django/docs/ref/databases.txt

1060 lines
40 KiB
Plaintext
Raw Normal View History

=========
Databases
=========
Django officially supports the following databases:
* :ref:`PostgreSQL <postgresql-notes>`
* :ref:`MariaDB <mariadb-notes>`
* :ref:`MySQL <mysql-notes>`
* :ref:`Oracle <oracle-notes>`
* :ref:`SQLite <sqlite-notes>`
There are also a number of :ref:`database backends provided by third parties
<third-party-notes>`.
Django attempts to support as many features as possible on all database
backends. However, not all database backends are alike, and we've had to make
design decisions on which features to support and which assumptions we can make
safely.
This file describes some of the features that might be relevant to Django
usage. Of course, it is not intended as a replacement for server-specific
documentation or reference manuals.
General notes
=============
.. _persistent-database-connections:
Persistent connections
----------------------
Persistent connections avoid the overhead of re-establishing a connection to
the database in each request. They're controlled by the
:setting:`CONN_MAX_AGE` parameter which defines the maximum lifetime of a
connection. It can be set independently for each database.
The default value is ``0``, preserving the historical behavior of closing the
database connection at the end of each request. To enable persistent
connections, set :setting:`CONN_MAX_AGE` to a positive integer of seconds. For
unlimited persistent connections, set it to ``None``.
Connection management
~~~~~~~~~~~~~~~~~~~~~
Django opens a connection to the database when it first makes a database
query. It keeps this connection open and reuses it in subsequent requests.
Django closes the connection once it exceeds the maximum age defined by
:setting:`CONN_MAX_AGE` or when it isn't usable any longer.
In detail, Django automatically opens a connection to the database whenever it
needs one and doesn't have one already — either because this is the first
connection, or because the previous connection was closed.
At the beginning of each request, Django closes the connection if it has
reached its maximum age. If your database terminates idle connections after
some time, you should set :setting:`CONN_MAX_AGE` to a lower value, so that
Django doesn't attempt to use a connection that has been terminated by the
database server. (This problem may only affect very low traffic sites.)
At the end of each request, Django closes the connection if it has reached its
maximum age or if it is in an unrecoverable error state. If any database
errors have occurred while processing the requests, Django checks whether the
connection still works, and closes it if it doesn't. Thus, database errors
affect at most one request; if the connection becomes unusable, the next
request gets a fresh connection.
Caveats
~~~~~~~
Since each thread maintains its own connection, your database must support at
least as many simultaneous connections as you have worker threads.
Sometimes a database won't be accessed by the majority of your views, for
example because it's the database of an external system, or thanks to caching.
In such cases, you should set :setting:`CONN_MAX_AGE` to a low value or even
``0``, because it doesn't make sense to maintain a connection that's unlikely
to be reused. This will help keep the number of simultaneous connections to
this database small.
The development server creates a new thread for each request it handles,
negating the effect of persistent connections. Don't enable them during
development.
When Django establishes a connection to the database, it sets up appropriate
parameters, depending on the backend being used. If you enable persistent
connections, this setup is no longer repeated every request. If you modify
parameters such as the connection's isolation level or time zone, you should
either restore Django's defaults at the end of each request, force an
appropriate value at the beginning of each request, or disable persistent
connections.
Encoding
--------
Django assumes that all databases use UTF-8 encoding. Using other encodings may
result in unexpected behavior such as "value too long" errors from your
database for data that is valid in Django. See the database specific notes
below for information on how to set up your database correctly.
.. _postgresql-notes:
PostgreSQL notes
================
Django supports PostgreSQL 9.5 and higher. `psycopg2`_ 2.5.4 or higher is
required, though the latest release is recommended.
.. _psycopg2: https://www.psycopg.org/
PostgreSQL connection settings
-------------------------------
See :setting:`HOST` for details.
Optimizing PostgreSQL's configuration
-------------------------------------
Django needs the following parameters for its database connections:
- ``client_encoding``: ``'UTF8'``,
- ``default_transaction_isolation``: ``'read committed'`` by default,
or the value set in the connection options (see below),
- ``timezone``:
- when :setting:`USE_TZ` is ``True``, ``'UTC'`` by default, or the
:setting:`TIME_ZONE <DATABASE-TIME_ZONE>` value set for the connection,
- when :setting:`USE_TZ` is ``False``, the value of the global
:setting:`TIME_ZONE` setting.
If these parameters already have the correct values, Django won't set them for
every new connection, which improves performance slightly. You can configure
them directly in :file:`postgresql.conf` or more conveniently per database
user with `ALTER ROLE`_.
Django will work just fine without this optimization, but each new connection
will do some additional queries to set these parameters.
.. _ALTER ROLE: https://www.postgresql.org/docs/current/sql-alterrole.html
2013-06-06 00:55:50 +08:00
.. _database-isolation-level:
Isolation level
---------------
Like PostgreSQL itself, Django defaults to the ``READ COMMITTED`` `isolation
level`_. If you need a higher isolation level such as ``REPEATABLE READ`` or
``SERIALIZABLE``, set it in the :setting:`OPTIONS` part of your database
configuration in :setting:`DATABASES`::
import psycopg2.extensions
DATABASES = {
# ...
'OPTIONS': {
'isolation_level': psycopg2.extensions.ISOLATION_LEVEL_SERIALIZABLE,
},
}
.. note::
Under higher isolation levels, your application should be prepared to
handle exceptions raised on serialization failures. This option is
designed for advanced uses.
.. _isolation level: https://www.postgresql.org/docs/current/transaction-iso.html
Indexes for ``varchar`` and ``text`` columns
--------------------------------------------
When specifying ``db_index=True`` on your model fields, Django typically
outputs a single ``CREATE INDEX`` statement. However, if the database type
for the field is either ``varchar`` or ``text`` (e.g., used by ``CharField``,
``FileField``, and ``TextField``), then Django will create
an additional index that uses an appropriate `PostgreSQL operator class`_
2012-07-25 07:17:27 +08:00
for the column. The extra index is necessary to correctly perform
lookups that use the ``LIKE`` operator in their SQL, as is done with the
``contains`` and ``startswith`` lookup types.
.. _PostgreSQL operator class: https://www.postgresql.org/docs/current/indexes-opclass.html
Migration operation for adding extensions
-----------------------------------------
If you need to add a PostgreSQL extension (like ``hstore``, ``postgis``, etc.)
using a migration, use the
:class:`~django.contrib.postgres.operations.CreateExtension` operation.
.. _postgresql-server-side-cursors:
Server-side cursors
-------------------
When using :meth:`QuerySet.iterator()
<django.db.models.query.QuerySet.iterator>`, Django opens a :ref:`server-side
cursor <psycopg2:server-side-cursors>`. By default, PostgreSQL assumes that
only the first 10% of the results of cursor queries will be fetched. The query
planner spends less time planning the query and starts returning results
faster, but this could diminish performance if more than 10% of the results are
retrieved. PostgreSQL's assumptions on the number of rows retrieved for a
cursor query is controlled with the `cursor_tuple_fraction`_ option.
.. _cursor_tuple_fraction: https://www.postgresql.org/docs/current/runtime-config-query.html#GUC-CURSOR-TUPLE-FRACTION
.. _transaction-pooling-server-side-cursors:
Transaction pooling and server-side cursors
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
2020-04-01 20:48:52 +08:00
Using a connection pooler in transaction pooling mode (e.g. `PgBouncer`_)
requires disabling server-side cursors for that connection.
Server-side cursors are local to a connection and remain open at the end of a
transaction when :setting:`AUTOCOMMIT <DATABASE-AUTOCOMMIT>` is ``True``. A
subsequent transaction may attempt to fetch more results from a server-side
cursor. In transaction pooling mode, there's no guarantee that subsequent
transactions will use the same connection. If a different connection is used,
an error is raised when the transaction references the server-side cursor,
because server-side cursors are only accessible in the connection in which they
were created.
One solution is to disable server-side cursors for a connection in
:setting:`DATABASES` by setting :setting:`DISABLE_SERVER_SIDE_CURSORS
<DATABASE-DISABLE_SERVER_SIDE_CURSORS>` to ``True``.
To benefit from server-side cursors in transaction pooling mode, you could set
up :doc:`another connection to the database </topics/db/multi-db>` in order to
perform queries that use server-side cursors. This connection needs to either
be directly to the database or to a connection pooler in session pooling mode.
Another option is to wrap each ``QuerySet`` using server-side cursors in an
:func:`~django.db.transaction.atomic` block, because it disables ``autocommit``
for the duration of the transaction. This way, the server-side cursor will only
live for the duration of the transaction.
2020-04-01 20:48:52 +08:00
.. _PgBouncer: https://pgbouncer.github.io/
.. _manually-specified-autoincrement-pk:
Manually-specifying values of auto-incrementing primary keys
------------------------------------------------------------
Django uses PostgreSQL's `SERIAL data type`_ to store auto-incrementing primary
keys. A ``SERIAL`` column is populated with values from a `sequence`_ that
keeps track of the next available value. Manually assigning a value to an
auto-incrementing field doesn't update the field's sequence, which might later
cause a conflict. For example::
>>> from django.contrib.auth.models import User
>>> User.objects.create(username='alice', pk=1)
<User: alice>
>>> # The sequence hasn't been updated; its next value is 1.
>>> User.objects.create(username='bob')
...
IntegrityError: duplicate key value violates unique constraint
"auth_user_pkey" DETAIL: Key (id)=(1) already exists.
If you need to specify such values, reset the sequence afterwards to avoid
reusing a value that's already in the table. The :djadmin:`sqlsequencereset`
management command generates the SQL statements to do that.
.. _SERIAL data type: https://www.postgresql.org/docs/current/datatype-numeric.html#DATATYPE-SERIAL
.. _sequence: https://www.postgresql.org/docs/current/sql-createsequence.html
Test database templates
-----------------------
You can use the :setting:`TEST['TEMPLATE'] <TEST_TEMPLATE>` setting to specify
a `template`_ (e.g. ``'template0'``) from which to create a test database.
.. _template: https://www.postgresql.org/docs/current/sql-createdatabase.html
Speeding up test execution with non-durable settings
----------------------------------------------------
You can speed up test execution times by `configuring PostgreSQL to be
non-durable <https://www.postgresql.org/docs/current/non-durability.html>`_.
.. warning::
This is dangerous: it will make your database more susceptible to data loss
or corruption in the case of a server crash or power loss. Only use this on
a development machine where you can easily restore the entire contents of
all databases in the cluster.
.. _mariadb-notes:
MariaDB notes
=============
.. versionadded:: 3.0
Django supports MariaDB 10.2 and higher.
To use MariaDB, use the MySQL backend, which is shared between the two. See the
:ref:`MySQL notes <mysql-notes>` for more details.
.. _mysql-notes:
MySQL notes
===========
Version support
---------------
Django supports MySQL 5.6 and higher.
2014-08-02 22:27:01 +08:00
Django's ``inspectdb`` feature uses the ``information_schema`` database, which
contains detailed data on all database schemas.
Django expects the database to support Unicode (UTF-8 encoding) and delegates to
it the task of enforcing transactions and referential integrity. It is important
to be aware of the fact that the two latter ones aren't actually enforced by
MySQL when using the MyISAM storage engine, see the next section.
.. _mysql-storage-engines:
Storage engines
---------------
MySQL has several `storage engines`_. You can change the default storage engine
in the server configuration.
MySQL's default storage engine is InnoDB_. This engine is fully transactional
and supports foreign key references. It's the recommended choice. However, the
InnoDB autoincrement counter is lost on a MySQL restart because it does not
remember the ``AUTO_INCREMENT`` value, instead recreating it as "max(id)+1".
This may result in an inadvertent reuse of :class:`~django.db.models.AutoField`
values.
The main drawbacks of MyISAM_ are that it doesn't support transactions or
enforce foreign-key constraints.
.. _storage engines: https://dev.mysql.com/doc/refman/en/storage-engines.html
.. _MyISAM: https://dev.mysql.com/doc/refman/en/myisam-storage-engine.html
.. _InnoDB: https://dev.mysql.com/doc/refman/en/innodb-storage-engine.html
.. _mysql-db-api-drivers:
MySQL DB API Drivers
--------------------
MySQL has a couple drivers that implement the Python Database API described in
:pep:`249`:
- `mysqlclient`_ is a native driver. It's **the recommended choice**.
- `MySQL Connector/Python`_ is a pure Python driver from Oracle that does not
require the MySQL client library or any Python modules outside the standard
library.
.. _mysqlclient: https://pypi.org/project/mysqlclient/
.. _MySQL Connector/Python: https://dev.mysql.com/downloads/connector/python
These drivers are thread-safe and provide connection pooling.
2014-03-01 00:44:03 +08:00
In addition to a DB API driver, Django needs an adapter to access the database
drivers from its ORM. Django provides an adapter for mysqlclient while MySQL
Connector/Python includes `its own`_.
.. _its own: https://dev.mysql.com/doc/connector-python/en/connector-python-django-backend.html
mysqlclient
~~~~~~~~~~~
Django requires `mysqlclient`_ 1.3.13 or later.
MySQL Connector/Python
~~~~~~~~~~~~~~~~~~~~~~
MySQL Connector/Python is available from the `download page`_.
The Django adapter is available in versions 1.1.X and later. It may not
support the most recent releases of Django.
.. _download page: https://dev.mysql.com/downloads/connector/python/
.. _mysql-time-zone-definitions:
Time zone definitions
---------------------
If you plan on using Django's :doc:`timezone support </topics/i18n/timezones>`,
use `mysql_tzinfo_to_sql`_ to load time zone tables into the MySQL database.
This needs to be done just once for your MySQL server, not per database.
.. _mysql_tzinfo_to_sql: https://dev.mysql.com/doc/refman/en/mysql-tzinfo-to-sql.html
Creating your database
----------------------
You can `create your database`_ using the command-line tools and this SQL::
CREATE DATABASE <dbname> CHARACTER SET utf8;
This ensures all tables and columns will use UTF-8 by default.
.. _create your database: https://dev.mysql.com/doc/refman/en/create-database.html
.. _mysql-collation:
Collation settings
~~~~~~~~~~~~~~~~~~
The collation setting for a column controls the order in which data is sorted
as well as what strings compare as equal. It can be set on a database-wide
level and also per-table and per-column. This is `documented thoroughly`_ in
the MySQL documentation. In all cases, you set the collation by directly
manipulating the database tables; Django doesn't provide a way to set this on
the model definition.
.. _documented thoroughly: https://dev.mysql.com/doc/refman/en/charset.html
By default, with a UTF-8 database, MySQL will use the
``utf8_general_ci`` collation. This results in all string equality
comparisons being done in a *case-insensitive* manner. That is, ``"Fred"`` and
``"freD"`` are considered equal at the database level. If you have a unique
constraint on a field, it would be illegal to try to insert both ``"aa"`` and
``"AA"`` into the same column, since they compare as equal (and, hence,
non-unique) with the default collation. If you want case-sensitive comparisons
on a particular column or table, change the column or table to use the
``utf8_bin`` collation.
Please note that according to `MySQL Unicode Character Sets`_, comparisons for
the ``utf8_general_ci`` collation are faster, but slightly less correct, than
comparisons for ``utf8_unicode_ci``. If this is acceptable for your application,
you should use ``utf8_general_ci`` because it is faster. If this is not acceptable
(for example, if you require German dictionary order), use ``utf8_unicode_ci``
because it is more accurate.
.. _MySQL Unicode Character Sets: https://dev.mysql.com/doc/refman/en/charset-unicode-sets.html
.. warning::
Model formsets validate unique fields in a case-sensitive manner. Thus when
using a case-insensitive collation, a formset with unique field values that
differ only by case will pass validation, but upon calling ``save()``, an
``IntegrityError`` will be raised.
Connecting to the database
--------------------------
Refer to the :doc:`settings documentation </ref/settings>`.
Connection settings are used in this order:
#. :setting:`OPTIONS`.
#. :setting:`NAME`, :setting:`USER`, :setting:`PASSWORD`, :setting:`HOST`,
:setting:`PORT`
#. MySQL option files.
In other words, if you set the name of the database in :setting:`OPTIONS`,
this will take precedence over :setting:`NAME`, which would override
anything in a `MySQL option file`_.
Here's a sample configuration which uses a MySQL option file::
# settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'OPTIONS': {
'read_default_file': '/path/to/my.cnf',
},
}
}
# my.cnf
[client]
database = NAME
user = USER
password = PASSWORD
default-character-set = utf8
Several other `MySQLdb connection options`_ may be useful, such as ``ssl``,
``init_command``, and ``sql_mode``.
.. _MySQL option file: https://dev.mysql.com/doc/refman/en/option-files.html
2017-05-03 08:17:14 +08:00
.. _MySQLdb connection options: https://mysqlclient.readthedocs.io/user_guide.html#functions-and-attributes
.. _mysql-sql-mode:
Setting ``sql_mode``
~~~~~~~~~~~~~~~~~~~~
From MySQL 5.7 onwards and on fresh installs of MySQL 5.6, the default value of
the ``sql_mode`` option contains ``STRICT_TRANS_TABLES``. That option escalates
warnings into errors when data are truncated upon insertion, so Django highly
recommends activating a `strict mode`_ for MySQL to prevent data loss (either
``STRICT_TRANS_TABLES`` or ``STRICT_ALL_TABLES``).
.. _strict mode: https://dev.mysql.com/doc/refman/en/sql-mode.html#sql-mode-strict
If you need to customize the SQL mode, you can set the ``sql_mode`` variable
like other MySQL options: either in a config file or with the entry
``'init_command': "SET sql_mode='STRICT_TRANS_TABLES'"`` in the
:setting:`OPTIONS` part of your database configuration in :setting:`DATABASES`.
.. _mysql-isolation-level:
Isolation level
~~~~~~~~~~~~~~~
When running concurrent loads, database transactions from different sessions
(say, separate threads handling different requests) may interact with each
other. These interactions are affected by each session's `transaction isolation
level`_. You can set a connection's isolation level with an
``'isolation_level'`` entry in the :setting:`OPTIONS` part of your database
configuration in :setting:`DATABASES`. Valid values for
this entry are the four standard isolation levels:
* ``'read uncommitted'``
* ``'read committed'``
* ``'repeatable read'``
* ``'serializable'``
or ``None`` to use the server's configured isolation level. However, Django
works best with and defaults to read committed rather than MySQL's default,
repeatable read. Data loss is possible with repeatable read. In particular,
you may see cases where :meth:`~django.db.models.query.QuerySet.get_or_create`
will raise an :exc:`~django.db.IntegrityError` but the object won't appear in
a subsequent :meth:`~django.db.models.query.QuerySet.get` call.
.. _transaction isolation level: https://dev.mysql.com/doc/refman/en/innodb-transaction-isolation-levels.html
Creating your tables
--------------------
When Django generates the schema, it doesn't specify a storage engine, so
tables will be created with whatever default storage engine your database
server is configured for. The easiest solution is to set your database server's
default storage engine to the desired engine.
If you're using a hosting service and can't change your server's default
storage engine, you have a couple of options.
* After the tables are created, execute an ``ALTER TABLE`` statement to
convert a table to a new storage engine (such as InnoDB)::
ALTER TABLE <tablename> ENGINE=INNODB;
This can be tedious if you have a lot of tables.
* Another option is to use the ``init_command`` option for MySQLdb prior to
creating your tables::
'OPTIONS': {
'init_command': 'SET default_storage_engine=INNODB',
}
This sets the default storage engine upon connecting to the database.
After your tables have been created, you should remove this option as it
adds a query that is only needed during table creation to each database
connection.
Table names
-----------
There are `known issues`_ in even the latest versions of MySQL that can cause the
case of a table name to be altered when certain SQL statements are executed
under certain conditions. It is recommended that you use lowercase table
names, if possible, to avoid any problems that might arise from this behavior.
Django uses lowercase table names when it auto-generates table names from
models, so this is mainly a consideration if you are overriding the table name
via the :class:`~django.db.models.Options.db_table` parameter.
.. _known issues: https://bugs.mysql.com/bug.php?id=48875
Savepoints
----------
Both the Django ORM and MySQL (when using the InnoDB :ref:`storage engine
<mysql-storage-engines>`) support database :ref:`savepoints
<topics-db-transactions-savepoints>`.
If you use the MyISAM storage engine please be aware of the fact that you will
receive database-generated errors if you try to use the :ref:`savepoint-related
methods of the transactions API <topics-db-transactions-savepoints>`. The reason
for this is that detecting the storage engine of a MySQL database/table is an
expensive operation so it was decided it isn't worth to dynamically convert
these methods in no-op's based in the results of such detection.
Notes on specific fields
------------------------
.. _mysql-character-fields:
Character fields
~~~~~~~~~~~~~~~~
Any fields that are stored with ``VARCHAR`` column types may have their
``max_length`` restricted to 255 characters if you are using ``unique=True``
for the field. This affects :class:`~django.db.models.CharField`,
:class:`~django.db.models.SlugField`. See `the MySQL documentation`_ for more
details.
.. _the MySQL documentation: https://dev.mysql.com/doc/refman/en/create-index.html#create-index-column-prefixes
``TextField`` limitations
~~~~~~~~~~~~~~~~~~~~~~~~~
MySQL can index only the first N chars of a ``BLOB`` or ``TEXT`` column. Since
``TextField`` doesn't have a defined length, you can't mark it as
``unique=True``. MySQL will report: "BLOB/TEXT column '<db_column>' used in key
specification without a key length".
.. _mysql-fractional-seconds:
Fractional seconds support for Time and DateTime fields
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
MySQL 5.6.4 and later can store fractional seconds, provided that the
column definition includes a fractional indication (e.g. ``DATETIME(6)``).
Earlier versions do not support them at all.
Django will not upgrade existing columns to include fractional seconds if the
database server supports it. If you want to enable them on an existing database,
it's up to you to either manually update the column on the target database, by
executing a command like::
ALTER TABLE `your_table` MODIFY `your_datetime_column` DATETIME(6)
or using a :class:`~django.db.migrations.operations.RunSQL` operation in a
:ref:`data migration <data-migrations>`.
``TIMESTAMP`` columns
~~~~~~~~~~~~~~~~~~~~~
If you are using a legacy database that contains ``TIMESTAMP`` columns, you must
set :setting:`USE_TZ = False <USE_TZ>` to avoid data corruption.
:djadmin:`inspectdb` maps these columns to
:class:`~django.db.models.DateTimeField` and if you enable timezone support,
both MySQL and Django will attempt to convert the values from UTC to local time.
Row locking with ``QuerySet.select_for_update()``
-------------------------------------------------
MySQL and MariaDB do not support some options to the ``SELECT ... FOR UPDATE``
statement. If ``select_for_update()`` is used with an unsupported option, then
a :exc:`~django.db.NotSupportedError` is raised.
=============== ========= ==========
Option MariaDB MySQL
=============== ========= ==========
``SKIP LOCKED`` X (≥8.0.1)
``NOWAIT`` X (≥10.3) X (≥8.0.1)
``OF``
=============== ========= ==========
When using ``select_for_update()`` on MySQL, make sure you filter a queryset
against at least set of fields contained in unique constraints or only against
fields covered by indexes. Otherwise, an exclusive write lock will be acquired
over the full table for the duration of the transaction.
Automatic typecasting can cause unexpected results
--------------------------------------------------
When performing a query on a string type, but with an integer value, MySQL will
coerce the types of all values in the table to an integer before performing the
comparison. If your table contains the values ``'abc'``, ``'def'`` and you
query for ``WHERE mycolumn=0``, both rows will match. Similarly, ``WHERE mycolumn=1``
will match the value ``'abc1'``. Therefore, string type fields included in Django
will always cast the value to a string before using it in a query.
If you implement custom model fields that inherit from
:class:`~django.db.models.Field` directly, are overriding
:meth:`~django.db.models.Field.get_prep_value`, or use
:class:`~django.db.models.expressions.RawSQL`,
:meth:`~django.db.models.query.QuerySet.extra`, or
:meth:`~django.db.models.Manager.raw`, you should ensure that you perform
appropriate typecasting.
.. _sqlite-notes:
SQLite notes
============
Django supports SQLite 3.8.3 and later.
SQLite_ provides an excellent development alternative for applications that
are predominantly read-only or require a smaller installation footprint. As
with all database servers, though, there are some differences that are
specific to SQLite that you should be aware of.
.. _SQLite: https://www.sqlite.org/
.. _sqlite-string-matching:
Substring matching and case sensitivity
---------------------------------------
For all SQLite versions, there is some slightly counter-intuitive behavior when
attempting to match some types of strings. These are triggered when using the
:lookup:`iexact` or :lookup:`contains` filters in Querysets. The behavior
splits into two cases:
1. For substring matching, all matches are done case-insensitively. That is a
filter such as ``filter(name__contains="aa")`` will match a name of ``"Aabb"``.
2. For strings containing characters outside the ASCII range, all exact string
matches are performed case-sensitively, even when the case-insensitive options
are passed into the query. So the :lookup:`iexact` filter will behave exactly
the same as the :lookup:`exact` filter in these cases.
Some possible workarounds for this are `documented at sqlite.org`_, but they
2014-03-01 10:03:46 +08:00
aren't utilized by the default SQLite backend in Django, as incorporating them
would be fairly difficult to do robustly. Thus, Django exposes the default
SQLite behavior and you should be aware of this when doing case-insensitive or
substring filtering.
.. _documented at sqlite.org: https://www.sqlite.org/faq.html#q18
.. _sqlite-decimal-handling:
Decimal handling
----------------
SQLite has no real decimal internal type. Decimal values are internally
converted to the ``REAL`` data type (8-byte IEEE floating point number), as
explained in the `SQLite datatypes documentation`__, so they don't support
correctly-rounded decimal floating point arithmetic.
__ https://www.sqlite.org/datatype3.html#storage_classes_and_datatypes
"Database is locked" errors
---------------------------
SQLite is meant to be a lightweight database, and thus can't support a high
level of concurrency. ``OperationalError: database is locked`` errors indicate
that your application is experiencing more concurrency than ``sqlite`` can
handle in default configuration. This error means that one thread or process has
an exclusive lock on the database connection and another thread timed out
waiting for the lock the be released.
Python's SQLite wrapper has
a default timeout value that determines how long the second thread is allowed to
wait on the lock before it times out and raises the ``OperationalError: database
is locked`` error.
If you're getting this error, you can solve it by:
* Switching to another database backend. At a certain point SQLite becomes
too "lite" for real-world applications, and these sorts of concurrency
errors indicate you've reached that point.
* Rewriting your code to reduce concurrency and ensure that database
transactions are short-lived.
* Increase the default timeout value by setting the ``timeout`` database
2015-03-10 17:10:24 +08:00
option::
'OPTIONS': {
# ...
'timeout': 20,
# ...
}
This will make SQLite wait a bit longer before throwing "database is locked"
errors; it won't really do anything to solve them.
``QuerySet.select_for_update()`` not supported
----------------------------------------------
SQLite does not support the ``SELECT ... FOR UPDATE`` syntax. Calling it will
have no effect.
"pyformat" parameter style in raw queries not supported
-------------------------------------------------------
For most backends, raw queries (``Manager.raw()`` or ``cursor.execute()``)
can use the "pyformat" parameter style, where placeholders in the query
are given as ``'%(name)s'`` and the parameters are passed as a dictionary
rather than a list. SQLite does not support this.
.. _sqlite-isolation:
Isolation when using ``QuerySet.iterator()``
--------------------------------------------
There are special considerations described in `Isolation In SQLite`_ when
modifying a table while iterating over it using :meth:`.QuerySet.iterator`. If
a row is added, changed, or deleted within the loop, then that row may or may
not appear, or may appear twice, in subsequent results fetched from the
iterator. Your code must handle this.
.. _`Isolation in SQLite`: https://sqlite.org/isolation.html
.. _oracle-notes:
Oracle notes
============
Django supports `Oracle Database Server`_ versions 12.2 and higher. Version
6.0 or higher of the `cx_Oracle`_ Python driver is required.
.. _`Oracle Database Server`: https://www.oracle.com/
.. _`cx_Oracle`: https://oracle.github.io/python-cx_Oracle/
In order for the ``python manage.py migrate`` command to work, your Oracle
database user must have privileges to run the following commands:
* CREATE TABLE
* CREATE SEQUENCE
* CREATE PROCEDURE
* CREATE TRIGGER
To run a project's test suite, the user usually needs these *additional*
privileges:
* CREATE USER
* ALTER USER
* DROP USER
* CREATE TABLESPACE
* DROP TABLESPACE
* CREATE SESSION WITH ADMIN OPTION
* CREATE TABLE WITH ADMIN OPTION
* CREATE SEQUENCE WITH ADMIN OPTION
* CREATE PROCEDURE WITH ADMIN OPTION
* CREATE TRIGGER WITH ADMIN OPTION
While the ``RESOURCE`` role has the required ``CREATE TABLE``,
``CREATE SEQUENCE``, ``CREATE PROCEDURE``, and ``CREATE TRIGGER`` privileges,
and a user granted ``RESOURCE WITH ADMIN OPTION`` can grant ``RESOURCE``, such
a user cannot grant the individual privileges (e.g. ``CREATE TABLE``), and thus
``RESOURCE WITH ADMIN OPTION`` is not usually sufficient for running tests.
Some test suites also create views or materialized views; to run these, the
user also needs ``CREATE VIEW WITH ADMIN OPTION`` and
``CREATE MATERIALIZED VIEW WITH ADMIN OPTION`` privileges. In particular, this
is needed for Django's own test suite.
All of these privileges are included in the DBA role, which is appropriate
for use on a private developer's database.
The Oracle database backend uses the ``SYS.DBMS_LOB`` and ``SYS.DBMS_RANDOM``
packages, so your user will require execute permissions on it. It's normally
accessible to all users by default, but in case it is not, you'll need to grant
permissions like so:
.. code-block:: sql
GRANT EXECUTE ON SYS.DBMS_LOB TO user;
GRANT EXECUTE ON SYS.DBMS_RANDOM TO user;
Connecting to the database
--------------------------
To connect using the service name of your Oracle database, your ``settings.py``
file should look something like this::
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.oracle',
'NAME': 'xe',
'USER': 'a_user',
'PASSWORD': 'a_password',
'HOST': '',
'PORT': '',
}
}
In this case, you should leave both :setting:`HOST` and :setting:`PORT` empty.
However, if you don't use a ``tnsnames.ora`` file or a similar naming method
and want to connect using the SID ("xe" in this example), then fill in both
:setting:`HOST` and :setting:`PORT` like so::
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.oracle',
'NAME': 'xe',
'USER': 'a_user',
'PASSWORD': 'a_password',
'HOST': 'dbprod01ned.mycompany.com',
'PORT': '1540',
}
}
You should either supply both :setting:`HOST` and :setting:`PORT`, or leave
both as empty strings. Django will use a different connect descriptor depending
on that choice.
Full DSN and Easy Connect
~~~~~~~~~~~~~~~~~~~~~~~~~
A Full DSN or Easy Connect string can be used in :setting:`NAME` if both
:setting:`HOST` and :setting:`PORT` are empty. This format is required when
using RAC or pluggable databases without ``tnsnames.ora``, for example.
Example of an Easy Connect string::
'NAME': 'localhost:1521/orclpdb1',
Example of a full DSN string::
'NAME': (
'(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=localhost)(PORT=1521))'
'(CONNECT_DATA=(SERVICE_NAME=orclpdb1)))'
),
Threaded option
---------------
If you plan to run Django in a multithreaded environment (e.g. Apache using the
default MPM module on any modern operating system), then you **must** set
the ``threaded`` option of your Oracle database configuration to ``True``::
'OPTIONS': {
'threaded': True,
},
Failure to do this may result in crashes and other odd behavior.
INSERT ... RETURNING INTO
-------------------------
By default, the Oracle backend uses a ``RETURNING INTO`` clause to efficiently
retrieve the value of an ``AutoField`` when inserting new rows. This behavior
may result in a ``DatabaseError`` in certain unusual setups, such as when
inserting into a remote table, or into a view with an ``INSTEAD OF`` trigger.
The ``RETURNING INTO`` clause can be disabled by setting the
``use_returning_into`` option of the database configuration to ``False``::
'OPTIONS': {
'use_returning_into': False,
},
In this case, the Oracle backend will use a separate ``SELECT`` query to
retrieve ``AutoField`` values.
Naming issues
-------------
Oracle imposes a name length limit of 30 characters. To accommodate this, the
backend truncates database identifiers to fit, replacing the final four
characters of the truncated name with a repeatable MD5 hash value.
Additionally, the backend turns database identifiers to all-uppercase.
To prevent these transformations (this is usually required only when dealing
with legacy databases or accessing tables which belong to other users), use
a quoted name as the value for ``db_table``::
class LegacyModel(models.Model):
class Meta:
2014-08-09 01:59:02 +08:00
db_table = '"name_left_in_lowercase"'
class ForeignModel(models.Model):
class Meta:
2014-08-09 01:59:02 +08:00
db_table = '"OTHER_USER"."NAME_ONLY_SEEMS_OVER_30"'
2013-11-12 00:07:52 +08:00
Quoted names can also be used with Django's other supported database
backends; except for Oracle, however, the quotes have no effect.
When running ``migrate``, an ``ORA-06552`` error may be encountered if
certain Oracle keywords are used as the name of a model field or the
value of a ``db_column`` option. Django quotes all identifiers used
in queries to prevent most such problems, but this error can still
occur when an Oracle datatype is used as a column name. In
particular, take care to avoid using the names ``date``,
``timestamp``, ``number`` or ``float`` as a field name.
.. _oracle-null-empty-strings:
NULL and empty strings
----------------------
Django generally prefers to use the empty string (``''``) rather than
``NULL``, but Oracle treats both identically. To get around this, the
Oracle backend ignores an explicit ``null`` option on fields that
have the empty string as a possible value and generates DDL as if
``null=True``. When fetching from the database, it is assumed that
a ``NULL`` value in one of these fields really means the empty
string, and the data is silently converted to reflect this assumption.
``TextField`` limitations
-------------------------
The Oracle backend stores ``TextFields`` as ``NCLOB`` columns. Oracle imposes
some limitations on the usage of such LOB columns in general:
* LOB columns may not be used as primary keys.
* LOB columns may not be used in indexes.
* LOB columns may not be used in a ``SELECT DISTINCT`` list. This means that
attempting to use the ``QuerySet.distinct`` method on a model that
includes ``TextField`` columns will result in an ``ORA-00932`` error when
run against Oracle. As a workaround, use the ``QuerySet.defer`` method in
conjunction with ``distinct()`` to prevent ``TextField`` columns from being
included in the ``SELECT DISTINCT`` list.
.. _subclassing-database-backends:
Subclassing the built-in database backends
==========================================
Django comes with built-in database backends. You may subclass an existing
database backends to modify its behavior, features, or configuration.
Consider, for example, that you need to change a single database feature.
First, you have to create a new directory with a ``base`` module in it. For
example::
mysite/
...
mydbengine/
__init__.py
base.py
The ``base.py`` module must contain a class named ``DatabaseWrapper`` that
subclasses an existing engine from the ``django.db.backends`` module. Here's an
example of subclassing the PostgreSQL engine to change a feature class
``allows_group_by_selected_pks_on_model``:
.. code-block:: python
:caption: mysite/mydbengine/base.py
from django.db.backends.postgresql import base, features
class DatabaseFeatures(features.DatabaseFeatures):
def allows_group_by_selected_pks_on_model(self, model):
return True
class DatabaseWrapper(base.DatabaseWrapper):
features_class = DatabaseFeatures
Finally, you must specify a :setting:`DATABASE-ENGINE` in your ``settings.py``
file::
DATABASES = {
'default': {
'ENGINE': 'mydbengine',
...
},
}
You can see the current list of database engines by looking in
:source:`django/db/backends`.
.. _third-party-notes:
Using a 3rd-party database backend
==================================
In addition to the officially supported databases, there are backends provided
by 3rd parties that allow you to use other databases with Django:
* `IBM DB2`_
* `Microsoft SQL Server`_
* Firebird_
* ODBC_
The Django versions and ORM features supported by these unofficial backends
vary considerably. Queries regarding the specific capabilities of these
unofficial backends, along with any support queries, should be directed to
the support channels provided by each 3rd party project.
.. _IBM DB2: https://pypi.org/project/ibm_db_django/
.. _Microsoft SQL Server: https://pypi.org/project/django-pyodbc-azure/
.. _Firebird: https://github.com/maxirobaina/django-firebird
.. _ODBC: https://github.com/lionheart/django-pyodbc/