The old names were downright confusing. Some seemed to mean the opposite
of what the class actually did.
The new names follow a consistent nomenclature:
(Forward|Reverse)(ManyToOne|OneToOne|ManyToMany)Descriptor.
I mentioned combinations that do not exist in the docstring in order to
help people who would search for them in the code base.
At 2800 lines it was the largest module in the django package. This
commit brings it down to a more manageable 1620 lines.
Very small changes were performed to uniformize import style.
Moved data loss check when assigning to a reverse one-to-one relation on
an unsaved instance to Model.save(). This is exactly the same change as
e4b813c but for reverse relations.
Moved the lookup in Field.swappable_setting to Apps, and added
an lru_cache to cache the results.
Refs #24743
Thanks Marten Kenbeek for the initial work on the patch. Thanks Aymeric
Augustin and Tim Graham for the review.
When the pk was a relation field, qs.filter(pk__in=qs) didn't work.
In addition, fixed Restaurant.objects.filter(place=restaurant_instance),
where place is an OneToOneField and the primary key of Restaurant.
A big thank you to Josh for review and to Tim for review and cosmetic
edits.
Thanks to Beauhurst for commissioning the work on this ticket.
This adds a new method, Apps.lazy_model_operation(), and a helper function,
lazy_related_operation(), which together supersede add_lazy_relation() and
make lazy model operations the responsibility of the App registry. This
system no longer uses the class_prepared signal.
Field.rel is now deprecated. Rel objects have now also remote_field
attribute. This means that self == self.remote_field.remote_field.
In addition, made the Rel objects a bit more like Field objects. Still,
marked ManyToManyFields as null=True.
Previously related fields didn't implement get_lookup, instead
related fields were treated specially. This commit removed some of
the special handling. In particular, related fields return Lookup
instances now, too.
Other notable changes in this commit is removal of support for
annotations in names_to_path().
If Field.choices is provided as an iterator, consume it in __init__ instead
of using itertools.tee (which ends up holding everything in memory
anyway). Fixes a bug where deconstruct() was consuming the iterator but
bypassing the call to `tee`.
These cached properies were causing problems with pickling, and in
addition they were confusingly defined: field.rel.model._meta was
not the same as field.rel.opts.
Instead users should use field.rel.related_model._meta inplace of
field.rel.opts, and field.rel.to._meta in place of field.rel.to_opts.
As suggested by Anssi. This has the slightly strange side effect of
passing the expression to Expression.convert_value has the expression
passed back to it, but it allows more complex patterns of expressions.
During direct assignment, evaluating the iterable before the transaction
is started avoids leaving the transaction dirty if an exception is raised.
This is slightly more wasteful but probably not enough to warrant a change
of behavior.
Thanks Anssi for the feedback. Refs #6707.
The method is mainly intended for use with UUIDField. For UUIDField we
want to call the field's default even when primary key value is
explicitly set to None to match the behavior of AutoField.
Thanks to Marc Tamlyn and Tim Graham for review.
An explicit `__exact` lookup in the related managers filters
was interpreted as a reference to a foreign `exact` field.
Thanks to Trac alias zhiyajun11 for the report, Josh for the investigation,
Loïc for the test name and Tim for the review.
Several issues resolved here, following from a report that a base_field
of GenericIpAddressField was failing.
We were using get_prep_value instead of get_db_prep_value in ArrayField
which was bypassing any extra modifications to the value being made in
the base field's get_db_prep_value. Changing this broke datetime
support, so the postgres backend has gained the relevant operation
methods to send dates/times/datetimes directly to the db backend instead
of casting them to strings. Similarly, a new database feature has been
added allowing the uuid to be passed directly to the backend, as we do
with timedeltas.
On the other side, psycopg2 expects an Inet() instance for IP address
fields, so we add a value_to_db_ipaddress method to wrap the strings on
postgres. We also have to manually add a database adapter to psycopg2,
as we do not wish to use the built in adapter which would turn
everything into Inet() instances.
Thanks to smclenithan for the report.
These refactorings making overriding some text based lookup names on
other fields (specifically `contains`) much cleaner. It also removes a
bunch of duplication in the contrib.postgres lookups.
Refactored compiler SELECT, GROUP BY and ORDER BY generation.
While there, also refactored select_related() implementation
(get_cached_row() and get_klass_info() are now gone!).
Made get_db_converters() method work on expressions instead of
internal_type. This allows the backend converters to target
specific expressions if need be.
Added query.context, this can be used to set per-query state.
Also changed the signature of database converters. They now accept
context as an argument.
Use INTERVAL DAY(9) TO SECOND(6) for Durationfield on Oracle rather than
storing as a NUMBER(19) of microseconds.
There are issues with cx_Oracle which require some extra data
manipulation in the database backend when constructing queries, but it
handles the conversion back to timedelta objects cleanly.
Thanks to Shai for the review.
A field for storing periods of time - modeled in Python by timedelta. It
is stored in the native interval data type on PostgreSQL and as a bigint
of microseconds on other backends.
Also includes significant changes to the internals of time related maths
in expressions, including the removal of DateModifierNode.
Thanks to Tim and Josh in particular for reviews.
Fixed issue with warning message displayed for unbound naive datetime
objects when USE_TZ is True. Adds unit test that demonstrates the issue
(discoverable when using a custom lookup in MySQL).
Added update_or_create to RelatedManager, ManyRelatedManager and
GenericRelatedObjectManager.
Added missing get_or_create to GenericRelatedObjectManager.
Validates that related_name is a valid Python id or ends with a '+' and
it's not a keyword. Without a check it passed silently leading to
unpredictable problems.
Thanks Konrad Świat for the initial work.
and ReverseSingleRelatedObjectDescriptor so they actually return QuerySet
instances.
Also ensured that SingleRelatedObjectDescriptor.get_queryset() accounts
for use_for_related_fields=True.
This cleanup lays the groundwork for #23533.
Thanks Anssi Kääriäinen for the review.
Complete rework of translating data values from database
Deprecation of SubfieldBase, removal of resolve_columns and
convert_values in favour of a more general converter based approach and
public API Field.from_db_value(). Now works seamlessly with aggregation,
.values() and raw queries.
Thanks to akaariai in particular for extensive advice and inspiration,
also to shaib, manfre and timograham for their reviews.