Compare commits
372 Commits
main
...
stable/1.4
Author | SHA1 | Date |
---|---|---|
Ramon Moraes | 5864e1f8e9 | |
Tim Graham | 018efef59a | |
Tim Graham | 9ff23eb7cc | |
Tim Graham | 575f59f9bc | |
Tim Graham | 8b0d63914f | |
Tim Graham | 3b324970e3 | |
Tim Graham | 3df6495c12 | |
Tim Graham | 622a11513e | |
Tim Graham | 1ba1cdce7d | |
Carl Meyer | 2e47f3e401 | |
Tim Graham | c570a5ec3e | |
Tim Graham | 91a395fa80 | |
Tim Graham | 664ad1252c | |
Tim Graham | b2a7878c10 | |
Tim Graham | 5388692144 | |
Tim Graham | 2342693b31 | |
Tim Graham | 3b20558beb | |
Carl Meyer | 785e57e296 | |
Tim Graham | e60557c249 | |
Tim Graham | 4376d6ef7b | |
Tim Graham | 7dd4c5221a | |
Benjamin Richter | 1e39d0f628 | |
Tim Graham | 9435474068 | |
Tim Graham | 99e6ac77f2 | |
Tim Graham | 4296a1da8b | |
Tim Graham | bd9dcd226b | |
Tim Graham | 88b7957b34 | |
Tim Graham | d020da6646 | |
Tim Graham | 4c241f1b71 | |
Carl Meyer | 4f6fffc1dc | |
Tim Graham | 113a8980f4 | |
Tim Graham | 2fd8054fda | |
Tim Graham | 032ffade8a | |
Tim Graham | 52136afda4 | |
Tim Graham | 592187e11b | |
Tim Graham | 35dc639cd6 | |
Tim Graham | a25c444bc7 | |
Simon Charette | 5940da16af | |
Tim Graham | c83b024b37 | |
Tim Graham | a1dcd82b28 | |
Tim Graham | 486b6ca3bc | |
James Bennett | 151d6dbf9c | |
Tim Graham | a92e386e26 | |
Tim Graham | 643374bcf5 | |
Emmanuelle Delescolle | f58392d8d8 | |
Tim Graham | df657a7682 | |
Joseph Dougherty | 3132edae41 | |
Claude Paroz | ba2be27613 | |
Simon Charette | 065caafa70 | |
Tim Graham | 78085844a7 | |
Tim Graham | 89157fe11f | |
James Bennett | 0517f498cd | |
Simon Charette | 4685026840 | |
Tim Graham | 8adc56ca78 | |
Tim Graham | 27c682ffa0 | |
Tim Graham | e484df76b6 | |
James Bennett | 4fce0193d2 | |
Simon Charette | 027bd34864 | |
Preston Holmes | c9e3b9949c | |
Tim Graham | 30042d475b | |
Florian Apolloner | c2fe73133b | |
Tim Graham | 4d5e972a2c | |
Tim Graham | 88cb7aa6aa | |
Tim Graham | 399052d224 | |
Tim Graham | d23d19c15e | |
Erik Romijn | bc03817b42 | |
Tim Graham | 778a555342 | |
Ramiro Morales | aa9c45c2e4 | |
Tim Graham | b44519072e | |
Tim Graham | d29f3b9e87 | |
Tim Graham | d39fcff11a | |
Jacob Kaplan-Moss | 37d6821d35 | |
Jacob Kaplan-Moss | 53b98b5a7c | |
Jacob Kaplan-Moss | fe5b3e36a2 | |
Tim Graham | 7feb54bbae | |
Aymeric Augustin | 28e23306aa | |
Tim Graham | e1812617cf | |
Tim Graham | 48a4729cd7 | |
James Bennett | b1b680c8fe | |
Tim Graham | b91c385e32 | |
Tim Graham | 1edb163592 | |
James Bennett | 194159ba44 | |
Erik Romijn | 8010908313 | |
Erik Romijn | aa80f498de | |
Aymeric Augustin | 1170f285dd | |
Tim Graham | c1a8c420fe | |
Matt Lauber | ca3927dfb9 | |
Tim Graham | 83420e70ef | |
Tim Graham | f2a9f71565 | |
Claude Paroz | f108b1f7d7 | |
Tim Graham | b8713ee69a | |
Tim Graham | 74181c0a2c | |
Tim Graham | 257f8528b7 | |
Tim Graham | 85057522bc | |
Jacob Kaplan-Moss | 03d9b9ea0a | |
Tim Graham | 1036e3ec7c | |
Luke Plant | 2c1d92bc64 | |
Ben Spaulding | 474e7dd6d0 | |
Aymeric Augustin | 2d4f399ad4 | |
Alasdair Nicol | 23126866ec | |
Aymeric Augustin | 8e8584f959 | |
Baptiste Mispelon | 46755c50ee | |
Tim Graham | c5d071f85a | |
James Bennett | 30eb916bdb | |
Florian Apolloner | 848a759474 | |
Aymeric Augustin | b149d1fcd6 | |
Loic Bistuer | 7984b58e78 | |
Paolo Melchiorre | d491702ed7 | |
Tim Graham | 11b750b031 | |
James Bennett | 8f36d1fd95 | |
Tim Graham | 3a46f621fe | |
Shai Berger | 6de3726423 | |
Tim Graham | ead7c496a4 | |
Florian Apolloner | c4f29c91f9 | |
Aymeric Augustin | ea04c81d37 | |
Anssi Kääriäinen | 037ec1054c | |
Florian Apolloner | e2403db95a | |
Florian Apolloner | 0317edf0c7 | |
Tim Graham | ca77e38d24 | |
Tim Graham | efee30e6b0 | |
Claude Paroz | 629813a804 | |
Russell Keith-Magee | 6903d1690a | |
James Bennett | 3ffc7b52f8 | |
Russell Keith-Magee | 3f3d887a68 | |
Tim Graham | 75d2bcda10 | |
Tim Graham | cca302cde6 | |
Florian Apolloner | 434d122a74 | |
Tim Graham | fba6af5a1e | |
Loic Bistuer | 3203f684e8 | |
James Bennett | 701c1a11bc | |
Tim Graham | d1dc8a0d00 | |
Tim Graham | 87d2750b39 | |
Садовский Николай | 9ab7ed9b72 | |
Shai Berger | 7826824aef | |
Shai Berger | d9dc98159d | |
Luke Plant | d5da495a2e | |
Anssi Kääriäinen | bf611f14ec | |
Florian Apolloner | 08e5fcb3e6 | |
Jacob Kaplan-Moss | 0d4ef66f7c | |
Tim Graham | d77ce64fe8 | |
Jacob Kaplan-Moss | 506913cdd8 | |
Tim Graham | e61e20e497 | |
Jacob Kaplan-Moss | 30e17be1f6 | |
Jacob Kaplan-Moss | ec67af0bd6 | |
Tim Graham | b50be6857c | |
Tim Graham | 8af0b1afd2 | |
SusanTan | ed6ec47ff7 | |
mark hellewell | f3a961f009 | |
Brenton Cleeland | eda39fe704 | |
Matt Deacalion Stevens | dfe36f10df | |
Tim Graham | 6b4b18e7e2 | |
Tim Graham | 288d70fccc | |
Tim Graham | e8971345b4 | |
Tim Graham | 7b7592cafa | |
Baptiste Mispelon | 165cc1dc2f | |
Aymeric Augustin | e2b86571bf | |
Aymeric Augustin | e3b6fed320 | |
Tim Graham | c97cc85b74 | |
Gavin Wahl | 9b5fe02215 | |
Tim Graham | 227d7f63e4 | |
Tim Graham | 1deeda5785 | |
Alasdair Nicol | e149d8ebf0 | |
Wilfred Hughes | 528345069d | |
Alex Gaynor | 6297673efd | |
Tim Graham | fbac080691 | |
Nimesh Ghelani | d2b8834839 | |
Carl Meyer | 4c6fb23dd4 | |
Donald Stufft | 41af26dd53 | |
Donald Stufft | 843034a8d6 | |
Claude Paroz | 577a27a9fc | |
Aymeric Augustin | 97a67b26f3 | |
Tim Graham | 52bac4ede1 | |
Tim Graham | db1e8bdc33 | |
Preston Holmes | 0f555f813b | |
Anssi Kääriäinen | 3872bc51c9 | |
James Bennett | 67a937c2c2 | |
Carl Meyer | 3adfc3f97d | |
Carl Meyer | 4cdfb24c98 | |
Carl Meyer | 5d1791ffd2 | |
James Bennett | f61f800c29 | |
Carl Meyer | 62d5338bf2 | |
Aymeric Augustin | 0cc350a896 | |
Carl Meyer | 0e7861aec7 | |
Carl Meyer | 1c60d07ba2 | |
Carl Meyer | 9936fdb11d | |
Tim Graham | 57b62a74cb | |
Tim Graham | 83e512fa6e | |
Alex Hunley | 3d6388941d | |
Tim Graham | 9eb7d59665 | |
Anssi Kääriäinen | dec7dd99f0 | |
Claude Paroz | b4fb448f83 | |
Anssi Kääriäinen | 209f174e58 | |
Anssi Kääriäinen | 9918b3f502 | |
Anssi Kääriäinen | 498a5de07b | |
Tim Graham | 056b2b5f65 | |
Claude Paroz | ec93ecdd10 | |
Claude Paroz | 3610d11ba0 | |
Claude Paroz | 6bd3896fcb | |
Tim Graham | 89ba1b27b4 | |
Tim Graham | c26541f5cb | |
Tim Graham | c4a9e5bd8d | |
Ramiro Morales | 6474105107 | |
Alex Gaynor | 8ab2aceb65 | |
Florian Apolloner | f2530dcb17 | |
James Bennett | 1f0af3c529 | |
Florian Apolloner | 319627c184 | |
Florian Apolloner | b2ae0a63ae | |
Julien Phalip | 8c9a8fd5c4 | |
Sebastián Magrí | c72172244e | |
Anssi Kääriäinen | 3e4058be9f | |
Aymeric Augustin | 046300c43b | |
Anssi Kääriäinen | c7dcb1d808 | |
Tim Graham | 9ee9a7265a | |
Preston Holmes | 19710955e4 | |
Luke Plant | 9003d6fece | |
Tim Graham | 06c14a63a2 | |
Anssi Kääriäinen | 25e041f270 | |
Nicolas Ippolito | fdb855e7b2 | |
Aymeric Augustin | f8c005b4ec | |
Aymeric Augustin | 2733253633 | |
Claude Paroz | ad2d57a2cc | |
Anssi Kääriäinen | 37c87b785d | |
Tim Graham | baf1f1dcde | |
Carl Meyer | ce168bb899 | |
Preston Holmes | e86e4ce0bd | |
Tim Graham | 6c1c490f64 | |
Tim Graham | 13bbe9161d | |
Tim Graham | e7685b87c1 | |
Tim Graham | 700717db1f | |
Tim Graham | fd90a90633 | |
Preston Holmes | 773a29295a | |
James Bennett | 8c46ead92b | |
James Bennett | 0f54fed0b6 | |
Preston Holmes | 58806ce153 | |
Preston Holmes | 92d3430f12 | |
Tim Graham | 73991b0b32 | |
Tim Graham | 33d11463a0 | |
Tim Graham | 6ebb6f9188 | |
Tim Graham | 81020708ea | |
Julien Phalip | cc0478606a | |
Claude Paroz | 4cdc416d03 | |
Tim Graham | d2891d1c07 | |
Tim Graham | e2dea54efe | |
Tim Graham | 8139a7990a | |
Tim Graham | a1d21c0877 | |
Tim Graham | cf17d5e267 | |
Tim Graham | 3ac70a5907 | |
Tim Graham | 1be0515fe9 | |
Tim Graham | c06b724a00 | |
Claude Paroz | 0636c9583f | |
Tim Graham | a35d7fd1e1 | |
Tim Graham | 8868a067e0 | |
Tim Graham | b1462e0a36 | |
Tim Graham | cf482d6e2a | |
Anssi Kääriäinen | 4dba4ed548 | |
Anssi Kääriäinen | 1f537335d9 | |
Tim Graham | bd514f28e4 | |
Tim Graham | 1189dca471 | |
Tim Graham | 57cdbf3bf8 | |
Tim Graham | 3a64adef61 | |
Julien Phalip | 336dfc3413 | |
Tim Graham | 421ce44e8b | |
Tim Graham | 18d88a169f | |
Tim Graham | 81c77d24ef | |
Tim Graham | 1b5b8b874f | |
Tim Graham | 7c6630920e | |
Tim Graham | cd5181f84c | |
Tim Graham | b0e2cb8e47 | |
Tim Graham | 7e8483e70b | |
Aymeric Augustin | 376a18993b | |
Tim Graham | 53f533f864 | |
Anssi Kääriäinen | 2326860851 | |
Tim Graham | f0c469c7be | |
Tim Graham | c9f1a13f87 | |
Claude Paroz | 92f7af3c36 | |
Tim Graham | c2f1aa5a3c | |
Tim Graham | c088a42670 | |
Tim Graham | c274a9cbd0 | |
Tim Graham | 90fee02fee | |
Claude Paroz | f6159d426b | |
Tim Graham | df0c1055cd | |
Claude Paroz | b8340d26e4 | |
Tim Graham | fa8a09fdc5 | |
Tim Graham | f6351851b6 | |
Tim Graham | 81bfe428e1 | |
Tim Graham | 4b8c6c4056 | |
Tim Graham | 27c2ccc1ea | |
Tim Graham | 42aee6ffe5 | |
Tim Graham | b05d2f51b8 | |
Jeremy Cowgar | eaa6e4e2d1 | |
Tim Graham | 232a308044 | |
Tim Saylor | 03e79c3386 | |
Tim Graham | e4b7e7d86d | |
Tim Graham | 01b0231717 | |
Raphaël Hertzog | 57d9ccc4aa | |
Tim Graham | 3264894ee0 | |
Tim Graham | fba0149e16 | |
Tim Graham | 6536f7597b | |
Tim Graham | df8a2bf4cb | |
Tim Graham | c54034a2ad | |
Claude Paroz | 49f9bb271d | |
Tim Graham | 1d280026c3 | |
Tim Graham | c8e681f624 | |
Tim Graham | e9f458133c | |
Tim Graham | d1ae3c899f | |
Tim Graham | 445b9663e7 | |
Simon Meers | 68fd7f56e1 | |
Tim Graham | a276bdebb2 | |
Tim Graham | 751a34e4b3 | |
Tim Graham | fa6577f5b2 | |
James Bennett | 28a4d039a2 | |
Florian Apolloner | e34685034b | |
Florian Apolloner | c14f325c4e | |
Florian Apolloner | da33d67181 | |
Tim Graham | 94e91f75b9 | |
Florian Apolloner | 498bf5c26c | |
Ramiro Morales | c2ff027861 | |
Kevin London | c6d06a9453 | |
Tim Graham | 8ba78a0daf | |
Tim Graham | dcede04715 | |
Julien Phalip | 9a2ca4266a | |
Claude Paroz | fd88fe657b | |
Aymeric Augustin | f1e416566a | |
Tim Graham | f5db3bddb3 | |
Tim Graham | c5e35afbcc | |
Tim Graham | 8bea1a7e4e | |
Tim Graham | 32bd77d392 | |
Tim Graham | fea5e0b80f | |
Raúl Cumplido | 342e8a6246 | |
Tim Graham | a89e76d151 | |
Claude Paroz | d92c38a281 | |
Tim Graham | 9014b138e6 | |
Tim Graham | 3631a028e2 | |
Luke Plant | ff6ee5f06c | |
Jacob Kaplan-Moss | 45d43317b7 | |
Luke Plant | 0a8a6b92b2 | |
Luke Plant | 3bd937aec2 | |
Karen Tracey | 03f1d69f1e | |
Florian Apolloner | 1c13cc023f | |
Julien Phalip | 4d2fdd4185 | |
Michael Newman | 0f69a16785 | |
Adrian Holovaty | d3fa8d92ea | |
Claude Paroz | 6bb85d98b0 | |
Claude Paroz | 589af4971e | |
Jannis Leidel | 35423f6fb1 | |
Claude Paroz | ffe620f203 | |
Aymeric Augustin | a3c8201b77 | |
Claude Paroz | 521fe472e5 | |
Ramiro Morales | 839a71b0a5 | |
Claude Paroz | 143305126b | |
Claude Paroz | 64bbf5187c | |
Claude Paroz | 2fa8b3f143 | |
Claude Paroz | 3f77b84489 | |
Claude Paroz | ee43524e22 | |
Claude Paroz | 8ed9e9074c | |
Aymeric Augustin | 01dfe35b38 | |
Claude Paroz | 8adfdf08de | |
Julien Phalip | a6ba67ffd1 | |
Ramiro Morales | 9a3e9c27c2 | |
Claude Paroz | 61b13444c5 | |
Claude Paroz | 456d4db251 | |
Julien Phalip | aafa73db54 | |
Aymeric Augustin | 13822974dd | |
Claude Paroz | 35124ae3e2 | |
Claude Paroz | 814385321b | |
Claude Paroz | 515b3b85ed | |
Claude Paroz | 6c5933d175 | |
Claude Paroz | 277661c2af | |
Claude Paroz | 37c0e10e8c | |
Claude Paroz | ec2119e194 | |
Claude Paroz | a815fd1652 | |
Aymeric Augustin | bbb2595f89 |
|
@ -1,4 +1,13 @@
|
|||
*.egg-info
|
||||
*.pot
|
||||
*.py[co]
|
||||
__pycache__
|
||||
MANIFEST
|
||||
dist/
|
||||
docs/_build/
|
||||
docs/locale/
|
||||
node_modules/
|
||||
tests/coverage_html/
|
||||
tests/.coverage
|
||||
build/
|
||||
tests/report/
|
||||
|
|
|
@ -3,4 +3,13 @@ syntax:glob
|
|||
*.egg-info
|
||||
*.pot
|
||||
*.py[co]
|
||||
__pycache__
|
||||
MANIFEST
|
||||
dist/
|
||||
docs/_build/
|
||||
docs/locale/
|
||||
node_modules/
|
||||
tests/coverage_html/
|
||||
tests/.coverage
|
||||
build/
|
||||
tests/report/
|
||||
|
|
1
AUTHORS
1
AUTHORS
|
@ -453,6 +453,7 @@ answer newbie questions, and generally made Django that much better:
|
|||
Vinay Sajip <vinay_sajip@yahoo.co.uk>
|
||||
Bartolome Sanchez Salado <i42sasab@uco.es>
|
||||
Kadesarin Sanjek
|
||||
Tim Saylor <tim.saylor@gmail.com>
|
||||
Massimo Scamarcia <massimo.scamarcia@gmail.com>
|
||||
Paulo Scardine <paulo@scardine.com.br>
|
||||
David Schein
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
VERSION = (1, 4, 0, 'final', 0)
|
||||
VERSION = (1, 4, 23, 'alpha', 0)
|
||||
|
||||
def get_version(version=None):
|
||||
"""Derives a PEP386-compliant version number from VERSION."""
|
||||
|
|
|
@ -29,6 +29,10 @@ ADMINS = ()
|
|||
# * Receive x-headers
|
||||
INTERNAL_IPS = ()
|
||||
|
||||
# Hosts/domain names that are valid for this site.
|
||||
# "*" matches anything, ".example.com" matches example.com and all subdomains
|
||||
ALLOWED_HOSTS = ['*']
|
||||
|
||||
# Local time zone for this installation. All choices can be found here:
|
||||
# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name (although not all
|
||||
# systems may support all possibilities). When USE_TZ is True, this is
|
||||
|
@ -512,6 +516,7 @@ PASSWORD_HASHERS = (
|
|||
'django.contrib.auth.hashers.BCryptPasswordHasher',
|
||||
'django.contrib.auth.hashers.SHA1PasswordHasher',
|
||||
'django.contrib.auth.hashers.MD5PasswordHasher',
|
||||
'django.contrib.auth.hashers.UnsaltedSHA1PasswordHasher',
|
||||
'django.contrib.auth.hashers.UnsaltedMD5PasswordHasher',
|
||||
'django.contrib.auth.hashers.CryptPasswordHasher',
|
||||
)
|
||||
|
|
|
@ -20,13 +20,14 @@ DATABASES = {
|
|||
}
|
||||
}
|
||||
|
||||
# Hosts/domain names that are valid for this site; required if DEBUG is False
|
||||
# See https://docs.djangoproject.com/en/1.4/ref/settings/#allowed-hosts
|
||||
ALLOWED_HOSTS = []
|
||||
|
||||
# Local time zone for this installation. Choices can be found here:
|
||||
# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
|
||||
# although not all choices may be available on all operating systems.
|
||||
# On Unix systems, a value of None will cause Django to use the same
|
||||
# timezone as the operating system.
|
||||
# If running in a Windows environment this must be set to the same as your
|
||||
# system time zone.
|
||||
# In a Windows environment this must be set to your system time zone.
|
||||
TIME_ZONE = 'America/Chicago'
|
||||
|
||||
# Language code for this installation. All choices can be found here:
|
||||
|
|
|
@ -0,0 +1,6 @@
|
|||
from django.core.exceptions import SuspiciousOperation
|
||||
|
||||
|
||||
class DisallowedModelAdminToField(SuspiciousOperation):
|
||||
"""Invalid to_field was passed to admin view via URL query string"""
|
||||
pass
|
|
@ -8,13 +8,13 @@ certain test -- e.g. being a DateField or ForeignKey.
|
|||
import datetime
|
||||
|
||||
from django.db import models
|
||||
from django.core.exceptions import ImproperlyConfigured
|
||||
from django.utils.encoding import smart_unicode
|
||||
from django.core.exceptions import ImproperlyConfigured, ValidationError
|
||||
from django.utils.encoding import smart_unicode, force_unicode
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils import timezone
|
||||
|
||||
from django.contrib.admin.util import (get_model_from_relation,
|
||||
reverse_field_path, get_limit_choices_to_from_path, prepare_lookup_value)
|
||||
from django.contrib.admin.options import IncorrectLookupParameters
|
||||
|
||||
class ListFilter(object):
|
||||
title = None # Human-readable title to appear in the right sidebar.
|
||||
|
@ -102,7 +102,7 @@ class SimpleListFilter(ListFilter):
|
|||
}
|
||||
for lookup, title in self.lookup_choices:
|
||||
yield {
|
||||
'selected': self.value() == lookup,
|
||||
'selected': self.value() == force_unicode(lookup),
|
||||
'query_string': cl.get_query_string({
|
||||
self.parameter_name: lookup,
|
||||
}, []),
|
||||
|
@ -129,7 +129,10 @@ class FieldListFilter(ListFilter):
|
|||
return True
|
||||
|
||||
def queryset(self, request, queryset):
|
||||
return queryset.filter(**self.used_parameters)
|
||||
try:
|
||||
return queryset.filter(**self.used_parameters)
|
||||
except ValidationError, e:
|
||||
raise IncorrectLookupParameters(e)
|
||||
|
||||
@classmethod
|
||||
def register(cls, test, list_filter_class, take_priority=False):
|
||||
|
@ -155,7 +158,10 @@ class FieldListFilter(ListFilter):
|
|||
class RelatedFieldListFilter(FieldListFilter):
|
||||
def __init__(self, field, request, params, model, model_admin, field_path):
|
||||
other_model = get_model_from_relation(field)
|
||||
rel_name = other_model._meta.pk.name
|
||||
if hasattr(field, 'rel'):
|
||||
rel_name = field.rel.get_related_field().name
|
||||
else:
|
||||
rel_name = other_model._meta.pk.name
|
||||
self.lookup_kwarg = '%s__%s__exact' % (field_path, rel_name)
|
||||
self.lookup_kwarg_isnull = '%s__isnull' % field_path
|
||||
self.lookup_val = request.GET.get(self.lookup_kwarg, None)
|
||||
|
@ -299,7 +305,7 @@ class DateFieldListFilter(FieldListFilter):
|
|||
else: # field is a models.DateField
|
||||
today = now.date()
|
||||
tomorrow = today + datetime.timedelta(days=1)
|
||||
|
||||
|
||||
self.lookup_kwarg_since = '%s__gte' % field_path
|
||||
self.lookup_kwarg_until = '%s__lt' % field_path
|
||||
self.links = (
|
||||
|
|
|
@ -245,7 +245,7 @@ class BaseModelAdmin(object):
|
|||
# if foo has been specificially included in the lookup list; so
|
||||
# drop __id if it is the last part. However, first we need to find
|
||||
# the pk attribute name.
|
||||
pk_attr_name = None
|
||||
rel_name = None
|
||||
for part in parts[:-1]:
|
||||
try:
|
||||
field, _, _, _ = model._meta.get_field_by_name(part)
|
||||
|
@ -255,13 +255,13 @@ class BaseModelAdmin(object):
|
|||
return True
|
||||
if hasattr(field, 'rel'):
|
||||
model = field.rel.to
|
||||
pk_attr_name = model._meta.pk.name
|
||||
rel_name = field.rel.get_related_field().name
|
||||
elif isinstance(field, RelatedObject):
|
||||
model = field.model
|
||||
pk_attr_name = model._meta.pk.name
|
||||
rel_name = model._meta.pk.name
|
||||
else:
|
||||
pk_attr_name = None
|
||||
if pk_attr_name and len(parts) > 1 and parts[-1] == pk_attr_name:
|
||||
rel_name = None
|
||||
if rel_name and len(parts) > 1 and parts[-1] == rel_name:
|
||||
parts.pop()
|
||||
|
||||
if len(parts) == 1:
|
||||
|
@ -269,6 +269,39 @@ class BaseModelAdmin(object):
|
|||
clean_lookup = LOOKUP_SEP.join(parts)
|
||||
return clean_lookup in self.list_filter or clean_lookup == self.date_hierarchy
|
||||
|
||||
def to_field_allowed(self, request, to_field):
|
||||
"""
|
||||
Returns True if the model associated with this admin should be
|
||||
allowed to be referenced by the specified field.
|
||||
"""
|
||||
opts = self.model._meta
|
||||
|
||||
try:
|
||||
field = opts.get_field(to_field)
|
||||
except FieldDoesNotExist:
|
||||
return False
|
||||
|
||||
# Always allow referencing the primary key since it's already possible
|
||||
# to get this information from the change view URL.
|
||||
if field.primary_key:
|
||||
return True
|
||||
|
||||
# Make sure at least one of the models registered for this site
|
||||
# references this field through a FK or a M2M relationship.
|
||||
registered_models = set()
|
||||
for model, admin in self.admin_site._registry.items():
|
||||
registered_models.add(model)
|
||||
for inline in admin.inlines:
|
||||
registered_models.add(inline.model)
|
||||
|
||||
for related_object in opts.get_all_related_objects(include_hidden=True):
|
||||
related_model = related_object.model
|
||||
if (any(issubclass(model, related_model) for model in registered_models) and
|
||||
related_object.field.rel.get_related_field() == field):
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def has_add_permission(self, request):
|
||||
"""
|
||||
Returns True if the given request has permission to add an object.
|
||||
|
@ -1317,15 +1350,21 @@ class ModelAdmin(BaseModelAdmin):
|
|||
def history_view(self, request, object_id, extra_context=None):
|
||||
"The 'history' admin view for this model."
|
||||
from django.contrib.admin.models import LogEntry
|
||||
# First check if the user can see this history.
|
||||
model = self.model
|
||||
obj = get_object_or_404(model, pk=unquote(object_id))
|
||||
|
||||
if not self.has_change_permission(request, obj):
|
||||
raise PermissionDenied
|
||||
|
||||
# Then get the history for this object.
|
||||
opts = model._meta
|
||||
app_label = opts.app_label
|
||||
action_list = LogEntry.objects.filter(
|
||||
object_id = object_id,
|
||||
content_type__id__exact = ContentType.objects.get_for_model(model).id
|
||||
).select_related().order_by('action_time')
|
||||
# If no history was found, see whether this object even exists.
|
||||
obj = get_object_or_404(model, pk=unquote(object_id))
|
||||
|
||||
context = {
|
||||
'title': _('Change history: %s') % force_unicode(obj),
|
||||
'action_list': action_list,
|
||||
|
|
|
@ -41,7 +41,8 @@
|
|||
text-align: left;
|
||||
}
|
||||
|
||||
.selector .selector-filter label {
|
||||
.selector .selector-filter label,
|
||||
.inline-group .aligned .selector .selector-filter label {
|
||||
width: 16px;
|
||||
padding: 2px;
|
||||
}
|
||||
|
|
|
@ -25,9 +25,9 @@ class AdminSeleniumWebDriverTestCase(LiveServerTestCase):
|
|||
|
||||
@classmethod
|
||||
def tearDownClass(cls):
|
||||
super(AdminSeleniumWebDriverTestCase, cls).tearDownClass()
|
||||
if hasattr(cls, 'selenium'):
|
||||
cls.selenium.quit()
|
||||
super(AdminSeleniumWebDriverTestCase, cls).tearDownClass()
|
||||
|
||||
def wait_until(self, callback, timeout=10):
|
||||
"""
|
||||
|
@ -49,6 +49,20 @@ class AdminSeleniumWebDriverTestCase(LiveServerTestCase):
|
|||
timeout
|
||||
)
|
||||
|
||||
def wait_page_loaded(self):
|
||||
"""
|
||||
Block until page has started to load.
|
||||
"""
|
||||
from selenium.common.exceptions import TimeoutException
|
||||
try:
|
||||
# Wait for the next page to be loaded
|
||||
self.wait_loaded_tag('body')
|
||||
except TimeoutException:
|
||||
# IE7 occasionnally returns an error "Internet Explorer cannot
|
||||
# display the webpage" and doesn't load the next page. We just
|
||||
# ignore it.
|
||||
pass
|
||||
|
||||
def admin_login(self, username, password, login_url='/admin/'):
|
||||
"""
|
||||
Helper function to log into the admin.
|
||||
|
@ -61,8 +75,7 @@ class AdminSeleniumWebDriverTestCase(LiveServerTestCase):
|
|||
login_text = _('Log in')
|
||||
self.selenium.find_element_by_xpath(
|
||||
'//input[@value="%s"]' % login_text).click()
|
||||
# Wait for the next page to be loaded.
|
||||
self.wait_loaded_tag('body')
|
||||
self.wait_page_loaded()
|
||||
|
||||
def get_css_value(self, selector, attribute):
|
||||
"""
|
||||
|
@ -102,4 +115,4 @@ class AdminSeleniumWebDriverTestCase(LiveServerTestCase):
|
|||
`klass`.
|
||||
"""
|
||||
return (self.selenium.find_element_by_css_selector(selector)
|
||||
.get_attribute('class').find(klass) != -1)
|
||||
.get_attribute('class').find(klass) != -1)
|
||||
|
|
|
@ -10,6 +10,7 @@ from django.utils.translation import ugettext, ugettext_lazy
|
|||
from django.utils.http import urlencode
|
||||
|
||||
from django.contrib.admin import FieldListFilter
|
||||
from django.contrib.admin.exceptions import DisallowedModelAdminToField
|
||||
from django.contrib.admin.options import IncorrectLookupParameters
|
||||
from django.contrib.admin.util import (quote, get_fields_from_path,
|
||||
lookup_needs_distinct, prepare_lookup_value)
|
||||
|
@ -56,7 +57,10 @@ class ChangeList(object):
|
|||
self.page_num = 0
|
||||
self.show_all = ALL_VAR in request.GET
|
||||
self.is_popup = IS_POPUP_VAR in request.GET
|
||||
self.to_field = request.GET.get(TO_FIELD_VAR)
|
||||
to_field = request.GET.get(TO_FIELD_VAR)
|
||||
if to_field and not model_admin.to_field_allowed(request, to_field):
|
||||
raise DisallowedModelAdminToField("The field %s cannot be referenced." % to_field)
|
||||
self.to_field = to_field
|
||||
self.params = dict(request.GET.items())
|
||||
if PAGE_VAR in self.params:
|
||||
del self.params[PAGE_VAR]
|
||||
|
@ -258,7 +262,7 @@ class ChangeList(object):
|
|||
if not (set(ordering) & set(['pk', '-pk', pk_name, '-' + pk_name])):
|
||||
# The two sets do not intersect, meaning the pk isn't present. So
|
||||
# we add it.
|
||||
ordering.append('pk')
|
||||
ordering.append('-pk')
|
||||
|
||||
return ordering
|
||||
|
||||
|
|
|
@ -17,6 +17,8 @@ from django.views.decorators.csrf import csrf_protect
|
|||
from django.views.decorators.debug import sensitive_post_parameters
|
||||
|
||||
csrf_protect_m = method_decorator(csrf_protect)
|
||||
sensitive_post_parameters_m = method_decorator(sensitive_post_parameters())
|
||||
|
||||
|
||||
class GroupAdmin(admin.ModelAdmin):
|
||||
search_fields = ('name',)
|
||||
|
@ -83,7 +85,7 @@ class UserAdmin(admin.ModelAdmin):
|
|||
self.admin_site.admin_view(self.user_change_password))
|
||||
) + super(UserAdmin, self).get_urls()
|
||||
|
||||
@sensitive_post_parameters()
|
||||
@sensitive_post_parameters_m
|
||||
@csrf_protect_m
|
||||
@transaction.commit_on_success
|
||||
def add_view(self, request, form_url='', extra_context=None):
|
||||
|
@ -113,7 +115,7 @@ class UserAdmin(admin.ModelAdmin):
|
|||
return super(UserAdmin, self).add_view(request, form_url,
|
||||
extra_context)
|
||||
|
||||
@sensitive_post_parameters()
|
||||
@sensitive_post_parameters_m
|
||||
def user_change_password(self, request, id, form_url=''):
|
||||
if not self.has_change_permission(request):
|
||||
raise PermissionDenied
|
||||
|
@ -170,4 +172,3 @@ class UserAdmin(admin.ModelAdmin):
|
|||
|
||||
admin.site.register(Group, GroupAdmin)
|
||||
admin.site.register(User, UserAdmin)
|
||||
|
||||
|
|
|
@ -11,6 +11,11 @@ class PermLookupDict(object):
|
|||
def __getitem__(self, perm_name):
|
||||
return self.user.has_perm("%s.%s" % (self.module_name, perm_name))
|
||||
|
||||
def __iter__(self):
|
||||
# To fix 'item in perms.someapp' and __getitem__ iteraction we need to
|
||||
# define __iter__. See #18979 for details.
|
||||
raise TypeError("PermLookupDict is not iterable.")
|
||||
|
||||
def __nonzero__(self):
|
||||
return self.user.has_module_perms(self.module_name)
|
||||
|
||||
|
|
|
@ -35,8 +35,14 @@ def check_password(password, encoded, setter=None, preferred='default'):
|
|||
password = smart_str(password)
|
||||
encoded = smart_str(encoded)
|
||||
|
||||
if len(encoded) == 32 and '$' not in encoded:
|
||||
# Ancient versions of Django created plain MD5 passwords and accepted
|
||||
# MD5 passwords with an empty salt.
|
||||
if ((len(encoded) == 32 and '$' not in encoded) or
|
||||
(len(encoded) == 37 and encoded.startswith('md5$$'))):
|
||||
hasher = get_hasher('unsalted_md5')
|
||||
# Ancient versions of Django accepted SHA1 passwords with an empty salt.
|
||||
elif len(encoded) == 46 and encoded.startswith('sha1$$'):
|
||||
hasher = get_hasher('unsalted_sha1')
|
||||
else:
|
||||
algorithm = encoded.split('$', 1)[0]
|
||||
hasher = get_hasher(algorithm)
|
||||
|
@ -329,14 +335,48 @@ class MD5PasswordHasher(BasePasswordHasher):
|
|||
])
|
||||
|
||||
|
||||
class UnsaltedSHA1PasswordHasher(BasePasswordHasher):
|
||||
"""
|
||||
Very insecure algorithm that you should *never* use; stores SHA1 hashes
|
||||
with an empty salt.
|
||||
|
||||
This class is implemented because Django used to accept such password
|
||||
hashes. Some older Django installs still have these values lingering
|
||||
around so we need to handle and upgrade them properly.
|
||||
"""
|
||||
algorithm = "unsalted_sha1"
|
||||
|
||||
def salt(self):
|
||||
return ''
|
||||
|
||||
def encode(self, password, salt):
|
||||
assert salt == ''
|
||||
hash = hashlib.sha1(password).hexdigest()
|
||||
return 'sha1$$%s' % hash
|
||||
|
||||
def verify(self, password, encoded):
|
||||
encoded_2 = self.encode(password, '')
|
||||
return constant_time_compare(encoded, encoded_2)
|
||||
|
||||
def safe_summary(self, encoded):
|
||||
assert encoded.startswith('sha1$$')
|
||||
hash = encoded[6:]
|
||||
return SortedDict([
|
||||
(_('algorithm'), self.algorithm),
|
||||
(_('hash'), mask_hash(hash)),
|
||||
])
|
||||
|
||||
|
||||
class UnsaltedMD5PasswordHasher(BasePasswordHasher):
|
||||
"""
|
||||
I am an incredibly insecure algorithm you should *never* use;
|
||||
stores unsalted MD5 hashes without the algorithm prefix.
|
||||
Incredibly insecure algorithm that you should *never* use; stores unsalted
|
||||
MD5 hashes without the algorithm prefix, also accepts MD5 hashes with an
|
||||
empty salt.
|
||||
|
||||
This class is implemented because Django used to store passwords
|
||||
this way. Some older Django installs still have these values
|
||||
lingering around so we need to handle and upgrade them properly.
|
||||
This class is implemented because Django used to store passwords this way
|
||||
and to accept such password hashes. Some older Django installs still have
|
||||
these values lingering around so we need to handle and upgrade them
|
||||
properly.
|
||||
"""
|
||||
algorithm = "unsalted_md5"
|
||||
|
||||
|
@ -344,9 +384,12 @@ class UnsaltedMD5PasswordHasher(BasePasswordHasher):
|
|||
return ''
|
||||
|
||||
def encode(self, password, salt):
|
||||
assert salt == ''
|
||||
return hashlib.md5(password).hexdigest()
|
||||
|
||||
def verify(self, password, encoded):
|
||||
if len(encoded) == 37 and encoded.startswith('md5$$'):
|
||||
encoded = encoded[5:]
|
||||
encoded_2 = self.encode(password, '')
|
||||
return constant_time_compare(encoded, encoded_2)
|
||||
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
from django.contrib import auth
|
||||
from django.contrib.auth.backends import RemoteUserBackend
|
||||
from django.core.exceptions import ImproperlyConfigured
|
||||
from django.utils.functional import SimpleLazyObject
|
||||
|
||||
|
@ -47,9 +48,11 @@ class RemoteUserMiddleware(object):
|
|||
try:
|
||||
username = request.META[self.header]
|
||||
except KeyError:
|
||||
# If specified header doesn't exist then return (leaving
|
||||
# request.user set to AnonymousUser by the
|
||||
# AuthenticationMiddleware).
|
||||
# If specified header doesn't exist then remove any existing
|
||||
# authenticated remote-user, or return (leaving request.user set to
|
||||
# AnonymousUser by the AuthenticationMiddleware).
|
||||
if request.user.is_authenticated():
|
||||
self._remove_invalid_user(request)
|
||||
return
|
||||
# If the user is already authenticated and that user is the user we are
|
||||
# getting passed in the headers, then the correct user is already
|
||||
|
@ -57,6 +60,11 @@ class RemoteUserMiddleware(object):
|
|||
if request.user.is_authenticated():
|
||||
if request.user.username == self.clean_username(username, request):
|
||||
return
|
||||
else:
|
||||
# An authenticated user is associated with the request, but
|
||||
# it does not match the authorized user in the header.
|
||||
self._remove_invalid_user(request)
|
||||
|
||||
# We are seeing this user for the first time in this session, attempt
|
||||
# to authenticate the user.
|
||||
user = auth.authenticate(remote_user=username)
|
||||
|
@ -78,3 +86,17 @@ class RemoteUserMiddleware(object):
|
|||
except AttributeError: # Backend has no clean_username method.
|
||||
pass
|
||||
return username
|
||||
|
||||
def _remove_invalid_user(self, request):
|
||||
"""
|
||||
Removes the current authenticated user in the request which is invalid
|
||||
but only if the user is authenticated via the RemoteUserBackend.
|
||||
"""
|
||||
try:
|
||||
stored_backend = auth.load_backend(request.session.get(auth.BACKEND_SESSION_KEY, ''))
|
||||
except ImproperlyConfigured:
|
||||
# backend failed to load
|
||||
auth.logout(request)
|
||||
else:
|
||||
if isinstance(stored_backend, RemoteUserBackend):
|
||||
auth.logout(request)
|
||||
|
|
|
@ -2,12 +2,58 @@ import os
|
|||
|
||||
from django.conf import global_settings
|
||||
from django.contrib.auth import authenticate
|
||||
from django.contrib.auth.context_processors import PermWrapper, PermLookupDict
|
||||
from django.db.models import Q
|
||||
from django.template import context
|
||||
from django.test import TestCase
|
||||
from django.test.utils import override_settings
|
||||
|
||||
|
||||
class MockUser(object):
|
||||
def has_module_perm(self, perm):
|
||||
if perm == 'mockapp.someapp':
|
||||
return True
|
||||
return False
|
||||
|
||||
def has_perm(self, perm):
|
||||
if perm == 'someperm':
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
class PermWrapperTests(TestCase):
|
||||
"""
|
||||
Test some details of the PermWrapper implementation.
|
||||
"""
|
||||
class EQLimiterObject(object):
|
||||
"""
|
||||
This object makes sure __eq__ will not be called endlessly.
|
||||
"""
|
||||
def __init__(self):
|
||||
self.eq_calls = 0
|
||||
|
||||
def __eq__(self, other):
|
||||
if self.eq_calls > 0:
|
||||
return True
|
||||
self.eq_calls += 1
|
||||
return False
|
||||
|
||||
def test_permwrapper_in(self):
|
||||
"""
|
||||
Test that 'something' in PermWrapper doesn't end up in endless loop.
|
||||
"""
|
||||
perms = PermWrapper(MockUser())
|
||||
def raises():
|
||||
self.EQLimiterObject() in perms
|
||||
self.assertRaises(raises, TypeError)
|
||||
|
||||
def test_permlookupdict_in(self):
|
||||
pldict = PermLookupDict(MockUser(), 'mockapp')
|
||||
def raises():
|
||||
self.EQLimiterObject() in pldict
|
||||
self.assertRaises(raises, TypeError)
|
||||
|
||||
|
||||
class AuthContextProcessorTests(TestCase):
|
||||
"""
|
||||
Tests for the ``django.contrib.auth.context_processors.auth`` processor
|
||||
|
|
|
@ -9,6 +9,7 @@ from django.test import TestCase
|
|||
from django.test.utils import override_settings
|
||||
from django.utils.encoding import force_unicode
|
||||
from django.utils import translation
|
||||
from django.utils.translation import ugettext as _
|
||||
|
||||
|
||||
class UserCreationFormTest(TestCase):
|
||||
|
@ -333,6 +334,6 @@ class PasswordResetFormTest(TestCase):
|
|||
form = PasswordResetForm(data)
|
||||
self.assertFalse(form.is_valid())
|
||||
self.assertEqual(form["email"].errors,
|
||||
[u"The user account associated with this e-mail address cannot reset the password."])
|
||||
[_(u"The user account associated with this e-mail address cannot reset the password.")])
|
||||
|
||||
PasswordResetFormTest = override_settings(USE_TZ=False)(PasswordResetFormTest)
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
from django.conf.global_settings import PASSWORD_HASHERS as default_hashers
|
||||
from django.contrib.auth.hashers import (is_password_usable,
|
||||
from django.contrib.auth.hashers import (is_password_usable,
|
||||
check_password, make_password, PBKDF2PasswordHasher, load_hashers,
|
||||
PBKDF2SHA1PasswordHasher, get_hasher, UNUSABLE_PASSWORD)
|
||||
from django.utils import unittest
|
||||
|
@ -14,6 +14,10 @@ except ImportError:
|
|||
|
||||
try:
|
||||
import bcrypt
|
||||
# Django 1.4 works only with py-bcrypt, not with bcrypt. py-bcrypt has
|
||||
# '_bcrypt' attribute, bcrypt doesn't.
|
||||
if not hasattr(bcrypt, '_bcrypt'):
|
||||
bcrypt = None
|
||||
except ImportError:
|
||||
bcrypt = None
|
||||
|
||||
|
@ -31,7 +35,7 @@ class TestUtilsHashPass(unittest.TestCase):
|
|||
|
||||
def test_pkbdf2(self):
|
||||
encoded = make_password('letmein', 'seasalt', 'pbkdf2_sha256')
|
||||
self.assertEqual(encoded,
|
||||
self.assertEqual(encoded,
|
||||
'pbkdf2_sha256$10000$seasalt$FQCNpiZpTb0zub+HBsH6TOwyRxJ19FwvjbweatNmK/Y=')
|
||||
self.assertTrue(is_password_usable(encoded))
|
||||
self.assertTrue(check_password(u'letmein', encoded))
|
||||
|
@ -39,7 +43,7 @@ class TestUtilsHashPass(unittest.TestCase):
|
|||
|
||||
def test_sha1(self):
|
||||
encoded = make_password('letmein', 'seasalt', 'sha1')
|
||||
self.assertEqual(encoded,
|
||||
self.assertEqual(encoded,
|
||||
'sha1$seasalt$fec3530984afba6bade3347b7140d1a7da7da8c7')
|
||||
self.assertTrue(is_password_usable(encoded))
|
||||
self.assertTrue(check_password(u'letmein', encoded))
|
||||
|
@ -47,18 +51,33 @@ class TestUtilsHashPass(unittest.TestCase):
|
|||
|
||||
def test_md5(self):
|
||||
encoded = make_password('letmein', 'seasalt', 'md5')
|
||||
self.assertEqual(encoded,
|
||||
self.assertEqual(encoded,
|
||||
'md5$seasalt$f5531bef9f3687d0ccf0f617f0e25573')
|
||||
self.assertTrue(is_password_usable(encoded))
|
||||
self.assertTrue(check_password(u'letmein', encoded))
|
||||
self.assertFalse(check_password('letmeinz', encoded))
|
||||
|
||||
def test_unsalted_md5(self):
|
||||
encoded = make_password('letmein', 'seasalt', 'unsalted_md5')
|
||||
encoded = make_password('letmein', '', 'unsalted_md5')
|
||||
self.assertEqual(encoded, '0d107d09f5bbe40cade3de5c71e9e9b7')
|
||||
self.assertTrue(is_password_usable(encoded))
|
||||
self.assertTrue(check_password(u'letmein', encoded))
|
||||
self.assertFalse(check_password('letmeinz', encoded))
|
||||
# Alternate unsalted syntax
|
||||
alt_encoded = "md5$$%s" % encoded
|
||||
self.assertTrue(is_password_usable(alt_encoded))
|
||||
self.assertTrue(check_password(u'letmein', alt_encoded))
|
||||
self.assertFalse(check_password('letmeinz', alt_encoded))
|
||||
|
||||
def test_unsalted_sha1(self):
|
||||
encoded = make_password('letmein', '', 'unsalted_sha1')
|
||||
self.assertEqual(encoded, 'sha1$$b7a875fc1ea228b9061041b7cec4bd3c52ab3ce3')
|
||||
self.assertTrue(is_password_usable(encoded))
|
||||
self.assertTrue(check_password('letmein', encoded))
|
||||
self.assertFalse(check_password('letmeinz', encoded))
|
||||
# Raw SHA1 isn't acceptable
|
||||
alt_encoded = encoded[6:]
|
||||
self.assertRaises(ValueError, check_password, 'letmein', alt_encoded)
|
||||
|
||||
@skipUnless(crypt, "no crypt module to generate password.")
|
||||
def test_crypt(self):
|
||||
|
@ -93,14 +112,14 @@ class TestUtilsHashPass(unittest.TestCase):
|
|||
def test_low_level_pkbdf2(self):
|
||||
hasher = PBKDF2PasswordHasher()
|
||||
encoded = hasher.encode('letmein', 'seasalt')
|
||||
self.assertEqual(encoded,
|
||||
self.assertEqual(encoded,
|
||||
'pbkdf2_sha256$10000$seasalt$FQCNpiZpTb0zub+HBsH6TOwyRxJ19FwvjbweatNmK/Y=')
|
||||
self.assertTrue(hasher.verify('letmein', encoded))
|
||||
|
||||
def test_low_level_pbkdf2_sha1(self):
|
||||
hasher = PBKDF2SHA1PasswordHasher()
|
||||
encoded = hasher.encode('letmein', 'seasalt')
|
||||
self.assertEqual(encoded,
|
||||
self.assertEqual(encoded,
|
||||
'pbkdf2_sha1$10000$seasalt$91JiNKgwADC8j2j86Ije/cc4vfQ=')
|
||||
self.assertTrue(hasher.verify('letmein', encoded))
|
||||
|
||||
|
|
|
@ -95,6 +95,24 @@ class RemoteUserTest(TestCase):
|
|||
response = self.client.get('/remote_user/', REMOTE_USER=self.known_user)
|
||||
self.assertEqual(default_login, response.context['user'].last_login)
|
||||
|
||||
def test_user_switch_forces_new_login(self):
|
||||
"""
|
||||
Tests that if the username in the header changes between requests
|
||||
that the original user is logged out
|
||||
"""
|
||||
User.objects.create(username='knownuser')
|
||||
# Known user authenticates
|
||||
response = self.client.get('/remote_user/',
|
||||
**{'REMOTE_USER': self.known_user})
|
||||
self.assertEqual(response.context['user'].username, 'knownuser')
|
||||
# During the session, the REMOTE_USER changes to a different user.
|
||||
response = self.client.get('/remote_user/',
|
||||
**{'REMOTE_USER': "newnewuser"})
|
||||
# Ensure that the current user is not the prior remote_user
|
||||
# In backends that create a new user, username is "newnewuser"
|
||||
# In backends that do not create new users, it is '' (anonymous user)
|
||||
self.assertNotEqual(response.context['user'].username, 'knownuser')
|
||||
|
||||
def tearDown(self):
|
||||
"""Restores settings to avoid breaking other tests."""
|
||||
settings.MIDDLEWARE_CLASSES = self.curr_middleware
|
||||
|
|
|
@ -51,6 +51,7 @@ urlpatterns = urlpatterns + patterns('',
|
|||
(r'^logout/next_page/$', 'django.contrib.auth.views.logout', dict(next_page='/somewhere/')),
|
||||
(r'^remote_user/$', remote_user_auth_view),
|
||||
(r'^password_reset_from_email/$', 'django.contrib.auth.views.password_reset', dict(from_email='staffmember@example.com')),
|
||||
(r'^admin_password_reset/$', 'django.contrib.auth.views.password_reset', dict(is_admin_site=True)),
|
||||
(r'^login_required/$', login_required(password_reset)),
|
||||
(r'^login_required_login_url/$', login_required(password_reset, login_url='/somewhere/')),
|
||||
|
||||
|
|
|
@ -7,6 +7,7 @@ from django.conf import settings
|
|||
from django.contrib.sites.models import Site, RequestSite
|
||||
from django.contrib.auth.models import User
|
||||
from django.core import mail
|
||||
from django.core.exceptions import SuspiciousOperation
|
||||
from django.core.urlresolvers import reverse, NoReverseMatch
|
||||
from django.http import QueryDict
|
||||
from django.utils.encoding import force_unicode
|
||||
|
@ -106,6 +107,47 @@ class PasswordResetTest(AuthViewsTestCase):
|
|||
self.assertEqual(len(mail.outbox), 1)
|
||||
self.assertEqual("staffmember@example.com", mail.outbox[0].from_email)
|
||||
|
||||
@override_settings(ALLOWED_HOSTS=['adminsite.com'])
|
||||
def test_admin_reset(self):
|
||||
"If the reset view is marked as being for admin, the HTTP_HOST header is used for a domain override."
|
||||
response = self.client.post('/admin_password_reset/',
|
||||
{'email': 'staffmember@example.com'},
|
||||
HTTP_HOST='adminsite.com'
|
||||
)
|
||||
self.assertEqual(response.status_code, 302)
|
||||
self.assertEqual(len(mail.outbox), 1)
|
||||
self.assertTrue("http://adminsite.com" in mail.outbox[0].body)
|
||||
self.assertEqual(settings.DEFAULT_FROM_EMAIL, mail.outbox[0].from_email)
|
||||
|
||||
# Skip any 500 handler action (like sending more mail...)
|
||||
@override_settings(DEBUG_PROPAGATE_EXCEPTIONS=True)
|
||||
def test_poisoned_http_host(self):
|
||||
"Poisoned HTTP_HOST headers can't be used for reset emails"
|
||||
# This attack is based on the way browsers handle URLs. The colon
|
||||
# should be used to separate the port, but if the URL contains an @,
|
||||
# the colon is interpreted as part of a username for login purposes,
|
||||
# making 'evil.com' the request domain. Since HTTP_HOST is used to
|
||||
# produce a meaningful reset URL, we need to be certain that the
|
||||
# HTTP_HOST header isn't poisoned. This is done as a check when get_host()
|
||||
# is invoked, but we check here as a practical consequence.
|
||||
with self.assertRaises(SuspiciousOperation):
|
||||
self.client.post('/password_reset/',
|
||||
{'email': 'staffmember@example.com'},
|
||||
HTTP_HOST='www.example:dr.frankenstein@evil.tld'
|
||||
)
|
||||
self.assertEqual(len(mail.outbox), 0)
|
||||
|
||||
# Skip any 500 handler action (like sending more mail...)
|
||||
@override_settings(DEBUG_PROPAGATE_EXCEPTIONS=True)
|
||||
def test_poisoned_http_host_admin_site(self):
|
||||
"Poisoned HTTP_HOST headers can't be used for reset emails on admin views"
|
||||
with self.assertRaises(SuspiciousOperation):
|
||||
self.client.post('/admin_password_reset/',
|
||||
{'email': 'staffmember@example.com'},
|
||||
HTTP_HOST='www.example:dr.frankenstein@evil.tld'
|
||||
)
|
||||
self.assertEqual(len(mail.outbox), 0)
|
||||
|
||||
def _test_confirm_start(self):
|
||||
# Start by creating the email
|
||||
response = self.client.post('/password_reset/', {'email': 'staffmember@example.com'})
|
||||
|
@ -265,9 +307,12 @@ class LoginTest(AuthViewsTestCase):
|
|||
|
||||
# Those URLs should not pass the security check
|
||||
for bad_url in ('http://example.com',
|
||||
'http:///example.com',
|
||||
'https://example.com',
|
||||
'ftp://exampel.com',
|
||||
'//example.com'):
|
||||
'///example.com',
|
||||
'//example.com',
|
||||
'javascript:alert("XSS")'):
|
||||
|
||||
nasty_url = '%(url)s?%(next)s=%(bad_url)s' % {
|
||||
'url': login_url,
|
||||
|
@ -287,7 +332,8 @@ class LoginTest(AuthViewsTestCase):
|
|||
'/view/?param=https://example.com',
|
||||
'/view?param=ftp://exampel.com',
|
||||
'view/?param=//example.com',
|
||||
'https:///',
|
||||
'https://testserver/',
|
||||
'HTTPS://testserver/',
|
||||
'//testserver/',
|
||||
'/url%20with%20spaces/'): # see ticket #12534
|
||||
safe_url = '%(url)s?%(next)s=%(good_url)s' % {
|
||||
|
@ -423,9 +469,12 @@ class LogoutTest(AuthViewsTestCase):
|
|||
|
||||
# Those URLs should not pass the security check
|
||||
for bad_url in ('http://example.com',
|
||||
'http:///example.com',
|
||||
'https://example.com',
|
||||
'ftp://exampel.com',
|
||||
'//example.com'):
|
||||
'///example.com',
|
||||
'//example.com',
|
||||
'javascript:alert("XSS")'):
|
||||
nasty_url = '%(url)s?%(next)s=%(bad_url)s' % {
|
||||
'url': logout_url,
|
||||
'next': REDIRECT_FIELD_NAME,
|
||||
|
@ -443,7 +492,8 @@ class LogoutTest(AuthViewsTestCase):
|
|||
'/view/?param=https://example.com',
|
||||
'/view?param=ftp://exampel.com',
|
||||
'view/?param=//example.com',
|
||||
'https:///',
|
||||
'https://testserver/',
|
||||
'HTTPS://testserver/',
|
||||
'//testserver/',
|
||||
'/url%20with%20spaces/'): # see ticket #12534
|
||||
safe_url = '%(url)s?%(next)s=%(good_url)s' % {
|
||||
|
|
|
@ -4,7 +4,7 @@ from django.conf import settings
|
|||
from django.core.urlresolvers import reverse
|
||||
from django.http import HttpResponseRedirect, QueryDict
|
||||
from django.template.response import TemplateResponse
|
||||
from django.utils.http import base36_to_int
|
||||
from django.utils.http import base36_to_int, is_safe_url
|
||||
from django.utils.translation import ugettext as _
|
||||
from django.views.decorators.debug import sensitive_post_parameters
|
||||
from django.views.decorators.cache import never_cache
|
||||
|
@ -34,18 +34,11 @@ def login(request, template_name='registration/login.html',
|
|||
if request.method == "POST":
|
||||
form = authentication_form(data=request.POST)
|
||||
if form.is_valid():
|
||||
netloc = urlparse.urlparse(redirect_to)[1]
|
||||
|
||||
# Use default setting if redirect_to is empty
|
||||
if not redirect_to:
|
||||
# Ensure the user-originating redirection url is safe.
|
||||
if not is_safe_url(url=redirect_to, host=request.get_host()):
|
||||
redirect_to = settings.LOGIN_REDIRECT_URL
|
||||
|
||||
# Heavier security check -- don't allow redirection to a different
|
||||
# host.
|
||||
elif netloc and netloc != request.get_host():
|
||||
redirect_to = settings.LOGIN_REDIRECT_URL
|
||||
|
||||
# Okay, security checks complete. Log the user in.
|
||||
# Okay, security check complete. Log the user in.
|
||||
auth_login(request, form.get_user())
|
||||
|
||||
if request.session.test_cookie_worked():
|
||||
|
@ -78,27 +71,27 @@ def logout(request, next_page=None,
|
|||
Logs out the user and displays 'You are logged out' message.
|
||||
"""
|
||||
auth_logout(request)
|
||||
redirect_to = request.REQUEST.get(redirect_field_name, '')
|
||||
if redirect_to:
|
||||
netloc = urlparse.urlparse(redirect_to)[1]
|
||||
# Security check -- don't allow redirection to a different host.
|
||||
if not (netloc and netloc != request.get_host()):
|
||||
return HttpResponseRedirect(redirect_to)
|
||||
|
||||
if next_page is None:
|
||||
current_site = get_current_site(request)
|
||||
context = {
|
||||
'site': current_site,
|
||||
'site_name': current_site.name,
|
||||
'title': _('Logged out')
|
||||
}
|
||||
if extra_context is not None:
|
||||
context.update(extra_context)
|
||||
return TemplateResponse(request, template_name, context,
|
||||
current_app=current_app)
|
||||
else:
|
||||
if redirect_field_name in request.REQUEST:
|
||||
next_page = request.REQUEST[redirect_field_name]
|
||||
# Security check -- don't allow redirection to a different host.
|
||||
if not is_safe_url(url=next_page, host=request.get_host()):
|
||||
next_page = request.path
|
||||
|
||||
if next_page:
|
||||
# Redirect to this page until the session has been cleared.
|
||||
return HttpResponseRedirect(next_page or request.path)
|
||||
return HttpResponseRedirect(next_page)
|
||||
|
||||
current_site = get_current_site(request)
|
||||
context = {
|
||||
'site': current_site,
|
||||
'site_name': current_site.name,
|
||||
'title': _('Logged out')
|
||||
}
|
||||
if extra_context is not None:
|
||||
context.update(extra_context)
|
||||
return TemplateResponse(request, template_name, context,
|
||||
current_app=current_app)
|
||||
|
||||
def logout_then_login(request, login_url=None, current_app=None, extra_context=None):
|
||||
"""
|
||||
|
@ -156,7 +149,7 @@ def password_reset(request, is_admin_site=False,
|
|||
'request': request,
|
||||
}
|
||||
if is_admin_site:
|
||||
opts = dict(opts, domain_override=request.META['HTTP_HOST'])
|
||||
opts = dict(opts, domain_override=request.get_host())
|
||||
form.save(**opts)
|
||||
return HttpResponseRedirect(post_reset_redirect)
|
||||
else:
|
||||
|
|
|
@ -44,9 +44,6 @@ def post_comment(request, next=None, using=None):
|
|||
if not data.get('email', ''):
|
||||
data["email"] = request.user.email
|
||||
|
||||
# Check to see if the POST data overrides the view's next argument.
|
||||
next = data.get("next", next)
|
||||
|
||||
# Look up the object we're trying to comment about
|
||||
ctype = data.get("content_type")
|
||||
object_pk = data.get("object_pk")
|
||||
|
@ -98,9 +95,9 @@ def post_comment(request, next=None, using=None):
|
|||
]
|
||||
return render_to_response(
|
||||
template_list, {
|
||||
"comment" : form.data.get("comment", ""),
|
||||
"form" : form,
|
||||
"next": next,
|
||||
"comment": form.data.get("comment", ""),
|
||||
"form": form,
|
||||
"next": data.get("next", next),
|
||||
},
|
||||
RequestContext(request, {})
|
||||
)
|
||||
|
@ -131,7 +128,7 @@ def post_comment(request, next=None, using=None):
|
|||
request = request
|
||||
)
|
||||
|
||||
return next_redirect(data, next, comment_done, c=comment._get_pk_val())
|
||||
return next_redirect(request, next, comment_done, c=comment._get_pk_val())
|
||||
|
||||
comment_done = confirmation_view(
|
||||
template = "comments/posted.html",
|
||||
|
|
|
@ -10,7 +10,6 @@ from django.shortcuts import get_object_or_404, render_to_response
|
|||
from django.views.decorators.csrf import csrf_protect
|
||||
|
||||
|
||||
|
||||
@csrf_protect
|
||||
@login_required
|
||||
def flag(request, comment_id, next=None):
|
||||
|
@ -27,7 +26,7 @@ def flag(request, comment_id, next=None):
|
|||
# Flag on POST
|
||||
if request.method == 'POST':
|
||||
perform_flag(request, comment)
|
||||
return next_redirect(request.POST.copy(), next, flag_done, c=comment.pk)
|
||||
return next_redirect(request, next, flag_done, c=comment.pk)
|
||||
|
||||
# Render a form on GET
|
||||
else:
|
||||
|
@ -54,7 +53,7 @@ def delete(request, comment_id, next=None):
|
|||
if request.method == 'POST':
|
||||
# Flag the comment as deleted instead of actually deleting it.
|
||||
perform_delete(request, comment)
|
||||
return next_redirect(request.POST.copy(), next, delete_done, c=comment.pk)
|
||||
return next_redirect(request, next, delete_done, c=comment.pk)
|
||||
|
||||
# Render a form on GET
|
||||
else:
|
||||
|
@ -81,7 +80,7 @@ def approve(request, comment_id, next=None):
|
|||
if request.method == 'POST':
|
||||
# Flag the comment as approved.
|
||||
perform_approve(request, comment)
|
||||
return next_redirect(request.POST.copy(), next, approve_done, c=comment.pk)
|
||||
return next_redirect(request, next, approve_done, c=comment.pk)
|
||||
|
||||
# Render a form on GET
|
||||
else:
|
||||
|
|
|
@ -4,14 +4,15 @@ A few bits of helper functions for comment views.
|
|||
|
||||
import urllib
|
||||
import textwrap
|
||||
from django.http import HttpResponseRedirect
|
||||
from django.core import urlresolvers
|
||||
from django.http import HttpResponseRedirect
|
||||
from django.shortcuts import render_to_response
|
||||
from django.template import RequestContext
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.contrib import comments
|
||||
from django.utils.http import is_safe_url
|
||||
|
||||
def next_redirect(data, default, default_view, **get_kwargs):
|
||||
def next_redirect(request, default, default_view, **get_kwargs):
|
||||
"""
|
||||
Handle the "where should I go next?" part of comment views.
|
||||
|
||||
|
@ -21,9 +22,10 @@ def next_redirect(data, default, default_view, **get_kwargs):
|
|||
|
||||
Returns an ``HttpResponseRedirect``.
|
||||
"""
|
||||
next = data.get("next", default)
|
||||
if next is None:
|
||||
next = request.POST.get('next', default)
|
||||
if not is_safe_url(url=next, host=request.get_host()):
|
||||
next = urlresolvers.reverse(default_view)
|
||||
|
||||
if get_kwargs:
|
||||
if '#' in next:
|
||||
tmp = next.rsplit('#', 1)
|
||||
|
|
|
@ -9,6 +9,7 @@ from django.contrib.sites.models import Site
|
|||
from django.http import HttpRequest, Http404
|
||||
from django.test import TestCase
|
||||
from django.utils.encoding import smart_str
|
||||
from django.test.utils import override_settings
|
||||
|
||||
|
||||
class FooWithoutUrl(models.Model):
|
||||
|
@ -114,6 +115,7 @@ class ContentTypesTests(TestCase):
|
|||
FooWithUrl: ContentType.objects.get_for_model(FooWithUrl),
|
||||
})
|
||||
|
||||
@override_settings(ALLOWED_HOSTS=['example.com'])
|
||||
def test_shortcut_view(self):
|
||||
"""
|
||||
Check that the shortcut view (used for the admin "view on site"
|
||||
|
|
|
@ -7,29 +7,7 @@ class GeoSQLCompiler(BaseGeoSQLCompiler, SQLCompiler):
|
|||
pass
|
||||
|
||||
class SQLInsertCompiler(compiler.SQLInsertCompiler, GeoSQLCompiler):
|
||||
def placeholder(self, field, val):
|
||||
if field is None:
|
||||
# A field value of None means the value is raw.
|
||||
return val
|
||||
elif hasattr(field, 'get_placeholder'):
|
||||
# Some fields (e.g. geo fields) need special munging before
|
||||
# they can be inserted.
|
||||
ph = field.get_placeholder(val, self.connection)
|
||||
if ph == 'NULL':
|
||||
# If the placeholder returned is 'NULL', then we need to
|
||||
# to remove None from the Query parameters. Specifically,
|
||||
# cx_Oracle will assume a CHAR type when a placeholder ('%s')
|
||||
# is used for columns of MDSYS.SDO_GEOMETRY. Thus, we use
|
||||
# 'NULL' for the value, and remove None from the query params.
|
||||
# See also #10888.
|
||||
param_idx = self.query.columns.index(field.column)
|
||||
params = list(self.query.params)
|
||||
params.pop(param_idx)
|
||||
self.query.params = tuple(params)
|
||||
return ph
|
||||
else:
|
||||
# Return the common case for the placeholder
|
||||
return '%s'
|
||||
pass
|
||||
|
||||
class SQLDeleteCompiler(compiler.SQLDeleteCompiler, GeoSQLCompiler):
|
||||
pass
|
||||
|
|
|
@ -9,6 +9,7 @@
|
|||
"""
|
||||
import re
|
||||
from decimal import Decimal
|
||||
from itertools import izip
|
||||
|
||||
from django.db.backends.oracle.base import DatabaseOperations
|
||||
from django.contrib.gis.db.backends.base import BaseSpatialOperations
|
||||
|
@ -287,3 +288,12 @@ class OracleOperations(DatabaseOperations, BaseSpatialOperations):
|
|||
def spatial_ref_sys(self):
|
||||
from django.contrib.gis.db.backends.oracle.models import SpatialRefSys
|
||||
return SpatialRefSys
|
||||
|
||||
def modify_insert_params(self, placeholders, params):
|
||||
"""Drop out insert parameters for NULL placeholder. Needed for Oracle Spatial
|
||||
backend due to #10888
|
||||
"""
|
||||
# This code doesn't work for bulk insert cases.
|
||||
assert len(placeholders) == 1
|
||||
return [[param for pholder,param
|
||||
in izip(placeholders[0], params[0]) if pholder != 'NULL'], ]
|
||||
|
|
|
@ -101,8 +101,11 @@ geos_version.argtypes = None
|
|||
geos_version.restype = c_char_p
|
||||
|
||||
# Regular expression should be able to parse version strings such as
|
||||
# '3.0.0rc4-CAPI-1.3.3', '3.0.0-CAPI-1.4.1' or '3.4.0dev-CAPI-1.8.0'
|
||||
version_regex = re.compile(r'^(?P<version>(?P<major>\d+)\.(?P<minor>\d+)\.(?P<subminor>\d+))((rc(?P<release_candidate>\d+))|dev)?-CAPI-(?P<capi_version>\d+\.\d+\.\d+)$')
|
||||
# '3.0.0rc4-CAPI-1.3.3', '3.0.0-CAPI-1.4.1', '3.4.0dev-CAPI-1.8.0' or '3.4.0dev-CAPI-1.8.0 r0'
|
||||
version_regex = re.compile(
|
||||
r'^(?P<version>(?P<major>\d+)\.(?P<minor>\d+)\.(?P<subminor>\d+))'
|
||||
r'((rc(?P<release_candidate>\d+))|dev)?-CAPI-(?P<capi_version>\d+\.\d+\.\d+)( r\d+)?$'
|
||||
)
|
||||
def geos_version_info():
|
||||
"""
|
||||
Returns a dictionary containing the various version metadata parsed from
|
||||
|
@ -112,8 +115,10 @@ def geos_version_info():
|
|||
"""
|
||||
ver = geos_version()
|
||||
m = version_regex.match(ver)
|
||||
if not m: raise GEOSException('Could not parse version info string "%s"' % ver)
|
||||
return dict((key, m.group(key)) for key in ('version', 'release_candidate', 'capi_version', 'major', 'minor', 'subminor'))
|
||||
if not m:
|
||||
raise GEOSException('Could not parse version info string "%s"' % ver)
|
||||
return dict((key, m.group(key)) for key in (
|
||||
'version', 'release_candidate', 'capi_version', 'major', 'minor', 'subminor'))
|
||||
|
||||
# Version numbers and whether or not prepared geometry support is available.
|
||||
_verinfo = geos_version_info()
|
||||
|
|
|
@ -662,7 +662,7 @@ class GEOSTest(unittest.TestCase, TestDataMixin):
|
|||
for i in range(len(mp)):
|
||||
# Creating a random point.
|
||||
pnt = mp[i]
|
||||
new = Point(random.randint(1, 100), random.randint(1, 100))
|
||||
new = Point(random.randint(21, 100), random.randint(21, 100))
|
||||
# Testing the assignment
|
||||
mp[i] = new
|
||||
s = str(new) # what was used for the assignment is still accessible
|
||||
|
@ -1050,15 +1050,17 @@ class GEOSTest(unittest.TestCase, TestDataMixin):
|
|||
print "\nEND - expecting GEOS_NOTICE; safe to ignore.\n"
|
||||
|
||||
def test28_geos_version(self):
|
||||
"Testing the GEOS version regular expression."
|
||||
"""Testing the GEOS version regular expression."""
|
||||
from django.contrib.gis.geos.libgeos import version_regex
|
||||
versions = [ ('3.0.0rc4-CAPI-1.3.3', '3.0.0'),
|
||||
('3.0.0-CAPI-1.4.1', '3.0.0'),
|
||||
('3.4.0dev-CAPI-1.8.0', '3.4.0') ]
|
||||
for v, expected in versions:
|
||||
m = version_regex.match(v)
|
||||
self.assertTrue(m)
|
||||
self.assertEqual(m.group('version'), expected)
|
||||
versions = [('3.0.0rc4-CAPI-1.3.3', '3.0.0', '1.3.3'),
|
||||
('3.0.0-CAPI-1.4.1', '3.0.0', '1.4.1'),
|
||||
('3.4.0dev-CAPI-1.8.0', '3.4.0', '1.8.0'),
|
||||
('3.4.0dev-CAPI-1.8.0 r0', '3.4.0', '1.8.0')]
|
||||
for v_init, v_geos, v_capi in versions:
|
||||
m = version_regex.match(v_init)
|
||||
self.assertTrue(m, msg="Unable to parse the version string '%s'" % v_init)
|
||||
self.assertEqual(m.group('version'), v_geos)
|
||||
self.assertEqual(m.group('capi_version'), v_capi)
|
||||
|
||||
|
||||
def suite():
|
||||
|
|
|
@ -68,23 +68,27 @@ class OGRInspectTest(TestCase):
|
|||
layer_key=AllOGRFields._meta.db_table,
|
||||
decimal=['f_decimal'])
|
||||
|
||||
expected = [
|
||||
'# This is an auto-generated Django model module created by ogrinspect.',
|
||||
'from django.contrib.gis.db import models',
|
||||
'',
|
||||
'class Measurement(models.Model):',
|
||||
' f_decimal = models.DecimalField(max_digits=0, decimal_places=0)',
|
||||
' f_int = models.IntegerField()',
|
||||
' f_datetime = models.DateTimeField()',
|
||||
' f_time = models.TimeField()',
|
||||
' f_float = models.FloatField()',
|
||||
' f_char = models.CharField(max_length=10)',
|
||||
' f_date = models.DateField()',
|
||||
' geom = models.PolygonField()',
|
||||
' objects = models.GeoManager()',
|
||||
]
|
||||
self.assertTrue(model_def.startswith(
|
||||
'# This is an auto-generated Django model module created by ogrinspect.\n'
|
||||
'from django.contrib.gis.db import models\n'
|
||||
'\n'
|
||||
'class Measurement(models.Model):\n'
|
||||
))
|
||||
|
||||
# The ordering of model fields might vary depending on several factors (version of GDAL, etc.)
|
||||
self.assertIn(' f_decimal = models.DecimalField(max_digits=0, decimal_places=0)', model_def)
|
||||
self.assertIn(' f_int = models.IntegerField()', model_def)
|
||||
self.assertIn(' f_datetime = models.DateTimeField()', model_def)
|
||||
self.assertIn(' f_time = models.TimeField()', model_def)
|
||||
self.assertIn(' f_float = models.FloatField()', model_def)
|
||||
self.assertIn(' f_char = models.CharField(max_length=10)', model_def)
|
||||
self.assertIn(' f_date = models.DateField()', model_def)
|
||||
|
||||
self.assertTrue(model_def.endswith(
|
||||
' geom = models.PolygonField()\n'
|
||||
' objects = models.GeoManager()'
|
||||
))
|
||||
|
||||
self.assertEqual(model_def, '\n'.join(expected))
|
||||
|
||||
def get_ogr_db_string():
|
||||
# Construct the DB string that GDAL will use to inspect the database.
|
||||
|
|
|
@ -65,8 +65,8 @@ def markdown(value, arg=''):
|
|||
safe_mode = True
|
||||
else:
|
||||
safe_mode = False
|
||||
python_markdown_deprecation = "The use of Python-Markdown "
|
||||
"< 2.1 in Django is deprecated; please update to the current version"
|
||||
python_markdown_deprecation = ("The use of Python-Markdown "
|
||||
"< 2.1 in Django is deprecated; please update to the current version")
|
||||
# Unicode support only in markdown v1.7 or above. Version_info
|
||||
# exist only in markdown v1.6.2rc-2 or above.
|
||||
markdown_vers = getattr(markdown, "version_info", None)
|
||||
|
|
|
@ -128,6 +128,13 @@ class SessionBase(object):
|
|||
self.accessed = True
|
||||
self.modified = True
|
||||
|
||||
def is_empty(self):
|
||||
"Returns True when there is no session_key and the session is empty"
|
||||
try:
|
||||
return not bool(self._session_key) and not self._session_cache
|
||||
except AttributeError:
|
||||
return True
|
||||
|
||||
def _get_new_session_key(self):
|
||||
"Returns session key that isn't being used."
|
||||
# Todo: move to 0-9a-z charset in 1.5
|
||||
|
@ -230,7 +237,7 @@ class SessionBase(object):
|
|||
"""
|
||||
self.clear()
|
||||
self.delete()
|
||||
self.create()
|
||||
self._session_key = None
|
||||
|
||||
def cycle_key(self):
|
||||
"""
|
||||
|
|
|
@ -25,7 +25,7 @@ class SessionStore(SessionBase):
|
|||
session_data = None
|
||||
if session_data is not None:
|
||||
return session_data
|
||||
self.create()
|
||||
self._session_key = None
|
||||
return {}
|
||||
|
||||
def create(self):
|
||||
|
@ -45,6 +45,8 @@ class SessionStore(SessionBase):
|
|||
raise RuntimeError("Unable to create a new session key.")
|
||||
|
||||
def save(self, must_create=False):
|
||||
if self.session_key is None:
|
||||
return self.create()
|
||||
if must_create:
|
||||
func = self._cache.add
|
||||
else:
|
||||
|
@ -56,7 +58,7 @@ class SessionStore(SessionBase):
|
|||
raise CreateError
|
||||
|
||||
def exists(self, session_key):
|
||||
return (KEY_PREFIX + session_key) in self._cache
|
||||
return session_key and (KEY_PREFIX + session_key) in self._cache
|
||||
|
||||
def delete(self, session_key=None):
|
||||
if session_key is None:
|
||||
|
|
|
@ -30,11 +30,12 @@ class SessionStore(DBStore):
|
|||
data = None
|
||||
if data is None:
|
||||
data = super(SessionStore, self).load()
|
||||
cache.set(self.cache_key, data, settings.SESSION_COOKIE_AGE)
|
||||
if self.session_key:
|
||||
cache.set(self.cache_key, data, settings.SESSION_COOKIE_AGE)
|
||||
return data
|
||||
|
||||
def exists(self, session_key):
|
||||
if (KEY_PREFIX + session_key) in cache:
|
||||
if session_key and (KEY_PREFIX + session_key) in cache:
|
||||
return True
|
||||
return super(SessionStore, self).exists(session_key)
|
||||
|
||||
|
@ -57,4 +58,4 @@ class SessionStore(DBStore):
|
|||
"""
|
||||
self.clear()
|
||||
self.delete(self.session_key)
|
||||
self.create()
|
||||
self._session_key = None
|
||||
|
|
|
@ -20,7 +20,7 @@ class SessionStore(SessionBase):
|
|||
)
|
||||
return self.decode(force_unicode(s.session_data))
|
||||
except (Session.DoesNotExist, SuspiciousOperation):
|
||||
self.create()
|
||||
self._session_key = None
|
||||
return {}
|
||||
|
||||
def exists(self, session_key):
|
||||
|
@ -37,7 +37,6 @@ class SessionStore(SessionBase):
|
|||
# Key wasn't unique. Try again.
|
||||
continue
|
||||
self.modified = True
|
||||
self._session_cache = {}
|
||||
return
|
||||
|
||||
def save(self, must_create=False):
|
||||
|
@ -47,6 +46,8 @@ class SessionStore(SessionBase):
|
|||
create a *new* entry (as opposed to possibly updating an existing
|
||||
entry).
|
||||
"""
|
||||
if self.session_key is None:
|
||||
return self.create()
|
||||
obj = Session(
|
||||
session_key=self._get_or_create_session_key(),
|
||||
session_data=self.encode(self._get_session(no_load=must_create)),
|
||||
|
|
|
@ -56,11 +56,11 @@ class SessionStore(SessionBase):
|
|||
try:
|
||||
session_data = self.decode(file_data)
|
||||
except (EOFError, SuspiciousOperation):
|
||||
self.create()
|
||||
self._session_key = None
|
||||
finally:
|
||||
session_file.close()
|
||||
except IOError:
|
||||
self.create()
|
||||
self._session_key = None
|
||||
return session_data
|
||||
|
||||
def create(self):
|
||||
|
@ -71,10 +71,11 @@ class SessionStore(SessionBase):
|
|||
except CreateError:
|
||||
continue
|
||||
self.modified = True
|
||||
self._session_cache = {}
|
||||
return
|
||||
|
||||
def save(self, must_create=False):
|
||||
if self.session_key is None:
|
||||
return self.create()
|
||||
# Get the session data now, before we start messing
|
||||
# with the file it is stored within.
|
||||
session_data = self._get_session(no_load=must_create)
|
||||
|
|
|
@ -14,30 +14,38 @@ class SessionMiddleware(object):
|
|||
def process_response(self, request, response):
|
||||
"""
|
||||
If request.session was modified, or if the configuration is to save the
|
||||
session every time, save the changes and set a session cookie.
|
||||
session every time, save the changes and set a session cookie or delete
|
||||
the session cookie if the session has been emptied.
|
||||
"""
|
||||
try:
|
||||
accessed = request.session.accessed
|
||||
modified = request.session.modified
|
||||
empty = request.session.is_empty()
|
||||
except AttributeError:
|
||||
pass
|
||||
else:
|
||||
if accessed:
|
||||
patch_vary_headers(response, ('Cookie',))
|
||||
if modified or settings.SESSION_SAVE_EVERY_REQUEST:
|
||||
if request.session.get_expire_at_browser_close():
|
||||
max_age = None
|
||||
expires = None
|
||||
else:
|
||||
max_age = request.session.get_expiry_age()
|
||||
expires_time = time.time() + max_age
|
||||
expires = cookie_date(expires_time)
|
||||
# Save the session data and refresh the client cookie.
|
||||
request.session.save()
|
||||
response.set_cookie(settings.SESSION_COOKIE_NAME,
|
||||
request.session.session_key, max_age=max_age,
|
||||
expires=expires, domain=settings.SESSION_COOKIE_DOMAIN,
|
||||
path=settings.SESSION_COOKIE_PATH,
|
||||
secure=settings.SESSION_COOKIE_SECURE or None,
|
||||
httponly=settings.SESSION_COOKIE_HTTPONLY or None)
|
||||
# First check if we need to delete this cookie.
|
||||
# The session should be deleted only if the session is entirely empty
|
||||
if settings.SESSION_COOKIE_NAME in request.COOKIES and empty:
|
||||
response.delete_cookie(settings.SESSION_COOKIE_NAME,
|
||||
domain=settings.SESSION_COOKIE_DOMAIN)
|
||||
else:
|
||||
if accessed:
|
||||
patch_vary_headers(response, ('Cookie',))
|
||||
if (modified or settings.SESSION_SAVE_EVERY_REQUEST) and not empty:
|
||||
if request.session.get_expire_at_browser_close():
|
||||
max_age = None
|
||||
expires = None
|
||||
else:
|
||||
max_age = request.session.get_expiry_age()
|
||||
expires_time = time.time() + max_age
|
||||
expires = cookie_date(expires_time)
|
||||
# Save the session data and refresh the client cookie.
|
||||
request.session.save()
|
||||
response.set_cookie(settings.SESSION_COOKIE_NAME,
|
||||
request.session.session_key, max_age=max_age,
|
||||
expires=expires, domain=settings.SESSION_COOKIE_DOMAIN,
|
||||
path=settings.SESSION_COOKIE_PATH,
|
||||
secure=settings.SESSION_COOKIE_SECURE or None,
|
||||
httponly=settings.SESSION_COOKIE_HTTPONLY or None)
|
||||
return response
|
||||
|
|
|
@ -150,6 +150,7 @@ class SessionTestsMixin(object):
|
|||
self.session.flush()
|
||||
self.assertFalse(self.session.exists(prev_key))
|
||||
self.assertNotEqual(self.session.session_key, prev_key)
|
||||
self.assertIsNone(self.session.session_key)
|
||||
self.assertTrue(self.session.modified)
|
||||
self.assertTrue(self.session.accessed)
|
||||
|
||||
|
@ -162,6 +163,11 @@ class SessionTestsMixin(object):
|
|||
self.assertNotEqual(self.session.session_key, prev_key)
|
||||
self.assertEqual(self.session.items(), prev_data)
|
||||
|
||||
def test_save_doesnt_clear_data(self):
|
||||
self.session['a'] = 'b'
|
||||
self.session.save()
|
||||
self.assertEqual(self.session['a'], 'b')
|
||||
|
||||
def test_invalid_key(self):
|
||||
# Submitting an invalid session key (either by guessing, or if the db has
|
||||
# removed the key) results in a new key being generated.
|
||||
|
@ -256,6 +262,20 @@ class SessionTestsMixin(object):
|
|||
encoded = self.session.encode(data)
|
||||
self.assertEqual(self.session.decode(encoded), data)
|
||||
|
||||
def test_session_load_does_not_create_record(self):
|
||||
"""
|
||||
Loading an unknown session key does not create a session record.
|
||||
|
||||
Creating session records on load is a DOS vulnerability.
|
||||
"""
|
||||
if self.backend is CookieSession:
|
||||
raise unittest.SkipTest("Cookie backend doesn't have an external store to create records in.")
|
||||
session = self.backend('deadbeef')
|
||||
session.load()
|
||||
|
||||
self.assertFalse(session.exists(session.session_key))
|
||||
# provided unknown key was cycled, not reused
|
||||
self.assertNotEqual(session.session_key, 'deadbeef')
|
||||
|
||||
class DatabaseSessionTests(SessionTestsMixin, TestCase):
|
||||
|
||||
|
@ -413,6 +433,75 @@ class SessionMiddlewareTests(unittest.TestCase):
|
|||
self.assertNotIn('httponly',
|
||||
str(response.cookies[settings.SESSION_COOKIE_NAME]))
|
||||
|
||||
def test_session_delete_on_end(self):
|
||||
request = RequestFactory().get('/')
|
||||
response = HttpResponse('Session test')
|
||||
middleware = SessionMiddleware()
|
||||
|
||||
# Before deleting, there has to be an existing cookie
|
||||
request.COOKIES[settings.SESSION_COOKIE_NAME] = 'abc'
|
||||
|
||||
# Simulate a request that ends the session
|
||||
middleware.process_request(request)
|
||||
request.session.flush()
|
||||
|
||||
# Handle the response through the middleware
|
||||
response = middleware.process_response(request, response)
|
||||
|
||||
# Check that the cookie was deleted, not recreated.
|
||||
# A deleted cookie header looks like:
|
||||
# Set-Cookie: sessionid=; expires=Thu, 01-Jan-1970 00:00:00 GMT; Max-Age=0; Path=/
|
||||
self.assertEqual(
|
||||
'Set-Cookie: %s=; expires=Thu, 01-Jan-1970 00:00:00 GMT; '
|
||||
'Max-Age=0; Path=/' % settings.SESSION_COOKIE_NAME,
|
||||
str(response.cookies[settings.SESSION_COOKIE_NAME])
|
||||
)
|
||||
|
||||
@override_settings(SESSION_COOKIE_DOMAIN='.example.local')
|
||||
def test_session_delete_on_end_with_custom_domain(self):
|
||||
request = RequestFactory().get('/')
|
||||
response = HttpResponse('Session test')
|
||||
middleware = SessionMiddleware()
|
||||
|
||||
# Before deleting, there has to be an existing cookie
|
||||
request.COOKIES[settings.SESSION_COOKIE_NAME] = 'abc'
|
||||
|
||||
# Simulate a request that ends the session
|
||||
middleware.process_request(request)
|
||||
request.session.flush()
|
||||
|
||||
# Handle the response through the middleware
|
||||
response = middleware.process_response(request, response)
|
||||
|
||||
# Check that the cookie was deleted, not recreated.
|
||||
# A deleted cookie header with a custom domain looks like:
|
||||
# Set-Cookie: sessionid=; Domain=.example.local;
|
||||
# expires=Thu, 01-Jan-1970 00:00:00 GMT; Max-Age=0; Path=/
|
||||
self.assertEqual(
|
||||
'Set-Cookie: %s=; Domain=.example.local; expires=Thu, '
|
||||
'01-Jan-1970 00:00:00 GMT; Max-Age=0; Path=/' % (
|
||||
settings.SESSION_COOKIE_NAME,
|
||||
),
|
||||
str(response.cookies[settings.SESSION_COOKIE_NAME])
|
||||
)
|
||||
|
||||
def test_flush_empty_without_session_cookie_doesnt_set_cookie(self):
|
||||
request = RequestFactory().get('/')
|
||||
response = HttpResponse('Session test')
|
||||
middleware = SessionMiddleware()
|
||||
|
||||
# Simulate a request that ends the session
|
||||
middleware.process_request(request)
|
||||
request.session.flush()
|
||||
|
||||
# Handle the response through the middleware
|
||||
response = middleware.process_response(request, response)
|
||||
|
||||
# A cookie should not be set.
|
||||
self.assertEqual(response.cookies, {})
|
||||
# The session is accessed so "Vary: Cookie" should be set.
|
||||
self.assertEqual(response['Vary'], 'Cookie')
|
||||
|
||||
|
||||
class CookieSessionTests(SessionTestsMixin, TestCase):
|
||||
|
||||
|
|
|
@ -3,6 +3,7 @@ from django.contrib.sites.models import Site, RequestSite, get_current_site
|
|||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.http import HttpRequest
|
||||
from django.test import TestCase
|
||||
from django.test.utils import override_settings
|
||||
|
||||
|
||||
class SitesFrameworkTests(TestCase):
|
||||
|
@ -39,6 +40,7 @@ class SitesFrameworkTests(TestCase):
|
|||
site = Site.objects.get_current()
|
||||
self.assertEqual(u"Example site", site.name)
|
||||
|
||||
@override_settings(ALLOWED_HOSTS=['example.com'])
|
||||
def test_get_current_site(self):
|
||||
# Test that the correct Site object is returned
|
||||
request = HttpRequest()
|
||||
|
|
|
@ -190,8 +190,8 @@ class CachedFilesMixin(object):
|
|||
if dry_run:
|
||||
return
|
||||
|
||||
# delete cache of all handled paths
|
||||
self.cache.delete_many([self.cache_key(path) for path in paths])
|
||||
# where to store the new paths
|
||||
hashed_paths = {}
|
||||
|
||||
# build a list of adjustable files
|
||||
matches = lambda path: matches_patterns(path, self._patterns.keys())
|
||||
|
@ -240,9 +240,12 @@ class CachedFilesMixin(object):
|
|||
hashed_name = force_unicode(saved_name.replace('\\', '/'))
|
||||
|
||||
# and then set the cache accordingly
|
||||
self.cache.set(self.cache_key(name), hashed_name)
|
||||
hashed_paths[self.cache_key(name)] = hashed_name
|
||||
yield name, hashed_name, processed
|
||||
|
||||
# Finally set the cache
|
||||
self.cache.set_many(hashed_paths)
|
||||
|
||||
|
||||
class CachedStaticFilesStorage(CachedFilesMixin, StaticFilesStorage):
|
||||
"""
|
||||
|
|
|
@ -47,13 +47,18 @@ def get_image_dimensions(file_or_path, close=False):
|
|||
file = open(file_or_path, 'rb')
|
||||
close = True
|
||||
try:
|
||||
# Most of the time PIL only needs a small chunk to parse the image and
|
||||
# get the dimensions, but with some TIFF files PIL needs to parse the
|
||||
# whole file.
|
||||
chunk_size = 1024
|
||||
while 1:
|
||||
data = file.read(1024)
|
||||
data = file.read(chunk_size)
|
||||
if not data:
|
||||
break
|
||||
p.feed(data)
|
||||
if p.image:
|
||||
return p.image.size
|
||||
chunk_size = chunk_size*2
|
||||
return None
|
||||
finally:
|
||||
if close:
|
||||
|
|
|
@ -1,13 +1,13 @@
|
|||
import os
|
||||
import errno
|
||||
import urlparse
|
||||
import itertools
|
||||
from datetime import datetime
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.exceptions import ImproperlyConfigured, SuspiciousOperation
|
||||
from django.core.files import locks, File
|
||||
from django.core.files.move import file_move_safe
|
||||
from django.utils.crypto import get_random_string
|
||||
from django.utils.encoding import force_unicode, filepath_to_uri
|
||||
from django.utils.functional import LazyObject
|
||||
from django.utils.importlib import import_module
|
||||
|
@ -63,13 +63,12 @@ class Storage(object):
|
|||
"""
|
||||
dir_name, file_name = os.path.split(name)
|
||||
file_root, file_ext = os.path.splitext(file_name)
|
||||
# If the filename already exists, add an underscore and a number (before
|
||||
# the file extension, if one exists) to the filename until the generated
|
||||
# filename doesn't exist.
|
||||
count = itertools.count(1)
|
||||
# If the filename already exists, add an underscore and a random 7
|
||||
# character alphanumeric string (before the file extension, if one
|
||||
# exists) to the filename until the generated filename doesn't exist.
|
||||
while self.exists(name):
|
||||
# file_ext includes the dot.
|
||||
name = os.path.join(dir_name, "%s_%s%s" % (file_root, count.next(), file_ext))
|
||||
name = os.path.join(dir_name, "%s_%s%s" % (file_root, get_random_string(7), file_ext))
|
||||
|
||||
return name
|
||||
|
||||
|
|
|
@ -14,8 +14,6 @@ class BaseHandler(object):
|
|||
response_fixes = [
|
||||
http.fix_location_header,
|
||||
http.conditional_content_removal,
|
||||
http.fix_IE_for_attach,
|
||||
http.fix_IE_for_vary,
|
||||
]
|
||||
|
||||
def __init__(self):
|
||||
|
|
|
@ -35,4 +35,11 @@ class Command(BaseCommand):
|
|||
# a strange error -- it causes this handle() method to be called
|
||||
# multiple times.
|
||||
shutdown_message = '\nServer stopped.\nNote that the test database, %r, has not been deleted. You can explore it on your own.' % db_name
|
||||
call_command('runserver', addrport=addrport, shutdown_message=shutdown_message, use_reloader=False, use_ipv6=options['use_ipv6'])
|
||||
use_threading = connection.features.test_db_allows_multiple_connections
|
||||
call_command('runserver',
|
||||
addrport=addrport,
|
||||
shutdown_message=shutdown_message,
|
||||
use_reloader=False,
|
||||
use_ipv6=options['use_ipv6'],
|
||||
use_threading=use_threading
|
||||
)
|
||||
|
|
|
@ -8,6 +8,8 @@ from django.db import models, DEFAULT_DB_ALIAS
|
|||
from django.utils.xmlutils import SimplerXMLGenerator
|
||||
from django.utils.encoding import smart_unicode
|
||||
from xml.dom import pulldom
|
||||
from xml.sax import handler
|
||||
from xml.sax.expatreader import ExpatParser as _ExpatParser
|
||||
|
||||
class Serializer(base.Serializer):
|
||||
"""
|
||||
|
@ -149,9 +151,13 @@ class Deserializer(base.Deserializer):
|
|||
|
||||
def __init__(self, stream_or_string, **options):
|
||||
super(Deserializer, self).__init__(stream_or_string, **options)
|
||||
self.event_stream = pulldom.parse(self.stream)
|
||||
self.event_stream = pulldom.parse(self.stream, self._make_parser())
|
||||
self.db = options.pop('using', DEFAULT_DB_ALIAS)
|
||||
|
||||
def _make_parser(self):
|
||||
"""Create a hardened XML parser (no custom/external entities)."""
|
||||
return DefusedExpatParser()
|
||||
|
||||
def next(self):
|
||||
for event, node in self.event_stream:
|
||||
if event == "START_ELEMENT" and node.nodeName == "object":
|
||||
|
@ -290,3 +296,90 @@ def getInnerText(node):
|
|||
else:
|
||||
pass
|
||||
return u"".join(inner_text)
|
||||
|
||||
|
||||
# Below code based on Christian Heimes' defusedxml
|
||||
|
||||
|
||||
class DefusedExpatParser(_ExpatParser):
|
||||
"""
|
||||
An expat parser hardened against XML bomb attacks.
|
||||
|
||||
Forbids DTDs, external entity references
|
||||
|
||||
"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
_ExpatParser.__init__(self, *args, **kwargs)
|
||||
self.setFeature(handler.feature_external_ges, False)
|
||||
self.setFeature(handler.feature_external_pes, False)
|
||||
|
||||
def start_doctype_decl(self, name, sysid, pubid, has_internal_subset):
|
||||
raise DTDForbidden(name, sysid, pubid)
|
||||
|
||||
def entity_decl(self, name, is_parameter_entity, value, base,
|
||||
sysid, pubid, notation_name):
|
||||
raise EntitiesForbidden(name, value, base, sysid, pubid, notation_name)
|
||||
|
||||
def unparsed_entity_decl(self, name, base, sysid, pubid, notation_name):
|
||||
# expat 1.2
|
||||
raise EntitiesForbidden(name, None, base, sysid, pubid, notation_name)
|
||||
|
||||
def external_entity_ref_handler(self, context, base, sysid, pubid):
|
||||
raise ExternalReferenceForbidden(context, base, sysid, pubid)
|
||||
|
||||
def reset(self):
|
||||
_ExpatParser.reset(self)
|
||||
parser = self._parser
|
||||
parser.StartDoctypeDeclHandler = self.start_doctype_decl
|
||||
parser.EntityDeclHandler = self.entity_decl
|
||||
parser.UnparsedEntityDeclHandler = self.unparsed_entity_decl
|
||||
parser.ExternalEntityRefHandler = self.external_entity_ref_handler
|
||||
|
||||
|
||||
class DefusedXmlException(ValueError):
|
||||
"""Base exception."""
|
||||
def __repr__(self):
|
||||
return str(self)
|
||||
|
||||
|
||||
class DTDForbidden(DefusedXmlException):
|
||||
"""Document type definition is forbidden."""
|
||||
def __init__(self, name, sysid, pubid):
|
||||
super(DTDForbidden, self).__init__()
|
||||
self.name = name
|
||||
self.sysid = sysid
|
||||
self.pubid = pubid
|
||||
|
||||
def __str__(self):
|
||||
tpl = "DTDForbidden(name='{}', system_id={!r}, public_id={!r})"
|
||||
return tpl.format(self.name, self.sysid, self.pubid)
|
||||
|
||||
|
||||
class EntitiesForbidden(DefusedXmlException):
|
||||
"""Entity definition is forbidden."""
|
||||
def __init__(self, name, value, base, sysid, pubid, notation_name):
|
||||
super(EntitiesForbidden, self).__init__()
|
||||
self.name = name
|
||||
self.value = value
|
||||
self.base = base
|
||||
self.sysid = sysid
|
||||
self.pubid = pubid
|
||||
self.notation_name = notation_name
|
||||
|
||||
def __str__(self):
|
||||
tpl = "EntitiesForbidden(name='{}', system_id={!r}, public_id={!r})"
|
||||
return tpl.format(self.name, self.sysid, self.pubid)
|
||||
|
||||
|
||||
class ExternalReferenceForbidden(DefusedXmlException):
|
||||
"""Resolving an external reference is forbidden."""
|
||||
def __init__(self, context, base, sysid, pubid):
|
||||
super(ExternalReferenceForbidden, self).__init__()
|
||||
self.context = context
|
||||
self.base = base
|
||||
self.sysid = sysid
|
||||
self.pubid = pubid
|
||||
|
||||
def __str__(self):
|
||||
tpl = "ExternalReferenceForbidden(system_id='{}', public_id={})"
|
||||
return tpl.format(self.sysid, self.pubid)
|
||||
|
|
|
@ -138,39 +138,6 @@ class WSGIRequestHandler(simple_server.WSGIRequestHandler, object):
|
|||
self.style = color_style()
|
||||
super(WSGIRequestHandler, self).__init__(*args, **kwargs)
|
||||
|
||||
def get_environ(self):
|
||||
env = self.server.base_environ.copy()
|
||||
env['SERVER_PROTOCOL'] = self.request_version
|
||||
env['REQUEST_METHOD'] = self.command
|
||||
if '?' in self.path:
|
||||
path,query = self.path.split('?',1)
|
||||
else:
|
||||
path,query = self.path,''
|
||||
|
||||
env['PATH_INFO'] = urllib.unquote(path)
|
||||
env['QUERY_STRING'] = query
|
||||
env['REMOTE_ADDR'] = self.client_address[0]
|
||||
|
||||
if self.headers.typeheader is None:
|
||||
env['CONTENT_TYPE'] = self.headers.type
|
||||
else:
|
||||
env['CONTENT_TYPE'] = self.headers.typeheader
|
||||
|
||||
length = self.headers.getheader('content-length')
|
||||
if length:
|
||||
env['CONTENT_LENGTH'] = length
|
||||
|
||||
for h in self.headers.headers:
|
||||
k,v = h.split(':',1)
|
||||
k=k.replace('-','_').upper(); v=v.strip()
|
||||
if k in env:
|
||||
continue # skip content length, type,etc.
|
||||
if 'HTTP_'+k in env:
|
||||
env['HTTP_'+k] += ','+v # comma-separate multiple headers
|
||||
else:
|
||||
env['HTTP_'+k] = v
|
||||
return env
|
||||
|
||||
def log_message(self, format, *args):
|
||||
# Don't bother logging requests for admin images or the favicon.
|
||||
if (self.path.startswith(self.admin_media_prefix)
|
||||
|
@ -199,6 +166,17 @@ class WSGIRequestHandler(simple_server.WSGIRequestHandler, object):
|
|||
|
||||
sys.stderr.write(msg)
|
||||
|
||||
def get_environ(self):
|
||||
# Strip all headers with underscores in the name before constructing
|
||||
# the WSGI environ. This prevents header-spoofing based on ambiguity
|
||||
# between underscores and dashes both normalized to underscores in WSGI
|
||||
# env vars. Nginx and Apache 2.4+ both do this as well.
|
||||
for k, v in self.headers.items():
|
||||
if '_' in k:
|
||||
del self.headers[k]
|
||||
|
||||
return super(WSGIRequestHandler, self).get_environ()
|
||||
|
||||
|
||||
class AdminMediaHandler(handlers.StaticFilesHandler):
|
||||
"""
|
||||
|
|
|
@ -7,6 +7,7 @@ a string) and returns a tuple in this format:
|
|||
(view_function, function_args, function_kwargs)
|
||||
"""
|
||||
|
||||
import functools
|
||||
import re
|
||||
from threading import local
|
||||
|
||||
|
@ -230,6 +231,10 @@ class RegexURLResolver(LocaleRegexProvider):
|
|||
self._reverse_dict = {}
|
||||
self._namespace_dict = {}
|
||||
self._app_dict = {}
|
||||
# set of dotted paths to all functions and classes that are used in
|
||||
# urlpatterns
|
||||
self._callback_strs = set()
|
||||
self._populated = False
|
||||
|
||||
def __repr__(self):
|
||||
return smart_str(u'<%s %s (%s:%s) %s>' % (self.__class__.__name__, self.urlconf_name, self.app_name, self.namespace, self.regex.pattern))
|
||||
|
@ -240,6 +245,18 @@ class RegexURLResolver(LocaleRegexProvider):
|
|||
apps = {}
|
||||
language_code = get_language()
|
||||
for pattern in reversed(self.url_patterns):
|
||||
if hasattr(pattern, '_callback_str'):
|
||||
self._callback_strs.add(pattern._callback_str)
|
||||
elif hasattr(pattern, '_callback'):
|
||||
callback = pattern._callback
|
||||
if isinstance(callback, functools.partial):
|
||||
callback = callback.func
|
||||
|
||||
if not hasattr(callback, '__name__'):
|
||||
lookup_str = callback.__module__ + "." + callback.__class__.__name__
|
||||
else:
|
||||
lookup_str = callback.__module__ + "." + callback.__name__
|
||||
self._callback_strs.add(lookup_str)
|
||||
p_pattern = pattern.regex.pattern
|
||||
if p_pattern.startswith('^'):
|
||||
p_pattern = p_pattern[1:]
|
||||
|
@ -260,6 +277,7 @@ class RegexURLResolver(LocaleRegexProvider):
|
|||
namespaces[namespace] = (p_pattern + prefix, sub_pattern)
|
||||
for app_name, namespace_list in pattern.app_dict.items():
|
||||
apps.setdefault(app_name, []).extend(namespace_list)
|
||||
self._callback_strs.update(pattern._callback_strs)
|
||||
else:
|
||||
bits = normalize(p_pattern)
|
||||
lookups.appendlist(pattern.callback, (bits, p_pattern, pattern.default_args))
|
||||
|
@ -268,6 +286,7 @@ class RegexURLResolver(LocaleRegexProvider):
|
|||
self._reverse_dict[language_code] = lookups
|
||||
self._namespace_dict[language_code] = namespaces
|
||||
self._app_dict[language_code] = apps
|
||||
self._populated = True
|
||||
|
||||
@property
|
||||
def reverse_dict(self):
|
||||
|
@ -356,8 +375,13 @@ class RegexURLResolver(LocaleRegexProvider):
|
|||
def _reverse_with_prefix(self, lookup_view, _prefix, *args, **kwargs):
|
||||
if args and kwargs:
|
||||
raise ValueError("Don't mix *args and **kwargs in call to reverse()!")
|
||||
|
||||
if not self._populated:
|
||||
self._populate()
|
||||
|
||||
try:
|
||||
lookup_view = get_callable(lookup_view, True)
|
||||
if lookup_view in self._callback_strs:
|
||||
lookup_view = get_callable(lookup_view, True)
|
||||
except (ImportError, AttributeError), e:
|
||||
raise NoReverseMatch("Error importing '%s': %s." % (lookup_view, e))
|
||||
possibilities = self.reverse_dict.getlist(lookup_view)
|
||||
|
@ -382,6 +406,8 @@ class RegexURLResolver(LocaleRegexProvider):
|
|||
unicode_kwargs = dict([(k, force_unicode(v)) for (k, v) in kwargs.items()])
|
||||
candidate = (prefix_norm + result) % unicode_kwargs
|
||||
if re.search(u'^%s%s' % (_prefix, pattern), candidate, re.UNICODE):
|
||||
if candidate.startswith('//'):
|
||||
candidate = '/%%2F%s' % candidate[2:]
|
||||
return candidate
|
||||
# lookup_view can be URL label, or dotted path, or callable, Any of
|
||||
# these can be passed in at the top, but callables are not friendly in
|
||||
|
|
|
@ -50,7 +50,7 @@ class URLValidator(RegexValidator):
|
|||
r'localhost|' #localhost...
|
||||
r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})' # ...or ip
|
||||
r'(?::\d+)?' # optional port
|
||||
r'(?:/?|[/?]\S+)$', re.IGNORECASE)
|
||||
r'(?:/?|[/?]\S+)\Z', re.IGNORECASE)
|
||||
|
||||
def __init__(self, verify_exists=False,
|
||||
validator_user_agent=URL_VALIDATOR_USER_AGENT):
|
||||
|
@ -133,11 +133,16 @@ class URLValidator(RegexValidator):
|
|||
raise broken_error
|
||||
|
||||
|
||||
integer_validator = RegexValidator(
|
||||
re.compile('^-?\d+\Z'),
|
||||
message=_('Enter a valid integer.'),
|
||||
code='invalid',
|
||||
)
|
||||
|
||||
|
||||
def validate_integer(value):
|
||||
try:
|
||||
int(value)
|
||||
except (ValueError, TypeError):
|
||||
raise ValidationError('')
|
||||
return integer_validator(value)
|
||||
|
||||
|
||||
class EmailValidator(RegexValidator):
|
||||
|
||||
|
@ -160,14 +165,14 @@ email_re = re.compile(
|
|||
r"(^[-!#$%&'*+/=?^_`{}|~0-9A-Z]+(\.[-!#$%&'*+/=?^_`{}|~0-9A-Z]+)*" # dot-atom
|
||||
# quoted-string, see also http://tools.ietf.org/html/rfc2822#section-3.2.5
|
||||
r'|^"([\001-\010\013\014\016-\037!#-\[\]-\177]|\\[\001-\011\013\014\016-\177])*"'
|
||||
r')@((?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+[A-Z]{2,6}\.?$)' # domain
|
||||
r'|\[(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\]$', re.IGNORECASE) # literal form, ipv4 address (SMTP 4.1.3)
|
||||
r')@((?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+[A-Z]{2,6}\.?\Z)' # domain
|
||||
r'|\[(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\]\Z', re.IGNORECASE) # literal form, ipv4 address (SMTP 4.1.3)
|
||||
validate_email = EmailValidator(email_re, _(u'Enter a valid e-mail address.'), 'invalid')
|
||||
|
||||
slug_re = re.compile(r'^[-\w]+$')
|
||||
slug_re = re.compile(r'^[-\w]+\Z')
|
||||
validate_slug = RegexValidator(slug_re, _(u"Enter a valid 'slug' consisting of letters, numbers, underscores or hyphens."), 'invalid')
|
||||
|
||||
ipv4_re = re.compile(r'^(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}$')
|
||||
ipv4_re = re.compile(r'^(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\Z')
|
||||
validate_ipv4_address = RegexValidator(ipv4_re, _(u'Enter a valid IPv4 address.'), 'invalid')
|
||||
|
||||
def validate_ipv6_address(value):
|
||||
|
@ -205,7 +210,7 @@ def ip_address_validators(protocol, unpack_ipv4):
|
|||
raise ValueError("The protocol '%s' is unknown. Supported: %s"
|
||||
% (protocol, ip_address_validator_map.keys()))
|
||||
|
||||
comma_separated_int_list_re = re.compile('^[\d,]+$')
|
||||
comma_separated_int_list_re = re.compile('^[\d,]+\Z')
|
||||
validate_comma_separated_integer_list = RegexValidator(comma_separated_int_list_re, _(u'Enter only digits separated by commas.'), 'invalid')
|
||||
|
||||
|
||||
|
@ -249,4 +254,3 @@ class MaxLengthValidator(BaseValidator):
|
|||
clean = lambda self, x: len(x)
|
||||
message = _(u'Ensure this value has at most %(limit_value)d characters (it has %(show_value)d).')
|
||||
code = 'max_length'
|
||||
|
||||
|
|
|
@ -42,8 +42,14 @@ backend = load_backend(connection.settings_dict['ENGINE'])
|
|||
# Register an event that closes the database connection
|
||||
# when a Django request is finished.
|
||||
def close_connection(**kwargs):
|
||||
for conn in connections.all():
|
||||
conn.close()
|
||||
# Avoid circular imports
|
||||
from django.db import transaction
|
||||
for conn in connections:
|
||||
# If an error happens here the connection will be left in broken
|
||||
# state. Once a good db connection is again available, the
|
||||
# connection state will be cleaned up.
|
||||
transaction.abort(conn)
|
||||
connections[conn].close()
|
||||
signals.request_finished.connect(close_connection)
|
||||
|
||||
# Register an event that resets connection.queries
|
||||
|
|
|
@ -83,6 +83,17 @@ class BaseDatabaseWrapper(object):
|
|||
return
|
||||
self.cursor().execute(self.ops.savepoint_commit_sql(sid))
|
||||
|
||||
def abort(self):
|
||||
"""
|
||||
Roll back any ongoing transaction and clean the transaction state
|
||||
stack.
|
||||
"""
|
||||
if self._dirty:
|
||||
self._rollback()
|
||||
self._dirty = False
|
||||
while self.transaction_state:
|
||||
self.leave_transaction_management()
|
||||
|
||||
def enter_transaction_management(self, managed=True):
|
||||
"""
|
||||
Enters transaction management for a running thread. It must be balanced with
|
||||
|
@ -470,6 +481,14 @@ class BaseDatabaseOperations(object):
|
|||
"""
|
||||
return None
|
||||
|
||||
def bulk_batch_size(self, fields, objs):
|
||||
"""
|
||||
Returns the maximum allowed batch size for the backend. The fields
|
||||
are the fields going to be inserted in the batch, the objs contains
|
||||
all the objects to be inserted.
|
||||
"""
|
||||
return len(objs)
|
||||
|
||||
def date_extract_sql(self, lookup_type, field_name):
|
||||
"""
|
||||
Given a lookup_type of 'year', 'month' or 'day', returns the SQL that
|
||||
|
@ -507,6 +526,17 @@ class BaseDatabaseOperations(object):
|
|||
"""
|
||||
return ''
|
||||
|
||||
def distinct_sql(self, fields):
|
||||
"""
|
||||
Returns an SQL DISTINCT clause which removes duplicate rows from the
|
||||
result set. If any fields are given, only the given fields are being
|
||||
checked for duplicates.
|
||||
"""
|
||||
if fields:
|
||||
raise NotImplementedError('DISTINCT ON fields is not supported by this database backend')
|
||||
else:
|
||||
return 'DISTINCT'
|
||||
|
||||
def drop_foreignkey_sql(self):
|
||||
"""
|
||||
Returns the SQL command that drops a foreign key.
|
||||
|
@ -562,17 +592,6 @@ class BaseDatabaseOperations(object):
|
|||
"""
|
||||
raise NotImplementedError('Full-text search is not implemented for this database backend')
|
||||
|
||||
def distinct_sql(self, fields):
|
||||
"""
|
||||
Returns an SQL DISTINCT clause which removes duplicate rows from the
|
||||
result set. If any fields are given, only the given fields are being
|
||||
checked for duplicates.
|
||||
"""
|
||||
if fields:
|
||||
raise NotImplementedError('DISTINCT ON fields is not supported by this database backend')
|
||||
else:
|
||||
return 'DISTINCT'
|
||||
|
||||
def last_executed_query(self, cursor, sql, params):
|
||||
"""
|
||||
Returns a string of the query last executed by the given cursor, with
|
||||
|
@ -866,6 +885,12 @@ class BaseDatabaseOperations(object):
|
|||
conn = ' %s ' % connector
|
||||
return conn.join(sub_expressions)
|
||||
|
||||
def modify_insert_params(self, placeholders, params):
|
||||
"""Allow modification of insert parameters. Needed for Oracle Spatial
|
||||
backend due to #10888.
|
||||
"""
|
||||
return params
|
||||
|
||||
class BaseDatabaseIntrospection(object):
|
||||
"""
|
||||
This class encapsulates all backend-specific introspection utilities
|
||||
|
|
|
@ -177,7 +177,7 @@ class DatabaseFeatures(BaseDatabaseFeatures):
|
|||
# will tell you the default table type of the created
|
||||
# table. Since all Django's test tables will have the same
|
||||
# table type, that's enough to evaluate the feature.
|
||||
cursor.execute("SHOW TABLE STATUS WHERE Name='INTROSPECT_TEST'")
|
||||
cursor.execute("SHOW TABLE STATUS LIKE 'INTROSPECT_TEST'")
|
||||
result = cursor.fetchone()
|
||||
cursor.execute('DROP TABLE INTROSPECT_TEST')
|
||||
self._storage_engine = result[1]
|
||||
|
@ -407,11 +407,20 @@ class DatabaseWrapper(BaseDatabaseWrapper):
|
|||
|
||||
def get_server_version(self):
|
||||
if not self.server_version:
|
||||
new_connection = False
|
||||
if not self._valid_connection():
|
||||
self.cursor()
|
||||
m = server_version_re.match(self.connection.get_server_info())
|
||||
# Ensure we have a connection with the DB by using a temporary
|
||||
# cursor
|
||||
new_connection = True
|
||||
self.cursor().close()
|
||||
server_info = self.connection.get_server_info()
|
||||
if new_connection:
|
||||
# Make sure we close the connection
|
||||
self.connection.close()
|
||||
self.connection = None
|
||||
m = server_version_re.match(server_info)
|
||||
if not m:
|
||||
raise Exception('Unable to determine MySQL version from version string %r' % self.connection.get_server_info())
|
||||
raise Exception('Unable to determine MySQL version from version string %r' % server_info)
|
||||
self.server_version = tuple([int(x) for x in m.groups()])
|
||||
return self.server_version
|
||||
|
||||
|
|
|
@ -167,6 +167,7 @@ class DatabaseCreation(BaseDatabaseCreation):
|
|||
IDENTIFIED BY %(password)s
|
||||
DEFAULT TABLESPACE %(tblspace)s
|
||||
TEMPORARY TABLESPACE %(tblspace_temp)s
|
||||
QUOTA UNLIMITED ON %(tblspace)s
|
||||
""",
|
||||
"""GRANT CONNECT, RESOURCE TO %(user)s""",
|
||||
]
|
||||
|
|
|
@ -72,14 +72,14 @@ class DatabaseIntrospection(BaseDatabaseIntrospection):
|
|||
FROM user_constraints, USER_CONS_COLUMNS ca, USER_CONS_COLUMNS cb,
|
||||
user_tab_cols ta, user_tab_cols tb
|
||||
WHERE user_constraints.table_name = %s AND
|
||||
ta.table_name = %s AND
|
||||
ta.table_name = user_constraints.table_name AND
|
||||
ta.column_name = ca.column_name AND
|
||||
ca.table_name = %s AND
|
||||
ca.table_name = ta.table_name AND
|
||||
user_constraints.constraint_name = ca.constraint_name AND
|
||||
user_constraints.r_constraint_name = cb.constraint_name AND
|
||||
cb.table_name = tb.table_name AND
|
||||
cb.column_name = tb.column_name AND
|
||||
ca.position = cb.position""", [table_name, table_name, table_name])
|
||||
ca.position = cb.position""", [table_name])
|
||||
|
||||
relations = {}
|
||||
for row in cursor.fetchall():
|
||||
|
@ -87,36 +87,31 @@ class DatabaseIntrospection(BaseDatabaseIntrospection):
|
|||
return relations
|
||||
|
||||
def get_indexes(self, cursor, table_name):
|
||||
sql = """
|
||||
SELECT LOWER(uic1.column_name) AS column_name,
|
||||
CASE user_constraints.constraint_type
|
||||
WHEN 'P' THEN 1 ELSE 0
|
||||
END AS is_primary_key,
|
||||
CASE user_indexes.uniqueness
|
||||
WHEN 'UNIQUE' THEN 1 ELSE 0
|
||||
END AS is_unique
|
||||
FROM user_constraints, user_indexes, user_ind_columns uic1
|
||||
WHERE user_constraints.constraint_type (+) = 'P'
|
||||
AND user_constraints.index_name (+) = uic1.index_name
|
||||
AND user_indexes.uniqueness (+) = 'UNIQUE'
|
||||
AND user_indexes.index_name (+) = uic1.index_name
|
||||
AND uic1.table_name = UPPER(%s)
|
||||
AND uic1.column_position = 1
|
||||
AND NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM user_ind_columns uic2
|
||||
WHERE uic2.index_name = uic1.index_name
|
||||
AND uic2.column_position = 2
|
||||
)
|
||||
"""
|
||||
Returns a dictionary of fieldname -> infodict for the given table,
|
||||
where each infodict is in the format:
|
||||
{'primary_key': boolean representing whether it's the primary key,
|
||||
'unique': boolean representing whether it's a unique index}
|
||||
"""
|
||||
# This query retrieves each index on the given table, including the
|
||||
# first associated field name
|
||||
# "We were in the nick of time; you were in great peril!"
|
||||
sql = """\
|
||||
SELECT LOWER(all_tab_cols.column_name) AS column_name,
|
||||
CASE user_constraints.constraint_type
|
||||
WHEN 'P' THEN 1 ELSE 0
|
||||
END AS is_primary_key,
|
||||
CASE user_indexes.uniqueness
|
||||
WHEN 'UNIQUE' THEN 1 ELSE 0
|
||||
END AS is_unique
|
||||
FROM all_tab_cols, user_cons_columns, user_constraints, user_ind_columns, user_indexes
|
||||
WHERE all_tab_cols.column_name = user_cons_columns.column_name (+)
|
||||
AND all_tab_cols.table_name = user_cons_columns.table_name (+)
|
||||
AND user_cons_columns.constraint_name = user_constraints.constraint_name (+)
|
||||
AND user_constraints.constraint_type (+) = 'P'
|
||||
AND user_ind_columns.column_name (+) = all_tab_cols.column_name
|
||||
AND user_ind_columns.table_name (+) = all_tab_cols.table_name
|
||||
AND user_indexes.uniqueness (+) = 'UNIQUE'
|
||||
AND user_indexes.index_name (+) = user_ind_columns.index_name
|
||||
AND all_tab_cols.table_name = UPPER(%s)
|
||||
"""
|
||||
cursor.execute(sql, [table_name])
|
||||
indexes = {}
|
||||
for row in cursor.fetchall():
|
||||
indexes[row[0]] = {'primary_key': row[1], 'unique': row[2]}
|
||||
indexes[row[0]] = {'primary_key': bool(row[1]),
|
||||
'unique': bool(row[2])}
|
||||
return indexes
|
||||
|
|
|
@ -83,7 +83,7 @@ class DatabaseFeatures(BaseDatabaseFeatures):
|
|||
supports_1000_query_parameters = False
|
||||
supports_mixed_date_datetime_comparisons = False
|
||||
has_bulk_insert = True
|
||||
can_combine_inserts_with_and_without_auto_increment_pk = True
|
||||
can_combine_inserts_with_and_without_auto_increment_pk = False
|
||||
|
||||
def _supports_stddev(self):
|
||||
"""Confirm support for STDDEV and related stats functions
|
||||
|
@ -104,6 +104,13 @@ class DatabaseFeatures(BaseDatabaseFeatures):
|
|||
return has_support
|
||||
|
||||
class DatabaseOperations(BaseDatabaseOperations):
|
||||
def bulk_batch_size(self, fields, objs):
|
||||
"""
|
||||
SQLite has a compile-time default (SQLITE_LIMIT_VARIABLE_NUMBER) of
|
||||
999 variables per query.
|
||||
"""
|
||||
return (999 // len(fields)) if len(fields) > 0 else len(objs)
|
||||
|
||||
def date_extract_sql(self, lookup_type, field_name):
|
||||
# sqlite doesn't support extract, so we fake it with the user-defined
|
||||
# function django_extract that's registered in connect(). Note that
|
||||
|
@ -208,7 +215,7 @@ class DatabaseOperations(BaseDatabaseOperations):
|
|||
res.append("SELECT %s" % ", ".join(
|
||||
"%%s AS %s" % self.quote_name(f.column) for f in fields
|
||||
))
|
||||
res.extend(["UNION SELECT %s" % ", ".join(["%s"] * len(fields))] * (num_values - 1))
|
||||
res.extend(["UNION ALL SELECT %s" % ", ".join(["%s"] * len(fields))] * (num_values - 1))
|
||||
return " ".join(res)
|
||||
|
||||
class DatabaseWrapper(BaseDatabaseWrapper):
|
||||
|
|
|
@ -911,6 +911,12 @@ class FilePathField(Field):
|
|||
kwargs['max_length'] = kwargs.get('max_length', 100)
|
||||
Field.__init__(self, verbose_name, name, **kwargs)
|
||||
|
||||
def get_prep_value(self, value):
|
||||
value = super(FilePathField, self).get_prep_value(value)
|
||||
if value is None:
|
||||
return None
|
||||
return smart_unicode(value)
|
||||
|
||||
def formfield(self, **kwargs):
|
||||
defaults = {
|
||||
'path': self.path,
|
||||
|
@ -1010,6 +1016,12 @@ class IPAddressField(Field):
|
|||
kwargs['max_length'] = 15
|
||||
Field.__init__(self, *args, **kwargs)
|
||||
|
||||
def get_prep_value(self, value):
|
||||
value = super(IPAddressField, self).get_prep_value(value)
|
||||
if value is None:
|
||||
return None
|
||||
return smart_unicode(value)
|
||||
|
||||
def get_internal_type(self):
|
||||
return "IPAddressField"
|
||||
|
||||
|
@ -1023,13 +1035,14 @@ class GenericIPAddressField(Field):
|
|||
description = _("IP address")
|
||||
default_error_messages = {}
|
||||
|
||||
def __init__(self, protocol='both', unpack_ipv4=False, *args, **kwargs):
|
||||
def __init__(self, verbose_name=None, name=None, protocol='both',
|
||||
unpack_ipv4=False, *args, **kwargs):
|
||||
self.unpack_ipv4 = unpack_ipv4
|
||||
self.default_validators, invalid_error_message = \
|
||||
validators.ip_address_validators(protocol, unpack_ipv4)
|
||||
self.default_error_messages['invalid'] = invalid_error_message
|
||||
kwargs['max_length'] = 39
|
||||
Field.__init__(self, *args, **kwargs)
|
||||
Field.__init__(self, verbose_name, name, *args, **kwargs)
|
||||
|
||||
def get_internal_type(self):
|
||||
return "GenericIPAddressField"
|
||||
|
@ -1046,12 +1059,14 @@ class GenericIPAddressField(Field):
|
|||
return value or None
|
||||
|
||||
def get_prep_value(self, value):
|
||||
if value is None:
|
||||
return value
|
||||
if value and ':' in value:
|
||||
try:
|
||||
return clean_ipv6_address(value, self.unpack_ipv4)
|
||||
except exceptions.ValidationError:
|
||||
pass
|
||||
return value
|
||||
return smart_unicode(value)
|
||||
|
||||
def formfield(self, **kwargs):
|
||||
defaults = {'form_class': forms.GenericIPAddressField}
|
||||
|
|
|
@ -239,7 +239,7 @@ class SingleRelatedObjectDescriptor(object):
|
|||
def get_prefetch_query_set(self, instances):
|
||||
vals = set(instance._get_pk_val() for instance in instances)
|
||||
params = {'%s__pk__in' % self.related.field.name: vals}
|
||||
return (self.get_query_set(instance=instances[0]),
|
||||
return (self.get_query_set(instance=instances[0]).filter(**params),
|
||||
attrgetter(self.related.field.attname),
|
||||
lambda obj: obj._get_pk_val(),
|
||||
True,
|
||||
|
@ -531,9 +531,33 @@ def create_many_related_manager(superclass, rel):
|
|||
self.reverse = reverse
|
||||
self.through = through
|
||||
self.prefetch_cache_name = prefetch_cache_name
|
||||
self._pk_val = self.instance.pk
|
||||
if self._pk_val is None:
|
||||
raise ValueError("%r instance needs to have a primary key value before a many-to-many relationship can be used." % instance.__class__.__name__)
|
||||
self._fk_val = self._get_fk_val(instance, source_field_name)
|
||||
if self._fk_val is None:
|
||||
raise ValueError('"%r" needs to have a value for field "%s" before '
|
||||
'this many-to-many relationship can be used.' %
|
||||
(instance, source_field_name))
|
||||
# Even if this relation is not to pk, we require still pk value.
|
||||
# The wish is that the instance has been already saved to DB,
|
||||
# although having a pk value isn't a guarantee of that.
|
||||
if instance.pk is None:
|
||||
raise ValueError("%r instance needs to have a primary key value before "
|
||||
"a many-to-many relationship can be used." %
|
||||
instance.__class__.__name__)
|
||||
|
||||
def _get_fk_val(self, obj, field_name):
|
||||
"""
|
||||
Returns the correct value for this relationship's foreign key. This
|
||||
might be something else than pk value when to_field is used.
|
||||
"""
|
||||
if not self.through:
|
||||
# Make custom m2m fields with no through model defined usable.
|
||||
return obj.pk
|
||||
fk = self.through._meta.get_field(field_name)
|
||||
if fk.rel.field_name and fk.rel.field_name != fk.rel.to._meta.pk.attname:
|
||||
attname = fk.rel.get_related_field().get_attname()
|
||||
return fk.get_prep_lookup('exact', getattr(obj, attname))
|
||||
else:
|
||||
return obj.pk
|
||||
|
||||
def get_query_set(self):
|
||||
try:
|
||||
|
@ -635,7 +659,11 @@ def create_many_related_manager(superclass, rel):
|
|||
if not router.allow_relation(obj, self.instance):
|
||||
raise ValueError('Cannot add "%r": instance is on database "%s", value is on database "%s"' %
|
||||
(obj, self.instance._state.db, obj._state.db))
|
||||
new_ids.add(obj.pk)
|
||||
fk_val = self._get_fk_val(obj, target_field_name)
|
||||
if fk_val is None:
|
||||
raise ValueError('Cannot add "%r": the value for field "%s" is None' %
|
||||
(obj, target_field_name))
|
||||
new_ids.add(self._get_fk_val(obj, target_field_name))
|
||||
elif isinstance(obj, Model):
|
||||
raise TypeError("'%s' instance expected, got %r" % (self.model._meta.object_name, obj))
|
||||
else:
|
||||
|
@ -643,7 +671,7 @@ def create_many_related_manager(superclass, rel):
|
|||
db = router.db_for_write(self.through, instance=self.instance)
|
||||
vals = self.through._default_manager.using(db).values_list(target_field_name, flat=True)
|
||||
vals = vals.filter(**{
|
||||
source_field_name: self._pk_val,
|
||||
source_field_name: self._fk_val,
|
||||
'%s__in' % target_field_name: new_ids,
|
||||
})
|
||||
new_ids = new_ids - set(vals)
|
||||
|
@ -657,11 +685,12 @@ def create_many_related_manager(superclass, rel):
|
|||
# Add the ones that aren't there already
|
||||
self.through._default_manager.using(db).bulk_create([
|
||||
self.through(**{
|
||||
'%s_id' % source_field_name: self._pk_val,
|
||||
'%s_id' % source_field_name: self._fk_val,
|
||||
'%s_id' % target_field_name: obj_id,
|
||||
})
|
||||
for obj_id in new_ids
|
||||
])
|
||||
|
||||
if self.reverse or source_field_name == self.source_field_name:
|
||||
# Don't send the signal when we are inserting the
|
||||
# duplicate data row for symmetrical reverse entries.
|
||||
|
@ -680,7 +709,7 @@ def create_many_related_manager(superclass, rel):
|
|||
old_ids = set()
|
||||
for obj in objs:
|
||||
if isinstance(obj, self.model):
|
||||
old_ids.add(obj.pk)
|
||||
old_ids.add(self._get_fk_val(obj, target_field_name))
|
||||
else:
|
||||
old_ids.add(obj)
|
||||
# Work out what DB we're operating on
|
||||
|
@ -694,7 +723,7 @@ def create_many_related_manager(superclass, rel):
|
|||
model=self.model, pk_set=old_ids, using=db)
|
||||
# Remove the specified objects from the join table
|
||||
self.through._default_manager.using(db).filter(**{
|
||||
source_field_name: self._pk_val,
|
||||
source_field_name: self._fk_val,
|
||||
'%s__in' % target_field_name: old_ids
|
||||
}).delete()
|
||||
if self.reverse or source_field_name == self.source_field_name:
|
||||
|
@ -714,7 +743,7 @@ def create_many_related_manager(superclass, rel):
|
|||
instance=self.instance, reverse=self.reverse,
|
||||
model=self.model, pk_set=None, using=db)
|
||||
self.through._default_manager.using(db).filter(**{
|
||||
source_field_name: self._pk_val
|
||||
source_field_name: self._fk_val
|
||||
}).delete()
|
||||
if self.reverse or source_field_name == self.source_field_name:
|
||||
# Don't send the signal when we are clearing the
|
||||
|
|
|
@ -377,7 +377,7 @@ class QuerySet(object):
|
|||
obj.save(force_insert=True, using=self.db)
|
||||
return obj
|
||||
|
||||
def bulk_create(self, objs):
|
||||
def bulk_create(self, objs, batch_size=None):
|
||||
"""
|
||||
Inserts each of the instances into the database. This does *not* call
|
||||
save() on each of the instances, does not send any pre/post save
|
||||
|
@ -390,8 +390,10 @@ class QuerySet(object):
|
|||
# this could be implemented if you didn't have an autoincrement pk,
|
||||
# and 2) you could do it by doing O(n) normal inserts into the parent
|
||||
# tables to get the primary keys back, and then doing a single bulk
|
||||
# insert into the childmost table. We're punting on these for now
|
||||
# because they are relatively rare cases.
|
||||
# insert into the childmost table. Some databases might allow doing
|
||||
# this by using RETURNING clause for the insert query. We're punting
|
||||
# on these for now because they are relatively rare cases.
|
||||
assert batch_size is None or batch_size > 0
|
||||
if self.model._meta.parents:
|
||||
raise ValueError("Can't bulk create an inherited model")
|
||||
if not objs:
|
||||
|
@ -407,13 +409,14 @@ class QuerySet(object):
|
|||
try:
|
||||
if (connection.features.can_combine_inserts_with_and_without_auto_increment_pk
|
||||
and self.model._meta.has_auto_field):
|
||||
self.model._base_manager._insert(objs, fields=fields, using=self.db)
|
||||
self._batched_insert(objs, fields, batch_size)
|
||||
else:
|
||||
objs_with_pk, objs_without_pk = partition(lambda o: o.pk is None, objs)
|
||||
if objs_with_pk:
|
||||
self.model._base_manager._insert(objs_with_pk, fields=fields, using=self.db)
|
||||
self._batched_insert(objs_with_pk, fields, batch_size)
|
||||
if objs_without_pk:
|
||||
self.model._base_manager._insert(objs_without_pk, fields=[f for f in fields if not isinstance(f, AutoField)], using=self.db)
|
||||
fields= [f for f in fields if not isinstance(f, AutoField)]
|
||||
self._batched_insert(objs_without_pk, fields, batch_size)
|
||||
if forced_managed:
|
||||
transaction.commit(using=self.db)
|
||||
else:
|
||||
|
@ -849,6 +852,20 @@ class QuerySet(object):
|
|||
###################
|
||||
# PRIVATE METHODS #
|
||||
###################
|
||||
def _batched_insert(self, objs, fields, batch_size):
|
||||
"""
|
||||
A little helper method for bulk_insert to insert the bulk one batch
|
||||
at a time. Inserts recursively a batch from the front of the bulk and
|
||||
then _batched_insert() the remaining objects again.
|
||||
"""
|
||||
if not objs:
|
||||
return
|
||||
ops = connections[self.db].ops
|
||||
batch_size = (batch_size or max(ops.bulk_batch_size(fields, objs), 1))
|
||||
for batch in [objs[i:i+batch_size]
|
||||
for i in range(0, len(objs), batch_size)]:
|
||||
self.model._base_manager._insert(batch, fields=fields,
|
||||
using=self.db)
|
||||
|
||||
def _clone(self, klass=None, setup=False, **kwargs):
|
||||
if klass is None:
|
||||
|
|
|
@ -885,6 +885,8 @@ class SQLInsertCompiler(SQLCompiler):
|
|||
[self.placeholder(field, v) for field, v in izip(fields, val)]
|
||||
for val in values
|
||||
]
|
||||
# Oracle Spatial needs to remove some values due to #10888
|
||||
params = self.connection.ops.modify_insert_params(placeholders, params)
|
||||
if self.return_id and self.connection.features.can_return_id_from_insert:
|
||||
params = params[0]
|
||||
col = "%s.%s" % (qn(opts.db_table), qn(opts.pk.column))
|
||||
|
|
|
@ -25,6 +25,21 @@ class TransactionManagementError(Exception):
|
|||
"""
|
||||
pass
|
||||
|
||||
def abort(using=None):
|
||||
"""
|
||||
Roll back any ongoing transactions and clean the transaction management
|
||||
state of the connection.
|
||||
|
||||
This method is to be used only in cases where using balanced
|
||||
leave_transaction_management() calls isn't possible. For example after a
|
||||
request has finished, the transaction state isn't known, yet the connection
|
||||
must be cleaned up for the next request.
|
||||
"""
|
||||
if using is None:
|
||||
using = DEFAULT_DB_ALIAS
|
||||
connection = connections[using]
|
||||
connection.abort()
|
||||
|
||||
def enter_transaction_management(managed=True, using=None):
|
||||
"""
|
||||
Enters transaction management for a running thread. It must be balanced with
|
||||
|
|
|
@ -570,20 +570,10 @@ class ImageField(FileField):
|
|||
file = StringIO(data['content'])
|
||||
|
||||
try:
|
||||
# load() is the only method that can spot a truncated JPEG,
|
||||
# but it cannot be called sanely after verify()
|
||||
trial_image = Image.open(file)
|
||||
trial_image.load()
|
||||
|
||||
# Since we're about to use the file again we have to reset the
|
||||
# file object if possible.
|
||||
if hasattr(file, 'reset'):
|
||||
file.reset()
|
||||
|
||||
# verify() is the only method that can spot a corrupt PNG,
|
||||
# but it must be called immediately after the constructor
|
||||
trial_image = Image.open(file)
|
||||
trial_image.verify()
|
||||
# load() could spot a truncated JPEG, but it loads the entire
|
||||
# image in memory, which is a DoS vector. See #3848 and #18520.
|
||||
# verify() must be called immediately after the constructor.
|
||||
Image.open(file).verify()
|
||||
except ImportError:
|
||||
# Under PyPy, it is possible to import PIL. However, the underlying
|
||||
# _imaging C module isn't available, so an ImportError will be
|
||||
|
|
|
@ -19,6 +19,9 @@ MAX_NUM_FORM_COUNT = 'MAX_NUM_FORMS'
|
|||
ORDERING_FIELD_NAME = 'ORDER'
|
||||
DELETION_FIELD_NAME = 'DELETE'
|
||||
|
||||
# default maximum number of forms in a formset, to prevent memory exhaustion
|
||||
DEFAULT_MAX_NUM = 1000
|
||||
|
||||
class ManagementForm(Form):
|
||||
"""
|
||||
``ManagementForm`` is used to keep track of how many form instances
|
||||
|
@ -111,7 +114,7 @@ class BaseFormSet(StrAndUnicode):
|
|||
def _construct_forms(self):
|
||||
# instantiate all the forms and put them in self.forms
|
||||
self.forms = []
|
||||
for i in xrange(self.total_form_count()):
|
||||
for i in xrange(min(self.total_form_count(), self.absolute_max)):
|
||||
self.forms.append(self._construct_form(i))
|
||||
|
||||
def _construct_form(self, i, **kwargs):
|
||||
|
@ -360,9 +363,14 @@ class BaseFormSet(StrAndUnicode):
|
|||
def formset_factory(form, formset=BaseFormSet, extra=1, can_order=False,
|
||||
can_delete=False, max_num=None):
|
||||
"""Return a FormSet for the given form class."""
|
||||
if max_num is None:
|
||||
max_num = DEFAULT_MAX_NUM
|
||||
# hard limit on forms instantiated, to prevent memory-exhaustion attacks
|
||||
# limit defaults to DEFAULT_MAX_NUM, but developer can increase it via max_num
|
||||
absolute_max = max(DEFAULT_MAX_NUM, max_num)
|
||||
attrs = {'form': form, 'extra': extra,
|
||||
'can_order': can_order, 'can_delete': can_delete,
|
||||
'max_num': max_num}
|
||||
'max_num': max_num, 'absolute_max': absolute_max}
|
||||
return type(form.__name__ + 'FormSet', (formset,), attrs)
|
||||
|
||||
def all_valid(formsets):
|
||||
|
|
|
@ -487,15 +487,18 @@ class TimeInput(Input):
|
|||
pass
|
||||
return super(TimeInput, self)._has_changed(self._format_value(initial), data)
|
||||
|
||||
|
||||
# Defined at module level so that CheckboxInput is picklable (#17976)
|
||||
def boolean_check(v):
|
||||
return not (v is False or v is None or v == '')
|
||||
|
||||
|
||||
class CheckboxInput(Widget):
|
||||
def __init__(self, attrs=None, check_test=None):
|
||||
super(CheckboxInput, self).__init__(attrs)
|
||||
# check_test is a callable that takes a value and returns True
|
||||
# if the checkbox should be checked for that value.
|
||||
if check_test is None:
|
||||
self.check_test = lambda v: not (v is False or v is None or v == '')
|
||||
else:
|
||||
self.check_test = check_test
|
||||
self.check_test = boolean_check if check_test is None else check_test
|
||||
|
||||
def render(self, name, value, attrs=None):
|
||||
final_attrs = self.build_attrs(attrs, type='checkbox', name=name)
|
||||
|
|
|
@ -9,7 +9,7 @@ import warnings
|
|||
|
||||
from pprint import pformat
|
||||
from urllib import urlencode, quote
|
||||
from urlparse import urljoin
|
||||
from urlparse import urljoin, urlparse
|
||||
try:
|
||||
from cStringIO import StringIO
|
||||
except ImportError:
|
||||
|
@ -114,7 +114,7 @@ class CompatCookie(SimpleCookie):
|
|||
|
||||
from django.conf import settings
|
||||
from django.core import signing
|
||||
from django.core.exceptions import ImproperlyConfigured
|
||||
from django.core.exceptions import ImproperlyConfigured, SuspiciousOperation
|
||||
from django.core.files import uploadhandler
|
||||
from django.http.multipartparser import MultiPartParser
|
||||
from django.http.utils import *
|
||||
|
@ -126,6 +126,8 @@ from django.utils import timezone
|
|||
RESERVED_CHARS="!*'();:@&=+$,/?%#[]"
|
||||
|
||||
absolute_http_url_re = re.compile(r"^https?://", re.I)
|
||||
host_validation_re = re.compile(r"^([a-z0-9.-]+|\[[a-f0-9]*:[a-f0-9:]+\])(:\d+)?$")
|
||||
|
||||
|
||||
class Http404(Exception):
|
||||
pass
|
||||
|
@ -212,7 +214,13 @@ class HttpRequest(object):
|
|||
server_port = str(self.META['SERVER_PORT'])
|
||||
if server_port != (self.is_secure() and '443' or '80'):
|
||||
host = '%s:%s' % (host, server_port)
|
||||
return host
|
||||
|
||||
allowed_hosts = ['*'] if settings.DEBUG else settings.ALLOWED_HOSTS
|
||||
if validate_host(host, allowed_hosts):
|
||||
return host
|
||||
else:
|
||||
raise SuspiciousOperation(
|
||||
"Invalid HTTP_HOST header (you may need to set ALLOWED_HOSTS): %s" % host)
|
||||
|
||||
def get_full_path(self):
|
||||
# RFC 3986 requires query string arguments to be in the ASCII range.
|
||||
|
@ -731,20 +739,22 @@ class HttpResponse(object):
|
|||
raise Exception("This %s instance cannot tell its position" % self.__class__)
|
||||
return sum([len(str(chunk)) for chunk in self._container])
|
||||
|
||||
class HttpResponseRedirect(HttpResponse):
|
||||
class HttpResponseRedirectBase(HttpResponse):
|
||||
allowed_schemes = ['http', 'https', 'ftp']
|
||||
|
||||
def __init__(self, redirect_to):
|
||||
super(HttpResponseRedirectBase, self).__init__()
|
||||
parsed = urlparse(redirect_to)
|
||||
if parsed.scheme and parsed.scheme not in self.allowed_schemes:
|
||||
raise SuspiciousOperation("Unsafe redirect to URL with scheme '%s'" % parsed.scheme)
|
||||
self['Location'] = iri_to_uri(redirect_to)
|
||||
|
||||
class HttpResponseRedirect(HttpResponseRedirectBase):
|
||||
status_code = 302
|
||||
|
||||
def __init__(self, redirect_to):
|
||||
super(HttpResponseRedirect, self).__init__()
|
||||
self['Location'] = iri_to_uri(redirect_to)
|
||||
|
||||
class HttpResponsePermanentRedirect(HttpResponse):
|
||||
class HttpResponsePermanentRedirect(HttpResponseRedirectBase):
|
||||
status_code = 301
|
||||
|
||||
def __init__(self, redirect_to):
|
||||
super(HttpResponsePermanentRedirect, self).__init__()
|
||||
self['Location'] = iri_to_uri(redirect_to)
|
||||
|
||||
class HttpResponseNotModified(HttpResponse):
|
||||
status_code = 304
|
||||
|
||||
|
@ -790,3 +800,43 @@ def str_to_unicode(s, encoding):
|
|||
else:
|
||||
return s
|
||||
|
||||
def validate_host(host, allowed_hosts):
|
||||
"""
|
||||
Validate the given host header value for this site.
|
||||
|
||||
Check that the host looks valid and matches a host or host pattern in the
|
||||
given list of ``allowed_hosts``. Any pattern beginning with a period
|
||||
matches a domain and all its subdomains (e.g. ``.example.com`` matches
|
||||
``example.com`` and any subdomain), ``*`` matches anything, and anything
|
||||
else must match exactly.
|
||||
|
||||
Return ``True`` for a valid host, ``False`` otherwise.
|
||||
|
||||
"""
|
||||
# All validation is case-insensitive
|
||||
host = host.lower()
|
||||
|
||||
# Basic sanity check
|
||||
if not host_validation_re.match(host):
|
||||
return False
|
||||
|
||||
# Validate only the domain part.
|
||||
if host[-1] == ']':
|
||||
# It's an IPv6 address without a port.
|
||||
domain = host
|
||||
else:
|
||||
domain = host.rsplit(':', 1)[0]
|
||||
|
||||
for pattern in allowed_hosts:
|
||||
pattern = pattern.lower()
|
||||
match = (
|
||||
pattern == '*' or
|
||||
pattern.startswith('.') and (
|
||||
domain.endswith(pattern) or domain == pattern[1:]
|
||||
) or
|
||||
pattern == domain
|
||||
)
|
||||
if match:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
|
|
@ -31,57 +31,3 @@ def conditional_content_removal(request, response):
|
|||
if request.method == 'HEAD':
|
||||
response.content = ''
|
||||
return response
|
||||
|
||||
def fix_IE_for_attach(request, response):
|
||||
"""
|
||||
This function will prevent Django from serving a Content-Disposition header
|
||||
while expecting the browser to cache it (only when the browser is IE). This
|
||||
leads to IE not allowing the client to download.
|
||||
"""
|
||||
useragent = request.META.get('HTTP_USER_AGENT', '').upper()
|
||||
if 'MSIE' not in useragent and 'CHROMEFRAME' not in useragent:
|
||||
return response
|
||||
|
||||
offending_headers = ('no-cache', 'no-store')
|
||||
if response.has_header('Content-Disposition'):
|
||||
try:
|
||||
del response['Pragma']
|
||||
except KeyError:
|
||||
pass
|
||||
if response.has_header('Cache-Control'):
|
||||
cache_control_values = [value.strip() for value in
|
||||
response['Cache-Control'].split(',')
|
||||
if value.strip().lower() not in offending_headers]
|
||||
|
||||
if not len(cache_control_values):
|
||||
del response['Cache-Control']
|
||||
else:
|
||||
response['Cache-Control'] = ', '.join(cache_control_values)
|
||||
|
||||
return response
|
||||
|
||||
def fix_IE_for_vary(request, response):
|
||||
"""
|
||||
This function will fix the bug reported at
|
||||
http://support.microsoft.com/kb/824847/en-us?spid=8722&sid=global
|
||||
by clearing the Vary header whenever the mime-type is not safe
|
||||
enough for Internet Explorer to handle. Poor thing.
|
||||
"""
|
||||
useragent = request.META.get('HTTP_USER_AGENT', '').upper()
|
||||
if 'MSIE' not in useragent and 'CHROMEFRAME' not in useragent:
|
||||
return response
|
||||
|
||||
# These mime-types that are decreed "Vary-safe" for IE:
|
||||
safe_mime_types = ('text/html', 'text/plain', 'text/sgml')
|
||||
|
||||
# The first part of the Content-Type field will be the MIME type,
|
||||
# everything after ';', such as character-set, can be ignored.
|
||||
mime_type = response.get('Content-Type', '').partition(';')[0]
|
||||
if mime_type not in safe_mime_types:
|
||||
try:
|
||||
del response['Vary']
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
return response
|
||||
|
||||
|
|
|
@ -50,7 +50,8 @@ More details about how the caching works:
|
|||
|
||||
from django.conf import settings
|
||||
from django.core.cache import get_cache, DEFAULT_CACHE_ALIAS
|
||||
from django.utils.cache import get_cache_key, learn_cache_key, patch_response_headers, get_max_age
|
||||
from django.utils.cache import (get_cache_key, get_max_age, has_vary_header,
|
||||
learn_cache_key, patch_response_headers)
|
||||
|
||||
|
||||
class UpdateCacheMiddleware(object):
|
||||
|
@ -93,8 +94,15 @@ class UpdateCacheMiddleware(object):
|
|||
if not self._should_update_cache(request, response):
|
||||
# We don't need to update the cache, just return.
|
||||
return response
|
||||
|
||||
if not response.status_code == 200:
|
||||
return response
|
||||
|
||||
# Don't cache responses that set a user-specific (and maybe security
|
||||
# sensitive) cookie in response to a cookie-less request.
|
||||
if not request.COOKIES and response.cookies and has_vary_header(response, 'Cookie'):
|
||||
return response
|
||||
|
||||
# Try to get the timeout from the "max-age" section of the "Cache-
|
||||
# Control" header before reverting to using the default cache_timeout
|
||||
# length.
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
import re
|
||||
|
||||
from django.utils.text import compress_string
|
||||
from django.utils.text import compress_string, compress_sequence
|
||||
from django.utils.cache import patch_vary_headers
|
||||
|
||||
re_accepts_gzip = re.compile(r'\bgzip\b')
|
||||
|
@ -12,8 +12,9 @@ class GZipMiddleware(object):
|
|||
on the Accept-Encoding header.
|
||||
"""
|
||||
def process_response(self, request, response):
|
||||
# The response object can tell us whether content is a string or an iterable
|
||||
# It's not worth attempting to compress really short responses.
|
||||
if len(response.content) < 200:
|
||||
if not response._base_content_is_iter and len(response.content) < 200:
|
||||
return response
|
||||
|
||||
patch_vary_headers(response, ('Accept-Encoding',))
|
||||
|
@ -32,15 +33,23 @@ class GZipMiddleware(object):
|
|||
if not re_accepts_gzip.search(ae):
|
||||
return response
|
||||
|
||||
# Return the compressed content only if it's actually shorter.
|
||||
compressed_content = compress_string(response.content)
|
||||
if len(compressed_content) >= len(response.content):
|
||||
return response
|
||||
# The response object can tell us whether content is a string or an iterable
|
||||
if response._base_content_is_iter:
|
||||
# If the response content is iterable we don't know the length, so delete the header.
|
||||
del response['Content-Length']
|
||||
# Wrap the response content in a streaming gzip iterator (direct access to inner response._container)
|
||||
response.content = compress_sequence(response._container)
|
||||
else:
|
||||
# Return the compressed content only if it's actually shorter.
|
||||
compressed_content = compress_string(response.content)
|
||||
if len(compressed_content) >= len(response.content):
|
||||
return response
|
||||
response.content = compressed_content
|
||||
response['Content-Length'] = str(len(response.content))
|
||||
|
||||
if response.has_header('ETag'):
|
||||
response['ETag'] = re.sub('"$', ';gzip"', response['ETag'])
|
||||
|
||||
response.content = compressed_content
|
||||
response['Content-Encoding'] = 'gzip'
|
||||
response['Content-Length'] = str(len(response.content))
|
||||
|
||||
return response
|
||||
|
|
|
@ -15,6 +15,10 @@ class TransactionMiddleware(object):
|
|||
def process_exception(self, request, exception):
|
||||
"""Rolls back the database and leaves transaction management"""
|
||||
if transaction.is_dirty():
|
||||
# This rollback might fail because of network failure for example.
|
||||
# If rollback isn't possible it is impossible to clean the
|
||||
# connection's state. So leave the connection in dirty state and
|
||||
# let request_finished signal deal with cleaning the connection.
|
||||
transaction.rollback()
|
||||
transaction.leave_transaction_management()
|
||||
|
||||
|
@ -22,6 +26,21 @@ class TransactionMiddleware(object):
|
|||
"""Commits and leaves transaction management."""
|
||||
if transaction.is_managed():
|
||||
if transaction.is_dirty():
|
||||
transaction.commit()
|
||||
# Note: it is possible that the commit fails. If the reason is
|
||||
# closed connection or some similar reason, then there is
|
||||
# little hope to proceed nicely. However, in some cases (
|
||||
# deferred foreign key checks for exampl) it is still possible
|
||||
# to rollback().
|
||||
try:
|
||||
transaction.commit()
|
||||
except Exception:
|
||||
# If the rollback fails, the transaction state will be
|
||||
# messed up. It doesn't matter, the connection will be set
|
||||
# to clean state after the request finishes. And, we can't
|
||||
# clean the state here properly even if we wanted to, the
|
||||
# connection is in transaction but we can't rollback...
|
||||
transaction.rollback()
|
||||
transaction.leave_transaction_management()
|
||||
raise
|
||||
transaction.leave_transaction_management()
|
||||
return response
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
"""Default tags used by the template system, available to all templates."""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import re
|
||||
from datetime import datetime
|
||||
|
@ -309,6 +310,7 @@ class RegroupNode(Node):
|
|||
return ''
|
||||
|
||||
def include_is_allowed(filepath):
|
||||
filepath = os.path.abspath(filepath)
|
||||
for root in settings.ALLOWED_INCLUDE_ROOTS:
|
||||
if filepath.startswith(root):
|
||||
return True
|
||||
|
|
|
@ -63,6 +63,7 @@ real_rollback = transaction.rollback
|
|||
real_enter_transaction_management = transaction.enter_transaction_management
|
||||
real_leave_transaction_management = transaction.leave_transaction_management
|
||||
real_managed = transaction.managed
|
||||
real_abort = transaction.abort
|
||||
|
||||
def nop(*args, **kwargs):
|
||||
return
|
||||
|
@ -73,6 +74,7 @@ def disable_transaction_methods():
|
|||
transaction.enter_transaction_management = nop
|
||||
transaction.leave_transaction_management = nop
|
||||
transaction.managed = nop
|
||||
transaction.abort = nop
|
||||
|
||||
def restore_transaction_methods():
|
||||
transaction.commit = real_commit
|
||||
|
@ -80,6 +82,7 @@ def restore_transaction_methods():
|
|||
transaction.enter_transaction_management = real_enter_transaction_management
|
||||
transaction.leave_transaction_management = real_leave_transaction_management
|
||||
transaction.managed = real_managed
|
||||
transaction.abort = real_abort
|
||||
|
||||
|
||||
def assert_and_parse_html(self, html, user_msg, msg):
|
||||
|
@ -1143,4 +1146,11 @@ class LiveServerTestCase(TransactionTestCase):
|
|||
if hasattr(cls, 'server_thread'):
|
||||
# Terminate the live server's thread
|
||||
cls.server_thread.join()
|
||||
|
||||
# Restore sqlite connections' non-sharability
|
||||
for conn in connections.all():
|
||||
if (conn.settings_dict['ENGINE'] == 'django.db.backends.sqlite3'
|
||||
and conn.settings_dict['NAME'] == ':memory:'):
|
||||
conn.allow_thread_sharing = False
|
||||
|
||||
super(LiveServerTestCase, cls).tearDownClass()
|
||||
|
|
|
@ -3,6 +3,7 @@ from __future__ import with_statement
|
|||
import warnings
|
||||
from django.conf import settings, UserSettingsHolder
|
||||
from django.core import mail
|
||||
from django import http
|
||||
from django.test.signals import template_rendered, setting_changed
|
||||
from django.template import Template, loader, TemplateDoesNotExist
|
||||
from django.template.loaders import cached
|
||||
|
@ -69,12 +70,18 @@ def setup_test_environment():
|
|||
- Set the email backend to the locmem email backend.
|
||||
- Setting the active locale to match the LANGUAGE_CODE setting.
|
||||
"""
|
||||
Template.original_render = Template._render
|
||||
Template._original_render = Template._render
|
||||
Template._render = instrumented_test_render
|
||||
|
||||
mail.original_email_backend = settings.EMAIL_BACKEND
|
||||
# Storing previous values in the settings module itself is problematic.
|
||||
# Store them in arbitrary (but related) modules instead. See #20636.
|
||||
|
||||
mail._original_email_backend = settings.EMAIL_BACKEND
|
||||
settings.EMAIL_BACKEND = 'django.core.mail.backends.locmem.EmailBackend'
|
||||
|
||||
http._original_allowed_hosts = settings.ALLOWED_HOSTS
|
||||
settings.ALLOWED_HOSTS = ['*']
|
||||
|
||||
mail.outbox = []
|
||||
|
||||
deactivate()
|
||||
|
@ -87,11 +94,14 @@ def teardown_test_environment():
|
|||
- Restoring the email sending functions
|
||||
|
||||
"""
|
||||
Template._render = Template.original_render
|
||||
del Template.original_render
|
||||
Template._render = Template._original_render
|
||||
del Template._original_render
|
||||
|
||||
settings.EMAIL_BACKEND = mail.original_email_backend
|
||||
del mail.original_email_backend
|
||||
settings.EMAIL_BACKEND = mail._original_email_backend
|
||||
del mail._original_email_backend
|
||||
|
||||
settings.ALLOWED_HOSTS = http._original_allowed_hosts
|
||||
del http._original_allowed_hosts
|
||||
|
||||
del mail.outbox
|
||||
|
||||
|
|
|
@ -106,21 +106,6 @@ def _long_to_bin(x, hex_format_string):
|
|||
return binascii.unhexlify(hex_format_string % x)
|
||||
|
||||
|
||||
def _fast_hmac(key, msg, digest):
|
||||
"""
|
||||
A trimmed down version of Python's HMAC implementation
|
||||
"""
|
||||
dig1, dig2 = digest(), digest()
|
||||
if len(key) > dig1.block_size:
|
||||
key = digest(key).digest()
|
||||
key += chr(0) * (dig1.block_size - len(key))
|
||||
dig1.update(key.translate(_trans_36))
|
||||
dig1.update(msg)
|
||||
dig2.update(key.translate(_trans_5c))
|
||||
dig2.update(dig1.digest())
|
||||
return dig2
|
||||
|
||||
|
||||
def pbkdf2(password, salt, iterations, dklen=0, digest=None):
|
||||
"""
|
||||
Implements PBKDF2 as defined in RFC 2898, section 5.2
|
||||
|
@ -146,11 +131,21 @@ def pbkdf2(password, salt, iterations, dklen=0, digest=None):
|
|||
|
||||
hex_format_string = "%%0%ix" % (hlen * 2)
|
||||
|
||||
inner, outer = digest(), digest()
|
||||
if len(password) > inner.block_size:
|
||||
password = digest(password).digest()
|
||||
password += '\x00' * (inner.block_size - len(password))
|
||||
inner.update(password.translate(_trans_36))
|
||||
outer.update(password.translate(_trans_5c))
|
||||
|
||||
def F(i):
|
||||
def U():
|
||||
u = salt + struct.pack('>I', i)
|
||||
for j in xrange(int(iterations)):
|
||||
u = _fast_hmac(password, u, digest).digest()
|
||||
dig1, dig2 = inner.copy(), outer.copy()
|
||||
dig1.update(u)
|
||||
dig2.update(dig1.digest())
|
||||
u = dig2.digest()
|
||||
yield _bin_to_long(u)
|
||||
return _long_to_bin(reduce(operator.xor, U()), hex_format_string)
|
||||
|
||||
|
|
|
@ -19,8 +19,11 @@ class datetime(real_datetime):
|
|||
def strftime(self, fmt):
|
||||
return strftime(self, fmt)
|
||||
|
||||
def combine(self, date, time):
|
||||
return datetime(date.year, date.month, date.day, time.hour, time.minute, time.microsecond, time.tzinfo)
|
||||
@classmethod
|
||||
def combine(cls, date, time):
|
||||
return cls(date.year, date.month, date.day,
|
||||
time.hour, time.minute, time.second,
|
||||
time.microsecond, time.tzinfo)
|
||||
|
||||
def date(self):
|
||||
return date(self.year, self.month, self.day)
|
||||
|
|
|
@ -183,3 +183,15 @@ try:
|
|||
codecs.lookup(DEFAULT_LOCALE_ENCODING)
|
||||
except:
|
||||
DEFAULT_LOCALE_ENCODING = 'ascii'
|
||||
|
||||
# Forwards compatibility with Django 1.5
|
||||
|
||||
def python_2_unicode_compatible(klass):
|
||||
# Always use the Python 2 branch of the decorator here in Django 1.4
|
||||
klass.__unicode__ = klass.__str__
|
||||
klass.__str__ = lambda self: self.__unicode__().encode('utf-8')
|
||||
return klass
|
||||
|
||||
smart_text = smart_unicode
|
||||
force_text = force_unicode
|
||||
smart_bytes = smart_str
|
||||
|
|
|
@ -1,98 +1,108 @@
|
|||
import HTMLParser as _HTMLParser
|
||||
import re
|
||||
import sys
|
||||
|
||||
current_version = sys.version_info
|
||||
|
||||
class HTMLParser(_HTMLParser.HTMLParser):
|
||||
"""
|
||||
Patched version of stdlib's HTMLParser with patch from:
|
||||
http://bugs.python.org/issue670664
|
||||
"""
|
||||
def __init__(self):
|
||||
_HTMLParser.HTMLParser.__init__(self)
|
||||
self.cdata_tag = None
|
||||
use_workaround = (
|
||||
(current_version < (2, 7, 3)) or
|
||||
(current_version >= (3, 0) and current_version < (3, 2, 3))
|
||||
)
|
||||
|
||||
def set_cdata_mode(self, tag):
|
||||
try:
|
||||
self.interesting = _HTMLParser.interesting_cdata
|
||||
except AttributeError:
|
||||
self.interesting = re.compile(r'</\s*%s\s*>' % tag.lower(), re.I)
|
||||
self.cdata_tag = tag.lower()
|
||||
if not use_workaround:
|
||||
HTMLParser = _HTMLParser.HTMLParser
|
||||
else:
|
||||
class HTMLParser(_HTMLParser.HTMLParser):
|
||||
"""
|
||||
Patched version of stdlib's HTMLParser with patch from:
|
||||
http://bugs.python.org/issue670664
|
||||
"""
|
||||
def __init__(self):
|
||||
_HTMLParser.HTMLParser.__init__(self)
|
||||
self.cdata_tag = None
|
||||
|
||||
def clear_cdata_mode(self):
|
||||
self.interesting = _HTMLParser.interesting_normal
|
||||
self.cdata_tag = None
|
||||
def set_cdata_mode(self, tag):
|
||||
try:
|
||||
self.interesting = _HTMLParser.interesting_cdata
|
||||
except AttributeError:
|
||||
self.interesting = re.compile(r'</\s*%s\s*>' % tag.lower(), re.I)
|
||||
self.cdata_tag = tag.lower()
|
||||
|
||||
# Internal -- handle starttag, return end or -1 if not terminated
|
||||
def parse_starttag(self, i):
|
||||
self.__starttag_text = None
|
||||
endpos = self.check_for_whole_start_tag(i)
|
||||
if endpos < 0:
|
||||
return endpos
|
||||
rawdata = self.rawdata
|
||||
self.__starttag_text = rawdata[i:endpos]
|
||||
def clear_cdata_mode(self):
|
||||
self.interesting = _HTMLParser.interesting_normal
|
||||
self.cdata_tag = None
|
||||
|
||||
# Now parse the data between i+1 and j into a tag and attrs
|
||||
attrs = []
|
||||
match = _HTMLParser.tagfind.match(rawdata, i + 1)
|
||||
assert match, 'unexpected call to parse_starttag()'
|
||||
k = match.end()
|
||||
self.lasttag = tag = rawdata[i + 1:k].lower()
|
||||
# Internal -- handle starttag, return end or -1 if not terminated
|
||||
def parse_starttag(self, i):
|
||||
self.__starttag_text = None
|
||||
endpos = self.check_for_whole_start_tag(i)
|
||||
if endpos < 0:
|
||||
return endpos
|
||||
rawdata = self.rawdata
|
||||
self.__starttag_text = rawdata[i:endpos]
|
||||
|
||||
while k < endpos:
|
||||
m = _HTMLParser.attrfind.match(rawdata, k)
|
||||
if not m:
|
||||
break
|
||||
attrname, rest, attrvalue = m.group(1, 2, 3)
|
||||
if not rest:
|
||||
attrvalue = None
|
||||
elif attrvalue[:1] == '\'' == attrvalue[-1:] or \
|
||||
attrvalue[:1] == '"' == attrvalue[-1:]:
|
||||
attrvalue = attrvalue[1:-1]
|
||||
attrvalue = self.unescape(attrvalue)
|
||||
attrs.append((attrname.lower(), attrvalue))
|
||||
k = m.end()
|
||||
# Now parse the data between i+1 and j into a tag and attrs
|
||||
attrs = []
|
||||
match = _HTMLParser.tagfind.match(rawdata, i + 1)
|
||||
assert match, 'unexpected call to parse_starttag()'
|
||||
k = match.end()
|
||||
self.lasttag = tag = rawdata[i + 1:k].lower()
|
||||
|
||||
end = rawdata[k:endpos].strip()
|
||||
if end not in (">", "/>"):
|
||||
lineno, offset = self.getpos()
|
||||
if "\n" in self.__starttag_text:
|
||||
lineno = lineno + self.__starttag_text.count("\n")
|
||||
offset = len(self.__starttag_text) \
|
||||
- self.__starttag_text.rfind("\n")
|
||||
while k < endpos:
|
||||
m = _HTMLParser.attrfind.match(rawdata, k)
|
||||
if not m:
|
||||
break
|
||||
attrname, rest, attrvalue = m.group(1, 2, 3)
|
||||
if not rest:
|
||||
attrvalue = None
|
||||
elif attrvalue[:1] == '\'' == attrvalue[-1:] or \
|
||||
attrvalue[:1] == '"' == attrvalue[-1:]:
|
||||
attrvalue = attrvalue[1:-1]
|
||||
attrvalue = self.unescape(attrvalue)
|
||||
attrs.append((attrname.lower(), attrvalue))
|
||||
k = m.end()
|
||||
|
||||
end = rawdata[k:endpos].strip()
|
||||
if end not in (">", "/>"):
|
||||
lineno, offset = self.getpos()
|
||||
if "\n" in self.__starttag_text:
|
||||
lineno = lineno + self.__starttag_text.count("\n")
|
||||
offset = len(self.__starttag_text) \
|
||||
- self.__starttag_text.rfind("\n")
|
||||
else:
|
||||
offset = offset + len(self.__starttag_text)
|
||||
self.error("junk characters in start tag: %r"
|
||||
% (rawdata[k:endpos][:20],))
|
||||
if end.endswith('/>'):
|
||||
# XHTML-style empty tag: <span attr="value" />
|
||||
self.handle_startendtag(tag, attrs)
|
||||
else:
|
||||
offset = offset + len(self.__starttag_text)
|
||||
self.error("junk characters in start tag: %r"
|
||||
% (rawdata[k:endpos][:20],))
|
||||
if end.endswith('/>'):
|
||||
# XHTML-style empty tag: <span attr="value" />
|
||||
self.handle_startendtag(tag, attrs)
|
||||
else:
|
||||
self.handle_starttag(tag, attrs)
|
||||
if tag in self.CDATA_CONTENT_ELEMENTS:
|
||||
self.set_cdata_mode(tag) # <--------------------------- Changed
|
||||
return endpos
|
||||
self.handle_starttag(tag, attrs)
|
||||
if tag in self.CDATA_CONTENT_ELEMENTS:
|
||||
self.set_cdata_mode(tag) # <--------------------------- Changed
|
||||
return endpos
|
||||
|
||||
# Internal -- parse endtag, return end or -1 if incomplete
|
||||
def parse_endtag(self, i):
|
||||
rawdata = self.rawdata
|
||||
assert rawdata[i:i + 2] == "</", "unexpected call to parse_endtag"
|
||||
match = _HTMLParser.endendtag.search(rawdata, i + 1) # >
|
||||
if not match:
|
||||
return -1
|
||||
j = match.end()
|
||||
match = _HTMLParser.endtagfind.match(rawdata, i) # </ + tag + >
|
||||
if not match:
|
||||
if self.cdata_tag is not None: # *** add ***
|
||||
self.handle_data(rawdata[i:j]) # *** add ***
|
||||
return j # *** add ***
|
||||
self.error("bad end tag: %r" % (rawdata[i:j],))
|
||||
# --- changed start ---------------------------------------------------
|
||||
tag = match.group(1).strip()
|
||||
if self.cdata_tag is not None:
|
||||
if tag.lower() != self.cdata_tag:
|
||||
self.handle_data(rawdata[i:j])
|
||||
return j
|
||||
# --- changed end -----------------------------------------------------
|
||||
self.handle_endtag(tag.lower())
|
||||
self.clear_cdata_mode()
|
||||
return j
|
||||
# Internal -- parse endtag, return end or -1 if incomplete
|
||||
def parse_endtag(self, i):
|
||||
rawdata = self.rawdata
|
||||
assert rawdata[i:i + 2] == "</", "unexpected call to parse_endtag"
|
||||
match = _HTMLParser.endendtag.search(rawdata, i + 1) # >
|
||||
if not match:
|
||||
return -1
|
||||
j = match.end()
|
||||
match = _HTMLParser.endtagfind.match(rawdata, i) # </ + tag + >
|
||||
if not match:
|
||||
if self.cdata_tag is not None: # *** add ***
|
||||
self.handle_data(rawdata[i:j]) # *** add ***
|
||||
return j # *** add ***
|
||||
self.error("bad end tag: %r" % (rawdata[i:j],))
|
||||
# --- changed start ---------------------------------------------------
|
||||
tag = match.group(1).strip()
|
||||
if self.cdata_tag is not None:
|
||||
if tag.lower() != self.cdata_tag:
|
||||
self.handle_data(rawdata[i:j])
|
||||
return j
|
||||
# --- changed end -----------------------------------------------------
|
||||
self.handle_endtag(tag.lower())
|
||||
self.clear_cdata_mode()
|
||||
return j
|
||||
|
|
|
@ -4,6 +4,7 @@ import re
|
|||
import sys
|
||||
import urllib
|
||||
import urlparse
|
||||
import unicodedata
|
||||
from email.utils import formatdate
|
||||
|
||||
from django.utils.datastructures import MultiValueDict
|
||||
|
@ -224,3 +225,35 @@ else:
|
|||
"""
|
||||
p1, p2 = urlparse.urlparse(url1), urlparse.urlparse(url2)
|
||||
return p1[0:2] == p2[0:2]
|
||||
|
||||
def is_safe_url(url, host=None):
|
||||
"""
|
||||
Return ``True`` if the url is a safe redirection (i.e. it doesn't point to
|
||||
a different host and uses a safe scheme).
|
||||
|
||||
Always returns ``False`` on an empty url.
|
||||
"""
|
||||
if url is not None:
|
||||
url = url.strip()
|
||||
if not url:
|
||||
return False
|
||||
# Chrome treats \ completely as /
|
||||
url = url.replace('\\', '/')
|
||||
# Chrome considers any URL with more than two slashes to be absolute, but
|
||||
# urlaprse is not so flexible. Treat any url with three slashes as unsafe.
|
||||
if url.startswith('///'):
|
||||
return False
|
||||
url_info = urlparse.urlparse(url)
|
||||
# Forbid URLs like http:///example.com - with a scheme, but without a hostname.
|
||||
# In that URL, example.com is not the hostname but, a path component. However,
|
||||
# Chrome will still consider example.com to be the hostname, so we must not
|
||||
# allow this syntax.
|
||||
if not url_info[1] and url_info[0]:
|
||||
return False
|
||||
# Forbid URLs that start with control characters. Some browsers (like
|
||||
# Chrome) ignore quite a few control characters at the start of a
|
||||
# URL and might consider the URL as scheme relative.
|
||||
if unicodedata.category(unicode(url[0]))[0] == 'C':
|
||||
return False
|
||||
return (not url_info[1] or url_info[1] == host) and \
|
||||
(not url_info[0] or url_info[0] in ['http', 'https'])
|
||||
|
|
|
@ -117,3 +117,9 @@ def mark_for_escaping(s):
|
|||
return EscapeUnicode(s)
|
||||
return EscapeString(str(s))
|
||||
|
||||
# Forwards compatibility with Django 1.5
|
||||
|
||||
EscapeBytes = EscapeString
|
||||
EscapeText = EscapeUnicode
|
||||
SafeBytes = SafeString
|
||||
SafeText = SafeUnicode
|
||||
|
|
|
@ -0,0 +1,788 @@
|
|||
"""Utilities for writing code that runs on Python 2 and 3"""
|
||||
|
||||
# Copyright (c) 2010-2014 Benjamin Peterson
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in all
|
||||
# copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
# SOFTWARE.
|
||||
|
||||
from __future__ import absolute_import
|
||||
|
||||
import functools
|
||||
import operator
|
||||
import sys
|
||||
import types
|
||||
|
||||
__author__ = "Benjamin Peterson <benjamin@python.org>"
|
||||
__version__ = "1.8.0"
|
||||
|
||||
|
||||
# Useful for very coarse version differentiation.
|
||||
PY2 = sys.version_info[0] == 2
|
||||
PY3 = sys.version_info[0] == 3
|
||||
|
||||
if PY3:
|
||||
string_types = str,
|
||||
integer_types = int,
|
||||
class_types = type,
|
||||
text_type = str
|
||||
binary_type = bytes
|
||||
|
||||
MAXSIZE = sys.maxsize
|
||||
else:
|
||||
string_types = basestring,
|
||||
integer_types = (int, long)
|
||||
class_types = (type, types.ClassType)
|
||||
text_type = unicode
|
||||
binary_type = str
|
||||
|
||||
if sys.platform.startswith("java"):
|
||||
# Jython always uses 32 bits.
|
||||
MAXSIZE = int((1 << 31) - 1)
|
||||
else:
|
||||
# It's possible to have sizeof(long) != sizeof(Py_ssize_t).
|
||||
class X(object):
|
||||
def __len__(self):
|
||||
return 1 << 31
|
||||
try:
|
||||
len(X())
|
||||
except OverflowError:
|
||||
# 32-bit
|
||||
MAXSIZE = int((1 << 31) - 1)
|
||||
else:
|
||||
# 64-bit
|
||||
MAXSIZE = int((1 << 63) - 1)
|
||||
del X
|
||||
|
||||
|
||||
def _add_doc(func, doc):
|
||||
"""Add documentation to a function."""
|
||||
func.__doc__ = doc
|
||||
|
||||
|
||||
def _import_module(name):
|
||||
"""Import module, returning the module after the last dot."""
|
||||
__import__(name)
|
||||
return sys.modules[name]
|
||||
|
||||
|
||||
class _LazyDescr(object):
|
||||
|
||||
def __init__(self, name):
|
||||
self.name = name
|
||||
|
||||
def __get__(self, obj, tp):
|
||||
result = self._resolve()
|
||||
setattr(obj, self.name, result) # Invokes __set__.
|
||||
# This is a bit ugly, but it avoids running this again.
|
||||
delattr(obj.__class__, self.name)
|
||||
return result
|
||||
|
||||
|
||||
class MovedModule(_LazyDescr):
|
||||
|
||||
def __init__(self, name, old, new=None):
|
||||
super(MovedModule, self).__init__(name)
|
||||
if PY3:
|
||||
if new is None:
|
||||
new = name
|
||||
self.mod = new
|
||||
else:
|
||||
self.mod = old
|
||||
|
||||
def _resolve(self):
|
||||
return _import_module(self.mod)
|
||||
|
||||
def __getattr__(self, attr):
|
||||
_module = self._resolve()
|
||||
value = getattr(_module, attr)
|
||||
setattr(self, attr, value)
|
||||
return value
|
||||
|
||||
|
||||
class _LazyModule(types.ModuleType):
|
||||
|
||||
def __init__(self, name):
|
||||
super(_LazyModule, self).__init__(name)
|
||||
self.__doc__ = self.__class__.__doc__
|
||||
|
||||
def __dir__(self):
|
||||
attrs = ["__doc__", "__name__"]
|
||||
attrs += [attr.name for attr in self._moved_attributes]
|
||||
return attrs
|
||||
|
||||
# Subclasses should override this
|
||||
_moved_attributes = []
|
||||
|
||||
|
||||
class MovedAttribute(_LazyDescr):
|
||||
|
||||
def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None):
|
||||
super(MovedAttribute, self).__init__(name)
|
||||
if PY3:
|
||||
if new_mod is None:
|
||||
new_mod = name
|
||||
self.mod = new_mod
|
||||
if new_attr is None:
|
||||
if old_attr is None:
|
||||
new_attr = name
|
||||
else:
|
||||
new_attr = old_attr
|
||||
self.attr = new_attr
|
||||
else:
|
||||
self.mod = old_mod
|
||||
if old_attr is None:
|
||||
old_attr = name
|
||||
self.attr = old_attr
|
||||
|
||||
def _resolve(self):
|
||||
module = _import_module(self.mod)
|
||||
return getattr(module, self.attr)
|
||||
|
||||
|
||||
class _SixMetaPathImporter(object):
|
||||
"""
|
||||
A meta path importer to import six.moves and its submodules.
|
||||
|
||||
This class implements a PEP302 finder and loader. It should be compatible
|
||||
with Python 2.5 and all existing versions of Python3
|
||||
"""
|
||||
def __init__(self, six_module_name):
|
||||
self.name = six_module_name
|
||||
self.known_modules = {}
|
||||
|
||||
def _add_module(self, mod, *fullnames):
|
||||
for fullname in fullnames:
|
||||
self.known_modules[self.name + "." + fullname] = mod
|
||||
|
||||
def _get_module(self, fullname):
|
||||
return self.known_modules[self.name + "." + fullname]
|
||||
|
||||
def find_module(self, fullname, path=None):
|
||||
if fullname in self.known_modules:
|
||||
return self
|
||||
return None
|
||||
|
||||
def __get_module(self, fullname):
|
||||
try:
|
||||
return self.known_modules[fullname]
|
||||
except KeyError:
|
||||
raise ImportError("This loader does not know module " + fullname)
|
||||
|
||||
def load_module(self, fullname):
|
||||
try:
|
||||
# in case of a reload
|
||||
return sys.modules[fullname]
|
||||
except KeyError:
|
||||
pass
|
||||
mod = self.__get_module(fullname)
|
||||
if isinstance(mod, MovedModule):
|
||||
mod = mod._resolve()
|
||||
else:
|
||||
mod.__loader__ = self
|
||||
sys.modules[fullname] = mod
|
||||
return mod
|
||||
|
||||
def is_package(self, fullname):
|
||||
"""
|
||||
Return true, if the named module is a package.
|
||||
|
||||
We need this method to get correct spec objects with
|
||||
Python 3.4 (see PEP451)
|
||||
"""
|
||||
return hasattr(self.__get_module(fullname), "__path__")
|
||||
|
||||
def get_code(self, fullname):
|
||||
"""Return None
|
||||
|
||||
Required, if is_package is implemented"""
|
||||
self.__get_module(fullname) # eventually raises ImportError
|
||||
return None
|
||||
get_source = get_code # same as get_code
|
||||
|
||||
_importer = _SixMetaPathImporter(__name__)
|
||||
|
||||
|
||||
class _MovedItems(_LazyModule):
|
||||
"""Lazy loading of moved objects"""
|
||||
__path__ = [] # mark as package
|
||||
|
||||
|
||||
_moved_attributes = [
|
||||
MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"),
|
||||
MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"),
|
||||
MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"),
|
||||
MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"),
|
||||
MovedAttribute("intern", "__builtin__", "sys"),
|
||||
MovedAttribute("map", "itertools", "builtins", "imap", "map"),
|
||||
MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"),
|
||||
MovedAttribute("reload_module", "__builtin__", "imp", "reload"),
|
||||
MovedAttribute("reduce", "__builtin__", "functools"),
|
||||
MovedAttribute("shlex_quote", "pipes", "shlex", "quote"),
|
||||
MovedAttribute("StringIO", "StringIO", "io"),
|
||||
MovedAttribute("UserDict", "UserDict", "collections"),
|
||||
MovedAttribute("UserList", "UserList", "collections"),
|
||||
MovedAttribute("UserString", "UserString", "collections"),
|
||||
MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"),
|
||||
MovedAttribute("zip", "itertools", "builtins", "izip", "zip"),
|
||||
MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"),
|
||||
|
||||
MovedModule("builtins", "__builtin__"),
|
||||
MovedModule("configparser", "ConfigParser"),
|
||||
MovedModule("copyreg", "copy_reg"),
|
||||
MovedModule("dbm_gnu", "gdbm", "dbm.gnu"),
|
||||
MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"),
|
||||
MovedModule("http_cookiejar", "cookielib", "http.cookiejar"),
|
||||
MovedModule("http_cookies", "Cookie", "http.cookies"),
|
||||
MovedModule("html_entities", "htmlentitydefs", "html.entities"),
|
||||
MovedModule("html_parser", "HTMLParser", "html.parser"),
|
||||
MovedModule("http_client", "httplib", "http.client"),
|
||||
MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"),
|
||||
MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"),
|
||||
MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"),
|
||||
MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"),
|
||||
MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"),
|
||||
MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"),
|
||||
MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"),
|
||||
MovedModule("cPickle", "cPickle", "pickle"),
|
||||
MovedModule("queue", "Queue"),
|
||||
MovedModule("reprlib", "repr"),
|
||||
MovedModule("socketserver", "SocketServer"),
|
||||
MovedModule("_thread", "thread", "_thread"),
|
||||
MovedModule("tkinter", "Tkinter"),
|
||||
MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"),
|
||||
MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"),
|
||||
MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"),
|
||||
MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"),
|
||||
MovedModule("tkinter_tix", "Tix", "tkinter.tix"),
|
||||
MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"),
|
||||
MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"),
|
||||
MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"),
|
||||
MovedModule("tkinter_colorchooser", "tkColorChooser",
|
||||
"tkinter.colorchooser"),
|
||||
MovedModule("tkinter_commondialog", "tkCommonDialog",
|
||||
"tkinter.commondialog"),
|
||||
MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"),
|
||||
MovedModule("tkinter_font", "tkFont", "tkinter.font"),
|
||||
MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"),
|
||||
MovedModule("tkinter_tksimpledialog", "tkSimpleDialog",
|
||||
"tkinter.simpledialog"),
|
||||
MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"),
|
||||
MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"),
|
||||
MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"),
|
||||
MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"),
|
||||
MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"),
|
||||
MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"),
|
||||
MovedModule("winreg", "_winreg"),
|
||||
]
|
||||
for attr in _moved_attributes:
|
||||
setattr(_MovedItems, attr.name, attr)
|
||||
if isinstance(attr, MovedModule):
|
||||
_importer._add_module(attr, "moves." + attr.name)
|
||||
del attr
|
||||
|
||||
_MovedItems._moved_attributes = _moved_attributes
|
||||
|
||||
moves = _MovedItems(__name__ + ".moves")
|
||||
_importer._add_module(moves, "moves")
|
||||
|
||||
|
||||
class Module_six_moves_urllib_parse(_LazyModule):
|
||||
"""Lazy loading of moved objects in six.moves.urllib_parse"""
|
||||
|
||||
|
||||
_urllib_parse_moved_attributes = [
|
||||
MovedAttribute("ParseResult", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("SplitResult", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("parse_qs", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("parse_qsl", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("urldefrag", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("urljoin", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("urlparse", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("urlsplit", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("urlunparse", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("urlunsplit", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("quote", "urllib", "urllib.parse"),
|
||||
MovedAttribute("quote_plus", "urllib", "urllib.parse"),
|
||||
MovedAttribute("unquote", "urllib", "urllib.parse"),
|
||||
MovedAttribute("unquote_plus", "urllib", "urllib.parse"),
|
||||
MovedAttribute("urlencode", "urllib", "urllib.parse"),
|
||||
MovedAttribute("splitquery", "urllib", "urllib.parse"),
|
||||
MovedAttribute("splittag", "urllib", "urllib.parse"),
|
||||
MovedAttribute("splituser", "urllib", "urllib.parse"),
|
||||
MovedAttribute("uses_fragment", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("uses_netloc", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("uses_params", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("uses_query", "urlparse", "urllib.parse"),
|
||||
MovedAttribute("uses_relative", "urlparse", "urllib.parse"),
|
||||
]
|
||||
for attr in _urllib_parse_moved_attributes:
|
||||
setattr(Module_six_moves_urllib_parse, attr.name, attr)
|
||||
del attr
|
||||
|
||||
Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes
|
||||
|
||||
_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"),
|
||||
"moves.urllib_parse", "moves.urllib.parse")
|
||||
|
||||
|
||||
class Module_six_moves_urllib_error(_LazyModule):
|
||||
"""Lazy loading of moved objects in six.moves.urllib_error"""
|
||||
|
||||
|
||||
_urllib_error_moved_attributes = [
|
||||
MovedAttribute("URLError", "urllib2", "urllib.error"),
|
||||
MovedAttribute("HTTPError", "urllib2", "urllib.error"),
|
||||
MovedAttribute("ContentTooShortError", "urllib", "urllib.error"),
|
||||
]
|
||||
for attr in _urllib_error_moved_attributes:
|
||||
setattr(Module_six_moves_urllib_error, attr.name, attr)
|
||||
del attr
|
||||
|
||||
Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes
|
||||
|
||||
_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"),
|
||||
"moves.urllib_error", "moves.urllib.error")
|
||||
|
||||
|
||||
class Module_six_moves_urllib_request(_LazyModule):
|
||||
"""Lazy loading of moved objects in six.moves.urllib_request"""
|
||||
|
||||
|
||||
_urllib_request_moved_attributes = [
|
||||
MovedAttribute("urlopen", "urllib2", "urllib.request"),
|
||||
MovedAttribute("install_opener", "urllib2", "urllib.request"),
|
||||
MovedAttribute("build_opener", "urllib2", "urllib.request"),
|
||||
MovedAttribute("pathname2url", "urllib", "urllib.request"),
|
||||
MovedAttribute("url2pathname", "urllib", "urllib.request"),
|
||||
MovedAttribute("getproxies", "urllib", "urllib.request"),
|
||||
MovedAttribute("Request", "urllib2", "urllib.request"),
|
||||
MovedAttribute("OpenerDirector", "urllib2", "urllib.request"),
|
||||
MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"),
|
||||
MovedAttribute("ProxyHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("BaseHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"),
|
||||
MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"),
|
||||
MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("HTTPHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("FileHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("FTPHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("UnknownHandler", "urllib2", "urllib.request"),
|
||||
MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"),
|
||||
MovedAttribute("urlretrieve", "urllib", "urllib.request"),
|
||||
MovedAttribute("urlcleanup", "urllib", "urllib.request"),
|
||||
MovedAttribute("URLopener", "urllib", "urllib.request"),
|
||||
MovedAttribute("FancyURLopener", "urllib", "urllib.request"),
|
||||
MovedAttribute("proxy_bypass", "urllib", "urllib.request"),
|
||||
]
|
||||
for attr in _urllib_request_moved_attributes:
|
||||
setattr(Module_six_moves_urllib_request, attr.name, attr)
|
||||
del attr
|
||||
|
||||
Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes
|
||||
|
||||
_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"),
|
||||
"moves.urllib_request", "moves.urllib.request")
|
||||
|
||||
|
||||
class Module_six_moves_urllib_response(_LazyModule):
|
||||
"""Lazy loading of moved objects in six.moves.urllib_response"""
|
||||
|
||||
|
||||
_urllib_response_moved_attributes = [
|
||||
MovedAttribute("addbase", "urllib", "urllib.response"),
|
||||
MovedAttribute("addclosehook", "urllib", "urllib.response"),
|
||||
MovedAttribute("addinfo", "urllib", "urllib.response"),
|
||||
MovedAttribute("addinfourl", "urllib", "urllib.response"),
|
||||
]
|
||||
for attr in _urllib_response_moved_attributes:
|
||||
setattr(Module_six_moves_urllib_response, attr.name, attr)
|
||||
del attr
|
||||
|
||||
Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes
|
||||
|
||||
_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"),
|
||||
"moves.urllib_response", "moves.urllib.response")
|
||||
|
||||
|
||||
class Module_six_moves_urllib_robotparser(_LazyModule):
|
||||
"""Lazy loading of moved objects in six.moves.urllib_robotparser"""
|
||||
|
||||
|
||||
_urllib_robotparser_moved_attributes = [
|
||||
MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"),
|
||||
]
|
||||
for attr in _urllib_robotparser_moved_attributes:
|
||||
setattr(Module_six_moves_urllib_robotparser, attr.name, attr)
|
||||
del attr
|
||||
|
||||
Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes
|
||||
|
||||
_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"),
|
||||
"moves.urllib_robotparser", "moves.urllib.robotparser")
|
||||
|
||||
|
||||
class Module_six_moves_urllib(types.ModuleType):
|
||||
"""Create a six.moves.urllib namespace that resembles the Python 3 namespace"""
|
||||
__path__ = [] # mark as package
|
||||
parse = _importer._get_module("moves.urllib_parse")
|
||||
error = _importer._get_module("moves.urllib_error")
|
||||
request = _importer._get_module("moves.urllib_request")
|
||||
response = _importer._get_module("moves.urllib_response")
|
||||
robotparser = _importer._get_module("moves.urllib_robotparser")
|
||||
|
||||
def __dir__(self):
|
||||
return ['parse', 'error', 'request', 'response', 'robotparser']
|
||||
|
||||
_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"),
|
||||
"moves.urllib")
|
||||
|
||||
|
||||
def add_move(move):
|
||||
"""Add an item to six.moves."""
|
||||
setattr(_MovedItems, move.name, move)
|
||||
|
||||
|
||||
def remove_move(name):
|
||||
"""Remove item from six.moves."""
|
||||
try:
|
||||
delattr(_MovedItems, name)
|
||||
except AttributeError:
|
||||
try:
|
||||
del moves.__dict__[name]
|
||||
except KeyError:
|
||||
raise AttributeError("no such move, %r" % (name,))
|
||||
|
||||
|
||||
if PY3:
|
||||
_meth_func = "__func__"
|
||||
_meth_self = "__self__"
|
||||
|
||||
_func_closure = "__closure__"
|
||||
_func_code = "__code__"
|
||||
_func_defaults = "__defaults__"
|
||||
_func_globals = "__globals__"
|
||||
else:
|
||||
_meth_func = "im_func"
|
||||
_meth_self = "im_self"
|
||||
|
||||
_func_closure = "func_closure"
|
||||
_func_code = "func_code"
|
||||
_func_defaults = "func_defaults"
|
||||
_func_globals = "func_globals"
|
||||
|
||||
|
||||
try:
|
||||
advance_iterator = next
|
||||
except NameError:
|
||||
def advance_iterator(it):
|
||||
return it.next()
|
||||
next = advance_iterator
|
||||
|
||||
|
||||
try:
|
||||
callable = callable
|
||||
except NameError:
|
||||
def callable(obj):
|
||||
return any("__call__" in klass.__dict__ for klass in type(obj).__mro__)
|
||||
|
||||
|
||||
if PY3:
|
||||
def get_unbound_function(unbound):
|
||||
return unbound
|
||||
|
||||
create_bound_method = types.MethodType
|
||||
|
||||
Iterator = object
|
||||
else:
|
||||
def get_unbound_function(unbound):
|
||||
return unbound.im_func
|
||||
|
||||
def create_bound_method(func, obj):
|
||||
return types.MethodType(func, obj, obj.__class__)
|
||||
|
||||
class Iterator(object):
|
||||
|
||||
def next(self):
|
||||
return type(self).__next__(self)
|
||||
|
||||
callable = callable
|
||||
_add_doc(get_unbound_function,
|
||||
"""Get the function out of a possibly unbound function""")
|
||||
|
||||
|
||||
get_method_function = operator.attrgetter(_meth_func)
|
||||
get_method_self = operator.attrgetter(_meth_self)
|
||||
get_function_closure = operator.attrgetter(_func_closure)
|
||||
get_function_code = operator.attrgetter(_func_code)
|
||||
get_function_defaults = operator.attrgetter(_func_defaults)
|
||||
get_function_globals = operator.attrgetter(_func_globals)
|
||||
|
||||
|
||||
if PY3:
|
||||
def iterkeys(d, **kw):
|
||||
return iter(d.keys(**kw))
|
||||
|
||||
def itervalues(d, **kw):
|
||||
return iter(d.values(**kw))
|
||||
|
||||
def iteritems(d, **kw):
|
||||
return iter(d.items(**kw))
|
||||
|
||||
def iterlists(d, **kw):
|
||||
return iter(d.lists(**kw))
|
||||
else:
|
||||
def iterkeys(d, **kw):
|
||||
return iter(d.iterkeys(**kw))
|
||||
|
||||
def itervalues(d, **kw):
|
||||
return iter(d.itervalues(**kw))
|
||||
|
||||
def iteritems(d, **kw):
|
||||
return iter(d.iteritems(**kw))
|
||||
|
||||
def iterlists(d, **kw):
|
||||
return iter(d.iterlists(**kw))
|
||||
|
||||
_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.")
|
||||
_add_doc(itervalues, "Return an iterator over the values of a dictionary.")
|
||||
_add_doc(iteritems,
|
||||
"Return an iterator over the (key, value) pairs of a dictionary.")
|
||||
_add_doc(iterlists,
|
||||
"Return an iterator over the (key, [values]) pairs of a dictionary.")
|
||||
|
||||
|
||||
if PY3:
|
||||
def b(s):
|
||||
return s.encode("latin-1")
|
||||
def u(s):
|
||||
return s
|
||||
unichr = chr
|
||||
if sys.version_info[1] <= 1:
|
||||
def int2byte(i):
|
||||
return bytes((i,))
|
||||
else:
|
||||
# This is about 2x faster than the implementation above on 3.2+
|
||||
int2byte = operator.methodcaller("to_bytes", 1, "big")
|
||||
byte2int = operator.itemgetter(0)
|
||||
indexbytes = operator.getitem
|
||||
iterbytes = iter
|
||||
import io
|
||||
StringIO = io.StringIO
|
||||
BytesIO = io.BytesIO
|
||||
else:
|
||||
def b(s):
|
||||
return s
|
||||
# Workaround for standalone backslash
|
||||
def u(s):
|
||||
return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape")
|
||||
unichr = unichr
|
||||
int2byte = chr
|
||||
def byte2int(bs):
|
||||
return ord(bs[0])
|
||||
def indexbytes(buf, i):
|
||||
return ord(buf[i])
|
||||
def iterbytes(buf):
|
||||
return (ord(byte) for byte in buf)
|
||||
import StringIO
|
||||
StringIO = BytesIO = StringIO.StringIO
|
||||
_add_doc(b, """Byte literal""")
|
||||
_add_doc(u, """Text literal""")
|
||||
|
||||
|
||||
if PY3:
|
||||
exec_ = getattr(moves.builtins, "exec")
|
||||
|
||||
|
||||
def reraise(tp, value, tb=None):
|
||||
if value is None:
|
||||
value = tp()
|
||||
if value.__traceback__ is not tb:
|
||||
raise value.with_traceback(tb)
|
||||
raise value
|
||||
|
||||
else:
|
||||
def exec_(_code_, _globs_=None, _locs_=None):
|
||||
"""Execute code in a namespace."""
|
||||
if _globs_ is None:
|
||||
frame = sys._getframe(1)
|
||||
_globs_ = frame.f_globals
|
||||
if _locs_ is None:
|
||||
_locs_ = frame.f_locals
|
||||
del frame
|
||||
elif _locs_ is None:
|
||||
_locs_ = _globs_
|
||||
exec("""exec _code_ in _globs_, _locs_""")
|
||||
|
||||
|
||||
exec_("""def reraise(tp, value, tb=None):
|
||||
raise tp, value, tb
|
||||
""")
|
||||
|
||||
|
||||
print_ = getattr(moves.builtins, "print", None)
|
||||
if print_ is None:
|
||||
def print_(*args, **kwargs):
|
||||
"""The new-style print function for Python 2.4 and 2.5."""
|
||||
fp = kwargs.pop("file", sys.stdout)
|
||||
if fp is None:
|
||||
return
|
||||
def write(data):
|
||||
if not isinstance(data, basestring):
|
||||
data = str(data)
|
||||
# If the file has an encoding, encode unicode with it.
|
||||
if (isinstance(fp, file) and
|
||||
isinstance(data, unicode) and
|
||||
fp.encoding is not None):
|
||||
errors = getattr(fp, "errors", None)
|
||||
if errors is None:
|
||||
errors = "strict"
|
||||
data = data.encode(fp.encoding, errors)
|
||||
fp.write(data)
|
||||
want_unicode = False
|
||||
sep = kwargs.pop("sep", None)
|
||||
if sep is not None:
|
||||
if isinstance(sep, unicode):
|
||||
want_unicode = True
|
||||
elif not isinstance(sep, str):
|
||||
raise TypeError("sep must be None or a string")
|
||||
end = kwargs.pop("end", None)
|
||||
if end is not None:
|
||||
if isinstance(end, unicode):
|
||||
want_unicode = True
|
||||
elif not isinstance(end, str):
|
||||
raise TypeError("end must be None or a string")
|
||||
if kwargs:
|
||||
raise TypeError("invalid keyword arguments to print()")
|
||||
if not want_unicode:
|
||||
for arg in args:
|
||||
if isinstance(arg, unicode):
|
||||
want_unicode = True
|
||||
break
|
||||
if want_unicode:
|
||||
newline = unicode("\n")
|
||||
space = unicode(" ")
|
||||
else:
|
||||
newline = "\n"
|
||||
space = " "
|
||||
if sep is None:
|
||||
sep = space
|
||||
if end is None:
|
||||
end = newline
|
||||
for i, arg in enumerate(args):
|
||||
if i:
|
||||
write(sep)
|
||||
write(arg)
|
||||
write(end)
|
||||
|
||||
_add_doc(reraise, """Reraise an exception.""")
|
||||
|
||||
if sys.version_info[0:2] < (3, 4):
|
||||
def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS,
|
||||
updated=functools.WRAPPER_UPDATES):
|
||||
def wrapper(f):
|
||||
f = functools.wraps(wrapped)(f)
|
||||
f.__wrapped__ = wrapped
|
||||
return f
|
||||
return wrapper
|
||||
else:
|
||||
wraps = functools.wraps
|
||||
|
||||
def with_metaclass(meta, *bases):
|
||||
"""Create a base class with a metaclass."""
|
||||
# This requires a bit of explanation: the basic idea is to make a dummy
|
||||
# metaclass for one level of class instantiation that replaces itself with
|
||||
# the actual metaclass.
|
||||
class metaclass(meta):
|
||||
def __new__(cls, name, this_bases, d):
|
||||
return meta(name, bases, d)
|
||||
return type.__new__(metaclass, 'temporary_class', (), {})
|
||||
|
||||
|
||||
def add_metaclass(metaclass):
|
||||
"""Class decorator for creating a class with a metaclass."""
|
||||
def wrapper(cls):
|
||||
orig_vars = cls.__dict__.copy()
|
||||
slots = orig_vars.get('__slots__')
|
||||
if slots is not None:
|
||||
if isinstance(slots, str):
|
||||
slots = [slots]
|
||||
for slots_var in slots:
|
||||
orig_vars.pop(slots_var)
|
||||
orig_vars.pop('__dict__', None)
|
||||
orig_vars.pop('__weakref__', None)
|
||||
return metaclass(cls.__name__, cls.__bases__, orig_vars)
|
||||
return wrapper
|
||||
|
||||
# Complete the moves implementation.
|
||||
# This code is at the end of this module to speed up module loading.
|
||||
# Turn this module into a package.
|
||||
__path__ = [] # required for PEP 302 and PEP 451
|
||||
__package__ = __name__ # see PEP 366 @ReservedAssignment
|
||||
if globals().get("__spec__") is not None:
|
||||
__spec__.submodule_search_locations = [] # PEP 451 @UndefinedVariable
|
||||
# Remove other six meta path importers, since they cause problems. This can
|
||||
# happen if six is removed from sys.modules and then reloaded. (Setuptools does
|
||||
# this for some reason.)
|
||||
if sys.meta_path:
|
||||
for i, importer in enumerate(sys.meta_path):
|
||||
# Here's some real nastiness: Another "instance" of the six module might
|
||||
# be floating around. Therefore, we can't use isinstance() to check for
|
||||
# the six meta path importer, since the other six instance will have
|
||||
# inserted an importer with different class.
|
||||
if (type(importer).__name__ == "_SixMetaPathImporter" and
|
||||
importer.name == __name__):
|
||||
del sys.meta_path[i]
|
||||
break
|
||||
del i, importer
|
||||
# Finally, add the importer to the meta path import hook.
|
||||
sys.meta_path.append(_importer)
|
||||
|
||||
|
||||
### Additional customizations for Django ###
|
||||
|
||||
if PY3:
|
||||
_assertRaisesRegex = "assertRaisesRegex"
|
||||
_assertRegex = "assertRegex"
|
||||
memoryview = memoryview
|
||||
else:
|
||||
_assertRaisesRegex = "assertRaisesRegexp"
|
||||
_assertRegex = "assertRegexpMatches"
|
||||
# memoryview and buffer are not stricly equivalent, but should be fine for
|
||||
# django core usage (mainly BinaryField). However, Jython doesn't support
|
||||
# buffer (see http://bugs.jython.org/issue1521), so we have to be careful.
|
||||
if sys.platform.startswith('java'):
|
||||
memoryview = memoryview
|
||||
else:
|
||||
memoryview = buffer
|
||||
|
||||
|
||||
def assertRaisesRegex(self, *args, **kwargs):
|
||||
return getattr(self, _assertRaisesRegex)(*args, **kwargs)
|
||||
|
||||
|
||||
def assertRegex(self, *args, **kwargs):
|
||||
return getattr(self, _assertRegex)(*args, **kwargs)
|
|
@ -286,6 +286,39 @@ def compress_string(s):
|
|||
|
||||
ustring_re = re.compile(u"([\u0080-\uffff])")
|
||||
|
||||
# Backported from django 1.5
|
||||
class StreamingBuffer(object):
|
||||
def __init__(self):
|
||||
self.vals = []
|
||||
|
||||
def write(self, val):
|
||||
self.vals.append(val)
|
||||
|
||||
def read(self):
|
||||
ret = ''.join(self.vals)
|
||||
self.vals = []
|
||||
return ret
|
||||
|
||||
def flush(self):
|
||||
return
|
||||
|
||||
def close(self):
|
||||
return
|
||||
|
||||
# Backported from django 1.5
|
||||
# Like compress_string, but for iterators of strings.
|
||||
def compress_sequence(sequence):
|
||||
buf = StreamingBuffer()
|
||||
zfile = GzipFile(mode='wb', compresslevel=6, fileobj=buf)
|
||||
# Output headers...
|
||||
yield buf.read()
|
||||
for item in sequence:
|
||||
zfile.write(item)
|
||||
zfile.flush()
|
||||
yield buf.read()
|
||||
zfile.close()
|
||||
yield buf.read()
|
||||
|
||||
def javascript_quote(s, quote_double_quotes=False):
|
||||
|
||||
def fix(match):
|
||||
|
|
|
@ -438,8 +438,8 @@ def blankout(src, char):
|
|||
return dot_re.sub(char, src)
|
||||
|
||||
context_re = re.compile(r"""^\s+.*context\s+((?:"[^"]*?")|(?:'[^']*?'))\s*""")
|
||||
inline_re = re.compile(r"""^\s*trans\s+((?:"[^"]*?")|(?:'[^']*?'))(\s+.*context\s+(?:"[^"]*?")|(?:'[^']*?'))?\s*""")
|
||||
block_re = re.compile(r"""^\s*blocktrans(\s+.*context\s+(?:"[^"]*?")|(?:'[^']*?'))?(?:\s+|$)""")
|
||||
inline_re = re.compile(r"""^\s*trans\s+((?:"[^"]*?")|(?:'[^']*?'))(\s+.*context\s+((?:"[^"]*?")|(?:'[^']*?')))?\s*""")
|
||||
block_re = re.compile(r"""^\s*blocktrans(\s+.*context\s+((?:"[^"]*?")|(?:'[^']*?')))?(?:\s+|$)""")
|
||||
endblock_re = re.compile(r"""^\s*endblocktrans$""")
|
||||
plural_re = re.compile(r"""^\s*plural$""")
|
||||
constant_re = re.compile(r"""_\(((?:".*?")|(?:'.*?'))\)""")
|
||||
|
|
|
@ -155,9 +155,20 @@ class SafeExceptionReporterFilter(ExceptionReporterFilter):
|
|||
Replaces the values of variables marked as sensitive with
|
||||
stars (*********).
|
||||
"""
|
||||
func_name = tb_frame.f_code.co_name
|
||||
func = tb_frame.f_globals.get(func_name)
|
||||
sensitive_variables = getattr(func, 'sensitive_variables', [])
|
||||
# Loop through the frame's callers to see if the sensitive_variables
|
||||
# decorator was used.
|
||||
current_frame = tb_frame.f_back
|
||||
sensitive_variables = None
|
||||
while current_frame is not None:
|
||||
if (current_frame.f_code.co_name == 'sensitive_variables_wrapper'
|
||||
and 'sensitive_variables_wrapper' in current_frame.f_locals):
|
||||
# The sensitive_variables decorator was used, so we take note
|
||||
# of the sensitive variables' names.
|
||||
wrapper = current_frame.f_locals['sensitive_variables_wrapper']
|
||||
sensitive_variables = getattr(wrapper, 'sensitive_variables', None)
|
||||
break
|
||||
current_frame = current_frame.f_back
|
||||
|
||||
cleansed = []
|
||||
if self.is_active(request) and sensitive_variables:
|
||||
if sensitive_variables == '__ALL__':
|
||||
|
|
|
@ -1,5 +1,7 @@
|
|||
import functools
|
||||
|
||||
from django.http import HttpRequest
|
||||
|
||||
|
||||
def sensitive_variables(*variables):
|
||||
"""
|
||||
|
@ -26,13 +28,13 @@ def sensitive_variables(*variables):
|
|||
"""
|
||||
def decorator(func):
|
||||
@functools.wraps(func)
|
||||
def wrapper(*args, **kwargs):
|
||||
def sensitive_variables_wrapper(*args, **kwargs):
|
||||
if variables:
|
||||
wrapper.sensitive_variables = variables
|
||||
sensitive_variables_wrapper.sensitive_variables = variables
|
||||
else:
|
||||
wrapper.sensitive_variables = '__ALL__'
|
||||
sensitive_variables_wrapper.sensitive_variables = '__ALL__'
|
||||
return func(*args, **kwargs)
|
||||
return wrapper
|
||||
return sensitive_variables_wrapper
|
||||
return decorator
|
||||
|
||||
|
||||
|
@ -61,11 +63,15 @@ def sensitive_post_parameters(*parameters):
|
|||
"""
|
||||
def decorator(view):
|
||||
@functools.wraps(view)
|
||||
def wrapper(request, *args, **kwargs):
|
||||
def sensitive_post_parameters_wrapper(request, *args, **kwargs):
|
||||
assert isinstance(request, HttpRequest), (
|
||||
"sensitive_post_parameters didn't receive an HttpRequest. If you "
|
||||
"are decorating a classmethod, be sure to use @method_decorator."
|
||||
)
|
||||
if parameters:
|
||||
request.sensitive_post_parameters = parameters
|
||||
else:
|
||||
request.sensitive_post_parameters = '__ALL__'
|
||||
return view(request, *args, **kwargs)
|
||||
return wrapper
|
||||
return sensitive_post_parameters_wrapper
|
||||
return decorator
|
||||
|
|
|
@ -8,6 +8,8 @@ from django.utils.translation import check_for_language, activate, to_locale, ge
|
|||
from django.utils.text import javascript_quote
|
||||
from django.utils.encoding import smart_unicode
|
||||
from django.utils.formats import get_format_modules, get_format
|
||||
from django.utils.http import is_safe_url
|
||||
|
||||
|
||||
def set_language(request):
|
||||
"""
|
||||
|
@ -20,11 +22,11 @@ def set_language(request):
|
|||
redirect to the page in the request (the 'next' parameter) without changing
|
||||
any state.
|
||||
"""
|
||||
next = request.REQUEST.get('next', None)
|
||||
if not next:
|
||||
next = request.META.get('HTTP_REFERER', None)
|
||||
if not next:
|
||||
next = '/'
|
||||
next = request.REQUEST.get('next')
|
||||
if not is_safe_url(url=next, host=request.get_host()):
|
||||
next = request.META.get('HTTP_REFERER')
|
||||
if not is_safe_url(url=next, host=request.get_host()):
|
||||
next = '/'
|
||||
response = http.HttpResponseRedirect(next)
|
||||
if request.method == 'POST':
|
||||
lang_code = request.POST.get('language', None)
|
||||
|
|
|
@ -16,6 +16,9 @@ from django.template import loader, Template, Context, TemplateDoesNotExist
|
|||
from django.utils.http import http_date, parse_http_date
|
||||
from django.utils.translation import ugettext as _, ugettext_noop
|
||||
|
||||
STREAM_CHUNK_SIZE = 4096
|
||||
|
||||
|
||||
def serve(request, path, document_root=None, show_indexes=False):
|
||||
"""
|
||||
Serve static files below a given point in the directory structure.
|
||||
|
@ -59,8 +62,8 @@ def serve(request, path, document_root=None, show_indexes=False):
|
|||
if not was_modified_since(request.META.get('HTTP_IF_MODIFIED_SINCE'),
|
||||
statobj.st_mtime, statobj.st_size):
|
||||
return HttpResponseNotModified(mimetype=mimetype)
|
||||
with open(fullpath, 'rb') as f:
|
||||
response = HttpResponse(f.read(), mimetype=mimetype)
|
||||
f = open(fullpath, 'rb')
|
||||
response = HttpResponse(iter(lambda: f.read(STREAM_CHUNK_SIZE), ''), mimetype=mimetype)
|
||||
response["Last-Modified"] = http_date(statobj.st_mtime)
|
||||
if stat.S_ISREG(statobj.st_mode):
|
||||
response["Content-Length"] = statobj.st_size
|
||||
|
|
|
@ -105,14 +105,22 @@ class DjangoHTMLTranslator(SmartyPantsHTMLTranslator):
|
|||
|
||||
# Don't use border=1, which docutils does by default.
|
||||
def visit_table(self, node):
|
||||
self.context.append(self.compact_p)
|
||||
self.compact_p = True
|
||||
self._table_row_index = 0 # Needed by Sphinx
|
||||
self.body.append(self.starttag(node, 'table', CLASS='docutils'))
|
||||
|
||||
# <big>? Really?
|
||||
def depart_table(self, node):
|
||||
self.compact_p = self.context.pop()
|
||||
self.body.append('</table>\n')
|
||||
|
||||
def visit_desc_parameterlist(self, node):
|
||||
self.body.append('(')
|
||||
self.body.append('(') # by default sphinx puts <big> around the "("
|
||||
self.first_param = 1
|
||||
self.optional_param_level = 0
|
||||
self.param_separator = node.child_text_separator
|
||||
self.required_params_left = sum([isinstance(c, addnodes.desc_parameter)
|
||||
for c in node.children])
|
||||
|
||||
def depart_desc_parameterlist(self, node):
|
||||
self.body.append(')')
|
||||
|
|
|
@ -17,6 +17,9 @@
|
|||
{%- endmacro %}
|
||||
|
||||
{% block extrahead %}
|
||||
{# When building htmlhelp (CHM format) disable JQuery inclusion, #}
|
||||
{# as it causes problems in compiled CHM files. #}
|
||||
{% if builder != "htmlhelp" %}
|
||||
{{ super() }}
|
||||
<script type="text/javascript" src="{{ pathto('templatebuiltins.js', 1) }}"></script>
|
||||
<script type="text/javascript">
|
||||
|
@ -51,6 +54,7 @@
|
|||
});
|
||||
})(jQuery);
|
||||
</script>
|
||||
{% endif %}
|
||||
{% endblock %}
|
||||
|
||||
{% block document %}
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
/*** setup ***/
|
||||
html { background:#092e20;}
|
||||
body { font:12px/1.5 Verdana,sans-serif; background:#092e20; color: white;}
|
||||
#custom-doc { width:76.54em;*width:74.69em;min-width:995px; max-width:100em; margin:auto; text-align:left; padding-top:16px; margin-top:0;}
|
||||
#custom-doc { width:76.54em;*width:74.69em;min-width:995px; max-width:100em; margin:auto; text-align:left; padding-top:16px; margin-top:0;}
|
||||
#hd { padding: 4px 0 12px 0; }
|
||||
#bd { background:#234F32; }
|
||||
#ft { color:#487858; font-size:90%; padding-bottom: 2em; }
|
||||
|
@ -54,7 +54,7 @@ hr { color:#ccc; background-color:#ccc; height:1px; border:0; }
|
|||
p, ul, dl { margin-top:.6em; margin-bottom:1em; padding-bottom: 0.1em;}
|
||||
#yui-main div.yui-b img { max-width: 50em; margin-left: auto; margin-right: auto; display: block; }
|
||||
caption { font-size:1em; font-weight:bold; margin-top:0.5em; margin-bottom:0.5em; margin-left: 2px; text-align: center; }
|
||||
blockquote { padding: 0 1em; margin: 1em 0; font:125%/1.2em "Trebuchet MS", sans-serif; color:#234f32; border-left:2px solid #94da3a; }
|
||||
blockquote { padding: 0 1em; margin: 1em 0; font:125%/1.2em "Trebuchet MS", sans-serif; color:#234f32; border-left:2px solid #94da3a; }
|
||||
strong { font-weight: bold; }
|
||||
em { font-style: italic; }
|
||||
ins { font-weight: bold; text-decoration: none; }
|
||||
|
@ -111,10 +111,11 @@ dt .literal, table .literal { background:none; }
|
|||
.note, .admonition { padding-left:65px; background:url(docicons-note.png) .8em .8em no-repeat;}
|
||||
div.admonition-philosophy { padding-left:65px; background:url(docicons-philosophy.png) .8em .8em no-repeat;}
|
||||
div.admonition-behind-the-scenes { padding-left:65px; background:url(docicons-behindscenes.png) .8em .8em no-repeat;}
|
||||
.admonition.warning { background:url(docicons-warning.png) .8em .8em no-repeat; border:1px solid #ffc83c;}
|
||||
|
||||
/*** versoinadded/changes ***/
|
||||
div.versionadded, div.versionchanged { }
|
||||
div.versionadded span.title, div.versionchanged span.title { font-weight: bold; }
|
||||
div.versionadded span.title, div.versionchanged span.title, div.deprecated span.title { font-weight: bold; }
|
||||
|
||||
/*** p-links ***/
|
||||
a.headerlink { color: #c60f0f; font-size: 0.8em; padding: 0 4px 0 4px; text-decoration: none; visibility: hidden; }
|
||||
|
|
Binary file not shown.
After Width: | Height: | Size: 782 B |
|
@ -50,9 +50,9 @@ copyright = 'Django Software Foundation and contributors'
|
|||
# built documents.
|
||||
#
|
||||
# The short X.Y version.
|
||||
version = '1.4'
|
||||
version = '1.4.22'
|
||||
# The full version, including alpha/beta/rc tags.
|
||||
release = '1.4'
|
||||
release = '1.4.22'
|
||||
# The next version to be released
|
||||
django_next_version = '1.5'
|
||||
|
||||
|
|
|
@ -42,18 +42,15 @@ Yes. See :doc:`Integrating with a legacy database </howto/legacy-databases>`.
|
|||
If I make changes to a model, how do I update the database?
|
||||
-----------------------------------------------------------
|
||||
|
||||
If you don't mind clearing data, your project's ``manage.py`` utility has an
|
||||
option to reset the SQL for a particular application::
|
||||
|
||||
manage.py reset appname
|
||||
|
||||
This drops any tables associated with ``appname`` and recreates them.
|
||||
If you don't mind clearing data, your project's ``manage.py`` utility has a
|
||||
:djadmin:`flush` option to reset the database to the state it was in
|
||||
immediately after :djadmin:`syncdb` was executed.
|
||||
|
||||
If you do care about deleting data, you'll have to execute the ``ALTER TABLE``
|
||||
statements manually in your database.
|
||||
|
||||
There are `external projects which handle schema updates
|
||||
<http://djangopackages.com/grids/g/database-migration/>`_, of which the current
|
||||
<http://www.djangopackages.com/grids/g/database-migration/>`_, of which the current
|
||||
defacto standard is `south <http://south.aeracode.org/>`_.
|
||||
|
||||
Do Django models support multiple-column primary keys?
|
||||
|
|
|
@ -86,5 +86,13 @@ the provided filename into account. The ``name`` argument passed to this method
|
|||
will have already cleaned to a filename valid for the storage system, according
|
||||
to the ``get_valid_name()`` method described above.
|
||||
|
||||
The code provided on ``Storage`` simply appends ``"_1"``, ``"_2"``, etc. to the
|
||||
filename until it finds one that's available in the destination directory.
|
||||
.. versionchanged:: 1.4.14
|
||||
|
||||
If a file with ``name`` already exists, an underscore plus a random 7
|
||||
character alphanumeric string is appended to the filename before the
|
||||
extension.
|
||||
|
||||
Previously, an underscore followed by a number (e.g. ``"_1"``, ``"_2"``,
|
||||
etc.) was appended to the filename until an available name in the destination
|
||||
directory was found. A malicious user could exploit this deterministic
|
||||
algorithm to create a denial-of-service attack.
|
||||
|
|
|
@ -120,7 +120,7 @@ default options such as :djadminopt:`--verbosity` and :djadminopt:`--traceback`.
|
|||
|
||||
class Command(BaseCommand):
|
||||
...
|
||||
self.can_import_settings = True
|
||||
can_import_settings = True
|
||||
|
||||
def handle(self, *args, **options):
|
||||
|
||||
|
|
|
@ -181,10 +181,10 @@ card values plus their suits; 104 characters in total.
|
|||
Many of Django's model fields accept options that they don't do anything
|
||||
with. For example, you can pass both
|
||||
:attr:`~django.db.models.Field.editable` and
|
||||
:attr:`~django.db.models.Field.auto_now` to a
|
||||
:attr:`~django.db.models.DateField.auto_now` to a
|
||||
:class:`django.db.models.DateField` and it will simply ignore the
|
||||
:attr:`~django.db.models.Field.editable` parameter
|
||||
(:attr:`~django.db.models.Field.auto_now` being set implies
|
||||
(:attr:`~django.db.models.DateField.auto_now` being set implies
|
||||
``editable=False``). No error is raised in this case.
|
||||
|
||||
This behavior simplifies the field classes, because they don't need to
|
||||
|
@ -334,7 +334,6 @@ Once you have ``MytypeField``, you can use it in any model, just like any other
|
|||
|
||||
class Person(models.Model):
|
||||
name = models.CharField(max_length=80)
|
||||
gender = models.CharField(max_length=1)
|
||||
something_else = MytypeField()
|
||||
|
||||
If you aim to build a database-agnostic application, you should account for
|
||||
|
@ -448,6 +447,13 @@ called when it is created, you should be using `The SubfieldBase metaclass`_
|
|||
mentioned earlier. Otherwise :meth:`.to_python` won't be called
|
||||
automatically.
|
||||
|
||||
.. warning::
|
||||
|
||||
If your custom field allows ``null=True``, any field method that takes
|
||||
``value`` as an argument, like :meth:`~Field.to_python` and
|
||||
:meth:`~Field.get_prep_value`, should handle the case when ``value`` is
|
||||
``None``.
|
||||
|
||||
Converting Python objects to query values
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
|
@ -476,6 +482,16 @@ For example::
|
|||
return ''.join([''.join(l) for l in (value.north,
|
||||
value.east, value.south, value.west)])
|
||||
|
||||
.. warning::
|
||||
|
||||
If your custom field uses the ``CHAR``, ``VARCHAR`` or ``TEXT``
|
||||
types for MySQL, you must make sure that :meth:`.get_prep_value`
|
||||
always returns a string type. MySQL performs flexible and unexpected
|
||||
matching when a query is performed on these types and the provided
|
||||
value is an integer, which can cause queries to include unexpected
|
||||
objects in their results. This problem cannot occur if you always
|
||||
return a string type from :meth:`.get_prep_value`.
|
||||
|
||||
Converting query values to database values
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
|
@ -525,8 +541,8 @@ for the first time, the ``add`` parameter will be ``True``, otherwise it will be
|
|||
You only need to override this method if you want to preprocess the value
|
||||
somehow, just before saving. For example, Django's
|
||||
:class:`~django.db.models.DateTimeField` uses this method to set the attribute
|
||||
correctly in the case of :attr:`~django.db.models.Field.auto_now` or
|
||||
:attr:`~django.db.models.Field.auto_now_add`.
|
||||
correctly in the case of :attr:`~django.db.models.DateField.auto_now` or
|
||||
:attr:`~django.db.models.DateField.auto_now_add`.
|
||||
|
||||
If you do override this method, you must return the value of the attribute at
|
||||
the end. You should also update the model's attribute if you make any changes
|
||||
|
@ -685,7 +701,7 @@ data storage anyway, we can reuse some existing conversion code::
|
|||
|
||||
def value_to_string(self, obj):
|
||||
value = self._get_val_from_obj(obj)
|
||||
return self.get_db_prep_value(value)
|
||||
return self.get_prep_value(value)
|
||||
|
||||
Some general advice
|
||||
--------------------
|
||||
|
|
|
@ -18,8 +18,8 @@ The `official mod_wsgi documentation`_ is fantastic; it's your source for all
|
|||
the details about how to use mod_wsgi. You'll probably want to start with the
|
||||
`installation and configuration documentation`_.
|
||||
|
||||
.. _official mod_wsgi documentation: http://www.modwsgi.org/
|
||||
.. _installation and configuration documentation: http://www.modwsgi.org/wiki/InstallationInstructions
|
||||
.. _official mod_wsgi documentation: http://code.google.com/p/modwsgi/
|
||||
.. _installation and configuration documentation: http://code.google.com/p/modwsgi/wiki/InstallationInstructions
|
||||
|
||||
Basic configuration
|
||||
===================
|
||||
|
@ -61,10 +61,10 @@ Using a virtualenv
|
|||
|
||||
If you install your project's Python dependencies inside a `virtualenv`_,
|
||||
you'll need to add the path to this virtualenv's ``site-packages`` directory to
|
||||
your Python path as well. To do this, you can add another line to your
|
||||
Apache configuration::
|
||||
your Python path as well. To do this, add an additional path to your
|
||||
`WSGIPythonPath` directive, with multiple paths separated by a colon::
|
||||
|
||||
WSGIPythonPath /path/to/your/venv/lib/python2.X/site-packages
|
||||
WSGIPythonPath /path/to/mysite.com:/path/to/your/venv/lib/python2.X/site-packages
|
||||
|
||||
Make sure you give the correct path to your virtualenv, and replace
|
||||
``python2.X`` with the correct Python version (e.g. ``python2.7``).
|
||||
|
@ -75,12 +75,20 @@ Using mod_wsgi daemon mode
|
|||
==========================
|
||||
|
||||
"Daemon mode" is the recommended mode for running mod_wsgi (on non-Windows
|
||||
platforms). See the `official mod_wsgi documentation`_ for details on setting
|
||||
up daemon mode. The only change required to the above configuration if you use
|
||||
daemon mode is that you can't use ``WSGIPythonPath``; instead you should use
|
||||
the ``python-path`` option to ``WSGIDaemonProcess``, for example::
|
||||
platforms). To create the required daemon process group and delegate the
|
||||
Django instance to run in it, you will need to add appropriate
|
||||
``WSGIDaemonProcess`` and ``WSGIProcessGroup`` directives. A further change
|
||||
required to the above configuration if you use daemon mode is that you can't
|
||||
use ``WSGIPythonPath``; instead you should use the ``python-path`` option to
|
||||
``WSGIDaemonProcess``, for example::
|
||||
|
||||
WSGIDaemonProcess example.com python-path=/path/to/mysite.com:/path/to/venv/lib/python2.7/site-packages
|
||||
WSGIProcessGroup example.com
|
||||
|
||||
See the official mod_wsgi documentation for `details on setting up daemon
|
||||
mode`_.
|
||||
|
||||
.. _details on setting up daemon mode: http://code.google.com/p/modwsgi/wiki/QuickConfigurationGuide#Delegation_To_Daemon_Process
|
||||
|
||||
.. _serving-files:
|
||||
|
||||
|
|
|
@ -47,7 +47,7 @@ uWSGI supports multiple ways to configure the process. See uWSGI's
|
|||
Here's an example command to start a uWSGI server::
|
||||
|
||||
uwsgi --chdir=/path/to/your/project
|
||||
--module='mysite.wsgi:application' \
|
||||
--module=mysite.wsgi:application \
|
||||
--env DJANGO_SETTINGS_MODULE=mysite.settings \
|
||||
--master --pidfile=/tmp/project-master.pid \
|
||||
--socket=127.0.0.1:49152 \ # can also be a file
|
||||
|
@ -81,7 +81,7 @@ Example ini configuration file::
|
|||
|
||||
[uwsgi]
|
||||
chdir=/path/to/your/project
|
||||
module='mysite.wsgi:application'
|
||||
module=mysite.wsgi:application
|
||||
master=True
|
||||
pidfile=/tmp/project-master.pid
|
||||
vacuum=True
|
||||
|
@ -93,6 +93,6 @@ Example ini configuration file usage::
|
|||
uwsgi --ini uwsgi.ini
|
||||
|
||||
See the uWSGI docs on `managing the uWSGI process`_ for information on
|
||||
starting, stoping and reloading the uWSGI workers.
|
||||
starting, stopping and reloading the uWSGI workers.
|
||||
|
||||
.. _managing the uWSGI process: http://projects.unbit.it/uwsgi/wiki/Management
|
||||
|
|
|
@ -123,6 +123,8 @@ Filtering error reports
|
|||
Filtering sensitive information
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
.. currentmodule:: django.views.decorators.debug
|
||||
|
||||
Error reports are really helpful for debugging errors, so it is generally
|
||||
useful to record as much relevant information about those errors as possible.
|
||||
For example, by default Django records the `full traceback`_ for the
|
||||
|
@ -236,11 +238,13 @@ attribute::
|
|||
request.exception_reporter_filter = CustomExceptionReporterFilter()
|
||||
...
|
||||
|
||||
.. currentmodule:: django.views.debug
|
||||
|
||||
Your custom filter class needs to inherit from
|
||||
:class:`django.views.debug.SafeExceptionReporterFilter` and may override the
|
||||
following methods:
|
||||
|
||||
.. class:: django.views.debug.SafeExceptionReporterFilter
|
||||
.. class:: SafeExceptionReporterFilter
|
||||
|
||||
.. method:: SafeExceptionReporterFilter.is_active(self, request)
|
||||
|
||||
|
|
|
@ -163,4 +163,4 @@ Backend-specific SQL data is executed before non-backend-specific SQL
|
|||
data. For example, if your app contains the files ``sql/person.sql``
|
||||
and ``sql/person.sqlite3.sql`` and you're installing the app on
|
||||
SQLite, Django will execute the contents of
|
||||
``sql/person.sqlite.sql`` first, then ``sql/person.sql``.
|
||||
``sql/person.sqlite3.sql`` first, then ``sql/person.sql``.
|
||||
|
|
|
@ -4,70 +4,7 @@ Running Django on Jython
|
|||
|
||||
.. index:: Jython, Java, JVM
|
||||
|
||||
Jython_ is an implementation of Python that runs on the Java platform (JVM).
|
||||
Django runs cleanly on Jython version 2.5 or later, which means you can deploy
|
||||
Django on any Java platform.
|
||||
|
||||
This document will get you up and running with Django on top of Jython.
|
||||
|
||||
.. _jython: http://www.jython.org/
|
||||
|
||||
Installing Jython
|
||||
=================
|
||||
|
||||
Django works with Jython versions 2.5b3 and higher. Download Jython at
|
||||
http://www.jython.org/.
|
||||
|
||||
Creating a servlet container
|
||||
============================
|
||||
|
||||
If you just want to experiment with Django, skip ahead to the next section;
|
||||
Django includes a lightweight Web server you can use for testing, so you won't
|
||||
need to set up anything else until you're ready to deploy Django in production.
|
||||
|
||||
If you want to use Django on a production site, use a Java servlet container,
|
||||
such as `Apache Tomcat`_. Full JavaEE applications servers such as `GlassFish`_
|
||||
or `JBoss`_ are also OK, if you need the extra features they include.
|
||||
|
||||
.. _`Apache Tomcat`: http://tomcat.apache.org/
|
||||
.. _GlassFish: https://glassfish.dev.java.net/
|
||||
.. _JBoss: http://www.jboss.org/
|
||||
|
||||
Installing Django
|
||||
=================
|
||||
|
||||
The next step is to install Django itself. This is exactly the same as
|
||||
installing Django on standard Python, so see
|
||||
:ref:`removing-old-versions-of-django` and :ref:`install-django-code` for
|
||||
instructions.
|
||||
|
||||
Installing Jython platform support libraries
|
||||
============================================
|
||||
|
||||
The `django-jython`_ project contains database backends and management commands
|
||||
for Django/Jython development. Note that the builtin Django backends won't work
|
||||
on top of Jython.
|
||||
`django-jython`_ supports Django 1.7. Please use that version of the
|
||||
documentation for details.
|
||||
|
||||
.. _`django-jython`: http://code.google.com/p/django-jython/
|
||||
|
||||
To install it, follow the `installation instructions`_ detailed on the project
|
||||
Web site. Also, read the `database backends`_ documentation there.
|
||||
|
||||
.. _`installation instructions`: http://code.google.com/p/django-jython/wiki/Install
|
||||
.. _`database backends`: http://code.google.com/p/django-jython/wiki/DatabaseBackends
|
||||
|
||||
Differences with Django on Jython
|
||||
=================================
|
||||
|
||||
.. index:: JYTHONPATH
|
||||
|
||||
At this point, Django on Jython should behave nearly identically to Django
|
||||
running on standard Python. However, are a few differences to keep in mind:
|
||||
|
||||
* Remember to use the ``jython`` command instead of ``python``. The
|
||||
documentation uses ``python`` for consistency, but if you're using Jython
|
||||
you'll want to mentally replace ``python`` with ``jython`` every time it
|
||||
occurs.
|
||||
|
||||
* Similarly, you'll need to use the ``JYTHONPATH`` environment variable
|
||||
instead of ``PYTHONPATH``.
|
||||
|
|
|
@ -21,7 +21,7 @@ Here's an example::
|
|||
def some_view(request):
|
||||
# Create the HttpResponse object with the appropriate CSV header.
|
||||
response = HttpResponse(mimetype='text/csv')
|
||||
response['Content-Disposition'] = 'attachment; filename=somefilename.csv'
|
||||
response['Content-Disposition'] = 'attachment; filename="somefilename.csv"'
|
||||
|
||||
writer = csv.writer(response)
|
||||
writer.writerow(['First row', 'Foo', 'Bar', 'Baz'])
|
||||
|
@ -93,7 +93,7 @@ Here's an example, which generates the same CSV file as above::
|
|||
def some_view(request):
|
||||
# Create the HttpResponse object with the appropriate CSV header.
|
||||
response = HttpResponse(mimetype='text/csv')
|
||||
response['Content-Disposition'] = 'attachment; filename=somefilename.csv'
|
||||
response['Content-Disposition'] = 'attachment; filename="somefilename.csv"'
|
||||
|
||||
# The data is hard-coded here, but you could load it from a database or
|
||||
# some other source.
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue