pax_global_header00006660000000000000000000000064150000425610014503gustar00rootroot0000000000000052 comment=f5609f0e571bf85361f9796b1f2f709e192161d1 django-cachalot-2.8.0/000077500000000000000000000000001500004256100145305ustar00rootroot00000000000000django-cachalot-2.8.0/.coveragerc000066400000000000000000000002731500004256100166530ustar00rootroot00000000000000[run] omit = */tests/* [report] exclude_lines = pragma: no cover pragma: nocover def __repr__ if __name__ == .__main__.: if TYPE_CHECKING: except ImportError: django-cachalot-2.8.0/.coveralls.yml000066400000000000000000000000561500004256100173240ustar00rootroot00000000000000repo_token: p6sOGerwOQcVD2w9HwXcpxyawFShxpxPw django-cachalot-2.8.0/.github/000077500000000000000000000000001500004256100160705ustar00rootroot00000000000000django-cachalot-2.8.0/.github/ISSUE_TEMPLATE/000077500000000000000000000000001500004256100202535ustar00rootroot00000000000000django-cachalot-2.8.0/.github/ISSUE_TEMPLATE/bug.md000066400000000000000000000003121500004256100213460ustar00rootroot00000000000000--- name: Bug Report about: Report a bug --- ## What happened? ## What should've happened instead? ## Steps to reproduce [//]: # (Include host configuration: OS, Django version, database, etc.) django-cachalot-2.8.0/.github/ISSUE_TEMPLATE/feature.md000066400000000000000000000007671500004256100222420ustar00rootroot00000000000000--- name: Feature Proposal about: Propose a feature that we could add to cachalot --- ## Description [//]: # (Describe what you are implementing and how, if possible) ## Rationale [//]: # (Why should this feature be implemented?) [//]: # (Does it improve upon the software?) ## Use case(s) / visualization(s) [//]: # (Who would use this? What kind of audience?) ## Alternatives? [//]: # (With every design comes another better or worse one) [//]: # (We need to know all possible alternatives)django-cachalot-2.8.0/.github/ISSUE_TEMPLATE/improvement.md000066400000000000000000000003731500004256100231450ustar00rootroot00000000000000--- name: Improvement Suggestion about: Efficiency is key and we want to know how to make it more efficient --- ## Description [//]: # (What are you proposing and how would we implement it?) ## Rationale [//]: # (Why should this be implemented?) django-cachalot-2.8.0/.github/ISSUE_TEMPLATE/question.md000066400000000000000000000005231500004256100224440ustar00rootroot00000000000000--- name: Question about: Ask a generic/specific question here or on the Discord chat --- ## Question ## What have you tried so far? [//]: # (Should include:) [//]: # (- Host configuration: OS, database, Django version, etc.) [//]: # (- Steps to reproduce) [//]: # (- Logs and errors) [//]: # (- Other potentially useful information) django-cachalot-2.8.0/.github/PULL_REQUEST_TEMPLATE.md000066400000000000000000000003141500004256100216670ustar00rootroot00000000000000[//]: # (Thanks for helping us out! Cache invalidation sucks, so your help is greatly appreciated!) ## Description [//]: # (What're you proposing?) ## Rationale [//]: # (Why should we implement it?) django-cachalot-2.8.0/.github/workflows/000077500000000000000000000000001500004256100201255ustar00rootroot00000000000000django-cachalot-2.8.0/.github/workflows/ci.yml000066400000000000000000000047701500004256100212530ustar00rootroot00000000000000name: CI on: push: branches: [ master ] pull_request: branches: [ master ] concurrency: group: ${{ github.workflow }}-${{ github.ref }} cancel-in-progress: true jobs: build: runs-on: ubuntu-latest strategy: fail-fast: false matrix: python-version: ['3.8', '3.9', '3.10', '3.11', '3.12'] django-version: ['3.2', '4.1', '4.2', '5.0', '5.1', '5.2'] exclude: - python-version: '3.11' django-version: '3.2' - python-version: '3.12' django-version: '3.2' - python-version: '3.11' django-version: '4.1' - python-version: '3.12' django-version: '4.1' services: redis: image: redis:6 ports: - 6379:6379 postgres: image: postgres ports: - 5432:5432 env: POSTGRES_USER: cachalot POSTGRES_PASSWORD: password POSTGRES_DB: cachalot options: >- --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5 mysql: image: mysql env: MYSQL_ALLOW_EMPTY_PASSWORD: yes MYSQL_DATABASE: cachalot ports: - 3306:3306 memcached: image: memcached ports: - 11211:11211 steps: - uses: actions/checkout@v2 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v2 with: python-version: ${{ matrix.python-version }} - name: Get pip cache dir id: pip-cache run: | echo "::set-output name=dir::$(pip cache dir)" - name: Cache uses: actions/cache@v4 with: path: ${{ steps.pip-cache.outputs.dir }} key: ${{ matrix.python-version }}-v1-${{ hashFiles('**/setup.py') }} restore-keys: | ${{ matrix.python-version }}-v1- - name: Install dependencies run: | sudo apt-get install -y libmemcached-dev zlib1g-dev libpq-dev python -m pip install --upgrade pip wheel python -m pip install tox tox-gh-actions coveralls - name: Tox Test run: tox env: TOX_TESTENV_PASSENV: POSTGRES_PASSWORD POSTGRES_PASSWORD: password PYTHON_VER: ${{ matrix.python-version }} DJANGO: ${{ matrix.django-version }} - name: Coverage (Coveralls) if: ${{ success() }} run: coveralls --service=github env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} django-cachalot-2.8.0/.github/workflows/main-ci.yml000066400000000000000000000034531500004256100221720ustar00rootroot00000000000000name: Django Main Testing CI on: schedule: - cron: "0 2 * * *" workflow_dispatch: jobs: build: runs-on: ubuntu-latest strategy: fail-fast: false matrix: python-version: ['3.10'] services: redis: image: redis:6 ports: - 6379:6379 postgres: image: postgres ports: - 5432:5432 env: POSTGRES_USER: cachalot POSTGRES_PASSWORD: password POSTGRES_DB: cachalot options: >- --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5 mysql: image: mysql env: MYSQL_ALLOW_EMPTY_PASSWORD: yes MYSQL_DATABASE: cachalot ports: - 3306:3306 memcached: image: memcached ports: - 11211:11211 steps: - uses: actions/checkout@v2 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v2 with: python-version: ${{ matrix.python-version }} - name: Get pip cache dir id: pip-cache run: | echo "::set-output name=dir::$(pip cache dir)" - name: Cache uses: actions/cache@v2 with: path: ${{ steps.pip-cache.outputs.dir }} key: ${{ matrix.python-version }}-v1-${{ hashFiles('**/setup.py') }} restore-keys: | ${{ matrix.python-version }}-v1- - name: Install dependencies run: | sudo apt-get install -y libmemcached-dev zlib1g-dev python -m pip install --upgrade pip wheel python -m pip install tox tox-gh-actions coveralls - name: Tox Test run: tox env: TOX_TESTENV_PASSENV: POSTGRES_PASSWORD POSTGRES_PASSWORD: password DJANGO: 'main' django-cachalot-2.8.0/.github/workflows/publish.yml000066400000000000000000000015161500004256100223210ustar00rootroot00000000000000# This workflows will upload a Python Package using Twine when a release is created # For more information see: https://help.github.com/en/actions/language-and-framework-guides/using-python-with-github-actions#publishing-to-package-registries name: Upload Python Package on: release: types: [created] jobs: deploy: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Set up Python uses: actions/setup-python@v2 with: python-version: '3.x' - name: Install dependencies run: | python -m pip install --upgrade pip pip install setuptools wheel twine - name: Build and publish env: TWINE_USERNAME: __token__ TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }} run: | python setup.py sdist bdist_wheel twine upload dist/* django-cachalot-2.8.0/.gitignore000066400000000000000000000032731500004256100165250ustar00rootroot00000000000000# Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] *$py.class # C extensions *.so # Distribution / packaging .Python build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ pip-wheel-metadata/ share/python-wheels/ *.egg-info/ .installed.cfg *.egg MANIFEST # PyInstaller # Usually these files are written by a python script from a template # before PyInstaller builds the exe, so as to inject date/other infos into it. *.manifest *.spec # Installer logs pip-log.txt pip-delete-this-directory.txt # Unit test / coverage reports htmlcov/ .tox/ .nox/ .coverage .coverage.* .cache nosetests.xml coverage.xml *.cover .hypothesis/ .pytest_cache/ # Translations *.mo *.pot # Django stuff: *.log local_settings.py *.sqlite3 db.sqlite3 db.sqlite3-journal # Flask stuff: instance/ .webassets-cache # Scrapy stuff: .scrapy # Sphinx documentation docs/_build/ # PyBuilder target/ # Jupyter Notebook .ipynb_checkpoints # IPython profile_default/ ipython_config.py # pyenv .python-version # pipenv # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. # However, in case of collaboration, if having platform-specific dependencies or dependencies # having no cross-platform support, pipenv may install dependencies that don't work, or not # install all needed dependencies. #Pipfile.lock # celery beat schedule file celerybeat-schedule # SageMath parsed files *.sage.py # Environments .env .venv env/ venv/ ENV/ env.bak/ venv.bak/ # Spyder project settings .spyderproject .spyproject # Rope project settings .ropeproject # mkdocs documentation /site # mypy .mypy_cache/ .dmypy.json dmypy.json # Pyre type checker .pyre/ django-cachalot-2.8.0/.readthedocs.yaml000066400000000000000000000007241500004256100177620ustar00rootroot00000000000000# Read the Docs configuration file # See https://docs.readthedocs.io/en/stable/config-file/v2.html for details # Required version: 2 # Set the version of Python and other tools you might need build: os: ubuntu-22.04 tools: python: "3.11" # Build documentation in the docs/ directory with Sphinx sphinx: configuration: docs/conf.py # Declare the Python requirements required to build your docs python: install: - requirements: docs/requirements.txt django-cachalot-2.8.0/CHANGELOG.rst000066400000000000000000000336321500004256100165600ustar00rootroot00000000000000What’s new in django-cachalot? ============================== 2.8.0 ----- - Add a setting for disabling iterator caching (#263) - Add support for testing against the builtin redis backend (#264) - Add Django 5.2 support to tox configuration (#272) 2.7.0 ----- - Add support for caching queries with enum filters (#267) 2.6.3 ----- - Support for Django 5.1 2.6.2 ----- - Support for Python 3.12 and Django 5.0 2.6.1 ----- - Hot fix when Django is externally imported due to connections import (#242) 2.6.0 ----- - Dropped Django 2.2 and 4.0 support - Added Django 4.2 and Python 3.11 support - Added psycopg support (#229) - Updated tests to account for the `BEGIN` and `COMMIT` query changes in Django 4.2 - Standardized django version comparisons in tests 2.5.3 ----- - Verify get_meta isn't none before requesting db_table (#225 #226) 2.5.2 ----- - Added Django 4.1 support (#217) 2.5.1 ----- - Table invalidation condition enhanced (#213) - Add test settings to sdist (#203) - Include docs in sdist (#202) 2.5.0 ----- - Add final SQL check to include potentially overlooked tables when looking up involved tables (#199) - Add ``CACHALOT_FINAL_SQL_CHECK`` for enabling Final SQL check 2.4.5 ----- - Dropped Python 3.6 and Django 3.1 support. Added Django 4.0 support (#208) 2.4.4 ----- - Handle queryset implementations without lhs/rhs attribute (#204) - Add Python 3.10 support (#206) - (Internal) Omit additional unnecessary code in coverage 2.4.3 ----- - Fix annotated Now being cached (#195) - Fix conditional annotated expressions not being cached (#196) - Simplify annotation handling by using the flatten method (#197) - Fix Django 3.2 default_app_config deprecation (#198) - (Internal) Pinned psycopg2 to <2.9 due to Django 2.2 incompatibility 2.4.2 ----- - Add convenience settings `CACHALOT_ONLY_CACHABLE_APPS` and `CACHALOT_UNCACHABLE_APPS` (#187) - Drop support for Django 3.0 (#189) - (Internal) Added Django main-branch CI on cron job - (Internal) Removed duplicate code (#190) 2.4.1 ----- - Fix Django requirement constraint to include 3.2.X not just 3.2 - (Internal) Deleted obsolete travis-matrix.py file 2.4.0 ----- - Add support for Django 3.2 (#181) - Remove enforced system check for Django version (#175) - Drop support for Django 2.0-2.1 and Python 3.5 (#181) - Add support for Pymemcache for Django 3.2+ (#181) - Reverts #157 with proper fix. (#181) - Add ``CACHALOT_ADDITIONAL_TABLES`` setting for unmanaged models (#183) 2.3.5 ----- - Fix cachalot_disabled (#174) 2.3.4 ----- - Fix bug with externally invalidated cache keys (#120) - Omit test files in coverage 2.3.3 ----- - Remove deprecated signal argument (#165) - Add Python 3.9 support - Use Discord instead since Slack doesn't save messages, @Andrew-Chen-Wang is not on there very much, and Discord has a phenomenal search functionality (with ES). 2.3.2 ----- - Cast memoryview objects to bytes to be able to pickle them (#163) 2.3.1 ----- - Added support for Django 3.1, including the new, native JSONField 2.3.0 ----- - Added context manager for temporarily disabling cachalot using `cachalot_disabled()` - Fix for certain Subquery cases. 2.2.2 ----- - Drop support for Django 1.11 and Python 2.7 - Added fix for subqueries from Django 2.2 2.2.0 ----- - Adds Django 2.2 and 3.0 support. - Dropped official support for Python 3.4 - It won't run properly with Travis CI tests on MySQL. - All Travis CI tests are fully functional. 2.1.0 ----- - Adds Django 2.1 support. 2.0.2 ----- - Adds support for ``.union``, ``.intersection`` & ``.difference`` that should have been introduced since 1.5.0 - Fixes error raised in some rare and undetermined cases, when the cache backend doesn’t yield data as expected 2.0.1 ----- - Allows specifying a schema name in ``Model._meta.db_table`` 2.0.0 ----- - Adds Django 2.0 support - Drops Django 1.10 support - Drops Django 1.8 support (1.9 support was dropped in 1.5.0) - Adds a check to make sure it is used with a supported Django version - Fixes a bug partially breaking django-cachalot when an error occurred during the end of a `transaction.atomic` block, typically when using deferred constraints 1.5.0 ----- - Adds Django 1.11 support - Adds Python 3.6 support - Drops Django 1.9 support (but 1.8 is still supported) - Drops Python 3.3 support - Adds ``CACHALOT_DATABASES`` to specify which databases have django-cachalot enabled (by default, only supported databases are enabled) - Stops advising users to dynamically override cachalot settings as it cannot be thread-safe due to Django’s internals - Invalidates tables after raw ``CREATE``, ``ALTER`` & ``DROP`` SQL queries - Allows specifying model lookups like ``auth.User`` in the API functions (previously, it could only be done in the Django template tag, not in the Jinja2 ``get_last_invalidation`` function nor in API functions) - Fixes the cache used by ``CachalotPanel`` if ``CACHALOT_CACHE`` is different from ``'default'`` - Uploads a wheel distribution of this package to PyPI starting now, in addition of the source release - Improves tests 1.4.1 ----- - Fixes a circular import occurring when CachalotPanel is used and django-debug-toolbar is before django-cachalot in ``INSTALLED_APPS`` - Stops checking compatibility for caches other than ``CACHALOT_CACHE`` 1.4.0 ----- - Fixes a bad design: ``QuerySet.select_for_update`` was cached, but it’s not correct since it does not lock data in the database once data was cached, leading to the database lock being useless in some cases - Stops automatically invalidating other caches than ``CACHALOT_CACHE`` for consistency, performance, and usefulness reasons - Fixes a minor issue: the ``post_invalidation`` signal was sent during transactions when calling the ``invalidate`` command - Creates `a gitter chat room `_ - Removes the Slack team. Slack does not allow public chat, this was therefore a bad idea 1.3.0 ----- - Adds Django 1.10 support - Drops Django 1.7 support - Drops Python 3.2 support - Adds a Jinja2 extension with a ``cache`` statement and the ``get_last_invalidation`` function - Adds a ``CACHALOT_TIMEOUT`` setting after dozens of private & public requests, but it’s not really useful - Fixes a ``RuntimeError`` occurring if a ``DatabaseCache`` was used in a project, even if not used by django-cachalot - Allows bytes raw queries (except on SQLite where it’s not supposed to work) - Creates `a Slack team `_ to discuss, easier than using Google Groups 1.2.1 ----- **Mandatory update if you’re using django-cachalot 1.2.0.** This version reverts the cache keys hashing change from 1.2.0, as it was leading to a non-shared cache when Python used a random seed for hashing, which is the case by default on Python 3.3, 3.4, & 3.5, and also on 2.7 & 3.2 if you set ``PYTHONHASHSEED=random``. 1.2.0 ----- **WARNING: This version is unsafe, it can lead to invalidation errors** - Adds Django 1.9 support - Simplifies and speeds up cache keys hashing - Documents how to use django-cachalot with a replica database - Adds ``DummyCache`` to ``VALID_CACHE_BACKENDS`` - Updates the comparison with django-cache-machine & django-cacheops by checking features and measuring performance instead of relying on their documentations and a 2-years-ago experience of them 1.1.0 ----- **Backwards incompatible changes:** - Adds Django 1.8 support and drops Django 1.6 & Python 2.6 support - Merges the 3 API functions ``invalidate_all``, ``invalidate_tables``, & ``invalidate_models`` into a single ``invalidate`` function while optimising it Other additions: - Adds a ``get_last_invalidation`` function to the API and the equivalent template tag - Adds a ``CACHALOT_ONLY_CACHABLE_TABLES`` setting in order to make a whitelist of the only table names django-cachalot can cache - Caches queries with IP addresses, floats, or decimals in parameters - Adds a Django check to ensure the project uses compatible cache and database backends - Adds a lot of tests, especially to test django.contrib.postgres - Adds a comparison with django-cache-machine and django-cacheops in the documentation Fixed: - Removes a useless extra invalidation during each write operation to the database, leading to a small speedup during data modification and tests - The ``post_invalidation`` signal was triggered during transactions and was not triggered when using the API or raw write queries: both issues are now fixed - Fixes a very unlikely invalidation issue occurring only when an error occurred in a transaction after a transaction of another database nested in the first transaction was committed, like this: .. code:: python from django.db import transaction assert list(YourModel.objects.using('another_db')) == [] try: with transaction.atomic(): with transaction.atomic('another_db'): obj = YourModel.objects.using('another_db').create(name='test') raise ZeroDivisionError except ZeroDivisionError: pass # Before django-cachalot 1.1.0, this assert was failing. assert list(YourModel.objects.using('another_db')) == [obj] 1.0.3 ----- - Fixes an invalidation issue that could rarely occur when querying on a ``BinaryField`` with PostgreSQL, or with some geographic queries (there was a small chance that a same query with different parameters could erroneously give the same result as the previous one) - Adds a ``CACHALOT_UNCACHABLE_TABLES`` setting - Fixes a Django 1.7 migrations invalidation issue in tests (that was leading to this error half of the time: ``RuntimeError: Error creating new content types. Please make sure contenttypes is migrated before trying to migrate apps individually.``) - Optimises tests when using django-cachalot by avoid several useless cache invalidations 1.0.2 ----- - Fixes an ``AttributeError`` occurring when excluding through a many-to-many relation on a child model (using multi-table inheritance) - Stops caching queries with random subqueries – for example ``User.objects.filter(pk__in=User.objects.order_by('?'))`` - Optimises automatic invalidation - Adds a note about clock synchronisation 1.0.1 ----- - Fixes an invalidation issue discovered by Helen Warren that was occurring when updating a ``ManyToManyField`` after executing using ``.exclude`` on that relation. For example, ``Permission.objects.all().delete()`` was not invalidating ``User.objects.exclude(user_permissions=None)`` - Fixes a ``UnicodeDecodeError`` introduced with python-memcached 1.54 - Adds a ``post_invalidation`` signal 1.0.0 ----- Fixes a bug occurring when caching a SQL query using a non-ascii table name. 1.0.0rc ------- Added: - Adds an `invalidate_cachalot` command to invalidate django-cachalot from a script without having to clear the whole cache - Adds the benchmark introduction, conditions & results to the documentation - Adds a short guide on how to configure Redis as a LRU cache Fixed: - Fixes a rare invalidation issue occurring when updating a many-to-many table after executing a queryset generating a ``HAVING`` SQL statement – for example, ``User.objects.first().user_permissions.add(Permission.objects.first())`` was not invalidating ``User.objects.annotate(n=Count('user_permissions')).filter(n__gte=1)`` - Fixes an even rarer invalidation issue occurring when updating a many-to-many table after executing a queryset filtering nested subqueries by another subquery through that many-to-many table – for example:: User.objects.filter( pk__in=User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all()))) - Avoids setting useless cache keys by using table names instead of Django-generated table alias 0.9.0 ----- Added: - Caches all queries implying ``Queryset.extra`` - Invalidates raw queries - Adds a simple API containing: ``invalidate_tables``, ``invalidate_models``, ``invalidate_all`` - Adds file-based cache support for Django 1.7 - Adds a setting to choose if random queries must be cached - Adds 2 settings to customize how cache keys are generated - Adds a django-debug-toolbar panel - Adds a benchmark Fixed: - Rewrites invalidation for a better speed & memory performance - Fixes a stale cache issue occurring when an invalidation is done exactly during a SQL request on the invalidated table(s) - Fixes a stale cache issue occurring after concurrent transactions - Uses an infinite timeout Removed: - Simplifies ``cachalot_settings`` and forbids its use or modification 0.8.1 ----- - Fixes an issue with pip if Django is not yet installed 0.8.0 ----- - Adds multi-database support - Adds invalidation when altering the DB schema using `migrate`, `syncdb`, `flush`, `loaddata` commands (also invalidates South, if you use it) - Small optimizations & simplifications - Adds several tests 0.7.0 ----- - Adds thread-safety - Optimizes the amount of cache queries during transaction 0.6.0 ----- - Adds memcached support 0.5.0 ----- - Adds ``CACHALOT_ENABLED`` & ``CACHALOT_CACHE`` settings - Allows settings to be dynamically overridden using ``cachalot_settings`` - Adds some missing tests 0.4.1 ----- - Fixes ``pip install``. 0.4.0 (**install broken**) -------------------------- - Adds Travis CI and adds compatibility for: - Django 1.6 & 1.7 - Python 2.6, 2.7, 3.2, 3.3, & 3.4 - locmem & Redis - SQLite, PostgreSQL, MySQL 0.3.0 ----- - Handles transactions - Adds lots of tests for complex cases 0.2.0 ----- - Adds a test suite - Fixes invalidation for data creation/deletion - Stops caching on queries defining ``select`` or ``where`` arguments with ``QuerySet.extra`` 0.1.0 ----- Prototype simply caching all SQL queries reading the database and trying to invalidate them when SQL queries modify the database. Has issues invalidating deletions and creations. Also caches ``QuerySet.extra`` queries but can’t reliably invalidate them. No transaction support, no test, no multi-database support, etc. django-cachalot-2.8.0/CONTRIBUTING.rst000066400000000000000000000013041500004256100171670ustar00rootroot00000000000000Thanks for contributing to Django Cachalot! We appreciate any support in improvements to this system in performance, erasing dependency errors, or in killing bugs. When you start a PR or issue, please follow the layout provided. To start developing, install the requirements and run the tests via tox. Make sure you have the following services: * Memcached * Redis * PostgreSQL * MySQL For setup: #. Install: ``pip install -r requirements/hacking.txt`` #. For PostgreSQL: ``CREATE ROLE cachalot LOGIN SUPERUSER;`` #. Run: ``tox --current-env`` to run the test suite on your current Python version. #. You can also run specific databases and Django versions: ``tox -e py38-django3.1-postgresql-redis`` django-cachalot-2.8.0/LICENSE000066400000000000000000000027261500004256100155440ustar00rootroot00000000000000Copyright (c) 2014-2016, Bertrand Bordage All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of django-cachalot nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. django-cachalot-2.8.0/MANIFEST.in000066400000000000000000000002371500004256100162700ustar00rootroot00000000000000include README.rst LICENSE CHANGELOG.rst requirements.txt tox.ini runtests.py runtests_urls.py settings.py recursive-include cachalot *.json *.html graft docs django-cachalot-2.8.0/README.rst000066400000000000000000000136701500004256100162260ustar00rootroot00000000000000Django Cachalot =============== Caches your Django ORM queries and automatically invalidates them. Documentation: http://django-cachalot.readthedocs.io ---- .. image:: http://img.shields.io/pypi/v/django-cachalot.svg?style=flat-square&maxAge=3600 :target: https://pypi.python.org/pypi/django-cachalot .. image:: https://img.shields.io/pypi/pyversions/django-cachalot :target: https://django-cachalot.readthedocs.io/en/latest/ .. image:: https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml/badge.svg :target: https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml .. image:: http://img.shields.io/coveralls/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 :target: https://coveralls.io/r/noripyt/django-cachalot?branch=master .. image:: http://img.shields.io/scrutinizer/g/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 :target: https://scrutinizer-ci.com/g/noripyt/django-cachalot/ .. image:: https://img.shields.io/discord/773656139207802881 :target: https://discord.gg/WFGFBk8rSU ---- Table of Contents: - Quickstart - Usage - Hacking - Benchmark - Third-Party Cache Comparison - Discussion Quickstart ---------- Cachalot officially supports Python 3.8-3.12 and Django 3.2, 4.1, 4.2, 5.0, 5.1, 5.2 with the databases PostgreSQL, SQLite, and MySQL. Note: an upper limit on Django version is set for your safety. Please do not ignore it. Usage ----- #. ``pip install django-cachalot`` #. Add ``'cachalot',`` to your ``INSTALLED_APPS`` #. If you use multiple servers with a common cache server, `double check their clock synchronisation `_ #. If you modify data outside Django – typically after restoring a SQL database –, use the `manage.py command `_ #. Be aware of `the few other limits `_ #. If you use `django-debug-toolbar `_, you can add ``'cachalot.panels.CachalotPanel',`` to your ``DEBUG_TOOLBAR_PANELS`` #. Enjoy! Hacking ------- To start developing, install the requirements and run the tests via tox. Make sure you have the following services: * Memcached * Redis * PostgreSQL * MySQL For setup: #. Install: ``pip install -r requirements/hacking.txt`` #. For PostgreSQL: ``CREATE ROLE cachalot LOGIN SUPERUSER;`` #. Run: ``tox --current-env`` to run the test suite on your current Python version. #. You can also run specific databases and Django versions: ``tox -e py312-django5.2-postgresql-redis`` Benchmark --------- Currently, benchmarks are supported on Linux and Mac/Darwin. You will need a database called "cachalot" on MySQL and PostgreSQL. Additionally, on PostgreSQL, you will need to create a role called "cachalot." You can also run the benchmark, and it'll raise errors with specific instructions for how to fix it. #. Install: ``pip install -r requirements/benchmark.txt`` #. Run: ``python benchmark.py`` The output will be in benchmark/TODAY'S_DATE/ TODO Create Docker-compose file to allow for easier running of data. Third-Party Cache Comparison ---------------------------- There are three main third party caches: cachalot, cache-machine, and cache-ops. Which do you use? We suggest a mix: TL;DR Use cachalot for cold or modified <50 times per minutes (Most people should stick with only cachalot since you most likely won't need to scale to the point of needing cache-machine added to the bowl). If you're an enterprise that already has huge statistics, then mixing cold caches for cachalot and your hot caches with cache-machine is the best mix. However, when performing joins with ``select_related`` and ``prefetch_related``, you can get a nearly 100x speed up for your initial deployment. Recall, cachalot caches THE ENTIRE TABLE. That's where its inefficiency stems from: if you keep updating the records, then the cachalot constantly invalidates the table and re-caches. Luckily caching is very efficient, it's just the cache invalidation part that kills all our systems. Look at Note 1 below to see how Reddit deals with it. Cachalot is more-or-less intended for cold caches or "just-right" conditions. If you find a partition library for Django (also authored but work-in-progress by `Andrew Chen Wang`_), then the caching will work better since sharding the cold/accessed-the-least records aren't invalidated as much. Cachalot is good when there are <50 modifications per minute on a hot cached table. This is mostly due to cache invalidation. It's the same with any cache, which is why we suggest you use cache-machine for hot caches. Cache-machine caches individual objects, taking up more in the memory store but invalidates those individual objects instead of the entire table like cachalot. Yes, the bane of our entire existence lies in cache invalidation and naming variables. Why does cachalot suck when stuck with a huge table that's modified rapidly? Since you've mixed your cold (90% of) with your hot (10% of) records, you're caching and invalidating an entire table. It's like trying to boil 1 ton of noodles inside ONE pot instead of 100 pots boiling 1 ton of noodles. Which is more efficient? The splitting up of them. Note 1: My personal experience with caches stems from Reddit's: https://web.archive.org/web/20210803213621/https://redditblog.com/2017/01/17/caching-at-reddit/ Note 2: Technical comparison: https://django-cachalot.readthedocs.io/en/latest/introduction.html#comparison-with-similar-tools Discussion ---------- Help? Technical chat? `It's here on Discord `_. Legacy chats: - https://gitter.im/django-cachalot/Lobby - https://join.slack.com/t/cachalotdjango/shared_invite/zt-dd0tj27b-cIH6VlaSOjAWnTG~II5~qw .. _Andrew Chen Wang: https://github.com/Andrew-Chen-Wang .. image:: https://raw.github.com/noripyt/django-cachalot/master/django-cachalot.jpg django-cachalot-2.8.0/benchmark.py000077500000000000000000000307531500004256100170470ustar00rootroot00000000000000import io import os import platform import re import sqlite3 from collections import OrderedDict from datetime import datetime from random import choice from subprocess import check_output from time import time os.environ.setdefault("DJANGO_SETTINGS_MODULE", "settings") import django django.setup() import matplotlib.pyplot as plt import pandas as pd import psycopg2 from django.conf import settings from django.contrib.auth.models import Group, User from django.core.cache import caches from django.db import connection, connections from django.test.utils import CaptureQueriesContext, override_settings from django.utils.encoding import force_text from MySQLdb import _mysql import cachalot from cachalot.api import invalidate from cachalot.tests.models import Test RESULTS_PATH = f"benchmark/docs/{datetime.now().date()}/" CONTEXTS = ("Control", "Cold cache", "Hot cache") DIVIDER = "divider" LINUX_DATA_PATH = "/var/lib/" DISK_DATA_RE = re.compile(r'^MODEL="(.*)" MOUNTPOINT="(.*)"$') def get_disk_model_for_path_linux(path): out = force_text(check_output(["lsblk", "-Po", "MODEL,MOUNTPOINT"])) mount_points = [] previous_model = None for model, mount_point in [ DISK_DATA_RE.match(line).groups() for line in out.split("\n") if line ]: if model: previous_model = model.strip() if mount_point: mount_points.append((previous_model, mount_point)) mount_points = sorted(mount_points, key=lambda t: -len(t[1])) for model, mount_point in mount_points: if path.startswith(mount_point): return model def write_conditions(): versions = OrderedDict() distribution = platform.uname() # Linux if distribution.system == "Linux": # CPU with open("/proc/cpuinfo") as f: versions["CPU"] = re.search( r"^model name\s+: (.+)$", f.read(), flags=re.MULTILINE ).group(1) # RAM with open("/proc/meminfo") as f: versions["RAM"] = re.search( r"^MemTotal:\s+(.+)$", f.read(), flags=re.MULTILINE ).group(1) # Disk Model versions.update((("Disk", get_disk_model_for_path_linux(LINUX_DATA_PATH)),)) # OS versions["Linux distribution"] = f"{distribution.system} {distribution.release}" # Darwin else: # CPU versions["CPU"] = os.popen("sysctl -n machdep.cpu.brand_string").read().rstrip("\n") # RAM versions["RAM"] = os.popen("sysctl -n hw.memsize").read().rstrip("\n") # Disk Model versions["DISK"] = os.popen( "diskutil info /dev/disk0 | grep 'Device / Media Name'" ).read().split(":")[1].rstrip("\n").lstrip(" ") # OS versions["OS"] = f"{distribution.system} {distribution.release}" versions.update( ( ("Python", platform.python_version()), ("Django", django.__version__), ("cachalot", cachalot.__version__), ("sqlite", sqlite3.sqlite_version), ) ) # PostgreSQL try: with connections["postgresql"].cursor() as cursor: cursor.execute("SELECT version();") versions["PostgreSQL"] = re.match( r"^PostgreSQL\s+(\S+)\s", cursor.fetchone()[0] ).group(1) except django.db.utils.OperationalError: raise django.db.utils.OperationalError( "You need a PostgreSQL DB called \"cachalot\" first. " "Login with \"psql -U postgres -h localhost\" and run: " "CREATE DATABASE cachalot;" ) # MySQL try: with connections["mysql"].cursor() as cursor: cursor.execute("SELECT version();") versions["MySQL"] = cursor.fetchone()[0].split("-")[0] except django.db.utils.OperationalError: raise django.db.utils.OperationalError( "You need a MySQL DB called \"cachalot\" first. " "Login with \"mysql -u root\" and run: CREATE DATABASE cachalot;" ) # Redis out = force_text(check_output(["redis-cli", "INFO", "server"])).replace("\r", "") versions["Redis"] = re.search( r"^redis_version:([\d\.]+)$", out, flags=re.MULTILINE ).group(1) # memcached out = force_text(check_output(["memcached", "-h"])) versions["memcached"] = re.match( r"^memcached ([\d\.]+)$", out, flags=re.MULTILINE ).group(1) versions.update( ( ("psycopg2", psycopg2.__version__.split()[0]), ("mysqlclient", _mysql.__version__), ) ) with io.open(os.path.join(RESULTS_PATH, "conditions.rst"), "w") as f: f.write( "In this benchmark, a small database is generated, " "and each test is executed %s times " "under the following conditions:\n\n" % Benchmark.n ) def write_table_sep(char="="): f.write((char * 20) + " " + (char * 50) + "\n") write_table_sep() for k, v in versions.items(): f.write(k.ljust(20) + " " + v + "\n") write_table_sep() class AssertNumQueries(CaptureQueriesContext): def __init__(self, n, using=None): self.n = n self.using = using super(AssertNumQueries, self).__init__(self.get_connection()) def get_connection(self): if self.using is None: return connection return connections[self.using] def __exit__(self, exc_type, exc_val, exc_tb): super(AssertNumQueries, self).__exit__(exc_type, exc_val, exc_tb) if len(self) != self.n: print( "The amount of queries should be %s, but %s were captured." % (self.n, len(self)) ) class Benchmark(object): n = 20 def __init__(self): self.data = [] def bench_once(self, context, num_queries, invalidate_before=False): for _ in range(self.n): if invalidate_before: invalidate(db_alias=self.db_alias) with AssertNumQueries(num_queries, using=self.db_alias): start = time() self.query_function(self.db_alias) end = time() self.data.append( { "query": self.query_name, "time": end - start, "context": context, "db": self.db_vendor, "cache": self.cache_name, } ) def benchmark(self, query_str, to_list=True, num_queries=1): # Clears the cache before a single benchmark to ensure the same # conditions across single benchmarks. caches[settings.CACHALOT_CACHE].clear() self.query_name = query_str query_str = "Test.objects.using(using)" + query_str if to_list: query_str = "list(%s)" % query_str self.query_function = eval("lambda using: " + query_str) with override_settings(CACHALOT_ENABLED=False): self.bench_once(CONTEXTS[0], num_queries) self.bench_once(CONTEXTS[1], num_queries, invalidate_before=True) self.bench_once(CONTEXTS[2], 0) def execute_benchmark(self): self.benchmark(".count()", to_list=False) self.benchmark(".first()", to_list=False) self.benchmark("[:10]") self.benchmark("[5000:5010]") self.benchmark(".filter(name__icontains='e')[0:10]") self.benchmark(".filter(name__icontains='e')[5000:5010]") self.benchmark(".order_by('owner')[0:10]") self.benchmark(".order_by('owner')[5000:5010]") self.benchmark(".select_related('owner')[0:10]") self.benchmark(".select_related('owner')[5000:5010]") self.benchmark(".prefetch_related('owner__groups')[0:10]", num_queries=3) self.benchmark(".prefetch_related('owner__groups')[5000:5010]", num_queries=3) def run(self): for db_alias in settings.DATABASES: self.db_alias = db_alias self.db_vendor = connections[self.db_alias].vendor print("Benchmarking %s…" % self.db_vendor) for cache_alias in settings.CACHES: cache = caches[cache_alias] self.cache_name = cache.__class__.__name__[:-5].lower() with override_settings(CACHALOT_CACHE=cache_alias): self.execute_benchmark() self.df = pd.DataFrame.from_records(self.data) if not os.path.exists(RESULTS_PATH): os.mkdir(RESULTS_PATH) self.df.to_csv(os.path.join(RESULTS_PATH, "data.csv")) self.xlim = (0, self.df["time"].max() * 1.01) self.output("db") self.output("cache") def output(self, param): gp = self.df.groupby(["context", "query", param])["time"] self.means = gp.mean().unstack().unstack().reindex(CONTEXTS) los = self.means - gp.min().unstack().unstack().reindex(CONTEXTS) ups = gp.max().unstack().unstack().reindex(CONTEXTS) - self.means self.errors = dict( ( key, dict( ( subkey, [ [los[key][subkey][context] for context in self.means.index], [ups[key][subkey][context] for context in self.means.index], ], ) for subkey in self.means.columns.levels[1] ), ) for key in self.means.columns.levels[0] ) self.get_perfs(param) self.plot_detail(param) gp = self.df.groupby(["context", param])["time"] self.means = gp.mean().unstack().reindex(CONTEXTS) los = self.means - gp.min().unstack().reindex(CONTEXTS) ups = gp.max().unstack().reindex(CONTEXTS) - self.means self.errors = [ [ [los[key][context] for context in self.means.index], [ups[key][context] for context in self.means.index], ] for key in self.means ] self.plot_general(param) def get_perfs(self, param): with io.open(os.path.join(RESULTS_PATH, param + "_results.rst"), "w") as f: for v in self.means.columns.levels[0]: g = self.means[v].mean(axis=1) perf = "%s is %.1f× slower then %.1f× faster" % ( v.ljust(10), g[CONTEXTS[1]] / g[CONTEXTS[0]], g[CONTEXTS[0]] / g[CONTEXTS[2]], ) print(perf) f.write("- %s\n" % perf) def plot_detail(self, param): for v in self.means.columns.levels[0]: plt.figure() axes = self.means[v].plot( kind="barh", xerr=self.errors[v], xlim=self.xlim, figsize=(15, 15), subplots=True, layout=(6, 2), sharey=True, legend=False, ) plt.gca().invert_yaxis() for row in axes: for ax in row: ax.xaxis.grid(True) ax.set_ylabel("") ax.set_xlabel("Time (s)") plt.savefig(os.path.join(RESULTS_PATH, "%s_%s.svg" % (param, v))) def plot_general(self, param): plt.figure() ax = self.means.plot(kind="barh", xerr=self.errors, xlim=self.xlim) ax.invert_yaxis() ax.xaxis.grid(True) ax.set_ylabel("") ax.set_xlabel("Time (s)") plt.savefig(os.path.join(RESULTS_PATH, "%s.svg" % param)) def create_data(using): User.objects.using(using).bulk_create( [User(username="user%d" % i) for i in range(50)] ) Group.objects.using(using).bulk_create( [Group(name="test%d" % i) for i in range(10)] ) groups = list(Group.objects.using(using)) for u in User.objects.using(using): u.groups.add(choice(groups), choice(groups)) users = list(User.objects.using(using)) Test.objects.using(using).bulk_create( [Test(name="test%d" % i, owner=choice(users)) for i in range(10000)] ) if __name__ == "__main__": if not os.path.exists(RESULTS_PATH): os.mkdir(RESULTS_PATH) write_conditions() old_db_names = {} for alias in connections: conn = connections[alias] old_db_names[alias] = conn.settings_dict["NAME"] conn.creation.create_test_db(autoclobber=True) print("Populating %s…" % connections[alias].vendor) create_data(alias) Benchmark().run() for alias in connections: connections[alias].creation.destroy_test_db(old_db_names[alias]) django-cachalot-2.8.0/benchmark/000077500000000000000000000000001500004256100164625ustar00rootroot00000000000000django-cachalot-2.8.0/benchmark/docs/000077500000000000000000000000001500004256100174125ustar00rootroot00000000000000django-cachalot-2.8.0/benchmark/docs/2018-08-09/000077500000000000000000000000001500004256100204575ustar00rootroot00000000000000django-cachalot-2.8.0/benchmark/docs/2018-08-09/cache.svg000066400000000000000000001123131500004256100222440ustar00rootroot00000000000000 django-cachalot-2.8.0/benchmark/docs/2018-08-09/cache_filebased.svg000066400000000000000000004703421500004256100242530ustar00rootroot00000000000000 django-cachalot-2.8.0/benchmark/docs/2018-08-09/cache_locmem.svg000066400000000000000000004703441500004256100236130ustar00rootroot00000000000000 django-cachalot-2.8.0/benchmark/docs/2018-08-09/cache_memcached.svg000066400000000000000000004703411500004256100242420ustar00rootroot00000000000000 django-cachalot-2.8.0/benchmark/docs/2018-08-09/cache_pylibmc.svg000066400000000000000000004703271500004256100237770ustar00rootroot00000000000000 django-cachalot-2.8.0/benchmark/docs/2018-08-09/cache_redis.svg000066400000000000000000004703301500004256100234400ustar00rootroot00000000000000 django-cachalot-2.8.0/benchmark/docs/2018-08-09/cache_results.rst000066400000000000000000000003531500004256100240360ustar00rootroot00000000000000- filebased is 1.2× slower then 5.8× faster - locmem is 1.1× slower then 6.1× faster - memcached is 1.1× slower then 5.0× faster - pylibmc is 1.1× slower then 5.6× faster - redis is 1.1× slower then 5.6× faster django-cachalot-2.8.0/benchmark/docs/2018-08-09/conditions.rst000066400000000000000000000013171500004256100233640ustar00rootroot00000000000000In this benchmark, a small database is generated, and each test is executed 20 times under the following conditions: ==================== ================================================== CPU Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz RAM 24634516 kB Disk SAMSUNG MZVPW256HEGL-00000 Linux distribution Ubuntu 18.04 bionic Python 3.7.0b3 Django 2.1 cachalot 2.1.0 sqlite 3.22.0 PostgreSQL 10.4 MySQL 5.7.23 Redis 4.0.9 memcached 1.5.6 psycopg2 2.7.5 mysqlclient 1.3.13 ==================== ================================================== django-cachalot-2.8.0/benchmark/docs/2018-08-09/db.svg000066400000000000000000001060371500004256100215740ustar00rootroot00000000000000 django-cachalot-2.8.0/benchmark/docs/2018-08-09/db_mysql.svg000066400000000000000000004703421500004256100230240ustar00rootroot00000000000000 django-cachalot-2.8.0/benchmark/docs/2018-08-09/db_postgresql.svg000066400000000000000000004703521500004256100240630ustar00rootroot00000000000000 django-cachalot-2.8.0/benchmark/docs/2018-08-09/db_results.rst000066400000000000000000000002151500004256100233550ustar00rootroot00000000000000- mysql is 1.1× slower then 4.0× faster - postgresql is 1.1× slower then 9.0× faster - sqlite is 1.2× slower then 4.3× faster django-cachalot-2.8.0/benchmark/docs/2018-08-09/db_sqlite.svg000066400000000000000000004703431500004256100231610ustar00rootroot00000000000000 django-cachalot-2.8.0/cachalot/000077500000000000000000000000001500004256100163065ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/__init__.py000066400000000000000000000001631500004256100204170ustar00rootroot00000000000000VERSION = (2, 8, 0) __version__ = ".".join(map(str, VERSION)) default_app_config = "cachalot.apps.CachalotConfig" django-cachalot-2.8.0/cachalot/admin_tests/000077500000000000000000000000001500004256100206205ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/admin_tests/__init__.py000066400000000000000000000000001500004256100227170ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/admin_tests/admin.py000066400000000000000000000002501500004256100222570ustar00rootroot00000000000000from django.contrib import admin from .models import TestModel @admin.register(TestModel) class TestModelAdmin(admin.ModelAdmin): list_display = ('name', 'owner') django-cachalot-2.8.0/cachalot/admin_tests/migrations/000077500000000000000000000000001500004256100227745ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/admin_tests/migrations/0001_initial.py000066400000000000000000000027301500004256100254410ustar00rootroot00000000000000# Generated by Django 4.1.7 on 2023-03-10 19:33 from django.conf import settings from django.db import migrations, models import django.db.models.deletion import django.db.models.functions.text class Migration(migrations.Migration): initial = True dependencies = [ migrations.swappable_dependency(settings.AUTH_USER_MODEL), ] operations = [ migrations.CreateModel( name="TestModel", fields=[ ( "id", models.AutoField( auto_created=True, primary_key=True, serialize=False, verbose_name="ID", ), ), ("name", models.CharField(max_length=20)), ( "owner", models.ForeignKey( blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL, ), ), ], options={ "ordering": ("name",), }, ), migrations.AddConstraint( model_name="testmodel", constraint=models.UniqueConstraint( fields=["name"], condition=models.Q(owner=None), name="unique_name", ), ), ] django-cachalot-2.8.0/cachalot/admin_tests/migrations/__init__.py000066400000000000000000000000001500004256100250730ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/admin_tests/models.py000066400000000000000000000010251500004256100224530ustar00rootroot00000000000000from django.conf import settings from django.db.models import Q, UniqueConstraint, Model, CharField, ForeignKey, SET_NULL class TestModel(Model): name = CharField(max_length=20) owner = ForeignKey(settings.AUTH_USER_MODEL, null=True, blank=True, on_delete=SET_NULL) class Meta: ordering = ('name',) constraints = [ UniqueConstraint( fields=["name"], condition=Q(owner=None), name="unique_name", ) ] django-cachalot-2.8.0/cachalot/admin_tests/test_admin.py000066400000000000000000000013521500004256100233220ustar00rootroot00000000000000from django.test import TestCase from django.contrib.auth.models import User from .models import TestModel from django.test import Client class AdminTestCase(TestCase): def setUp(self): self.client = Client() self.user = User.objects.create(username='admin', is_staff=True, is_superuser=True) def test_save_test_model(self): """ Model 'TestModel' has UniqueConstraint which caused problems when saving TestModelAdmin in Django >= 4.1 """ self.client.force_login(self.user) response = self.client.post('/admin/admin_tests/testmodel/add/', {'name': 'test', 'public': True}) self.assertEqual(response.status_code, 302) self.assertEqual(TestModel.objects.count(), 1) django-cachalot-2.8.0/cachalot/api.py000066400000000000000000000136041500004256100174350ustar00rootroot00000000000000from contextlib import contextmanager from typing import Any, Optional, Tuple, Union from django.apps import apps from django.conf import settings from django.db import connections from .cache import cachalot_caches from .settings import cachalot_settings from .signals import post_invalidation from .transaction import AtomicCache from .utils import _invalidate_tables try: from asgiref.local import Local LOCAL_STORAGE = Local() except ImportError: import threading LOCAL_STORAGE = threading.local() __all__ = ('invalidate', 'get_last_invalidation', 'cachalot_disabled') def _cache_db_tables_iterator(tables, cache_alias, db_alias): no_tables = not tables cache_aliases = settings.CACHES if cache_alias is None else (cache_alias,) db_aliases = settings.DATABASES if db_alias is None else (db_alias,) for db_alias in db_aliases: if no_tables: tables = connections[db_alias].introspection.table_names() if tables: for cache_alias in cache_aliases: yield cache_alias, db_alias, tables def _get_tables(tables_or_models): for table_or_model in tables_or_models: if isinstance(table_or_model, str) and '.' in table_or_model: try: table_or_model = apps.get_model(table_or_model) except LookupError: pass yield (table_or_model if isinstance(table_or_model, str) else table_or_model._meta.db_table) def invalidate( *tables_or_models: Tuple[Union[str, Any], ...], cache_alias: Optional[str] = None, db_alias: Optional[str] = None, ) -> None: """ Clears what was cached by django-cachalot implying one or more SQL tables or models from ``tables_or_models``. If ``tables_or_models`` is not specified, all tables found in the database (including those outside Django) are invalidated. If ``cache_alias`` is specified, it only clears the SQL queries stored on this cache, otherwise queries from all caches are cleared. If ``db_alias`` is specified, it only clears the SQL queries executed on this database, otherwise queries from all databases are cleared. :arg tables_or_models: SQL tables names, models or models lookups (or a combination) :type tables_or_models: tuple of strings or models :arg cache_alias: Alias from the Django ``CACHES`` setting :arg db_alias: Alias from the Django ``DATABASES`` setting :returns: Nothing """ send_signal = False invalidated = set() for cache_alias, db_alias, tables in _cache_db_tables_iterator( list(_get_tables(tables_or_models)), cache_alias, db_alias): cache = cachalot_caches.get_cache(cache_alias, db_alias) if not isinstance(cache, AtomicCache): send_signal = True _invalidate_tables(cache, db_alias, tables) invalidated.update(tables) if send_signal: for table in invalidated: post_invalidation.send(table, db_alias=db_alias) def get_last_invalidation( *tables_or_models: Tuple[Union[str, Any], ...], cache_alias: Optional[str] = None, db_alias: Optional[str] = None, ) -> float: """ Returns the timestamp of the most recent invalidation of the given ``tables_or_models``. If ``tables_or_models`` is not specified, all tables found in the database (including those outside Django) are used. If ``cache_alias`` is specified, it only fetches invalidations in this cache, otherwise invalidations in all caches are fetched. If ``db_alias`` is specified, it only fetches invalidations for this database, otherwise invalidations for all databases are fetched. :arg tables_or_models: SQL tables names, models or models lookups (or a combination) :type tables_or_models: tuple of strings or models :arg cache_alias: Alias from the Django ``CACHES`` setting :arg db_alias: Alias from the Django ``DATABASES`` setting :returns: The timestamp of the most recent invalidation """ last_invalidation = 0.0 for cache_alias, db_alias, tables in _cache_db_tables_iterator( list(_get_tables(tables_or_models)), cache_alias, db_alias): get_table_cache_key = cachalot_settings.CACHALOT_TABLE_KEYGEN table_cache_keys = [get_table_cache_key(db_alias, t) for t in tables] invalidations = cachalot_caches.get_cache( cache_alias, db_alias).get_many(table_cache_keys).values() if invalidations: current_last_invalidation = max(invalidations) if current_last_invalidation > last_invalidation: last_invalidation = current_last_invalidation return last_invalidation @contextmanager def cachalot_disabled(all_queries: bool = False): """ Context manager for temporarily disabling cachalot. If you evaluate the same queryset a second time, like normally for Django querysets, this will access the variable that saved it in-memory. For example: .. code-block:: python with cachalot_disabled(): qs = Test.objects.filter(blah=blah) # Does a single query to the db list(qs) # Evaluates queryset # Because the qs was evaluated, it's # saved in memory: list(qs) # this does 0 queries. # This does 1 query to the db list(Test.objects.filter(blah=blah)) If you evaluate the queryset outside the context manager, any duplicate query will use the cached result unless an object creation happens in between the original and duplicate query. :arg all_queries: Any query, including already evaluated queries, are re-evaluated. """ was_enabled = getattr(LOCAL_STORAGE, "cachalot_enabled", cachalot_settings.CACHALOT_ENABLED) LOCAL_STORAGE.cachalot_enabled = False LOCAL_STORAGE.disable_on_all = all_queries yield LOCAL_STORAGE.cachalot_enabled = was_enabled django-cachalot-2.8.0/cachalot/apps.py000066400000000000000000000067361500004256100176370ustar00rootroot00000000000000import copyreg from django.apps import AppConfig from django.conf import settings from django.core.checks import register, Tags, Warning, Error from cachalot.utils import ITERABLES from .settings import ( cachalot_settings, SUPPORTED_CACHE_BACKENDS, SUPPORTED_DATABASE_ENGINES, SUPPORTED_ONLY) @register(Tags.caches, Tags.compatibility) def check_cache_compatibility(app_configs, **kwargs): cache = settings.CACHES[cachalot_settings.CACHALOT_CACHE] cache_backend = cache['BACKEND'] if cache_backend not in SUPPORTED_CACHE_BACKENDS: return [Warning( 'Cache backend %r is not supported by django-cachalot.' % cache_backend, hint='Switch to a supported cache backend ' 'like Redis or Memcached.', id='cachalot.W001')] return [] @register(Tags.database, Tags.compatibility) def check_databases_compatibility(app_configs, **kwargs): errors = [] databases = settings.DATABASES original_enabled_databases = getattr(settings, 'CACHALOT_DATABASES', SUPPORTED_ONLY) enabled_databases = cachalot_settings.CACHALOT_DATABASES if original_enabled_databases == SUPPORTED_ONLY: if not cachalot_settings.CACHALOT_DATABASES: errors.append(Warning( 'None of the configured databases are supported ' 'by django-cachalot.', hint='Use a supported database, or remove django-cachalot, or ' 'put at least one database alias in `CACHALOT_DATABASES` ' 'to force django-cachalot to use it.', id='cachalot.W002' )) elif enabled_databases.__class__ in ITERABLES: for db_alias in enabled_databases: if db_alias in databases: engine = databases[db_alias]['ENGINE'] if engine not in SUPPORTED_DATABASE_ENGINES: errors.append(Warning( 'Database engine %r is not supported ' 'by django-cachalot.' % engine, hint='Switch to a supported database engine.', id='cachalot.W003' )) else: errors.append(Error( 'Database alias %r from `CACHALOT_DATABASES` ' 'is not defined in `DATABASES`.' % db_alias, hint='Change `CACHALOT_DATABASES` to be compliant with' '`CACHALOT_DATABASES`', id='cachalot.E001', )) if not enabled_databases: errors.append(Warning( 'Django-cachalot is useless because no database ' 'is configured in `CACHALOT_DATABASES`.', hint='Reconfigure django-cachalot or remove it.', id='cachalot.W004' )) else: errors.append(Error( "`CACHALOT_DATABASES` must be either %r or a list, tuple, " "frozenset or set of database aliases." % SUPPORTED_ONLY, hint='Remove `CACHALOT_DATABASES` or change it.', id='cachalot.E002', )) return errors class CachalotConfig(AppConfig): name = 'cachalot' def ready(self): # Cast memoryview objects to bytes to be able to pickle them. # https://docs.python.org/3/library/copyreg.html#copyreg.pickle copyreg.pickle(memoryview, lambda val: (memoryview, (bytes(val),))) cachalot_settings.load() django-cachalot-2.8.0/cachalot/cache.py000066400000000000000000000040351500004256100177250ustar00rootroot00000000000000from collections import defaultdict from threading import local from django.core.cache import caches from django.db import DEFAULT_DB_ALIAS from .settings import cachalot_settings from .signals import post_invalidation from .transaction import AtomicCache class CacheHandler(local): @property def atomic_caches(self): if not hasattr(self, '_atomic_caches'): self._atomic_caches = defaultdict(list) return self._atomic_caches def get_atomic_cache(self, cache_alias, db_alias, level): if cache_alias not in self.atomic_caches[db_alias][level]: self.atomic_caches[db_alias][level][cache_alias] = AtomicCache( self.get_cache(cache_alias, db_alias, level-1), db_alias) return self.atomic_caches[db_alias][level][cache_alias] def get_cache(self, cache_alias=None, db_alias=None, atomic_level=-1): if db_alias is None: db_alias = DEFAULT_DB_ALIAS if cache_alias is None: cache_alias = cachalot_settings.CACHALOT_CACHE min_level = -len(self.atomic_caches[db_alias]) if atomic_level < min_level: return caches[cache_alias] return self.get_atomic_cache(cache_alias, db_alias, atomic_level) def enter_atomic(self, db_alias): if db_alias is None: db_alias = DEFAULT_DB_ALIAS self.atomic_caches[db_alias].append({}) def exit_atomic(self, db_alias, commit): if db_alias is None: db_alias = DEFAULT_DB_ALIAS atomic_caches = self.atomic_caches[db_alias].pop().values() if commit: to_be_invalidated = set() for atomic_cache in atomic_caches: atomic_cache.commit() to_be_invalidated.update(atomic_cache.to_be_invalidated) # This happens when committing the outermost atomic block. if not self.atomic_caches[db_alias]: for table in to_be_invalidated: post_invalidation.send(table, db_alias=db_alias) cachalot_caches = CacheHandler() django-cachalot-2.8.0/cachalot/jinja2ext.py000066400000000000000000000052411500004256100205600ustar00rootroot00000000000000from django.core.cache import caches, DEFAULT_CACHE_ALIAS from django.core.cache.utils import make_template_fragment_key from jinja2.nodes import Keyword, Const, CallBlock from jinja2.ext import Extension from .api import get_last_invalidation class CachalotExtension(Extension): tags = {'cache'} allowed_kwargs = ('cache_key', 'timeout', 'cache_alias') def __init__(self, environment): super(CachalotExtension, self).__init__(environment) self.environment.globals.update( get_last_invalidation=get_last_invalidation) def parse_args(self, parser): args = [] kwargs = [] stream = parser.stream while stream.current.type != 'block_end': if stream.current.type == 'name' \ and stream.look().type == 'assign': key = stream.current.value if key not in self.allowed_kwargs: parser.fail( "'%s' is not a valid keyword argument " "for {%% cache %%}" % key, stream.current.lineno) stream.skip(2) value = parser.parse_expression() kwargs.append(Keyword(key, value, lineno=value.lineno)) else: args.append(parser.parse_expression()) if stream.current.type == 'block_end': break parser.stream.expect('comma') return args, kwargs def parse(self, parser): tag = parser.stream.current.value lineno = next(parser.stream).lineno args, kwargs = self.parse_args(parser) default_cache_key = (None if parser.filename is None else '%s:%d' % (parser.filename, lineno)) kwargs.append(Keyword('default_cache_key', Const(default_cache_key), lineno=lineno)) body = parser.parse_statements(['name:end' + tag], drop_needle=True) return CallBlock(self.call_method('cache', args, kwargs), [], [], body).set_lineno(lineno) def cache(self, *args, **kwargs): cache_alias = kwargs.get('cache_alias', DEFAULT_CACHE_ALIAS) cache_key = kwargs.get('cache_key', kwargs['default_cache_key']) if cache_key is None: raise ValueError( 'You must set `cache_key` when the template is not a file.') cache_key = make_template_fragment_key(cache_key, args) out = caches[cache_alias].get(cache_key) if out is None: out = kwargs['caller']() caches[cache_alias].set(cache_key, out, kwargs.get('timeout')) return out cachalot = CachalotExtension django-cachalot-2.8.0/cachalot/management/000077500000000000000000000000001500004256100204225ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/management/__init__.py000066400000000000000000000000001500004256100225210ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/management/commands/000077500000000000000000000000001500004256100222235ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/management/commands/__init__.py000066400000000000000000000000001500004256100243220ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/management/commands/invalidate_cachalot.py000066400000000000000000000036301500004256100265550ustar00rootroot00000000000000from django.conf import settings from django.core.management.base import BaseCommand from django.apps import apps from ...api import invalidate class Command(BaseCommand): help = 'Invalidates the cache keys set by django-cachalot.' def add_arguments(self, parser): parser.add_argument('app_label[.model_name]', nargs='*') parser.add_argument( '-c', '--cache', action='store', dest='cache_alias', choices=list(settings.CACHES.keys()), help='Cache alias from the CACHES setting.') parser.add_argument( '-d', '--db', action='store', dest='db_alias', choices=list(settings.DATABASES.keys()), help='Database alias from the DATABASES setting.') def handle(self, *args, **options): cache_alias = options['cache_alias'] db_alias = options['db_alias'] verbosity = int(options['verbosity']) labels = options['app_label[.model_name]'] models = [] for label in labels: try: models.extend(apps.get_app_config(label).get_models()) except LookupError: app_label = '.'.join(label.split('.')[:-1]) model_name = label.split('.')[-1] models.append(apps.get_model(app_label, model_name)) cache_str = '' if cache_alias is None else "on cache '%s'" % cache_alias db_str = '' if db_alias is None else "for database '%s'" % db_alias keys_str = 'keys for %s models' % len(models) if labels else 'all keys' if verbosity > 0: self.stdout.write(' '.join(filter(bool, ['Invalidating', keys_str, cache_str, db_str])) + '...') invalidate(*models, cache_alias=cache_alias, db_alias=db_alias) if verbosity > 0: self.stdout.write('Cache keys successfully invalidated.') django-cachalot-2.8.0/cachalot/models.py000066400000000000000000000000001500004256100201310ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/monkey_patch.py000066400000000000000000000156071500004256100213520ustar00rootroot00000000000000import re import types from collections.abc import Iterable from functools import wraps from time import time from django.core.exceptions import EmptyResultSet from django.db.backends.utils import CursorWrapper from django.db.models.signals import post_migrate from django.db.models.sql.compiler import ( SQLCompiler, SQLInsertCompiler, SQLUpdateCompiler, SQLDeleteCompiler, ) from django.db.transaction import Atomic, get_connection from .api import invalidate, LOCAL_STORAGE from .cache import cachalot_caches from .settings import cachalot_settings, ITERABLES from .utils import ( _get_table_cache_keys, _get_tables_from_sql, UncachableQuery, is_cachable, filter_cachable, ) WRITE_COMPILERS = (SQLInsertCompiler, SQLUpdateCompiler, SQLDeleteCompiler) SQL_DATA_CHANGE_RE = re.compile( '|'.join([ fr'(\W|\A){re.escape(keyword)}(\W|\Z)' for keyword in ['update', 'insert', 'delete', 'alter', 'create', 'drop'] ]), flags=re.IGNORECASE, ) def _unset_raw_connection(original): def inner(compiler, *args, **kwargs): compiler.connection.raw = False try: return original(compiler, *args, **kwargs) finally: compiler.connection.raw = True return inner def _get_result_or_execute_query(execute_query_func, cache, cache_key, table_cache_keys): try: data = cache.get_many(table_cache_keys + [cache_key]) except (KeyError, ModuleNotFoundError): data = None new_table_cache_keys = set(table_cache_keys) if data: new_table_cache_keys.difference_update(data) if not new_table_cache_keys: try: timestamp, result = data.pop(cache_key) if timestamp >= max(data.values()): return result except (KeyError, TypeError, ValueError): # In case `cache_key` is not in `data` or contains bad data, # we simply run the query and cache again the results. pass result = execute_query_func() if result.__class__ == types.GeneratorType and not cachalot_settings.CACHALOT_CACHE_ITERATORS: return result if result.__class__ not in ITERABLES and isinstance(result, Iterable): result = list(result) now = time() to_be_set = {k: now for k in new_table_cache_keys} to_be_set[cache_key] = (now, result) cache.set_many(to_be_set, cachalot_settings.CACHALOT_TIMEOUT) return result def _patch_compiler(original): @wraps(original) @_unset_raw_connection def inner(compiler, *args, **kwargs): execute_query_func = lambda: original(compiler, *args, **kwargs) # Checks if utils/cachalot_disabled if not getattr(LOCAL_STORAGE, "cachalot_enabled", True): return execute_query_func() db_alias = compiler.using if db_alias not in cachalot_settings.CACHALOT_DATABASES \ or isinstance(compiler, WRITE_COMPILERS): return execute_query_func() try: cache_key = cachalot_settings.CACHALOT_QUERY_KEYGEN(compiler) table_cache_keys = _get_table_cache_keys(compiler) except (EmptyResultSet, UncachableQuery): return execute_query_func() return _get_result_or_execute_query( execute_query_func, cachalot_caches.get_cache(db_alias=db_alias), cache_key, table_cache_keys) return inner def _patch_write_compiler(original): @wraps(original) @_unset_raw_connection def inner(write_compiler, *args, **kwargs): db_alias = write_compiler.using table = write_compiler.query.get_meta().db_table if is_cachable(table): invalidate(table, db_alias=db_alias, cache_alias=cachalot_settings.CACHALOT_CACHE) return original(write_compiler, *args, **kwargs) return inner def _patch_orm(): if cachalot_settings.CACHALOT_ENABLED: SQLCompiler.execute_sql = _patch_compiler(SQLCompiler.execute_sql) for compiler in WRITE_COMPILERS: compiler.execute_sql = _patch_write_compiler(compiler.execute_sql) def _unpatch_orm(): if hasattr(SQLCompiler.execute_sql, '__wrapped__'): SQLCompiler.execute_sql = SQLCompiler.execute_sql.__wrapped__ for compiler in WRITE_COMPILERS: compiler.execute_sql = compiler.execute_sql.__wrapped__ def _patch_cursor(): def _patch_cursor_execute(original): @wraps(original) def inner(cursor, sql, *args, **kwargs): try: return original(cursor, sql, *args, **kwargs) finally: connection = cursor.db if getattr(connection, 'raw', True): if isinstance(sql, bytes): sql = sql.decode('utf-8') sql = sql.lower() if SQL_DATA_CHANGE_RE.search(sql): tables = filter_cachable( _get_tables_from_sql(connection, sql)) if tables: invalidate( *tables, db_alias=connection.alias, cache_alias=cachalot_settings.CACHALOT_CACHE) return inner if cachalot_settings.CACHALOT_INVALIDATE_RAW: CursorWrapper.execute = _patch_cursor_execute(CursorWrapper.execute) CursorWrapper.executemany = _patch_cursor_execute(CursorWrapper.executemany) def _unpatch_cursor(): if hasattr(CursorWrapper.execute, '__wrapped__'): CursorWrapper.execute = CursorWrapper.execute.__wrapped__ CursorWrapper.executemany = CursorWrapper.executemany.__wrapped__ def _patch_atomic(): def patch_enter(original): @wraps(original) def inner(self): cachalot_caches.enter_atomic(self.using) original(self) return inner def patch_exit(original): @wraps(original) def inner(self, exc_type, exc_value, traceback): needs_rollback = get_connection(self.using).needs_rollback try: original(self, exc_type, exc_value, traceback) finally: cachalot_caches.exit_atomic( self.using, exc_type is None and not needs_rollback) return inner Atomic.__enter__ = patch_enter(Atomic.__enter__) Atomic.__exit__ = patch_exit(Atomic.__exit__) def _unpatch_atomic(): Atomic.__enter__ = Atomic.__enter__.__wrapped__ Atomic.__exit__ = Atomic.__exit__.__wrapped__ def _invalidate_on_migration(sender, **kwargs): invalidate(*sender.get_models(), db_alias=kwargs['using'], cache_alias=cachalot_settings.CACHALOT_CACHE) def patch(): post_migrate.connect(_invalidate_on_migration) _patch_cursor() _patch_atomic() _patch_orm() def unpatch(): post_migrate.disconnect(_invalidate_on_migration) _unpatch_cursor() _unpatch_atomic() _unpatch_orm() django-cachalot-2.8.0/cachalot/panels.py000066400000000000000000000046601500004256100201500ustar00rootroot00000000000000from collections import defaultdict from datetime import datetime from debug_toolbar.panels import Panel from django.apps import apps from django.conf import settings from django.utils.translation import gettext_lazy as _ from django.utils.timesince import timesince from .cache import cachalot_caches from .settings import cachalot_settings class CachalotPanel(Panel): title = 'Cachalot' template = 'cachalot/panel.html' def __init__(self, *args, **kwargs): self.last_invalidation = None super(CachalotPanel, self).__init__(*args, **kwargs) @property def enabled(self): enabled = super(CachalotPanel, self).enabled if enabled: self.enable_instrumentation() else: self.disable_instrumentation() return enabled def enable_instrumentation(self): settings.CACHALOT_ENABLED = True cachalot_settings.reload() def disable_instrumentation(self): settings.CACHALOT_ENABLED = False cachalot_settings.reload() def process_request(self, request): self.collect_invalidations() return super(CachalotPanel, self).process_request(request) def collect_invalidations(self): models = apps.get_models() data = defaultdict(list) cache = cachalot_caches.get_cache() for db_alias in settings.DATABASES: get_table_cache_key = cachalot_settings.CACHALOT_TABLE_KEYGEN model_cache_keys = { get_table_cache_key(db_alias, model._meta.db_table): model for model in models} for cache_key, timestamp in cache.get_many( model_cache_keys.keys()).items(): invalidation = datetime.fromtimestamp(timestamp) model = model_cache_keys[cache_key] data[db_alias].append( (model._meta.app_label, model.__name__, invalidation)) if self.last_invalidation is None \ or invalidation > self.last_invalidation: self.last_invalidation = invalidation data[db_alias].sort(key=lambda row: row[2], reverse=True) self.record_stats({'invalidations_per_db': data.items()}) @property def nav_subtitle(self): if self.enabled and self.last_invalidation is not None: return (_('Last invalidation: %s') % timesince(self.last_invalidation)) return '' django-cachalot-2.8.0/cachalot/settings.py000066400000000000000000000102611500004256100205200ustar00rootroot00000000000000from itertools import chain from django.apps import apps from django.conf import settings from django.utils.module_loading import import_string SUPPORTED_DATABASE_ENGINES = { 'django.db.backends.sqlite3', 'django.db.backends.postgresql', 'django.db.backends.mysql', # GeoDjango 'django.contrib.gis.db.backends.spatialite', 'django.contrib.gis.db.backends.postgis', 'django.contrib.gis.db.backends.mysql', # django-transaction-hooks 'transaction_hooks.backends.sqlite3', 'transaction_hooks.backends.postgis', 'transaction_hooks.backends.mysql', # django-prometheus wrapped engines 'django_prometheus.db.backends.sqlite3', 'django_prometheus.db.backends.postgresql', 'django_prometheus.db.backends.mysql', } SUPPORTED_CACHE_BACKENDS = { 'django.core.cache.backends.dummy.DummyCache', 'django.core.cache.backends.locmem.LocMemCache', 'django.core.cache.backends.filebased.FileBasedCache', 'django.core.cache.backends.redis.RedisCache', 'django_redis.cache.RedisCache', 'django.core.cache.backends.memcached.MemcachedCache', 'django.core.cache.backends.memcached.PyLibMCCache', 'django.core.cache.backends.memcached.PyMemcacheCache', } SUPPORTED_ONLY = 'supported_only' ITERABLES = {tuple, list, frozenset, set} class Settings(object): patched = False converters = {} CACHALOT_ENABLED = True CACHALOT_CACHE = 'default' CACHALOT_DATABASES = 'supported_only' CACHALOT_TIMEOUT = None CACHALOT_CACHE_RANDOM = False CACHALOT_CACHE_ITERATORS = True CACHALOT_INVALIDATE_RAW = True CACHALOT_ONLY_CACHABLE_TABLES = () CACHALOT_ONLY_CACHABLE_APPS = () CACHALOT_UNCACHABLE_TABLES = ('django_migrations',) CACHALOT_UNCACHABLE_APPS = () CACHALOT_ADDITIONAL_TABLES = () CACHALOT_QUERY_KEYGEN = 'cachalot.utils.get_query_cache_key' CACHALOT_TABLE_KEYGEN = 'cachalot.utils.get_table_cache_key' CACHALOT_FINAL_SQL_CHECK = False @classmethod def add_converter(cls, setting): def inner(func): cls.converters[setting] = func return inner @classmethod def get_names(cls): return {name for name in cls.__dict__ if name[:2] != '__' and name.isupper()} def load(self): for name in self.get_names(): value = getattr(settings, name, getattr(self.__class__, name)) converter = self.converters.get(name) if converter is not None: value = converter(value) setattr(self, name, value) if not self.patched: from .monkey_patch import patch patch() self.patched = True def unload(self): if self.patched: from .monkey_patch import unpatch unpatch() self.patched = False def reload(self): self.unload() self.load() @Settings.add_converter('CACHALOT_DATABASES') def convert(value): if value == SUPPORTED_ONLY: value = {alias for alias, setting in settings.DATABASES.items() if setting['ENGINE'] in SUPPORTED_DATABASE_ENGINES} if value.__class__ in ITERABLES: return frozenset(value) return value def convert_tables(value, setting_app_name): dj_apps = getattr(settings, setting_app_name, ()) if dj_apps: dj_apps = tuple(model._meta.db_table for model in chain.from_iterable( apps.all_models[_app].values() for _app in dj_apps )) # Use [] lookup to make sure app is loaded (via INSTALLED_APP's order) return frozenset(tuple(value) + dj_apps) return frozenset(value) @Settings.add_converter('CACHALOT_ONLY_CACHABLE_TABLES') def convert(value): return convert_tables(value, 'CACHALOT_ONLY_CACHABLE_APPS') @Settings.add_converter('CACHALOT_UNCACHABLE_TABLES') def convert(value): return convert_tables(value, 'CACHALOT_UNCACHABLE_APPS') @Settings.add_converter('CACHALOT_ADDITIONAL_TABLES') def convert(value): return list(value) @Settings.add_converter('CACHALOT_QUERY_KEYGEN') def convert(value): return import_string(value) @Settings.add_converter('CACHALOT_TABLE_KEYGEN') def convert(value): return import_string(value) cachalot_settings = Settings() django-cachalot-2.8.0/cachalot/signals.py000066400000000000000000000002241500004256100203160ustar00rootroot00000000000000from django.dispatch import Signal # sender: name of table invalidated # db_alias: name of database that was effected post_invalidation = Signal() django-cachalot-2.8.0/cachalot/templates/000077500000000000000000000000001500004256100203045ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/templates/cachalot/000077500000000000000000000000001500004256100220625ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/templates/cachalot/panel.html000066400000000000000000000012001500004256100240400ustar00rootroot00000000000000{% load i18n %} {% for db_alias, invalidations in invalidations_per_db %}

{% blocktrans %}Database '{{ db_alias }}'{% endblocktrans %}

{% for app_label, model, datetime in invalidations %} {% endfor %}
{% trans 'Application' %} {% trans 'Model' %} {% trans 'Last invalidation' %}
{{ app_label }} {{ model }} {{ datetime|timesince }}
{% endfor %} django-cachalot-2.8.0/cachalot/templatetags/000077500000000000000000000000001500004256100210005ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/templatetags/__init__.py000066400000000000000000000000001500004256100230770ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/templatetags/cachalot.py000066400000000000000000000002211500004256100231230ustar00rootroot00000000000000from django.template import Library from ..api import get_last_invalidation register = Library() register.simple_tag(get_last_invalidation) django-cachalot-2.8.0/cachalot/tests/000077500000000000000000000000001500004256100174505ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/tests/__init__.py000066400000000000000000000013001500004256100215530ustar00rootroot00000000000000from django.core.signals import setting_changed from django.dispatch import receiver from ..settings import cachalot_settings from .read import ReadTestCase, ParameterTypeTestCase from .write import WriteTestCase, DatabaseCommandTestCase from .transaction import AtomicCacheTestCase, AtomicTestCase from .thread_safety import ThreadSafetyTestCase from .multi_db import MultiDatabaseTestCase from .settings import SettingsTestCase from .api import APITestCase, CommandTestCase from .signals import SignalsTestCase from .postgres import PostgresReadTestCase from .debug_toolbar import DebugToolbarTestCase @receiver(setting_changed) def reload_settings(sender, **kwargs): cachalot_settings.reload() django-cachalot-2.8.0/cachalot/tests/api.py000066400000000000000000000375251500004256100206070ustar00rootroot00000000000000import os from time import time, sleep from unittest import skipIf from django.conf import settings from django.contrib.auth.models import Permission, User from django.core.cache import DEFAULT_CACHE_ALIAS, caches from django.core.management import call_command from django.db import connection, transaction, DEFAULT_DB_ALIAS from django.template import engines from django.test import TransactionTestCase from jinja2.exceptions import TemplateSyntaxError from ..api import * from .models import Test from .test_utils import TestUtilsMixin class APITestCase(TestUtilsMixin, TransactionTestCase): databases = set(settings.DATABASES.keys()) def setUp(self): super(APITestCase, self).setUp() self.t1 = Test.objects.create(name='test1') self.cache_alias2 = next(alias for alias in settings.CACHES if alias != DEFAULT_CACHE_ALIAS) # For cachalot_disabled test self.user = User.objects.create_user('user') self.t1__permission = (Permission.objects.order_by('?') .select_related('content_type')[0]) def test_invalidate_tables(self): with self.assertNumQueries(1): data1 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data1, ['test1']) with self.settings(CACHALOT_INVALIDATE_RAW=False): with connection.cursor() as cursor: cursor.execute( "INSERT INTO cachalot_test (name, public) " "VALUES ('test2', %s);", [1 if self.is_sqlite else True]) with self.assertNumQueries(0): data2 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data2, ['test1']) invalidate('cachalot_test') with self.assertNumQueries(1): data3 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data3, ['test1', 'test2']) def test_invalidate_models_lookups(self): with self.assertNumQueries(1): data1 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data1, ['test1']) with self.settings(CACHALOT_INVALIDATE_RAW=False): with connection.cursor() as cursor: cursor.execute( "INSERT INTO cachalot_test (name, public) " "VALUES ('test2', %s);", [1 if self.is_sqlite else True]) with self.assertNumQueries(0): data2 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data2, ['test1']) invalidate('cachalot.Test') with self.assertNumQueries(1): data3 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data3, ['test1', 'test2']) def test_invalidate_models(self): with self.assertNumQueries(1): data1 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data1, ['test1']) with self.settings(CACHALOT_INVALIDATE_RAW=False): with connection.cursor() as cursor: cursor.execute( "INSERT INTO cachalot_test (name, public) " "VALUES ('test2', %s);", [1 if self.is_sqlite else True]) with self.assertNumQueries(0): data2 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data2, ['test1']) invalidate(Test) with self.assertNumQueries(1): data3 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data3, ['test1', 'test2']) def test_invalidate_all(self): with self.assertNumQueries(1): Test.objects.get() with self.assertNumQueries(0): Test.objects.get() invalidate() with self.assertNumQueries(1): Test.objects.get() def test_invalidate_all_in_atomic(self): with transaction.atomic(): with self.assertNumQueries(1): Test.objects.get() with self.assertNumQueries(0): Test.objects.get() invalidate() with self.assertNumQueries(1): Test.objects.get() with self.assertNumQueries(1): Test.objects.get() def test_get_last_invalidation(self): invalidate() timestamp = get_last_invalidation() delta = 0.15 if os.environ.get("CACHE_BACKEND") == "filebased" else 0.1 self.assertAlmostEqual(timestamp, time(), delta=delta) sleep(0.1) invalidate('cachalot_test') timestamp = get_last_invalidation('cachalot_test') self.assertAlmostEqual(timestamp, time(), delta=delta) same_timestamp = get_last_invalidation('cachalot.Test') self.assertEqual(same_timestamp, timestamp) same_timestamp = get_last_invalidation(Test) self.assertEqual(same_timestamp, timestamp) timestamp = get_last_invalidation('cachalot_testparent') self.assertNotAlmostEqual(timestamp, time(), delta=0.1) timestamp = get_last_invalidation('cachalot_testparent', 'cachalot_test') self.assertAlmostEqual(timestamp, time(), delta=delta) def test_get_last_invalidation_template_tag(self): # Without arguments original_timestamp = engines['django'].from_string( "{{ timestamp }}" ).render({ 'timestamp': get_last_invalidation(), }) template = engines['django'].from_string(""" {% load cachalot %} {% get_last_invalidation as timestamp %} {{ timestamp }} """) timestamp = template.render().strip() self.assertNotEqual(timestamp, '') self.assertNotEqual(timestamp, '0.0') self.assertAlmostEqual(float(timestamp), float(original_timestamp), delta=0.1) # With arguments original_timestamp = engines['django'].from_string( "{{ timestamp }}" ).render({ 'timestamp': get_last_invalidation('auth.Group', 'cachalot_test'), }) template = engines['django'].from_string(""" {% load cachalot %} {% get_last_invalidation 'auth.Group' 'cachalot_test' as timestamp %} {{ timestamp }} """) timestamp = template.render().strip() self.assertNotEqual(timestamp, '') self.assertNotEqual(timestamp, '0.0') self.assertAlmostEqual(float(timestamp), float(original_timestamp), delta=0.1) # While using the `cache` template tag, with invalidation template = engines['django'].from_string(""" {% load cachalot cache %} {% get_last_invalidation 'auth.Group' 'cachalot_test' as timestamp %} {% cache 10 cache_key_name timestamp %} {{ content }} {% endcache %} """) content = template.render({'content': 'something'}).strip() self.assertEqual(content, 'something') content = template.render({'content': 'anything'}).strip() self.assertEqual(content, 'something') invalidate('cachalot_test') content = template.render({'content': 'yet another'}).strip() self.assertEqual(content, 'yet another') def test_get_last_invalidation_jinja2(self): original_timestamp = engines['jinja2'].from_string( "{{ timestamp }}" ).render({ 'timestamp': get_last_invalidation('auth.Group', 'cachalot_test'), }) template = engines['jinja2'].from_string( "{{ get_last_invalidation('auth.Group', 'cachalot_test') }}") timestamp = template.render({}) self.assertNotEqual(timestamp, '') self.assertNotEqual(timestamp, '0.0') self.assertAlmostEqual(float(timestamp), float(original_timestamp), delta=0.1) def test_cache_jinja2(self): # Invalid arguments with self.assertRaises(TemplateSyntaxError, msg="'invalid' is not a valid keyword argument " "for {% cache %}"): engines['jinja2'].from_string(""" {% cache cache_key='anything', invalid='what?' %}{% endcache %} """) with self.assertRaises(ValueError, msg='You must set `cache_key` when ' 'the template is not a file.'): engines['jinja2'].from_string( '{% cache %} broken {% endcache %}').render() # With the minimum number of arguments template = engines['jinja2'].from_string(""" {%- cache cache_key='first' -%} {{ content1 }} {%- endcache -%} {%- cache cache_key='second' -%} {{ content2 }} {%- endcache -%} """) content = template.render({'content1': 'abc', 'content2': 'def'}) self.assertEqual(content, 'abcdef') invalidate() content = template.render({'content1': 'ghi', 'content2': 'jkl'}) self.assertEqual(content, 'abcdef') # With the maximum number of arguments template = engines['jinja2'].from_string(""" {%- cache get_last_invalidation('auth.Group', 'cachalot_test', cache_alias=cache), timeout=10, cache_key='cache_key_name', cache_alias=cache -%} {{ content }} {%- endcache -%} """) content = template.render({'content': 'something', 'cache': self.cache_alias2}) self.assertEqual(content, 'something') content = template.render({'content': 'anything', 'cache': self.cache_alias2}) self.assertEqual(content, 'something') invalidate('cachalot_test', cache_alias=DEFAULT_CACHE_ALIAS) content = template.render({'content': 'yet another', 'cache': self.cache_alias2}) self.assertEqual(content, 'something') invalidate('cachalot_test') content = template.render({'content': 'will you change?', 'cache': self.cache_alias2}) self.assertEqual(content, 'will you change?') caches[self.cache_alias2].clear() content = template.render({'content': 'better!', 'cache': self.cache_alias2}) self.assertEqual(content, 'better!') def test_cachalot_disabled_multiple_queries_ignoring_in_mem_cache(self): """ Test that when queries are given the `cachalot_disabled` context manager, the queries will not be cached. """ with cachalot_disabled(True): qs = Test.objects.all() with self.assertNumQueries(1): data1 = list(qs.all()) Test.objects.create( name='test3', owner=self.user, date='1789-07-14', datetime='1789-07-14T16:43:27', permission=self.t1__permission) with self.assertNumQueries(1): data2 = list(qs.all()) self.assertNotEqual(data1, data2) def test_query_cachalot_disabled_even_if_already_cached(self): """ Test that when a query is given the `cachalot_disabled` context manager, the query outside of the context manager will be cached. Any duplicated query will use the original query's cached result. """ qs = Test.objects.all() self.assert_query_cached(qs) with cachalot_disabled() and self.assertNumQueries(0): list(qs.all()) def test_duplicate_query_execute_anyways(self): """After an object is created, a duplicate query should execute rather than use the cached result. """ qs = Test.objects.all() self.assert_query_cached(qs) Test.objects.create( name='test3', owner=self.user, date='1789-07-14', datetime='1789-07-14T16:43:27', permission=self.t1__permission) with cachalot_disabled() and self.assertNumQueries(1): list(qs.all()) class CommandTestCase(TransactionTestCase): multi_db = True databases = "__all__" def setUp(self): self.db_alias2 = next(alias for alias in settings.DATABASES if alias != DEFAULT_DB_ALIAS) self.cache_alias2 = next(alias for alias in settings.CACHES if alias != DEFAULT_CACHE_ALIAS) self.t1 = Test.objects.create(name='test1') self.t2 = Test.objects.using(self.db_alias2).create(name='test2') self.u = User.objects.create_user('test') def test_invalidate_cachalot(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', verbosity=0) with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', 'auth', verbosity=0) with self.assertNumQueries(0): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', 'cachalot', verbosity=0) with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', 'cachalot.testchild', verbosity=0) with self.assertNumQueries(0): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', 'cachalot.test', verbosity=0) with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) with self.assertNumQueries(1): self.assertListEqual(list(User.objects.all()), [self.u]) call_command('invalidate_cachalot', 'cachalot.test', 'auth.user', verbosity=0) with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) with self.assertNumQueries(1): self.assertListEqual(list(User.objects.all()), [self.u]) @skipIf(len(settings.DATABASES) == 1, 'We can’t change the DB used since there’s only one configured') def test_invalidate_cachalot_multi_db(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', verbosity=0, db_alias=self.db_alias2) with self.assertNumQueries(0): self.assertListEqual(list(Test.objects.all()), [self.t1]) with self.assertNumQueries(1, using=self.db_alias2): self.assertListEqual(list(Test.objects.using(self.db_alias2)), [self.t2]) call_command('invalidate_cachalot', verbosity=0, db_alias=self.db_alias2) with self.assertNumQueries(1, using=self.db_alias2): self.assertListEqual(list(Test.objects.using(self.db_alias2)), [self.t2]) @skipIf(len(settings.CACHES) == 1, 'We can’t change the cache used since there’s only one configured') def test_invalidate_cachalot_multi_cache(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', verbosity=0, cache_alias=self.cache_alias2) with self.assertNumQueries(0): self.assertListEqual(list(Test.objects.all()), [self.t1]) with self.assertNumQueries(1): with self.settings(CACHALOT_CACHE=self.cache_alias2): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', verbosity=0, cache_alias=self.cache_alias2) with self.assertNumQueries(1): with self.settings(CACHALOT_CACHE=self.cache_alias2): self.assertListEqual(list(Test.objects.all()), [self.t1]) django-cachalot-2.8.0/cachalot/tests/db_router.py000066400000000000000000000012171500004256100220100ustar00rootroot00000000000000from django.conf import settings class PostgresRouter(object): @staticmethod def in_postgres(model): app_label = model._meta.app_label model_name = model._meta.model_name return app_label == 'cachalot' and model_name == 'postgresmodel' def get_postgresql_alias(self): return ('postgresql' if 'postgresql' in settings.DATABASES else 'default') def allow_migrate(self, db, app_label, model=None, **hints): if hints.get('extension') in ('hstore', 'unaccent') \ or (model is not None and self.in_postgres(model)): return db == self.get_postgresql_alias() django-cachalot-2.8.0/cachalot/tests/debug_toolbar.py000066400000000000000000000022321500004256100226310ustar00rootroot00000000000000from uuid import UUID from bs4 import BeautifulSoup from django.conf import settings from django.test import LiveServerTestCase, override_settings @override_settings(DEBUG=True) class DebugToolbarTestCase(LiveServerTestCase): databases = set(settings.DATABASES.keys()) def test_rendering(self): # # Rendering toolbar # response = self.client.get('/') self.assertEqual(response.status_code, 200) soup = BeautifulSoup(response.content.decode('utf-8'), 'html.parser') toolbar = soup.find(id='djDebug') self.assertIsNotNone(toolbar) store_id = toolbar.attrs['data-store-id'] # Checks that store_id is a valid UUID. UUID(store_id) render_panel_url = toolbar.attrs['data-render-panel-url'] panel_id = soup.find(title='Cachalot')['class'][0] panel_url = ('%s?store_id=%s&panel_id=%s' % (render_panel_url, store_id, panel_id)) # # Rendering panel # panel_response = self.client.get(panel_url) self.assertEqual(panel_response.status_code, 200) # TODO: Check that the displayed data is correct. django-cachalot-2.8.0/cachalot/tests/loaddata_fixture.json000066400000000000000000000000721500004256100236610ustar00rootroot00000000000000[{"fields": {"name": "test2"}, "model": "cachalot.test"}] django-cachalot-2.8.0/cachalot/tests/migrations/000077500000000000000000000000001500004256100216245ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/tests/migrations/0001_initial.py000066400000000000000000000105451500004256100242740ustar00rootroot00000000000000from django import VERSION as DJANGO_VERSION from django.conf import settings from django.contrib.postgres.fields import ( ArrayField, HStoreField, IntegerRangeField, DateRangeField, DateTimeRangeField, DecimalRangeField) from django.contrib.postgres.operations import ( HStoreExtension, UnaccentExtension) from django.db import models, migrations def extra_regular_available_fields(): fields = [] try: from django.db.models import JSONField fields.append(('json', JSONField(null=True, blank=True))) except ImportError: pass return fields def extra_postgres_available_fields(): fields = [] # Future proofing with Django 40 deprecation if DJANGO_VERSION[0] < 4: # TODO Remove when Dj40 support is dropped from django.contrib.postgres.fields import JSONField fields.append(('json', JSONField(null=True, blank=True))) return fields class Migration(migrations.Migration): dependencies = [ ('auth', '0001_initial'), migrations.swappable_dependency(settings.AUTH_USER_MODEL), ] operations = [ migrations.CreateModel( name='Test', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('name', models.CharField(max_length=20)), ('public', models.BooleanField(default=False)), ('date', models.DateField(null=True, blank=True)), ('datetime', models.DateTimeField(null=True, blank=True)), ('owner', models.ForeignKey(blank=True, to=settings.AUTH_USER_MODEL, null=True, on_delete=models.SET_NULL)), ('permission', models.ForeignKey(blank=True, to='auth.Permission', null=True, on_delete=models.PROTECT)), ('a_float', models.FloatField(null=True, blank=True)), ('a_decimal', models.DecimalField(null=True, blank=True, max_digits=5, decimal_places=2)), ('a_choice', models.CharField(choices=[("foo","foo"), ("bar","bar")], null=True, max_length=3)), ('a_dict_choice', models.CharField(choices=[("foo","foo"), ("bar","bar")], null=True, max_length=3)), ('bin', models.BinaryField(null=True, blank=True)), ('ip', models.GenericIPAddressField(null=True, blank=True)), ('duration', models.DurationField(null=True, blank=True)), ('uuid', models.UUIDField(null=True, blank=True)), ] + extra_regular_available_fields(), options={ 'ordering': ('name',), }, ), migrations.CreateModel( name='TestParent', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('name', models.CharField(max_length=20)), ], ), migrations.CreateModel( name='TestChild', fields=[ ('testparent_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='cachalot.TestParent', on_delete=models.CASCADE)), ('public', models.BooleanField(default=False)), ('permissions', models.ManyToManyField('auth.Permission', blank=True)) ], bases=('cachalot.testparent',), ), HStoreExtension(), UnaccentExtension(), migrations.CreateModel( name='PostgresModel', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('int_array', ArrayField( models.IntegerField(null=True, blank=True), size=3, null=True, blank=True)), ('hstore', HStoreField(null=True, blank=True)), ('int_range', IntegerRangeField(null=True, blank=True)), ('date_range', DateRangeField(null=True, blank=True)), ('datetime_range', DateTimeRangeField(null=True, blank=True)), ('decimal_range', DecimalRangeField(null=True, blank=True)) ] + extra_postgres_available_fields(), ), migrations.RunSQL('CREATE TABLE cachalot_unmanagedmodel ' '(id SERIAL PRIMARY KEY, name VARCHAR(50));'), ] django-cachalot-2.8.0/cachalot/tests/migrations/__init__.py000066400000000000000000000000001500004256100237230ustar00rootroot00000000000000django-cachalot-2.8.0/cachalot/tests/models.py000066400000000000000000000061361500004256100213130ustar00rootroot00000000000000from django import VERSION as DJANGO_VERSION from django.conf import settings from django.contrib.postgres.fields import ( ArrayField, HStoreField, IntegerRangeField, DateRangeField, DateTimeRangeField) from django.db.models import ( Model, CharField, ForeignKey, BooleanField, DateField, DateTimeField, ManyToManyField, BinaryField, IntegerField, GenericIPAddressField, TextChoices, FloatField, DecimalField, DurationField, UUIDField, SET_NULL, PROTECT) class SomeChoices(TextChoices): foo = 'foo' bar = 'bar' class Test(Model): name = CharField(max_length=20) owner = ForeignKey(settings.AUTH_USER_MODEL, null=True, blank=True, on_delete=SET_NULL) public = BooleanField(default=False) date = DateField(null=True, blank=True) datetime = DateTimeField(null=True, blank=True) permission = ForeignKey('auth.Permission', null=True, blank=True, on_delete=PROTECT) # We can’t use the exact names `float` or `decimal` as database column name # since it fails on MySQL. a_float = FloatField(null=True, blank=True) a_decimal = DecimalField(null=True, blank=True, max_digits=5, decimal_places=2) a_choice = CharField(max_length=3, choices=SomeChoices.choices, null=True) bin = BinaryField(null=True, blank=True) ip = GenericIPAddressField(null=True, blank=True) duration = DurationField(null=True, blank=True) uuid = UUIDField(null=True, blank=True) try: from django.db.models import JSONField json = JSONField(null=True, blank=True) except ImportError: pass class Meta: ordering = ('name',) class TestParent(Model): name = CharField(max_length=20) class TestChild(TestParent): """ A OneToOneField to TestParent is automatically added here. https://docs.djangoproject.com/en/3.2/topics/db/models/#multi-table-inheritance """ public = BooleanField(default=False) permissions = ManyToManyField('auth.Permission', blank=True) class PostgresModel(Model): int_array = ArrayField(IntegerField(null=True, blank=True), size=3, null=True, blank=True) hstore = HStoreField(null=True, blank=True) if DJANGO_VERSION < (4, 0): from django.contrib.postgres.fields import JSONField json = JSONField(null=True, blank=True) int_range = IntegerRangeField(null=True, blank=True) try: from django.contrib.postgres.fields import FloatRangeField float_range = FloatRangeField(null=True, blank=True) except ImportError: pass try: from django.contrib.postgres.fields import DecimalRangeField decimal_range = DecimalRangeField(null=True, blank=True) except ImportError: pass date_range = DateRangeField(null=True, blank=True) datetime_range = DateTimeRangeField(null=True, blank=True) class Meta: # Tests schema name in table name. db_table = '"public"."cachalot_postgresmodel"' class UnmanagedModel(Model): name = CharField(max_length=50) class Meta: managed = False django-cachalot-2.8.0/cachalot/tests/multi_db.py000066400000000000000000000114211500004256100216200ustar00rootroot00000000000000from unittest import skipIf from django.conf import settings from django.db import DEFAULT_DB_ALIAS, connections, transaction from django.test import TransactionTestCase from .models import Test @skipIf(len(settings.DATABASES) == 1, 'We can’t change the DB used since there’s only one configured') class MultiDatabaseTestCase(TransactionTestCase): multi_db = True databases = "__all__" def setUp(self): self.t1 = Test.objects.create(name='test1') self.t2 = Test.objects.create(name='test2') self.db_alias2 = next(alias for alias in settings.DATABASES if alias != DEFAULT_DB_ALIAS) connection2 = connections[self.db_alias2] self.is_sqlite2 = connection2.vendor == 'sqlite' self.is_mysql2 = connection2.vendor == 'mysql' if connection2.vendor in ('mysql', 'postgresql'): # We need to reopen the connection or Django # will execute an extra SQL request below. connection2.cursor() def test_read(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, [self.t1, self.t2]) with self.assertNumQueries(1, using=self.db_alias2): data2 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data2, []) with self.assertNumQueries(0, using=self.db_alias2): data3 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data3, []) def test_invalidate_other_db(self): """ Tests if the non-default database is invalidated when modified. """ with self.assertNumQueries(1, using=self.db_alias2): data1 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data1, []) with self.assertNumQueries(1, using=self.db_alias2): t3 = Test.objects.using(self.db_alias2).create(name='test3') with self.assertNumQueries(1, using=self.db_alias2): data2 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data2, [t3]) def test_invalidation_independence(self): """ Tests if invalidation doesn’t affect the unmodified databases. """ with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, [self.t1, self.t2]) with self.assertNumQueries(1, using=self.db_alias2): Test.objects.using(self.db_alias2).create(name='test3') with self.assertNumQueries(0): data2 = list(Test.objects.all()) self.assertListEqual(data2, [self.t1, self.t2]) def test_heterogeneous_atomics(self): """ Checks that an atomic block for a database nested inside another atomic block for another database has no impact on their caching. """ with transaction.atomic(): with transaction.atomic(self.db_alias2): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, [self.t1, self.t2]) with self.assertNumQueries(1, using=self.db_alias2): data2 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data2, []) t3 = Test.objects.using(self.db_alias2).create(name='test3') with self.assertNumQueries(1, using=self.db_alias2): data3 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data3, [t3]) with self.assertNumQueries(0): data4 = list(Test.objects.all()) self.assertListEqual(data4, [self.t1, self.t2]) with self.assertNumQueries(1): data5 = list(Test.objects.filter(name='test3')) self.assertListEqual(data5, []) def test_heterogeneous_atomics_independence(self): """ Checks that interrupting an atomic block after the commit of another atomic block for another database nested inside it correctly invalidates the cache for the committed transaction. """ with self.assertNumQueries(1, using=self.db_alias2): data1 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data1, []) try: with transaction.atomic(): with transaction.atomic(self.db_alias2): t3 = Test.objects.using( self.db_alias2).create(name='test3') raise ZeroDivisionError except ZeroDivisionError: pass with self.assertNumQueries(1, using=self.db_alias2): data2 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data2, [t3]) django-cachalot-2.8.0/cachalot/tests/postgres.py000066400000000000000000000432051500004256100216740ustar00rootroot00000000000000from datetime import date, datetime from decimal import Decimal from unittest import skipUnless from django import VERSION from django.contrib.postgres.functions import TransactionNow from django.db import connection from django.test import TransactionTestCase, override_settings # If we are using Django 4.2 or higher, we need to use: if VERSION >= (4, 2): from django.db.backends.postgresql.psycopg_any import ( DateRange, DateTimeTZRange, NumericRange, ) else: from psycopg2.extras import DateRange, DateTimeTZRange, NumericRange from pytz import timezone from ..utils import UncachableQuery from .api import invalidate from .models import PostgresModel, Test from .test_utils import TestUtilsMixin from .tests_decorators import all_final_sql_checks, no_final_sql_check, with_final_sql_check # FIXME: Add tests for aggregations. def is_pg_field_available(name): fields = [] try: from django.contrib.postgres.fields import FloatRangeField fields.append("FloatRangeField") except ImportError: pass try: from django.contrib.postgres.fields import DecimalRangeField fields.append("DecimalRangeField") except ImportError: pass try: from django import VERSION from django.contrib.postgres.fields import JSONField if VERSION[0] < 4: fields.append("JSONField") except ImportError: pass return name in fields @skipUnless(connection.vendor == 'postgresql', 'This test is only for PostgreSQL') @override_settings(USE_TZ=True) class PostgresReadTestCase(TestUtilsMixin, TransactionTestCase): def setUp(self): self.obj1 = PostgresModel( int_array=[1, 2, 3], hstore={'a': 'b', 'c': None}, int_range=[1900, 2000], date_range=['1678-03-04', '1741-07-28'], datetime_range=[ datetime(1989, 1, 30, 12, 20, tzinfo=timezone('Europe/Paris')), None ] ) self.obj2 = PostgresModel( int_array=[4, None, 6], hstore={'a': '1', 'b': '2'}, int_range=[1989, None], date_range=['1989-01-30', None], datetime_range=[None, None]) if is_pg_field_available("JSONField"): self.obj1.json = {'a': 1, 'b': 2} self.obj2.json = [ 'something', { 'a': 1, 'b': None, 'c': 123.456, 'd': True, 'e': { 'another': 'dict', 'and yet': { 'another': 'one', 'with a list': [], }, }, }, ] if is_pg_field_available("FloatRangeField"): self.obj1.float_range = [-1e3, 9.87654321] self.obj2.float_range = [0.0, None] if is_pg_field_available("DecimalRangeField"): self.obj1.decimal_range = [-1e3, 9.87654321] self.obj2.decimal_range = [0.0, None] self.obj1.save() self.obj2.save() @all_final_sql_checks def test_unaccent(self): obj1 = Test.objects.create(name='Clémentine') obj2 = Test.objects.create(name='Clementine') qs = (Test.objects.filter(name__unaccent='Clémentine') .values_list('name', flat=True)) self.assert_tables(qs, Test) self.assert_query_cached(qs, ['Clementine', 'Clémentine']) obj1.delete() obj2.delete() @all_final_sql_checks def test_int_array(self): with self.assertNumQueries(1): data1 = [o.int_array for o in PostgresModel.objects.all()] with self.assertNumQueries(1): data2 = list(PostgresModel.objects .values_list('int_array', flat=True)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [[1, 2, 3], [4, None, 6]]) invalidate(PostgresModel) qs = PostgresModel.objects.values_list('int_array', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3], [4, None, 6]]) qs = (PostgresModel.objects.filter(int_array__contains=[3]) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3]]) qs = (PostgresModel.objects .filter(int_array__contained_by=[1, 2, 3, 4, 5, 6]) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3]]) qs = (PostgresModel.objects.filter(int_array__overlap=[3, 4]) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3], [4, None, 6]]) qs = (PostgresModel.objects.filter(int_array__len__in=(2, 3)) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3], [4, None, 6]]) qs = (PostgresModel.objects.filter(int_array__2=6) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[4, None, 6]]) qs = (PostgresModel.objects.filter(int_array__0_2=(1, 2)) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3]]) @all_final_sql_checks def test_hstore(self): with self.assertNumQueries(1): data1 = [o.hstore for o in PostgresModel.objects.all()] with self.assertNumQueries(1): data2 = list(PostgresModel.objects .values_list('hstore', flat=True)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [{'a': 'b', 'c': None}, {'a': '1', 'b': '2'}]) invalidate(PostgresModel) qs = PostgresModel.objects.values_list('hstore', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': 'b', 'c': None}, {'a': '1', 'b': '2'}]) qs = (PostgresModel.objects.filter(hstore__a='1') .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': '1', 'b': '2'}]) qs = (PostgresModel.objects.filter(hstore__contains={'a': 'b'}) .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': 'b', 'c': None}]) qs = (PostgresModel.objects .filter(hstore__contained_by={'a': 'b', 'c': None, 'b': '2'}) .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': 'b', 'c': None}]) qs = (PostgresModel.objects.filter(hstore__has_key='c') .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': 'b', 'c': None}]) qs = (PostgresModel.objects.filter(hstore__has_keys=['a', 'b']) .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': '1', 'b': '2'}]) qs = (PostgresModel.objects.filter(hstore__keys=['a', 'b']) .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': '1', 'b': '2'}]) qs = (PostgresModel.objects.filter(hstore__values=['1', '2']) .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': '1', 'b': '2'}]) @all_final_sql_checks @skipUnless(is_pg_field_available("JSONField"), "JSONField was removed in Dj 4.0") def test_json(self): with self.assertNumQueries(1): data1 = [o.json for o in PostgresModel.objects.all()] with self.assertNumQueries(1): data2 = list(PostgresModel.objects.values_list('json', flat=True)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.obj1.json, self.obj2.json]) invalidate(PostgresModel) qs = PostgresModel.objects.values_list('json', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj1.json, self.obj2.json]) # Tests an index. qs = (PostgresModel.objects.filter(json__0='something') .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj2.json]) qs = (PostgresModel.objects .filter(json__0__nonexistent_key='something') .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, []) # Tests a path with spaces. qs = (PostgresModel.objects .filter(**{'json__1__e__and yet__another': 'one'}) .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj2.json]) qs = (PostgresModel.objects.filter(json__contains=['something']) .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj2.json]) qs = (PostgresModel.objects .filter(json__contained_by={'a': 1, 'b': 2, 'any': 'thing'}) .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj1.json]) qs = (PostgresModel.objects.filter(json__has_key='a') .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj1.json]) qs = (PostgresModel.objects.filter(json__has_any_keys=['a', 'b', 'c']) .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj1.json]) qs = (PostgresModel.objects.filter(json__has_keys=['a', 'b']) .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj1.json]) @skipUnless(is_pg_field_available("JSONField"), "JSONField was removed in Dj 4.0") def test_mutable_result_change(self): """ Checks that changing a mutable returned by a query has no effect on other executions of the query. """ qs = PostgresModel.objects.values_list('int_array', flat=True) data = list(qs.all()) self.assertListEqual(data, [[1, 2, 3], [4, None, 6]]) data[0].append(4) data[1].remove(4) data[1][0] = 5 self.assertListEqual(data, [[1, 2, 3, 4], [5, 6]]) self.assertListEqual(list(qs.all()), [[1, 2, 3], [4, None, 6]]) qs = PostgresModel.objects.values_list('json', flat=True) data = list(qs.all()) self.assertListEqual(data, [self.obj1.json, self.obj2.json]) data[0]['c'] = 3 del data[0]['b'] data[1].pop(0) data[1][0]['e']['and yet']['some other'] = True data[1][0]['f'] = 6 json1 = {'a': 1, 'c': 3} json2 = [ { 'a': 1, 'b': None, 'c': 123.456, 'd': True, 'e': { 'another': 'dict', 'and yet': { 'another': 'one', 'with a list': [], 'some other': True }, }, 'f': 6 }, ] self.assertListEqual(data, [json1, json2]) self.assertListEqual(list(qs.all()), [self.obj1.json, self.obj2.json]) @all_final_sql_checks def test_int_range(self): with self.assertNumQueries(1): data1 = [o.int_range for o in PostgresModel.objects.all()] with self.assertNumQueries(1): data2 = list(PostgresModel.objects .values_list('int_range', flat=True)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [NumericRange(1900, 2000), NumericRange(1989)]) invalidate(PostgresModel) qs = PostgresModel.objects.values_list('int_range', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000), NumericRange(1989)]) qs = (PostgresModel.objects.filter(int_range__contains=2015) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1989)]) qs = (PostgresModel.objects .filter(int_range__contains=NumericRange(1950, 1990)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000)]) qs = (PostgresModel.objects .filter(int_range__contained_by=NumericRange(0, 2050)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000)]) qs = (PostgresModel.objects.filter(int_range__fully_lt=(2015, None)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000)]) qs = (PostgresModel.objects.filter(int_range__fully_gt=(1970, 1980)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1989)]) qs = (PostgresModel.objects.filter(int_range__not_lt=(1970, 1980)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1989)]) qs = (PostgresModel.objects.filter(int_range__not_gt=(1970, 1980)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, []) qs = (PostgresModel.objects.filter(int_range__adjacent_to=(1900, 1989)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1989)]) qs = (PostgresModel.objects.filter(int_range__startswith=1900) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000)]) qs = (PostgresModel.objects.filter(int_range__endswith=2000) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000)]) obj = PostgresModel.objects.create(int_range=[1900, 1900]) qs = (PostgresModel.objects.filter(int_range__isempty=True) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(empty=True)]) obj.delete() @all_final_sql_checks @skipUnless(is_pg_field_available("FloatRangeField"), "FloatRangeField was removed in Dj 3.1") def test_float_range(self): qs = PostgresModel.objects.values_list('float_range', flat=True) self.assert_tables(qs, PostgresModel) # For a strange reason, probably a misconception in psycopg2 # or a bad name in django.contrib.postgres (less probable), # FloatRange returns decimals instead of floats. # Note from ACW: crisis averted, renamed to DecimalRangeField self.assert_query_cached(qs, [ NumericRange(Decimal('-1000.0'), Decimal('9.87654321')), NumericRange(Decimal('0.0'))]) @all_final_sql_checks @skipUnless(is_pg_field_available("DecimalRangeField"), "DecimalRangeField was added in Dj 2.2") def test_decimal_range(self): qs = PostgresModel.objects.values_list('decimal_range', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [ NumericRange(Decimal('-1000.0'), Decimal('9.87654321')), NumericRange(Decimal('0.0'))]) @all_final_sql_checks def test_date_range(self): qs = PostgresModel.objects.values_list('date_range', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [ DateRange(date(1678, 3, 4), date(1741, 7, 28)), DateRange(date(1989, 1, 30))]) @all_final_sql_checks def test_datetime_range(self): qs = PostgresModel.objects.values_list('datetime_range', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [ DateTimeTZRange(datetime(1989, 1, 30, 12, 20, tzinfo=timezone('Europe/Paris'))), DateTimeTZRange(bounds='()')]) @all_final_sql_checks def test_transaction_now(self): """ Checks that queries with a TransactionNow() parameter are not cached. """ obj = Test.objects.create(datetime='1992-07-02T12:00:00') qs = Test.objects.filter(datetime__lte=TransactionNow()) with self.assertRaises(UncachableQuery): self.assert_tables(qs, Test) self.assert_query_cached(qs, [obj], after=1) obj.delete() django-cachalot-2.8.0/cachalot/tests/read.py000066400000000000000000001454041500004256100207450ustar00rootroot00000000000000import datetime from unittest import skipIf from uuid import UUID from decimal import Decimal from django import VERSION as DJANGO_VERSION from django.conf import settings from django.contrib.auth.models import Group, Permission, User from django.contrib.contenttypes.models import ContentType from django.db import ( connection, transaction, DEFAULT_DB_ALIAS, ProgrammingError, OperationalError) from django.db.models import Case, Count, Q, Value, When from django.db.models.expressions import RawSQL, Subquery, OuterRef, Exists from django.db.models.functions import Coalesce, Now from django.db.transaction import TransactionManagementError from django.test import TransactionTestCase, skipUnlessDBFeature, override_settings from pytz import UTC from cachalot.cache import cachalot_caches from ..settings import cachalot_settings from ..utils import UncachableQuery from .models import SomeChoices, Test, TestChild, TestParent, UnmanagedModel from .test_utils import TestUtilsMixin, FilteredTransactionTestCase from .tests_decorators import all_final_sql_checks, with_final_sql_check, no_final_sql_check def is_field_available(name): fields = [] try: from django.db.models import JSONField fields.append("JSONField") except ImportError: pass return name in fields class ReadTestCase(TestUtilsMixin, FilteredTransactionTestCase): """ Tests if every SQL request that only reads data is cached. The only exception is for requests that don’t go through the ORM, using ``QuerySet.extra`` with ``select`` or ``where`` arguments, ``Model.objects.raw``, or ``cursor.execute``. """ def setUp(self): super(ReadTestCase, self).setUp() self.group = Group.objects.create(name='test_group') self.group__permissions = list(Permission.objects.all()[:3]) self.group.permissions.add(*self.group__permissions) self.user = User.objects.create_user('user') self.user__permissions = list(Permission.objects.filter(content_type__app_label='auth')[3:6]) self.user.groups.add(self.group) self.user.user_permissions.add(*self.user__permissions) self.admin = User.objects.create_superuser('admin', 'admin@test.me', 'password') self.t1__permission = (Permission.objects .order_by('?') .select_related('content_type')[0]) self.t1 = Test.objects.create( name='test1', owner=self.user, date='1789-07-14', datetime='1789-07-14T16:43:27', permission=self.t1__permission) self.t2 = Test.objects.create( name='test2', owner=self.admin, public=True, date='1944-06-06', datetime='1944-06-06T06:35:00') def test_empty(self): with self.assertNumQueries(0): data1 = list(Test.objects.none()) with self.assertNumQueries(0): data2 = list(Test.objects.none()) self.assertListEqual(data2, data1) self.assertListEqual(data2, []) def test_exists(self): with self.assertNumQueries(1): n1 = Test.objects.exists() with self.assertNumQueries(0): n2 = Test.objects.exists() self.assertEqual(n2, n1) self.assertTrue(n2) def test_count(self): with self.assertNumQueries(1): n1 = Test.objects.count() with self.assertNumQueries(0): n2 = Test.objects.count() self.assertEqual(n2, n1) self.assertEqual(n2, 2) def test_get(self): with self.assertNumQueries(1): data1 = Test.objects.get(name='test1') with self.assertNumQueries(0): data2 = Test.objects.get(name='test1') self.assertEqual(data2, data1) self.assertEqual(data2, self.t1) def test_first(self): with self.assertNumQueries(1): self.assertEqual(Test.objects.filter(name='bad').first(), None) with self.assertNumQueries(0): self.assertEqual(Test.objects.filter(name='bad').first(), None) with self.assertNumQueries(1): data1 = Test.objects.first() with self.assertNumQueries(0): data2 = Test.objects.first() self.assertEqual(data2, data1) self.assertEqual(data2, self.t1) def test_last(self): with self.assertNumQueries(1): data1 = Test.objects.last() with self.assertNumQueries(0): data2 = Test.objects.last() self.assertEqual(data2, data1) self.assertEqual(data2, self.t2) def test_all(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) with self.assertNumQueries(0): data2 = list(Test.objects.all()) self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.t1, self.t2]) @all_final_sql_checks def test_filter(self): qs = Test.objects.filter(public=True) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2]) qs = Test.objects.filter(name__in=['test2', 'test72']) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2]) qs = Test.objects.filter(date__gt=datetime.date(1900, 1, 1)) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2]) qs = Test.objects.filter(datetime__lt=datetime.datetime(1900, 1, 1)) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1]) @all_final_sql_checks def test_filter_empty(self): qs = Test.objects.filter(public=True, name='user') self.assert_tables(qs, Test) self.assert_query_cached(qs, []) @all_final_sql_checks def test_exclude(self): qs = Test.objects.exclude(public=True) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1]) qs = Test.objects.exclude(name__in=['test2', 'test72']) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1]) @all_final_sql_checks def test_slicing(self): qs = Test.objects.all()[:1] self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1]) @all_final_sql_checks def test_order_by(self): qs = Test.objects.order_by('pk') self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1, self.t2]) qs = Test.objects.order_by('-name') self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2, self.t1]) @all_final_sql_checks def test_random_order_by(self): qs = Test.objects.order_by('?') with self.assertRaises(UncachableQuery): self.assert_tables(qs, Test) self.assert_query_cached(qs, after=1, compare_results=False) @with_final_sql_check def test_order_by_field_of_another_table_with_check(self): qs = Test.objects.order_by('owner__username') self.assert_tables(qs, Test, User) self.assert_query_cached(qs, [self.t2, self.t1]) @no_final_sql_check def test_order_by_field_of_another_table_no_check(self): qs = Test.objects.order_by('owner__username') self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2, self.t1]) @with_final_sql_check def test_order_by_field_of_another_table_with_expression_with_check(self): qs = Test.objects.order_by(Coalesce('name', 'owner__username')) self.assert_tables(qs, Test, User) self.assert_query_cached(qs, [self.t1, self.t2]) @no_final_sql_check def test_order_by_field_of_another_table_with_expression_no_check(self): qs = Test.objects.order_by(Coalesce('name', 'owner__username')) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1, self.t2]) @all_final_sql_checks @skipIf(connection.vendor == 'mysql', 'MySQL does not support limit/offset on a subquery. ' 'Since Django only applies ordering in subqueries when they are ' 'offset/limited, we can’t test it on MySQL.') def test_random_order_by_subquery(self): qs = Test.objects.filter( pk__in=Test.objects.order_by('?')[:10]) with self.assertRaises(UncachableQuery): self.assert_tables(qs, Test) self.assert_query_cached(qs, after=1, compare_results=False) @all_final_sql_checks def test_reverse(self): qs = Test.objects.reverse() self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2, self.t1]) @all_final_sql_checks def test_distinct(self): # We ensure that the query without distinct should return duplicate # objects, in order to have a real-world example. qs = Test.objects.filter( owner__user_permissions__content_type__app_label='auth') self.assert_tables(qs, Test, User, User.user_permissions.through, Permission, ContentType) self.assert_query_cached(qs, [self.t1, self.t1, self.t1]) qs = qs.distinct() self.assert_tables(qs, Test, User, User.user_permissions.through, Permission, ContentType) self.assert_query_cached(qs, [self.t1]) def test_django_enums(self): t = Test.objects.create(name='test1', a_choice=SomeChoices.foo) qs = Test.objects.filter(a_choice=SomeChoices.foo) self.assert_query_cached(qs, [t]) def test_iterator(self): with self.settings(CACHALOT_CACHE_ITERATORS=False): with self.assertNumQueries(2): data1 = list(Test.objects.iterator()) data2 = list(Test.objects.iterator()) self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.t1, self.t2]) with self.assertNumQueries(1): data1 = list(Test.objects.iterator()) with self.assertNumQueries(0): data2 = list(Test.objects.iterator()) self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.t1, self.t2]) def test_in_bulk(self): with self.assertNumQueries(1): data1 = Test.objects.in_bulk((5432, self.t2.pk, 9200)) with self.assertNumQueries(0): data2 = Test.objects.in_bulk((5432, self.t2.pk, 9200)) self.assertDictEqual(data2, data1) self.assertDictEqual(data2, {self.t2.pk: self.t2}) @all_final_sql_checks def test_values(self): qs = Test.objects.values('name', 'public') self.assert_tables(qs, Test) self.assert_query_cached(qs, [{'name': 'test1', 'public': False}, {'name': 'test2', 'public': True}]) @all_final_sql_checks def test_values_list(self): qs = Test.objects.values_list('name', flat=True) self.assert_tables(qs, Test) self.assert_query_cached(qs, ['test1', 'test2']) def test_earliest(self): with self.assertNumQueries(1): data1 = Test.objects.earliest('date') with self.assertNumQueries(0): data2 = Test.objects.earliest('date') self.assertEqual(data2, data1) self.assertEqual(data2, self.t1) def test_latest(self): with self.assertNumQueries(1): data1 = Test.objects.latest('date') with self.assertNumQueries(0): data2 = Test.objects.latest('date') self.assertEqual(data2, data1) self.assertEqual(data2, self.t2) @all_final_sql_checks def test_dates(self): qs = Test.objects.dates('date', 'year') self.assert_tables(qs, Test) self.assert_query_cached(qs, [datetime.date(1789, 1, 1), datetime.date(1944, 1, 1)]) @all_final_sql_checks def test_datetimes(self): qs = Test.objects.datetimes('datetime', 'hour') self.assert_tables(qs, Test) self.assert_query_cached(qs, [datetime.datetime(1789, 7, 14, 16), datetime.datetime(1944, 6, 6, 6)]) @all_final_sql_checks @skipIf(connection.vendor == 'mysql', 'Time zones are not supported by MySQL.') @override_settings(USE_TZ=True) def test_datetimes_with_time_zones(self): qs = Test.objects.datetimes('datetime', 'hour') self.assert_tables(qs, Test) self.assert_query_cached(qs, [ datetime.datetime(1789, 7, 14, 16, tzinfo=UTC), datetime.datetime(1944, 6, 6, 6, tzinfo=UTC)]) @all_final_sql_checks def test_foreign_key(self): with self.assertNumQueries(3): data1 = [t.owner for t in Test.objects.all()] with self.assertNumQueries(0): data2 = [t.owner for t in Test.objects.all()] self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.user, self.admin]) qs = Test.objects.values_list('owner', flat=True) self.assert_tables(qs, Test, User) self.assert_query_cached(qs, [self.user.pk, self.admin.pk]) def _test_many_to_many(self): u = User.objects.create_user('test_user') ct = ContentType.objects.get_for_model(User) u.user_permissions.add( Permission.objects.create( name='Can discuss', content_type=ct, codename='discuss'), Permission.objects.create( name='Can touch', content_type=ct, codename='touch'), Permission.objects.create( name='Can cuddle', content_type=ct, codename='cuddle')) return u.user_permissions.values_list('codename', flat=True) @with_final_sql_check def test_many_to_many_when_sql_check(self): qs = self._test_many_to_many() self.assert_tables(qs, User, User.user_permissions.through, Permission, ContentType) self.assert_query_cached(qs, ['cuddle', 'discuss', 'touch']) @no_final_sql_check def test_many_to_many_when_no_sql_check(self): qs = self._test_many_to_many() self.assert_tables(qs, User, User.user_permissions.through, Permission) self.assert_query_cached(qs, ['cuddle', 'discuss', 'touch']) @all_final_sql_checks def test_subquery(self): additional_tables = [] if DJANGO_VERSION >= (4, 0) and DJANGO_VERSION < (4, 1) and settings.CACHALOT_FINAL_SQL_CHECK: # with Django 4.0 comes some query optimalizations that do selects little differently. additional_tables.append('django_content_type') qs = Test.objects.filter(owner__in=User.objects.all()) self.assert_tables(qs, Test, User) self.assert_query_cached(qs, [self.t1, self.t2]) qs = Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ) self.assert_tables( qs, Test, User, User.groups.through, Group, Group.permissions.through, Permission, *additional_tables ) self.assert_query_cached(qs, [self.t1, self.t1, self.t1]) qs = Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ).distinct() self.assert_tables( qs, Test, User, User.groups.through, Group, Group.permissions.through, Permission, *additional_tables ) self.assert_query_cached(qs, [self.t1]) qs = TestChild.objects.exclude(permissions__isnull=True) self.assert_tables( qs, TestParent, TestChild, TestChild.permissions.through, Permission ) self.assert_query_cached(qs, []) qs = TestChild.objects.exclude(permissions__name='') self.assert_tables( qs, TestParent, TestChild, TestChild.permissions.through, Permission ) self.assert_query_cached(qs, []) @with_final_sql_check def test_custom_subquery_with_check(self): tests = Test.objects.filter(permission=OuterRef('pk')).values('name') qs = Permission.objects.annotate(first_permission=Subquery(tests[:1])) self.assert_tables(qs, Permission, Test, ContentType) self.assert_query_cached(qs, list(Permission.objects.all())) @no_final_sql_check def test_custom_subquery_no_check(self): tests = Test.objects.filter(permission=OuterRef('pk')).values('name') qs = Permission.objects.annotate(first_permission=Subquery(tests[:1])) self.assert_tables(qs, Permission, Test) self.assert_query_cached(qs, list(Permission.objects.all())) @with_final_sql_check def test_custom_subquery_exists(self): tests = Test.objects.filter(permission=OuterRef('pk')) qs = Permission.objects.annotate(has_tests=Exists(tests)) self.assert_tables(qs, Permission, Test, ContentType) self.assert_query_cached(qs, list(Permission.objects.all())) @no_final_sql_check def test_custom_subquery_exists(self): tests = Test.objects.filter(permission=OuterRef('pk')) qs = Permission.objects.annotate(has_tests=Exists(tests)) self.assert_tables(qs, Permission, Test) self.assert_query_cached(qs, list(Permission.objects.all())) @all_final_sql_checks def test_raw_subquery(self): with self.assertNumQueries(0): raw_sql = RawSQL('SELECT id FROM auth_permission WHERE id = %s', (self.t1__permission.pk,)) qs = Test.objects.filter(permission=raw_sql) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs, [self.t1]) qs = Test.objects.filter( pk__in=Test.objects.filter(permission=raw_sql)) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs, [self.t1]) @all_final_sql_checks def test_aggregate(self): test3 = Test.objects.create(name='test3', owner=self.user) with self.assertNumQueries(1): n1 = User.objects.aggregate(n=Count('test'))['n'] with self.assertNumQueries(0): n2 = User.objects.aggregate(n=Count('test'))['n'] self.assertEqual(n2, n1) self.assertEqual(n2, 3) test3.delete() @all_final_sql_checks def test_annotate(self): test3 = Test.objects.create(name='test3', owner=self.user) qs = (User.objects.annotate(n=Count('test')).order_by('pk') .values_list('n', flat=True)) self.assert_tables(qs, User, Test) self.assert_query_cached(qs, [2, 1]) test3.delete() @all_final_sql_checks def test_annotate_subquery(self): tests = Test.objects.filter(owner=OuterRef('pk')).values('name') qs = User.objects.annotate(first_test=Subquery(tests[:1])) self.assert_tables(qs, User, Test) self.assert_query_cached(qs, [self.user, self.admin]) @all_final_sql_checks def test_annotate_case_with_when_and_query_in_default(self): tests = Test.objects.filter(owner=OuterRef('pk')).values('name') qs = User.objects.annotate( first_test=Case( When(Q(pk=1), then=Value('noname')), default=Subquery(tests[:1]) ) ) self.assert_tables(qs, User, Test) self.assert_query_cached(qs, [self.user, self.admin]) @all_final_sql_checks def test_annotate_case_with_when(self): tests = Test.objects.filter(owner=OuterRef('pk')).values('name') qs = User.objects.annotate( first_test=Case( When(Q(pk=1), then=Subquery(tests[:1])), default=Value('noname') ) ) self.assert_tables(qs, User, Test) self.assert_query_cached(qs, [self.user, self.admin]) @all_final_sql_checks def test_annotate_coalesce(self): tests = Test.objects.filter(owner=OuterRef('pk')).values('name') qs = User.objects.annotate( name=Coalesce( Subquery(tests[:1]), Value('notest') ) ) self.assert_tables(qs, User, Test) self.assert_query_cached(qs, [self.user, self.admin]) @all_final_sql_checks def test_annotate_raw(self): qs = User.objects.annotate( perm_id=RawSQL('SELECT id FROM auth_permission WHERE id = %s', (self.t1__permission.pk,)) ) self.assert_tables(qs, User, Permission) self.assert_query_cached(qs, [self.user, self.admin]) @all_final_sql_checks def test_only(self): with self.assertNumQueries(1): t1 = Test.objects.only('name').first() t1.name with self.assertNumQueries(0): t2 = Test.objects.only('name').first() t2.name with self.assertNumQueries(1): t1.public with self.assertNumQueries(0): t2.public self.assertEqual(t2, t1) self.assertEqual(t2.name, t1.name) self.assertEqual(t2.public, t1.public) @all_final_sql_checks def test_defer(self): with self.assertNumQueries(1): t1 = Test.objects.defer('name').first() t1.public with self.assertNumQueries(0): t2 = Test.objects.defer('name').first() t2.public with self.assertNumQueries(1): t1.name with self.assertNumQueries(0): t2.name self.assertEqual(t2, t1) self.assertEqual(t2.name, t1.name) self.assertEqual(t2.public, t1.public) @all_final_sql_checks def test_select_related(self): # Simple select_related with self.assertNumQueries(1): t1 = Test.objects.select_related('owner').get(name='test1') self.assertEqual(t1.owner, self.user) with self.assertNumQueries(0): t2 = Test.objects.select_related('owner').get(name='test1') self.assertEqual(t2.owner, self.user) self.assertEqual(t2, t1) self.assertEqual(t2, self.t1) # Select_related through a foreign key with self.assertNumQueries(1): t3 = Test.objects.select_related('permission__content_type')[0] self.assertEqual(t3.permission, self.t1.permission) self.assertEqual(t3.permission.content_type, self.t1__permission.content_type) with self.assertNumQueries(0): t4 = Test.objects.select_related('permission__content_type')[0] self.assertEqual(t4.permission, self.t1.permission) self.assertEqual(t4.permission.content_type, self.t1__permission.content_type) self.assertEqual(t4, t3) self.assertEqual(t4, self.t1) @all_final_sql_checks def test_prefetch_related(self): # Simple prefetch_related with self.assertNumQueries(2): data1 = list(User.objects.prefetch_related('user_permissions')) with self.assertNumQueries(0): permissions1 = [p for u in data1 for p in u.user_permissions.all()] with self.assertNumQueries(0): data2 = list(User.objects.prefetch_related('user_permissions')) permissions2 = [p for u in data2 for p in u.user_permissions.all()] self.assertListEqual(permissions2, permissions1) self.assertListEqual(permissions2, self.user__permissions) # Prefetch_related through a foreign key where exactly # the same prefetch_related SQL request was executed before with self.assertNumQueries(1): data3 = list(Test.objects.select_related('owner') .prefetch_related('owner__user_permissions')) with self.assertNumQueries(0): permissions3 = [p for t in data3 for p in t.owner.user_permissions.all()] with self.assertNumQueries(0): data4 = list(Test.objects.select_related('owner') .prefetch_related('owner__user_permissions')) permissions4 = [p for t in data4 for p in t.owner.user_permissions.all()] self.assertListEqual(permissions4, permissions3) self.assertListEqual(permissions4, self.user__permissions) # Prefetch_related through a foreign key where exactly # the same prefetch_related SQL request was not fetched before with self.assertNumQueries(2): data5 = list(Test.objects .select_related('owner') .prefetch_related('owner__user_permissions')[:1]) with self.assertNumQueries(0): permissions5 = [p for t in data5 for p in t.owner.user_permissions.all()] with self.assertNumQueries(0): data6 = list(Test.objects.select_related('owner') .prefetch_related('owner__user_permissions')[:1]) permissions6 = [p for t in data6 for p in t.owner.user_permissions.all()] self.assertListEqual(permissions6, permissions5) self.assertListEqual(permissions6, self.user__permissions) # Prefetch_related through a many to many with self.assertNumQueries(2): data7 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) with self.assertNumQueries(0): permissions7 = [p for t in data7 for g in t.owner.groups.all() for p in g.permissions.all()] with self.assertNumQueries(0): data8 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) permissions8 = [p for t in data8 for g in t.owner.groups.all() for p in g.permissions.all()] self.assertListEqual(permissions8, permissions7) self.assertListEqual(permissions8, self.group__permissions) @all_final_sql_checks def test_test_parent(self): child = TestChild.objects.create(name='child') qs = TestChild.objects.filter(name='child') self.assert_query_cached(qs) parent = TestParent.objects.all().first() parent.name = 'another name' parent.save() child = TestChild.objects.all().first() self.assertEqual(child.name, 'another name') def _filtered_relation(self): """ Resulting query: SELECT "cachalot_testparent"."id", "cachalot_testparent"."name", "cachalot_testchild"."testparent_ptr_id", "cachalot_testchild"."public" FROM "cachalot_testchild" INNER JOIN "cachalot_testparent" ON ("cachalot_testchild"."testparent_ptr_id" = "cachalot_testparent"."id") """ from django.db.models import FilteredRelation qs = TestChild.objects.annotate( filtered_permissions=FilteredRelation( 'permissions', condition=Q(permissions__pk__gt=1)) ) return qs def _filtered_relation_common_asserts(self, qs): self.assert_query_cached(qs) values_qs = qs.values('filtered_permissions') self.assert_tables( values_qs, TestParent, TestChild, TestChild.permissions.through, Permission ) self.assert_query_cached(values_qs) filtered_qs = qs.filter(filtered_permissions__pk__gt=2) self.assert_tables( values_qs, TestParent, TestChild, TestChild.permissions.through, Permission ) self.assert_query_cached(filtered_qs) @with_final_sql_check def test_filtered_relation_with_check(self): qs = self._filtered_relation() self.assert_tables(qs, TestParent, TestChild) self._filtered_relation_common_asserts(qs) @no_final_sql_check def test_filtered_relation_no_check(self): qs = self._filtered_relation() self.assert_tables(qs, TestChild) self._filtered_relation_common_asserts(qs) def _test_union(self, check: bool): qs = ( Test.objects.filter(pk__lt=5) | Test.objects.filter(permission__name__contains='a') ) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs) with self.assertRaisesMessage( AssertionError if DJANGO_VERSION < (4, 0) else TypeError, 'Cannot combine queries on two different base models.' ): Test.objects.all() | Permission.objects.all() qs = Test.objects.filter(pk__lt=5) sub_qs = Test.objects.filter(permission__name__contains='a') if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.union(sub_qs) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs) qs = Test.objects.all() sub_qs = Permission.objects.all() if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.union(sub_qs) tables = {Test, Permission} # Sqlite does not do an ORDER BY django_content_type if not self.is_sqlite and check: tables.add(ContentType) self.assert_tables(qs, *tables) with self.assertRaises((ProgrammingError, OperationalError)): self.assert_query_cached(qs) @with_final_sql_check @skipUnlessDBFeature('supports_select_union') def test_union_with_sql_check(self): self._test_union(check=True) @no_final_sql_check @skipUnlessDBFeature('supports_select_union') def test_union_with_sql_check(self): self._test_union(check=False) def _test_intersection(self, check: bool): qs = (Test.objects.filter(pk__lt=5) & Test.objects.filter(permission__name__contains='a')) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs) with self.assertRaisesMessage( AssertionError if DJANGO_VERSION < (4, 0) else TypeError, 'Cannot combine queries on two different base models.'): Test.objects.all() & Permission.objects.all() qs = Test.objects.filter(pk__lt=5) sub_qs = Test.objects.filter(permission__name__contains='a') if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.intersection(sub_qs) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs) qs = Test.objects.all() sub_qs = Permission.objects.all() if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.intersection(sub_qs) tables = {Test, Permission} if not self.is_sqlite and check: tables.add(ContentType) self.assert_tables(qs, *tables) with self.assertRaises((ProgrammingError, OperationalError)): self.assert_query_cached(qs) @with_final_sql_check @skipUnlessDBFeature('supports_select_intersection') def test_intersection_with_check(self): self._test_intersection(check=True) @no_final_sql_check @skipUnlessDBFeature('supports_select_intersection') def test_intersection_with_check(self): self._test_intersection(check=False) def _test_difference(self, check: bool): qs = Test.objects.filter(pk__lt=5) sub_qs = Test.objects.filter(permission__name__contains='a') if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.difference(sub_qs) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs) qs = Test.objects.all() sub_qs = Permission.objects.all() if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.difference(sub_qs) tables = {Test, Permission} if not self.is_sqlite and check: tables.add(ContentType) self.assert_tables(qs, *tables) with self.assertRaises((ProgrammingError, OperationalError)): self.assert_query_cached(qs) @with_final_sql_check @skipUnlessDBFeature('supports_select_difference') def test_difference_with_check(self): self._test_difference(check=True) @no_final_sql_check @skipUnlessDBFeature('supports_select_difference') def test_difference_with_check(self): self._test_difference(check=False) @skipUnlessDBFeature('has_select_for_update') def test_select_for_update(self): """ Tests if ``select_for_update`` queries are not cached. """ with self.assertRaises(TransactionManagementError): list(Test.objects.select_for_update()) with self.assertNumQueries(1): with transaction.atomic(): data1 = list(Test.objects.select_for_update()) self.assertListEqual(data1, [self.t1, self.t2]) self.assertListEqual([t.name for t in data1], ['test1', 'test2']) with self.assertNumQueries(1): with transaction.atomic(): data2 = list(Test.objects.select_for_update()) self.assertListEqual(data2, [self.t1, self.t2]) self.assertListEqual([t.name for t in data2], ['test1', 'test2']) with self.assertNumQueries(2): with transaction.atomic(): data3 = list(Test.objects.select_for_update()) data4 = list(Test.objects.select_for_update()) self.assertListEqual(data3, [self.t1, self.t2]) self.assertListEqual(data4, [self.t1, self.t2]) self.assertListEqual([t.name for t in data3], ['test1', 'test2']) self.assertListEqual([t.name for t in data4], ['test1', 'test2']) @all_final_sql_checks def test_having(self): qs = (User.objects.annotate(n=Count('user_permissions')).filter(n__gte=1)) self.assert_tables(qs, User, User.user_permissions.through, Permission) self.assert_query_cached(qs, [self.user]) with self.assertNumQueries(1): self.assertEqual(User.objects.annotate(n=Count('user_permissions')) .filter(n__gte=1).count(), 1) with self.assertNumQueries(0): self.assertEqual(User.objects.annotate(n=Count('user_permissions')) .filter(n__gte=1).count(), 1) def test_extra_select(self): username_length_sql = """ SELECT LENGTH(%(user_table)s.username) FROM %(user_table)s WHERE %(user_table)s.id = %(test_table)s.owner_id """ % {'user_table': User._meta.db_table, 'test_table': Test._meta.db_table} with self.assertNumQueries(1): data1 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data1, [self.t1, self.t2]) self.assertListEqual([o.username_length for o in data1], [4, 5]) with self.assertNumQueries(0): data2 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data2, [self.t1, self.t2]) self.assertListEqual([o.username_length for o in data2], [4, 5]) @all_final_sql_checks def test_extra_where(self): sql_condition = ("owner_id IN " "(SELECT id FROM auth_user WHERE username = 'admin')") qs = Test.objects.extra(where=[sql_condition]) self.assert_tables(qs, Test, User) self.assert_query_cached(qs, [self.t2]) @all_final_sql_checks def test_extra_tables(self): qs = Test.objects.extra(tables=['auth_user'], select={'extra_id': 'auth_user.id'}) self.assert_tables(qs, Test, User) self.assert_query_cached(qs) @all_final_sql_checks def test_extra_order_by(self): qs = Test.objects.extra(order_by=['-cachalot_test.name']) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2, self.t1]) def test_table_inheritance(self): with self.assertNumQueries(2): t_child = TestChild.objects.create(name='test_child') with self.assertNumQueries(1): self.assertEqual(TestChild.objects.get(), t_child) with self.assertNumQueries(0): self.assertEqual(TestChild.objects.get(), t_child) def test_explain(self): explain_kwargs = {} if self.is_sqlite: expected = (r'\d+ 0 0 SCAN cachalot_test\n' r'\d+ 0 0 USE TEMP B-TREE FOR ORDER BY') elif self.is_mysql: expected = ( r'-> Sort row IDs: cachalot_test.`name` \(cost=[\d\.]+ rows=\d\)\n ' r'-> Table scan on cachalot_test \(cost=[\d\.]+ rows=\d\)\n' ) else: explain_kwargs.update( analyze=True, costs=False, ) operation_detail = (r'\(actual time=[\d\.]+..[\d\.]+\ ' r'rows=\d+ loops=\d+\)') expected = ( r'^Sort %s\n' r' Sort Key: name\n' r' Sort Method: quicksort Memory: \d+kB\n' r' -> Seq Scan on cachalot_test %s\n' r'Planning Time: [\d\.]+ ms\n' r'Execution Time: [\d\.]+ ms$') % (operation_detail, operation_detail) with self.assertNumQueries(1): explanation1 = Test.objects.explain(**explain_kwargs) self.assertRegex(explanation1, expected) with self.assertNumQueries(0): explanation2 = Test.objects.explain(**explain_kwargs) self.assertEqual(explanation2, explanation1) def test_raw(self): """ Tests if ``Model.objects.raw`` queries are not cached. """ sql = 'SELECT * FROM %s;' % Test._meta.db_table with self.assertNumQueries(1): data1 = list(Test.objects.raw(sql)) with self.assertNumQueries(1): data2 = list(Test.objects.raw(sql)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.t1, self.t2]) def test_raw_no_table(self): sql = 'SELECT * FROM (SELECT 1 AS id UNION ALL SELECT 2) AS t;' with self.assertNumQueries(1): data1 = list(Test.objects.raw(sql)) with self.assertNumQueries(1): data2 = list(Test.objects.raw(sql)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [Test(pk=1), Test(pk=2)]) def test_cursor_execute_unicode(self): """ Tests if queries executed from a DB cursor are not cached. """ attname_column_list = [f.get_attname_column() for f in Test._meta.fields] attnames = [t[0] for t in attname_column_list] columns = [t[1] for t in attname_column_list] sql = "SELECT CAST('é' AS CHAR), %s FROM %s;" % (', '.join(columns), Test._meta.db_table) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data1 = list(cursor.fetchall()) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data2 = list(cursor.fetchall()) self.assertListEqual(data2, data1) self.assertListEqual( data2, [('é',) + l for l in Test.objects.values_list(*attnames)]) @skipIf(connection.vendor == 'sqlite', 'SQLite doesn’t accept bytes as raw query.') def test_cursor_execute_bytes(self): attname_column_list = [f.get_attname_column() for f in Test._meta.fields] attnames = [t[0] for t in attname_column_list] columns = [t[1] for t in attname_column_list] sql = "SELECT CAST('é' AS CHAR), %s FROM %s;" % (', '.join(columns), Test._meta.db_table) sql = sql.encode('utf-8') with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data1 = list(cursor.fetchall()) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data2 = list(cursor.fetchall()) self.assertListEqual(data2, data1) self.assertListEqual( data2, [('é',) + l for l in Test.objects.values_list(*attnames)]) def test_cursor_execute_no_table(self): sql = 'SELECT * FROM (SELECT 1 AS id UNION ALL SELECT 2) AS t;' with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data1 = list(cursor.fetchall()) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data2 = list(cursor.fetchall()) self.assertListEqual(data2, data1) self.assertListEqual(data2, [(1,), (2,)]) @all_final_sql_checks def test_missing_table_cache_key(self): qs = Test.objects.all() self.assert_tables(qs, Test) self.assert_query_cached(qs) table_cache_key = cachalot_settings.CACHALOT_TABLE_KEYGEN( connection.alias, Test._meta.db_table) cachalot_caches.get_cache().delete(table_cache_key) self.assert_query_cached(qs) @all_final_sql_checks def test_broken_query_cache_value(self): """ In some undetermined cases, cache.get_many return wrong values such as `None` or other invalid values. They should be gracefully handled. See https://github.com/noripyt/django-cachalot/issues/110 This test artificially creates a wrong value, but it’s usually a cache backend bug that leads to these wrong values. """ qs = Test.objects.all() self.assert_tables(qs, Test) self.assert_query_cached(qs) query_cache_key = cachalot_settings.CACHALOT_QUERY_KEYGEN( qs.query.get_compiler(DEFAULT_DB_ALIAS)) cachalot_caches.get_cache().set(query_cache_key, (), cachalot_settings.CACHALOT_TIMEOUT) self.assert_query_cached(qs) def test_unicode_get(self): with self.assertNumQueries(1): with self.assertRaises(Test.DoesNotExist): Test.objects.get(name='Clémentine') with self.assertNumQueries(0): with self.assertRaises(Test.DoesNotExist): Test.objects.get(name='Clémentine') @all_final_sql_checks def test_unicode_table_name(self): """ Tests if using unicode in table names does not break caching. """ table_name = 'Clémentine' if self.is_postgresql: table_name = '"%s"' % table_name with connection.cursor() as cursor: cursor.execute('CREATE TABLE %s (taste VARCHAR(20));' % table_name) qs = Test.objects.extra(tables=['Clémentine'], select={'taste': '%s.taste' % table_name}) # Here the table `Clémentine` is not detected because it is not # registered by Django, and we only check for registered tables # to avoid creating an extra SQL query fetching table names. self.assert_tables(qs, Test) self.assert_query_cached(qs) with connection.cursor() as cursor: cursor.execute('DROP TABLE %s;' % table_name) @all_final_sql_checks def test_unmanaged_model(self): qs = UnmanagedModel.objects.all() self.assert_tables(qs, UnmanagedModel) self.assert_query_cached(qs) def test_now_annotate(self): """Check that queries with a Now() annotation are not cached #193""" qs = Test.objects.annotate(now=Now()) self.assert_query_cached(qs, after=1) class ParameterTypeTestCase(TestUtilsMixin, TransactionTestCase): @all_final_sql_checks def test_tuple(self): qs = Test.objects.filter(pk__in=(1, 2, 3)) self.assert_tables(qs, Test) self.assert_query_cached(qs) qs = Test.objects.filter(pk__in=(4, 5, 6)) self.assert_tables(qs, Test) self.assert_query_cached(qs) @all_final_sql_checks def test_list(self): qs = Test.objects.filter(pk__in=[1, 2, 3]) self.assert_tables(qs, Test) self.assert_query_cached(qs) l = [4, 5, 6] qs = Test.objects.filter(pk__in=l) self.assert_tables(qs, Test) self.assert_query_cached(qs) l.append(7) self.assert_tables(qs, Test) # The queryset is not taking the new element into account because # the list was copied during `.filter()`. self.assert_query_cached(qs, before=0) qs = Test.objects.filter(pk__in=l) self.assert_tables(qs, Test) self.assert_query_cached(qs) @all_final_sql_checks def test_binary(self): """ Binary data should be cached on PostgreSQL & MySQL, but not on SQLite, because SQLite uses a type making it hard to access data itself. So this also tests how django-cachalot handles unknown params, in this case the `memory` object passed to SQLite. """ qs = Test.objects.filter(bin=None) self.assert_tables(qs, Test) self.assert_query_cached(qs) qs = Test.objects.filter(bin=b'abc') self.assert_tables(qs, Test) self.assert_query_cached(qs, after=1 if self.is_sqlite else 0) qs = Test.objects.filter(bin=b'def') self.assert_tables(qs, Test) self.assert_query_cached(qs, after=1 if self.is_sqlite else 0) def test_float(self): with self.assertNumQueries(1): Test.objects.create(name='test1', a_float=0.123456789) with self.assertNumQueries(1): Test.objects.create(name='test2', a_float=12345.6789) with self.assertNumQueries(1): data1 = list(Test.objects.values_list('a_float', flat=True).filter( a_float__isnull=False).order_by('a_float')) with self.assertNumQueries(0): data2 = list(Test.objects.values_list('a_float', flat=True).filter( a_float__isnull=False).order_by('a_float')) self.assertListEqual(data2, data1) self.assertEqual(len(data2), 2) self.assertAlmostEqual(data2[0], 0.123456789, delta=0.0001) self.assertAlmostEqual(data2[1], 12345.6789, delta=0.0001) with self.assertNumQueries(1): Test.objects.get(a_float=0.123456789) with self.assertNumQueries(0): Test.objects.get(a_float=0.123456789) @all_final_sql_checks def test_decimal(self): with self.assertNumQueries(1): test1 = Test.objects.create(name='test1', a_decimal=Decimal('123.45')) with self.assertNumQueries(1): test2 = Test.objects.create(name='test2', a_decimal=Decimal('12.3')) qs = Test.objects.values_list('a_decimal', flat=True).filter( a_decimal__isnull=False).order_by('a_decimal') self.assert_tables(qs, Test) self.assert_query_cached(qs, [Decimal('12.3'), Decimal('123.45')]) with self.assertNumQueries(1): Test.objects.get(a_decimal=Decimal('123.45')) with self.assertNumQueries(0): Test.objects.get(a_decimal=Decimal('123.45')) test1.delete() test2.delete() @all_final_sql_checks def test_ipv4_address(self): with self.assertNumQueries(1): test1 = Test.objects.create(name='test1', ip='127.0.0.1') with self.assertNumQueries(1): test2 = Test.objects.create(name='test2', ip='192.168.0.1') qs = Test.objects.values_list('ip', flat=True).filter( ip__isnull=False).order_by('ip') self.assert_tables(qs, Test) self.assert_query_cached(qs, ['127.0.0.1', '192.168.0.1']) with self.assertNumQueries(1): Test.objects.get(ip='127.0.0.1') with self.assertNumQueries(0): Test.objects.get(ip='127.0.0.1') test1.delete() test2.delete() @all_final_sql_checks def test_ipv6_address(self): with self.assertNumQueries(1): test1 = Test.objects.create(name='test1', ip='2001:db8:a0b:12f0::1') with self.assertNumQueries(1): test2 = Test.objects.create(name='test2', ip='2001:db8:0:85a3::ac1f:8001') qs = Test.objects.values_list('ip', flat=True).filter( ip__isnull=False).order_by('ip') self.assert_tables(qs, Test) self.assert_query_cached(qs, ['2001:db8:0:85a3::ac1f:8001', '2001:db8:a0b:12f0::1']) with self.assertNumQueries(1): Test.objects.get(ip='2001:db8:0:85a3::ac1f:8001') with self.assertNumQueries(0): Test.objects.get(ip='2001:db8:0:85a3::ac1f:8001') test1.delete() test2.delete() @all_final_sql_checks def test_duration(self): with self.assertNumQueries(1): test1 = Test.objects.create(name='test1', duration=datetime.timedelta(30)) with self.assertNumQueries(1): test2 = Test.objects.create(name='test2', duration=datetime.timedelta(60)) qs = Test.objects.values_list('duration', flat=True).filter( duration__isnull=False).order_by('duration') self.assert_tables(qs, Test) self.assert_query_cached(qs, [datetime.timedelta(30), datetime.timedelta(60)]) with self.assertNumQueries(1): Test.objects.get(duration=datetime.timedelta(30)) with self.assertNumQueries(0): Test.objects.get(duration=datetime.timedelta(30)) test1.delete() test2.delete() @all_final_sql_checks def test_uuid(self): with self.assertNumQueries(1): test1 = Test.objects.create(name='test1', uuid='1cc401b7-09f4-4520-b8d0-c267576d196b') with self.assertNumQueries(1): test2 = Test.objects.create(name='test2', uuid='ebb3b6e1-1737-4321-93e3-4c35d61ff491') qs = Test.objects.values_list('uuid', flat=True).filter( uuid__isnull=False).order_by('uuid') self.assert_tables(qs, Test) self.assert_query_cached(qs, [ UUID('1cc401b7-09f4-4520-b8d0-c267576d196b'), UUID('ebb3b6e1-1737-4321-93e3-4c35d61ff491')]) with self.assertNumQueries(1): Test.objects.get(uuid=UUID('1cc401b7-09f4-4520-b8d0-c267576d196b')) with self.assertNumQueries(0): Test.objects.get(uuid=UUID('1cc401b7-09f4-4520-b8d0-c267576d196b')) test1.delete() test2.delete() def test_now(self): """ Checks that queries with a Now() parameter are not cached. """ obj = Test.objects.create(datetime='1992-07-02T12:00:00') qs = Test.objects.filter(datetime__lte=Now()) with self.assertNumQueries(1): obj1 = qs.get() with self.assertNumQueries(1): obj2 = qs.get() self.assertEqual(obj1, obj2) self.assertEqual(obj1, obj) django-cachalot-2.8.0/cachalot/tests/settings.py000066400000000000000000000337571500004256100217010ustar00rootroot00000000000000from time import sleep from unittest import skipIf from unittest.mock import MagicMock, patch from django.conf import settings from django.contrib.auth.models import User from django.core.cache import DEFAULT_CACHE_ALIAS from django.core.checks import Error, Tags, Warning, run_checks from django.db import connection from django.test import TransactionTestCase from django.test.utils import override_settings from ..api import invalidate from ..settings import SUPPORTED_DATABASE_ENGINES, SUPPORTED_ONLY from ..utils import _get_tables from .models import Test, TestChild, TestParent, UnmanagedModel from .test_utils import TestUtilsMixin class SettingsTestCase(TestUtilsMixin, TransactionTestCase): @override_settings(CACHALOT_ENABLED=False) def test_decorator(self): self.assert_query_cached(Test.objects.all(), after=1) def test_django_override(self): with self.settings(CACHALOT_ENABLED=False): qs = Test.objects.all() self.assert_query_cached(qs, after=1) with self.settings(CACHALOT_ENABLED=True): self.assert_query_cached(qs) def test_enabled(self): qs = Test.objects.all() with self.settings(CACHALOT_ENABLED=True): self.assert_query_cached(qs) with self.settings(CACHALOT_ENABLED=False): self.assert_query_cached(qs, after=1) with self.assertNumQueries(0): list(Test.objects.all()) with self.settings(CACHALOT_ENABLED=False): with self.assertNumQueries(1): t = Test.objects.create(name='test') with self.assertNumQueries(1): data = list(Test.objects.all()) self.assertListEqual(data, [t]) @skipIf(len(settings.CACHES) == 1, 'We can’t change the cache used ' 'since there’s only one configured.') def test_cache(self): other_cache_alias = next(alias for alias in settings.CACHES if alias != DEFAULT_CACHE_ALIAS) invalidate(Test, cache_alias=other_cache_alias) qs = Test.objects.all() with self.settings(CACHALOT_CACHE=DEFAULT_CACHE_ALIAS): self.assert_query_cached(qs) with self.settings(CACHALOT_CACHE=other_cache_alias): self.assert_query_cached(qs) Test.objects.create(name='test') # Only `CACHALOT_CACHE` is invalidated, so changing the database should # not invalidate all caches. with self.settings(CACHALOT_CACHE=other_cache_alias): self.assert_query_cached(qs, before=0) def test_databases(self): qs = Test.objects.all() with self.settings(CACHALOT_DATABASES=SUPPORTED_ONLY): self.assert_query_cached(qs) invalidate(Test) engine = connection.settings_dict['ENGINE'] SUPPORTED_DATABASE_ENGINES.remove(engine) with self.settings(CACHALOT_DATABASES=SUPPORTED_ONLY): self.assert_query_cached(qs, after=1) SUPPORTED_DATABASE_ENGINES.add(engine) with self.settings(CACHALOT_DATABASES=SUPPORTED_ONLY): self.assert_query_cached(qs) with self.settings(CACHALOT_DATABASES=[]): self.assert_query_cached(qs, after=1) def test_cache_timeout(self): qs = Test.objects.all() with self.assertNumQueries(1): list(qs.all()) sleep(1) with self.assertNumQueries(0): list(qs.all()) invalidate(Test) with self.settings(CACHALOT_TIMEOUT=0): with self.assertNumQueries(1): list(qs.all()) sleep(0.05) with self.assertNumQueries(1): list(qs.all()) # We have to test with a full second and not a shorter time because # memcached only takes the integer part of the timeout into account. with self.settings(CACHALOT_TIMEOUT=1): self.assert_query_cached(qs) sleep(1) with self.assertNumQueries(1): list(Test.objects.all()) def test_cache_random(self): qs = Test.objects.order_by('?') self.assert_query_cached(qs, after=1, compare_results=False) with self.settings(CACHALOT_CACHE_RANDOM=True): self.assert_query_cached(qs) def test_invalidate_raw(self): with self.assertNumQueries(1): list(Test.objects.all()) with self.settings(CACHALOT_INVALIDATE_RAW=False): with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute("UPDATE %s SET name = 'new name';" % Test._meta.db_table) with self.assertNumQueries(0): list(Test.objects.all()) def test_only_cachable_tables(self): with self.settings(CACHALOT_ONLY_CACHABLE_TABLES=('cachalot_test',)): self.assert_query_cached(Test.objects.all()) self.assert_query_cached(TestParent.objects.all(), after=1) self.assert_query_cached(Test.objects.select_related('owner'), after=1) self.assert_query_cached(TestParent.objects.all()) with self.settings(CACHALOT_ONLY_CACHABLE_TABLES=( 'cachalot_test', 'cachalot_testchild', 'auth_user')): self.assert_query_cached(Test.objects.select_related('owner')) # TestChild uses multi-table inheritance, and since its parent, # 'cachalot_testparent', is not cachable, a basic # TestChild query can’t be cached self.assert_query_cached(TestChild.objects.all(), after=1) # However, if we only fetch data from the 'cachalot_testchild' # table, it’s cachable. self.assert_query_cached(TestChild.objects.values('public')) @override_settings(CACHALOT_ONLY_CACHABLE_APPS=('cachalot',)) def test_only_cachable_apps(self): self.assert_query_cached(Test.objects.all()) self.assert_query_cached(TestParent.objects.all()) self.assert_query_cached(Test.objects.select_related('owner'), after=1) # Must use override_settings to get the correct effect. Using the cm doesn't # reload settings on cachalot's side @override_settings(CACHALOT_ONLY_CACHABLE_TABLES=('cachalot_test', 'auth_user'), CACHALOT_ONLY_CACHABLE_APPS=('cachalot',)) def test_only_cachable_apps_set_combo(self): self.assert_query_cached(Test.objects.all()) self.assert_query_cached(TestParent.objects.all()) self.assert_query_cached(Test.objects.select_related('owner')) def test_uncachable_tables(self): qs = Test.objects.all() with self.settings(CACHALOT_UNCACHABLE_TABLES=('cachalot_test',)): self.assert_query_cached(qs, after=1) self.assert_query_cached(qs) with self.settings(CACHALOT_UNCACHABLE_TABLES=('cachalot_test',)): self.assert_query_cached(qs, after=1) @override_settings(CACHALOT_UNCACHABLE_APPS=('cachalot',)) def test_uncachable_apps(self): self.assert_query_cached(Test.objects.all(), after=1) self.assert_query_cached(TestParent.objects.all(), after=1) @override_settings(CACHALOT_UNCACHABLE_TABLES=('cachalot_test',), CACHALOT_UNCACHABLE_APPS=('cachalot',)) def test_uncachable_apps_set_combo(self): self.assert_query_cached(Test.objects.all(), after=1) self.assert_query_cached(TestParent.objects.all(), after=1) def test_only_cachable_and_uncachable_table(self): with self.settings( CACHALOT_ONLY_CACHABLE_TABLES=('cachalot_test', 'cachalot_testparent'), CACHALOT_UNCACHABLE_TABLES=('cachalot_test',)): self.assert_query_cached(Test.objects.all(), after=1) self.assert_query_cached(TestParent.objects.all()) self.assert_query_cached(User.objects.all(), after=1) def test_uncachable_unmanaged_table(self): qs = UnmanagedModel.objects.all() with self.settings( CACHALOT_UNCACHABLE_TABLES=("cachalot_unmanagedmodel",), CACHALOT_ADDITIONAL_TABLES=("cachalot_unmanagedmodel",) ): self.assert_query_cached(qs, after=1) def test_cache_compatibility(self): compatible_cache = { 'BACKEND': 'django.core.cache.backends.locmem.LocMemCache', } incompatible_cache = { 'BACKEND': 'django.core.cache.backends.db.DatabaseCache', 'LOCATION': 'cache_table' } with self.settings(CACHES={'default': compatible_cache, 'secondary': incompatible_cache}): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, []) warning001 = Warning( 'Cache backend %r is not supported by django-cachalot.' % 'django.core.cache.backends.db.DatabaseCache', hint='Switch to a supported cache backend ' 'like Redis or Memcached.', id='cachalot.W001') with self.settings(CACHES={'default': incompatible_cache}): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [warning001]) with self.settings(CACHES={'default': compatible_cache, 'secondary': incompatible_cache}, CACHALOT_CACHE='secondary'): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [warning001]) def test_database_compatibility(self): compatible_database = { 'ENGINE': 'django.db.backends.sqlite3', 'NAME': 'non_existent_db.sqlite3', } incompatible_database = { 'ENGINE': 'django.db.backends.oracle', 'NAME': 'non_existent_db', } warning002 = Warning( 'None of the configured databases are supported ' 'by django-cachalot.', hint='Use a supported database, or remove django-cachalot, or ' 'put at least one database alias in `CACHALOT_DATABASES` ' 'to force django-cachalot to use it.', id='cachalot.W002' ) warning003 = Warning( 'Database engine %r is not supported by django-cachalot.' % 'django.db.backends.oracle', hint='Switch to a supported database engine.', id='cachalot.W003' ) warning004 = Warning( 'Django-cachalot is useless because no database ' 'is configured in `CACHALOT_DATABASES`.', hint='Reconfigure django-cachalot or remove it.', id='cachalot.W004' ) error001 = Error( 'Database alias %r from `CACHALOT_DATABASES` ' 'is not defined in `DATABASES`.' % 'secondary', hint='Change `CACHALOT_DATABASES` to be compliant with' '`CACHALOT_DATABASES`', id='cachalot.E001', ) error002 = Error( "`CACHALOT_DATABASES` must be either %r or a list, tuple, " "frozenset or set of database aliases." % SUPPORTED_ONLY, hint='Remove `CACHALOT_DATABASES` or change it.', id='cachalot.E002', ) with self.settings(DATABASES={'default': incompatible_database}): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [warning002]) with self.settings(DATABASES={'default': compatible_database, 'secondary': incompatible_database}): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, []) with self.settings(DATABASES={'default': incompatible_database, 'secondary': compatible_database}): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, []) with self.settings(DATABASES={'default': incompatible_database}, CACHALOT_DATABASES=['default']): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [warning003]) with self.settings(DATABASES={'default': incompatible_database}, CACHALOT_DATABASES=[]): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [warning004]) with self.settings(DATABASES={'default': incompatible_database}, CACHALOT_DATABASES=['secondary']): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [error001]) with self.settings(DATABASES={'default': compatible_database}, CACHALOT_DATABASES=['default', 'secondary']): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [error001]) with self.settings(CACHALOT_DATABASES='invalid value'): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [error002]) def call_get_tables(self): qs = Test.objects.all() compiler_mock = MagicMock() compiler_mock.__cachalot_generated_sql = '' tables = _get_tables(qs.db, qs.query, compiler_mock) self.assertTrue(tables) return tables @override_settings(CACHALOT_FINAL_SQL_CHECK=True) @patch('cachalot.utils._get_tables_from_sql') def test_cachalot_final_sql_check_when_true(self, _get_tables_from_sql): _get_tables_from_sql.return_value = {'patched'} tables = self.call_get_tables() _get_tables_from_sql.assert_called_once() self.assertIn('patched', tables) @override_settings(CACHALOT_FINAL_SQL_CHECK=False) @patch('cachalot.utils._get_tables_from_sql') def test_cachalot_final_sql_check_when_false(self, _get_tables_from_sql): _get_tables_from_sql.return_value = {'patched'} tables = self.call_get_tables() _get_tables_from_sql.assert_not_called() self.assertNotIn('patched', tables) django-cachalot-2.8.0/cachalot/tests/signals.py000066400000000000000000000104641500004256100214670ustar00rootroot00000000000000from unittest import skipIf from django.conf import settings from django.contrib.auth.models import User from django.db import DEFAULT_DB_ALIAS, transaction from django.test import TransactionTestCase from ..api import invalidate from ..signals import post_invalidation from .models import Test class SignalsTestCase(TransactionTestCase): databases = set(settings.DATABASES.keys()) def test_table_invalidated(self): l = [] def receiver(sender, **kwargs): db_alias = kwargs['db_alias'] l.append((sender, db_alias)) post_invalidation.connect(receiver) self.assertListEqual(l, []) list(Test.objects.all()) self.assertListEqual(l, []) Test.objects.create(name='test1') self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) post_invalidation.disconnect(receiver) del l[:] # Empties the list post_invalidation.connect(receiver, sender=User._meta.db_table) Test.objects.create(name='test2') self.assertListEqual(l, []) User.objects.create_user('user') self.assertListEqual(l, [('auth_user', DEFAULT_DB_ALIAS)]) post_invalidation.disconnect(receiver, sender=User._meta.db_table) def test_table_invalidated_in_transaction(self): """ Checks that the ``post_invalidation`` signal is triggered only after the end of a transaction. """ l = [] def receiver(sender, **kwargs): db_alias = kwargs['db_alias'] l.append((sender, db_alias)) post_invalidation.connect(receiver) self.assertListEqual(l, []) with transaction.atomic(): Test.objects.create(name='test1') self.assertListEqual(l, []) self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) del l[:] # Empties the list self.assertListEqual(l, []) with transaction.atomic(): Test.objects.create(name='test2') with transaction.atomic(): Test.objects.create(name='test3') self.assertListEqual(l, []) self.assertListEqual(l, []) self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) post_invalidation.disconnect(receiver) def test_table_invalidated_once_per_transaction_or_invalidate(self): """ Checks that the ``post_invalidation`` signal is triggered only after the end of a transaction. """ l = [] def receiver(sender, **kwargs): db_alias = kwargs['db_alias'] l.append((sender, db_alias)) post_invalidation.connect(receiver) self.assertListEqual(l, []) with transaction.atomic(): Test.objects.create(name='test1') self.assertListEqual(l, []) Test.objects.create(name='test2') self.assertListEqual(l, []) self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) del l[:] # Empties the list self.assertListEqual(l, []) invalidate(Test, db_alias=DEFAULT_DB_ALIAS) self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) del l[:] # Empties the list self.assertListEqual(l, []) with transaction.atomic(): invalidate(Test, db_alias=DEFAULT_DB_ALIAS) self.assertListEqual(l, []) self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) post_invalidation.disconnect(receiver) @skipIf(len(settings.DATABASES) == 1, 'We can’t change the DB used since there’s only one configured') def test_table_invalidated_multi_db(self): db_alias2 = next(alias for alias in settings.DATABASES if alias != DEFAULT_DB_ALIAS) l = [] def receiver(sender, **kwargs): db_alias = kwargs['db_alias'] l.append((sender, db_alias)) post_invalidation.connect(receiver) self.assertListEqual(l, []) Test.objects.using(DEFAULT_DB_ALIAS).create(name='test') self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) Test.objects.using(db_alias2).create(name='test') self.assertListEqual(l, [ ('cachalot_test', DEFAULT_DB_ALIAS), ('cachalot_test', db_alias2)]) post_invalidation.disconnect(receiver) django-cachalot-2.8.0/cachalot/tests/test_utils.py000066400000000000000000000104421500004256100222220ustar00rootroot00000000000000from django.core.management.color import no_style from django.db import DEFAULT_DB_ALIAS, connection, connections, transaction from django.test import TransactionTestCase from django.test.utils import CaptureQueriesContext from ..utils import _get_tables from .models import PostgresModel class TestUtilsMixin: def setUp(self): self.is_sqlite = connection.vendor == 'sqlite' self.is_mysql = connection.vendor == 'mysql' self.is_postgresql = connection.vendor == 'postgresql' self.force_reopen_connection() # TODO: Remove this workaround when this issue is fixed: # https://code.djangoproject.com/ticket/29494 def tearDown(self): if connection.vendor == 'postgresql': flush_args = [no_style(), (PostgresModel._meta.db_table,),] flush_sql_list = connection.ops.sql_flush(*flush_args) with transaction.atomic(): for sql in flush_sql_list: with connection.cursor() as cursor: cursor.execute(sql) def force_reopen_connection(self): if connection.vendor in ('mysql', 'postgresql'): # We need to reopen the connection or Django # will execute an extra SQL request below. connection.cursor() def assert_tables(self, queryset, *tables): tables = {table if isinstance(table, str) else table._meta.db_table for table in tables} self.assertSetEqual(_get_tables(queryset.db, queryset.query), tables, str(queryset.query)) def assert_query_cached(self, queryset, result=None, result_type=None, compare_results=True, before=1, after=0): if result_type is None: result_type = list if result is None else type(result) with self.assertNumQueries(before): data1 = queryset.all() if result_type is list: data1 = list(data1) with self.assertNumQueries(after): data2 = queryset.all() if result_type is list: data2 = list(data2) if not compare_results: return assert_functions = { list: self.assertListEqual, set: self.assertSetEqual, dict: self.assertDictEqual, } assert_function = assert_functions.get(result_type, self.assertEqual) assert_function(data2, data1) if result is not None: assert_function(data2, result) class FilteredTransactionTestCase(TransactionTestCase): """ TransactionTestCase with assertNumQueries that ignores BEGIN, COMMIT and ROLLBACK queries. """ def assertNumQueries(self, num, func=None, *args, using=DEFAULT_DB_ALIAS, **kwargs): conn = connections[using] context = FilteredAssertNumQueriesContext(self, num, conn) if func is None: return context with context: func(*args, **kwargs) class FilteredAssertNumQueriesContext(CaptureQueriesContext): """ Capture queries and assert their number ignoring BEGIN, COMMIT and ROLLBACK queries. """ EXCLUDE = ('BEGIN', 'COMMIT', 'ROLLBACK') def __init__(self, test_case, num, connection): self.test_case = test_case self.num = num super().__init__(connection) def __exit__(self, exc_type, exc_value, traceback): super().__exit__(exc_type, exc_value, traceback) if exc_type is not None: return filtered_queries = [] excluded_queries = [] for q in self.captured_queries: if q['sql'].upper() not in self.EXCLUDE: filtered_queries.append(q) else: excluded_queries.append(q) executed = len(filtered_queries) self.test_case.assertEqual( executed, self.num, f"\n{executed} queries executed on {self.connection.vendor}, {self.num} expected\n" + "\nCaptured queries were:\n" + "".join( f"{i}. {query['sql']}\n" for i, query in enumerate(filtered_queries, start=1) ) + "\nCaptured queries, that were excluded:\n" + "".join( f"{i}. {query['sql']}\n" for i, query in enumerate(excluded_queries, start=1) ) ) django-cachalot-2.8.0/cachalot/tests/tests_decorators.py000066400000000000000000000024031500004256100234100ustar00rootroot00000000000000import logging from functools import wraps from django.core.cache import cache from django.test.utils import override_settings logger = logging.getLogger(__name__) def all_final_sql_checks(func): """ Runs test as two sub-tests: one with CACHALOT_FINAL_SQL_CHECK setting True, one with False """ @wraps(func) def wrapper(self, *args, **kwargs): for final_sql_check in (True, False): with self.subTest(msg=f'CACHALOT_FINAL_SQL_CHECK = {final_sql_check}'): with override_settings( CACHALOT_FINAL_SQL_CHECK=final_sql_check ): func(self, *args, **kwargs) cache.clear() return wrapper def no_final_sql_check(func): """ Runs test with CACHALOT_FINAL_SQL_CHECK = False """ @wraps(func) def wrapper(self, *args, **kwargs): with override_settings(CACHALOT_FINAL_SQL_CHECK=False): func(self, *args, **kwargs) return wrapper def with_final_sql_check(func): """ Runs test with CACHALOT_FINAL_SQL_CHECK = True """ @wraps(func) def wrapper(self, *args, **kwargs): with override_settings(CACHALOT_FINAL_SQL_CHECK=True): func(self, *args, **kwargs) return wrapper django-cachalot-2.8.0/cachalot/tests/thread_safety.py000066400000000000000000000072211500004256100226460ustar00rootroot00000000000000from threading import Thread from django.db import connection, transaction from django.test import skipUnlessDBFeature from .models import Test from .test_utils import TestUtilsMixin, FilteredTransactionTestCase class TestThread(Thread): def start_and_join(self): self.start() self.join() return self.t def run(self): self.t = Test.objects.first() connection.close() @skipUnlessDBFeature('test_db_allows_multiple_connections') class ThreadSafetyTestCase(TestUtilsMixin, FilteredTransactionTestCase): def test_concurrent_caching(self): t1 = TestThread().start_and_join() t = Test.objects.create(name='test') t2 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, t) def test_concurrent_caching_during_atomic(self): with self.assertNumQueries(1): with transaction.atomic(): t1 = TestThread().start_and_join() t = Test.objects.create(name='test') t2 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, None) with self.assertNumQueries(1): data = Test.objects.first() self.assertEqual(data, t) def test_concurrent_caching_before_and_during_atomic_1(self): t1 = TestThread().start_and_join() with self.assertNumQueries(1): with transaction.atomic(): t2 = TestThread().start_and_join() t = Test.objects.create(name='test') self.assertEqual(t1, None) self.assertEqual(t2, None) with self.assertNumQueries(1): data = Test.objects.first() self.assertEqual(data, t) def test_concurrent_caching_before_and_during_atomic_2(self): t1 = TestThread().start_and_join() with self.assertNumQueries(1): with transaction.atomic(): t = Test.objects.create(name='test') t2 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, None) with self.assertNumQueries(1): data = Test.objects.first() self.assertEqual(data, t) def test_concurrent_caching_during_and_after_atomic_1(self): with self.assertNumQueries(1): with transaction.atomic(): t1 = TestThread().start_and_join() t = Test.objects.create(name='test') t2 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, t) with self.assertNumQueries(0): data = Test.objects.first() self.assertEqual(data, t) def test_concurrent_caching_during_and_after_atomic_2(self): with self.assertNumQueries(1): with transaction.atomic(): t = Test.objects.create(name='test') t1 = TestThread().start_and_join() t2 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, t) with self.assertNumQueries(0): data = Test.objects.first() self.assertEqual(data, t) def test_concurrent_caching_during_and_after_atomic_3(self): with self.assertNumQueries(1): with transaction.atomic(): t1 = TestThread().start_and_join() t = Test.objects.create(name='test') t2 = TestThread().start_and_join() t3 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, None) self.assertEqual(t3, t) with self.assertNumQueries(0): data = Test.objects.first() self.assertEqual(data, t) django-cachalot-2.8.0/cachalot/tests/transaction.py000066400000000000000000000175241500004256100223600ustar00rootroot00000000000000from cachalot.transaction import AtomicCache from django.contrib.auth.models import User from django.core.cache import cache from django.db import transaction, connection, IntegrityError from django.test import SimpleTestCase, skipUnlessDBFeature from .models import Test from .test_utils import TestUtilsMixin, FilteredTransactionTestCase class AtomicTestCase(TestUtilsMixin, FilteredTransactionTestCase): def test_successful_read_atomic(self): with self.assertNumQueries(1): with transaction.atomic(): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(0): data2 = list(Test.objects.all()) self.assertListEqual(data2, []) def test_unsuccessful_read_atomic(self): with self.assertNumQueries(1): try: with transaction.atomic(): data1 = list(Test.objects.all()) raise ZeroDivisionError except ZeroDivisionError: pass self.assertListEqual(data1, []) with self.assertNumQueries(1): data2 = list(Test.objects.all()) self.assertListEqual(data2, []) def test_successful_write_atomic(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(1): with transaction.atomic(): t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): data2 = list(Test.objects.all()) self.assertListEqual(data2, [t1]) with self.assertNumQueries(1): with transaction.atomic(): t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data3 = list(Test.objects.all()) self.assertListEqual(data3, [t1, t2]) with self.assertNumQueries(3): with transaction.atomic(): data4 = list(Test.objects.all()) t3 = Test.objects.create(name='test3') t4 = Test.objects.create(name='test4') data5 = list(Test.objects.all()) self.assertListEqual(data4, [t1, t2]) self.assertListEqual(data5, [t1, t2, t3, t4]) self.assertNotEqual(t4, t3) def test_unsuccessful_write_atomic(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(1): try: with transaction.atomic(): Test.objects.create(name='test') raise ZeroDivisionError except ZeroDivisionError: pass with self.assertNumQueries(0): data2 = list(Test.objects.all()) self.assertListEqual(data2, []) with self.assertNumQueries(1): with self.assertRaises(Test.DoesNotExist): Test.objects.get(name='test') def test_cache_inside_atomic(self): with self.assertNumQueries(1): with transaction.atomic(): data1 = list(Test.objects.all()) data2 = list(Test.objects.all()) self.assertListEqual(data2, data1) self.assertListEqual(data2, []) def test_invalidation_inside_atomic(self): with self.assertNumQueries(3): with transaction.atomic(): data1 = list(Test.objects.all()) t = Test.objects.create(name='test') data2 = list(Test.objects.all()) self.assertListEqual(data1, []) self.assertListEqual(data2, [t]) def test_successful_nested_read_atomic(self): with self.assertNumQueries(6): with transaction.atomic(): list(Test.objects.all()) with transaction.atomic(): list(User.objects.all()) with self.assertNumQueries(2): with transaction.atomic(): list(User.objects.all()) with self.assertNumQueries(0): list(User.objects.all()) with self.assertNumQueries(0): list(Test.objects.all()) list(User.objects.all()) def test_unsuccessful_nested_read_atomic(self): with self.assertNumQueries(5): with transaction.atomic(): try: with transaction.atomic(): with self.assertNumQueries(1): list(Test.objects.all()) raise ZeroDivisionError except ZeroDivisionError: pass with self.assertNumQueries(1): list(Test.objects.all()) def test_successful_nested_write_atomic(self): with self.assertNumQueries(12): with transaction.atomic(): t1 = Test.objects.create(name='test1') with transaction.atomic(): t2 = Test.objects.create(name='test2') data1 = list(Test.objects.all()) self.assertListEqual(data1, [t1, t2]) with transaction.atomic(): t3 = Test.objects.create(name='test3') with transaction.atomic(): data2 = list(Test.objects.all()) self.assertListEqual(data2, [t1, t2, t3]) t4 = Test.objects.create(name='test4') data3 = list(Test.objects.all()) self.assertListEqual(data3, [t1, t2, t3, t4]) def test_unsuccessful_nested_write_atomic(self): with self.assertNumQueries(15): with transaction.atomic(): t1 = Test.objects.create(name='test1') try: with transaction.atomic(): t2 = Test.objects.create(name='test2') data1 = list(Test.objects.all()) self.assertListEqual(data1, [t1, t2]) raise ZeroDivisionError except ZeroDivisionError: pass data2 = list(Test.objects.all()) self.assertListEqual(data2, [t1]) try: with transaction.atomic(): t3 = Test.objects.create(name='test3') with transaction.atomic(): data2 = list(Test.objects.all()) self.assertListEqual(data2, [t1, t3]) raise ZeroDivisionError except ZeroDivisionError: pass with self.assertNumQueries(1): data3 = list(Test.objects.all()) self.assertListEqual(data3, [t1]) @skipUnlessDBFeature('can_defer_constraint_checks') def test_deferred_error(self): """ Checks that an error occurring during the end of a transaction has no impact on future queries. """ with connection.cursor() as cursor: cursor.execute( 'CREATE TABLE example (' 'id int UNIQUE DEFERRABLE INITIALLY DEFERRED);') with self.assertRaises(IntegrityError): with transaction.atomic(): with self.assertNumQueries(1): list(Test.objects.all()) cursor.execute( 'INSERT INTO example VALUES (1), (1);' '-- ' + Test._meta.db_table) # Should invalidate Test. with self.assertNumQueries(1): list(Test.objects.all()) class AtomicCacheTestCase(SimpleTestCase): def setUp(self): self.atomic_cache = AtomicCache(cache, 'db_alias') def test_set(self): self.assertDictEqual(self.atomic_cache, {}) self.atomic_cache.set('key', 'value', None) self.assertDictEqual(self.atomic_cache, {'key': 'value'}) django-cachalot-2.8.0/cachalot/tests/write.py000066400000000000000000001144361500004256100211650ustar00rootroot00000000000000from unittest import skipIf, skipUnless from django.contrib.auth.models import User, Permission, Group from django.contrib.contenttypes.models import ContentType from django.core.exceptions import MultipleObjectsReturned from django.core.management import call_command from django.db import ( connection, transaction, ProgrammingError, OperationalError) from django.db.models import Count from django.db.models.expressions import RawSQL from django.test import TransactionTestCase, skipUnlessDBFeature from .models import Test, TestParent, TestChild from .test_utils import TestUtilsMixin, FilteredTransactionTestCase class WriteTestCase(TestUtilsMixin, FilteredTransactionTestCase): """ Tests if every SQL request writing data is not cached and invalidates the implied data. """ def test_create(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(1): t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data2 = list(Test.objects.all()) with self.assertNumQueries(1): t3 = Test.objects.create(name='test3') with self.assertNumQueries(1): data3 = list(Test.objects.all()) self.assertListEqual(data2, [t1, t2]) self.assertListEqual(data3, [t1, t2, t3]) with self.assertNumQueries(1): t3_copy = Test.objects.create(name='test3') self.assertNotEqual(t3_copy, t3) with self.assertNumQueries(1): data4 = list(Test.objects.all()) self.assertListEqual(data4, [t1, t2, t3, t3_copy]) def test_get_or_create(self): """ Tests if the ``SELECT`` query of a ``QuerySet.get_or_create`` is cached, but not the ``INSERT`` one. """ with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(2): t, created = Test.objects.get_or_create(name='test') self.assertTrue(created) with self.assertNumQueries(1): t_clone, created = Test.objects.get_or_create(name='test') self.assertFalse(created) self.assertEqual(t_clone, t) with self.assertNumQueries(0): t_clone, created = Test.objects.get_or_create(name='test') self.assertFalse(created) self.assertEqual(t_clone, t) with self.assertNumQueries(1): data2 = list(Test.objects.all()) self.assertListEqual(data2, [t]) def test_update_or_create(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) with self.assertNumQueries(4): t, created = Test.objects.update_or_create( name='test', defaults={'public': True}) self.assertTrue(created) self.assertEqual(t.name, 'test') self.assertEqual(t.public, True) with self.assertNumQueries(2): t, created = Test.objects.update_or_create( name='test', defaults={'public': False}) self.assertFalse(created) self.assertEqual(t.name, 'test') self.assertEqual(t.public, False) # The number of SQL queries doesn’t decrease because update_or_create # always calls an UPDATE, even when data wasn’t changed. with self.assertNumQueries(2): t, created = Test.objects.update_or_create( name='test', defaults={'public': False}) self.assertFalse(created) self.assertEqual(t.name, 'test') self.assertEqual(t.public, False) with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [t]) def test_bulk_create(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(1): unsaved_tests = [Test(name='test%02d' % i) for i in range(1, 11)] Test.objects.bulk_create(unsaved_tests) self.assertEqual(Test.objects.count(), 10) with self.assertNumQueries(1): unsaved_tests = [Test(name='test%02d' % i) for i in range(1, 11)] Test.objects.bulk_create(unsaved_tests) self.assertEqual(Test.objects.count(), 20) with self.assertNumQueries(1): data2 = list(Test.objects.all()) self.assertEqual(len(data2), 20) self.assertListEqual([t.name for t in data2], ['test%02d' % (i // 2) for i in range(2, 22)]) def test_update(self): with self.assertNumQueries(1): t = Test.objects.create(name='test1') with self.assertNumQueries(1): t1 = Test.objects.get() with self.assertNumQueries(1): t.name = 'test2' t.save() with self.assertNumQueries(1): t2 = Test.objects.get() self.assertEqual(t1.name, 'test1') self.assertEqual(t2.name, 'test2') with self.assertNumQueries(1): Test.objects.update(name='test3') with self.assertNumQueries(1): t3 = Test.objects.get() self.assertEqual(t3.name, 'test3') def test_delete(self): with self.assertNumQueries(1): t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data1 = list(Test.objects.values_list('name', flat=True)) with self.assertNumQueries(1): t2.delete() with self.assertNumQueries(1): data2 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data1, [t1.name, t2.name]) self.assertListEqual(data2, [t1.name]) with self.assertNumQueries(1): Test.objects.bulk_create([Test(name='test%s' % i) for i in range(2, 11)]) with self.assertNumQueries(1): self.assertEqual(Test.objects.count(), 10) with self.assertNumQueries(1): Test.objects.all().delete() with self.assertNumQueries(1): self.assertEqual(Test.objects.count(), 0) def test_invalidate_exists(self): with self.assertNumQueries(1): self.assertFalse(Test.objects.exists()) Test.objects.create(name='test') with self.assertNumQueries(1): self.assertTrue(Test.objects.create()) def test_invalidate_count(self): with self.assertNumQueries(1): self.assertEqual(Test.objects.count(), 0) Test.objects.create(name='test1') with self.assertNumQueries(1): self.assertEqual(Test.objects.count(), 1) Test.objects.create(name='test2') with self.assertNumQueries(1): self.assertEqual(Test.objects.count(), 2) def test_invalidate_get(self): with self.assertNumQueries(1): with self.assertRaises(Test.DoesNotExist): Test.objects.get(name='test') Test.objects.create(name='test') with self.assertNumQueries(1): Test.objects.get(name='test') Test.objects.create(name='test') with self.assertNumQueries(1): with self.assertRaises(MultipleObjectsReturned): Test.objects.get(name='test') def test_invalidate_values(self): with self.assertNumQueries(1): data1 = list(Test.objects.values('name', 'public')) self.assertListEqual(data1, []) Test.objects.bulk_create([Test(name='test1'), Test(name='test2', public=True)]) with self.assertNumQueries(1): data2 = list(Test.objects.values('name', 'public')) self.assertEqual(len(data2), 2) self.assertDictEqual(data2[0], {'name': 'test1', 'public': False}) self.assertDictEqual(data2[1], {'name': 'test2', 'public': True}) Test.objects.all()[0].delete() with self.assertNumQueries(1): data3 = list(Test.objects.values('name', 'public')) self.assertEqual(len(data3), 1) self.assertDictEqual(data3[0], {'name': 'test2', 'public': True}) def test_invalidate_foreign_key(self): with self.assertNumQueries(1): data1 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data1, []) u1 = User.objects.create_user('user1') Test.objects.bulk_create([Test(name='test1', owner=u1), Test(name='test2')]) with self.assertNumQueries(2): data2 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data2, ['user1']) Test.objects.create(name='test3') with self.assertNumQueries(1): data3 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data3, ['user1']) t2 = Test.objects.get(name='test2') t2.owner = u1 t2.save() with self.assertNumQueries(1): data4 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data4, ['user1', 'user1']) u2 = User.objects.create_user('user2') Test.objects.filter(name='test3').update(owner=u2) with self.assertNumQueries(3): data5 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data5, ['user1', 'user1', 'user2']) User.objects.filter(username='user2').update(username='user3') with self.assertNumQueries(2): data6 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data6, ['user1', 'user1', 'user3']) u2 = User.objects.create_user('user2') Test.objects.filter(name='test2').update(owner=u2) with self.assertNumQueries(4): data7 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data7, ['user1', 'user2', 'user3']) with self.assertNumQueries(0): data8 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data8, ['user1', 'user2', 'user3']) def test_invalidate_many_to_many(self): u = User.objects.create_user('test_user') ct = ContentType.objects.get_for_model(User) discuss = Permission.objects.create( name='Can discuss', content_type=ct, codename='discuss') touch = Permission.objects.create( name='Can touch', content_type=ct, codename='touch') cuddle = Permission.objects.create( name='Can cuddle', content_type=ct, codename='cuddle') u.user_permissions.add(discuss, touch, cuddle) with self.assertNumQueries(1): data1 = [p.codename for p in u.user_permissions.all()] self.assertListEqual(data1, ['cuddle', 'discuss', 'touch']) touch.name = 'Can lick' touch.codename = 'lick' touch.save() with self.assertNumQueries(1): data2 = [p.codename for p in u.user_permissions.all()] self.assertListEqual(data2, ['cuddle', 'discuss', 'lick']) Permission.objects.filter(pk=discuss.pk).update( name='Can finger', codename='finger') with self.assertNumQueries(1): data3 = [p.codename for p in u.user_permissions.all()] self.assertListEqual(data3, ['cuddle', 'finger', 'lick']) def test_invalidate_aggregate(self): with self.assertNumQueries(1): self.assertEqual(User.objects.aggregate(n=Count('test'))['n'], 0) with self.assertNumQueries(1): u = User.objects.create_user('test') with self.assertNumQueries(1): self.assertEqual(User.objects.aggregate(n=Count('test'))['n'], 0) with self.assertNumQueries(1): Test.objects.create(name='test1') with self.assertNumQueries(1): self.assertEqual(User.objects.aggregate(n=Count('test'))['n'], 0) with self.assertNumQueries(1): Test.objects.create(name='test2', owner=u) with self.assertNumQueries(1): self.assertEqual(User.objects.aggregate(n=Count('test'))['n'], 1) with self.assertNumQueries(1): Test.objects.create(name='test3') with self.assertNumQueries(1): self.assertEqual(User.objects.aggregate(n=Count('test'))['n'], 1) def test_invalidate_annotate(self): with self.assertNumQueries(1): data1 = list(User.objects.annotate(n=Count('test')).order_by('pk')) self.assertListEqual(data1, []) with self.assertNumQueries(1): Test.objects.create(name='test1') with self.assertNumQueries(1): data2 = list(User.objects.annotate(n=Count('test')).order_by('pk')) self.assertListEqual(data2, []) with self.assertNumQueries(2): user1 = User.objects.create_user('user1') user2 = User.objects.create_user('user2') with self.assertNumQueries(1): data3 = list(User.objects.annotate(n=Count('test')).order_by('pk')) self.assertListEqual(data3, [user1, user2]) self.assertListEqual([u.n for u in data3], [0, 0]) with self.assertNumQueries(1): Test.objects.create(name='test2', owner=user1) with self.assertNumQueries(1): data4 = list(User.objects.annotate(n=Count('test')).order_by('pk')) self.assertListEqual(data4, [user1, user2]) self.assertListEqual([u.n for u in data4], [1, 0]) with self.assertNumQueries(1): Test.objects.bulk_create([ Test(name='test3', owner=user1), Test(name='test4', owner=user2), Test(name='test5', owner=user1), Test(name='test6', owner=user2), ]) with self.assertNumQueries(1): data5 = list(User.objects.annotate(n=Count('test')).order_by('pk')) self.assertListEqual(data5, [user1, user2]) self.assertListEqual([u.n for u in data5], [3, 2]) def test_invalidate_subquery(self): with self.assertNumQueries(1): data1 = list(Test.objects.filter(owner__in=User.objects.all())) self.assertListEqual(data1, []) u = User.objects.create_user('test') with self.assertNumQueries(1): data2 = list(Test.objects.filter(owner__in=User.objects.all())) self.assertListEqual(data2, []) t = Test.objects.create(name='test', owner=u) with self.assertNumQueries(1): data3 = list(Test.objects.filter(owner__in=User.objects.all())) self.assertListEqual(data3, [t]) with self.assertNumQueries(1): data4 = list( Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ).distinct()) self.assertListEqual(data4, []) g = Group.objects.create(name='test_group') with self.assertNumQueries(1): data5 = list( Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ).distinct()) self.assertListEqual(data5, []) p = Permission.objects.first() g.permissions.add(p) with self.assertNumQueries(1): data6 = list( Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ).distinct()) self.assertListEqual(data6, []) u.groups.add(g) with self.assertNumQueries(1): data7 = list( Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ).distinct()) self.assertListEqual(data7, [t]) with self.assertNumQueries(1): data8 = list( User.objects.filter(user_permissions__in=g.permissions.all()) ) self.assertListEqual(data8, []) u.user_permissions.add(p) with self.assertNumQueries(1): data9 = list( User.objects.filter(user_permissions__in=g.permissions.all()) ) self.assertListEqual(data9, [u]) g.permissions.remove(p) with self.assertNumQueries(1): data10 = list( User.objects.filter(user_permissions__in=g.permissions.all()) ) self.assertListEqual(data10, []) with self.assertNumQueries(1): data11 = list(User.objects.exclude(user_permissions=None)) self.assertListEqual(data11, [u]) u.user_permissions.clear() with self.assertNumQueries(1): data12 = list(User.objects.exclude(user_permissions=None)) self.assertListEqual(data12, []) def test_invalidate_nested_subqueries(self): with self.assertNumQueries(1): data1 = list( User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all() ) ) ) self.assertListEqual(data1, []) u = User.objects.create_user('test') with self.assertNumQueries(1): data2 = list( User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all() ) ) ) self.assertListEqual(data2, []) p = Permission.objects.first() u.user_permissions.add(p) with self.assertNumQueries(1): data3 = list( User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all() ) ) ) self.assertListEqual(data3, [u]) with self.assertNumQueries(1): data4 = list( User.objects.filter( pk__in=User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all() ) ) ) ) self.assertListEqual(data4, [u]) u.user_permissions.remove(p) with self.assertNumQueries(1): data5 = list( User.objects.filter( pk__in=User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all() ) ) ) ) self.assertListEqual(data5, []) def test_invalidate_raw_subquery(self): permission = Permission.objects.first() with self.assertNumQueries(0): raw_sql = RawSQL('SELECT id FROM auth_permission WHERE id = %s', (permission.pk,)) with self.assertNumQueries(1): data1 = list(Test.objects.filter(permission=raw_sql)) self.assertListEqual(data1, []) test = Test.objects.create(name='test', permission=permission) with self.assertNumQueries(1): data2 = list(Test.objects.filter(permission=raw_sql)) self.assertListEqual(data2, [test]) permission.save() with self.assertNumQueries(1): data3 = list(Test.objects.filter(permission=raw_sql)) self.assertListEqual(data3, [test]) test.delete() with self.assertNumQueries(1): data4 = list(Test.objects.filter(permission=raw_sql)) self.assertListEqual(data4, []) def test_invalidate_nested_raw_subquery(self): permission = Permission.objects.first() with self.assertNumQueries(0): raw_sql = RawSQL('SELECT id FROM auth_permission WHERE id = %s', (permission.pk,)) with self.assertNumQueries(1): data1 = list(Test.objects.filter( pk__in=Test.objects.filter(permission=raw_sql))) self.assertListEqual(data1, []) test = Test.objects.create(name='test', permission=permission) with self.assertNumQueries(1): data2 = list(Test.objects.filter( pk__in=Test.objects.filter(permission=raw_sql))) self.assertListEqual(data2, [test]) permission.save() with self.assertNumQueries(1): data3 = list(Test.objects.filter( pk__in=Test.objects.filter(permission=raw_sql))) self.assertListEqual(data3, [test]) test.delete() with self.assertNumQueries(1): data4 = list(Test.objects.filter( pk__in=Test.objects.filter(permission=raw_sql))) self.assertListEqual(data4, []) def test_invalidate_select_related(self): with self.assertNumQueries(1): data1 = list(Test.objects.select_related('owner')) self.assertListEqual(data1, []) with self.assertNumQueries(2): u1 = User.objects.create_user('test1') u2 = User.objects.create_user('test2') with self.assertNumQueries(1): data2 = list(Test.objects.select_related('owner')) self.assertListEqual(data2, []) with self.assertNumQueries(1): Test.objects.bulk_create([ Test(name='test1', owner=u1), Test(name='test2', owner=u2), Test(name='test3', owner=u2), Test(name='test4', owner=u1), ]) with self.assertNumQueries(1): data3 = list(Test.objects.select_related('owner')) self.assertEqual(data3[0].owner, u1) self.assertEqual(data3[1].owner, u2) self.assertEqual(data3[2].owner, u2) self.assertEqual(data3[3].owner, u1) with self.assertNumQueries(1): Test.objects.filter(name__in=['test1', 'test2']).delete() with self.assertNumQueries(1): data4 = list(Test.objects.select_related('owner')) self.assertEqual(data4[0].owner, u2) self.assertEqual(data4[1].owner, u1) def test_invalidate_prefetch_related(self): with self.assertNumQueries(1): data1 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertListEqual(data1, []) with self.assertNumQueries(1): t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): data2 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertListEqual(data2, [t1]) self.assertEqual(data2[0].owner, None) with self.assertNumQueries(2): u = User.objects.create_user('user') t1.owner = u t1.save() with self.assertNumQueries(2): data3 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertListEqual(data3, [t1]) self.assertEqual(data3[0].owner, u) self.assertListEqual(list(data3[0].owner.groups.all()), []) with self.assertNumQueries(4): group = Group.objects.create(name='test_group') permissions = list(Permission.objects.all()[:5]) group.permissions.add(*permissions) u.groups.add(group) with self.assertNumQueries(2): data4 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertListEqual(data4, [t1]) owner = data4[0].owner self.assertEqual(owner, u) groups = list(owner.groups.all()) self.assertListEqual(groups, [group]) self.assertListEqual(list(groups[0].permissions.all()), permissions) with self.assertNumQueries(1): t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data5 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertListEqual(data5, [t1, t2]) owners = [t.owner for t in data5 if t.owner is not None] self.assertListEqual(owners, [u]) groups = [g for o in owners for g in o.groups.all()] self.assertListEqual(groups, [group]) data5_permissions = [p for g in groups for p in g.permissions.all()] self.assertListEqual(data5_permissions, permissions) with self.assertNumQueries(1): permissions[0].save() with self.assertNumQueries(1): list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) with self.assertNumQueries(1): group.name = 'modified_test_group' group.save() with self.assertNumQueries(2): data6 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) g = list(data6[0].owner.groups.all())[0] self.assertEqual(g.name, 'modified_test_group') with self.assertNumQueries(1): User.objects.update(username='modified_user') with self.assertNumQueries(2): data7 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertEqual(data7[0].owner.username, 'modified_user') @skipUnlessDBFeature('has_select_for_update') def test_invalidate_select_for_update(self): with self.assertNumQueries(1): Test.objects.bulk_create([Test(name='test1'), Test(name='test2')]) with self.assertNumQueries(1): with transaction.atomic(): data1 = list(Test.objects.select_for_update()) self.assertListEqual([t.name for t in data1], ['test1', 'test2']) with self.assertNumQueries(1): with transaction.atomic(): qs = Test.objects.select_for_update() qs.update(name='test3') with self.assertNumQueries(1): with transaction.atomic(): data2 = list(Test.objects.select_for_update()) self.assertListEqual([t.name for t in data2], ['test3'] * 2) def test_invalidate_extra_select(self): user = User.objects.create_user('user') t1 = Test.objects.create(name='test1', owner=user, public=True) username_length_sql = """ SELECT LENGTH(%(user_table)s.username) FROM %(user_table)s WHERE %(user_table)s.id = %(test_table)s.owner_id """ % {'user_table': User._meta.db_table, 'test_table': Test._meta.db_table} with self.assertNumQueries(1): data1 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data1, [t1]) self.assertListEqual([o.username_length for o in data1], [4]) Test.objects.update(public=False) with self.assertNumQueries(1): data2 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data2, [t1]) self.assertListEqual([o.username_length for o in data2], [4]) admin = User.objects.create_superuser('admin', 'admin@test.me', 'password') with self.assertNumQueries(1): data3 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data3, [t1]) self.assertListEqual([o.username_length for o in data3], [4]) t2 = Test.objects.create(name='test2', owner=admin) with self.assertNumQueries(1): data4 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data4, [t1, t2]) self.assertListEqual([o.username_length for o in data4], [4, 5]) def test_invalidate_having(self): def _query(): return User.objects.annotate(n=Count('user_permissions')).filter(n__gte=1) with self.assertNumQueries(1): data1 = list(_query()) self.assertListEqual(data1, []) u = User.objects.create_user('user') with self.assertNumQueries(1): data2 = list(_query()) self.assertListEqual(data2, []) p = Permission.objects.first() p.save() with self.assertNumQueries(1): data3 = list(_query()) self.assertListEqual(data3, []) u.user_permissions.add(p) with self.assertNumQueries(1): data3 = list(_query()) self.assertListEqual(data3, [u]) with self.assertNumQueries(1): self.assertEqual(_query().count(), 1) u.user_permissions.clear() with self.assertNumQueries(1): self.assertEqual(_query().count(), 0) def test_invalidate_extra_where(self): sql_condition = ("owner_id IN " "(SELECT id FROM auth_user WHERE username = 'admin')") with self.assertNumQueries(1): data1 = list(Test.objects.extra(where=[sql_condition])) self.assertListEqual(data1, []) admin = User.objects.create_superuser('admin', 'admin@test.me', 'password') with self.assertNumQueries(1): data2 = list(Test.objects.extra(where=[sql_condition])) self.assertListEqual(data2, []) t = Test.objects.create(name='test', owner=admin) with self.assertNumQueries(1): data3 = list(Test.objects.extra(where=[sql_condition])) self.assertListEqual(data3, [t]) admin.username = 'modified' admin.save() with self.assertNumQueries(1): data4 = list(Test.objects.extra(where=[sql_condition])) self.assertListEqual(data4, []) def test_invalidate_extra_tables(self): with self.assertNumQueries(1): User.objects.create_user('user1') with self.assertNumQueries(1): data1 = list(Test.objects.all().extra(tables=['auth_user'])) self.assertListEqual(data1, []) with self.assertNumQueries(1): t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): data2 = list(Test.objects.all().extra(tables=['auth_user'])) self.assertListEqual(data2, [t1]) with self.assertNumQueries(1): t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data3 = list(Test.objects.all().extra(tables=['auth_user'])) self.assertListEqual(data3, [t1, t2]) with self.assertNumQueries(1): User.objects.create_user('user2') with self.assertNumQueries(1): data4 = list(Test.objects.all().extra(tables=['auth_user'])) self.assertListEqual(data4, [t1, t1, t2, t2]) def test_invalidate_extra_order_by(self): with self.assertNumQueries(1): data1 = list(Test.objects.extra(order_by=['-cachalot_test.name'])) self.assertListEqual(data1, []) t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): data2 = list(Test.objects.extra(order_by=['-cachalot_test.name'])) self.assertListEqual(data2, [t1]) t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data2 = list(Test.objects.extra(order_by=['-cachalot_test.name'])) self.assertListEqual(data2, [t2, t1]) def test_invalidate_table_inheritance(self): with self.assertNumQueries(1): with self.assertRaises(TestChild.DoesNotExist): TestChild.objects.get() with self.assertNumQueries(2): t_child = TestChild.objects.create(name='test_child') with self.assertNumQueries(1): self.assertEqual(TestChild.objects.get(), t_child) with self.assertNumQueries(1): TestParent.objects.filter(pk=t_child.pk).update(name='modified') with self.assertNumQueries(1): modified_t_child = TestChild.objects.get() self.assertEqual(modified_t_child.pk, t_child.pk) self.assertEqual(modified_t_child.name, 'modified') with self.assertNumQueries(2): TestChild.objects.filter(pk=t_child.pk).update(name='modified2') with self.assertNumQueries(1): modified2_t_child = TestChild.objects.get() self.assertEqual(modified2_t_child.pk, t_child.pk) self.assertEqual(modified2_t_child.name, 'modified2') def test_raw_insert(self): with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), []) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute( "INSERT INTO cachalot_test (name, public) " "VALUES ('test1', %s)", [True]) with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['test1']) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute( "INSERT INTO cachalot_test (name, public) " "VALUES ('test2', %s)", [True]) with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['test1', 'test2']) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.executemany( "INSERT INTO cachalot_test (name, public) " "VALUES ('test3', %s)", [[True]]) with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['test1', 'test2', 'test3']) def test_raw_update(self): with self.assertNumQueries(1): Test.objects.create(name='test') with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['test']) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute("UPDATE cachalot_test SET name = 'new name';") with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['new name']) def test_raw_delete(self): with self.assertNumQueries(1): Test.objects.create(name='test') with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['test']) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute('DELETE FROM cachalot_test;') with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), []) def test_raw_create(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) try: with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute( 'CREATE INDEX tmp_index ON cachalot_test(name);') with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) finally: with connection.cursor() as cursor: cursor.execute('DROP INDEX tmp_index ON cachalot_test;' if self.is_mysql else 'DROP INDEX tmp_index;') @skipIf(connection.vendor == 'sqlite', 'SQLite does not support column drop, ' 'making it hard to test this.') def test_raw_alter(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) try: with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute( 'ALTER TABLE cachalot_test ADD COLUMN tmp INTEGER;') with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) finally: with connection.cursor() as cursor: cursor.execute('ALTER TABLE cachalot_test DROP COLUMN tmp;') @skipUnless( connection.vendor == 'postgresql', 'SQLite & MySQL do not revert schema changes in a transaction, ' 'making it hard to test this.') @transaction.atomic def test_raw_drop(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute('DROP TABLE cachalot_test;') # The table no longer exists, so an error should be raised # after querying it. with self.assertRaises((ProgrammingError, OperationalError)): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) class DatabaseCommandTestCase(TestUtilsMixin, TransactionTestCase): def setUp(self): self.t = Test.objects.create(name='test1') def test_flush(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t]) call_command('flush', verbosity=0, interactive=False) self.force_reopen_connection() with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) def test_loaddata(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t]) call_command('loaddata', 'cachalot/tests/loaddata_fixture.json', verbosity=0) self.force_reopen_connection() with self.assertNumQueries(1): self.assertListEqual([t.name for t in Test.objects.all()], ['test1', 'test2']) django-cachalot-2.8.0/cachalot/transaction.py000066400000000000000000000021621500004256100212060ustar00rootroot00000000000000from .settings import cachalot_settings class AtomicCache(dict): def __init__(self, parent_cache, db_alias): super().__init__() self.parent_cache = parent_cache self.db_alias = db_alias self.to_be_invalidated = set() def set(self, k, v, timeout): self[k] = v def get_many(self, keys): data = {k: self[k] for k in keys if k in self} missing_keys = set(keys) missing_keys.difference_update(data) data.update(self.parent_cache.get_many(missing_keys)) return data def set_many(self, data, timeout): self.update(data) def commit(self): # We import this here to avoid a circular import issue. from .utils import _invalidate_tables if self: self.parent_cache.set_many( self, cachalot_settings.CACHALOT_TIMEOUT) # The previous `set_many` is not enough. The parent cache needs to be # invalidated in case another transaction occurred in the meantime. _invalidate_tables(self.parent_cache, self.db_alias, self.to_be_invalidated) django-cachalot-2.8.0/cachalot/utils.py000066400000000000000000000250441500004256100200250ustar00rootroot00000000000000import datetime from decimal import Decimal from hashlib import sha1 from time import time from typing import TYPE_CHECKING from uuid import UUID from django.contrib.postgres.functions import TransactionNow from django.db.models import Exists, QuerySet, Subquery from django.db.models.enums import Choices from django.db.models.expressions import RawSQL from django.db.models.functions import Now from django.db.models.sql import Query, AggregateQuery from django.db.models.sql.where import ExtraWhere, WhereNode, NothingNode from .settings import ITERABLES, cachalot_settings from .transaction import AtomicCache if TYPE_CHECKING: from django.db.models.expressions import BaseExpression class UncachableQuery(Exception): pass class IsRawQuery(Exception): pass CACHABLE_PARAM_TYPES = { bool, int, float, Decimal, bytearray, bytes, str, type(None), datetime.date, datetime.time, datetime.datetime, datetime.timedelta, UUID, } UNCACHABLE_FUNCS = {Now, TransactionNow} try: from psycopg.dbapi20 import Binary from django.db.backends.postgresql.psycopg_any import ( NumericRange, DateRange, DateTimeRange, DateTimeTZRange, Inet, ) from psycopg.types.numeric import ( Int2, Int4, Int8, Float4, Float8, ) from ipaddress import ( IPv4Address, IPv6Address, ) from psycopg.types.json import ( Json, Jsonb, ) CACHABLE_PARAM_TYPES.update(( NumericRange, DateRange, DateTimeRange, DateTimeTZRange, Inet, Json, Jsonb, Int2, Int4, Int8, Float4, Float8, IPv4Address, IPv6Address, Binary, )) except ImportError: try: from psycopg2 import Binary from psycopg2.extras import ( NumericRange, DateRange, DateTimeRange, DateTimeTZRange, Inet, Json) CACHABLE_PARAM_TYPES.update(( Binary, NumericRange, DateRange, DateTimeRange, DateTimeTZRange, Inet, Json,)) except ImportError: pass def check_parameter_types(params): for p in params: cl = p.__class__ if cl not in CACHABLE_PARAM_TYPES: if cl in ITERABLES: check_parameter_types(p) elif cl is dict: check_parameter_types(p.items()) elif issubclass(cl, Choices): # Handle the case where a parameter from a Choices field is passed. # Django Choices use enum.unique() so the values are guaranteed to be unique. # Dereference the true "value" and verify that it is cachable. check_parameter_types([p.value]) else: raise UncachableQuery def get_query_cache_key(compiler): """ Generates a cache key from a SQLCompiler. This cache key is specific to the SQL query and its context (which database is used). The same query in the same context (= the same database) must generate the same cache key. :arg compiler: A SQLCompiler that will generate the SQL query :type compiler: django.db.models.sql.compiler.SQLCompiler :return: A cache key :rtype: int """ sql, params = compiler.as_sql() check_parameter_types(params) cache_key = '%s:%s:%s' % (compiler.using, sql, [str(p) for p in params]) # Set attribute on compiler for later access # to the generated SQL. This prevents another as_sql() call! compiler.__cachalot_generated_sql = sql.lower() return sha1(cache_key.encode('utf-8')).hexdigest() def get_table_cache_key(db_alias, table): """ Generates a cache key from a SQL table. :arg db_alias: Alias of the used database :type db_alias: str or unicode :arg table: Name of the SQL table :type table: str or unicode :return: A cache key :rtype: int """ cache_key = '%s:%s' % (db_alias, table) return sha1(cache_key.encode('utf-8')).hexdigest() def _get_tables_from_sql(connection, lowercased_sql, enable_quote: bool = False): """Returns names of involved tables after analyzing the final SQL query.""" return {table for table in (connection.introspection.django_table_names() + cachalot_settings.CACHALOT_ADDITIONAL_TABLES) if _quote_table_name(table, connection, enable_quote) in lowercased_sql} def _quote_table_name(table_name, connection, enable_quote: bool): """ Returns quoted table name. Put database-specific quotation marks around the table name to preven that tables with substrings of the table are considered. E.g. cachalot_testparent must not return cachalot_test. """ return f'{connection.ops.quote_name(table_name)}' \ if enable_quote else table_name def _find_rhs_lhs_subquery(side): h_class = side.__class__ if h_class is Query: return side elif h_class is QuerySet: return side.query elif h_class in (Subquery, Exists): # Subquery allows QuerySet & Query return side.query.query if side.query.__class__ is QuerySet else side.query elif h_class in UNCACHABLE_FUNCS: raise UncachableQuery def _find_subqueries_in_where(children): for child in children: child_class = child.__class__ if child_class is WhereNode: for grand_child in _find_subqueries_in_where(child.children): yield grand_child elif child_class is ExtraWhere: raise IsRawQuery elif child_class is NothingNode: pass else: try: child_rhs = child.rhs child_lhs = child.lhs except AttributeError: raise UncachableQuery rhs = _find_rhs_lhs_subquery(child_rhs) if rhs is not None: yield rhs lhs = _find_rhs_lhs_subquery(child_lhs) if lhs is not None: yield lhs def is_cachable(table): whitelist = cachalot_settings.CACHALOT_ONLY_CACHABLE_TABLES if whitelist and table not in whitelist: return False return table not in cachalot_settings.CACHALOT_UNCACHABLE_TABLES def are_all_cachable(tables): whitelist = cachalot_settings.CACHALOT_ONLY_CACHABLE_TABLES if whitelist and not tables.issubset(whitelist): return False return tables.isdisjoint(cachalot_settings.CACHALOT_UNCACHABLE_TABLES) def filter_cachable(tables): whitelist = cachalot_settings.CACHALOT_ONLY_CACHABLE_TABLES tables = tables.difference(cachalot_settings.CACHALOT_UNCACHABLE_TABLES) if whitelist: return tables.intersection(whitelist) return tables def _flatten(expression: 'BaseExpression'): """ Recursively yield this expression and all subexpressions, in depth-first order. Taken from Django 3.2 as the previous Django versions don’t check for existence of flatten. """ yield expression for expr in expression.get_source_expressions(): if expr: if hasattr(expr, 'flatten'): yield from _flatten(expr) else: yield expr def _get_tables(db_alias, query, compiler=False): from django.db import connections if query.select_for_update or ( not cachalot_settings.CACHALOT_CACHE_RANDOM and '?' in query.order_by): raise UncachableQuery try: if query.extra_select: raise IsRawQuery # Gets all tables already found by the ORM. tables = set(query.table_map) if query.get_meta(): tables.add(query.get_meta().db_table) # Gets tables in subquery annotations. for annotation in query.annotations.values(): if type(annotation) in UNCACHABLE_FUNCS: raise UncachableQuery for expression in _flatten(annotation): if isinstance(expression, Subquery): # Django 2.2 only: no query, only queryset if not hasattr(expression, 'query'): tables.update(_get_tables(db_alias, expression.queryset.query)) # Django 3+ else: tables.update(_get_tables(db_alias, expression.query)) elif isinstance(expression, RawSQL): sql = expression.as_sql(None, None)[0].lower() tables.update(_get_tables_from_sql(connections[db_alias], sql)) # Gets tables in WHERE subqueries. for subquery in _find_subqueries_in_where(query.where.children): tables.update(_get_tables(db_alias, subquery)) # Gets tables in HAVING subqueries. if isinstance(query, AggregateQuery): try: tables.update(_get_tables_from_sql(connections[db_alias], query.subquery)) except TypeError: # For Django 3.2+ tables.update(_get_tables(db_alias, query.inner_query)) # Gets tables in combined queries # using `.union`, `.intersection`, or `difference`. if query.combined_queries: for combined_query in query.combined_queries: tables.update(_get_tables(db_alias, combined_query)) except IsRawQuery: sql = query.get_compiler(db_alias).as_sql()[0].lower() tables = _get_tables_from_sql(connections[db_alias], sql) else: # Additional check of the final SQL. # Potentially overlooked tables are added here. Tables may be overlooked by the regular checks # as not all expressions are handled yet. This final check acts as safety net. if cachalot_settings.CACHALOT_FINAL_SQL_CHECK: if compiler: # Access generated SQL stored when caching the query! sql = compiler.__cachalot_generated_sql else: sql = query.get_compiler(db_alias).as_sql()[0].lower() final_check_tables = _get_tables_from_sql(connections[db_alias], sql, enable_quote=True) tables.update(final_check_tables) if not are_all_cachable(tables): raise UncachableQuery return tables def _get_table_cache_keys(compiler): db_alias = compiler.using get_table_cache_key = cachalot_settings.CACHALOT_TABLE_KEYGEN return [get_table_cache_key(db_alias, t) for t in _get_tables(db_alias, compiler.query, compiler)] def _invalidate_tables(cache, db_alias, tables): tables = filter_cachable(set(tables)) if not tables: return now = time() get_table_cache_key = cachalot_settings.CACHALOT_TABLE_KEYGEN cache.set_many( {get_table_cache_key(db_alias, t): now for t in tables}, cachalot_settings.CACHALOT_TIMEOUT) if isinstance(cache, AtomicCache): cache.to_be_invalidated.update(tables) django-cachalot-2.8.0/django-cachalot.jpg000066400000000000000000003454551500004256100202700ustar00rootroot00000000000000JFIFHH;CREATOR: gd-jpeg v1.0 (using IJG JPEG v62), quality = 95 C     C  W  v!o 7B ̱8ZAO̩\Ig"Rlbyii'b3 WlEr\kY[5S ^QE̓;;MkhL6R̛iwэդDhNvl+>]:ͣ&/4XV ٞCUiSMSIJK3(G]EU^d-gPc0Ð46Vźpd$mOZcI.%学Lf.* 2e0ì3WGU:%R;;R$CTez%j-,2qڛVŀ՘!3^EټU2)I1Z6ܶH!v#yIu3JtX2PB'ȶ2t޽ˠW$D[`CXu6-MeC _ euG'fNym+= W\ܭԻkPpعUnƲ nߛыp5LMZr8z?>;oArS斚Fe21,g)$XD]jkԙnHBVLFDfgV1Fr푌z*ݰ4IѼs6gm= 29?5]8T6Yz3Eg=f^Ϧ9[Zl|ubϵѫ+0!Tr{IYi,[FE =t-;da%޵aeW=Uml:&Lbۥ`R\]s^̤]Tժ5-72Ͳlg9E6ޚX{tL˫wV8|ٶ,ۑnT^՚me)瓣rtdRݜ~}ic,q*tgIheFxֹoL2h}ZưZ3L\V[(=LqRբ`-EfhF.WQn_Z٪7s<[h$J5YS}4iUzi] ZyO=.;y ׇFEXNVӺ9=MWȫB0e{“rѓwf;zk)a]A'2^&p؅kD'Bhtۧˑusv1 ؤKz4Ռ&Pf:= fZF,J]ZC0Z$96:n}7jiL]g+ּ6wcҽ4Ӟ^"} Խi&3Z} |}K(քJ\d9LH9(fxEshMn+-mW-q)V[;9 aԫ(M%XF~b'?ժ=6o\ͮ3iw7f&^uVގHL;MS|hA 2CԻp, !Xa`a(HMjۚi* ]ճtPZlVg,UiͲ3G/yt~ߝՔ^К6[[M+^e|eV~bEjXJttCN:~f]&yä絵 ZH1> WN PZ`rМ2 8& Bp1 ' #2+ӏٸ*t}$aG=KJw&Fd+:S& &#b$ iVj7K+]k7& n"uo /XQgmEewOc0-e p= CЀ2B0p,IL^NzWOfSÛTHн̺ZlVnȔIӕB| [q,\zzOKe%U=ugVKr 橔\]P2]Q<. А2, a8`0 ,$ A` @K56 Kx1hjSFsA׬:y~&Z-#(I6]m=,*Aʴc׹<+Ouc`EV́j[}*$Ԏ{ֶ@@7x0t A9&8x $p<%*ԚEakm&\sgc[5*ك~d&w.n>7:mgʚ4rn9w/nڽ=aWWZ)M{9*345mƋ+gH,@P p<`3P1 B d|Jbֵ\m˖8wˮ{;NUBm5ucUh4UQԚ|VvҫDʇQՅ =vai][j][B |a`H xd|ab0&A5nE[2[:v~qz^S&XL +k# i,:'/8sXC2UkV5/lwy}==](ĤomXUwJjL2XD@@Hb # =`8= AHBAs5?? 3y!MmM;Pk+[@3l,W m[jآsPꞆޖE4`W mH4ѷڕàh `0,"C B`  b9;XF9ڒs-eQV|QPM?n~^'kO;$ϊ/֛-7/M5omx 1:2@F2!0J`# Cbxp0 ?+?\ʦqyX4W2RqQ,-gϛ|H~fVݑKh!\Fj}t.5&"-%˦2f@H baL XLANxzcc9Lʖ{9V=vs . ݛM9,^l6y9dVZڅ]wk,Vv)5{T~P< PN@ B|A NxX7pe1 *5¥2F3'W Lvy9Jݵ]1)ɮDz)1u۹oLjֳ=Xwn{Z x=4 $ 'a84C " (a Flk liҒzO(xɺ>7}V1xjjI ?7+ۧ5vMIV'NƺgP1}Ub*-)^pC_%@>D/Gc3/v4D!}5]n ssqDs쩥UѢRϹyuqӯ{o(qef  Ҁ!\><ADg:*J[ڲĕLoc,ga@mTXTqLKٻպt.0ŏfts(҈qwmcUdizѠxmH%yB6.0>šd+T!0h9p/_+a%- jMj}֩q9$e uWU 9V#9aaw9թ3,:Χ > >_fRѭ˥3+ MI$x]o2=)ZW*@,Z $( @8$pOʒJs.Ya"g\j+CkiuZ2sWCUheoNsTjcUzN^wcFيYT2/\v77[qϯZ.,)U > YCT1A`@9" +6 MDKsR4RͲg[hd(խz>O27\\ݛjF ~-Wuvtl\׭6YrvXW;uWhdmF*ZrմeHA8* Y0H 0 М C1蚊WM"5 u81}~D-Ym c5˜dMJb/ZA]buJ5s+)im9+5 ݯԛG У&rGN:s|ϗNϱ~:ٗl* H !x` 0B@b _ϩQ\fܭR!6b$uI>{`1 SA0ߟ811&= -{[slŻ8~Rj9'SW=Q\~&_gVAo2:uɾɳکCZ(o~t5nP#$ @Pv @ !XJ$m -$9.1yF "cA"s0*UN!B_2mU`+6ּzYJs5ze:jsGT;3kV|.6C[!\]]o;|\&xA f!0fj l3Rݔ{6gk*淯 On&-Y1ه_08S3tM~v9ΛWs.Ƶ߸\4^+bT}c]YBFWܶ a˨Tiy7=6wSkԹh]sS * 0 B@\lт=GpV~Jv/*^#ƧVCƣ `sU&͂ܲs5F\WPY.l [j3jK64^WSWHyfn@-l%o)ezVP*`0B =VXjiK9ypa,"tE>@u+N+gUg.he-JZҳMUI:wful~rXvS32X ́`8-f!F"“ fN >G観MO o/l17iF:OV'PRԗ Y7{ip(]y3\Jtntyퟄv63Ff KVF:>[`HTjۃo)njى=z_$iۉQ[!P`zuvN[w,^Y3 g&YY_M KYKv3z>]ʙ_-x|8k3Һ=םT\zNatjw.Dc$ҏa[w| !̭87'SkV~~xtN"}>OwqKPx=_ML73}ҍUS,U; AVS&;Yd ٸ*=c=Y./˥sZ[G=j;}M93 TMW+IF:HMzѱkumy]]2;d~Oa|?R1`p(/*.vwCr5u9aMx ѕj\o8c`kSUfn}sUꭔ}&RIHgo<&wMAWMHY/3Uix֟by;yiLjmt_'gCag YfJUZe-bRde 3 , IUGOW "% d bRJ[ov.|ęu$ueR>,7׫}]4X^/9MaJ=9}ZM(nN9;gaRlWAI{ۓFY*o?XieuB}XuG*d> f| lŭJ{\+ՌVVNWo{iTަLv\bڠvX Tid]ƫbK9TρULwfЉ"utdvɛ {F{Ǜ| Lg5*IGi4[(Z͋[ $_qAUln [x1Q1\BCQg0_dO)<9緿Uku=Ú5t ؊.C Dq2lt{O<&ΫИRXMUq:&5v.\umf;a?ur V3\7i>{V*mPmNc{t1~ r8[^W~O#L< q\j2nq=Ш@Eֆ>]K?H6/!G4{EAؚ]5#P"vREL +GU5V::z6k:=(`Śah7~vHݓh<#sjpG=9m5dMWΊת+l{ /obaF8EŏFȕfĦ7^J{"1Xn'us66̢^Mk/LY;R֍8kG@1J"h6u7X:^G2%ڪaUۨ ۏ/Knt-ص9KuwaTje}kOH܍6YmX3zn,wW5u9bF׵kYZH)!|{5G+llA+'dbKa !KH-wUF~I2bׯGFjB@}$cR\&JJo‹ZȻO&;UiGfSAc_b֔AhY߭H=x*6#c v*x5V)cbHlո1F#qM$N$Oǵ LWyC*~Ȫ7|kB#֑ڻ#۶QO73<4Wj-Fȭm'SY$i̵Q3x*ZJބ<4[9YO'AZnWInI@ݽznSA~tiuXrnD R3gk4Z#\(PD5 _HA[xmfO6iTUVқKfJ+1Hc 0/ȋW4uUGy]ݞo2viP>N1,eQ}z ,3S@K& Mwc[װUVzͳcn x7EuV QR+FeaM15#Z %͏_n[)l ǶiUX"rVLn"XVLs+SFEbCRdwy9;~rwoւV.0W a^N/{e2;iv?$(eNڣ6*C'ؕSFtv%#=m=8ej9}WIѦRM3?UqE;(6tu^qrcU?_8Nkv*#Gwkx6bNhh)y\G}ybue'5Zp"gHOE >X:2یpw re7Xdy<쳥Krfc]@3;{j4-&"Pܰ6yc3=BVfzQ%ˆ(=g[L{+ IgwxȰW=^G9mqe5ӫ֚$].jx|(Fv,bڇȓ2i:[5zg~([YjFZ58ثY[J-I ??!O,+IK_$,xШ'#?׈W1ݥ XEi}keG5}N{jrvRc b958cm[jIPp Z5Um\N5 HQ?64X|vtǨc־Ēse9SVH)&]a`fܖl[ˋ$vZŮvpXF+0xfǒR^HeIMGoRī[ :rP* K&ؚWɳtq4v\9}X1UOsP+$}Џ$ATxC譑%7x \&Js CjM_{*:1e*z \T"1q>Bc35OMm4M^=~:Z2WŽ!$aVگ`m?OlxϔwAh*s#Z~e$Z2Iv3cvup-hJ5#Add0C8ƕM/rjoEYzŪ@l>dj6upk[aSTjb$YGwl> G9ځY :9ї ȂQ5Vm&+];r4ntvI!rCmX6{\ M'ZBEJRGE쒎cIڔ-l=v=ʪ7ބ͑3ZO{%Ȗ\hk' Q hn6?f@31Q۶3]Gw2Jl]R\U%Di6& "DR,REfV%s( Z^_$d5AР%հge:S n/Ċt&C-~ܾ+\ +\ C3C2K:]XN tyj0'M6%Vm+cQP Z7Yk"mAZZ̼]gˑ_k5fqtIx#Yw /`U2Dv]h>%&Ec=%pHs5秜H)=NT}?\o[29;Άc$/oi~%-PQzi+, TusȒK@*xFW+H| Ei&Iq11jx'uSV&T͐/QαMr9ә[VSbK>mhV51e$Zvߩqbm3l;PP?J ̏DŒA$iV[+̈́T3ٌ[ zEA)ўɑdTvtxfIuXsM\#H& $8s5 Q9؆w6MwysV\?\nАN,0?$ !yMX1I;2r$G%ZؤJj3idJ]Q C0a|F$&\XJ9k ^ۢf&|ݺ\9P>Q{QXwV!U7 f+R,)%HѕŖWs*yj<W#֭ѫL81~hoeJҔdXa\V,FkI=έ{ e;SxAI"iud;eɒA["BFQ+eHT{1a,E{V)Ueee~*>k]oʨ3Z6lw=@_+綤*Ο3FAS } XX莐M3>sh-/>WE+}X&F2JlĖt!ՎcQ 6 UȒ[4268HBG}'޼WuY{7k{+d퉩j0tp9ueL"TvU)7O lGˏb6:* cd¹i\p'^T`FV _kiYoقxW5pJdNGXPC2$Rj5^-'sog`bNWyru#XnuR^G+Y !)->VPOYugշM{e-X^kJN˯bظW}+3!`vmd7 Tb^d1`n/X״oz<1FM49c}ʮ87ED{@-Wu*͘uS#qk^17O(E-VvNƷֹr#ʷq\d5IA8U[N3ՕD%IccR)kw*pnNCr4b-,؜A6]msEs-vpWIni -aJ;]}m@DIc2>FlTz\~?!TI`-LcT]XU$Kh NP5g5F s3j{F3 aFW>KcP;oN5b3FZAUWˇ;Ӿ/ݣv(%"*UM}n=buF)Z`+8྅] cR u4ML f$?^E@Me=5V/,6~U7kx7A۪%uS,MhA)R;0CL`缎az9#$II~H(d3~s;3%zEN`#wLE5s#xF>rGcJW;F_+ Gs^s |$x77f>_yMBTn<qqhSka3OH&eN;AΞ9eiT 4pp]0iXzRmRV=ml*$ΓJ@t]hW"myI#k.|\*%{P 1Q[4*Kv{s%0r.$DFk7Bބk"+ \ _#SՖkZdhZwGZTLi?8kdzP#j6GwnA ws2Ux;JX%(V=vV[CI2;,6˵m(H!k{Ε{MFIipڍ~]ޥyO;ӻTSiDakdY I4JHFUDg>*˨R0X(F3=)1)jsF+7oyb3UY l_RTZC׫HxѮN"3U{#F/׳r4A?/得a[h-A"bY;o&{S(ҤaЉtkf2c#Uͣk6fUbEL!-gNWK˦Q=!ޕgaZD"^_9=is̈́,V=3^iZ-Gk1S2D79ݣȳ<1֣ʑsȈɇcˢU*.s/>j~ ըL<yF'3sˍic;|k>,6+X5uzcam6ŽڃiIbSO,*fU@#XTLI@dĐ}R56D#IN!M.jChN"ާ LNlc̯|)kQ8v0!V+%1"3<*#Tlk)Z!,?cag@AƐܾ?Lj<ƝmbvO+!|^ޮޯs\?>Fa;%VrB5^jasa2~v8furU[ : H F@,dH#gPqș:Žv([Ync"I{ "Gd%t0tڞE4 ȳcˤaQpzu׵ՉW9H$MvJrƴjlSM>;9$SYC,HŒFKGvd˧+)`ׅ䯴g$* Չ=||\VcdE"Iq=scRxLGT|aQ#*3Vgkk96uv2+It:b:\*{9Iع IפSD|xᰆQe"SjiH&ÏYC[:I6;8RH$jRc'VNƯj'pI$ڬk%=Rhd-)#BOnT:j=iBٿYY˒ƺ"l00 0-G=َ/"uϲCQo#۩dнvvSWİvW mV_Qnҭ'n5Q%$ KCIGUȮ̀4IߥB5N̗qJ/_ޖsX]hجJʃaQ* l+8Dcrfw'b\%ˉ)H5f0"tW1>Q` µ֌;~#g\8{10b\cg_ p5LTQFz\tD +k ,)^Y Uz;3U@n%F6Yj UdWJvLa϶2E).$k%qOzI8E-mLichsE0a􊏕V=a@B;׽{{T 7a|#r4oTQR囫IBOgb(ᵫ"Q|yo F{JQT^75#fW9qY3=28NkQU:PJTF%}yE}IZ,[ ]‚^_]Ŷkff)Az=Ï&jpG~L )cJ a :cZ-{espm. Q$.5wW<6#ɍ\gtCp2d8@ ڦ{#bjGF$uo#< ПhUPhGc)Z^tϾ+G ե7`f9m"K@{g\N8ګJM[gyG.~JILjՕf:]۰X4%¦+Ƿׅ.Q/KB?M\MLk'f:P`/dvWT*|0j 2D^`;>^C$ԏTϔ4*<9O/aq%T$%A+ñ,3c8oZDY!sp?7l ‚=j4c]mRc6*uOH4Z2 IJKz՗i 0FՖ:x_g<]vѲTۭj͸t[Fno}Xˬν諴[XmŵYϬ6t2&_|Luk&J9]0'|GO# rzoAVkbxrjФ}l|^~q}SfrI8y$ OLJ&>SE /.Nlr8J򾾰kP[نFͪ6/)5!=+#DVL2F )CX[[4^SbhyYFzOr_ >óXhX3`I4Cn:Kjvh0k.ȗ "ȡiHe5[/ $=ȡ!b>FZ64)a,c$F [Tt FycJG4uo8=wvx`w1vXOGQpeB"ּ` f;jYLt S Z |T %s?fRDe)ȴewإ`u4Hë_pôNVKqfŧ*j7i&JXɲ.Hw7Z 7XkUoα|k7)[c.cQs^cRbG֛N /]+7ZbfGaڅ5KFwI~g\{;8pؒZFZ;>pTa$-v97xd'v5\I O@j.$ھfFT0ߎI{'oRGGZjv#lg2\-ul cֵN XTdldZ#ҦY\J[,} Wkݿ,EoۋjefOmi_(Z}ή\>8"uoN!'>_Xpv,-:4ִIǻSQV웕M脳ڣ/aun5BfrJ"g\xoVkgyaiΨ9{9Xu:s!ej>rqq8ڌZ扛ڞbPZR՚ߒ7hm1wǽi+=J`q騲lyPc6 vY< w[G#>KQ0뀭9jA>6 UϤYs}Nml/_LH|F"$G haپ\㹭F8u0bEٍwLsZoEj9%jQSB!'[Mp"7GP$ld=!4D];nWZ8_/q8E0e|d:7;Oq:s*x'NUj_=UShnCjWV 8ݔ:nf|F4<ǀ˳_>-nêoxau7a/vg d} 9R3 N?(f؃e0r5].5V0.bv~~Nb5: y5;}c >Q`zՍLG8ί/cmrtxK;cؑY#5KzēN%jvl:WQM4{\x< l5ki!rM# .0:/U'mox-嶋~ ȣ6j'+Z Z.V{+[m*ZŗroZYl3;fĚNy/z&/5x>榊&tRͪti͟dPDإ@%j5/wwUWc]=}bव1bv32H,UoV=cIa:?rttI ">wwm ] QV{{R&^w4[l[;sLf?O.>vúqŤspVzH"0J^ 7b4}:!-Po+-WiQwзX,ȉ7s^țQen6Z!^ڋ9e{xd*l~L7³*5\,wur/o`kHs~p׮|AX܌Dv'WSpRa^LիGv(NNˍVH*`oV:C&0]>0g)&Q;&>Pj5̧({s*rj:r"A1'l{w{. +-N F_W\l{EEmUkO7l9\}> lM[^jWZ;[uё7]ye5{JYgXMFb_*YU$02R;5ݸӲ9]3%kqUolu,i?1DY1* \@&hN{Tv|aLi,} -uںTÙ.I̲a֏m kyr7F9Prr'`bDE EAcZE{;e*6˙~ QYG[}Q==Ө.5 eog֔kEI +2[M*y |{5zDFF ?p߳?g ?Ow?m;_7w!nr7gg!?ݿf-}kmՙWm8g7W?, .{^6!1"A2 #@PQB03a$Rq?脄LV_j#4gpghxm;7~XBR\FX9FGȢGrH}8 j;g^24SD1X6dq됩d4E*TکTTJF5ŎHr2dBb&,F)rn7 fIɌtdȍ6%D (u>z㤽0I6iikZ "*4EK\lb AbfIԙ*sbv391P(R-#M#Qlpğq$=dd\!˪c]Z&:5pnFQ_bLBxery%^dBcF%/$F e[F&8lY?]'!P>G&G\:͢Q?#r!A`SERr7Z$2' i4j'b_$> Y7,31B2Y6MXl;lĔ"l;dc4(d EM`WHL1:tGDK\j |/D $ 'dyd;qDyĩ{Z|kH;M2Dž7(4ɼ"9F'f,;vS&feRfy;97(6F im#e38 FMbGpYMeXm",Xnɜ  6Ap$W-bov,pϒ (IO22|VRY&),((RdMD6EQM^Ȕr`/ٸvVGI* XX>D1. 2R0GDB)rm括!p()E=z~G*-qDjD2BC2K$8'fJa-.G>pID!#cdN\Dvy&LLFҘg4tv=Ã6a Ɗ_$uM\ȧrp+YjD^ !崺jCLȂEhH#DqЫEk1(K_K~Þ᩵7-wQBQ3\ L*dYB2d3/QdQrv' 䇟 DhHq`/me9J9ViVk,~L` p3Or[J,mS)r++im#yf'E6Y)+X(W%);n'JB(߃y]E+ &o,K%"$ن`$<>)7s 5σ?iK|V/ 6(&Ċu}wrކhu\P+p" c(PmaV%!~%V-Fv'$++ PMsLkS9>3D.JZ %V;fJilM8!^GRCD tlM"H8K}Y8dqK$E[Y' $#$,q'+i [lvB?7;07Fx#vGa++.MzcMLY(2m+l8G=(pQ'D .ɤlUO^ML\2i׷շȠ)2qe4Md?Pr*M5Y_$Y1WHq$&)I"RrLG2wr䞞.`;C_4V*IPfG DTfUpFQL oc,Ly#&er%2H09`riQzyr)65V=JK$uD9摩qMJ-1Pio ћKt.P 5ezq~?{D8D0:9!ILL+xf2829M&زȵ+w&j-I$pU&-fđn3M#f9+ iSr4BQf5O\YV8$LNYI!hX', `u91*ju\(ӳOKE:xcl&Ed.R|WW9w#7M+'mWNג 5,*lopjliDV80i4GMVu<Sx 3jɡٔk]9Edddj"j5womrxSH2#ȶ8FJ4`hq$ l$݃$ʭm_I-46uo/IO''ka'٘G]MI%X5ZY*jgdg76F|K}=9W#)+$Bq9_;(4p&M*]le8#"ڰO&Q6d &)rK<6,Mv2?RLQQ;#GUlNځTo+O+k=StK3rW+eo>{5S`k#~)n%;gud(`cvҫ͢oqdQdYTqdSdj-F"қPhIqi"3gvD%Q㴳?nHPɹnl1i&E(4vqHSMqϏVWoEvFdmQ;33ޞ[p`H Ȳڡ,2::Wm=ؗN\~=&Q?$%Uj-|p}q"j4Xovp}.1]W?,K#l|tT(9ar(6)AdSM,ۂ3MѪ< ՄgԼđ7@4ڗ/MI*)CJ;&z.E2im5jzvHu?)KO{BU(N\|xfQ 95FO[(E<㦳Mh6$|P;2ȼ̲Y%($FMrVH756JPd Df vŽ8Hĕ.Ƚ$o.y4ed6.%S-zv'+ZssOSWnܙD<޶tz%S=C'k>iB9fKg&2hpENvo^ޚS"C$7Jx*Qyc&$D4Y)I" *V ؉, ,M᛽ͥ57PX#imX꼒g/dbĄ99yl?$|IF*qktpMJVYa`9U? 5_6㦾+GnGReOu=DI$FQ0ĮasyF!!׸{F"/mhVYO8$aņJ515>%BtK"c/*Q347g[||:|ƷMY5=r cO,W+ٱPޑ}6p!SB6 X!dQHšK2u-T"DoD d.QJn̎bQC*i>IXeW{HdUVX!_#-Hj4pWx{/)1[vmD'+!0-U x+ G)BJk58`vߔw=9Y+lmbH2"B>OItW#oHxGL~EEi1> G2Ծ>|Gq$-?H-> .!1"A2 #@PBQ03Ra?Ld 8ra2J9# Iļ/霣b5v?Hɒ3F*D>I+!k_)ɏFȔNBB3`݂r9[ND&Y1 XCHiFD}'"3DnX*/ ?F|g<>oWY\?c I&5r)gxJm&*7NB#8IFIxF9ID^2m44|D !nIW=XdLR7%#>{.qIi'DXbeOd ۪l;g[bQɻlF-6S}v(UܼnIdz#jG; gYMXx#s=Kc%WF%9oJq!"ٞ}/E{,{z"K J3I K{gDŽAa Ќjo1$k=z8JD#R^3Dv(eLO"l؜4kȨdFb2J1_ё6'd6eƯ8+24l|:6/2dɓ&LKq2dG LzHH~?*Y!o,q/da[Qф[ؑL>???HՁQFK(hfta#cc&vغ(2l7$yO^ 69%E!.#5=#sQ~9vA,}%IJe@_ȟr+oEU:2ILDHHqJ]~ϐSb#2]- K=9`3bň%E}"dt\Х%?㈓E&#H_boн1 c1ldCxȌe YEskH=H1Oj8m>- XGdc,Ս2XgőPJ[xt8eLQ6 T#آK/fJ D&|$eu {!YLE %l/HSV5"F$V'أmf*7LbHX 3$|"~2\5CDG,K r %$Id}ȇVJUN,M)o\M˙Ȟg݌ Mrt,沾s3%<8䬏;ȒɩrQ>@clh!X6!*llhu4EtfH2l%#!Hl:+~z#fiCde6Ly-Mb"܄&W }pْy)uqe<$%uHD9;).F,l>2.H#>ϗ`IxcVȯ[&dDdHcäO1%DȪɮ.dIEf%ѕOq_ٔ#!e]]vЪbC myfi3BUfX#,!]E^c9КȦbl 9Tz# +kb%NΆѻGH-@} H񱦍8sСA eG#_rFc͑$誾ǣfūrMbp9!xK)E>ױf w'P'[L] '')D,:-O'C٣IJ$2,3u!։Gj%%Z/諬2%U`Ю+K(+)<Cѩ(lvy"ɲǡtFr%+z]}t7.+IRyF/D#ؔEI,X(9!Du㭦|ly(h'hd0~tt'Xc}DV$9?uT06tJd]xQ#ԅ/!<>z!$#.Ȉi&YԦlq3,Wt}#я7#r)g๬4l,2F.dVMr/!]JdT+' ;0%Rqixǵh_wؔE 1+0ZFX8xd㎙}Xy\9š{+'ĉCϋ$V1JQdy,bAdWM$Bj-QF$ &ZYT^$\f8eБ9#(2Q_ +(l57hi1W&&H] }Hbp;gR5H㵜i2u6)PcRBS'T*^r1;{*nCg1VJ[N>ʖ: `fZDc&όR*Z"'Dz\9gBYTW 7lOL0112M$&/C I39|8]jJ):2Etm[!Y(HVF^T8&8d/UZ+VIO>ĆX+m 25"-$qpbdŏFnB6b$\6DVZW~%/Li^͑9&)#8$E`<`H2e2\i%VbBLq`Qd$bH\TP*d`}=!1` |Y9 S8e]P>W|{#~d?d(?3ўȍd]963>"CCQ-BG~Z%rF;2\TOdx8*qEJ-Z59OS˷?OV8G"Ygp9TE-%\td]fpaq0/LM1#dgں!t%R"w+`.r$9rm =яG"،XrYE7rLf˲v.ű!H:&j`i~w2*v>-4|˲ʨĎ5xm˛>z.rpGO̲flheBJmrEHwgؚ~ɋE& {LMf3h1GQyi?BzbV2ɓ,k~=̲͕lfO1Rf]*6dm` l 6FQ$E1آJ8#9)%h|r&JQD#Zٴ|&A=Sc=z%,9$^T׉H>,#MO(yQyf/^Zh^GV k5HVN5R2˱NDe!$jJدGs0KqenLŘ#BNd`$Ȼ1b4c! ?$ہ)}#d dϓALLG,"FYf X̞ Y>6(1}G,h!Ȝ;.z!I͊ȗ (`N#(vڇei2W4>yHTҐ!}%!Dȯ]Az(?G,e0B9' %%!I<%舖G1xydSL]Iƥ.&cAG^oMMq㓒[2J#6J3nkK9,7ȕS9mGu<"TBRH-CB$pd&|8#"LFFOQ115{ԒF۲V<Ԝ$Uʕȳ_Z7JgWߞB]x8ǿ;,V e}u}u{q"T9Sࢿx9ED7EZ!^8|X#[J9Fmz8jHdI 94C}#Gl$`ДMe ]dGjְr?NY)4DfR(Xf5DZ!ɜ ~?τQ8‹m6+q9Ok8Qxڱ'𜓞6.3!+(ϊ*]7=b蔅V7%(=ɩ߳vnne$~LFHro-N_݉eHqi'/"d4.w x#=G{(d#1ׂK,K~w[W}l4+mYG hc$"H"$z!刈>'^3GׄHqKOa_D^%(8 Y/D=IXCQ!1"AQ2a#BqR$3Cbr4S%cs5D &e?מb6\Q\J44?[eL7k8dYGE|]?bv7-Z~TC$dA_/[Yvhޥڝf9O#1鶪l^g]W;~A{c$ %:,atK#_C+rߪ}9ѿK|w=?F_jK dc'o\YW)QfFutQM\ o긌՝+rM/r/>WF̔(=ӰپF%;LL+WRr6k96ؠDעlx}Sd9&pwV¥v!$9mnvYX[m]nEʱ@nSsi'6;tУn[e_7صiAn46Fhѷ3R6Y΄t\6I6]ʲ _+Č{7rjb;^:zxmn$-Vkm El`Uk\^oG}:mZ =:TbM|MW)L摵Gs܂;rB٭,&QMKjF :iYςmf?5B鈎& 3ScWp%h@W<5U0_G>'>K(F =\د}oOV[L3Hz_OKMmRU2(Bs˛&ph[)嬫Ms#,; +Ni$ c֜|f&O,-_v}nݚms(e1',ZoDd:_w+0}eI0KnltlDCg;ٿ m+wߒD@W=-4ZrQ&HЎU|/n2[Ykez {'=@nkuDuc=VJpQ>NN1 âdwJy#k[rW}&:Y'[,1٤_#}j'jziC4U =ܲ׶aa0|N8$bw+IYT1ɟ&r ߹4a#6x!rW>T.`TnU,~96~ M-k4K d1Cp3>;>G5YRvi#F<9Ϧ˜O\4t ©P9Z?4Yw*sSMYWkiNU>F _cV8[j%oe-KPVbK_]Lp%vYCՏYgdyQm.'{k/e+\<VI\7(ON1 \gS?NVy2c9 ߠߠChFh_3/bݲ]]Rx.9a.W6e__3.8⸐́;7&b.`[~o3:(wڶܞo)i8 e}S0ݐ\|Viw45VufDzi.›BKT]?1:c;*##R4?:VrLXsr<,.SEMP#.hUcY`Sa-ONȚ'@iUv]M.To8v&.KuT1p[ČSg3%}kVuM+Ύ :ǶK9{ ]+)*47{fpք-~zdX)Xm\ AiD%.1`!uX-<%LS,D\-d Ͳ/;kٲv~i{˛[eUD>3ecOEÑsw\cvq-FgMkDo_ltU8\_=Rz3RUcqd|F x'c~ .(L׋ 7'!/RёCJg.2EO 䆿px@7T_نmZָ;hq5? r z&Ცj?ݗr)ck3Ra)ϰ+†0h^V?C8-#@n^`> opl֕n#Uoy97U=\/.H滾h{֗V~kYIadY'4oI9~mwF[-s|ErI{FWE‚~Q`QJ!\ | o 7~H2c1Ĭ뙘7(hxUR֘(Mqs]#S$\/9Ѭx/[0&y oFp[BYf)&nUDa@R66W?V kv?3Je={ 8t,"02.⤨Z f4㯒*z[ecz ,3@j*Cm ~ uE>'悶?^݄nse2F guCiEe]~ݖWtlvYҳ9 s}* 8|]CzGSg7+5 EN]}h79nu4}] 0A Xc_G `8gf C^?ey,runBm%:JtёMSo+Du7&D,&0F%EUQ coKW H)~jxl4\\Y*0!?]7Ѕ`,q)ZF% uIP6̤F<3DKN)3o4ָgs,Y$jѽ%`MXoF!Rykm7C".{%%n3Zjj!y-Z~#\KIosX5bn|苎ζ,ukKm؅UCnaPs[{M9mcO3Nɥ+zs8;^-E3u܅Ħ7~9dv@8?H q9Xܻcn:ICInex,4 !m[K9rrzfN7+9[83Nm\{&̰FR5^ J$q1QDŽjۇZʧY)s_m<+cxƺmi^8cU$"1$8s$$O4U>}qLN8\;W-6v[eZv%04FF^;j@u%`(QXñͲuDع x\sP*i#nUz@7D1 wT>ȯ ԕP̑v"~++\"ֺw~:LðM{ܢO J7Q<=KP8sy*L96S97q]Nc*^µw+$uqbd=3}OB jp'W1sI}Pbe:k|+;HY?ctP٪M{ ~VVl˄:^>-maK,>QWaGߢJfF)LW"%sjy]uA}g+R6C֡+ 2S8P@I1k-wR;Xrzi{CuJ X1mmxr8$|.?G/U`~4RU?ĆrrO4h/;1>'?!=qNFL>樽qa%f=Y}AfWx^AhEB+b;_WOeڹ(<^" 75G ġ䰗C OQES[^c}P./)_IV9~TbR[ vJLabd 2ۅݶkt)r}t@G3 U5uw^^(  ~%.3pacƇ>Ǻ3i1 WѰn/\ZA<艳zh֨fEB>۲եh\í!i}I<;:+X 0!˩g7ʣ{l1.&q(mFQ z w7qB3܍G+03ǦEn|gdW5d k}ЋakMY}^tR7+M>_];Hߚcr3lD3KwFjJkGn:/x] G379O塤 )6ݗa:w@4Zl)|N-Sz AEg;)FL61kG՛rod~+\pz#jcSCQ|3=$@MPOX BL*3s\'1B|<ݱPsݘ[g?hjm/\f]YFٶ =q6 JgLDE=7;#_1D.Ǿwy]jf~C̸SF|y*8Pg$}]UFڙ<{y Ԑr4 ױYnYz+7Ycnnnti脙],]]e%=v\){mIda)*ͪaІ&啤irXtWu5hiUì::]0qX9:*Jܬ=lӢLJ`ll3ݔ[l,TSZe屵n/fyr-).ӷt\CSEYK#7n%@ }[+#ٻ:fnˍs2_ƄMZz*:)vԪbn`Em =B4uA5IM5HT0ܩ r3[XtSXAe|,,ǑYQ[zL:7j}|4gԀmeSbdo9ֺuAŻ[qd D#\Ge5QpbavQE_逘C#f5~"xw7٭L44_u̹UEs?.!PE_b> ۲od3cW* )q]V|ĐpAVpI' p \^+^fn.Ylw54u :ɒ[/seYT!>9uifkrgF WcYpb%N]kP6:,H,4-;ge3qh*3Q,B3KvM#F6AMNhfv]2h@@hSg4 װm4 ecuueՄВ~ohlz/; }}Ź#٧U{Xπn81#xR԰ȌUJb4ֲ\UuPVZ>㦊zxmĥnEatfļ3oV&CEçɚU>-*\כFlX"f\Jdi6BAqX95o9\YnݖJ`m9f˔@6B-=kXn. $}CUtZOͭt4m&vV8 ~ i?ẋB &14/PkLn;CҪUCU45qY•lݬX)mN 粗fkZC$jE7'RpLG%~P}|1tߠ_әrFCb\͚'^5w 2(7v^MX_#\'A%`'Q\hu.l:iEu)+}<&%SW`%JY}%3isL k{ۺm5cwC Q't<Ј?X6U-P:hr{h$pG+4b45''$0uqMq2Nqu坏 7eVbivۖr/wAk_P_r-Ͳqlu_[6ȼ5shvYW ~(mՒpp~Ka"f-\No .96Ba uqC4{ݨ {;sŏo5kr~k~=Pm17xm4ۢ8Vl<7NyϮwf=s ]:*p|ʔ6׷2 &!TdeKQ:Ƕ9KeaFOTbNxk6Bg;9:N]};®: n˖S{wDq^NnQYӡNtsmĈ `ڂג:̚Bmekڛ#oxT(hQ I]RfKik'0O<n>e:u@^ Lc-E RqKԖ?axUSucowDPH]I$&ii˯jfSL*j> OEGM[- c|٪IgӪUmV[+nPkgT^a+~k=*Оc`C3tY-^Ѧ?]\MSKL,/Q0mHDǢ9Fns\_|k9A}.P{y\J`gku  wqmІ1kh쫉K(|yv"=nۿlO GcFPvׯbЎ}u)Ӷ<&vW~|4t,z }⢥b\FOX6W=TRI*ַ. U6(i^Nr:kh n ;Ѭ\c׺#[mhX꽥9 WaۡB'j1RHSֿ).;n[P}%ut`eBul̦nJAOY&LݶLDR8Uj_/O2C7>.SQY@sj+&ݭthk1Δ]&ZCmU5KF/l0nqneܛFȞR㩳r3v(#q;+֋ep:Gi:+d6pѥk1A\"un mΠ2EBD>,%v7#\ueƟ/{0߲m|êݢ}Xo] v\F䜾k8k A-az+?@mc@)kAexӟ׸خ#K ԕX=@T?HlC57Dni[g݂Gعs8[6*o8zWI$'8i}Tjl@sM\LVurqy%=;>WKw kݘme? B݂ KFpdUtY<9yzMO>#7#[Mk^?!43K7KldBޤ_'D)DLdnIeIM4/k!fXU-{CݴQ:}1Lr2V %&T|0tF;Ie86.wo%3'7e4&/+i,g ͓fNp2ghw߱:j~ϯ e10L׶lN7BsZڋ`rqXu_[B@O E[t4G,A;ɺ2nGGK. {tѺ-mUG ۅ9,:' %k{<٭Mp%]6B>-Y[|x,1@ou4vFi{&6vBw 7M-kײI.!N٣ }sEU<`9O7 xY2\鎇2x1joWMz/4tTm%3@5a#CA{OPw([`)cDGz]_qS5De,]wlqޚkY$37{| q+5EogOdqs>^y}$ %g;A~UțzKu1dcqCB<=5G(B/{߱쳍/W ꯟȻE䵎Xz6eqջw+'$\_wouiu&~ 7fA쪻wʘ:FX[tK.zxvYspﶮ:yiV-˚NːkGMB<z_4eX4r mWViWMF@fuc5-sH#kmj'QUVVxU%D<wκ[w=K+lQE"QڡYGlT;_ڱ6[v!^[{"1`;=3\ZVWWUpd.$|?zd*ٌCcg_ aϛ)Q=4KndNi`;)ߔ1UwG ]S5SƓmu3&i U-ELyTs?s8uӺ+B=`e|Jr8hT#\qOŎIYZ1>]X~5~}jvxʟu!lU#$f HO} p%]P|67m˛~@9vZ ]e`wB,uʈ5--Ft+\30hHYr3gqo+mZepp]VC(.=e%װwd>~uB871nS\C+-X-(mrءhs׽mP8v-ΆGZ+^xyheg6>י0 HNPs;/kq>qKuڄ&{Z,.2l-p۲tf|VugN__|଼+5%vwFr]"1sPF^YWcRz2 8#w.geջzv\Oe.p~(o^N.ӻQ1eetV86JaGR^(?m, 2Zϖ*EC]]> %=Ԃ dfF{6^S']srcT|j|M/Lu}7"9!k'&+WTK3TPMQ%ttw fm?FYc[kF_ }UK/tpTV-#4z7.iٿ\vftܨjhOr]*/,2#'k+TRq5fƸayWITdvH^EG(xI53GUCK%,-}U@Den&QdArss Ss {K5f}F \#^)L}w/ 7lY͖Ltfխd]7p ?ٺ/߲͜=n[tʳ1]5ripw5s]H0<uPn-6Dͻu#.ߢkx7{uFg嫬dG Rskn-ZBXt@c⹜t[.^}(5{,3;{tRWrs;K2E?1Xy<1ꢎ<>F ܛ-T;;:0gu@Vx3C,ZfSbUQYZ;K!-1V*\Fq C/ [>vTUD %kdkۿGPvff#HT#äw{vLW j[)e),G)Y(p?\Pv[gv#Ǣ6>'5TlPHs?_(`vgc=HIYfa36j1)p:b5j,@=Jw9nj2+CcE<9u+Iw pVse!u9oB` &vVsr.VG"$ݶ蓛-CnxCx[$!:ll9\6:@VgϭȻ* ]h[o5UtͳZc[y]]ouvU]?n qQ.&S5J^4MtїjS!0k$%)L5Z|U.hᲕm[)0I0z۲{eyAoE>)9 8&ld )=GV0{r}_v#ښǓC>` haANٌ1RlwPVQM WN_YFJ†2h}>x\܄NnY-5Rɽ|2klˆim%i5bKӵ)K$FG8ExBtiummŐΖ{Z5ظě $dڇ/a 8_-n)p؃ ܅IVTL7m+q*ȱHΜr7xfTO-0QLũ%`fX-;htQ؋{h a2xYN3,~Uˠg@ѱMjq:ڈI : d| $b[?+\z/Ѷ->LdČ"j߸8C:۲}-%{O[&99uuxU|rL%ġ QS1NWRkó.4GWo[@uwEguɲ24z+'=S^[+^QpNVd]b}u?PW<. i,jZ뽬nNp~V8oPF9YB"Bz4̺{HDnwEklE6lO^MvD ɩqH6rn^3\u+9{9ن")&𹃉uܔ"'ۢ`QȜpy-[UmPOQ4A.f۵Eu\M6콧oНӰDJ96~e=X28|tS_TF EU c=SQUh?rgqg̹^O´dsw/CK˃#PB-㌽A ׿vjЄa7j6 'd7A>Ϡ+fh]AisW Q`feFjPQn<^.W=wc 3[m/r쌃N4S}D&x6;., ;X!PR>JjLf̛%#k~jcuM60tMY"˺7ɰ߮Y2Zb]պnƤB:w #o%t (="moOG#z`d5|&n=&[#=T%Hր㦖^źuu šo~DeiNu#d:_K+wmFOywV줢9>(ln@!RKxg TKHZGCQ!E4;v7dv"WYҤgY]+S2 g$kQU᳘n kc+Li"ܡ+y.B]JQVȹ&dcqp7w'PbQ叝4pz$q^aHno0,hc"V ƻuͻ-ʝumW+ʲd^Pe|':yHv3y:'rӮfȊkB-}{|gfh ѥ]qj9uSY/Assn~-rjsw-C,7~)9wiϢ@.4ˏ)7!ȘÛ766(bۖ#ת3m~ۭ6/˺~:!}̳ej[4:_ewh.,uwYO4v`r%MU1n3hZvJo= gݾCN#RWtTUE |n:| k e)\TYb`!|Ѱ>I8hk\1eKuBj/Nt`Exyt$(]7]H_gffi ( 6Pfk̗5ގT{ooL詀ij_:OHpivO~M4kX4WsO4Ӫ-\Dsj4M 6.YK,J}wͺ0=-^ wDd BJw.Zɣeźش:Q _wiNo\_Yeje^$.m}J m?EvG/]/4} ۲ 긝{+9>ht!5EUt\ٲo7u-gn4OԞxK5%E}&h2=zEQǖWkgϕ`BQ 祤"jGHv$h]v`<k&vQQz;O,O[APD7whx$ín)wU+_̪RjjF7# |׬o_f=ZOp:E]P5Y >hly\(`AЉ#<6jv_ק 77qc\ZJ/a>#wkU5TS:Aֽv:þ!?Ŗok N!%3LݹCo2Ksg=Ö Ot[#7F6A{#r}QמckۢL&sf!Cutw)ֈ1[_f"|P}y#4gk5gw miύӡ?l4ЬF.s~H \Nq;.-П^z|V;t+Mlv*ÕxMk[kuH֑c=anR,noPdS<u4:{ˡi+ 8YAq*=9uzu8nl )YNltup,"i#`@Y+*g-3Ľ"iDyऋt@5 bHKC_s.AowkqXz"#h8U%hi۲ yR4 (UJݻU xfk}g]>7ᾥ.W@!eꙁMJ)kel[pm`j e#E][?+as;8;D(9!*mގ.x:P_m-.Y.k.6=C4dkZ4_/rNg. %EC>+΂0"BTe B67 Nvj% !|V:eO :f/x~k׺˛NH:?xlVQD2}M}WY}V<(%ccߩMiÜ7UQ+|29n- ar?hç &yu t( I0&y#xi,O9w6{JoEk -wfij yͤ]=+.݇:qYt7gsG. I ,G(BETM.IYU3Tbڥ9&iъ^LykK(pv`o*VR,Xq_{l5:і Y+kp2Rͅ9"ōY:ܶ\*:p/7 ,ϧϧә##E賰չ|H0Ep#f'6C,Ftܹb~K;G Ώ~]r]`sXeP|#rZ[Mg]~kAs;_%ث-(ξl|ex^7TK7^>_NDJ#{zRQ cCFL>I%P7>uj#+Ku4[G4eOPw F4j.1Zg$ydM4Tq;VF`݇֗3=W/?*/g0YY[O4\SWݓ <&ph$ix~RcKx!7 y\gzI~nhL K%fGGpg3 kJ'#im;ɀlc3X.bRϊ4SQOK1u2ITT˗Gd Jzl%K7!좚($MQ q`PCo6RIE;[hi_C??tva^;lg>ShU1Yb߹=ϧ{m!{1m&!+2\HrٮoF emt[ Fs9ZJϯ@Sr<sDȗ |KîmS渂}>_AXg[iunۢ߱@ꎛ,..nomRmYk+uvioD ~upY\4!5eHhzNUU[~D;w*<k8eʗGlqԪF`jah)` Y#:pb3)ve=dq+K[}w:J쯿EMq2C}gISXv~\92X3 \^MV&RTTwyfu}L$2Z+hJRFuРtL|@Z:TPSK_mTUTq/M[\2 sFp&#卽p3fykqL-!9\j@8NYyh9syz6Ug21Zv GSዴyST @ITɭ,3)99SA$1'4qnSI$muC(kt3(.J&qf,̅i(x7AYϮSMt^bvF3tAC-ŷ9tWkT]AV;vkml:K5rm[QbPGS7K'OW%$b/Bl *C#t ,F,{EAoϤwTƂ?AO=M/3hk]u'dl[&Fĩ33,!&8ɹuy)"{(Imy %WhJ }i$I`t!QSނ]S\ag(m485d)ãJϦ ^6؇ҐAq(z+]U]+jHuXdYnb4?|HaLiKiE-Ec.كQjcZ y6;35Ӽ{M==3rws!'6Gias_7uP\?Sp*$uC])LX _ъa 8H`&Tz)6B =NBl%k( s/3G,MEV*h22',SpӺuZ8u'NH%6usW.@~(\vG(ۭ܇2oN}P#/Ev96`],]e X ;5_o{|em-ukm#FѴ'CkAp_yv-G8$4ޡQ_\y"h"sic}^ GѺ*TX,CTf캷axLLEы; g;YbޑWhNڼF8sbxA!;#GKd@m`1F3i fWL#vrobаx-=/bxՍao{4 3¥ [긴QcTqe)$U qZ71rʢj qke.8BKu>:جkHl~e:V9}横{jWBe}V V/U;w*3VLĿB#<~3/n?e9?+2ޚs"+_6e^֝NѽU}ST\+ufv| -wC.h}ꝃa)"ԩiq\'Ljzayw=f,X]0c^vAx!vnӔxML:ir-<&e(˺Hh%Q'+tbƃ-xr~k vKz >qh?rý'a؃3>mEVVs"Y?5,L㸝ߝYIhJM9je 劝@RlBGN7 oHLqBGSѬSLnλ~eI;ddA3^_;<ߒqzmE O S%Yscl*G88nnՋ}_PJ詩moM(CYgEx'o+pmcXNQ( սF-YGx?Iϫo|:8坜^kC౧Vfe'N؜w5K q>z^]4je߿U\̣[ŽIVTRuqr>1n]XA$/ݯO쯷h>Zȷ_7PaVnW YC*$m9AaZ𝐾Ts;›  itsv;u_H5=$nSq)y1TS2aE TGYb8lP瓁^{ya+e[۝ҫpAwR6\N14ԧX4ByF؅Rc>X "\:j oJ*eQQB(}L;dLܲFpT49ٜ؛kjWG+0JYWMp|M'#D0x{gebJGRP@  h?UVano?7إVSzv%RzE`uTx}##,[(U:ͦtA JO)2I#LL]dcVȀ_naĠsŮh0vx!ylG4-fò =hp25ml,yuvUd7+U[GMw l-u+셾.m ]hks7ڵܬ_ "$hFnەelzymeir5i[+['J(I]j,ɹTa&akb G#ipV|j[trF]pxT5Elϟ/4(c:!QR',/%[O޿7YTB>)+u ?$xdӵ>gdO?tK$OYP-sbˈ[7{3xp*Wo;n !&7pCl.rPe e RϏD8jU\Qme׺!QuQ].Ro%sd/e϶Y]G8vo>tB}&vD[M5Z7$Rԙ4ݔĎTr<݊"@=BOT)08Clba] l'si*ƹͷSO jd"zūÐX4 jqjqcIke=;alWQ"Lnl3ii]uꔾO;5*@oZ;7D{ EVƓ;/jr 9tse z=~$Gu;'0g灵^WL%7/ߊ꜏7枥U7"?Mu'|_j *GCP|T;YKCu_(|Pd'Aj_&!1AQaq?!Ks nRx4%dH@PĘ. rrxX#%2Ia%`MB=6i!-_Y6,83x M%1yF6m -NI-g<a2gxI;r&$\XYID b L'aDq K) {8@XuFT@*ȿ@B'hZD+|u"kcb΂'թ>抂EP<AC1ɛ&PGܳdP`EZyGQ,e[. c`0|H6}ف}SM1%3Qy>e "OSV$Iy\T)-C /0E40#'F[/D F"?-B}X+<@Fwqh]"R|9J}8)ED7<Y"S~.ϧ8u_6೒M,T`dh9)|hAv*]A(Dx0.A0 YH]4I!bgBb@Fo ?Pg"60M$>嗠0#/0HZm C5jr˨wj!*ZHAT?PsAc?'e:6rv h^ANjY 1~!?LI(e b)*afYeO=X؅ևxX !\%pzU BW"*':P `e|O8A}}`WjyEеS3g1oQBK@?(K'drfhD&LbF+u*՛v9k!iHJsWA@%tZd Gd BВ_R&mb#oub`dPŒ*&^88o!3 ިB 0# ޵| ܆!1M`qc8`rҜ B[0=,nyphu!>aEi4B8'LFd`a"W1TԐ^?=Μ->D G/@ӈSe8xPFb'> a|PcCtW2e 93JC'H漲%GDD5L?qs l.yD1-l`& BEHK4@&@ջ)5~#k@j6 CZa%Q oйT nWkLԑf9A ۍBu2(&2;%=/K:Cl/D0k?B%']J^p3:R{zykuFk֘Cx?d\p+6MY7`G>PKsA70pY ;*DȆn09sV2@-0fB1pt $B^!Ld3%ǡQq ^el { K ' xjJ!]GptK(ʄOĂ#Eai];AQ4O"A oT2LNp+s9s8.><̩ @-|b b`nԣ!r/1@!`*HP  &GB1Gb8yK|Kǐa%W3xl!٧ D\Hm~f̻`E@-J]66KH'0-ʄ*mw`P@ tc ` ؃nL`!dEx9D6? 7qtc< n?  xJ3Vi__[ ] GA POPͭWP;^F}Ƅ|]\Cqt!AV2eq(CI<M&QXp(y$XLImK̭8ۧ,k1,!DKD/L\!@j58,ŦPH>a ]F"mϪf f[:Է7) Pw|@=0&sڀؤ Q!Wa̍YCK r#$۰i>FPK'B⠩$b2y` >JA`N8*,g~$A"7?-$yTHh!$? Z⽔Vv@ > 4NBRa<نE<8QoL"\A9X9u{`?\'.!%Ez~< _~QP ? I28Ċ?.Y[aO偃'RFSx]B4)-29uz^ 9 E`qd}F=f>cQoD@=0\̠p1AD9(dQ(\ǸȄŸP/4DuCpO3\0؂qE(āy8@,@Kn9LQÏ+@+GJY9 3]BuXXYp%U>k 9̻LBl@A`/!9t$f):LO9*5{17:wcyN33`XJ{,}¼G?BWBIp|/f'0| Nl?25) -d<"qǍB.TX"CF|l!L,s K`*bR!%Zd'Z7<$@8OS l): ny"| iSq BfbJhwÃn1bsa옴D&.b8[!KQXd+$zE98l@dldSػhu(_m]ipl-P/( gc2eG5!IցهP0aED჈hG2HG DmF4^|94V,b 8׈r Wy5t ~s r!̞SM0 b,Ȍ\ ) pe@\t #A.>U0WӐTMD"hlFz1_5 51)Gv?yMDA-`B=.:b ?daGZ}C颬.!Z)x(J#/k&CzTPPN@b> UY$vHhdX R^J]rʶ'deL.=1y9¡`3,AF&]ygRGRql.}]~:SPY`'Le2aB',_p0$T8˄q a`3ц&n #O jFcb`IP uNFg혎I!b@RI /US.P@tʵ_p#[p i([0A6 zYq5Rc[;+0U LS$Ό-[o/ }!#ͣ?cl>hkB/Hc"IVLB@vUFr B# -QuM[]2%X_rV/:)`7BgdhYԆ(PGr/&i65??D7Hd[22Mp @;r+%Cx$j@N,. inf6JfSВB/([\V˺ş8:.שg|1h Q "A!(k!%14$@dvMp@c8-o̥4e_p(V,s1 Mx?@RxXJ5vaGb/& hهK$1J_ +p'/YupnRMLBS5)*Z?0b:/p G @_8D b\$ElЇrf FJp1ք%S.KDȎ ǙSg;kB-N C+00Lc>& 3yyDTÂ,M7,Ri.FF Bƚb}34Nu$/RgaQ="s?2wKrS<HHj#t'F,#Z/ϦCr)\ x7&Ri4,OZb(H'#g땨($N !X-2+k prXG+S2CtPl{5 +4"z9~ rn>0S [!TٸWY?#Wa$BсZ%Ł,,i@8^Q_3e#hQ0 0F#^yP+ - A 8w4'm;yBB!X|7, z]8 FeP:>Ay9Z 0WXԚ' 5c Ä$4Y< ;8ȠC1eN\xp-#§a G@"n?UXs-cT3I&و0A#TD`b/v[<t꺅VcT_J40ecK𛲀34fO9djb7\0PQ'ܧߨBS.$$ 㼑/z+1.Of`b  V.$.R 47r"_nDA@U:+A ~+mzN{ 2`;9xvP0F@'4~s>?D>-@×p*7ctlpO Xq=Ke>9Gy! 1\L4l:JZGp+>`ь]oHx@ a,tL o!~ gj5 GD3d+WBzy6JΥR''$j[̢& @|([0_ILb||j\Di@Qv"7y00uHyX==B;7dcRc Ȁ:/+'2Ǜ3G,'`S܄qY[@CqUaIA56LPsD`#@UIXp: F3&x?q#!$=ғ!|=1IobPPm!>pCzjỬ< wCeM -BQi..p۲(1G[@A@"CH/+߄S& +'lv *E(hi(̟E#D !s" f%"!y_Fh< `CVX(Iͽ1slHuPXp_bl rZ}; % FΪqe&ש]ȑ bt+S Ť8,))}G1)+aqa<8pl)P#bs2̗Fr6jCшوB?LϨh0 ~Yã1([j}3SmQo$i=فѕ0A@y7耬!Ȱn0O8G&aRQx*250b%߀Z`($$T?3rlE9婋E ALhhKI)BT(pFxcHCGMם3PViz.NKd^e0r@Qv_TbF9JB- 85ZFaKs,ɣs]Ĩ@i0cqS2LaafQ,ul\<=;ơAQL^#6 W ƢbANfo1 g8 棌أဇ#?x*Fᜌa2SX0I'5!ƶE ː&OȬp ` ,`Gcf#@OJS,|@If E5}:BH lnqҐ)Ng2ai>^y/V!( o  ٞ1;J)1+L aupkR1;ؤk!ˣ0B;O=D z Q x K+ MC<B#)U`RyFрx8d(Y\{,\ v`Bd&Ih# ҝ0ۿ:z,GdLf!0& İB d?ۊc4N|łĿcxoIp.s#C#u0uc`b@!.W e33~5 ;YV9* ,2@Ya@BL>qpѴMx15ir# V 8Ϙk)); 1(cfrx8*;T3j1Kd; 3<$, N6Q>$sh lɲ(_~kF0SY2א`>aA~`V<Q[`[Cb5@8 RGns:ߋ$ËKb[>n 4 Y9r^ @A>椃qZdAB>o=R1Ac= f8~Fc!u{0 @!;+̳' "@p2$iD"' %<`8SK?p `L-v o3RHlaru@C렰ӤMf 2\~MF"Y>!t"A`J " aH kGf͗'μ4GB j\Kq(Vl6D1KA%#>ELvzgr5 !rFMh-Ga[ȄŇ!f&_:8 *U4NͿs|TL~R~wd.0Bh*A ;3)\A( a l|  K1xzQpRHĸ  D5"?oӥ2_X<>9 9uD~:JTA c>=9LPR&t d! 5 Q_n`d38‡*@騑qNdGĶH)-dj:N{m{O~a,eB4Y `A wP D4.Ed0lH"U!$ד 9[IY$%wKob4O4,\V n/z9_~pQheڈ+=| 퉙eJn?up{kP(fE l&" N'Z KCy=ft.'7a9 p@O0[DaQxY (wC49PC ?ܥB)pfp(/!C[yȩU#f10a+YMtދ$%r*p-?xd$g?` !Zk@ \:4P`'|1NX8 8 * ~ r’rN0pXWir[@# r00D|@?`eKe!h=@1EjVB}s R;{ESxCYfq$&1<"":@AkM@(FxPN+2#}1߃)i6:\!!.3k `s@\~bW,C@I{Bчm T 0=KXM*N+A#ڔ'\rX` x HAw+GNiPi y(STF&A; GIk+IXxZT JIф 1iXŏdTBMcgYupd6_(H6Ј"{Po)mgA JN:uBCpaEL¤ 1 _n U{`".Va>SFxnD-hXGOpv3H[\{j`+)xsn>kn|4f1bKe[fuG%[H gs.͊d %/'1xC@-v=@Yh(_JpE[ǟP&:,8Yf_%\q@ \1P}xqTn6r6YIT%rs]!{\@+G/P z7a̱j|Tfd2A+6&x9IJ,浏BSe함e3 l0U?ĥqMbf`BljəKi5gifv!LAPgBEzHjqx=RM? Yǖhx y Eغ,HC9_УKl4OCDk >`+ !>dZ1)a^L`Hb?!1duQc(i@=}f4峺\;z:& 1!#3]Cq\ vC|0 zI rDfH.žz}1/^a("AS1LH@#QtKр;YKZlF1DucD%` )dfUBpF&*'[a,QrAqBLo ӗR;Ћn\v~@``AH43ƥ{ yw+5&hx bRExVDb#$8tPmw0E!@Ih! a<? $,hqcJ> cHj"(҄%ba5aNnH!lJVC 7 e \́ C qV̜v.ޡ+@7{N'aP'<$øQP D.Y,fAJN  eH@w`m̟Lv9kfBOQ-8H{ɇbMۈ$Uu@jr][' ]藱|5\ =T#@CHl W  p `2h)S<'d˗ ۷`oa6@WɇRs{CkP@sl\" "ڂ x " [doJj~9ְys X7\ { ز wa(aǰ#Ќ~&!H⇨PAPBR V^%9߈'J}P0w9FY?чq `g\ #&I '썍*^n /ă1Vxb Z#8>KOd?4>`:X { H_= a O8!@0LI"Hb.('!_ҴQ U-@v'"T?%{0:d̓\ϹS fd\*8s+L+ vRT#,MT(pv}ʽt9XAR`#%$:b$ʶ LZɂTb 'DAiB' ѬvPU>ʀU(yS*fȍXBd.eVs% $"g%)qVÄ`~8='>(!{ b7KiV 5hS9O$F0+g3:8+P$Ww `.7;RDƇ$o/S$WhBr}AJY,Adb`h: Z̠Dń0u Re^,Qv\K$ ^ K%3 Rp1a0q9 6Q\lDxY5 @dQ?yvD7/]G(q1Bhl|TaoC6y*_#D)zlNhiԭ?#axmp8< d؆JlZfYk]Ðh,F]aT#54'0_RQo0p[0g#Nl THpH(F (~Hpa Ac唡{qH9s E/bY?~{O_ Bv?w|*@|`Th߅ L%$l+ Hl:Çď,APa LGv2Euʢ!4@ " qeTETYI48h:kk|ލ^i pfx%*QBoEDB#N:*|nhWКf5*i> 8قT32 HT'x4!{+͏0s#3cئE*'f b}`B)x`e}DqDC|7_Z7s#,(*c<|AÔ$<)W0b+B,L/9>Xw:5CB;)q) ZKr d4æ1.pS$.Tl _]:61%m˯9 \! 2 - fNbse`،-÷dg! 4KO^BL~ KjJO_/P6(`”Hi ?ls(ǜw-y E֫0(vWU#q8/f8/OcOc{18gʙ7(׫`gV0%7@c#@\a HAî{ˀK/Kq!ET<Ǘ.|t `ra٤/u @LqpP/,{?.Pd6[DĎ!E-}JKNRCf`@ ^,SA͏aPŇ(sumu r@ڶaz1]ĪXz"b0,c-\Cc|KbD|"S s!#@F AЪB%:=2Cq(BwO gh!6f+UAB0K ?R"#P|:#|YB&%#RCƁXPi=Y@\C` d Kk~=%/0:* z(EgP6e@`C >]0/)-X: #!f67*&iV' op"z|9P{b\n rPQ}9z(^q mK!7z 9ǘ׺bjz@AĄGaq24-1 d2`8`:0? ȓ% eH 8&Z|2@cz$4>9<b3@JE/QH;7A!R 4r-QbC$5H. ,`(쁞3ś`{<.A%H4҄Fe!!j']O]rlPC@A}Auv>QG8?u M 9W>gEK@=Gs;(i 1uqPI U2pd0?اWwpaV 0 Лb&"Нa:(t!p8b5Nl,O.SR3P22+4xR> `_͙= O*ucVl@V ᶆhW]Dbtph)Ȳ`UtgXc>z#ڝs?/IKACAIlu@  Nۈ,ZyãeOzViq(@Nq)^le à\dPO2\`@1P">C?nKF țdcGh[0I਀OFZb>\6OƦP62Q) ÅSuAA!E1Z pM煷mQS n$p准C.EP*${*>C0b2GK(1ȳΘrDbQ'+@01  uyX2ŃJxgA|g{dvB gBP6=/78;>q;7> R9[%YPR<.${a_Ab4Bq\gVI (:?Q! =|}fXE b4R^+d}l[î@q](#fܭ! |:o$2`P?20tl ao d3(u dNP+B3lV,-EsvB$5@Z^Lѣ60/ qRO`?*@$I #qA ,"L_״$%@Aa-{DGRһ`)BAE ` xLI>AsҘHu:*N,+j^W"!p >MʋAhOSCLP_&øS ݢ:AuCe轈 L.7|TB?5l^ ?fVDz1-t 4xcA9ٳUq@e ؞CGi|@#bG:5Ty 3 $Tg"3I}"EAX9E Й4IEXYBT=\‚VgL K`2۾80FwKI97׸`% T_ǜS /Z("9a-dA?pa޾gLu |C"ÍO <>riDlʈ2E0@2Ƃ>n <\Dz@AOǠNPx ߺA8wF~!ÙEch/K.53p>`q]Lz@t-GelC&[73ij'`&.zJ=BEO x-0qGQA, x  Ki ǡ[ZU4ǣQ޿MDR?`O`An=Y2"P nx.^?`~?&> A93f?ʻLΣ? _o3}} n}>w!?fXEoO8L蘼cg;~w  T45sjt9# f6#),?\}\r_8bM|8ȸu1Y3q;*R}ks"lMY&c@VEgn$8kѷ@9}*`-[!:r6-(@A-$n({0M_j<#J] C%#\?t$6HY 08 톐y#);Yixel>d4 Dvz?Ronp~auK:|TfRQ_gB(4&!9طTAd|L`PTP`)ENF@ ҅*R5$'4H !D wpGOFȾYxH$A$@9|Nh:9ES$A~рm\ܮTAI $ Ү%nʥ(t H @7J9-RĒնX}I qO:,a"3 $@n^t$֥S/ @ @$AtR3Mgz@$H$M6o,A)kdH@$H A@{18d=A H K/M!!j1A$$[:諄 ڽ;$ I //Agyl\I I AH,cUhw?G`I@DlCY-;~]$ $H I0`wɕ:Ybb%x77`H HO[>Yn_+]$ȗ] @$$HzAkv_1`@A I swDш֥8k6$HÛGOV|79A  9"{DrҋDxpU @}02vXYez'z껒D @ ū>(L̢u/Չ A%xm/tz3|80_ 0>}p9X&ᣬJ H @>=xr&,f uŭ)8R6G+a1H9dF'qy=tJcT歄;'7yRS^ &4^v/\AN4jKԸ$ Nzu[gt{3Ry0j,6tؐM*ƓEzt-Wh!♸I>f[Al%y +_K> 0jUPye )!1AQa @Pq0?)m0y$0V<^tY#[ ȘYTM&̱;6,``#v,>\ʠ,^f:^M=scQ*#&Wol.ԍ$\M;/< ܕI1ݎޣ9%>R!݌o g۷c̃pZ݇vɓSͤK.7P=H97L: Q0˭- vCҘ1nIj|M3ْOP,Ąx3fH8%hb . wƶiUx8Lӑ,N{6+Dq&tlzܫ[kl P#+P L>WW+lv$z`awnOŤ f0oHDZ]Yǻz1b4gaHdsو6ԳvB݆KF \촄ސ&`KKBm'xnd`DQ_<+f& n5Cߗܓx1.)#B-N  {P$[5Ƣq 40 ИKgÑ˙1هpa6٬óz~#Wd$A^'@FK]6J&rpűq%0%ܽaVGf4xDs46Fd,a؇v"Ϥu1(1~O^*PJ6P1/zX<vӰn-֔심OSibcf82߰6 \ oA"=F!jk e`v'=x'X"CZe̶xZxOߍ؍;0rR4NKMb+!lЗ/nmK&x%2`"̖V~x lvZ(1!x%36L#rmQZrX9% ;vzV_i.>d fB%;+ my[`rѓ" pȀ̲H=2K5n3A%6K`lXHM0OM ?I RMjK]A ,S4orR,B1Ա>Yfv&˖g0C[ z4F;.r@!y&thSk)}@ΗCG<ޒ JED aPӓoB<a Kΰe'xD/w[R6iI`Y%^x~_zz|O)F; B>L&N2>$9_Z$zmٗ?W#%L7 G6i2x$<)3/dpix1`Qų!va`ï ]{ |a%l>┩J{18vZ܉}6! jR3W6&m=XğF4]DA5'q"F"3N=Ƃ20}c\O~A!owqѢ;4+z mO HC3Bolk0'q$! а0iB{K{\ɇ vǍLqD P',ɋu .L,&bl:Y^pMt,~lf6ڙԈ&rG %pl* S!#J28GcZƒ' [X /RX];\% H8,f"kĊHK![,aY!ݴE,]I|1CS AnűKyNP&"Bv##@axx]?[Em;i=Ň #|[mEDk){ Bx~k.dC;ǃITJ#Jcd@N:2X@O6#dzU`=A =:By9X],EqډȺDKwKcJ#Ug 8wچ}6Cd;;!/%:7OKM$2ے{"#v ʹ7! q|G .sQ\^g[GvX̔ZF?4Y-2îD-#Bx N'\[r7u) O`@b !  #ha$znn}}6oTޥ꽑f$~m;A S1"h Vp~"x1ݖEYrv{3<̀.W -Ǩi:i7-$ f0KrcBJ!C(>[3$4 oLZGh3 hp[| 짷{@&9w=Fϯ-s9F[l/҄؂w~lolɕץ}JaBe^PB0DGJ- ۂ~&*?.Nn{ [ k#X.&'_僤柯?X6~#|l~|=m噋]O?S7Oy&Q dΟY]PPwܶ $=ΖPxC~;`rh-X%:̙uf_AIZx$Ɏ#o'rޯt\'(~N?p/V"r;~P,6t?w,A럵ߗλJ`lN*xYgOPݬ0(ݴį%il*_T#Z^Q7[p ټ7ac$(+COzK}/ZH?XAwH['n\;÷3ew{{~ZC0}Xeaijv>%"ml}k-ͮэN%FuDeᷤ-ۖ<+W<,߉G;=H|o5dK;o&_7?{M-~ W>{sa9Y;: oP%B[=rK ݷ6е*YNcq}a4d8 v֑{2v q. ?T=Kmm|_]߯O}ڻz_ Ų7=#&8}rcv”c@!:}ol@w<-ٗm^ڜcfȻ<"& h2Uh đ#ci$"PӉadleQUMe1&sxoZo"=${ /"^9^_`]Ļ&5 !/ļ|{{{8GB)>ݱŒ %ف"#Kd?E-^,2~{n#rlD~PHL`ᘌW2 g'[0]7,>.} DdZۄQ @AQkF|M@Ba !a?m&v^4<%%!ȲyT7J_1?(RDzGv]g`!dr ߁S2?YahVk>XgWg|?;f7%tQ/4mVo [?KK!aJ1 qOɨ*=pIm/;+'%q{ 67m2(2ivvÒ+8qw9P)P膷lK,,y-d𓍈d9*wbwkw$8e` ZLaܗ5={$6O<`Yyh3$r~Q!8!f;aslNK X<sh1ʒMσrB:E$W2]j/ Ǒ &_[zlnc;llQl5вwF+v.7ϑ-YƂ,!=Г[~_ц7r#G>{|oI\s;Gm>K~iEńh-f""1+)Nm!ΌD|e"&\bCl4ΑKZǰjG p[$ŪsV'!a$4(lIZ0`8.B84#30$*$k?'l'G_Jwq92@Pn{ hLk^HB!&wZ8=_l0 /ؐ-9)>-* \i>AaZlIˣerv 3 |.jHHdfHeKj3 ݳ["['͉0r% -W! 1 -@q~d+ЌD6"$6?Tl\!`>d&;eE,~gܮVxA T9ݘ)@$CłCVٺ;d!!e2Ly<9qٺ~&?G`qr$Ko .k%ٓn9! σU߁b2l b,:pC.U`1bDE!:pP:]y'&-w'XNFa NcbЀSI?#ϐ7c=ݰ䖒V|ȤN?M7䐃f ADs ql03?1c "ŷ'rMNvYc#(u\'b,`.q<=N0pַrYٞ9s߲d ̅PM>C̛Q,ž[~a#o;zKqk^}~zO`odu2ӓwhEu*P2_%π$M婟2Rm@Z/gمCĞ7>k{ z6ltfK߄P1bS',@ט/d/#ٌތDq-ɏay~,h4rf>PWXgRN51xJ~~1sh$Y bϲf:C{P8 mP!.:ko菄x>aDDr]oRbx_T^HvL {DpE%/"WeNJ] tFnMy'0$F8D.ܟ]2儸^S X̗~H-ۄC͎R .K.6C{/?(%ːvw|&D1~n(3.ichD0zHqo_{5.Aan9(yzW <7`JG1 SI |h{ }G-LXC 'fwm߉@H&.LÈ)HG:AQ9ῬE?]$E`]Cw#%"~2D7iz03?2wxi \]L?{g@_cت߳<_bbζ ԦN vgj ?%OIL\eqXz#|lϋ&ذos#Sq?k&|ukZ-sH>v?u]4,3GXDDP kBpȄeaۙeB ס'^}-7E5N+_9IYzg2Zn v?Բ ~d8t/=?޶UB ;, W V6 ?Jy;Gl|u8 deG1>Xw?Gw&j[+@ZLw933Oy~h ~om#ߚ'جlgHak1>M Fdȷm$]^K*!ެM voLKYsX̂mspw~[Oy~~d;] 3%-~ _xL:w|\vabxyX%Yś2FNo2&> <|!ȇmg"1v>tb`!_;JwXt3"ŧ,k41{*ѲC#XxO$9('3v C.?1 lˡq%>]'~nkbg&sݞ7~S6؎Ku5~h"^A //z S<}Nv0[dYA/VΛIQN Vܮ>j\mr^Z P[ɉ4uDW@zd]#vD9;8?242&{K`vi~/5>YRzǿmϷ'ϡ<~Gϯz~^/7>|>=_ŝ|{{^?/<_'!1AQaq ?ܐ6l,{w Zqqdۊ{~@ӐڄM8;͵z,fv_ 7-Ўh8Wu'1ה!yq.\iw: WCwconc& Ԇњ")|a"wCX͖k>ʅ~ER1pό2"N fl<s[h$K󼉤2c?;1Iɡ6-%X}ܴ/fW=ʢ!*Sxۇg8#$-۩Jats?#8 з]bK&}YpJ CNI{lS?,x &̏N!=dx!S{@;@aQT{%r=TX];=hMUޖɭdˉ`mc]8%| ;#aɲ||@]u`RURyT>%ǫ0H@''mXZ%5"9eFt3JK Bxtqi^W1][.1O8Zsa*98ɥ6BȜAʅXIVH ПϜ4zӕZA<`VZ o\X޵xU eOXxN peF-$R!ArJOxQ;mș@ >gD `7|qR#sG&7Ⱦ$ƋcbjU]V f5MQfGJYkZBs妄m߼.n@,(W\5u.:,e kSnRF=F 2<&W7 p0Pom Fnr-M7*4cضo@qgœo% ,DWoSvX[?p1>LdAqAG;SH-"Wq8%X,vLXUtMk =<e?xGQƧA!qt==>`]8ӏ mM4GbNMR}HAy}mbNN>Yu5;i"$ }`U'(M=Zp)=AcT_y>pƢ2v8۪ya#2\'+@UzYb 6Vvw%_X;nMaiI&8~~0SBU<zJ=l8y#^h%˿7h56mp6sJ?wxAS-Y`°eI8bqqj8f F j?&1*}K`q-xpoa pmNcfG|&Bhfqpn85[>g&>>d.;xg>ykyJy{ 'XfK1m+,+(~+aSf Lrnj|u,v''D -Wь+~Jq  |`Av,Qbm3c6t!B* :\ \u+Zinj>DxXlu8{`T` ,w@>8)'m'`</Sv>>!MݱHk O)/Ƒy!#PRD1W$rn%HΊ;u9ڨ@W5ef-s,:z]8n)iQw@-QDuDcIU wT/⏦{0GP쾴sPm^XJI÷o>10Th7o8o#UV^>rǘi߂s)OxB}k ޹Nm)J7ͱKdh |\\sKM]S+ `r},66xPH&kgv􇞰ul(Kpc)\YǪc4 % x0CMSn2jZT2|iҟFe`'j!^qe7ruċ\Pt2̚хD3^18w"Cb[\Dg8Xk?xGh4ظip 9]L k /bNa!2ulMl>$sa(Αd/!].YhVI2-K/1u'*Aى$.pcůGW޳l~o4 / MKOxNE6 * PJwZ_;MVsU4.=6"3Í] 9GB\o.﷼sjyWAeJ n8VP/2IACtJ@dqw@kNpYCaS0#JvBZG{ӏE8o=?.:ms #ȄhÉΜ,5jս-4鑣y)dП $YM@~pEҵLvo1/Vˁ3ycWo䵣_6 f4s`5Bq83'>M8 c]Jp:j=A rTǤzeV9SƯ@zTGc%Y*u9Ͷ7ÁUz=rןTfZwq\+(9ނF7j N:9$9K1(iJ`IMEŒh0᜙\qt*qÓ[^9$O!G*!H?GN<WPJ>2heJG҅4_(v,vS.,);5ScyUDo+k7#FMfK 8,~͛٤yz`Vt=:Z5: 5uK3#e"#F8ښ_yFtX۲۔V_$CHL<$Yo p S.2j0Oܫhqp݆N JB|_pƹcRxS9 &!qx[Z\%=㓡q ޖ0ȨG7# w :ټgpk^*Js}~';ՠG02]s,7* 3c#ev̞n fy*K0Ƕ? &8g'(M~pU/F, rx)߭&Pn`t9k7% :T.f{)>Mc(XťBH7RϱD]| $v' 8nӝ+w1Sa}\r@YGFVَ'^0q| Zx[gYW b`mKdhy,DpOB996EX8`$y]nkaN5B8!Ƽ7uL@ 7ah;ukO Nbx&sg7θĻԩEyu8q#9^2#ڸⷼkjut?;΂T^wr@m~oVe #r^\^y(猖 c3|ap89{ߋP n/oKCXEu+]1g 22u*ph՜F@}6o4up5$t{a8~.x+/W[\G* tx)Yy<8!Ɗ~sG@mk0LF}. #._w7MuβQewq( D$Kփx8yiXrs ׄ()5 SWu{FZq߉Ar44djk{)X.n]oÇxWL3`*-=K Qi&*̻8*_8Wcy$p6k_ ˔?XtxƶUGje}[ 2|8o~3l1iȮ~G8'Mõ룅?b*BsWWy c/.˸)`}qb?7.8ߠl`$] v?,%g ''.u0P/6LY&ӧXgj 'og%FmOw87qdC85GfdAON׽$ZˌJgu8WJ6L lZM܄*=)"䩀EF㭗`pkݺe:r/:o4TѨtfKTOsC`;; \ >k{ 3)ȃ&r@h5^sKǕ3;L\eG0ŭ ]yB GĿcL=?۹To!M4K?8ΈT}?8)yPg6`GѺ_!abu.w|u-zJGL`8VȚ]]Ov(ٯɅ@_~h+c{ͷʴ'+Ra j.z |e*X~31ay_6nL S??yT*3C$(4.u%'d !> )ްA-Ni0;EcY !2 ƿ6 /8ڃn. - +"[jː ^pPD@{o08Kt0 U,[~1 $fQ-5Aش#1|q|5'i8/Qֲn)!vkt2mpJzVڏL@ d%4Xbsxj\Fa' Ǭ0)k% NS` pĘч͇?>_%B\fk,HT=oDjjXDi x+9k,DkO9]CZdXQZqhD`edFZ^@t iLṽX绊4^yO=OIA| x.%pysjf9p,u9}'nkXj7KO]I8*5$~ɽ9nCgM6>:KoI CVqXHK b('}x*e1]e^Jp0oYpsi/>qSxp2`o^E0 لP:*Gԇ*g5 9:>q?A4pAn,f,ϛ\;ߋ.24ϊvi@a뼷caM. j49: C!NgYq5q@k1 yဃ@΍Xm'(@<_"-<-+.]/@{s17&] My 2N.r1>t>2~F|^قT|wG"8Y-V NfFKu|X?~ VG&u3]oǮ:aRNߜ=AJxq1k7s@lX3AN)"X!_MdXPA` :bxk3iDf8M#z΂JL׌&PCLJ <(лuGV o^ Z o:a1O`-UwXm*]w+kD@\ @(5|o yMo%8^SE OKiH1۷Y˻@{C}{Xxp 's:RQ8[SnqMDhZ&'aA\ZV ɾ7jD_:ӑ";MmSX$SxuO& Ua`{+)n@Jqn"ג,d͉2Icشr k&GvW&0njJjr~3%9j6u DsZr:)xޣax4g J]ࣙo~[`m?{j~bah\6&ns^pξW7  7F@xݸ@@!LXӼCq1*9r =( _^ =Y9xx0s AB7:'k~q\zFPO {tu\سo,p,;@qXbM׼G s&>9` `^H!:|]SsOM<p^CX-"8e(an;]ŵ{/cwje}e4+e8^p&x6<ó ըto o|5d) ^9BoYE7EqbNV:X8 MF8,:pOǤ oXhn>!@C" R:a[J%H*5y~"J}/8po'B|cjJ+!o3BuՀ+ٍM6@av,*U֙cDi5# &|c׵ky 'Zk1r4;fohFVf3ż;9D#\ B{?C9V0;0GwMh]m.z >qiﬡ i3; dӣWA3SAkt6BGǩzBɂqЃ ;?c ;wZG?EB2K@w8`8?ΠELW=ql(7Ȝ0rTӡ9{z]m:oxjP *Ӝ !ғzvyfC /+TB/ù=ڒMP. Y<SCD)Ĕy{DVbƒxj۹&䲇 ᰙEAeUb+U"ɼGj#WPtƚ8.5"paGهue$󍨍8ְ~y4gm=yؔΥ1*{55ҧk <3Cׁƭ*&7c-ko?Zß]yc+*,]ǐ_Ƀx*Zo6F©"}\^yW%?Es9`|-ļDu6U>KN*+)Ng YP|\;p6[b0{Z\FǣF~p3PZbo׼ߊ'?{0K ~LMBț4; T7gm@1Bg:E&ﯬWQZ\`FQ(i~/;러v;M]Ɍb][χ>̓9' 8T3*Xzzh- E$6X>0ˀU]K7,Tg#SB)Р0M>p "5Bj>" W^SSf7t 2`{ZuE `A& O/ԢMЧwL@hG!"Fr.ލz@ .ѽc 4zw8dqKb8b"p\MPWnDŽ,@Bx)@%6AU&pӜш#)f -Y7:r0)tdKp 4iMT45G;:t6iÉۚ,"PpP I00@yquKt8⚌C<}`LI5a]f YiK (in 8\t'alka[bMmpOZq?&!t$<|o#"$F.PN~+;uoY4wz㼍ĕGOxP/};wͅ^0Yj ||1!i@Xju ޷!2 AOxҺU9CE9D9=8;QQGQ5X8B}!ˋrvyE7Ns,5RԢ6I2 9(7:2ue݄:742vY.m[jԡsغ+`ڭ:*҂(a'ZSx!@6 u~wa>0†ב`R{фlh )lנ3lKH5xZ- l8SͨBŒQXT.V\ii Wx{n(@v)8<q2zJڭq&+4H!n}e!wsGgyޠ$ ilk*HEشZQȩD x_.S+-$%-cu|Vx ʹ@,WҖqwnG`Dzp"pmn0`ٹ/=` /ooM0#zTMqoxXxYDd6sU$R0Ԃp401~0؄7\Z>Dc X[6[1ϯ9,S28r­%b[ v e$Hͬ5Z-:y(;gA0ަ,YR<#x%E@V ,&W`nFϡ DL 1 Y7]՛7lGI'[y&pVQ2P/ J 7 Ldz6?sa- qݘ"((0.M/aU XpߤS sb*i 08۵kQ:z9! ʘx]ix-hKPhB(s0rߐ8'AGh^U̓(hkqe6!ń5'**KZa=7ynJ{'MC|HmAb)il4a]CXFO(8GH D!AƜT(r ]P*h(h`$vj$^ 6rn\a  9cb<α-ʥޜȻS($qa Z&1"sY^5rM9A8"߬Ph6ittmbY&だ!@]{v4ԅ>LcČ8tU _8=_6%އo<WO&Мo7OF-kBbO4_Ln4)lAQRv,p|MՂyUI Qy|6NwvzGzN20"֓\db2|` F|}E2)l;(3{K/o->RbT7|2;uM7'H{?*20J12c -'P h,ś6 I[>&Q ]ƭ"zN@UlbsvLs u[ȋ|N8Lv'9WWWy.$5Od?Pk{7}r+kEfG#d /ɂhkYL#kg4Om8BPhII9r$~ fNs[q{z"KUh96ja &bYs`TDi ĵ$y";:P=!mҰ70R:ObŽ$T7s΢t!]TB3m#$ E$9֌пf9_2BB#FbƼ\""Dѻڎߥdf"t[ל[)D8xB U*$ѓ4燍o/\P(*z٬xSɼdQ! 7tc͸\;m?5ܸEiፚR" i좶AF6O\"mP8xJ,5e`rf0GÑ (EB 5W|Ђ-<(de5SV)D堦mCN0i18 |BV ,7[$j h_ Di2/ѨmN*'D1Rh/Es2_87_<<_QCY|)yaUrĭt$~7\}cf{Â?Y㇟xL ?7K~01L>!7 ?Kbu*8WZyʧA |*'a-@b$~ 3iD-u* wNQ;6K/߳W L$Wkwii,sݏ7(ǁ$#eH$H-Nt>UBQ™Usw_J v\XJI8Jۗ oaDہb tu)hÞhm˸p2aI0=By %-U3@CIӥHk4s)ϓ)7 V"36M5S=jTn7Nڎpu)ӖH+Y\9PG]`s OppOf vZX_WlESۆ 7 |$ spWORpWc}j?}JY]-z6mo=:QO.kpF("Aը@PJN&hY@M] ]J]#sl  iwM>j.3bw`Ma1/x#`w%OLN)4l˪ݨL5}' @$/B!,O# 4!@3 /j tL5КEqjɝhB(X@>ŇC"')t`"E:9݉Zjją14N8p T!%)Zf:ZJk#עb \אJZD'=ɡUA/`K7)Ap/d@RSW/e?1#BR k-4y1 :;W8YHwwČ# UM΁"*Hܾ2{P)ryA0r*80Jj/@ D!iA@thɠ_x*ƢhRsjZm&VlvcFHJt $ơp #vw,X"`Z:RY`1{ E#V%(CIZ#q"PhWPoD`I bb`kX ٖR"MdhU,WOY |}|wpcr^X epe4wG;A.4 ̵8X&!0ƮgUw c( q΁3[I;6ebIYX 1MN8YyO&7>7v.[a0 z4t`QSywEG s 4^Xm-0̑jDbf,1XUiRI%7G>O6NqZ@z{7phf覠a7yt\P). K҈7tFFiΠ ["҂&l߬Y j@4n u ر)Fqzٱ⡑6Ua9E;M *xnVmHq{9L3`< aA$HZHHTD V;BB}8.b YcRri{ZM%׌RחAp|;00Dl'G&yea,|OݪNw]<V͘T,Eӽf%*%/'Y2%yH˵:tE^L 45teܷ -iD@VQfv 56(7u,r 16KF-fZ{%|0}5G]Z1Bex^G NzÊ)2st+ \yO!'FͿQL-ZgȒ[E\ _@=}:k.5~zIuNlcD5ey/y!njPF@h}9:?:5~8;#ن֦8'|yp L#=+p#z=8`(7YRguʤ ʷ ĕne6J-29j1ZFH"QJ42,+$VeKTkSNf+l3OC7b46يh ^$+$tChFl50&Zd` d@RKZ: Y0WwQy\w6=o3{\Zy ^n8T&wpР[8`P(Ӧ%ǎqMߕ!{vl*(x֛>^Phd@E:+ÑH/s.t^¶e¡NUxuq\9>X`z?So7{eRD8PӓCïvW .Z_(p90uxRFkxy q.XΕnDB UÂ@J4,Zr.3N)8?f0%@ʹ5v瑍6Ϭ5F sJ|,[cOlW^n#Рֽwidn؞Y>`A8,Y"Yf(^5ͥƿ9gCYpD %.(LXBb[E H=dF^M5qlRh^qƍ, x.@4y Q]] |khP tʳpm/O$`T {!O@kR5@/8[QdqG:IZV.nPPM˦_'_ `Axqc+w.Z_QNTuM7xfLvDЍBF/4.͍k3AKp`FP<)}*-omni_#!--p}_*|P>3rH8GCrV[Ÿ˘ɎGч3uW_=<"I 뽻ś۬Z|e=@ yt\`/S1v fUa Gw̆7"`z`0" y'Px/$.~ S2'`T,G b۠-@ r+pwt ^ о(]y 'MB6ч#-xy77Aec]z\86/ى95W91*Uf918XIiUMQ1bK{ 5&-@ RmwC "+R|AĨḚUlpba cN>1ޤ<6嶘R9GELۘh eThUrn. XNe]#&2=4/XPOCixΚG%Vmb6Ya( U?G$. 7&e-kkOtITSa ]y-6A&{~TL+cHTsN5a^j˨)bzɃ׬?g1T&57 /!<Օh|6=pItnncG=TFqa甕̄ K~Tp52A"s%ow RE9hK"@6RoBVm&^m[!I<ObVكQw xhpVnjKJuoT>1+(_vmd!Rir}FjuBJTmnbJ<2╼?:1 n~D4|c6oT/}mxKN¤-Fdp}j$050<\pO,x ,R8Njx0 pyW7[ |n}0#QcqS\.tM'Fbq5[rͷMpS8v6 Ô6G%;!yA%=y >EFi>Yu/i3TaZB`0RX1IHv>x V7óAmUʻU&YZŴ57t3踝dh^D_OeGxeDF/6,6c77-#NpxݸA}ĉ|PsQPpiSۑpI]Wylb5[QĢRJd D #^x7G_}+PRKnQx~$4zck)[,|aLCp/n$4+XC7ͅƐipg- y\76eKRʟ}/ ^1oO㬟m猰7r Pf[+5LN+`yxVSKdRu8 38u . 0B ۵{bjMStp{ʩM>7eDKwg9Z(2 A))$Oab~$߼a0l#RϜwtK8+KǍʐx`(P嫏&/kiަ` WE a}#DTfY*4st!X-cakZ"Xj1짷`ғ;;Q$͐'ziX(X( 8()%7:ɡd.7X;۞nbGԦJ6,̂Ǭ AT X:L1.mQ7HZuz/O6Ez#bjP_'R|u` p'X BS8`oqi1&`xhX`/\⼭kFun&~ ?}~0Kd697)u}LP;ÕWcu7y}a'9:‚2E.jnDZ5p sR7 Czo-'/:mӔޏZք&(G >QAѭ^o,4LlYNTXYL[ Xbf'] i.a+kjH&uS4(r9ɵ -i:=QiY{KW愾t ilCL]fk`WC9A/umeTJ;BV.mZ2QB7՞:H8hޚyyzk":s&JTT޲q xҮ4zT`UTTvd4cUrA4ጵ4xhd p9' )h$7Pѕ_ +$dxaETl )ࣙ"ֲם&<hM=n=I q)F 95e@pZ[pbv߻!bL|M~0%hxЕx%.sGW1E06Êkk )R3V-H#5GߌEYXg%Y#OI{xݸ8&zYR=|!JW1~6=.xEYF"z= ķcٹbs/x%p SQؚt)r'8s@LNqp׋aV`k$D#,NƿCVW`Qoz\T$k7 hL)0 qIhTD&@<޺WCwxL"Py.M>djڤfYeFOqAc+w=3e€4R{ʦP@QaH!\WdFBt !nwa>7sQcJ& 7"eKahS6 :vA PmLQYyxe)M"DE89ӬbHptMc쎋 5mp0F[}+f2x,J\O/pE?Xj/9&"=LD^\/<l"{Rm p/%8:ߞ]j>0(7YO aYIS4a|/)id1zk8P;7T1K]Z.ֿPNfLd.y!OLj6|[G4X xybj 6ž1F\Z$@u xףk!v~Zen905qJݯa} &k)wʝ^|^xLA]vC2.tb:uk:DJhen@*Bm>%GoyBKLPE<=eeD&&h<ݦeWX 8t0j PoY@@Y" 䣅$i"GyuqG/yVS&؂&ᆍ#8QZ.f~} :-H`Z6.iM"M r%$ Ympm5q(VɄ K>ϼDb;aq v^f|J.pAE_x{aD;8 }2"9.*uZh1B;BMj@f>;uVZ2]_Mந(]'8aK&H! #f>憤iiƈ  qgMT)ϔ~\:6xnε8)Z:&W]^Q::ӉYt{ ]2-E5ґ*MYâ<7'yn?iw8 0F(Y x$Tp u~ .8yr ;}ɩȼ:!/9> b_6ƪaNBXrZ7f-*Uv5e^X&C$*_듌7ݟvNFm{`SEM:jpZJ\t=I,%IcR/XA'RD>BVJL{p!#/͈.K>VwAa37kP:F/X-kX',}kCB7Gj `ȧ&a }GxQ[m?qҩ*W-jnÌ\委G{#aXI". T76^dŜ4 ;@ Ea]"<X `pԥH`r'jt\;p9rQ=JpoPS^U tQEcvJDA/ t1ETOJrt.Ɯkvwb¥N;<^+cǁ6è-%p*pX]Iw lji+hآ+ @@|Mn߳b"h>0@FrqE@7+x[%P"ȔL2ƒU %.:p=*RQrAO. 4{+$$}`v0Tb*)\'\N|9UT8c 4Hl}Bz&'E340LەR6)8V'u<=#Xy\ >:[oHGs'8x&^/Dx45a Sf+ X:ǁM"v) 8a &gl$(~Cm@* aOƄ_i$I =ϔ}T7|+WRcaˇ{qưka|\ DXRxLJ&Q kaL^< _b3 6m  iPc3d  JmȜ!U:2 G|?&׷$ԩZ>_|N/86y3 'XpsBڃrsX|]b:du<yKǎtI?)ň2*9]hgȍe~% >yǬ2=ˊ&D8jA,-z` Ť8Eᐦ/ǿ`zwCw /=L\|x3t姒PD(QVqz.1ىS1s(2(UMM E!U4лyiEL%mq8nE7T4u Ԅ SF`PHT&(/=1ȧHOȆQGӆj*)tCJTO|uhsW"t+)/B4JI'9V`#9cBhjc l؀G4/L>8GiiP4 qNCPp$D,:q0Љq[w8#ٽ bf͏ڏEܨ{O!040ȡ7w$b~$%A Q7MX4|FHxa[ FM{ā$`L7/spS˹S\s<7cu/xa]{?)=xL@r@WO"[_^Rl"$(1*(V"U}0j7J &$dB3b\=LIbDMT|xH]IP:P烾^QeԙAÐ~[ߌN`AGzirjk !UߒƐD 1M\@"k+,"LA2s7o dL" p(귤MLnlD `q0CRQ f;ii\DуC챐9nM\0]G:r,J#.!4 uhuw]^\<Ek:;.t5+HAKVTC;[E㟬:>$_>1NyU8)Óo!!k9ٞFwG|~ӻ508a9H/?ӗpv%bsYy&aZXAq 8/H!ާakN; שּׁ ixXp`G8؈?rPobvN Wc7WpFpYH`;줷$m ŗăF`>pKC{IuQ`«gWA6ŚGWGX3;7yvXrR_iO\ 5gz!jH鑱4xʢNŦ .,7u~uŒt^{6z :'Ļ(i9.zxKmpw zĊPP/dev/null 2>&1; echo $$?), 1) $(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/) endif # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " xml to make Docutils-native XML files" @echo " pseudoxml to make pseudoxml-XML files for display purposes" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/django-cachalot.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/django-cachalot.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/django-cachalot" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/django-cachalot" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." latexpdfja: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through platex and dvipdfmx..." $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." xml: $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml @echo @echo "Build finished. The XML files are in $(BUILDDIR)/xml." pseudoxml: $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml @echo @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." django-cachalot-2.8.0/docs/api.rst000066400000000000000000000004041500004256100167610ustar00rootroot00000000000000.. _API: API --- Use these tools to interact with django-cachalot, especially if you face :ref:`raw queries limits ` or if you need to create a cache key from the last table invalidation timestamp. .. automodule:: cachalot.api :members: django-cachalot-2.8.0/docs/benchmark.rst000066400000000000000000000060041500004256100201440ustar00rootroot00000000000000.. _Benchmark: Benchmark --------- .. contents:: Introduction ............ This benchmark does not intend to be exhaustive nor fair to SQL. It shows how django-cachalot behaves on an unoptimised application. On an application using perfectly optimised SQL queries only, django-cachalot may not be useful. Unfortunately, most Django apps (including Django itself) use unoptimised queries. Of course, they often lack useful indexes (even though it only requires 20 characters per index…). But what you may not know is that **the ORM currently generates totally unoptimised queries** [#]_. You can run the benchmarks yourself (officially supported on Linux and Mac). You will need a database called "cachalot" on MySQL and PostgreSQL. Additionally, on PostgreSQL, you will need to create a role called "cachalot." Running the benchmarks can raise errors with specific instructions for how to fix it. #. Install: ``pip install -r requirements/benchmark.txt`` #. Run: ``python benchmark.py`` The output will be in benchmark/TODAY'S_DATE/ Conditions .......... .. include:: ../benchmark/docs/2018-08-09/conditions.rst Note that `MySQL’s query cache `_ is active during the benchmark. Database results ................ .. include:: ../benchmark/docs/2018-08-09/db_results.rst .. image:: ../benchmark/docs/2018-08-09/db.svg Cache results ............. .. include:: ../benchmark/docs/2018-08-09/cache_results.rst .. image:: ../benchmark/docs/2018-08-09/cache.svg Database detailed results ......................... MySQL ~~~~~ .. image:: ../benchmark/docs/2018-08-09/db_mysql.svg PostgreSQL ~~~~~~~~~~ .. image:: ../benchmark/docs/2018-08-09/db_postgresql.svg SQLite ~~~~~~ .. image:: ../benchmark/docs/2018-08-09/db_sqlite.svg Cache detailed results ...................... File-based ~~~~~~~~~~ .. image:: ../benchmark/docs/2018-08-09/cache_filebased.svg Locmem ~~~~~~ .. image:: ../benchmark/docs/2018-08-09/cache_locmem.svg Memcached (python-memcached) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. image:: ../benchmark/docs/2018-08-09/cache_memcached.svg Memcached (pylibmc) ~~~~~~~~~~~~~~~~~~~ .. image:: ../benchmark/docs/2018-08-09/cache_pylibmc.svg Redis ~~~~~ .. image:: ../benchmark/docs/2018-08-09/cache_redis.svg .. [#] The ORM fetches way too much data if you don’t restrict it using ``.only`` and ``.defer``. You can divide the execution time of most queries by 2-3 by specifying what you want to fetch. But specifying which data we want for each query is very long and unmaintainable. An automation using field usage statistics is possible and would drastically improve performance. Other performance issues occur with slicing. You can often optimise a sliced query using a subquery, like ``YourModel.objects.filter(pk__in=YourModel.objects.filter(…)[10000:10050]).select_related(…)`` instead of ``YourModel.objects.filter(…).select_related(…)[10000:10050]``. I’ll maybe work on these issues one day. django-cachalot-2.8.0/docs/changelog.rst000066400000000000000000000000361500004256100201400ustar00rootroot00000000000000.. include:: ../CHANGELOG.rst django-cachalot-2.8.0/docs/conf.py000066400000000000000000000211131500004256100167550ustar00rootroot00000000000000# -*- coding: utf-8 -*- # # django-cachalot documentation build configuration file, created by # sphinx-quickstart on Tue Oct 28 22:46:50 2014. # # This file is execfile()d with the current directory set to its # containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import sys import os sys.path.insert(0, os.path.abspath('..')) os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'settings') import cachalot # This sets up Django, necessary for autodoc import runtests # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. #sys.path.insert(0, os.path.abspath('.')) # -- General configuration ------------------------------------------------ # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx', 'sphinx.ext.todo', 'sphinx.ext.coverage', 'sphinx.ext.ifconfig', 'sphinx.ext.viewcode', ] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = 'django-cachalot' copyright = '2014-2016, Bertrand Bordage' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. version = '%s.%s' % cachalot.VERSION[:2] # The full version, including alpha/beta/rc tags. release = cachalot.__version__ # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. #language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ['_build'] # The reST default role (used for this markup: `text`) to use for all # documents. #default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # If true, keep warnings as "system message" paragraphs in the built documents. #keep_warnings = False # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'sphinx_rtd_theme' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. #html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. #html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # Add any extra paths that contain custom files (such as robots.txt or # .htaccess) here, relative to this directory. These files are copied # directly to the root of the documentation. #html_extra_path = [] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_domain_indices = True # If false, no index is generated. #html_use_index = True # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. #html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. #html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = None # Output file base name for HTML help builder. htmlhelp_basename = 'django-cachalotdoc' # -- Options for LaTeX output --------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). #'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). #'pointsize': '10pt', # Additional stuff for the LaTeX preamble. #'preamble': '', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, # author, documentclass [howto, manual, or own class]). latex_documents = [ ('index', 'django-cachalot.tex', u'django-cachalot Documentation', u'Bertrand Bordage', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. #latex_use_parts = False # If true, show page references after internal links. #latex_show_pagerefs = False # If true, show URL addresses after external links. #latex_show_urls = False # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_domain_indices = True # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', 'django-cachalot', u'django-cachalot Documentation', [u'Bertrand Bordage'], 1) ] # If true, show URL addresses after external links. #man_show_urls = False # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', 'django-cachalot', u'django-cachalot Documentation', u'Bertrand Bordage', 'django-cachalot', 'One line description of project.', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. #texinfo_appendices = [] # If false, no module index is generated. #texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. #texinfo_show_urls = 'footnote' # If true, do not generate a @detailmenu in the "Top" node's menu. #texinfo_no_detailmenu = False # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = {'python': ('https://docs.python.org/3', None)} django-cachalot-2.8.0/docs/how.rst000066400000000000000000000015421500004256100170110ustar00rootroot00000000000000How django-cachalot works ------------------------- .. note:: If you don’t understand, you can pretend it’s magic. Reverse engineering ................... It’s a lot of Django reverse engineering combined with a strong test suite. Such a test suite is crucial for a reverse engineering project. If some important part of Django changes and breaks the expected behaviour, you can be sure that the test suite will fail. Monkey patching ............... Django-cachalot modifies Django in place during execution to add a caching tool just before SQL queries are executed. When a SQL query reads data, we save the result in cache. If that same query is executed later, we fetch that result from cache. When we detect ``INSERT``, ``UPDATE`` or ``DELETE``, we know which tables are modified. All the previous cached queries can therefore be safely invalidated. django-cachalot-2.8.0/docs/index.rst000066400000000000000000000103441500004256100173230ustar00rootroot00000000000000*************** django-cachalot *************** Caches your Django ORM queries and automatically invalidates them. .. image:: https://raw.github.com/noripyt/django-cachalot/master/django-cachalot.jpg ---- .. image:: http://img.shields.io/pypi/v/django-cachalot.svg?style=flat-square&maxAge=3600 :target: https://pypi.python.org/pypi/django-cachalot .. image:: https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml/badge.svg :target: https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml .. image:: http://img.shields.io/coveralls/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 :target: https://coveralls.io/r/noripyt/django-cachalot?branch=master .. image:: http://img.shields.io/scrutinizer/g/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 :target: https://scrutinizer-ci.com/g/noripyt/django-cachalot/ .. image:: https://img.shields.io/discord/773656139207802881 :target: https://discord.gg/WFGFBk8rSU Usage ..... #. ``pip install django-cachalot`` #. Add ``'cachalot',`` to your ``INSTALLED_APPS`` #. If you use multiple servers with a common cache server, :ref:`double check their clock synchronisation ` #. If you modify data outside Django – typically after restoring a SQL database –, use the :ref:`manage.py command ` #. Be aware of :ref:`the few other limits ` #. If you use `django-debug-toolbar `_, you can add ``'cachalot.panels.CachalotPanel',`` to your ``DEBUG_TOOLBAR_PANELS`` #. Enjoy! Note: In settings, you can use :ref:`CACHALOT_UNCACHABLE_TABLES` as a frozenset of table names (e.g. "public_test" if public was the app name and test is a model name). Why use cachalot? :ref:`Check out our comparison ` Below the tree is an in-depth opinion from the new maintainer: .. toctree:: :maxdepth: 2 introduction quickstart limits api benchmark todo reporting how legacy changelog In-depth opinion (from new maintainer): There are three main third party caches: cachalot, cache-machine, and cache-ops. Which do you use? We suggest a mix: TL;DR Use cachalot for cold or modified <50 times per minutes (Most people should stick with only cachalot since you most likely won't need to scale to the point of needing cache-machine added to the bowl). If you're an enterprise that already has huge statistics, then mixing cold caches for cachalot and your hot caches with cache-machine is the best mix. However, when performing joins with select_related and prefetch_related, you can get a nearly 100x speed up for your initial deployment. Recall, cachalot caches THE ENTIRE TABLE. That's where its inefficiency stems from: if you keep updating the records, then the cachalot constantly invalidates the table and re-caches. Luckily caching is very efficient, it's just the cache invalidation part that kills all our systems. Look at Note 1 below to see how Reddit deals with it. Cachalot is more-or-less intended for cold caches or "just-right" conditions. If you find a partition library for Django (also authored but work-in-progress by `Andrew Chen Wang `_), then the caching will work better since sharding the cold/accessed-the-least records aren't invalidated as much. Cachalot is good when there are <50 modifications per minute on a hot cached table. This is mostly due to cache invalidation. It's the same with any cache, which is why we suggest you use cache-machine for hot caches. Cache-machine caches individual objects, taking up more in the memory store but invalidates those individual objects instead of the entire table like cachalot. Yes, the bane of our entire existence lies in cache invalidation and naming variables. Why does cachalot suck when stuck with a huge table that's modified rapidly? Since you've mixed your cold (90% of) with your hot (10% of) records, you're caching and invalidating an entire table. It's like trying to boil 1 ton of noodles inside ONE pot instead of 100 pots boiling 1 ton of noodles. Which is more efficient? The splitting up of them. Note 1: My personal experience with caches stems from Reddit's: https://redditblog.com/2017/01/17/caching-at-reddit/ django-cachalot-2.8.0/docs/introduction.rst000066400000000000000000000172201500004256100207350ustar00rootroot00000000000000.. _Introduction: Introduction ------------ Should you use it? .................. Django-cachalot is the perfect speedup tool for most Django projects. It will speedup a website of 100 000 visits per month without any problem. In fact, **the more visitors you have, the faster the website becomes**. That’s because every possible SQL query on the project ends up being cached. Django-cachalot is especially efficient in the Django administration website since it’s unfortunately badly optimised (use foreign keys in ``list_editable`` if you need to be convinced). One of the best suited is ``select_related`` and ``prefetch_related`` operations. However, it’s not suited for projects where there is **a high number of modifications per minute** on each table, like a social network with more than a 50 messages per minute. Django-cachalot may still give a small speedup in such cases, but it may also slow things a bit (in the worst case scenario, a 20% slowdown, according to :ref:`the benchmark `). If you have a website like that, optimising your SQL database and queries is the number one thing you have to do. There is also an obvious case where you don’t need django-cachalot: when the project is already fast enough (all pages load in less than 300 ms). Like any other dependency, django-cachalot is a potential source of problems (even though it’s currently bug free). Don’t use dependencies you can avoid, a “future you” may thank you for that. Features ........ - **Saves in cache the results of any SQL query** generated by the Django ORM that reads data. These saved results are then returned instead of executing the same SQL query, which is faster. - The first time a query is executed is about 10% slower, then the following times are way faster (7× faster being the average). - Automatically invalidates saved results, so that **you never get stale results**. - **Invalidates per table, not per object**: if you change an object, all the queries done on other objects of the same model are also invalidated. This is unfortunately technically impossible to make a reliable per-object cache. Don’t be fooled by packages pretending having that per-object feature, they are unreliable and dangerous for your data. - **Handles everything in the ORM**. You can use the most advanced features from the ORM without a single issue, django-cachalot is extremely robust. - An easy control thanks to :ref:`settings` and :ref:`a simple API `. But that’s only required if you have a complex infrastructure. Most people will never use settings or the API. - A few bonus features like :ref:`a signal triggered at each database change ` (including bulk changes) and :ref:`a template tag for a better template fragment caching