Merge pull request #6716 from netbox-community/6639-drop-cacheops

Closes #6639: Replace django-cacheops with django-redis for caching
This commit is contained in:
Jeremy Stretch 2021-07-07 22:08:01 -04:00 committed by GitHub
commit 028c876bca
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
16 changed files with 60 additions and 244 deletions

View File

@ -2,10 +2,6 @@
# https://github.com/django/django
Django
# Django caching using Redis
# https://github.com/Suor/django-cacheops
django-cacheops
# Django middleware which permits cross-domain API requests
# https://github.com/OttoYiu/django-cors-headers
django-cors-headers
@ -34,6 +30,10 @@ django-pglocks
# https://github.com/korfuri/django-prometheus
django-prometheus
# Django chaching backend using Redis
# https://github.com/jazzband/django-redis
django-redis
# Django integration for RQ (Reqis queuing)
# https://github.com/rq/django-rq
django-rq

View File

@ -1,25 +0,0 @@
# Caching
NetBox supports database query caching using [django-cacheops](https://github.com/Suor/django-cacheops) and Redis. When a query is made, the results are cached in Redis for a short period of time, as defined by the [CACHE_TIMEOUT](../configuration/optional-settings.md#cache_timeout) parameter (15 minutes by default). Within that time, all recurrences of that specific query will return the pre-fetched results from the cache.
If a change is made to any of the objects returned by the query within that time, or if the timeout expires, the results are automatically invalidated and the next request for those results will be sent to the database.
## Invalidating Cached Data
Although caching is performed automatically and rarely requires administrative intervention, NetBox provides the `invalidate` management command to force invalidation of cached results. This command can reference a specific object by its type and numeric ID:
```no-highlight
$ python netbox/manage.py invalidate dcim.Device.34
```
Alternatively, it can also delete all cached results for an object type:
```no-highlight
$ python netbox/manage.py invalidate dcim.Device
```
Finally, calling it with the `all` argument will force invalidation of the entire cache database:
```no-highlight
$ python netbox/manage.py invalidate all
```

View File

@ -52,14 +52,6 @@ BASE_PATH = 'netbox/'
---
## CACHE_TIMEOUT
Default: 900
The number of seconds that cache entries will be retained before expiring.
---
## CHANGELOG_RETENTION
Default: 90

View File

@ -113,7 +113,6 @@ NetBox looks for the `config` variable within a plugin's `__init__.py` to load i
| `min_version` | Minimum version of NetBox with which the plugin is compatible |
| `max_version` | Maximum version of NetBox with which the plugin is compatible |
| `middleware` | A list of middleware classes to append after NetBox's build-in middleware |
| `caching_config` | Plugin-specific cache configuration
| `template_extensions` | The dotted path to the list of template extension classes (default: `template_content.template_extensions`) |
| `menu_items` | The dotted path to the list of menu items provided by the plugin (default: `navigation.menu_items`) |
@ -384,32 +383,4 @@ class SiteAnimalCount(PluginTemplateExtension):
})
template_extensions = [SiteAnimalCount]
```
## Caching Configuration
By default, all query operations within a plugin are cached. To change this, define a caching configuration under the PluginConfig class' `caching_config` attribute. All configuration keys will be applied within the context of the plugin; there is no need to include the plugin name. An example configuration is below:
```python
class MyPluginConfig(PluginConfig):
...
caching_config = {
'foo': {
'ops': 'get',
'timeout': 60 * 15,
},
'*': {
'ops': 'all',
}
}
```
To disable caching for your plugin entirely, set:
```python
caching_config = {
'*': None
}
```
See the [django-cacheops](https://github.com/Suor/django-cacheops) documentation for more detail on configuring caching.
```

View File

@ -5,6 +5,9 @@
### Breaking Changes
* The default CSV export format for all objects now includes all available data. Additionally, the CSV headers now use human-friendly titles rather than the raw field names.
* Support for queryset caching configuration (`caching_config`) has been removed from the plugins API (see [#6639](https://github.com/netbox-community/netbox/issues/6639)).
* The `cacheops_*` metrics have been removed from the Prometheus exporter (see [#6639](https://github.com/netbox-community/netbox/issues/6639)).
* The `invalidate` management command has been removed.
### New Features
@ -64,6 +67,11 @@ CustomValidator can also be subclassed to enforce more complex logic by overridi
* [#5994](https://github.com/netbox-community/netbox/issues/5994) - Drop support for `display_field` argument on ObjectVar
* [#6068](https://github.com/netbox-community/netbox/issues/6068) - Drop support for legacy static CSV export
* [#6338](https://github.com/netbox-community/netbox/issues/6338) - Decimal fields are no longer coerced to strings in REST API
* [#6639](https://github.com/netbox-community/netbox/issues/6639) - Drop support for queryset caching (django-cacheops)
### Configuration Changes
* The `CACHE_TIMEOUT` configuration parameter has been removed.
### REST API Changes

View File

@ -1,6 +1,5 @@
import logging
from cacheops import invalidate_obj
from django.contrib.contenttypes.models import ContentType
from django.db.models.signals import post_save, post_delete, pre_delete
from django.db import transaction
@ -33,7 +32,6 @@ def rebuild_paths(obj):
for cp in cable_paths:
cp.delete()
if cp.origin:
invalidate_obj(cp.origin)
create_cablepath(cp.origin)

View File

@ -1,4 +1,3 @@
from cacheops import invalidate_model
from django.apps import apps
from django.core.management.base import BaseCommand, CommandError
@ -108,8 +107,5 @@ class Command(BaseCommand):
elif options['verbosity']:
self.stdout.write(self.style.SUCCESS(str(count)))
# Invalidate cached queries
invalidate_model(model)
if options['verbosity']:
self.stdout.write(self.style.SUCCESS("Done."))

View File

@ -47,11 +47,6 @@ class PluginConfig(AppConfig):
# Middleware classes provided by the plugin
middleware = []
# Cacheops configuration. Cache all operations by default.
caching_config = {
'*': {'ops': 'all'},
}
# Default integration paths. Plugin authors can override these to customize the paths to
# integrated components.
template_extensions = 'template_content.template_extensions'

View File

@ -1,4 +1,3 @@
from cacheops.signals import cache_invalidated, cache_read
from django.conf import settings
from django.contrib.contenttypes.models import ContentType
from django.db.models.signals import m2m_changed, post_save, pre_delete
@ -138,27 +137,3 @@ def run_custom_validators(sender, instance, **kwargs):
validators = settings.CUSTOM_VALIDATORS.get(model_name, [])
for validator in validators:
validator(instance)
#
# Caching
#
cacheops_cache_hit = Counter('cacheops_cache_hit', 'Number of cache hits')
cacheops_cache_miss = Counter('cacheops_cache_miss', 'Number of cache misses')
cacheops_cache_invalidated = Counter('cacheops_cache_invalidated', 'Number of cache invalidations')
def cache_read_collector(sender, func, hit, **kwargs):
if hit:
cacheops_cache_hit.inc()
else:
cacheops_cache_miss.inc()
def cache_invalidated_collector(sender, obj_dict, **kwargs):
cacheops_cache_invalidated.inc()
cache_read.connect(cache_read_collector)
cache_invalidated.connect(cache_invalidated_collector)

View File

@ -80,12 +80,6 @@ class PluginTest(TestCase):
"""
self.assertIn('extras.tests.dummy_plugin.middleware.DummyMiddleware', settings.MIDDLEWARE)
def test_caching_config(self):
"""
Check that plugin caching configuration is registered.
"""
self.assertIn('extras.tests.dummy_plugin.*', settings.CACHEOPS)
def test_min_version(self):
"""
Check enforcement of minimum NetBox version.

View File

@ -89,9 +89,6 @@ BANNER_LOGIN = ''
# BASE_PATH = 'netbox/'
BASE_PATH = ''
# Cache timeout in seconds. Set to 0 to dissable caching. Defaults to 900 (15 minutes)
CACHE_TIMEOUT = 900
# Maximum number of days to retain logged changes. Set to 0 to retain changes indefinitely. (Default: 90)
CHANGELOG_RETENTION = 90

View File

@ -1,7 +1,7 @@
import logging
from cacheops import CacheMiss, cache
from django.conf import settings
from django.core.cache import cache
from django_rq import get_queue
from utilities.background_tasks import get_releases
@ -12,12 +12,11 @@ logger = logging.getLogger('netbox.releases')
def get_latest_release(pre_releases=False):
if settings.RELEASE_CHECK_URL:
logger.debug("Checking for most recent release")
try:
latest_release = cache.get('latest_release')
if latest_release:
logger.debug("Found cached release: {}".format(latest_release))
return latest_release
except CacheMiss:
latest_release = cache.get('latest_release')
if latest_release:
logger.debug(f"Found cached release: {latest_release}")
return latest_release
else:
# Check for an existing job. This can happen if the RQ worker process is not running.
queue = get_queue('check_releases')
if queue.jobs:

View File

@ -45,6 +45,10 @@ except ModuleNotFoundError as e:
)
raise
# Warn on removed config parameters
if hasattr(configuration, 'CACHE_TIMEOUT'):
warnings.warn("The CACHE_TIMEOUT configuration parameter was removed in v3.0.0 and no longer has any effect.")
# Enforce required configuration parameters
for parameter in ['ALLOWED_HOSTS', 'DATABASE', 'SECRET_KEY', 'REDIS']:
if not hasattr(configuration, parameter):
@ -69,7 +73,6 @@ BANNER_TOP = getattr(configuration, 'BANNER_TOP', '')
BASE_PATH = getattr(configuration, 'BASE_PATH', '')
if BASE_PATH:
BASE_PATH = BASE_PATH.strip('/') + '/' # Enforce trailing slash only
CACHE_TIMEOUT = getattr(configuration, 'CACHE_TIMEOUT', 900)
CHANGELOG_RETENTION = getattr(configuration, 'CHANGELOG_RETENTION', 90)
CORS_ORIGIN_ALLOW_ALL = getattr(configuration, 'CORS_ORIGIN_ALLOW_ALL', False)
CORS_ORIGIN_REGEX_WHITELIST = getattr(configuration, 'CORS_ORIGIN_REGEX_WHITELIST', [])
@ -225,19 +228,31 @@ if 'caching' not in REDIS:
raise ImproperlyConfigured(
"REDIS section in configuration.py is missing caching subsection."
)
CACHING_REDIS = REDIS['caching']
CACHING_REDIS_HOST = CACHING_REDIS.get('HOST', 'localhost')
CACHING_REDIS_PORT = CACHING_REDIS.get('PORT', 6379)
CACHING_REDIS_SENTINELS = CACHING_REDIS.get('SENTINELS', [])
CACHING_REDIS_USING_SENTINEL = all([
isinstance(CACHING_REDIS_SENTINELS, (list, tuple)),
len(CACHING_REDIS_SENTINELS) > 0
])
CACHING_REDIS_SENTINEL_SERVICE = CACHING_REDIS.get('SENTINEL_SERVICE', 'default')
CACHING_REDIS_PASSWORD = CACHING_REDIS.get('PASSWORD', '')
CACHING_REDIS_DATABASE = CACHING_REDIS.get('DATABASE', 0)
CACHING_REDIS_SSL = CACHING_REDIS.get('SSL', False)
CACHING_REDIS_SKIP_TLS_VERIFY = CACHING_REDIS.get('INSECURE_SKIP_TLS_VERIFY', False)
CACHING_REDIS_HOST = REDIS['caching'].get('HOST', 'localhost')
CACHING_REDIS_PORT = REDIS['caching'].get('PORT', 6379)
CACHING_REDIS_DATABASE = REDIS['caching'].get('DATABASE', 0)
CACHING_REDIS_PASSWORD = REDIS['caching'].get('PASSWORD', '')
CACHING_REDIS_SENTINELS = REDIS['caching'].get('SENTINELS', [])
CACHING_REDIS_SENTINEL_SERVICE = REDIS['caching'].get('SENTINEL_SERVICE', 'default')
CACHING_REDIS_PROTO = 'rediss' if REDIS['caching'].get('SSL', False) else 'redis'
CACHING_REDIS_SKIP_TLS_VERIFY = REDIS['caching'].get('INSECURE_SKIP_TLS_VERIFY', False)
CACHES = {
'default': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': f'{CACHING_REDIS_PROTO}://{CACHING_REDIS_HOST}:{CACHING_REDIS_PORT}/{CACHING_REDIS_DATABASE}',
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
'PASSWORD': CACHING_REDIS_PASSWORD,
}
}
}
if CACHING_REDIS_SENTINELS:
CACHES['default']['LOCATION'] = f'{CACHING_REDIS_PROTO}://{CACHING_REDIS_SENTINEL_SERVICE}/{CACHING_REDIS_DATABASE}'
CACHES['default']['OPTIONS']['CLIENT_CLASS'] = 'django_redis.client.SentinelClient'
CACHES['default']['OPTIONS']['SENTINELS'] = CACHING_REDIS_SENTINELS
if CACHING_REDIS_SKIP_TLS_VERIFY:
CACHES['default']['OPTIONS']['CONNECTION_POOL_KWARGS']['ssl_cert_reqs'] = False
#
@ -280,7 +295,6 @@ INSTALLED_APPS = [
'django.contrib.messages',
'django.contrib.staticfiles',
'django.contrib.humanize',
'cacheops',
'corsheaders',
'debug_toolbar',
'graphiql_debug_toolbar',
@ -396,53 +410,6 @@ EXEMPT_EXCLUDE_MODELS = (
('users', 'objectpermission'),
)
#
# Caching
#
if CACHING_REDIS_USING_SENTINEL:
CACHEOPS_SENTINEL = {
'locations': CACHING_REDIS_SENTINELS,
'service_name': CACHING_REDIS_SENTINEL_SERVICE,
'db': CACHING_REDIS_DATABASE,
'password': CACHING_REDIS_PASSWORD,
}
else:
CACHEOPS_REDIS = {
'host': CACHING_REDIS_HOST,
'port': CACHING_REDIS_PORT,
'db': CACHING_REDIS_DATABASE,
'password': CACHING_REDIS_PASSWORD,
'ssl': CACHING_REDIS_SSL,
'ssl_cert_reqs': None if CACHING_REDIS_SKIP_TLS_VERIFY else 'required',
}
if not CACHE_TIMEOUT:
CACHEOPS_ENABLED = False
else:
CACHEOPS_ENABLED = True
CACHEOPS_DEFAULTS = {
'timeout': CACHE_TIMEOUT
}
CACHEOPS = {
'auth.user': {'ops': 'get', 'timeout': 60 * 15},
'auth.*': {'ops': ('fetch', 'get')},
'auth.permission': {'ops': 'all'},
'circuits.*': {'ops': 'all'},
'dcim.inventoryitem': None, # MPTT models are exempt due to raw SQL
'dcim.region': None, # MPTT models are exempt due to raw SQL
'dcim.location': None, # MPTT models are exempt due to raw SQL
'dcim.*': {'ops': 'all'},
'ipam.*': {'ops': 'all'},
'extras.*': {'ops': 'all'},
'users.*': {'ops': 'all'},
'tenancy.tenantgroup': None, # MPTT models are exempt due to raw SQL
'tenancy.*': {'ops': 'all'},
'virtualization.*': {'ops': 'all'},
}
CACHEOPS_DEGRADE_ON_FAILURE = True
#
# Django Prometheus
@ -632,12 +599,3 @@ for plugin_name in PLUGINS:
plugin_middleware = plugin_config.middleware
if plugin_middleware and type(plugin_middleware) in (list, tuple):
MIDDLEWARE.extend(plugin_middleware)
# Apply cacheops config
if type(plugin_config.caching_config) is not dict:
raise ImproperlyConfigured(
"Plugin {} caching_config must be a dictionary.".format(plugin_name)
)
CACHEOPS.update({
"{}.{}".format(plugin_name, key): value for key, value in plugin_config.caching_config.items()
})

View File

@ -3,8 +3,8 @@ from logging import ERROR
from unittest.mock import Mock, patch
import requests
from cacheops import CacheMiss, RedisCache
from django.conf import settings
from django.core.cache import cache
from django.test import SimpleTestCase, override_settings
from packaging.version import Version
from requests import Response
@ -60,10 +60,8 @@ def unsuccessful_github_response(url, *_args, **_kwargs):
@override_settings(RELEASE_CHECK_URL='https://localhost/unittest/releases', RELEASE_CHECK_TIMEOUT=160876)
class GetReleasesTestCase(SimpleTestCase):
@patch.object(requests, 'get')
@patch.object(RedisCache, 'set')
@patch.object(RedisCache, 'get')
def test_pre_releases(self, dummy_cache_get: Mock, dummy_cache_set: Mock, dummy_request_get: Mock):
dummy_cache_get.side_effect = CacheMiss()
@patch.object(cache, 'set')
def test_pre_releases(self, dummy_cache_set: Mock, dummy_request_get: Mock):
dummy_request_get.side_effect = successful_github_response
releases = get_releases(pre_releases=True)
@ -90,10 +88,8 @@ class GetReleasesTestCase(SimpleTestCase):
)
@patch.object(requests, 'get')
@patch.object(RedisCache, 'set')
@patch.object(RedisCache, 'get')
def test_no_pre_releases(self, dummy_cache_get: Mock, dummy_cache_set: Mock, dummy_request_get: Mock):
dummy_cache_get.side_effect = CacheMiss()
@patch.object(cache, 'set')
def test_no_pre_releases(self, dummy_cache_set: Mock, dummy_request_get: Mock):
dummy_request_get.side_effect = successful_github_response
releases = get_releases(pre_releases=False)
@ -119,10 +115,7 @@ class GetReleasesTestCase(SimpleTestCase):
)
@patch.object(requests, 'get')
@patch.object(RedisCache, 'set')
@patch.object(RedisCache, 'get')
def test_failed_request(self, dummy_cache_get: Mock, dummy_cache_set: Mock, dummy_request_get: Mock):
dummy_cache_get.side_effect = CacheMiss()
def test_failed_request(self, dummy_request_get: Mock):
dummy_request_get.side_effect = unsuccessful_github_response
with self.assertLogs(level=ERROR) as cm:
@ -143,28 +136,3 @@ class GetReleasesTestCase(SimpleTestCase):
headers={'Accept': 'application/vnd.github.v3+json'},
proxies=settings.HTTP_PROXIES
)
# Check if failure is put in cache
dummy_cache_set.assert_called_once_with(
'latest_release_no_retry',
'https://localhost/unittest/releases',
900
)
@patch.object(requests, 'get')
@patch.object(RedisCache, 'set')
@patch.object(RedisCache, 'get')
def test_blocked_retry(self, dummy_cache_get: Mock, dummy_cache_set: Mock, dummy_request_get: Mock):
dummy_cache_get.return_value = 'https://localhost/unittest/releases'
dummy_request_get.side_effect = successful_github_response
releases = get_releases()
# Check result
self.assertListEqual(releases, [])
# Check if request is NOT made
dummy_request_get.assert_not_called()
# Check if cache is not updated
dummy_cache_set.assert_not_called()

View File

@ -1,8 +1,8 @@
import logging
import requests
from cacheops.simple import cache, CacheMiss
from django.conf import settings
from django.core.cache import cache
from django_rq import job
from packaging import version
@ -18,16 +18,8 @@ def get_releases(pre_releases=False):
}
releases = []
# Check whether this URL has failed recently and shouldn't be retried yet
try:
if url == cache.get('latest_release_no_retry'):
logger.info("Skipping release check; URL failed recently: {}".format(url))
return []
except CacheMiss:
pass
try:
logger.debug("Fetching new releases from {}".format(url))
logger.info(f"Fetching new releases from {url}")
response = requests.get(url, headers=headers, proxies=settings.HTTP_PROXIES)
response.raise_for_status()
total_releases = len(response.json())
@ -38,12 +30,10 @@ def get_releases(pre_releases=False):
if not pre_releases and (release.get('devrelease') or release.get('prerelease')):
continue
releases.append((version.parse(release['tag_name']), release.get('html_url')))
logger.debug("Found {} releases; {} usable".format(total_releases, len(releases)))
logger.debug(f"Found {total_releases} releases; {len(releases)} usable")
except requests.exceptions.RequestException:
# The request failed. Set a flag in the cache to disable future checks to this URL for 15 minutes.
logger.exception("Error while fetching {}. Disabling checks for 15 minutes.".format(url))
cache.set('latest_release_no_retry', url, 900)
except requests.exceptions.RequestException as exc:
logger.exception(f"Error while fetching latest release from {url}: {exc}")
return []
# Cache the most recent release

View File

@ -1,5 +1,4 @@
Django==3.2.5
django-cacheops==6.0
django-cors-headers==3.7.0
django-debug-toolbar==3.2.1
django-filter==2.4.0
@ -7,6 +6,7 @@ django-graphiql-debug-toolbar==0.1.4
django-mptt==0.12.0
django-pglocks==1.0.4
django-prometheus==2.1.0
django-redis==5.0.0
django-rq==2.4.1
django-tables2==2.4.0
django-taggit==1.5.1