Merge branch 'develop' into 16315-cant_filter_changelog_by_object_type_no_results_found

This commit is contained in:
Julio-Oliveira-Encora 2024-06-03 12:26:10 -03:00
commit 81c3e72e11
29 changed files with 5785 additions and 4936 deletions

View File

@ -12,10 +12,10 @@ jobs:
auto-assign:
runs-on: ubuntu-latest
steps:
- uses: pozil/auto-assign-issue@v1
- uses: pozil/auto-assign-issue@v2
if: "contains(github.event.issue.labels.*.name, 'status: needs triage')"
with:
# Weighted assignments
assignees: arthanson:3, jeffgdotorg:3, jeremystretch:3, abhi1693, DanSheps
assignees: arthanson:3, jeffgdotorg:3, jeremystretch:3, DanSheps
numOfAssignee: 1
abortIfPreviousAssignees: true

View File

@ -1,7 +1,18 @@
name: CI
on: [push, pull_request]
on:
push:
paths-ignore:
- 'contrib/**'
- 'docs/**'
pull_request:
paths-ignore:
- 'contrib/**'
- 'docs/**'
permissions:
contents: read
jobs:
build:
runs-on: ubuntu-latest
@ -34,12 +45,12 @@ jobs:
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node-version }}
@ -47,7 +58,7 @@ jobs:
run: npm install -g yarn
- name: Setup Node.js with Yarn Caching
uses: actions/setup-node@v3
uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node-version }}
cache: yarn

View File

@ -0,0 +1,45 @@
name: Update translation strings
on:
schedule:
- cron: '0 5 * * *'
workflow_dispatch:
permissions:
contents: write
env:
LOCALE: "en"
jobs:
makemessages:
runs-on: ubuntu-latest
env:
NETBOX_CONFIGURATION: netbox.configuration_testing
steps:
- name: Check out repo
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.11
- name: Install system dependencies
run: sudo apt install -y gettext
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Run makemessages
run: python netbox/manage.py makemessages -l ${{ env.LOCALE }}
- name: Commit changes
uses: EndBug/add-and-commit@v9
with:
add: 'netbox/translations/'
default_author: github_actions
message: 'Update source translation strings'

View File

@ -86,15 +86,7 @@ This will automatically update the schema file at `contrib/generated_schema.json
### Update & Compile Translations
Log into [Transifex](https://app.transifex.com/netbox-community/netbox/dashboard/) to download the updated string maps. Download the resource (portable object, or `.po`) file for each language and save them to `netbox/translations/$lang/LC_MESSAGES/django.po`, overwriting the current files. (Be sure to click the **Download for use** link.)
![Transifex download](../media/development/transifex_download.png)
Once the resource files for all languages have been updated, compile the machine object (`.mo`) files using the `compilemessages` management command:
```nohighlight
./manage.py compilemessages
```
Updated language translations should be pulled from [Transifex](https://app.transifex.com/netbox-community/netbox/dashboard/) and re-compiled for each new release. Follow the documented process for [updating translated strings](./translations.md#updating-translated-strings) to do this.
### Update Version and Changelog

View File

@ -6,17 +6,38 @@ All language translations in NetBox are generated from the source file found at
Reviewers log into Transifex and navigate to their designated language(s) to translate strings. The initial translation for most strings will be machine-generated via the AWS Translate service. Human reviewers are responsible for reviewing these translations and making corrections where necessary.
Immediately prior to each NetBox release, the translation maps for all completed languages will be downloaded from Transifex, compiled, and checked into the NetBox code base by a maintainer.
## Updating Translation Sources
To update the English `.po` file from which all translations are derived, use the `makemessages` management command:
To update the English `.po` file from which all translations are derived, use the `makemessages` management command (ignoring the `project-static/` directory):
```nohighlight
./manage.py makemessages -l en
./manage.py makemessages -l en -i "project-static/*"
```
Then, commit the change and push to the `develop` branch on GitHub. After some time, any new strings will appear for translation on Transifex automatically.
Then, commit the change and push to the `develop` branch on GitHub. Any new strings will appear for translation on Transifex automatically.
## Updating Translated Strings
Typically, translated strings need to be updated only as part of the NetBox [release process](./release-checklist.md).
To update translated strings, start by initiating a sync from Transifex. From the Transifex dashboard, navigate to Settings > Integrations > GitHub > Manage, and click the **Manual Sync** button at top right.
![Transifex manual sync](../media/development/transifex_sync.png)
Enter a threshold percentage of 1 (to ensure all translations are captured) and select the `develop` branch, then click **Sync**. This will initiate a pull request to GitHub to update any newly modified translation (`.po`) files.
!!! tip
The new PR should appear within a few minutes. If it does not, check that there are in fact new translations to be added.
![Transifex pull request](../media/development/transifex_pull_request.png)
Once the PR has been merged, the updated strings need to be compiled into new `.mo` files so they can be used by the application. Update the `develop` branch locally to pull in the changes from the Transifex PR, then run Django's [`compilemessages`](https://docs.djangoproject.com/en/stable/ref/django-admin/#django-admin-compilemessages) management command:
```nohighlight
./manage.py compilemessages
```
Once any new `.mo` files have been generated, they need to be committed and pushed back up to GitHub. (Again, this is typically done as part of publishing a new NetBox release.)
## Proposing New Languages

Binary file not shown.

Before

Width:  |  Height:  |  Size: 54 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 108 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

View File

@ -2,6 +2,18 @@
## v4.0.4 (FUTURE)
### Enhancements
* [#14810](https://github.com/netbox-community/netbox/issues/14810) - Enable contact assignment for services
* [#15489](https://github.com/netbox-community/netbox/issues/15489) - Add 1000Base-TX interface type
* [#16290](https://github.com/netbox-community/netbox/issues/16290) - Capture entire object in changelog data (but continue to display only non-internal attributes)
### Bug Fixes
* [#13422](https://github.com/netbox-community/netbox/issues/13422) - Rebuild MPTT trees for applicable models after merging staged changes
* [#16202](https://github.com/netbox-community/netbox/issues/16202) - Fix site map button URL for certain localizations
* [#16286](https://github.com/netbox-community/netbox/issues/16286) - Fix global search support for provider accounts
---
## v4.0.3 (2024-05-22)

View File

@ -43,14 +43,6 @@ MODULEBAY_STATUS = """
"""
def get_cabletermination_row_class(record):
if record.mark_connected:
return 'success'
elif record.cable:
return record.cable.get_status_color()
return ''
#
# Device roles
#
@ -339,6 +331,14 @@ class CableTerminationTable(NetBoxTable):
verbose_name=_('Mark Connected'),
)
class Meta:
row_attrs = {
'data-name': lambda record: record.name,
'data-mark-connected': lambda record: "true" if record.mark_connected else "false",
'data-cable-status': lambda record: record.cable.status if record.cable else "",
'data-type': lambda record: record.type
}
def value_link_peer(self, value):
return ', '.join([
f"{termination.parent_object} > {termination}" for termination in value
@ -386,16 +386,13 @@ class DeviceConsolePortTable(ConsolePortTable):
extra_buttons=CONSOLEPORT_BUTTONS
)
class Meta(DeviceComponentTable.Meta):
class Meta(CableTerminationTable.Meta, DeviceComponentTable.Meta):
model = models.ConsolePort
fields = (
'pk', 'id', 'name', 'module_bay', 'module', 'label', 'type', 'speed', 'description', 'mark_connected',
'cable', 'cable_color', 'link_peer', 'connection', 'tags', 'actions'
)
default_columns = ('pk', 'name', 'label', 'type', 'speed', 'description', 'cable', 'connection')
row_attrs = {
'class': get_cabletermination_row_class
}
class ConsoleServerPortTable(ModularDeviceComponentTable, PathEndpointTable):
@ -431,16 +428,13 @@ class DeviceConsoleServerPortTable(ConsoleServerPortTable):
extra_buttons=CONSOLESERVERPORT_BUTTONS
)
class Meta(DeviceComponentTable.Meta):
class Meta(CableTerminationTable.Meta, DeviceComponentTable.Meta):
model = models.ConsoleServerPort
fields = (
'pk', 'id', 'name', 'module_bay', 'module', 'label', 'type', 'speed', 'description', 'mark_connected',
'cable', 'cable_color', 'link_peer', 'connection', 'tags', 'actions',
)
default_columns = ('pk', 'name', 'label', 'type', 'speed', 'description', 'cable', 'connection')
row_attrs = {
'class': get_cabletermination_row_class
}
class PowerPortTable(ModularDeviceComponentTable, PathEndpointTable):
@ -483,7 +477,7 @@ class DevicePowerPortTable(PowerPortTable):
extra_buttons=POWERPORT_BUTTONS
)
class Meta(DeviceComponentTable.Meta):
class Meta(CableTerminationTable.Meta, DeviceComponentTable.Meta):
model = models.PowerPort
fields = (
'pk', 'id', 'name', 'module_bay', 'module', 'label', 'type', 'maximum_draw', 'allocated_draw',
@ -492,9 +486,6 @@ class DevicePowerPortTable(PowerPortTable):
default_columns = (
'pk', 'name', 'label', 'type', 'maximum_draw', 'allocated_draw', 'description', 'cable', 'connection',
)
row_attrs = {
'class': get_cabletermination_row_class
}
class PowerOutletTable(ModularDeviceComponentTable, PathEndpointTable):
@ -534,7 +525,7 @@ class DevicePowerOutletTable(PowerOutletTable):
extra_buttons=POWEROUTLET_BUTTONS
)
class Meta(DeviceComponentTable.Meta):
class Meta(CableTerminationTable.Meta, DeviceComponentTable.Meta):
model = models.PowerOutlet
fields = (
'pk', 'id', 'name', 'module_bay', 'module', 'label', 'type', 'power_port', 'feed_leg', 'description',
@ -543,9 +534,6 @@ class DevicePowerOutletTable(PowerOutletTable):
default_columns = (
'pk', 'name', 'label', 'type', 'power_port', 'feed_leg', 'description', 'cable', 'connection',
)
row_attrs = {
'class': get_cabletermination_row_class
}
class BaseInterfaceTable(NetBoxTable):
@ -733,7 +721,7 @@ class DeviceFrontPortTable(FrontPortTable):
extra_buttons=FRONTPORT_BUTTONS
)
class Meta(DeviceComponentTable.Meta):
class Meta(CableTerminationTable.Meta, DeviceComponentTable.Meta):
model = models.FrontPort
fields = (
'pk', 'id', 'name', 'module_bay', 'module', 'label', 'type', 'rear_port', 'rear_port_position',
@ -742,9 +730,6 @@ class DeviceFrontPortTable(FrontPortTable):
default_columns = (
'pk', 'name', 'label', 'type', 'rear_port', 'rear_port_position', 'description', 'cable', 'link_peer',
)
row_attrs = {
'class': get_cabletermination_row_class
}
class RearPortTable(ModularDeviceComponentTable, CableTerminationTable):
@ -783,7 +768,7 @@ class DeviceRearPortTable(RearPortTable):
extra_buttons=REARPORT_BUTTONS
)
class Meta(DeviceComponentTable.Meta):
class Meta(CableTerminationTable.Meta, DeviceComponentTable.Meta):
model = models.RearPort
fields = (
'pk', 'id', 'name', 'module_bay', 'module', 'label', 'type', 'positions', 'description', 'mark_connected',
@ -792,9 +777,6 @@ class DeviceRearPortTable(RearPortTable):
default_columns = (
'pk', 'name', 'label', 'type', 'positions', 'description', 'cable', 'link_peer',
)
row_attrs = {
'class': get_cabletermination_row_class
}
class DeviceBayTable(DeviceComponentTable):

View File

@ -30,6 +30,16 @@ class ObjectChangeSerializer(BaseModelSerializer):
changed_object = serializers.SerializerMethodField(
read_only=True
)
prechange_data = serializers.JSONField(
source='prechange_data_clean',
read_only=True,
allow_null=True
)
postchange_data = serializers.JSONField(
source='postchange_data_clean',
read_only=True,
allow_null=True
)
class Meta:
model = ObjectChange

View File

@ -13,13 +13,14 @@ def event_tracking(request):
:param request: WSGIRequest object with a unique `id` set
"""
current_request.set(request)
events_queue.set([])
events_queue.set({})
yield
# Flush queued webhooks to RQ
flush_events(events_queue.get())
if events := list(events_queue.get().values()):
flush_events(events)
# Clear context vars
current_request.set(None)
events_queue.set([])
events_queue.set({})

View File

@ -265,6 +265,7 @@ class ObjectListWidget(DashboardWidget):
parameters = self.config.get('url_params') or {}
if page_size := self.config.get('page_size'):
parameters['per_page'] = page_size
parameters['embedded'] = True
if parameters:
try:

View File

@ -58,7 +58,13 @@ def enqueue_object(queue, instance, user, request_id, action):
if model_name not in registry['model_features']['event_rules'].get(app_label, []):
return
queue.append({
assert instance.pk is not None
key = f'{app_label}.{model_name}:{instance.pk}'
if key in queue:
queue[key]['data'] = serialize_for_event(instance)
queue[key]['snapshots']['postchange'] = get_snapshots(instance, action)['postchange']
else:
queue[key] = {
'content_type': ContentType.objects.get_for_model(instance),
'object_id': instance.pk,
'event': action,
@ -66,7 +72,7 @@ def enqueue_object(queue, instance, user, request_id, action):
'snapshots': get_snapshots(instance, action),
'username': user.username,
'request_id': request_id
})
}
def process_event_rules(event_rules, model_name, event, data, username=None, snapshots=None, request_id=None):
@ -163,14 +169,14 @@ def process_event_queue(events):
)
def flush_events(queue):
def flush_events(events):
"""
Flush a list of object representation to RQ for webhook processing.
Flush a list of object representations to RQ for event processing.
"""
if queue:
if events:
for name in settings.EVENTS_PIPELINE:
try:
func = import_string(name)
func(queue)
func(events)
except Exception as e:
logger.error(_("Cannot import events pipeline {name} error: {error}").format(name=name, error=e))

View File

@ -1,12 +1,17 @@
from functools import cached_property
from django.conf import settings
from django.contrib.contenttypes.fields import GenericForeignKey
from django.core.exceptions import ValidationError
from django.db import models
from django.urls import reverse
from django.utils.translation import gettext_lazy as _
from mptt.models import MPTTModel
from core.models import ObjectType
from extras.choices import *
from netbox.models.features import ChangeLoggingMixin
from utilities.data import shallow_compare_dict
from ..querysets import ObjectChangeQuerySet
__all__ = (
@ -136,6 +141,71 @@ class ObjectChange(models.Model):
def get_action_color(self):
return ObjectChangeActionChoices.colors.get(self.action)
@property
@cached_property
def has_changes(self):
return self.prechange_data != self.postchange_data
@cached_property
def diff_exclude_fields(self):
"""
Return a set of attributes which should be ignored when calculating a diff
between the pre- and post-change data. (For instance, it would not make
sense to compare the "last updated" times as these are expected to differ.)
"""
model = self.changed_object_type.model_class()
attrs = set()
# Exclude auto-populated change tracking fields
if issubclass(model, ChangeLoggingMixin):
attrs.update({'created', 'last_updated'})
# Exclude MPTT-internal fields
if issubclass(model, MPTTModel):
attrs.update({'level', 'lft', 'rght', 'tree_id'})
return attrs
def get_clean_data(self, prefix):
"""
Return only the pre-/post-change attributes which are relevant for calculating a diff.
"""
ret = {}
change_data = getattr(self, f'{prefix}_data') or {}
for k, v in change_data.items():
if k not in self.diff_exclude_fields and not k.startswith('_'):
ret[k] = v
return ret
@cached_property
def prechange_data_clean(self):
return self.get_clean_data('prechange')
@cached_property
def postchange_data_clean(self):
return self.get_clean_data('postchange')
def diff(self):
"""
Return a dictionary of pre- and post-change values for attributes which have changed.
"""
prechange_data = self.prechange_data_clean
postchange_data = self.postchange_data_clean
# Determine which attributes have changed
if self.action == ObjectChangeActionChoices.ACTION_CREATE:
changed_attrs = sorted(postchange_data.keys())
elif self.action == ObjectChangeActionChoices.ACTION_DELETE:
changed_attrs = sorted(prechange_data.keys())
else:
# TODO: Support deep (recursive) comparison
changed_data = shallow_compare_dict(prechange_data, postchange_data)
changed_attrs = sorted(changed_data.keys())
return {
'pre': {
k: prechange_data.get(k) for k in changed_attrs
},
'post': {
k: postchange_data.get(k) for k in changed_attrs
},
}

View File

@ -4,6 +4,7 @@ from django.contrib.auth import get_user_model
from django.contrib.contenttypes.fields import GenericForeignKey
from django.db import models, transaction
from django.utils.translation import gettext_lazy as _
from mptt.models import MPTTModel
from extras.choices import ChangeActionChoices
from netbox.models import ChangeLoggedModel
@ -124,6 +125,11 @@ class StagedChange(CustomValidationMixin, EventRulesMixin, models.Model):
instance = self.model.objects.get(pk=self.object_id)
logger.info(f'Deleting {self.model._meta.verbose_name} {instance}')
instance.delete()
# Rebuild the MPTT tree where applicable
if issubclass(self.model, MPTTModel):
self.model.objects.rebuild()
apply.alters_data = True
def get_action_color(self):

View File

@ -55,18 +55,6 @@ def run_validators(instance, validators):
clear_events = Signal()
def is_same_object(instance, webhook_data, request_id):
"""
Compare the given instance to the most recent queued webhook object, returning True
if they match. This check is used to avoid creating duplicate webhook entries.
"""
return (
ContentType.objects.get_for_model(instance) == webhook_data['content_type'] and
instance.pk == webhook_data['object_id'] and
request_id == webhook_data['request_id']
)
@receiver((post_save, m2m_changed))
def handle_changed_object(sender, instance, **kwargs):
"""
@ -112,13 +100,12 @@ def handle_changed_object(sender, instance, **kwargs):
objectchange.request_id = request.id
objectchange.save()
# If this is an M2M change, update the previously queued webhook (from post_save)
# Ensure that we're working with fresh M2M assignments
if m2m_changed:
instance.refresh_from_db()
# Enqueue the object for event processing
queue = events_queue.get()
if m2m_changed and queue and is_same_object(instance, queue[-1], request.id):
instance.refresh_from_db() # Ensure that we're working with fresh M2M assignments
queue[-1]['data'] = serialize_for_event(instance)
queue[-1]['snapshots']['postchange'] = get_snapshots(instance, action)['postchange']
else:
enqueue_object(queue, instance, request.user, request.id, action)
events_queue.set(queue)
@ -179,7 +166,7 @@ def handle_deleted_object(sender, instance, **kwargs):
obj.snapshot() # Ensure the change record includes the "before" state
getattr(obj, related_field_name).remove(instance)
# Enqueue webhooks
# Enqueue the object for event processing
queue = events_queue.get()
enqueue_object(queue, instance, request.user, request.id, ObjectChangeActionChoices.ACTION_DELETE)
events_queue.set(queue)
@ -195,7 +182,7 @@ def clear_events_queue(sender, **kwargs):
"""
logger = logging.getLogger('events')
logger.info(f"Clearing {len(events_queue.get())} queued events ({sender})")
events_queue.set([])
events_queue.set({})
#

View File

@ -75,6 +75,10 @@ class ChangeLogViewTest(ModelViewTestCase):
self.assertEqual(oc.postchange_data['custom_fields']['cf2'], form_data['cf_cf2'])
self.assertEqual(oc.postchange_data['tags'], ['Tag 1', 'Tag 2'])
# Check that private attributes were included in raw data but not display data
self.assertIn('_name', oc.postchange_data)
self.assertNotIn('_name', oc.postchange_data_clean)
def test_update_object(self):
site = Site(name='Site 1', slug='site-1')
site.save()
@ -112,6 +116,12 @@ class ChangeLogViewTest(ModelViewTestCase):
self.assertEqual(oc.postchange_data['custom_fields']['cf2'], form_data['cf_cf2'])
self.assertEqual(oc.postchange_data['tags'], ['Tag 3'])
# Check that private attributes were included in raw data but not display data
self.assertIn('_name', oc.prechange_data)
self.assertNotIn('_name', oc.prechange_data_clean)
self.assertIn('_name', oc.postchange_data)
self.assertNotIn('_name', oc.postchange_data_clean)
def test_delete_object(self):
site = Site(
name='Site 1',
@ -142,6 +152,10 @@ class ChangeLogViewTest(ModelViewTestCase):
self.assertEqual(oc.prechange_data['tags'], ['Tag 1', 'Tag 2'])
self.assertEqual(oc.postchange_data, None)
# Check that private attributes were included in raw data but not display data
self.assertIn('_name', oc.prechange_data)
self.assertNotIn('_name', oc.prechange_data_clean)
def test_bulk_update_objects(self):
sites = (
Site(name='Site 1', slug='site-1', status=SiteStatusChoices.STATUS_ACTIVE),
@ -338,6 +352,10 @@ class ChangeLogAPITest(APITestCase):
self.assertEqual(oc.postchange_data['custom_fields'], data['custom_fields'])
self.assertEqual(oc.postchange_data['tags'], ['Tag 1', 'Tag 2'])
# Check that private attributes were included in raw data but not display data
self.assertIn('_name', oc.postchange_data)
self.assertNotIn('_name', oc.postchange_data_clean)
def test_update_object(self):
site = Site(name='Site 1', slug='site-1')
site.save()
@ -370,6 +388,12 @@ class ChangeLogAPITest(APITestCase):
self.assertEqual(oc.postchange_data['custom_fields'], data['custom_fields'])
self.assertEqual(oc.postchange_data['tags'], ['Tag 3'])
# Check that private attributes were included in raw data but not display data
self.assertIn('_name', oc.prechange_data)
self.assertNotIn('_name', oc.prechange_data_clean)
self.assertIn('_name', oc.postchange_data)
self.assertNotIn('_name', oc.postchange_data_clean)
def test_delete_object(self):
site = Site(
name='Site 1',
@ -398,6 +422,10 @@ class ChangeLogAPITest(APITestCase):
self.assertEqual(oc.prechange_data['tags'], ['Tag 1', 'Tag 2'])
self.assertEqual(oc.postchange_data, None)
# Check that private attributes were included in raw data but not display data
self.assertIn('_name', oc.prechange_data)
self.assertNotIn('_name', oc.prechange_data_clean)
def test_bulk_create_objects(self):
data = (
{

View File

@ -4,6 +4,7 @@ from unittest.mock import patch
import django_rq
from django.http import HttpResponse
from django.test import RequestFactory
from django.urls import reverse
from requests import Session
from rest_framework import status
@ -12,6 +13,7 @@ from core.models import ObjectType
from dcim.choices import SiteStatusChoices
from dcim.models import Site
from extras.choices import EventRuleActionChoices, ObjectChangeActionChoices
from extras.context_managers import event_tracking
from extras.events import enqueue_object, flush_events, serialize_for_event
from extras.models import EventRule, Tag, Webhook
from extras.webhooks import generate_signature, send_webhook
@ -360,7 +362,7 @@ class EventRuleTest(APITestCase):
return HttpResponse()
# Enqueue a webhook for processing
webhooks_queue = []
webhooks_queue = {}
site = Site.objects.create(name='Site 1', slug='site-1')
enqueue_object(
webhooks_queue,
@ -369,7 +371,7 @@ class EventRuleTest(APITestCase):
request_id=request_id,
action=ObjectChangeActionChoices.ACTION_CREATE
)
flush_events(webhooks_queue)
flush_events(list(webhooks_queue.values()))
# Retrieve the job from queue
job = self.queue.jobs[0]
@ -377,3 +379,24 @@ class EventRuleTest(APITestCase):
# Patch the Session object with our dummy_send() method, then process the webhook for sending
with patch.object(Session, 'send', dummy_send) as mock_send:
send_webhook(**job.kwargs)
def test_duplicate_triggers(self):
"""
Test for erroneous duplicate event triggers resulting from saving an object multiple times
within the span of a single request.
"""
url = reverse('dcim:site_add')
request = RequestFactory().get(url)
request.id = uuid.uuid4()
request.user = self.user
self.assertEqual(self.queue.count, 0, msg="Unexpected jobs found in queue")
with event_tracking(request):
site = Site(name='Site 1', slug='site-1')
site.save()
# Save the site a second time
site.save()
self.assertEqual(self.queue.count, 1, msg="Duplicate jobs found in queue")

View File

@ -723,15 +723,15 @@ class ObjectChangeView(generic.ObjectView):
if not instance.prechange_data and instance.action in ['update', 'delete'] and prev_change:
non_atomic_change = True
prechange_data = prev_change.postchange_data
prechange_data = prev_change.postchange_data_clean
else:
non_atomic_change = False
prechange_data = instance.prechange_data
prechange_data = instance.prechange_data_clean
if prechange_data and instance.postchange_data:
diff_added = shallow_compare_dict(
prechange_data or dict(),
instance.postchange_data or dict(),
instance.postchange_data_clean or dict(),
exclude=['last_updated'],
)
diff_removed = {

View File

@ -7,4 +7,4 @@ __all__ = (
current_request = ContextVar('current_request', default=None)
events_queue = ContextVar('events_queue', default=[])
events_queue = ContextVar('events_queue', default=dict())

Binary file not shown.

View File

@ -1,7 +1,7 @@
// Serialized data from change records
pre.change-data {
padding-right: 0;
padding-left: 0;
border-radius: 0;
padding: 0;
// Display each line individually for highlighting
> span {

View File

@ -5,6 +5,7 @@
{% load helpers %}
{% load plugins %}
{% load i18n %}
{% load l10n %}
{% load mptt %}
{% block content %}
@ -63,7 +64,7 @@
{% if object.latitude and object.longitude %}
{% if config.MAPS_URL %}
<div class="position-absolute top-50 end-0 translate-middle-y d-print-none">
<a href="{{ config.MAPS_URL }}{{ object.latitude }},{{ object.longitude }}" target="_blank" class="btn btn-primary">
<a href="{{ config.MAPS_URL }}{{ object.latitude|unlocalize }},{{ object.longitude|unlocalize }}" target="_blank" class="btn btn-primary">
<i class="mdi mdi-map-marker"></i> {% trans "Map It" %}
</a>
</div>

View File

@ -3,6 +3,7 @@
{% load plugins %}
{% load tz %}
{% load i18n %}
{% load l10n %}
{% load mptt %}
{% block breadcrumbs %}
@ -95,7 +96,7 @@
{% if object.latitude and object.longitude %}
{% if config.MAPS_URL %}
<div class="position-absolute top-50 end-0 translate-middle-y d-print-none">
<a href="{{ config.MAPS_URL }}{{ object.latitude }},{{ object.longitude }}" target="_blank" class="btn btn-primary">
<a href="{{ config.MAPS_URL }}{{ object.latitude|unlocalize }},{{ object.longitude|unlocalize }}" target="_blank" class="btn btn-primary">
<i class="mdi mdi-map-marker"></i> {% trans "Map It" %}
</a>
</div>

View File

@ -112,7 +112,7 @@
{% if object.prechange_data %}
{% spaceless %}
<pre class="change-data">
{% for k, v in object.prechange_data.items %}
{% for k, v in object.prechange_data_clean.items %}
<span{% if k in diff_removed %} class="removed"{% endif %}>{{ k }}: {{ v|json }}</span>
{% endfor %}
</pre>
@ -132,7 +132,7 @@
{% if object.postchange_data %}
{% spaceless %}
<pre class="change-data">
{% for k, v in object.postchange_data.items %}
{% for k, v in object.postchange_data_clean.items %}
<span{% if k in diff_added %} class="added"{% endif %}>{{ k }}: {{ v|json }}</span>
{% endfor %}
</pre>

File diff suppressed because it is too large Load Diff

View File

@ -2,7 +2,6 @@ import json
from django.contrib.contenttypes.models import ContentType
from django.core import serializers
from mptt.models import MPTTModel
from extras.utils import is_taggable
@ -16,8 +15,7 @@ def serialize_object(obj, resolve_tags=True, extra=None, exclude=None):
"""
Return a generic JSON representation of an object using Django's built-in serializer. (This is used for things like
change logging, not the REST API.) Optionally include a dictionary to supplement the object data. A list of keys
can be provided to exclude them from the returned dictionary. Private fields (prefaced with an underscore) are
implicitly excluded.
can be provided to exclude them from the returned dictionary.
Args:
obj: The object to serialize
@ -30,11 +28,6 @@ def serialize_object(obj, resolve_tags=True, extra=None, exclude=None):
data = json.loads(json_str)[0]['fields']
exclude = exclude or []
# Exclude any MPTTModel fields
if issubclass(obj.__class__, MPTTModel):
for field in ['level', 'lft', 'rght', 'tree_id']:
data.pop(field)
# Include custom_field_data as "custom_fields"
if hasattr(obj, 'custom_field_data'):
data['custom_fields'] = data.pop('custom_field_data')
@ -45,9 +38,9 @@ def serialize_object(obj, resolve_tags=True, extra=None, exclude=None):
tags = getattr(obj, '_tags', None) or obj.tags.all()
data['tags'] = sorted([tag.name for tag in tags])
# Skip excluded and private (prefixes with an underscore) attributes
# Skip any excluded attributes
for key in list(data.keys()):
if key in exclude or (isinstance(key, str) and key.startswith('_')):
if key in exclude:
data.pop(key)
# Append any extra data

View File

@ -173,6 +173,8 @@ class VirtualMachineVMInterfaceTable(VMInterfaceTable):
default_columns = ('pk', 'name', 'enabled', 'mac_address', 'mtu', 'mode', 'description', 'ip_addresses')
row_attrs = {
'data-name': lambda record: record.name,
'data-virtual': lambda record: "true",
'data-enabled': lambda record: "true" if record.enabled else "false",
}