Compare commits

..

No commits in common. "main" and "v4.3.2" have entirely different histories.
main ... v4.3.2

127 changed files with 4636 additions and 6908 deletions

View File

@ -15,7 +15,7 @@ body:
attributes:
label: NetBox version
description: What version of NetBox are you currently running?
placeholder: v4.3.4
placeholder: v4.3.2
validations:
required: true
- type: dropdown

View File

@ -27,7 +27,7 @@ body:
attributes:
label: NetBox Version
description: What version of NetBox are you currently running?
placeholder: v4.3.4
placeholder: v4.3.2
validations:
required: true
- type: dropdown

View File

@ -8,7 +8,7 @@
</h3>
<h3>
:jigsaw: <a href="#jigsaw-creating-plugins">Create a plugin</a> &middot;
:briefcase: <a href="#briefcase-looking-for-a-job">Work with us!</a> &middot;
:rescue_worker_helmet: <a href="#rescue_worker_helmet-become-a-maintainer">Become a maintainer</a> &middot;
:heart: <a href="#heart-other-ways-to-contribute">Other ideas</a>
</h3>
</div>
@ -109,9 +109,21 @@ Do you have an idea for something you'd like to build in NetBox, but might not b
Check out our [plugin development tutorial](https://github.com/netbox-community/netbox-plugin-tutorial) to get started!
## :briefcase: Looking for a Job?
## :rescue_worker_helmet: Become a Maintainer
At [NetBox Labs](https://netboxlabs.com/), we're always looking for highly skilled and motivated people to join our team. While NetBox is a core part of our product lineup, we have an ever-expanding suite of solutions serving the network automation space. Check out our [current openings](https://netboxlabs.com/careers/) to see if you might be a fit!
We're always looking for motivated individuals to join the maintainers team and help drive NetBox's long-term development. Some of our most sought-after skills include:
* Python development with a strong focus on the [Django](https://www.djangoproject.com/) framework
* Expertise working with PostgreSQL databases
* Javascript & TypeScript proficiency
* A knack for web application design (HTML & CSS)
* Familiarity with git and software development best practices
* Excellent attention to detail
* Working experience in the field of network operations & engineering
We generally ask that maintainers dedicate around four hours of work to the project each week on average, which includes both hands-on development and project management tasks such as issue triage. Maintainers are also encouraged (but not required) to attend our bi-weekly Zoom call to catch up on recent items.
Interested? You can contact our lead maintainer, Jeremy Stretch, at jeremy@netbox.dev or on the [NetDev Community Slack](https://netdev.chat/). We'd love to have you on the team!
## :heart: Other Ways to Contribute

View File

@ -6,9 +6,9 @@
<a href="https://github.com/netbox-community/netbox/graphs/contributors"><img src="https://img.shields.io/github/contributors/netbox-community/netbox?color=blue" alt="Contributors" /></a>
<a href="https://github.com/netbox-community/netbox/stargazers"><img src="https://img.shields.io/github/stars/netbox-community/netbox?style=flat" alt="GitHub stars" /></a>
<a href="https://explore.transifex.com/netbox-community/netbox/"><img src="https://img.shields.io/badge/languages-15-blue" alt="Languages supported" /></a>
<a href="https://github.com/netbox-community/netbox/actions/workflows/ci.yml"><img src="https://github.com/netbox-community/netbox/actions/workflows/ci.yml/badge.svg" alt="CI status" /></a>
<a href="https://github.com/netbox-community/netbox/actions/workflows/ci.yml"><img src="https://github.com/netbox-community/netbox/workflows/CI/badge.svg?branch=main" alt="CI status" /></a>
<p>
<strong><a href="https://netboxlabs.com/community/">NetBox Community</a></strong> |
<strong><a href="https://github.com/netbox-community/netbox/">NetBox Community</a></strong> |
<strong><a href="https://netboxlabs.com/netbox-cloud/">NetBox Cloud</a></strong> |
<strong><a href="https://netboxlabs.com/netbox-enterprise/">NetBox Enterprise</a></strong>
</p>

View File

@ -14,10 +14,6 @@ django-debug-toolbar
# https://github.com/carltongibson/django-filter/blob/main/CHANGES.rst
django-filter
# Django Debug Toolbar extension for GraphiQL
# https://github.com/flavors/django-graphiql-debug-toolbar/blob/main/CHANGES.rst
django-graphiql-debug-toolbar
# HTMX utilities for Django
# https://django-htmx.readthedocs.io/en/latest/changelog.html
django-htmx
@ -112,7 +108,6 @@ nh3
# Fork of PIL (Python Imaging Library) for image processing
# https://github.com/python-pillow/Pillow/releases
# https://pillow.readthedocs.io/en/stable/releasenotes/
Pillow
# PostgreSQL database adapter for Python
@ -131,22 +126,21 @@ requests
# https://github.com/rq/rq/blob/master/CHANGES.md
rq
# Django app for social-auth-core
# https://github.com/python-social-auth/social-app-django/blob/master/CHANGELOG.md
social-auth-app-django
# Social authentication framework
# https://github.com/python-social-auth/social-core/blob/master/CHANGELOG.md
social-auth-core
# Django app for social-auth-core
# https://github.com/python-social-auth/social-app-django/blob/master/CHANGELOG.md
social-auth-app-django
# Strawberry GraphQL
# https://github.com/strawberry-graphql/strawberry/blob/main/CHANGELOG.md
strawberry-graphql
# Strawberry GraphQL Django extension
# https://github.com/strawberry-graphql/strawberry-django/releases
# See #19771
strawberry-graphql-django==0.60.0
strawberry-graphql-django
# SVG image rendering (used for rack elevations)
# https://github.com/mozman/svgwrite/blob/master/NEWS.rst

View File

@ -158,7 +158,6 @@ LOGGING = {
* `netbox.<app>.<model>` - Generic form for model-specific log messages
* `netbox.auth.*` - Authentication events
* `netbox.api.views.*` - Views which handle business logic for the REST API
* `netbox.event_rules` - Event rules
* `netbox.reports.*` - Report execution (`module.name`)
* `netbox.scripts.*` - Custom script execution (`module.name`)
* `netbox.views.*` - Views which handle business logic for the web UI

View File

@ -147,7 +147,7 @@ For UI development you will need to review the [Web UI Development Guide](web-ui
## Populating Demo Data
Once you have your development environment up and running, it might be helpful to populate some "dummy" data to make interacting with the UI and APIs more convenient. Check out the [netbox-demo-data](https://github.com/netbox-community/netbox-demo-data) repo on GitHub, which houses a collection of sample data that can be easily imported to any new NetBox deployment. This sample data is used to populate the [public demo instance](https://demo.netbox.dev).
Once you have your development environment up and running, it might be helpful to populate some "dummy" data to make interacting with the UI and APIs more convenient. Check out the [netbox-demo-data](https://github.com/netbox-community/netbox-demo-data) repo on GitHub, which houses a collection of sample data that can be easily imported to any new NetBox deployment. (This sample data is used to populate the public demo instance at <https://demo.netbox.dev>.)
The demo data is provided in JSON format and loaded into an empty database using Django's `loaddata` management command. Consult the demo data repo's `README` file for complete instructions on populating the data.

View File

@ -166,8 +166,7 @@ Then, compile these portable (`.po`) files for use in the application:
### Update Version and Changelog
* Update the version number and published date in `netbox/release.yaml`. Add or remove the designation (e.g. `beta1`) if applicable.
* Copy the version number from `release.yaml` to `pyproject.toml` in the project root.
* Update the version number and date in `netbox/release.yaml` and `pyproject.toml`. Add or remove the designation (e.g. `beta1`) if applicable.
* Update the example version numbers in the feature request and bug report templates under `.github/ISSUE_TEMPLATES/`.
* Add a section for this release at the top of the changelog page for the minor version (e.g. `docs/release-notes/version-4.2.md`) listing all relevant changes made in this release.
@ -193,3 +192,15 @@ Create a [new release](https://github.com/netbox-community/netbox/releases/new)
* **Description:** Copy from the pull request body, then promote the `###` headers to `##` ones
Once created, the release will become available for users to install.
### Update the Public Documentation
After a release has been published, the public NetBox documentation needs to be updated. This is accomplished by running two actions on the [netboxlabs-docs](https://github.com/netboxlabs/netboxlabs-docs) repository.
First, run the `build-site` action, by navigating to Actions > build-site > Run workflow. This process compiles the documentation along with an overlay for integration with the documentation portal at <https://netboxlabs.com/docs>. The job should take about two minutes.
Once the documentation files have been compiled, they must be published by running the `deploy-kinsta` action. Select the desired deployment environment (staging or production) and specify `latest` as the deploy tag.
Clear the CDN cache from the [Kinsta](https://my.kinsta.com/) portal. Navigate to _Sites_ / _NetBox Labs_ / _Live_, select _Cache_ in the left-nav, click the _Clear Cache_ button, and confirm the clear operation.
Finally, verify that the documentation at <https://netboxlabs.com/docs/netbox/en/stable/> has been updated.

View File

@ -2,9 +2,9 @@
NetBox includes the ability to execute certain functions as background tasks. These include:
* [Report](../customization/reports.md) execution
* [Custom script](../customization/custom-scripts.md) execution
* Synchronization of [remote data sources](../integrations/synchronized-data.md)
* Housekeeping tasks
Additionally, NetBox plugins can enqueue their own background tasks. This is accomplished using the [Job model](../models/core/job.md). Background tasks are executed by the `rqworker` process(es).

View File

@ -135,7 +135,7 @@ Check out the desired release by specifying its tag. For example:
```
cd /opt/netbox && \
sudo git fetch --tags && \
sudo git fetch && \
sudo git checkout v4.2.7
```

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View File

@ -15,6 +15,7 @@ A background job implements a basic [Job](../../models/core/job.md) executor for
```python title="jobs.py"
from netbox.jobs import JobRunner
class MyTestJob(JobRunner):
class Meta:
name = "My Test Job"
@ -24,8 +25,6 @@ class MyTestJob(JobRunner):
# your logic goes here
```
Completed jobs will have their status updated to "completed" by default, or "errored" if an unhandled exception was raised by the `run()` method. To intentionally mark a job as failed, raise the `core.exceptions.JobFailed` exception. (Note that "failed" differs from "errored" in that a failure may be expected under certain conditions, whereas an error is not.)
You can schedule the background job from within your code (e.g. from a model's `save()` method or a view) by calling `MyTestJob.enqueue()`. This method passes through all arguments to `Job.enqueue()`. However, no `name` argument must be passed, as the background job name will be used instead.
!!! tip

View File

@ -86,69 +86,3 @@ netbox=> DELETE FROM django_migrations WHERE app='pluginname';
!!! warning
Exercise extreme caution when altering Django system tables. Users are strongly encouraged to perform a backup of their database immediately before taking these actions.
## Clean Up Content Types and Permissions
After removing a plugin and its database tables, you may find that object type references (`ContentTypes`) created by the plugin still appear in the permissions management section (e.g., when editing permissions in the NetBox UI).
This happens because the `django_content_type` table retains entries for the models that the plugin registered with Django.
!!! warning
Please use caution when removing `ContentTypes`. It is strongly recommended to **back up your database** before making these changes.
**Identify Stale Content Types:**
Open the Django shell to inspect lingering `ContentType` entries related to the removed plugin.
Typically, the Content Type's `app_label` matches the plugins name.
```no-highlight
$ cd /opt/netbox/
$ source /opt/netbox/venv/bin/activate
(venv) $ python3 netbox/manage.py nbshell
```
Then, in the shell:
```no-highlight
from django.contrib.contenttypes.models import ContentType
# Replace 'pluginname' with your plugin's actual name
stale_types = ContentType.objects.filter(app_label="pluginname")
for ct in stale_types:
print(ct)
### ^^^ These will be removed, make sure its ok
```
!!! warning
Review the output carefully and confirm that each listed Content Type is related to the plugin you removed.
**Remove Stale Content Types and Related Permissions:**
Next, check for any permissions associated with these Content Types:
```no-highlight
from django.contrib.auth.models import Permission
for ct in stale_types:
perms = Permission.objects.filter(content_type=ct)
print(list(perms))
```
If there are related Permissions, you can remove them safely:
```no-highlight
for ct in stale_types:
Permission.objects.filter(content_type=ct).delete()
```
After removing any related permissions, delete the Content Type entries:
```no-highlight
stale_types.delete()
```
**Restart NetBox:**
After making these changes, restart the NetBox service to ensure all changes are reflected.
```no-highlight
sudo systemctl restart netbox
```

View File

@ -1,55 +1,5 @@
# NetBox v4.3
## v4.3.4 (2025-07-15)
### Enhancements
* [#18811](https://github.com/netbox-community/netbox/issues/18811) - Match expanded form IPv6 addresses in global search
* [#19550](https://github.com/netbox-community/netbox/issues/19550) - Enable lazy loading for rack elevations
* [#19571](https://github.com/netbox-community/netbox/issues/19571) - Add a default module type profile for expansion cards
* [#19793](https://github.com/netbox-community/netbox/issues/19793) - Support custom dynamic navigation menu links
* [#19828](https://github.com/netbox-community/netbox/issues/19828) - Expose L2VPN termination in interface GraphQL response
### Bug Fixes
* [#19413](https://github.com/netbox-community/netbox/issues/19413) - Custom fields should be grouped in filter forms
* [#19633](https://github.com/netbox-community/netbox/issues/19633) - Introduce InvalidCondition exception and log all evaluations of invalid event rule conditions
* [#19800](https://github.com/netbox-community/netbox/issues/19800) - Module type bulk import should support profile assignment
* [#19806](https://github.com/netbox-community/netbox/issues/19806) - Introduce JobFailed exception to allow marking background jobs as failed
* [#19827](https://github.com/netbox-community/netbox/issues/19827) - Enforce uniqueness for device role names & slugs
* [#19839](https://github.com/netbox-community/netbox/issues/19839) - Enable export of parent assignment for recursively nested objects
* [#19876](https://github.com/netbox-community/netbox/issues/19876) - Remove Markdown rendering from CustomFieldChoiceSet description field
---
## v4.3.3 (2025-06-26)
### Enhancements
* [#17183](https://github.com/netbox-community/netbox/issues/17183) - Enable associating tags with object types during bulk import
* [#17719](https://github.com/netbox-community/netbox/issues/17719) - Introduce a user preference for table row striping
* [#19492](https://github.com/netbox-community/netbox/issues/19492) - Add a UI button to download the output of an executed custom script
* [#19499](https://github.com/netbox-community/netbox/issues/19499) - Support qualifying interfaces by parent device when bulk importing wireless links
### Bug Fixes
* [#19529](https://github.com/netbox-community/netbox/issues/19529) - Fix support for running custom scripts via the `runscript` management command
* [#19555](https://github.com/netbox-community/netbox/issues/19555) - Fix support for `schedule_at` when invoking a custom script via the REST API
* [#19617](https://github.com/netbox-community/netbox/issues/19617) - Ensure consistent styling of "connect" buttons in UI
* [#19640](https://github.com/netbox-community/netbox/issues/19640) - Restore ability to filter FHRP group assignments by device/VM in GraphQL API
* [#19644](https://github.com/netbox-community/netbox/issues/19644) - Atomic transactions should always employ database routing
* [#19659](https://github.com/netbox-community/netbox/issues/19659) - Populate initial device/VM selection for "add a service" button
* [#19665](https://github.com/netbox-community/netbox/issues/19665) - Correct field reference in wireless link model validation
* [#19667](https://github.com/netbox-community/netbox/issues/19667) - Fix `TypeError` exception when creating a new module profile type with no schema
* [#19673](https://github.com/netbox-community/netbox/issues/19673) - Ignore custom field references when compiling table prefetches
* [#19677](https://github.com/netbox-community/netbox/issues/19677) - Fix exception when passing null value to `present_in_vrf` filter
* [#19680](https://github.com/netbox-community/netbox/issues/19680) - Correct chronological ordering of change records resulting from device deletions
* [#19687](https://github.com/netbox-community/netbox/issues/19687) - Cellular interface types should be considered non-connectable
* [#19702](https://github.com/netbox-community/netbox/issues/19702) - Fix `DoesNotExist` exception when deleting a notification group with an associated event rule
* [#19745](https://github.com/netbox-community/netbox/issues/19745) - Fix bulk import of services with IP addresses assigned to FHRP groups
---
## v4.3.2 (2025-06-05)
### Enhancements

View File

@ -1,5 +1,5 @@
from django.contrib import messages
from django.db import router, transaction
from django.db import transaction
from django.shortcuts import get_object_or_404, redirect, render
from django.utils.translation import gettext_lazy as _
@ -384,7 +384,7 @@ class CircuitSwapTerminations(generic.ObjectEditView):
if termination_a and termination_z:
# Use a placeholder to avoid an IntegrityError on the (circuit, term_side) unique constraint
with transaction.atomic(using=router.db_for_write(CircuitTermination)):
with transaction.atomic():
termination_a.term_side = '_'
termination_a.save()
termination_z.term_side = 'A'

View File

@ -1,19 +1,9 @@
from django.core.exceptions import ImproperlyConfigured
__all__ = (
'IncompatiblePluginError',
'JobFailed',
'SyncError',
)
class IncompatiblePluginError(ImproperlyConfigured):
pass
class JobFailed(Exception):
pass
class SyncError(Exception):
pass
class IncompatiblePluginError(ImproperlyConfigured):
pass

View File

@ -187,14 +187,15 @@ class Job(models.Model):
"""
Mark the job as completed, optionally specifying a particular termination status.
"""
if status not in JobStatusChoices.TERMINAL_STATE_CHOICES:
valid_statuses = JobStatusChoices.TERMINAL_STATE_CHOICES
if status not in valid_statuses:
raise ValueError(
_("Invalid status for job termination. Choices are: {choices}").format(
choices=', '.join(JobStatusChoices.TERMINAL_STATE_CHOICES)
choices=', '.join(valid_statuses)
)
)
# Set the job's status and completion time
# Mark the job as completed
self.status = status
if error:
self.error = error

View File

@ -162,12 +162,6 @@ def handle_deleted_object(sender, instance, **kwargs):
getattr(obj, related_field_name).remove(instance)
elif type(relation) is ManyToOneRel and relation.field.null is True:
setattr(obj, related_field_name, None)
# make sure the object hasn't been deleted - in case of
# deletion chaining of related objects
try:
obj.refresh_from_db()
except DoesNotExist:
continue
obj.save()
# Enqueue the object for event processing

View File

@ -6,13 +6,12 @@ from rest_framework import status
from core.choices import ObjectChangeActionChoices
from core.models import ObjectChange, ObjectType
from dcim.choices import SiteStatusChoices
from dcim.models import Site, CableTermination, Device, DeviceType, DeviceRole, Interface, Cable
from dcim.models import Site
from extras.choices import *
from extras.models import CustomField, CustomFieldChoiceSet, Tag
from utilities.testing import APITestCase
from utilities.testing.utils import create_tags, post_data
from utilities.testing.views import ModelViewTestCase
from dcim.models import Manufacturer
class ChangeLogViewTest(ModelViewTestCase):
@ -271,81 +270,6 @@ class ChangeLogViewTest(ModelViewTestCase):
# Check that no ObjectChange records have been created
self.assertEqual(ObjectChange.objects.count(), 0)
def test_ordering_genericrelation(self):
# Create required objects first
manufacturer = Manufacturer.objects.create(name='Manufacturer 1')
device_type = DeviceType.objects.create(
manufacturer=manufacturer,
model='Model 1',
slug='model-1'
)
device_role = DeviceRole.objects.create(
name='Role 1',
slug='role-1'
)
site = Site.objects.create(
name='Site 1',
slug='site-1'
)
# Create two devices
device1 = Device.objects.create(
name='Device 1',
device_type=device_type,
role=device_role,
site=site
)
device2 = Device.objects.create(
name='Device 2',
device_type=device_type,
role=device_role,
site=site
)
# Create interfaces on both devices
interface1 = Interface.objects.create(
device=device1,
name='eth0',
type='1000base-t'
)
interface2 = Interface.objects.create(
device=device2,
name='eth0',
type='1000base-t'
)
# Create a cable between the interfaces
_ = Cable.objects.create(
a_terminations=[interface1],
b_terminations=[interface2],
status='connected'
)
# Delete device1
request = {
'path': reverse('dcim:device_delete', kwargs={'pk': device1.pk}),
'data': post_data({'confirm': True}),
}
self.add_permissions(
'dcim.delete_device',
'dcim.delete_interface',
'dcim.delete_cable',
'dcim.delete_cabletermination'
)
response = self.client.post(**request)
self.assertHttpStatus(response, 302)
# Get the ObjectChange records for delete actions ordered by time
changes = ObjectChange.objects.filter(
action=ObjectChangeActionChoices.ACTION_DELETE
).order_by('time')[:3]
# Verify the order of deletion
self.assertEqual(len(changes), 3)
self.assertEqual(changes[0].changed_object_type, ContentType.objects.get_for_model(CableTermination))
self.assertEqual(changes[1].changed_object_type, ContentType.objects.get_for_model(Interface))
self.assertEqual(changes[2].changed_object_type, ContentType.objects.get_for_model(Device))
class ChangeLogAPITest(APITestCase):

View File

@ -53,11 +53,6 @@ WIRELESS_IFACE_TYPES = [
InterfaceTypeChoices.TYPE_802151,
InterfaceTypeChoices.TYPE_802154,
InterfaceTypeChoices.TYPE_OTHER_WIRELESS,
InterfaceTypeChoices.TYPE_GSM,
InterfaceTypeChoices.TYPE_CDMA,
InterfaceTypeChoices.TYPE_LTE,
InterfaceTypeChoices.TYPE_4G,
InterfaceTypeChoices.TYPE_5G,
]
NONCONNECTABLE_IFACE_TYPES = VIRTUAL_IFACE_TYPES + WIRELESS_IFACE_TYPES

View File

@ -470,8 +470,8 @@ class ModuleTypeImportForm(NetBoxModelImportForm):
class Meta:
model = ModuleType
fields = [
'manufacturer', 'model', 'part_number', 'description', 'airflow', 'weight', 'weight_unit', 'profile',
'comments', 'tags'
'manufacturer', 'model', 'part_number', 'description', 'airflow', 'weight', 'weight_unit', 'comments',
'tags',
]

View File

@ -33,7 +33,6 @@ if TYPE_CHECKING:
from tenancy.graphql.types import TenantType
from users.graphql.types import UserType
from virtualization.graphql.types import ClusterType, VMInterfaceType, VirtualMachineType
from vpn.graphql.types import L2VPNTerminationType
from wireless.graphql.types import WirelessLANType, WirelessLinkType
__all__ = (
@ -441,7 +440,6 @@ class InterfaceType(IPAddressesMixin, ModularComponentType, CabledObjectMixin, P
primary_mac_address: Annotated["MACAddressType", strawberry.lazy('dcim.graphql.types')] | None
qinq_svlan: Annotated["VLANType", strawberry.lazy('ipam.graphql.types')] | None
vlan_translation_policy: Annotated["VLANTranslationPolicyType", strawberry.lazy('ipam.graphql.types')] | None
l2vpn_termination: Annotated["L2VPNTerminationType", strawberry.lazy('vpn.graphql.types')] | None
vdcs: List[Annotated["VirtualDeviceContextType", strawberry.lazy('dcim.graphql.types')]]
tagged_vlans: List[Annotated["VLANType", strawberry.lazy('ipam.graphql.types')]]

View File

@ -19,8 +19,7 @@ def load_initial_data(apps, schema_editor):
'gpu',
'hard_disk',
'memory',
'power_supply',
'expansion_card'
'power_supply'
)
for name in initial_profiles:

View File

@ -1,44 +0,0 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dcim', '0207_remove_redundant_indexes'),
('extras', '0129_fix_script_paths'),
]
operations = [
migrations.AddConstraint(
model_name='devicerole',
constraint=models.UniqueConstraint(
fields=('parent', 'name'),
name='dcim_devicerole_parent_name'
),
),
migrations.AddConstraint(
model_name='devicerole',
constraint=models.UniqueConstraint(
condition=models.Q(('parent__isnull', True)),
fields=('name',),
name='dcim_devicerole_name',
violation_error_message='A top-level device role with this name already exists.'
),
),
migrations.AddConstraint(
model_name='devicerole',
constraint=models.UniqueConstraint(
fields=('parent', 'slug'),
name='dcim_devicerole_parent_slug'
),
),
migrations.AddConstraint(
model_name='devicerole',
constraint=models.UniqueConstraint(
condition=models.Q(('parent__isnull', True)),
fields=('slug',),
name='dcim_devicerole_slug',
violation_error_message='A top-level device role with this slug already exists.'
),
),
]

View File

@ -1,15 +0,0 @@
{
"name": "Expansion card",
"schema": {
"properties": {
"connector_type": {
"type": "string",
"description": "Connector type e.g. PCIe x4"
},
"bandwidth": {
"type": "integer",
"description": "Total Bandwidth for this module"
}
}
}
}

View File

@ -398,28 +398,6 @@ class DeviceRole(NestedGroupModel):
class Meta:
ordering = ('name',)
constraints = (
models.UniqueConstraint(
fields=('parent', 'name'),
name='%(app_label)s_%(class)s_parent_name'
),
models.UniqueConstraint(
fields=('name',),
name='%(app_label)s_%(class)s_name',
condition=Q(parent__isnull=True),
violation_error_message=_("A top-level device role with this name already exists.")
),
models.UniqueConstraint(
fields=('parent', 'slug'),
name='%(app_label)s_%(class)s_parent_slug'
),
models.UniqueConstraint(
fields=('slug',),
name='%(app_label)s_%(class)s_slug',
condition=Q(parent__isnull=True),
violation_error_message=_("A top-level device role with this slug already exists.")
),
)
verbose_name = _('device role')
verbose_name_plural = _('device roles')

View File

@ -144,7 +144,7 @@ class ModuleType(ImageAttachmentsMixin, PrimaryModel, WeightMixin):
super().clean()
# Validate any attributes against the assigned profile's schema
if self.profile and self.profile.schema:
if self.profile:
try:
jsonschema.validate(self.attribute_data, schema=self.profile.schema)
except JSONValidationError as e:

View File

@ -63,10 +63,6 @@ class DeviceRoleTable(NetBoxTable):
verbose_name=_('Name'),
linkify=True
)
parent = tables.Column(
verbose_name=_('Parent'),
linkify=True,
)
device_count = columns.LinkedCountColumn(
viewname='dcim:device_list',
url_params={'role_id': 'pk'},
@ -92,8 +88,8 @@ class DeviceRoleTable(NetBoxTable):
class Meta(NetBoxTable.Meta):
model = models.DeviceRole
fields = (
'pk', 'id', 'name', 'parent', 'device_count', 'vm_count', 'color', 'vm_role', 'config_template',
'description', 'slug', 'tags', 'actions', 'created', 'last_updated',
'pk', 'id', 'name', 'device_count', 'vm_count', 'color', 'vm_role', 'config_template', 'description',
'slug', 'tags', 'actions', 'created', 'last_updated',
)
default_columns = ('pk', 'name', 'device_count', 'vm_count', 'color', 'vm_role', 'description')

View File

@ -24,10 +24,6 @@ class RegionTable(ContactsColumnMixin, NetBoxTable):
verbose_name=_('Name'),
linkify=True
)
parent = tables.Column(
verbose_name=_('Parent'),
linkify=True,
)
site_count = columns.LinkedCountColumn(
viewname='dcim:site_list',
url_params={'region_id': 'pk'},
@ -43,7 +39,7 @@ class RegionTable(ContactsColumnMixin, NetBoxTable):
class Meta(NetBoxTable.Meta):
model = Region
fields = (
'pk', 'id', 'name', 'parent', 'slug', 'site_count', 'description', 'comments', 'contacts', 'tags',
'pk', 'id', 'name', 'slug', 'site_count', 'description', 'comments', 'contacts', 'tags',
'created', 'last_updated', 'actions',
)
default_columns = ('pk', 'name', 'site_count', 'description')
@ -58,10 +54,6 @@ class SiteGroupTable(ContactsColumnMixin, NetBoxTable):
verbose_name=_('Name'),
linkify=True
)
parent = tables.Column(
verbose_name=_('Parent'),
linkify=True,
)
site_count = columns.LinkedCountColumn(
viewname='dcim:site_list',
url_params={'group_id': 'pk'},
@ -77,7 +69,7 @@ class SiteGroupTable(ContactsColumnMixin, NetBoxTable):
class Meta(NetBoxTable.Meta):
model = SiteGroup
fields = (
'pk', 'id', 'name', 'parent', 'slug', 'site_count', 'description', 'comments', 'contacts', 'tags',
'pk', 'id', 'name', 'slug', 'site_count', 'description', 'comments', 'contacts', 'tags',
'created', 'last_updated', 'actions',
)
default_columns = ('pk', 'name', 'site_count', 'description')
@ -143,10 +135,6 @@ class LocationTable(TenancyColumnsMixin, ContactsColumnMixin, NetBoxTable):
verbose_name=_('Name'),
linkify=True
)
parent = tables.Column(
verbose_name=_('Parent'),
linkify=True,
)
site = tables.Column(
verbose_name=_('Site'),
linkify=True
@ -182,8 +170,8 @@ class LocationTable(TenancyColumnsMixin, ContactsColumnMixin, NetBoxTable):
class Meta(NetBoxTable.Meta):
model = Location
fields = (
'pk', 'id', 'name', 'parent', 'site', 'status', 'facility', 'tenant', 'tenant_group', 'rack_count',
'device_count', 'description', 'slug', 'comments', 'contacts', 'tags', 'actions', 'created', 'last_updated',
'pk', 'id', 'name', 'site', 'status', 'facility', 'tenant', 'tenant_group', 'rack_count', 'device_count',
'description', 'slug', 'comments', 'contacts', 'tags', 'actions', 'created', 'last_updated',
'vlangroup_count',
)
default_columns = (

View File

@ -954,19 +954,6 @@ class CableTestCase(TestCase):
with self.assertRaises(ValidationError):
cable.clean()
@tag('regression')
def test_cable_cannot_terminate_to_a_cellular_interface(self):
"""
A cable cannot terminate to a cellular interface
"""
device1 = Device.objects.get(name='TestDevice1')
interface2 = Interface.objects.get(device__name='TestDevice2', name='eth0')
cellular_interface = Interface(device=device1, name="W1", type=InterfaceTypeChoices.TYPE_LTE)
cable = Cable(a_terminations=[interface2], b_terminations=[cellular_interface])
with self.assertRaises(ValidationError):
cable.clean()
class VirtualDeviceContextTestCase(TestCase):

View File

@ -3,7 +3,7 @@ from decimal import Decimal
from zoneinfo import ZoneInfo
import yaml
from django.test import override_settings, tag
from django.test import override_settings
from django.urls import reverse
from netaddr import EUI
@ -1000,7 +1000,18 @@ inventory-items:
self.assertEqual(response.get('Content-Type'), 'text/csv; charset=utf-8')
class ModuleTypeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
# TODO: Change base class to PrimaryObjectViewTestCase
# Blocked by absence of bulk import view for ModuleTypes
class ModuleTypeTestCase(
ViewTestCases.GetObjectViewTestCase,
ViewTestCases.GetObjectChangelogViewTestCase,
ViewTestCases.CreateObjectViewTestCase,
ViewTestCases.EditObjectViewTestCase,
ViewTestCases.DeleteObjectViewTestCase,
ViewTestCases.ListObjectsViewTestCase,
ViewTestCases.BulkEditObjectsViewTestCase,
ViewTestCases.BulkDeleteObjectsViewTestCase
):
model = ModuleType
@classmethod
@ -1012,7 +1023,7 @@ class ModuleTypeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
)
Manufacturer.objects.bulk_create(manufacturers)
module_types = ModuleType.objects.bulk_create([
ModuleType.objects.bulk_create([
ModuleType(model='Module Type 1', manufacturer=manufacturers[0]),
ModuleType(model='Module Type 2', manufacturer=manufacturers[0]),
ModuleType(model='Module Type 3', manufacturer=manufacturers[0]),
@ -1020,8 +1031,6 @@ class ModuleTypeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
tags = create_tags('Alpha', 'Bravo', 'Charlie')
fan_module_type_profile = ModuleTypeProfile.objects.get(name='Fan')
cls.form_data = {
'manufacturer': manufacturers[1].pk,
'model': 'Device Type X',
@ -1035,70 +1044,6 @@ class ModuleTypeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
'part_number': '456DEF',
}
cls.csv_data = (
"manufacturer,model,part_number,comments,profile",
f"Manufacturer 1,fan0,generic-fan,,{fan_module_type_profile.name}"
)
cls.csv_update_data = (
"id,model",
f"{module_types[0].id},test model",
)
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'])
def test_bulk_update_objects_with_permission(self):
self.add_permissions(
'dcim.add_consoleporttemplate',
'dcim.add_consoleserverporttemplate',
'dcim.add_powerporttemplate',
'dcim.add_poweroutlettemplate',
'dcim.add_interfacetemplate',
'dcim.add_frontporttemplate',
'dcim.add_rearporttemplate',
'dcim.add_modulebaytemplate',
)
# run base test
super().test_bulk_update_objects_with_permission()
@tag('regression')
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'], EXEMPT_EXCLUDE_MODELS=[])
def test_bulk_import_objects_with_permission(self):
self.add_permissions(
'dcim.add_consoleporttemplate',
'dcim.add_consoleserverporttemplate',
'dcim.add_powerporttemplate',
'dcim.add_poweroutlettemplate',
'dcim.add_interfacetemplate',
'dcim.add_frontporttemplate',
'dcim.add_rearporttemplate',
'dcim.add_modulebaytemplate',
)
# run base test
super().test_bulk_import_objects_with_permission()
# TODO: remove extra regression asserts once parent test supports testing all import fields
fan_module_type = ModuleType.objects.get(part_number='generic-fan')
fan_module_type_profile = ModuleTypeProfile.objects.get(name='Fan')
assert fan_module_type.profile == fan_module_type_profile
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'], EXEMPT_EXCLUDE_MODELS=[])
def test_bulk_import_objects_with_constrained_permission(self):
self.add_permissions(
'dcim.add_consoleporttemplate',
'dcim.add_consoleserverporttemplate',
'dcim.add_powerporttemplate',
'dcim.add_poweroutlettemplate',
'dcim.add_interfacetemplate',
'dcim.add_frontporttemplate',
'dcim.add_rearporttemplate',
'dcim.add_modulebaytemplate',
)
super().test_bulk_import_objects_with_constrained_permission()
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'])
def test_moduletype_consoleports(self):
moduletype = ModuleType.objects.first()
@ -1859,9 +1804,9 @@ class DeviceRoleTestCase(ViewTestCases.OrganizationalObjectViewTestCase):
cls.csv_data = (
"name,slug,color",
"Device Role 6,device-role-6,ff0000",
"Device Role 7,device-role-7,00ff00",
"Device Role 8,device-role-8,0000ff",
"Device Role 4,device-role-4,ff0000",
"Device Role 5,device-role-5,00ff00",
"Device Role 6,device-role-6,0000ff",
)
cls.csv_update_data = (

View File

@ -1,6 +1,6 @@
from django.apps import apps
from django.contrib.contenttypes.models import ContentType
from django.db import router, transaction
from django.db import transaction
def compile_path_node(ct_id, object_id):
@ -53,7 +53,7 @@ def rebuild_paths(terminations):
for obj in terminations:
cable_paths = CablePath.objects.filter(_nodes__contains=obj)
with transaction.atomic(using=router.db_for_write(CablePath)):
with transaction.atomic():
for cp in cable_paths:
cp.delete()
create_cablepath(cp.origins)

View File

@ -1,7 +1,7 @@
from django.contrib import messages
from django.contrib.contenttypes.models import ContentType
from django.core.paginator import EmptyPage, PageNotAnInteger
from django.db import router, transaction
from django.db import transaction
from django.db.models import Prefetch
from django.forms import ModelMultipleChoiceField, MultipleHiddenInput, modelformset_factory
from django.shortcuts import get_object_or_404, redirect, render
@ -124,7 +124,7 @@ class BulkDisconnectView(GetReturnURLMixin, ObjectPermissionRequiredMixin, View)
if form.is_valid():
with transaction.atomic(using=router.db_for_write(Cable)):
with transaction.atomic():
count = 0
cable_ids = set()
for obj in self.queryset.filter(pk__in=form.cleaned_data['pk']):
@ -3746,7 +3746,7 @@ class VirtualChassisEditView(ObjectPermissionRequiredMixin, GetReturnURLMixin, V
if vc_form.is_valid() and formset.is_valid():
with transaction.atomic(using=router.db_for_write(Device)):
with transaction.atomic():
# Save the VirtualChassis
vc_form.save()

View File

@ -66,11 +66,11 @@ class ScriptInputSerializer(serializers.Serializer):
interval = serializers.IntegerField(required=False, allow_null=True)
def validate_schedule_at(self, value):
if value and not self.context['script'].python_class.scheduling_enabled:
if value and not self.context['script'].scheduling_enabled:
raise serializers.ValidationError(_("Scheduling is not enabled for this script."))
return value
def validate_interval(self, value):
if value and not self.context['script'].python_class.scheduling_enabled:
if value and not self.context['script'].scheduling_enabled:
raise serializers.ValidationError(_("Scheduling is not enabled for this script."))
return value

View File

@ -270,7 +270,6 @@ class ScriptViewSet(ModelViewSet):
module_name, script_name = pk.split('.', maxsplit=1)
except ValueError:
raise Http404
return get_object_or_404(self.queryset, module__file_path=f'{module_name}.py', name=script_name)
def retrieve(self, request, pk):

View File

@ -1,14 +1,13 @@
import functools
import operator
import re
from django.utils.translation import gettext as _
__all__ = (
'Condition',
'ConditionSet',
'InvalidCondition',
)
AND = 'and'
OR = 'or'
@ -20,10 +19,6 @@ def is_ruleset(data):
return type(data) is dict and len(data) == 1 and list(data.keys())[0] in (AND, OR)
class InvalidCondition(Exception):
pass
class Condition:
"""
An individual conditional rule that evaluates a single attribute and its value.
@ -66,7 +61,6 @@ class Condition:
self.attr = attr
self.value = value
self.op = op
self.eval_func = getattr(self, f'eval_{op}')
self.negate = negate
@ -76,17 +70,16 @@ class Condition:
"""
def _get(obj, key):
if isinstance(obj, list):
return [operator.getitem(item or {}, key) for item in obj]
return operator.getitem(obj or {}, key)
return [dict.get(i, key) for i in obj]
return dict.get(obj, key)
try:
value = functools.reduce(_get, self.attr.split('.'), data)
except KeyError:
raise InvalidCondition(f"Invalid key path: {self.attr}")
try:
result = self.eval_func(value)
except TypeError as e:
raise InvalidCondition(f"Invalid data type at '{self.attr}' for '{self.op}' evaluation: {e}")
except TypeError:
# Invalid key path
value = None
result = self.eval_func(value)
if self.negate:
return not result

View File

@ -192,5 +192,5 @@ def flush_events(events):
try:
func = import_string(name)
func(events)
except ImportError as e:
except Exception as e:
logger.error(_("Cannot import events pipeline {name} error: {error}").format(name=name, error=e))

View File

@ -238,18 +238,10 @@ class TagImportForm(CSVModelForm):
label=_('Weight'),
required=False
)
object_types = CSVMultipleContentTypeField(
label=_('Object types'),
queryset=ObjectType.objects.with_feature('tags'),
help_text=_("One or more assigned object types"),
required=False,
)
class Meta:
model = Tag
fields = (
'name', 'slug', 'color', 'weight', 'description', 'object_types',
)
fields = ('name', 'slug', 'color', 'weight', 'description')
class JournalEntryImportForm(NetBoxModelImportForm):

View File

@ -1,8 +1,13 @@
from core.choices import JobIntervalChoices
from core.forms import ManagedFileForm
import os
from django import forms
from django.conf import settings
from django.core.files.storage import storages
from django.utils.translation import gettext_lazy as _
from core.choices import JobIntervalChoices
from core.forms import ManagedFileForm
from extras.storage import ScriptFileSystemStorage
from utilities.datetime import local_now
from utilities.forms.widgets import DateTimePicker, NumberWithOptions
@ -69,7 +74,12 @@ class ScriptFileForm(ManagedFileForm):
storage = storages.create_storage(storages.backends["scripts"])
filename = self.cleaned_data['upload_file'].name
self.instance.file_path = filename
if isinstance(storage, ScriptFileSystemStorage):
full_path = os.path.join(settings.SCRIPTS_ROOT, filename)
else:
full_path = filename
self.instance.file_path = full_path
data = self.cleaned_data['upload_file']
storage.save(filename, data)

View File

@ -39,9 +39,6 @@ class ScriptJob(JobRunner):
try:
try:
# A script can modify multiple models so need to do an atomic lock on
# both the default database (for non ChangeLogged models) and potentially
# any other database (for ChangeLogged models)
with transaction.atomic():
script.output = script.run(data, commit)
if not commit:

View File

@ -18,22 +18,9 @@ class Empty(Lookup):
return f"CAST(LENGTH({sql}) AS BOOLEAN) IS TRUE", params
class NetHost(Lookup):
"""
Similar to ipam.lookups.NetHost, but casts the field to INET.
"""
lookup_name = 'net_host'
def as_sql(self, qn, connection):
lhs, lhs_params = self.process_lhs(qn, connection)
rhs, rhs_params = self.process_rhs(qn, connection)
params = lhs_params + rhs_params
return 'HOST(CAST(%s AS INET)) = HOST(%s)' % (lhs, rhs), params
class NetContainsOrEquals(Lookup):
"""
Similar to ipam.lookups.NetContainsOrEquals, but casts the field to INET.
This lookup has the same functionality as the one from the ipam app except lhs is cast to inet
"""
lookup_name = 'net_contains_or_equals'
@ -45,5 +32,4 @@ class NetContainsOrEquals(Lookup):
CharField.register_lookup(Empty)
CachedValueField.register_lookup(NetHost)
CachedValueField.register_lookup(NetContainsOrEquals)

View File

@ -1,56 +0,0 @@
from django.conf import settings
from django.core.files.storage import storages
from django.db import migrations
from urllib.parse import urlparse
from extras.storage import ScriptFileSystemStorage
def normalize(url):
parsed_url = urlparse(url)
if not parsed_url.path.endswith('/'):
return url + '/'
return url
def fix_script_paths(apps, schema_editor):
"""
Fix script paths for scripts that had incorrect path from NB 4.3.
"""
storage = storages.create_storage(storages.backends["scripts"])
if not isinstance(storage, ScriptFileSystemStorage):
return
ScriptModule = apps.get_model('extras', 'ScriptModule')
script_root_path = normalize(settings.SCRIPTS_ROOT)
for script in ScriptModule.objects.filter(file_path__startswith=script_root_path):
script.file_path = script.file_path[len(script_root_path):]
script.save()
class Migration(migrations.Migration):
dependencies = [
('extras', '0128_tableconfig'),
]
operations = [
migrations.RunPython(code=fix_script_paths, reverse_code=migrations.RunPython.noop),
]
def oc_fix_script_paths(objectchange, reverting):
script_root_path = normalize(settings.SCRIPTS_ROOT)
for data in (objectchange.prechange_data, objectchange.postchange_data):
if data is None:
continue
if file_path := data.get('file_path'):
if file_path.startswith(script_root_path):
data['file_path'] = file_path[len(script_root_path):]
objectchange_migrators = {
'extras.scriptmodule': oc_fix_script_paths,
}

View File

@ -13,7 +13,7 @@ from rest_framework.utils.encoders import JSONEncoder
from core.models import ObjectType
from extras.choices import *
from extras.conditions import ConditionSet, InvalidCondition
from extras.conditions import ConditionSet
from extras.constants import *
from extras.utils import image_upload
from extras.models.mixins import RenderTemplateMixin
@ -142,15 +142,7 @@ class EventRule(CustomFieldsMixin, ExportTemplatesMixin, TagsMixin, ChangeLogged
if not self.conditions:
return True
logger = logging.getLogger('netbox.event_rules')
try:
result = ConditionSet(self.conditions).eval(data)
logger.debug(f'{self.name}: Evaluated as {result}')
return result
except InvalidCondition as e:
logger.error(f"{self.name}: Evaluation failed. {e}")
return False
return ConditionSet(self.conditions).eval(data)
class Webhook(CustomFieldsMixin, ExportTemplatesMixin, TagsMixin, ChangeLoggedModel):

View File

@ -1,7 +1,7 @@
from functools import cached_property
from django.conf import settings
from django.contrib.contenttypes.fields import GenericForeignKey, GenericRelation
from django.contrib.contenttypes.fields import GenericForeignKey
from django.core.exceptions import ValidationError
from django.db import models
from django.urls import reverse
@ -144,12 +144,6 @@ class NotificationGroup(ChangeLoggedModel):
blank=True,
related_name='notification_groups'
)
event_rules = GenericRelation(
to='extras.EventRule',
content_type_field='action_object_type',
object_id_field='action_object_id',
related_query_name='+'
)
objects = RestrictedQuerySet.as_manager()

View File

@ -4,7 +4,7 @@ from django.test import TestCase
from core.events import *
from dcim.choices import SiteStatusChoices
from dcim.models import Site
from extras.conditions import Condition, ConditionSet, InvalidCondition
from extras.conditions import Condition, ConditionSet
from extras.events import serialize_for_event
from extras.forms import EventRuleForm
from extras.models import EventRule, Webhook
@ -12,11 +12,16 @@ from extras.models import EventRule, Webhook
class ConditionTestCase(TestCase):
def test_dotted_path_access(self):
c = Condition('a.b.c', 1, 'eq')
self.assertTrue(c.eval({'a': {'b': {'c': 1}}}))
self.assertFalse(c.eval({'a': {'b': {'c': 2}}}))
self.assertFalse(c.eval({'a': {'b': {'x': 1}}}))
def test_undefined_attr(self):
c = Condition('x', 1, 'eq')
self.assertFalse(c.eval({}))
self.assertTrue(c.eval({'x': 1}))
with self.assertRaises(InvalidCondition):
c.eval({})
#
# Validation tests
@ -32,13 +37,10 @@ class ConditionTestCase(TestCase):
# dict type is unsupported
Condition('x', 1, dict())
def test_invalid_op_types(self):
def test_invalid_op_type(self):
with self.assertRaises(ValueError):
# 'gt' supports only numeric values
Condition('x', 'foo', 'gt')
with self.assertRaises(ValueError):
# 'in' supports only iterable values
Condition('x', 123, 'in')
#
# Nested attrs tests
@ -48,10 +50,7 @@ class ConditionTestCase(TestCase):
c = Condition('x.y.z', 1)
self.assertTrue(c.eval({'x': {'y': {'z': 1}}}))
self.assertFalse(c.eval({'x': {'y': {'z': 2}}}))
with self.assertRaises(InvalidCondition):
c.eval({'x': {'y': None}})
with self.assertRaises(InvalidCondition):
c.eval({'x': {'y': {'a': 1}}})
self.assertFalse(c.eval({'a': {'b': {'c': 1}}}))
#
# Operator tests
@ -75,31 +74,23 @@ class ConditionTestCase(TestCase):
c = Condition('x', 1, 'gt')
self.assertTrue(c.eval({'x': 2}))
self.assertFalse(c.eval({'x': 1}))
with self.assertRaises(InvalidCondition):
c.eval({'x': 'foo'}) # Invalid type
def test_gte(self):
c = Condition('x', 1, 'gte')
self.assertTrue(c.eval({'x': 2}))
self.assertTrue(c.eval({'x': 1}))
self.assertFalse(c.eval({'x': 0}))
with self.assertRaises(InvalidCondition):
c.eval({'x': 'foo'}) # Invalid type
def test_lt(self):
c = Condition('x', 2, 'lt')
self.assertTrue(c.eval({'x': 1}))
self.assertFalse(c.eval({'x': 2}))
with self.assertRaises(InvalidCondition):
c.eval({'x': 'foo'}) # Invalid type
def test_lte(self):
c = Condition('x', 2, 'lte')
self.assertTrue(c.eval({'x': 1}))
self.assertTrue(c.eval({'x': 2}))
self.assertFalse(c.eval({'x': 3}))
with self.assertRaises(InvalidCondition):
c.eval({'x': 'foo'}) # Invalid type
def test_in(self):
c = Condition('x', [1, 2, 3], 'in')
@ -115,8 +106,6 @@ class ConditionTestCase(TestCase):
c = Condition('x', 1, 'contains')
self.assertTrue(c.eval({'x': [1, 2, 3]}))
self.assertFalse(c.eval({'x': [2, 3, 4]}))
with self.assertRaises(InvalidCondition):
c.eval({'x': 123}) # Invalid type
def test_contains_negated(self):
c = Condition('x', 1, 'contains', negate=True)

View File

@ -444,8 +444,6 @@ class TagTestCase(ViewTestCases.OrganizationalObjectViewTestCase):
@classmethod
def setUpTestData(cls):
site_ct = ContentType.objects.get_for_model(Site)
tags = (
Tag(name='Tag 1', slug='tag-1'),
Tag(name='Tag 2', slug='tag-2', weight=1),
@ -458,15 +456,14 @@ class TagTestCase(ViewTestCases.OrganizationalObjectViewTestCase):
'slug': 'tag-x',
'color': 'c0c0c0',
'comments': 'Some comments',
'object_types': [site_ct.pk],
'weight': 11,
}
cls.csv_data = (
"name,slug,color,description,object_types,weight",
"Tag 4,tag-4,ff0000,Fourth tag,dcim.interface,0",
"Tag 5,tag-5,00ff00,Fifth tag,'dcim.device,dcim.site',1111",
"Tag 6,tag-6,0000ff,Sixth tag,dcim.site,0",
"name,slug,color,description,weight",
"Tag 4,tag-4,ff0000,Fourth tag,0",
"Tag 5,tag-5,00ff00,Fifth tag,1111",
"Tag 6,tag-6,0000ff,Sixth tag,0",
)
cls.csv_update_data = (

View File

@ -1476,16 +1476,7 @@ class ScriptResultView(TableMixin, generic.ObjectView):
table = None
job = get_object_or_404(Job.objects.all(), pk=kwargs.get('job_pk'))
# If a direct export output has been requested, return the job data content as a
# downloadable file.
if job.completed and request.GET.get('export') == 'output':
content = (job.data.get("output") or "").encode()
response = HttpResponse(content, content_type='text')
filename = f"{job.object.name or 'script-output'}_{job.completed.strftime('%Y-%m-%d_%H%M%S')}.txt"
response['Content-Disposition'] = f'attachment; filename="{filename}"'
return response
elif job.completed:
if job.completed:
table = self.get_table(job, request, bulk_actions=False)
log_threshold = request.GET.get('log_threshold', LogLevelChoices.LOG_INFO)

View File

@ -2,7 +2,7 @@ from copy import deepcopy
from django.contrib.contenttypes.prefetch import GenericPrefetch
from django.core.exceptions import ObjectDoesNotExist, PermissionDenied
from django.db import router, transaction
from django.db import transaction
from django.shortcuts import get_object_or_404
from django.utils.translation import gettext as _
from django_pglocks import advisory_lock
@ -295,7 +295,7 @@ class AvailableObjectsView(ObjectValidationMixin, APIView):
# Create the new IP address(es)
try:
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
with transaction.atomic():
created = serializer.save()
self._validate_objects(created)
except ObjectDoesNotExist:

View File

@ -449,7 +449,7 @@ class PrefixFilterSet(NetBoxModelFilterSet, ScopedFilterSet, TenancyFilterSet, C
@extend_schema_field(OpenApiTypes.STR)
def filter_present_in_vrf(self, queryset, name, vrf):
if vrf is None:
return queryset.none()
return queryset.none
return queryset.filter(
Q(vrf=vrf) |
Q(vrf__export_targets__in=vrf.import_targets.all())
@ -729,7 +729,7 @@ class IPAddressFilterSet(NetBoxModelFilterSet, TenancyFilterSet, ContactModelFil
@extend_schema_field(OpenApiTypes.STR)
def filter_present_in_vrf(self, queryset, name, vrf):
if vrf is None:
return queryset.none()
return queryset.none
return queryset.filter(
Q(vrf=vrf) |
Q(vrf__export_targets__in=vrf.import_targets.all())

View File

@ -633,10 +633,7 @@ class ServiceImportForm(NetBoxModelImportForm):
# triggered
parent = self.cleaned_data.get('parent')
for ip_address in self.cleaned_data.get('ipaddresses', []):
if not (assigned := ip_address.assigned_object) or ( # no assigned object
(isinstance(parent, FHRPGroup) and assigned != parent) # assigned to FHRPGroup
and getattr(assigned, 'parent_object') != parent # assigned to [VM]Interface
):
if not ip_address.assigned_object or getattr(ip_address.assigned_object, 'parent_object') != parent:
raise forms.ValidationError(
_("{ip} is not assigned to this parent.").format(ip=ip_address)
)

View File

@ -826,7 +826,7 @@ class ServiceForm(NetBoxModelForm):
except ObjectDoesNotExist:
pass
if self.instance and self.instance.pk and parent_object_type_id != self.instance.parent_object_type_id:
if self.instance and parent_object_type_id != self.instance.parent_object_type_id:
self.initial['parent'] = None
def clean(self):

View File

@ -11,12 +11,10 @@ from strawberry_django import FilterLookup, DateFilterLookup
from core.graphql.filter_mixins import BaseObjectTypeFilterMixin, ChangeLogFilterMixin
from dcim.graphql.filter_mixins import ScopedFilterMixin
from dcim.models import Device
from ipam import models
from ipam.graphql.filter_mixins import ServiceBaseFilterMixin
from netbox.graphql.filter_mixins import NetBoxModelFilterMixin, OrganizationalModelFilterMixin, PrimaryModelFilterMixin
from tenancy.graphql.filter_mixins import ContactFilterMixin, TenancyFilterMixin
from virtualization.models import VMInterface
if TYPE_CHECKING:
from netbox.graphql.filter_lookups import IntegerArrayLookup, IntegerLookup
@ -118,30 +116,6 @@ class FHRPGroupAssignmentFilter(BaseObjectTypeFilterMixin, ChangeLogFilterMixin)
strawberry_django.filter_field()
)
@strawberry_django.filter_field()
def device_id(self, queryset, value: list[str], prefix) -> Q:
return self.filter_device('id', value)
@strawberry_django.filter_field()
def device(self, value: list[str], prefix) -> Q:
return self.filter_device('name', value)
@strawberry_django.filter_field()
def virtual_machine_id(self, value: list[str], prefix) -> Q:
return Q(interface_id__in=VMInterface.objects.filter(virtual_machine_id__in=value))
@strawberry_django.filter_field()
def virtual_machine(self, value: list[str], prefix) -> Q:
return Q(interface_id__in=VMInterface.objects.filter(virtual_machine__name__in=value))
def filter_device(self, field, value) -> Q:
"""Helper to standardize logic for device and device_id filters"""
devices = Device.objects.filter(**{f'{field}__in': value})
interface_ids = []
for device in devices:
interface_ids.extend(device.vc_interfaces().values_list('id', flat=True))
return Q(interface_id__in=interface_ids)
@strawberry_django.filter_type(models.IPAddress, lookups=True)
class IPAddressFilter(ContactFilterMixin, TenancyFilterMixin, PrimaryModelFilterMixin):

View File

@ -162,11 +162,6 @@ class Aggregate(ContactsMixin, GetAvailablePrefixesMixin, PrimaryModel):
return self.prefix.version
return None
@property
def ipv6_full(self):
if self.prefix and self.prefix.version == 6:
return netaddr.IPAddress(self.prefix).format(netaddr.ipv6_full)
def get_child_prefixes(self):
"""
Return all Prefixes within this Aggregate
@ -335,11 +330,6 @@ class Prefix(ContactsMixin, GetAvailablePrefixesMixin, CachedScopeMixin, Primary
def mask_length(self):
return self.prefix.prefixlen if self.prefix else None
@property
def ipv6_full(self):
if self.prefix and self.prefix.version == 6:
return netaddr.IPAddress(self.prefix).format(netaddr.ipv6_full)
@property
def depth(self):
return self._depth
@ -818,11 +808,6 @@ class IPAddress(ContactsMixin, PrimaryModel):
self._original_assigned_object_id = self.__dict__.get('assigned_object_id')
self._original_assigned_object_type_id = self.__dict__.get('assigned_object_type_id')
@property
def ipv6_full(self):
if self.address and self.address.version == 6:
return netaddr.IPAddress(self.address).format(netaddr.ipv6_full)
def get_duplicates(self):
return IPAddress.objects.filter(
vrf=self.vrf,

View File

@ -1068,9 +1068,6 @@ class ServiceTestCase(ViewTestCases.PrimaryObjectViewTestCase):
role = DeviceRole.objects.create(name='Device Role 1', slug='device-role-1')
device = Device.objects.create(name='Device 1', site=site, device_type=devicetype, role=role)
interface = Interface.objects.create(device=device, name='Interface 1', type=InterfaceTypeChoices.TYPE_VIRTUAL)
fhrp_group = FHRPGroup.objects.create(
name='Group 1', group_id=1234, protocol=FHRPGroupProtocolChoices.PROTOCOL_CARP
)
services = (
Service(parent=device, name='Service 1', protocol=ServiceProtocolChoices.PROTOCOL_TCP, ports=[101]),
@ -1082,7 +1079,6 @@ class ServiceTestCase(ViewTestCases.PrimaryObjectViewTestCase):
ip_addresses = (
IPAddress(assigned_object=interface, address='192.0.2.1/24'),
IPAddress(assigned_object=interface, address='192.0.2.2/24'),
IPAddress(assigned_object=fhrp_group, address='192.0.2.3/24'),
)
IPAddress.objects.bulk_create(ip_addresses)
@ -1104,7 +1100,6 @@ class ServiceTestCase(ViewTestCases.PrimaryObjectViewTestCase):
"dcim.device,Device 1,Service 1,tcp,1,192.0.2.1/24,First service",
"dcim.device,Device 1,Service 2,tcp,2,192.0.2.2/24,Second service",
"dcim.device,Device 1,Service 3,udp,3,,Third service",
"ipam.fhrpgroup,Group 1,Service 4,udp,4,192.0.2.3/24,Fourth service",
)
cls.csv_update_data = (

View File

@ -2,7 +2,7 @@ import logging
from functools import cached_property
from django.core.exceptions import ObjectDoesNotExist, PermissionDenied
from django.db import router, transaction
from django.db import transaction
from django.db.models import ProtectedError, RestrictedError
from django_pglocks import advisory_lock
from netbox.constants import ADVISORY_LOCK_KEYS
@ -170,7 +170,7 @@ class NetBoxModelViewSet(
# Enforce object-level permissions on save()
try:
with transaction.atomic(using=router.db_for_write(model)):
with transaction.atomic():
instance = serializer.save()
self._validate_objects(instance)
except ObjectDoesNotExist:
@ -190,7 +190,7 @@ class NetBoxModelViewSet(
# Enforce object-level permissions on save()
try:
with transaction.atomic(using=router.db_for_write(model)):
with transaction.atomic():
instance = serializer.save()
self._validate_objects(instance)
except ObjectDoesNotExist:

View File

@ -1,5 +1,5 @@
from django.core.exceptions import ObjectDoesNotExist
from django.db import router, transaction
from django.db import transaction
from django.http import Http404
from rest_framework import status
from rest_framework.response import Response
@ -56,22 +56,22 @@ class SequentialBulkCreatesMixin:
which depends on the evaluation of existing objects (such as checking for free space within a rack) functions
appropriately.
"""
@transaction.atomic
def create(self, request, *args, **kwargs):
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
if not isinstance(request.data, list):
# Creating a single object
return super().create(request, *args, **kwargs)
if not isinstance(request.data, list):
# Creating a single object
return super().create(request, *args, **kwargs)
return_data = []
for data in request.data:
serializer = self.get_serializer(data=data)
serializer.is_valid(raise_exception=True)
self.perform_create(serializer)
return_data.append(serializer.data)
return_data = []
for data in request.data:
serializer = self.get_serializer(data=data)
serializer.is_valid(raise_exception=True)
self.perform_create(serializer)
return_data.append(serializer.data)
headers = self.get_success_headers(serializer.data)
headers = self.get_success_headers(serializer.data)
return Response(return_data, status=status.HTTP_201_CREATED, headers=headers)
return Response(return_data, status=status.HTTP_201_CREATED, headers=headers)
class BulkUpdateModelMixin:
@ -113,7 +113,7 @@ class BulkUpdateModelMixin:
return Response(data, status=status.HTTP_200_OK)
def perform_bulk_update(self, objects, update_data, partial):
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
with transaction.atomic():
data_list = []
for obj in objects:
data = update_data.get(obj.id)
@ -157,7 +157,7 @@ class BulkDestroyModelMixin:
return Response(status=status.HTTP_204_NO_CONTENT)
def perform_bulk_destroy(self, objects):
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
with transaction.atomic():
for obj in objects:
if hasattr(obj, 'snapshot'):
obj.snapshot()

View File

@ -231,19 +231,14 @@ SESSION_FILE_PATH = None
# DISK_BASE_UNIT = 1024
# RAM_BASE_UNIT = 1024
# Within the STORAGES dictionary, "default" is used for image uploads, "staticfiles" is for static files and "scripts"
# is used for custom scripts. See django-storages and django-storage-swift libraries for more details. By default the
# following configuration is used:
# STORAGES = {
# "default": {
# "BACKEND": "django.core.files.storage.FileSystemStorage",
# },
# "staticfiles": {
# "BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
# },
# "scripts": {
# "BACKEND": "extras.storage.ScriptFileSystemStorage",
# },
# By default, uploaded media is stored on the local filesystem. Using Django-storages is also supported. Provide the
# class path of the storage driver in STORAGE_BACKEND and any configuration options in STORAGE_CONFIG. For example:
# STORAGE_BACKEND = 'storages.backends.s3boto3.S3Boto3Storage'
# STORAGE_CONFIG = {
# 'AWS_ACCESS_KEY_ID': 'Key ID',
# 'AWS_SECRET_ACCESS_KEY': 'Secret',
# 'AWS_STORAGE_BUCKET_NAME': 'netbox',
# 'AWS_S3_REGION_NAME': 'eu-west-1',
# }
# Time zone (default: UTC)

View File

@ -8,7 +8,6 @@ from django_pglocks import advisory_lock
from rq.timeouts import JobTimeoutException
from core.choices import JobStatusChoices
from core.exceptions import JobFailed
from core.models import Job, ObjectType
from netbox.constants import ADVISORY_LOCK_KEYS
from netbox.registry import registry
@ -74,21 +73,15 @@ class JobRunner(ABC):
This method is called by the Job Scheduler to handle the execution of all job commands. It will maintain the
job's metadata and handle errors. For periodic jobs, a new job is automatically scheduled using its `interval`.
"""
logger = logging.getLogger('netbox.jobs')
try:
job.start()
cls(job).run(*args, **kwargs)
job.terminate()
except JobFailed:
logger.warning(f"Job {job} failed")
job.terminate(status=JobStatusChoices.STATUS_FAILED)
except Exception as e:
job.terminate(status=JobStatusChoices.STATUS_ERRORED, error=repr(e))
if type(e) is JobTimeoutException:
logger.error(e)
logging.error(e)
# If the executed job is a periodic job, schedule its next execution at the specified interval.
finally:

View File

@ -1,90 +0,0 @@
import logging
from django.contrib.contenttypes.fields import GenericRelation
from django.db import router
from django.db.models.deletion import Collector
logger = logging.getLogger("netbox.models.deletion")
class CustomCollector(Collector):
"""
Custom collector that handles GenericRelations correctly.
"""
def collect(
self,
objs,
source=None,
nullable=False,
collect_related=True,
source_attr=None,
reverse_dependency=False,
keep_parents=False,
fail_on_restricted=True,
):
"""
Override collect to first collect standard dependencies,
then add GenericRelations to the dependency graph.
"""
# Call parent collect first to get all standard dependencies
super().collect(
objs,
source=source,
nullable=nullable,
collect_related=collect_related,
source_attr=source_attr,
reverse_dependency=reverse_dependency,
keep_parents=keep_parents,
fail_on_restricted=fail_on_restricted,
)
# Track which GenericRelations we've already processed to prevent infinite recursion
processed_relations = set()
# Now add GenericRelations to the dependency graph
for _, instances in list(self.data.items()):
for instance in instances:
# Get all GenericRelations for this model
for field in instance._meta.private_fields:
if isinstance(field, GenericRelation):
# Create a unique key for this relation
relation_key = f"{instance._meta.model_name}.{field.name}"
if relation_key in processed_relations:
continue
processed_relations.add(relation_key)
# Add the model that the generic relation points to as a dependency
self.add_dependency(field.related_model, instance, reverse_dependency=True)
class DeleteMixin:
"""
Mixin to override the model delete function to use our custom collector.
"""
def delete(self, using=None, keep_parents=False):
"""
Override delete to use our custom collector.
"""
using = using or router.db_for_write(self.__class__, instance=self)
assert self._get_pk_val() is not None, "%s object can't be deleted because its %s attribute is set to None." % (
self._meta.object_name,
self._meta.pk.attname,
)
collector = CustomCollector(using=using)
collector.collect([self], keep_parents=keep_parents)
return collector.delete()
delete.alters_data = True
@classmethod
def verify_mro(cls, instance):
"""
Verify that this mixin is first in the MRO.
"""
mro = instance.__class__.__mro__
if mro.index(cls) != 0:
raise RuntimeError(f"{cls.__name__} must be first in the MRO. Current MRO: {mro}")

View File

@ -16,7 +16,6 @@ from extras.choices import *
from extras.constants import CUSTOMFIELD_EMPTY_VALUES
from extras.utils import is_taggable
from netbox.config import get_config
from netbox.models.deletion import DeleteMixin
from netbox.registry import registry
from netbox.signals import post_clean
from utilities.json import CustomFieldJSONEncoder
@ -46,7 +45,7 @@ __all__ = (
# Feature mixins
#
class ChangeLoggingMixin(DeleteMixin, models.Model):
class ChangeLoggingMixin(models.Model):
"""
Provides change logging support for a model. Adds the `created` and `last_updated` fields.
"""

View File

@ -1,8 +1,6 @@
from dataclasses import dataclass
from typing import Sequence, Optional
from django.urls import reverse_lazy
__all__ = (
'get_model_item',
@ -24,46 +22,20 @@ class MenuItemButton:
link: str
title: str
icon_class: str
_url: Optional[str] = None
permissions: Optional[Sequence[str]] = ()
color: Optional[str] = None
def __post_init__(self):
if self.link:
self._url = reverse_lazy(self.link)
@property
def url(self):
return self._url
@url.setter
def url(self, value):
self._url = value
@dataclass
class MenuItem:
link: str
link_text: str
_url: Optional[str] = None
permissions: Optional[Sequence[str]] = ()
auth_required: Optional[bool] = False
staff_only: Optional[bool] = False
buttons: Optional[Sequence[MenuItemButton]] = ()
def __post_init__(self):
if self.link:
self._url = reverse_lazy(self.link)
@property
def url(self):
return self._url
@url.setter
def url(self, value):
self._url = value
@dataclass
class MenuGroup:

View File

@ -1,4 +1,3 @@
from django.urls import reverse_lazy
from django.utils.text import slugify
from django.utils.translation import gettext as _
@ -33,23 +32,17 @@ class PluginMenuItem:
This class represents a navigation menu item. This constitutes primary link and its text, but also allows for
specifying additional link buttons that appear to the right of the item in the van menu.
Links are specified as Django reverse URL strings suitable for rendering via {% url item.link %}.
Alternatively, a pre-generated url can be set on the object which will be rendered literally.
Links are specified as Django reverse URL strings.
Buttons are each specified as a list of PluginMenuButton instances.
"""
permissions = []
buttons = []
_url = None
def __init__(
self, link, link_text, auth_required=False, staff_only=False, permissions=None, buttons=None
):
def __init__(self, link, link_text, auth_required=False, staff_only=False, permissions=None, buttons=None):
self.link = link
self.link_text = link_text
self.auth_required = auth_required
self.staff_only = staff_only
if link:
self._url = reverse_lazy(link)
if permissions is not None:
if type(permissions) not in (list, tuple):
raise TypeError(_("Permissions must be passed as a tuple or list."))
@ -59,14 +52,6 @@ class PluginMenuItem:
raise TypeError(_("Buttons must be passed as a tuple or list."))
self.buttons = buttons
@property
def url(self):
return self._url
@url.setter
def url(self, value):
self._url = value
class PluginMenuButton:
"""
@ -75,14 +60,11 @@ class PluginMenuButton:
"""
color = ButtonColorChoices.DEFAULT
permissions = []
_url = None
def __init__(self, link, title, icon_class, color=None, permissions=None):
self.link = link
self.title = title
self.icon_class = icon_class
if link:
self._url = reverse_lazy(link)
if permissions is not None:
if type(permissions) not in (list, tuple):
raise TypeError(_("Permissions must be passed as a tuple or list."))
@ -91,11 +73,3 @@ class PluginMenuButton:
if color not in ButtonColorChoices.values():
raise ValueError(_("Button color must be a choice within ButtonColorChoices."))
self.color = color
@property
def url(self):
return self._url
@url.setter
def url(self, value):
self._url = value

View File

@ -54,14 +54,6 @@ PREFERENCES = {
default='bottom',
description=_('Where the paginator controls will be displayed relative to a table')
),
'ui.tables.striping': UserPreference(
label=_('Striped table rows'),
choices=(
('', _('Disabled')),
('true', _('Enabled')),
),
description=_('Render table rows with alternating colors to increase readability'),
),
# Miscellaneous
'data_format': UserPreference(

View File

@ -115,13 +115,11 @@ class CachedValueSearchBackend(SearchBackend):
if lookup in (LookupTypes.STARTSWITH, LookupTypes.ENDSWITH):
# "Starts/ends with" matches are valid only on string values
query_filter &= Q(type=FieldTypes.STRING)
elif lookup in (LookupTypes.PARTIAL, LookupTypes.EXACT):
elif lookup == LookupTypes.PARTIAL:
try:
# If the value looks like an IP address, add extra filters for CIDR/INET values
# If the value looks like an IP address, add an extra match for CIDR values
address = str(netaddr.IPNetwork(value.strip()).cidr)
query_filter |= Q(type=FieldTypes.INET) & Q(value__net_host=address)
if lookup == LookupTypes.PARTIAL:
query_filter |= Q(type=FieldTypes.CIDR) & Q(value__net_contains_or_equals=address)
query_filter |= Q(type=FieldTypes.CIDR) & Q(value__net_contains_or_equals=address)
except (AddrFormatError, ValueError):
pass

View File

@ -66,9 +66,6 @@ class BaseTable(tables.Table):
if column.visible:
model = getattr(self.Meta, 'model')
accessor = column.accessor
if accessor.startswith('custom_field_data__'):
# Ignore custom field references
continue
prefetch_path = []
for field_name in accessor.split(accessor.SEPARATOR):
try:
@ -166,8 +163,6 @@ class BaseTable(tables.Table):
columns = userconfig.get(f"tables.{self.name}.columns")
if ordering is None:
ordering = userconfig.get(f"tables.{self.name}.ordering")
if userconfig.get("ui.tables.striping"):
self.attrs['class'] += ' table-striped'
# Fall back to the default columns & ordering
if columns is None and hasattr(settings, 'DEFAULT_USER_PREFERENCES'):

View File

@ -7,15 +7,11 @@ from django_rq import get_queue
from ..jobs import *
from core.models import DataSource, Job
from core.choices import JobStatusChoices
from core.exceptions import JobFailed
from utilities.testing import disable_warnings
class TestJobRunner(JobRunner):
def run(self, *args, **kwargs):
if kwargs.get('make_fail', False):
raise JobFailed()
pass
class JobRunnerTestCase(TestCase):
@ -53,12 +49,6 @@ class JobRunnerTest(JobRunnerTestCase):
self.assertEqual(job.status, JobStatusChoices.STATUS_COMPLETED)
def test_handle_failed(self):
with disable_warnings('netbox.jobs'):
job = TestJobRunner.enqueue(immediate=True, make_fail=True)
self.assertEqual(job.status, JobStatusChoices.STATUS_FAILED)
def test_handle_errored(self):
class ErroredJobRunner(TestJobRunner):
EXP = Exception('Test error')

View File

@ -6,7 +6,7 @@ from django.contrib import messages
from django.contrib.contenttypes.fields import GenericForeignKey, GenericRel
from django.contrib.contenttypes.models import ContentType
from django.core.exceptions import FieldDoesNotExist, ObjectDoesNotExist, ValidationError
from django.db import IntegrityError, router, transaction
from django.db import transaction, IntegrityError
from django.db.models import ManyToManyField, ProtectedError, RestrictedError
from django.db.models.fields.reverse_related import ManyToManyRel
from django.forms import ModelMultipleChoiceField, MultipleHiddenInput
@ -278,7 +278,7 @@ class BulkCreateView(GetReturnURLMixin, BaseMultiObjectView):
logger.debug("Form validation was successful")
try:
with transaction.atomic(using=router.db_for_write(model)):
with transaction.atomic():
new_objs = self._create_objects(form, request)
# Enforce object-level permissions
@ -501,7 +501,7 @@ class BulkImportView(GetReturnURLMixin, BaseMultiObjectView):
try:
# Iterate through data and bind each record to a new model form instance.
with transaction.atomic(using=router.db_for_write(model)):
with transaction.atomic():
new_objs = self.create_and_update_objects(form, request)
# Enforce object-level permissions
@ -681,7 +681,7 @@ class BulkEditView(GetReturnURLMixin, BaseMultiObjectView):
if form.is_valid():
logger.debug("Form validation was successful")
try:
with transaction.atomic(using=router.db_for_write(model)):
with transaction.atomic():
updated_objects = self._update_objects(form, request)
# Enforce object-level permissions
@ -778,7 +778,7 @@ class BulkRenameView(GetReturnURLMixin, BaseMultiObjectView):
if form.is_valid():
try:
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
with transaction.atomic():
renamed_pks = self._rename_objects(form, selected_objects)
if '_apply' in request.POST:
@ -875,7 +875,7 @@ class BulkDeleteView(GetReturnURLMixin, BaseMultiObjectView):
queryset = self.queryset.filter(pk__in=pk_list)
deleted_count = queryset.count()
try:
with transaction.atomic(using=router.db_for_write(model)):
with transaction.atomic():
for obj in queryset:
# Take a snapshot of change-logged models
if hasattr(obj, 'snapshot'):
@ -980,7 +980,7 @@ class BulkComponentCreateView(GetReturnURLMixin, BaseMultiObjectView):
}
try:
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
with transaction.atomic():
for obj in data['pk']:

View File

@ -1,7 +1,7 @@
from django.contrib.auth.mixins import LoginRequiredMixin
from django.contrib.contenttypes.models import ContentType
from django.contrib import messages
from django.db import router, transaction
from django.db import transaction
from django.db.models import Q
from django.shortcuts import get_object_or_404, redirect, render
from django.utils.translation import gettext_lazy as _
@ -240,7 +240,7 @@ class BulkSyncDataView(GetReturnURLMixin, BaseMultiObjectView):
data_file__isnull=False
)
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
with transaction.atomic():
for obj in selected_objects:
obj.sync(save=True)

View File

@ -282,7 +282,7 @@ class ObjectEditView(GetReturnURLMixin, BaseObjectView):
logger.debug("Form validation was successful")
try:
with transaction.atomic(using=router.db_for_write(model)):
with transaction.atomic():
object_created = form.instance.pk is None
obj = form.save()
@ -570,7 +570,7 @@ class ComponentCreateView(GetReturnURLMixin, BaseObjectView):
if not form.errors and not component_form.errors:
try:
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
with transaction.atomic():
# Create the new components
new_objs = []
for component_form in new_components:

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@ -23,14 +23,14 @@
},
"dependencies": {
"@mdi/font": "7.4.47",
"@tabler/core": "1.4.0",
"bootstrap": "5.3.7",
"@tabler/core": "1.3.2",
"bootstrap": "5.3.6",
"clipboard": "2.0.11",
"flatpickr": "4.6.13",
"gridstack": "12.2.2",
"htmx.org": "2.0.6",
"query-string": "9.2.2",
"sass": "1.89.2",
"gridstack": "12.2.1",
"htmx.org": "2.0.4",
"query-string": "9.2.0",
"sass": "1.89.1",
"tom-select": "2.4.3",
"typeface-inter": "3.18.1",
"typeface-roboto-mono": "1.1.13"
@ -39,15 +39,15 @@
"@types/bootstrap": "5.2.10",
"@types/cookie": "^0.6.0",
"@types/node": "^22.3.0",
"@typescript-eslint/eslint-plugin": "^8.37.0",
"@typescript-eslint/parser": "^8.37.0",
"esbuild": "^0.25.6",
"@typescript-eslint/eslint-plugin": "^8.1.0",
"@typescript-eslint/parser": "^8.1.0",
"esbuild": "^0.25.3",
"esbuild-sass-plugin": "^3.3.1",
"eslint": "<9.0",
"eslint-config-prettier": "^9.1.0",
"eslint-import-resolver-typescript": "^3.6.3",
"eslint-plugin-import": "^2.32.0",
"eslint-plugin-prettier": "^5.5.1",
"eslint-plugin-import": "^2.30.0",
"eslint-plugin-prettier": "^5.2.1",
"prettier": "^3.3.3",
"typescript": "<5.5"
},

View File

@ -1,4 +0,0 @@
.rack-loading-container {
min-height: 200px;
margin-left: 30px;
}

View File

@ -27,4 +27,3 @@
@import 'custom/markdown';
@import 'custom/misc';
@import 'custom/notifications';
@import 'custom/racks';

File diff suppressed because it is too large Load Diff

View File

@ -1,3 +1,3 @@
version: "4.3.4"
version: "4.3.2"
edition: "Community"
published: "2025-07-15"
published: "2025-06-05"

View File

@ -45,7 +45,7 @@
</div>
{% elif perms.dcim.add_cable %}
<div class="dropdown">
<button type="button" class="btn btn-primary dropdown-toggle" data-bs-toggle="dropdown" aria-haspopup="true" aria-expanded="false">
<button type="button" class="btn btn-success dropdown-toggle" data-bs-toggle="dropdown" aria-haspopup="true" aria-expanded="false">
<span class="mdi mdi-ethernet-cable" aria-hidden="true"></span> {% trans "Connect" %}
</button>
<ul class="dropdown-menu">

View File

@ -65,7 +65,7 @@
{% trans "Not Connected" %}
{% if perms.dcim.add_cable %}
<div class="dropdown float-end">
<button type="button" class="btn btn-primary dropdown-toggle" data-bs-toggle="dropdown" aria-expanded="false">
<button type="button" class="btn btn-primary btn-sm dropdown-toggle" data-bs-toggle="dropdown" aria-expanded="false">
<span class="mdi mdi-ethernet-cable" aria-hidden="true"></span> {% trans "Connect" %}
</button>
<ul class="dropdown-menu dropdown-menu-end">

View File

@ -65,7 +65,7 @@
{% trans "Not Connected" %}
{% if perms.dcim.add_cable %}
<div class="dropdown float-end">
<button type="button" class="btn btn-primary dropdown-toggle" data-bs-toggle="dropdown" aria-expanded="false">
<button type="button" class="btn btn-primary btn-sm dropdown-toggle" data-bs-toggle="dropdown" aria-expanded="false">
<span class="mdi mdi-ethernet-cable" aria-hidden="true"></span> {% trans "Connect" %}
</button>
<ul class="dropdown-menu dropdown-menu-end">

View File

@ -308,7 +308,7 @@
{% trans "Services" %}
{% if perms.ipam.add_service %}
<div class="card-actions">
<a href="{% url 'ipam:service_add' %}?parent_object_type={{ object|content_type_id }}&parent={{ object.pk }}" class="btn btn-ghost-primary btn-sm">
<a href="{% url 'ipam:service_add' %}?device={{ object.pk }}" class="btn btn-ghost-primary btn-sm">
<span class="mdi mdi-plus-thick" aria-hidden="true"></span> {% trans "Add a service" %}
</a>
</div>

View File

@ -1,17 +1,6 @@
{% load i18n %}
<div style="margin-left: -30px">
<div
hx-get="{% url 'dcim-api:rack-elevation' pk=object.pk %}?face={{ face }}&render=svg{% if extra_params %}&{{ extra_params }}{% endif %}"
hx-trigger="intersect"
hx-swap="outerHTML"
aria-label="{% trans "Rack elevation" %}"
>
<div class="d-flex justify-content-center align-items-center rack-loading-container">
<div class="spinner-border" role="status">
<span class="visually-hidden">{% trans "Loading..." %}</span>
</div>
</div>
</div>
<object data="{% url 'dcim-api:rack-elevation' pk=object.pk %}?face={{face}}&render=svg{% if extra_params %}&{{ extra_params }}{% endif %}" class="rack_elevation" aria-label="{% trans "Rack elevation" %}"></object>
</div>
<div class="text-center mt-3">
<a class="btn btn-outline-primary" href="{% url 'dcim-api:rack-elevation' pk=object.pk %}?face={{face}}&render=svg{% if extra_params %}&{{ extra_params }}{% endif %}" hx-boost="false">

View File

@ -118,6 +118,10 @@
{% else %}
<div class="card-body text-muted">
{% trans "Not connected" %}
</div>
{% endif %}
{% if not object.mark_connected and not object.cable %}
<div class="card-footer">
{% if perms.dcim.add_cable %}
<a href="{% url 'dcim:cable_add' %}?a_terminations_type=dcim.powerfeed&a_terminations={{ object.pk }}&b_terminations_type=dcim.powerport&return_url={{ object.get_absolute_url }}" class="btn btn-primary float-end">
<i class="mdi mdi-ethernet-cable" aria-hidden="true"></i> {% trans "Connect" %}

View File

@ -14,7 +14,7 @@
</tr>
<tr>
<th scope="row">Description</th>
<td>{{ object.description|placeholder }}</td>
<td>{{ object.description|markdown|placeholder }}</td>
</tr>
<tr>
<th scope="row">Base Choices</th>

View File

@ -53,16 +53,7 @@
{# Script output. Legacy reports will not have this. #}
{% if 'output' in job.data %}
<div class="card mb-3">
<h2 class="card-header d-flex justify-content-between">
{% trans "Output" %}
{% if job.completed %}
<div>
<a href="?export=output" class="btn btn-primary lh-1" role="button">
<i class="mdi mdi-download" aria-hidden="true"></i> {% trans "Download" %}
</a>
</div>
{% endif %}
</h2>
<h2 class="card-header">{% trans "Output" %}</h2>
{% if job.data.output %}
<pre class="card-body font-monospace">{{ job.data.output }}</pre>
{% else %}

View File

@ -29,7 +29,11 @@
<div class="hr-text">
<span>{% trans "Custom Fields" %}</span>
</div>
{% render_custom_fields filter_form %}
{% for name in filter_form.custom_fields %}
{% with field=filter_form|get_item:name %}
{% render_field field %}
{% endwith %}
{% endfor %}
</div>
{% endif %}
</div>

View File

@ -154,7 +154,7 @@
{% trans "Services" %}
{% if perms.ipam.add_service %}
<div class="card-actions">
<a href="{% url 'ipam:service_add' %}?parent_object_type={{ object|content_type_id }}&parent={{ object.pk }}" class="btn btn-ghost-primary btn-sm">
<a href="{% url 'ipam:service_add' %}?virtual_machine={{ object.pk }}" class="btn btn-ghost-primary btn-sm">
<span class="mdi mdi-plus-thick" aria-hidden="true"></span> {% trans "Add a service" %}
</a>
</div>

View File

@ -19,10 +19,6 @@ class ContactGroupTable(NetBoxTable):
verbose_name=_('Name'),
linkify=True
)
parent = tables.Column(
verbose_name=_('Parent'),
linkify=True,
)
contact_count = columns.LinkedCountColumn(
viewname='tenancy:contact_list',
url_params={'group_id': 'pk'},
@ -38,7 +34,7 @@ class ContactGroupTable(NetBoxTable):
class Meta(NetBoxTable.Meta):
model = ContactGroup
fields = (
'pk', 'name', 'parent', 'contact_count', 'description', 'comments', 'slug', 'tags', 'created',
'pk', 'name', 'contact_count', 'description', 'comments', 'slug', 'tags', 'created',
'last_updated', 'actions',
)
default_columns = ('pk', 'name', 'contact_count', 'description')

View File

@ -16,10 +16,6 @@ class TenantGroupTable(NetBoxTable):
verbose_name=_('Name'),
linkify=True
)
parent = tables.Column(
verbose_name=_('Parent'),
linkify=True,
)
tenant_count = columns.LinkedCountColumn(
viewname='tenancy:tenant_list',
url_params={'group_id': 'pk'},
@ -35,7 +31,7 @@ class TenantGroupTable(NetBoxTable):
class Meta(NetBoxTable.Meta):
model = TenantGroup
fields = (
'pk', 'id', 'name', 'parent', 'tenant_count', 'description', 'comments', 'slug', 'tags', 'created',
'pk', 'id', 'name', 'tenant_count', 'description', 'comments', 'slug', 'tags', 'created',
'last_updated', 'actions',
)
default_columns = ('pk', 'name', 'tenant_count', 'description')

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More