Compare commits

..

201 Commits

Author SHA1 Message Date
Arthur
588c069ff1 #20378 fix delete of DataSource 2025-11-06 15:57:07 -08:00
Robin Schneider
3cdc6251be docs(configuration): PROTECTION_RULES missing in list
Closes: #20709
2025-11-04 09:53:06 -05:00
jniec-js
0e1705b870 Closes #20297: add additional coaxial cable type choices (#20741) 2025-11-04 08:45:37 -06:00
github-actions
cbf9b62f12 Update source translation strings
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-11-01 05:02:02 +00:00
Martin Hauser
c429cc3638 Closes #14171: Add VLAN-related fields to import forms (#20730)
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
2025-10-31 16:17:58 -05:00
Jeremy Stretch
032ed4f11c Closes #20715: Remove OpenAPI schema check from pre-commit (#20716)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-31 09:29:56 -07:00
Jason Novinger
7ca4342c15 Fixes #20721: Fix breadcrumb link on task detail page (#20724) 2025-10-31 09:29:28 -07:00
Martin Hauser
70bc1c226a fix(utilities): Ensure unique signal handlers for counter models
Updates `connect_counters` to prevent duplicate signal handlers by
using consistent `dispatch_uid` values per sender. Adds a check to
avoid reconnecting models already processed during registration.

Fixes #20697
2025-10-31 10:12:41 -04:00
Robin Schneider
6a21459ccc docs(configuration): close Markdown inline code, "`" was forgotten
https://netboxlabs.com/docs/netbox/configuration/security/#csrf_trusted_origins
2025-10-31 08:17:48 -04:00
github-actions
635de4af2e Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-31 05:03:42 +00:00
Robin Gruyters
df96f7dd0f Closes #20647: add cleanup for interface import (#20702)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
Co-authored-by: Robin Gruyters <2082795+rgruyters@users.noreply.github.com>
Co-authored-by: Martin Hauser <git@pheus.dev>
2025-10-30 20:08:24 -05:00
Jeremy Stretch
0b61d69e05 Fixes #20713: Record pre-change snapshots on VC members being added/removed (#20714)
Some checks failed
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-10-30 07:50:10 -05:00
github-actions
df688ce064 Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-30 05:03:02 +00:00
bctiemann
1a1ab2a19d Merge pull request #20708 from netbox-community/20699-changelog-ordering
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
Fixes #20699: Ensure proper ordering of changelog entries resulting from cascading deletions
2025-10-29 14:13:21 -04:00
Jo
80f03daad6 Improved docs on background jobs on instances (#20489) 2025-10-29 10:15:49 -07:00
Jeremy Stretch
d04c41d0f6 Add test for ordering of cascading deletions 2025-10-29 09:22:17 -04:00
github-actions
1fc849eb40 Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-29 05:02:12 +00:00
Jeremy Stretch
bbf1f6181d Extend custom collector to force expected ordering of cascading deletions 2025-10-28 16:37:21 -04:00
Jeremy Stretch
729b0365e0 Fix errant update of objects being deleted via cascade 2025-10-28 15:13:03 -04:00
Jeremy Stretch
43cb476223 Release v4.4.5
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-28 14:34:18 -04:00
Martin Hauser
d6f756d315 feat(tables): Add ContactsColumnMixin to multiple tables
Integrate `ContactsColumnMixin` into various IPAM and VPN tables to
improve contact management. Updates table fields to include `contacts`.

Fixes #20700
2025-10-28 13:34:27 -04:00
Martin Hauser
afc62b6ffd fix(ipam): Correct VLAN ID range calculation logic
Adjust VLAN ID range calculation to use half‑open intervals for
consistency. Add a test to validate `_total_vlan_ids`.

Fixes #20610
2025-10-28 13:14:34 -04:00
bctiemann
3d4841f17f Merge pull request #20612 from pheus/20301-add-clear-all-option-to-user-notifications-dropdown
Closes #20301: Add "Dismiss all" action to notifications dropdown
2025-10-28 12:08:53 -04:00
Alexander Zimin
2aefb3af73 Add contacts field to ip addresses table view #20692 2025-10-28 08:48:36 -04:00
github-actions
4eff4d6a4a Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-28 05:03:24 +00:00
rinna11
9381564cab Fixes #20422: Allow Aggregate and Prefix to filter by family in GraphQL (#20626)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
Co-authored-by: Rinna Izumi <rizumi@bethel.jw.org>
Co-authored-by: Jason Novinger <jnovinger@gmail.com>
2025-10-27 09:02:28 -05:00
Jeremy Stretch
3d143d635b Closes #20675: Enable NetBox Copilot integration (#20682) 2025-10-27 08:54:38 -05:00
Martin Hauser
77307b3c91 fix(users): Disable sorting on Permission flag columns
Mark `can_view`, `can_add`, `can_change`, and `can_delete` columns in
the Permissions list as `orderable=False`. Sorting by these computed
flags persisted an invalid sort key which triggers a `FieldError` when
loading `/users/permissions/`.

Fixes #20655
2025-10-27 09:25:36 -04:00
bctiemann
aa4571b61f Merge pull request #20672 from pheus/20389-allow-all-bulk-rename
Fixes #20389: Add FilterSet support to BulkRenameView
2025-10-27 09:23:39 -04:00
Jo
56d9146323 Fixes #20499: Documented ObjectListView quick search feature for plugins (#20500)
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-26 20:59:59 -05:00
github-actions
e192f64dd2 Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-26 05:03:34 +00:00
Martin Hauser
d433a28524 Fixes #20646: Prevent cables from connecting to marked objects (#20678)
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
2025-10-25 10:22:03 -05:00
Pl0xym0r
dbfdf318ad Closes #20459 : clean is_oob and is_primary on bulk_import (#20657) 2025-10-25 10:10:20 -05:00
Martin Hauser
639bc4462b Fixes #20541: Enhance filter methods with dynamic prefixing (#20579)
Some checks are pending
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CI / build (20.x, 3.10) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-24 14:58:31 -05:00
Jeremy Stretch
1c59d411f7 Apply the "netbox" label automatically for all new issues (#20666)
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
2025-10-24 09:27:41 -05:00
Martin Hauser
ac7a4ec4a3 feat(views): Add FilterSet support to BulkRenameView
Allow passing a FilterSet to BulkRenameView for consistent behavior with
BulkEditView and BulkDeleteView. Enables the
"Select all N matching query" functionality to expand across the full
queryset. Updates logic to handle PK lists appropriately when editing
all matched objects.

Fixes #20389
2025-10-24 14:43:35 +02:00
github-actions
0cf58e62b2 Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-24 05:02:27 +00:00
Jason Novinger
fb8d41b527 Fixes #20641: Handle viewsets with queryset=None in get_view_name() (#20642)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
The get_view_name() utility function crashed with AttributeError when
called on viewsets that override get_queryset() without setting a
class-level queryset attribute (e.g., ObjectChangeViewSet).

This pattern became necessary in #20089 to force re-evaluation of
valid_models() on each request, ensuring ObjectChange querysets reflect
current ContentType state.

Added None check to fall back to DRF's default view naming when no
class-level queryset exists.
2025-10-23 09:39:49 -07:00
bctiemann
ae5d7911f9 Merge pull request #20665 from netbox-community/20637-improve-device-q-filter
Some checks failed
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
Fixes #20637: Omit inventory item serials from device search filter to improve performance
2025-10-23 11:08:22 -04:00
Jeremy Stretch
3bd0186870 Fixes #20637: Omit inventory item serials from device search filter to improve performance 2025-10-23 10:11:08 -04:00
bctiemann
09ce8a808d Merge pull request #20651 from netbox-community/19872-script-validation-errors
Fixes #19872: Display script form validation errors
2025-10-23 09:59:29 -04:00
Martin Hauser
8eaff9dce7 feat(extras): Add "Dismiss all" action to notifications dropdown
Introduce a view to allow users to dismiss all unread notifications with
a single action. Update the notifications' template to include a
"Dismiss all" button for enhanced usability. This addition streamlines
notification management and improves the user experience.

Fixes #20301
2025-10-22 13:59:54 +02:00
github-actions
cb3308a166 Update source translation strings
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-10-22 05:02:23 +00:00
Jason Novinger
5fbae8407e Only show non-rendered field errors in toast
When script form validation fails, display error messages for fields not
in fieldsets. Fields in fieldsets show inline errors only; hidden fields
show toast notifications to provide feedback instead of failing silently.
2025-10-21 11:54:46 -05:00
Jason Novinger
2fdd46f64c Fixes #19872: Display form validation errors for script execution
When script form validation fails (e.g., required fields excluded from
fieldsets), display error messages via Django's message framework instead
of failing silently. Error format: "field: error1, error2; field2: error".
2025-10-21 11:16:56 -05:00
Martin Hauser
c5124cb2e4 feat(templates): Update user menu icon class names for consistency
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
Switch icons in the top-right User dropdown to Tabler’s
`dropdown-item-icon` to standardize spacing between the icon and label.
Improves readability and ensures alignment with the overall UI styling.

Fixes #20608
2025-10-21 08:35:50 -04:00
Jason Novinger
d01d7b4156 Fixes #20551: Support quick-add form prefix in automatic slug generation (#20624)
* Fixes #20551: Support quick-add form prefix in automatic slug generation

The slug generation logic in `reslug.ts` looks for form fields using hard-coded ID selectors like `#id_slug` and `#id_name`. In quick-add modals, Django applies a `quickadd` prefix to form fields (introduced in #20542), resulting in IDs like `#id_quickadd-slug` and `#id_quickadd-name`. The logic couldn't find these prefixed fields, so automatic slug generation failed silently in quick-add modals. This fix updates the field selectors to try both unprefixed and prefixed patterns using the nullish coalescing operator (`??`), checking for the standard field ID first and falling back to the quickadd-prefixed ID if the standard one isn't found.

* Address PR feedback

The slug generation logic required updates to support form prefixes like `quickadd`. Python-side changes
ensure `SlugField.get_bound_field()` updates the `slug-source` attribute to include the form prefix when
present, so JavaScript receives the correct prefixed field ID. `SlugWidget.__init__()` now adds a
`slug-field` class to enable selector-based field discovery. On the frontend, `reslug.ts` now uses class
selectors (`button.reslug` and `input.slug-field`) instead of ID-based lookups, eliminating the need for
fallback logic. The template was updated to use `class="reslug"` instead of `id="reslug"` on the button to
avoid ID duplication issues.
2025-10-21 08:33:10 -04:00
github-actions
4db6123fb2 Update source translation strings
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-21 05:03:30 +00:00
Jeremy Stretch
43648d629b Fixes #20606: Enable copying text from badges in UI (#20633) 2025-10-20 17:12:42 -05:00
bctiemann
0b97df0984 Merge pull request #20625 from netbox-community/20498-url-custom-field-validation-regex
Fixes #20498: Apply validation regex to URL custom fields
2025-10-20 15:30:33 -04:00
Martin Hauser
5334c8143c feat(forms): Add context handling for ModuleBay field (#20586) 2025-10-20 10:16:53 -07:00
Martin Hauser
bbb330becf feat(filtersets): Add assigned and primary filters for MACAddress (#20620)
Introduce Boolean filters `assigned` and `primary` to the MACAddress
filterset, improving filtering capabilities. Update forms, tables, and
GraphQL queries to incorporate the new filters. Add tests to validate
the correct functionality.

Fixes #20399
2025-10-20 10:01:25 -07:00
Jeremy Stretch
e4c74ce6a3 Closes #20614: Update ruff for pre-commit check (#20631) 2025-10-20 09:07:12 -07:00
Martin Hauser
a4868f894d feat(ipam): Add ContactsColumnMixin to ServiceTable
Enhance `ServiceTable` by incorporating `ContactsColumnMixin` for better
contact management. Updates the fields to include `contacts`.

Fixes #20567
2025-10-20 09:07:25 -04:00
github-actions
531ea34207 Update source translation strings 2025-10-20 05:03:22 +00:00
Jason Novinger
6747c82a1a Fixes #20498: Apply validation regex to URL custom fields
The validation_regex field was not being enforced for URL type custom
fields. This fix adds regex validation in two places:

1. to_form_field() - Applies regex validator to form fields (UI validation)
2. validate() - Applies regex check in model validation (API/programmatic)

Note: The original issue reported UI validation only, but this fix also
adds API validation for consistency with text field behavior and to
ensure data integrity across all entry points.
2025-10-19 18:30:54 -05:00
Martin Hauser
e251ea10b5 Closes #20605: Document variable prefilling via URL parameters (#20619) 2025-10-19 15:42:09 -05:00
Martin Hauser
a1aaf465ac Fixes #20466: Correct handling of assigned filter logic (#20538) 2025-10-19 12:51:44 -05:00
Martin Hauser
2a1d315d85 Fixes #20524: Enhance API script scheduling validation (#20616) 2025-10-19 12:29:14 -05:00
github-actions
8cc6589a35 Update source translation strings
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
Close incomplete issues / stale (push) Has been cancelled
Lock threads / lock (push) Has been cancelled
Close stale issues/PRs / stale (push) Has been cancelled
Update translation strings / makemessages (push) Has been cancelled
2025-10-16 05:03:49 +00:00
Jason Novinger
bee0080917 Release v4.4.4 (#20594)
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
CI / build (20.x, 3.10) (push) Has been cancelled
2025-10-15 14:25:43 -05:00
bctiemann
389c44e5d6 Merge pull request #20591 from pheus/20554-add-missing-contenttypefilter-to-filtersets
Fixes #20554: Add ContentTypeFilter to several filtersets
2025-10-15 14:16:51 -04:00
bctiemann
9cb2c78e34 Init storage at class level of BaseScript instead of in findsource function (#20575) 2025-10-15 11:09:22 -07:00
Jason Novinger
2ae98f0353 Fixes #20587: Handle stale ContentTypes in has_feature()
When deleting stale ContentTypes during remove_stale_contenttypes, the
pre_delete signal triggers notify_object_changed(), which calls
has_feature() with the ContentType instance. For stale types (those with
no corresponding model class), model_class() returns None, which then gets
passed to issubclass() in the feature test lambda, causing a TypeError.

The previous implementation in has_feature() checked for None before
attempting ObjectType lookup. The optimization in 5ceb6a6 removed this
safety check when refactoring the ContentType code path to use direct
feature registry lookups. This restores the null check to maintain the
original behavior of returning False for stale ContentTypes.
2025-10-15 14:09:04 -04:00
Jeremy Stretch
addda0538f Fixes #20584: Ensure consistent validation between Interface & InterfaceTemplate (#20589) 2025-10-15 11:04:39 -07:00
Jeremy Stretch
c902a1c510 Fixes #20585: Fix AttributeError exception for conditionless single-field UniqueConstraints (#20590) 2025-10-15 12:51:33 -05:00
Martin Hauser
f23ee0a46f feat(filtersets): Add ContentTypeFilter to enhance filtering
Introduce `ContentTypeFilter` across several filtersets, including
`object_type`, `related_object_type`, `assigned_object_type`, and
`parent_object_type`. This improvement enhances filtering specificity
and aligns with existing usability standards.

Closes #20554
2025-10-15 18:24:42 +02:00
github-actions
b4acc3fb36 Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-15 05:04:04 +00:00
Jeremy Stretch
a69bbcf651 Release v4.4.3
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-14 13:51:41 -04:00
Jeremy Stretch
2edfde5753 Fixes #19302: Fix uniqueness validation in REST API for nullable fields (#20549) 2025-10-14 09:19:10 -07:00
Martin Hauser
cfbd9632ac feat(utilities): Add ranges_to_string_list
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
Introduce `ranges_to_string_list` for converting numeric ranges into a
list of readable strings. Update the `vid_ranges_list` property and
templates to use this method for better readability and maintainability.
Add related tests to ensure functionality.

Closes #20516
2025-10-14 09:39:09 -04:00
bctiemann
c9386bc9c3 Merge pull request #20558 from netbox-community/20557-update-to-django-5.2.7
Some checks are pending
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
Closes #20557: Upgrade Django to v5.2.7
2025-10-13 07:02:44 -04:00
Jason Novinger
c826c5cdb0 Closes #20557: Upgrade Django to v5.2.7
Upgrade Django to v5.2.7 to address upstream vulnerability reports

https://www.djangoproject.com/weblog/2025/oct/01/security-releases/
2025-10-13 01:06:23 -05:00
Aaron
a4ab4f885d Fixes #20156: Fixed rack view not using previous setting (#20556)
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
2025-10-13 00:38:45 -05:00
Arthur Hanson
61d77dff14 Fixes #19615: Properly set version request parameter for static files in S3 (#20455) 2025-10-12 18:49:42 -05:00
github-actions
24a83acc34 Update source translation strings
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-10-10 05:03:50 +00:00
bctiemann
dbc71158ec Merge pull request #20525 from mathieumd/19818-hide_primary_ip_at_vm_creation
Some checks failed
CI / build (20.x, 3.12) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
Fixes #19818: Hide IP fields when creating VM
2025-10-09 17:54:22 -04:00
Jason Novinger
f0523611d1 Fixes #20542: Add form prefix to POST handler in ObjectEditView (#20550)
Commit d22246688 added form prefix support to the `GET` handler to fix
Markdown preview functionality in quick add modals. The form prefix
allows Django to properly namespace field names and IDs when rendering
forms within the quick add modal context.

However, the corresponding change was not made to the `POST` handler. This
created a mismatch where form fields were rendered with the `quickadd-`
prefix during `GET` requests, but the `POST` handler instantiated forms
without the prefix. When users submitted quick add forms, Django looked
for unprefixed field names like `address` and `status` in the `POST` data,
but the actual submitted data used prefixed names like `quickadd-address`
and `quickadd-status`. This caused validation to fail immediately with
"This field is required" errors for all required fields, making every
quick add form unusable.

The fix adds the same prefix detection logic to the `POST` handler that was
added to the `GET` handler, checking for the `_quickadd` parameter in the
query string and applying the `quickadd` prefix when present. This ensures
consistent form field naming between rendering and validation.

A regression test has been added to `MACAddressTestCase` to verify that MAC
addresses can be successfully created via the quick add modal, preventing
this issue from recurring. This test should be promoted to a template
test whenever it becomes possible to determine if a model should support
quick-add functionality.
2025-10-09 14:42:59 -07:00
Daniel Sheppard
7719b98697 Fixes #19825: Prevent inaccurate config revision activation when not intended (#20219)
Some checks failed
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-10-09 01:36:41 -05:00
Martin Hauser
f383067ecb Closes #20527: Address deprecation warnings (#20533) 2025-10-09 00:47:09 -05:00
github-actions
20de263565 Update source translation strings 2025-10-09 05:04:28 +00:00
Jeremy Stretch
5ceb6a60da Fixes #20290: Avoid exceptions when upgrading to v4.4 from early releases due to missing ObjectTypes table
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-08 13:00:27 -04:00
Martin Hauser
33d4759871 feat(extras): Add range_contains ORM lookup
Introduce a generic lookup for ArrayField(RangeField) that matches rows
where a scalar value is contained by any range in the array
(e.g. VLANGroup.vid_ranges).
Replace the raw-SQL helper in the VLANGroup FilterSet (`contains_vid`)
with the ORM lookup for better maintainability.
Add tests for the lookup and the FilterSet behavior.

Closes #20497
2025-10-08 09:57:15 -04:00
Amir-Bakar
2abc5ac69a Update base.html
Update base.html to account for other cases where passwords are not used, other than LDAP. (SSO solutions, for example.)
2025-10-08 09:56:15 -04:00
bctiemann
f8c074045f Merge pull request #20528 from netbox-community/02496-max-page
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
20496 make max_page_size upper bound
2025-10-07 13:11:59 -04:00
Arthur
4db3d488ad Merge branch 'main' into 02496-max-page
Some checks failed
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
2025-10-07 09:12:33 -07:00
Martin Hauser
b7cae04572 fix(api): Update NumericRange handling to use half-open intervals (#20478) 2025-10-07 09:01:29 -07:00
Martin Hauser
51528ae429 fix(utilities): Enhance ranges_to_string for improved clarity (#20479) 2025-10-07 08:47:01 -07:00
Jeremy Stretch
d5e8480367 Update OpenAPI schema (#20519) 2025-10-07 08:22:24 -07:00
Matthew Papaleo
05e26b82c1 Fixes #20507 Contacts returned for ASN via graphql API 2025-10-07 09:08:04 -04:00
Mathieu
d8e4c95bcc Fixes #19818: Hide IP fields when creating VM 2025-10-07 14:03:01 +02:00
github-actions
faa89a53ff Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
2025-10-07 05:02:29 +00:00
Dmitry Smirnov
d18bbe48c1 add tag copy_content and id 'job_data_output'
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-10-06 15:17:39 -04:00
Martin Hauser
99e367cbaf docs(api): Correct IntegerRangeSerializer schema definition
Adjusts the schema mapping for `IntegerRangeSerializer` by setting
`match_subclasses` to `True` and refining the array definition. Adds
an example field for clarity in generated OpenAPI documentation.

Fixes #20494
2025-10-06 15:09:57 -04:00
Daniel Sheppard
f5ed095738 Fixes: #21040 - Registered denormalized fields (#20503)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
2025-10-06 09:12:27 -05:00
Johannes Erwerle
b70f1211ab Fixed wrong link in plugin filtersets documentation 2025-10-06 10:03:47 -04:00
Arthur
10e8e7b071 20496 fix test 2025-10-03 14:54:08 -07:00
Arthur
c770e6b45d 20496 fix max_page_size for REST API 2025-10-03 14:22:55 -07:00
Jason Novinger
c094699dc0 Fixes #20484: Configure CodeQL to exclude URL redirect false positives
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-10-03 08:48:02 -04:00
Martin Hauser
5f77d684e1 chore(core): Remove unused imports in plugins and migrations
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
Cleans up unused imports across `plugins.py` and a migration file.

Closes #20482
2025-10-02 17:11:07 -04:00
github-actions
f23eb53312 Update source translation strings
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
2025-10-02 05:02:10 +00:00
bctiemann
91d5d284ca Merge pull request #20464 from netbox-community/20248-fix-translation-error
Fixes #20248: Tweak help text to avoid error when compiling translations
2025-10-01 20:45:42 -04:00
github-actions
c4dcc62c04 Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
2025-10-01 05:02:17 +00:00
Jeremy Stretch
5a96b76cd4 Release v4.4.2
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
2025-09-30 16:14:35 -04:00
Jeremy Stretch
26fc06b817 Fixes #20248: Tweak help text to avoid error when compiling translations 2025-09-30 15:10:53 -04:00
Jeremy Stretch
9bc60a157b Fixes #20243: Prevent scheduled system jobs from re-running multiple times (#20450) 2025-09-30 13:27:31 -05:00
Jeremy Stretch
28cc8e5c89 Fixes #18878: Automatically assign a designated primary MAC address upon creation of a new interface (#20457) 2025-09-30 13:26:52 -05:00
Martin Hauser
ba1c0d6d84 Closes #20449: Add user preferences documentation (#20460) 2025-09-30 13:16:36 -05:00
Jeremy Stretch
f31a5551ff Closes #19765: Linkify object types under saved filter view (#20458)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-30 08:29:59 -07:00
Jeremy Stretch
b0a8b86a93 #20382: Additional GraphQL API tips (#20451)
* #20382: Additional GraphQL API tips

* Add graphql hint for syntax highlighting
2025-09-30 11:29:29 -04:00
Jeremy Stretch
d222466882 Fixes #20245: Fix Markdown preview functionality within "quick add" modal 2025-09-30 11:19:50 -04:00
Martin Hauser
9e75a2f955 fix(api): Fix schema and field definitions for OpenAPI
Add `get_internal_type()` to custom field classes for Django compatibility,
annotate path parameters and operation IDs for background endpoints, and
provide serializer context on the RQ base viewset to clear schema warnings.

Fixes #20365
2025-09-30 10:46:03 -04:00
Jeremy Stretch
10e76597a8 Closes #20332: Add a "none" option to object tag filters (#20452) 2025-09-30 09:45:15 -05:00
Martin Hauser
18862586e5 feat(dcim): Add "facility" field to bulk edit forms for Site and Location
Introduces a new "facility" field in the bulk edit forms for Site and
Location models. Updates fieldsets and nullable fields to incorporate
the "facility" field.

Closes #20438
2025-09-30 08:48:26 -04:00
github-actions
69a7c97c3e Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-30 05:04:06 +00:00
Jeremy Stretch
bfd1adf0b5 Fixes #20441: Fix display of the "groups" column in contact assignments table (#20446)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-29 13:05:52 -05:00
Robert Drake
030f03b1a8 Typo and alphabetical fixes for Interface choices
This fixes the alphabetical ordering of the interface types, and it
corrects the typo in the BiDi names.

fixes #20392
2025-09-29 13:38:37 -04:00
Jeremy Stretch
6cf6e2cd7f Fixes #20419: Correct action buttons for child object views (#20445) 2025-09-29 09:14:16 -07:00
RasmusThing
0b7baae23c Fixes #20412: linkify cluster type (#20413)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-29 06:44:15 -05:00
Elliott Balsley
0c22fc9408 Fixes #19590: Display related columns on DeviceComponents table (#20344) 2025-09-29 05:45:37 -05:00
bctiemann
a437931aef Merge pull request #20393 from netbox-community/20390-pagination-dropdown
Some checks failed
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
Fixes #20390: Fix styling of pagination dropdown menu
2025-09-22 07:15:53 -04:00
github-actions
0fac8e671e Update source translation strings
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-09-20 05:02:20 +00:00
Jeremy Stretch
6547a16ab6 Fixes #20398: Rely on browser-native form field validation (#20401)
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
2025-09-19 15:13:47 -05:00
Jeremy Stretch
07a53c8315 Closes #17010: Show admin navigation menu items only for staff & superusers (#20386) 2025-09-19 12:52:16 -07:00
Elliott Balsley
55cda3ca45 Fixes #20253: GraphQL filter by contacts (#20288)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
* filter models by contacts

* remove unsed import

* simpler solution
2025-09-19 10:52:02 -04:00
github-actions
a173a9b4ac Update source translation strings
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-19 05:03:24 +00:00
bctiemann
d34ce7794c Merge pull request #20381 from netbox-community/20380-sentry_config
Closes #20380: Introduce the `SENTRY_CONFIG` config parameter
2025-09-18 22:15:11 -04:00
Jeremy Stretch
f45a11d079 Closes #20382: Document performance best practices (#20384)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-18 14:17:06 -05:00
Jeremy Stretch
56db60f8c9 Fixes #20375: Preserve filter params when performing bulk operations (#20387) 2025-09-18 14:08:50 -05:00
Jeremy Stretch
c8b30270a8 Fixes #20390: Fix styling of pagination dropdown menu 2025-09-18 14:05:00 -04:00
Jeremy Stretch
8e332055bc Closes #20380: Introduce the SENTRY_CONFIG config parameter 2025-09-17 14:25:41 -04:00
bctiemann
3c09ee8b11 Merge pull request #20350 from llamafilm/17824-hotkeys
Some checks failed
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
add global search hotkey
2025-09-17 13:59:37 -04:00
Jeremy Stretch
a4f0b76cb5 Closes #20367: Document best practices for modeling SFPs (#20377)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-17 11:31:11 -05:00
Elliott Balsley
f2097cce33 no search at login page 2025-09-16 20:05:35 -07:00
Elliott Balsley
499ebb8ab4 Merge branch 'main' into 17824-hotkeys 2025-09-16 19:26:14 -07:00
Jeremy Stretch
8fa1abd371 Release v4.4.1 (#20366)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
* Release v4.4.1

* Revert django-mptt to v0.17.0
2025-09-16 11:56:50 -04:00
github-actions
81401b9e17 Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-16 05:02:30 +00:00
Jason Novinger
5bfbca9a83 Fixes #20298: Add placeholder for failed image thumbnail generation (#20359)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-15 16:49:43 -07:00
Robin Schneider
85689b25de feat: add Wi-Fi Alliance generation labels to Interface type texts (#20348)
* feat: add Wi-Fi Alliance generation labels to Interface type texts

Closes: #20347

* Shorten labels for WiGig choices

---------

Co-authored-by: Jeremy Stretch <jstretch@netboxlabs.com>
2025-09-15 14:52:46 -04:00
Jeremy Stretch
c2aa87a4c9 Closes #20321: Add PHY interface types for pluggable transceivers (#20343) 2025-09-15 13:39:05 -05:00
Martin Hauser
34b111bdc4 feat(users): Add support for cloning ObjectPermission objects
Introduces cloning functionality for ObjectPermission objects using the
CloningMixin. Updates the constraints field handling, adds JSONField,
and introduces logic to process initial data for cloned objects.

Fixes #15492
2025-09-15 13:50:09 -04:00
Martin Hauser
684106031a feat(dcim): Improve CableTypeChoices structure and grouping
Refactors `CableTypeChoices` by reorganizing cable types into more
specific subcategories. Enhances clarity with distinct groups such as
Copper (Twisted Pair, Twinax, Coaxial) and Fiber (Multi Mode, Single
Mode, Other).

Closes #19865
2025-09-15 13:34:29 -04:00
Martin Hauser
31644b4ce6 fix(ipam): Remove FHRP IP prefix constraint
Remove `FHRPGroupAssignmentForm.__init__` logic that tied group choices
to the interface IP prefix. Add `group_id` to the `q` filter to enable
matching by group ID.

Fixes #19262
2025-09-15 13:31:19 -04:00
Jason Novinger
fb004bb94e #20327: Device queries now faster when including ConfigContexts (#20346)
* Fixes #20327: Device queries are now faster when including ConfidContexts
Move .distinct() from main queryset to tag subquery to eliminate
performance bottleneck when querying devices with config contexts.

The .distinct() call on the main device queryset was causing PostgreSQL
to sort all devices before pagination, resulting in 15x slower API
responses for large installations (10k+ devices, 100+ config contexts).

Moving .distinct() to the tag subquery eliminates duplicates at their
source (GenericForeignKey tag relationships) while preserving the fix
for issues #5314 and #5387 without impacting overall query performance.

* Add performance regression test for config context annotation

The test verifies that:
- Main device queries do not use expensive DISTINCT operations
- Tag subqueries properly use DISTINCT to prevent duplicates from issue #5387

This ensures the optimization from issue #20327 (moving .distinct() from maintaining
query to tag subquery) cannot be accidentally reverted while maintaining the
correctness guarantees for issues #5314 and #5387.

* Address PR feedback, clean up new regression test

The new regression test now avoids casting the query to a string and
inspecting the string, which was brittle at best.

The new approach asserts directly against `queryset.distinct` for the
main query and then finds the subquery that we expect to have distinct
set and verifies that is in fact the case.

I also realized that the use of `connection.query_log` was problematic,
in that it didn't seem to return any queries as expected. This meant
that the test was actually not making any assertions since none of the
code inside of the for loop over `device_queries` ever ran.
2025-09-15 13:04:56 -04:00
bctiemann
192440a4d3 Merge pull request #20334 from 991jo/patch-2
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
Extended plugin development documentation regarding bulk edit/delete …
2025-09-15 08:54:54 -04:00
Elliott Balsley
03a6032f36 remove debug line 2025-09-13 11:45:17 -07:00
Martin Hauser
2dac09cea0 Closes #20341: Drop legacy django_admin_log table (#20349)
Some checks failed
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-09-13 13:11:13 -05:00
github-actions
2a99aadc5d Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-13 05:03:28 +00:00
Martin Hauser
2d6b3d19e7 Fixes #20236: Improve file naming and upload handling (#20315)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-12 17:41:49 -05:00
Martin Hauser
103939ad3c Fixes #20197: Correct validation for virtual chassis parent interface (#20337)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-12 08:53:08 -05:00
Jeremy Stretch
4b17faae52 Bump Django to v5.2.6 (#20340) 2025-09-12 08:33:49 -05:00
Jo
37644eed3f Extended plugin development documentation regarding bulk edit/delete buttons in tables 2025-09-12 08:22:16 +02:00
github-actions
cf0ef92268 Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-12 05:02:16 +00:00
Jeremy Stretch
77376524f9 Fixes #20329: Fix InconsistentMigrationHistory exception when upgrading from v4.3 (#20330)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
Reverts "Fixes #20290: Fix ordering of migrations to support upgrading from v3.7"
2025-09-11 15:28:07 -05:00
Jason Novinger
53d1b1aa50 Closes #19944: Add multi-scenario CSV import testing support with cleanup (#20302)
* Closes #19944: Add multi-scenario CSV import testing support with cleanup

Enhanced BulkImportObjectsViewTestCase to support multiple CSV import scenarios via dictionary format,
where each scenario runs as a separate subtest with automatic cleanup. This enables testing different
import configurations (e.g., with/without optional fields) in a single test run with clear output
showing which scenario is being tested.

Introduces cleanupSubTest() context manager that uses database savepoints to automatically roll back
changes between subtests, providing test isolation similar to separate test methods. This allows
subtests to create/modify objects without affecting subsequent subtests in the same test method.

Added post_import_callback parameter to bulk import tests, allowing child classes to inject custom
assertions that run before database cleanup. This solves the inheritance problem where child classes
need to verify imported data but the parent's cleanup would roll back the data before assertions could
run.

The callback approach is cleaner than conditional cleanup parameters - it makes the execution timing
explicit and maintains test isolation while still allowing extensibility.

* Fixup ModuleTypeTestCase bulk import test to work with callback mechamisn

* Update CableTestCase to use expanded CSV scenario testing

* Remove unneeded permission cleanup

Co-authored-by: Jeremy Stretch <jstretch@netboxlabs.com>

* Consolidate scenario name retrieval into method

---------

Co-authored-by: Jeremy Stretch <jstretch@netboxlabs.com>
2025-09-11 12:47:23 -04:00
bctiemann
d172e6210b Merge pull request #20323 from netbox-community/20206-document-env-var-config-approach
#20206: Clarify `django-storages` configuration from env vars
2025-09-11 12:08:45 -04:00
Jason Novinger
cd122a7dde Address PR feedback 2025-09-11 10:00:22 -06:00
Jason Novinger
d1e40281f3 Fixes #20242: Conditionally log request.id in EventRule triggered script (#20322) 2025-09-11 08:46:04 -07:00
Elliott Balsley
be4db9a899 format script results timestamp (#20307) 2025-09-11 08:43:26 -07:00
bctiemann
01f1228e3b Merge pull request #20314 from netbox-community/20290-fix-migration
Fixes #20290: Fix ordering of migrations to support upgrading from v3.7
2025-09-11 11:19:00 -04:00
Jason Novinger
c57d9f9a37 Fix 'dim' type --> 'dcim' 2025-09-11 08:51:50 -06:00
Jason Novinger
6f01da90b4 Closes #20206: Clarifies django-storages configuration from env vars 2025-09-11 08:48:14 -06:00
Martin Hauser
bf7356473c fix(extras): Inherit ConfigContext from ancestors locations (#20291)
Some checks failed
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-09-10 10:00:22 -07:00
Jeremy Stretch
a99e21afd6 Fixes #20290: Fix ordering of migrations to support upgrading from v3.7 2025-09-10 12:33:36 -04:00
github-actions
0e627d4d9b Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-10 05:02:17 +00:00
Elliott Balsley
53b15e3e41 add global search hotkey 2025-09-09 19:07:17 -07:00
Aaron
1034f738af Fixes #20217: Fix '0 VLANs available' in the VLANs table in VLAN Groups (#20261)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
* Fixes #20217: hide 0 VLANs available message in VLAN groups

* Simplified fix to improve readability
2025-09-09 15:33:11 -04:00
Jeremy Stretch
873372f61e Closes #20241: Record A & B terminations on cable changelog records (#20246) 2025-09-09 11:56:08 -05:00
github-actions
1d9d7f2d84 Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-09 05:02:37 +00:00
bctiemann
83fe973fea Merge pull request #20280 from pheus/20264-fix-plugin-icon-display-in-plugin-table
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
Fixes #20264: Update plugin title rendering with default icon
2025-09-08 13:53:16 -04:00
bctiemann
8ebc677372 Merge pull request #20267 from pheus/19744-fix-active-column-sorting-in-plugin-table
Fixes #19744: Add accessor for is_loaded in TemplateColumn
2025-09-08 13:40:19 -04:00
Jeremy Stretch
9d0e80571c Closes #20277: Add support for attribute assignment to deserialize_object() (#20281) 2025-09-08 10:28:14 -07:00
Jeremy Stretch
291010737a Closes #20296: Misc updates to issue templates (#20293) 2025-09-08 10:05:14 -07:00
Martin Hauser
b24f8fb340 feat(core): Update plugin title rendering with default icon
Replaces inline plugin title HTML with a reusable template in
`template_code.py`. Adds a default icon for plugins without custom icons
and updates the table logic to use this template.
Removes redundant logic from the `render_title_long` method to improve
maintainability.
Changes the `order_by` field in `plugins.py` from `name` to
`title_long`.

Fixes #20264
2025-09-08 18:44:00 +02:00
Elliott Balsley
a611ade5d3 Fixes #19729: GraphQL filter interfaces by kind (#20289) 2025-09-08 09:51:01 -05:00
Martin Hauser
099f3b2f34 feat(core): Add Sync button for DataSource actions
Introduces a sync button in the DataSource table for improved user
interaction. Enables users to trigger sync actions directly from the
table, with context-sensitive availability based on permissions and
record status.

Closes #19547
2025-09-08 09:39:53 -04:00
bctiemann
1b83d32f4a Merge pull request #20274 from netbox-community/20215-configcontextfilter-requires-filter-fields
Fixes #20215: Make ConfigContextFilter filters optional
2025-09-08 09:25:50 -04:00
bctiemann
af6f4ce3ab Merge pull request #20254 from netbox-community/19428-device-table-height-column
Closes #19428: Add `u_height` column to devices table
2025-09-08 09:23:33 -04:00
bctiemann
d2c0026b9d Merge pull request #20287 from mr1716/20286-Improve-Grammar-Of-Documentation
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
#20286 Update Documentation To Cleanup Grammar
2025-09-08 08:17:32 -04:00
mr1716
1eeede0931 Update Grammar 2025-09-07 08:35:59 -04:00
mr1716
c3b37db8f7 Update netbox-shell.md To Reflect Proper Grammar 2025-09-06 11:15:15 -04:00
mr1716
c9dc2005b0 Update planning.md to cleanup grammar 2025-09-06 11:09:01 -04:00
github-actions
c9f823167c Update source translation strings
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-09-06 05:02:29 +00:00
jetomit
5ca2cea016 Closes #20222: Enable HttpOnly flag for the CSRF cookie (#20262)
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
CI / build (20.x, 3.10) (push) Has been cancelled
CI / build (20.x, 3.11) (push) Has been cancelled
CI / build (20.x, 3.12) (push) Has been cancelled
2025-09-05 15:04:02 -07:00
Jason Novinger
026737b62b Fixes #19851: Fix WirelessLANImportForm has no field scope, improve validation (#20273) 2025-09-05 14:59:38 -07:00
Jeremy Stretch
94faf58c27 Closes #19408: Enable export templates for circuit terminations (#20251) 2025-09-05 14:23:07 -07:00
Jeremy Stretch
de499ca686 Fixes #20282: Fix styling of warning for missing prerequisite objects (#20283) 2025-09-05 15:26:11 -05:00
Martin Hauser
f04a2b965f Fixes #20252: Remove generic AddObject from ObjectChildrenView (#20279) 2025-09-05 15:10:24 -05:00
Jason Novinger
fcb380b5c5 Fixes #20221: JSON CustomField does not coerce {} to null
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
This fix actually fixes this for all valid JSON values that evaluate to
`False` in Python when loaded and cast to bool:
`bool(json.loads(<val>))`.

- `{}`
- `[]`
- `0`
- `False`

This does not change the behavior of `()` or `""` which are both
explicitly cited as "empty" values on `JSONField`.
2025-09-05 15:54:25 -04:00
Martin Hauser
8311f457b5 Fixes #20258: Correct typographical errors in labels (#20278) 2025-09-05 14:07:12 -05:00
Jason Novinger
2ba2864a6a Fixes #20215: Make ConfigContextFilter filters optional 2025-09-05 10:37:39 -05:00
Martin Hauser
47e4947ca0 Fixes #20234: Correct add_button return_url (#20268) 2025-09-05 08:01:28 -05:00
Jeremy Stretch
545773e221 Fixes #20227: Fix paragraph spacing in rendered Markdown content (#20256) 2025-09-05 07:05:36 -05:00
Martin Hauser
f9159ad9bd fix(plugins): Add accessor for is_loaded in TemplateColumn
Adds the `accessor` attribute with `tables.A('is_loaded')` to the
`is_installed` column in the plugin's table. This ensures proper data
access and improves the table's functionality.

Fixes #19744
2025-09-05 10:55:58 +02:00
github-actions
2ddec1ef48 Update source translation strings
Some checks are pending
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-05 05:03:32 +00:00
Jonathan Ramstedt
309e434064 Fixes #19896: cf minmax mustbe int (#20207)
Some checks are pending
CI / build (20.x, 3.10) (push) Waiting to run
CI / build (20.x, 3.11) (push) Waiting to run
CI / build (20.x, 3.12) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run
2025-09-04 16:10:05 -07:00
Jeremy Stretch
8a1db81111 Closes #20203: Add a pre-commit check for OpenAPI schema changes (#20230) 2025-09-04 16:02:12 -07:00
Martin Hauser
399d51b466 fix(vpn): Update to_field_name in bulk import form
Changes the value of `to_field_name` from `name` to `address` in the
VPN bulk import form. This ensures proper mapping and validation for
IP address selection during the bulk import process.

Closes #20238
2025-09-04 16:42:13 -04:00
Martin Hauser
6135fb8cd7 feat(vpn): Add search index for TunnelGroup
Introduces `TunnelGroupIndex` for enabling search functionality on
Tunnel Groups. Includes searchable fields for `name` and `description`
with respective weights and display attributes.

Closes #20237
2025-09-04 16:33:39 -04:00
Jeremy Stretch
0a336465f2 Closes #19428: Add u_height column to devices table 2025-09-04 15:44:34 -04:00
github-actions
ea50786b5c Update source translation strings
Some checks failed
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
2025-09-03 05:02:17 +00:00
248 changed files with 53133 additions and 43680 deletions

View File

@@ -2,7 +2,7 @@
name: ✨ Feature Request
type: Feature
description: Propose a new NetBox feature or enhancement
labels: ["type: feature", "status: needs triage"]
labels: ["netbox", "type: feature", "status: needs triage"]
body:
- type: markdown
attributes:
@@ -15,7 +15,7 @@ body:
attributes:
label: NetBox version
description: What version of NetBox are you currently running?
placeholder: v4.4.0
placeholder: v4.4.5
validations:
required: true
- type: dropdown

View File

@@ -2,32 +2,32 @@
name: 🐛 Bug Report
type: Bug
description: Report a reproducible bug in the current release of NetBox
labels: ["type: bug", "status: needs triage"]
labels: ["netbox", "type: bug", "status: needs triage"]
body:
- type: markdown
attributes:
value: >
**NOTE:** This form is only for reporting _reproducible bugs_ in a current NetBox
installation. If you're having trouble with installation or just looking for
assistance with using NetBox, please visit our
release. If you're having trouble with installation or just looking for assistance
using NetBox, please visit our
[discussion forum](https://github.com/netbox-community/netbox/discussions) instead.
- type: dropdown
attributes:
label: Deployment Type
label: NetBox Edition
description: >
How are you running NetBox? (For issues with the Docker image, please go to the
[netbox-docker](https://github.com/netbox-community/netbox-docker) repo.)
Users of [NetBox Cloud](https://netboxlabs.com/netbox-cloud/) or
[NetBox Enterprise](https://netboxlabs.com/netbox-enterprise/), please contact the
[NetBox Labs](https://netboxlabs.com/) support team for assistance to ensure your
request receives immediate attention.
options:
- NetBox Cloud
- NetBox Enterprise
- Self-hosted
- NetBox Community
validations:
required: true
- type: input
attributes:
label: NetBox Version
description: What version of NetBox are you currently running?
placeholder: v4.4.0
placeholder: v4.4.5
validations:
required: true
- type: dropdown

View File

@@ -2,7 +2,7 @@
name: 📖 Documentation Change
type: Documentation
description: Suggest an addition or modification to the NetBox documentation
labels: ["type: documentation", "status: needs triage"]
labels: ["netbox", "type: documentation", "status: needs triage"]
body:
- type: dropdown
attributes:
@@ -25,9 +25,12 @@ body:
- Getting started
- Configuration
- Customization
- Best practices
- Integrations/API
- Plugins
- Administration
- Data model
- Reference
- Development
- Other
validations:

View File

@@ -2,7 +2,7 @@
name: 🌍 Translation
type: Translation
description: Request support for a new language in the user interface
labels: ["type: translation"]
labels: ["netbox", "type: translation"]
body:
- type: markdown
attributes:

View File

@@ -2,7 +2,7 @@
name: 🏡 Housekeeping
type: Housekeeping
description: A change pertaining to the codebase itself (developers only)
labels: ["type: housekeeping"]
labels: ["netbox", "type: housekeeping"]
body:
- type: markdown
attributes:

View File

@@ -2,7 +2,7 @@
name: 🗑️ Deprecation
type: Deprecation
description: The removal of an existing feature or resource
labels: ["type: deprecation"]
labels: ["netbox", "type: deprecation"]
body:
- type: textarea
attributes:

View File

@@ -13,9 +13,6 @@ contact_links:
- name: 🌎 Correct a Translation
url: https://explore.transifex.com/netbox-community/netbox/
about: "Spot an incorrect translation? You can propose a fix on Transifex."
- name: 💡 Plugin Idea
url: https://plugin-ideas.netbox.dev
about: "Have an idea for a plugin? Head over to the ideas board!"
- name: 💬 Community Slack
url: https://netdev.chat
about: "Join #netbox on the NetDev Community Slack for assistance with installation issues and other problems."

View File

@@ -1,3 +1,11 @@
paths-ignore:
# Ignore compiled JS
- netbox/project-static/dist
query-filters:
# Exclude py/url-redirection: NetBox uses safe_for_redirect() wrapper function
# which validates all redirects via Django's url_has_allowed_host_and_scheme().
# CodeQL's taint tracking doesn't recognize wrapper functions without custom
# query configuration. See #20484.
- exclude:
id: py/url-redirection

View File

@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.6.9
rev: v0.14.1
hooks:
- id: ruff
name: "Ruff linter"
@@ -21,14 +21,6 @@ repos:
language: system
pass_filenames: false
types: [python]
- id: openapi-check
name: "Validate OpenAPI schema"
description: "Check for any unexpected changes to the OpenAPI schema"
files: api/.*\.py$
entry: scripts/verify-openapi.sh
language: system
pass_filenames: false
types: [python]
- id: mkdocs-build
name: "Build documentation"
description: "Build the documentation with mkdocs"

View File

@@ -91,7 +91,6 @@ NetBox automatically logs the creation, modification, and deletion of all manage
* Join the conversation on [the discussion forum](https://github.com/netbox-community/netbox/discussions) and [Slack](https://netdev.chat/)!
* Already a power user? You can [suggest a feature](https://github.com/netbox-community/netbox/issues/new?assignees=&labels=type%3A+feature&template=feature_request.yaml) or [report a bug](https://github.com/netbox-community/netbox/issues/new?assignees=&labels=type%3A+bug&template=bug_report.yaml) on GitHub.
* Contributions from the community are encouraged and appreciated! Check out our [contributing guide](CONTRIBUTING.md) to get started.
* [Share your idea](https://plugin-ideas.netbox.dev/) for a new plugin, or [learn how to build one](https://github.com/netbox-community/netbox-plugin-tutorial) yourself!
## Screenshots

View File

@@ -34,4 +34,4 @@ For any security concerns regarding the community-maintained Docker image for Ne
### Bug Bounties
As NetBox is provided as free open source software, we do not offer any monetary compensation for vulnerability or bug reports, however your contributions are greatly appreciated.
As NetBox is provided as free open source software, we do not offer any monetary compensation for vulnerability or bug reports; however, your contributions are greatly appreciated.

View File

@@ -12,9 +12,7 @@ django-cors-headers
# Runtime UI tool for debugging Django
# https://github.com/jazzband/django-debug-toolbar/blob/main/docs/changes.rst
# django-debug-toolbar v6.0.0 raises "Attribute Error at /: 'function' object has no attribute 'set'"
# see https://github.com/netbox-community/netbox/issues/19974
django-debug-toolbar==5.2.0
django-debug-toolbar
# Library for writing reusable URL query filters
# https://github.com/carltongibson/django-filter/blob/main/CHANGES.rst
@@ -30,7 +28,8 @@ django-htmx
# Modified Preorder Tree Traversal (recursive nesting of objects)
# https://github.com/django-mptt/django-mptt/blob/main/CHANGELOG.rst
django-mptt
# v0.18.0 introduces errant migrations which need to be resolved
django-mptt==0.17.0
# Context managers for PostgreSQL advisory locks
# https://github.com/Xof/django-pglocks/blob/master/CHANGES.txt
@@ -70,7 +69,8 @@ django-timezone-field
# A REST API framework for Django projects
# https://www.django-rest-framework.org/community/release-notes/
djangorestframework
# TODO: Re-evaluate the monkey-patch of get_unique_validators() before upgrading
djangorestframework==3.16.1
# Sane and flexible OpenAPI 3 schema generation for Django REST framework.
# https://github.com/tfranzel/drf-spectacular/blob/master/CHANGELOG.rst
@@ -139,8 +139,7 @@ requests
# rq
# https://github.com/rq/rq/blob/master/CHANGES.md
# RQ v2.5 drops support for Redis < 5.0
rq==2.4.1
rq
# Django app for social-auth-core
# https://github.com/python-social-auth/social-app-django/blob/master/CHANGELOG.md
@@ -167,7 +166,8 @@ strawberry-graphql-django
svgwrite
# Tabular dataset library (for table-based exports)
# https://github.com/jazzband/tablib/blob/master/HISTORY.md
# Current: https://github.com/jazzband/tablib/releases
# Previous: https://github.com/jazzband/tablib/blob/master/HISTORY.md
tablib
# Timezone data (required by django-timezone-field on Python 3.9+)

View File

@@ -330,46 +330,120 @@
"100base-lfx",
"100base-tx",
"100base-t1",
"1000base-t",
"1000base-sx",
"1000base-bx10-d",
"1000base-bx10-u",
"1000base-cwdm",
"1000base-cx",
"1000base-dwdm",
"1000base-ex",
"1000base-lsx",
"1000base-lx",
"1000base-lx10",
"1000base-sx",
"1000base-t",
"1000base-tx",
"1000base-zx",
"2.5gbase-t",
"5gbase-t",
"10gbase-t",
"10gbase-br-d",
"10gbase-br-u",
"10gbase-cx4",
"10gbase-er",
"10gbase-lr",
"10gbase-lrm",
"10gbase-lx4",
"10gbase-sr",
"10gbase-t",
"10gbase-zr",
"25gbase-cr",
"25gbase-er",
"25gbase-lr",
"25gbase-sr",
"25gbase-t",
"40gbase-cr4",
"40gbase-er4",
"40gbase-fr4",
"40gbase-lr4",
"40gbase-sr4",
"50gbase-cr",
"50gbase-er",
"50gbase-fr",
"50gbase-lr",
"50gbase-sr",
"100gbase-cr1",
"100gbase-cr2",
"100gbase-cr4",
"100gbase-cr10",
"100gbase-cwdm4",
"100gbase-dr",
"100gbase-er4",
"100gbase-fr1",
"100gbase-lr1",
"100gbase-lr4",
"100gbase-sr1",
"100gbase-sr1.2",
"100gbase-sr2",
"100gbase-sr4",
"100gbase-sr10",
"100gbase-zr",
"200gbase-cr2",
"200gbase-cr4",
"200gbase-dr4",
"200gbase-er4",
"200gbase-fr4",
"200gbase-lr4",
"200gbase-sr2",
"200gbase-sr4",
"200gbase-vr2",
"400gbase-cr4",
"400gbase-dr4",
"400gbase-er8",
"400gbase-fr4",
"400gbase-fr8",
"400gbase-lr4",
"400gbase-lr8",
"400gbase-sr4",
"400gbase-sr4_2",
"400gbase-sr8",
"400gbase-sr16",
"400gbase-vr4",
"400gbase-zr",
"800gbase-cr8",
"800gbase-dr8",
"800gbase-sr8",
"800gbase-vr8",
"100base-x-sfp",
"1000base-x-gbic",
"1000base-x-sfp",
"10gbase-x-sfpp",
"10gbase-x-xfp",
"10gbase-x-xenpak",
"10gbase-x-xfp",
"10gbase-x-x2",
"25gbase-x-sfp28",
"50gbase-x-sfp56",
"40gbase-x-qsfpp",
"50gbase-x-sfp28",
"50gbase-x-sfp56",
"100gbase-x-cfp",
"100gbase-x-cfp2",
"200gbase-x-cfp2",
"400gbase-x-cfp2",
"100gbase-x-cfp4",
"100gbase-x-cxp",
"100gbase-x-cpak",
"100gbase-x-dsfp",
"100gbase-x-sfpdd",
"100gbase-x-qsfp28",
"100gbase-x-qsfpdd",
"100gbase-x-sfpdd",
"200gbase-x-cfp2",
"200gbase-x-qsfp56",
"200gbase-x-qsfpdd",
"400gbase-x-qsfp112",
"400gbase-x-qsfpdd",
"400gbase-x-cdfp",
"400gbase-x-cfp2",
"400gbase-x-cfp8",
"400gbase-x-osfp",
"400gbase-x-osfp-rhs",
"400gbase-x-cdfp",
"400gbase-x-cfp8",
"800gbase-x-qsfpdd",
"800gbase-x-osfp",
"800gbase-x-qsfpdd",
"1000base-kx",
"2.5gbase-kx",
"5gbase-kr",

File diff suppressed because one or more lines are too long

View File

@@ -25,7 +25,7 @@ Once finished, make note of the application (client) ID; this will be used when
![Completed app registration](../../media/authentication/azure_ad_app_registration_created.png)
!!! tip "Multitenant authentication"
NetBox also supports multitenant authentication via Azure AD, however it requires a different backend and an additional configuration parameter. Please see the [`python-social-auth` documentation](https://python-social-auth.readthedocs.io/en/latest/backends/azuread.html#tenant-support) for details concerning multitenant authentication.
NetBox also supports multitenant authentication via Azure AD; however, it requires a different backend and an additional configuration parameter. Please see the [`python-social-auth` documentation](https://python-social-auth.readthedocs.io/en/latest/backends/azuread.html#tenant-support) for details concerning multitenant authentication.
### 3. Create a secret

View File

@@ -106,7 +106,7 @@ This approach can span multiple levels of relations. For example, the following
```
!!! note
While the above query is functional, it's not very efficient. There are ways to optimize such requests, however they are out of scope for this document. For more information, see the [Django queryset method reference](https://docs.djangoproject.com/en/stable/ref/models/querysets/) documentation.
While the above query is functional, it's not very efficient. There are ways to optimize such requests; however, they are out of scope for this document. For more information, see the [Django queryset method reference](https://docs.djangoproject.com/en/stable/ref/models/querysets/) documentation.
Reverse relationships can be traversed as well. For example, the following will find all devices with an interface named "em0":

View File

@@ -0,0 +1,74 @@
# Modeling Pluggable Transceivers
## Use Case
Many network devices utilize field-swappable [small-form factor pluggable transceivers (SFPs)](https://en.wikipedia.org/wiki/Small_Form-factor_Pluggable) to enable changing the physical media type of a fixed interface. For example, a 10 Gigabit Ethernet interface might be connected using copper, multimode fiber, or single-mode fiber, each of which requires a different type of SFP+ transceiver.
It can be challenging to model SFPs given their dynamic nature. This guide intends to capture the recommended strategy for modeling SFPs on NetBox v4.4 and later.
## Modeling Strategy
Pluggable transceivers are most accurately represented in NetBox as discrete [modules](../models/dcim/module.md) which are installed within [module bays](../models/dcim/modulebay.md). A module can deliver one or more [interfaces](../models/dcim/interface.md) (or other components) to the device in which it is installed. This approach ensures that a new interface is automatically created on the device when the module is installed, and deleted when the module is removed.
```mermaid
flowchart BT
interface1[Interface 1/1]--> module1[SFP]
interface2[Interface 2/1]--> module2[SFP]
interface3[Interface 3/1] & interface4[Interface 3/2]--> module3[SFP]
module1 --> modulebay1[Module Bay 1]
module2 --> modulebay2[Module Bay 2]
module3 --> modulebay3[Module Bay 3]
modulebay1 & modulebay2 & modulebay3 --> device[Device]
```
### 1. Create an SFP Module Type Profile
If one has not already been defined, create a [module type profile](../models/dcim/moduletypeprofile.md) for SFPs. This profile will be assigned for all module types which represent a pluggable transceiver. Typically, you will need only one profile for all pluggable transceivers.
You might opt to define custom attributes for the profile by defining a custom [JSON schema](https://json-schema.org/). Profile attributes might be used to define characteristics unique to transceivers, such as optical wavelength and power ranges. Adding profile attributes is optional, and can be done at a later point.
!!! note
Creating a module type profile is optional, but recommended as it allows for defining custom module attributes.
### 2. Create a Module Type for Each SFP Model in Inventory
Next, create a [module type](../models/dcim/moduletype.md) to represent each unique SFP model present in your network. Each module type should define a manufacturer and a unique model name, and may also include a part number. For example, you might create a module type for each of the following transceivers:
| Manufacturer | Model | Media Type |
|--------------|------------------|------------|
| Cisco | SFP-10G-SR | 10GE MMF |
| Cisco | SFP-10G-LR | 10GE SMF |
| Juniper | QFX-QSFP-40G-SR4 | 40GE MMF |
| Juniper | JNP-QSFP-DAC-5M | 40GE DAC |
### 3. Add an Interface to the Module Type
After creating each module type, create an interface template on it to represent its physical interface. The definition of this interface template will depend on the transceiver's physical media type. (Reference the table above for examples.) When a new module is "installed" within a module bay on a device, its templated interface(s) will be automatically instantiated on that device as child interfaces of the module.
Determining which name to use for the transceiver's interface can be tricky, as the interface name might depend on the type of device in which the SFP is installed. To avoid having to rename interfaces, consider using the `{module}` token in place of a static interface name. The interface's name will inherit the position of the bay in which its parent module is installed. If creating multiple interfaces on a module, be sure to append a unique ID (e.g. `{module}:1`) to ensure each interface gets assigned a unique name.
### 4. Create Device Types
If you haven't already, create a [device type](../models/dcim/devicetype.md) to represent each unique device model in your network.
!!! note
Skip this step if you've already created the necessary device types.
### 5. Add Module Bays to the Device Type
Once you've created a device type, add the appropriate number of module bays on each device type to represent its SFP slots. For example, a Juniper QFX5110 would have module bays numbered `0/0/0` through `0/0/55`: 48 SFP+ bays and 8 QSFP28 bays (56 total).
Be sure to define both the name **and position** of each module bay with a unique value. The module bay's position will be used to automatically name SFP interfaces.
### 6. Create a Device
Create a new device using the device type added in the previous step. The module bays (and any other components) defined on the device type will be instantiated on the new device automatically.
!!! note
If you've already created the necessary devices in NetBox, you'll need to add their module bays manually. You can add multiple module bays at once by selecting the desired devices from the device list and selecting **Add Components > Module Bays** at the bottom of the page.
### 7. Add the SFP Modules
Finally, create each SFP in the new device by "installing" a new module of the appropriate type in each module bay. The interface(s) defined on the selected module type will be automatically populated on the new module. If present, the `{module}` token in the name of each interface template will be replaced with the position of the bay in which the module is being installed. For example, an interface template with the name `et-{module}` being created on a module installed in a bay with position `0/0/14` will create an interface named `et-0/0/14`.
When adding many modules at once, you may find it helpful to utilize NetBox's bulk import functionality. This allows you to create many modules at once from CSV, JSON, or YAML data.

View File

@@ -0,0 +1,187 @@
# Performance Handbook
The purpose of this handbook is to help users and administrators use NetBox efficiently. It contains assorted recommendations and best practices compiled over time, intending to serve a wide variety of use cases.
## Server Configuration
### WSGI Server Configuration
NetBox operates as a [Web Server Gateway Interface (WSGI)](https://en.wikipedia.org/wiki/Web_Server_Gateway_Interface) application, which sits behind a frontend HTTP server such as nginx or Apache. The HTTP server handles low-level HTTP request processing and serving static assets, and forwards application-level requests to NetBox via WSGI.
A backend WSGI server (typically [Gunicorn](https://gunicorn.org/) or [uWSGI](https://uwsgi-docs.readthedocs.io/en/latest/)) is responsible for running the NetBox application. This is accomplished by initializing a number of WSGI worker processes which accept WSGI requests relayed from the frontend HTTP server.
Tuning your WSGI server is crucial to realizing optimal performance from NetBox. Below are some recommended configuration parameters.
#### Provision Multiple Workers
General guidance is to set the number of worker processes to double the number of CPU cores available, plus one (`2 * CPUs + 1`).
#### Limit the Worker Lifetime
Set a maximum number of requests that a worker can service before being respawned. This helps protect against potential memory leaks.
#### Set a Request Timeout
Limit the time a worker may spend processing any request. This prevents a long-running request from tying up a worker beyond an acceptable threshold. We suggest a limit of 120 seconds as a reasonable safeguard.
#### Bind Using a Unix Socket
When running the HTTP frontend and WSGI server on the same machine, binding via a Unix socket (instead of a TCP socket) may yield slight performance gains.
### NetBox Configuration
NetBox ships with a reasonable default configuration for most environments, but administrators are encouraged to explore all the [available parameters](../configuration/index.md) to tune their installation. Some of the most notable parameters impacting performance are called out below.
#### Reduce the Maximum Page Size
NetBox paginates large result sets to reduce the overall response size. The [`MAX_PAGE_SIZE`](../configuration/miscellaneous.md#max_page_size) parameter specifies the maximum number of results per page that a client can request. This is set to 1,000 by default. Consider lowering this number if you find that API clients are frequently requesting very large result sets.
#### Limit GraphQL Aliases
By default, NetBox restricts a GraphQL query to 10 aliases. Consider reducing this number by setting [`GRAPHQL_MAX_ALIASES`](../configuration/graphql-api.md#graphql_max_aliases) to a lower value.
#### Designate Isolated Deployments
If your NetBox installation does not have Internet access, set [`ISOLATED_DEPLOYMENT`](../configuration/system.md#isolated_deployment) to True. This will prevent the application from attempting routine external requests.
#### Reduce Sentry Sampling
If [Sentry](https://sentry.io/) has been enabled for error reporting and analytics, consider lowering its sampling rate. This can be accomplished by modifying the values for `sample_rate` and `traces_sample_rate` under [`SENTRY_CONFIG`](../configuration/error-reporting.md#sentry_config).
#### Remove Unneeded Event Handlers
Check whether any custom event handlers have been added under [`EVENTS_PIPELINE`](../configuration/miscellaneous.md#events_pipeline). Remove any that are no longer needed.
### Background Task Workers
NetBox defers the execution of certain tasks to background workers via Redis queues serviced by one or more background workers. These workers operate asynchronously from the frontend WSGI workers, and process tasks in the order they are enqueued.
NetBox creates three default queues for background tasks: `high`, `default`, and `low`. Additional queues can be configured via the [`QUEUE_MAPPINGS`](../configuration/miscellaneous.md#queue_mappings) configuration parameter.
By default, a background worker (spawned via `manage.py rqworker`) will listen to all available queues. To improve responsiveness to high-priority background tasks, consider dedicating one or more workers to service the `high` queue only:
```
$ ./manage.py rqworker high
19:31:20 Worker 861be45b32214afc95c235beeb19c9fa: started with PID 2300029, version 2.6.0
19:31:20 Worker 861be45b32214afc95c235beeb19c9fa: subscribing to channel rq:pubsub:861be45b32214afc95c235beeb19c9fa
19:31:20 *** Listening on high...
19:31:20 Worker 861be45b32214afc95c235beeb19c9fa: cleaning registries for queue: high
19:31:20 Scheduler for high started with PID 2300096
```
## API Clients
### REST API
NetBox's [REST API](../integrations/rest-api.md) is the primary means of integration with external systems, allowing full create, read, update, and delete (CRUD) operations. There are a few performance considerations to keep in mind when dealing with very large data sets.
#### Use "Brief" Mode for Simple Lists
In cases where you need to retrieve only a minimal representation of objects, append `?brief=True` to the URL. This instructs NetBox to omit all fields except the following:
* ID
* URL
* Display text
* Name (or similar identifier)
* Slug (if present)
* Description
* Counts of notable related objects (where applicable)
For example, a site fetched using brief mode returns only the following:
```json
{
"id": 2,
"url": "https://netbox/api/dcim/sites/2/",
"display": "DM-Akron",
"name": "DM-Akron",
"slug": "dm-akron",
"description": ""
}
```
Omitting all other fields (especially those which fetch and return related objects) often results in much faster queries.
#### Declare Selected Fields
If you need more flexibility regarding the fields to be returned for an object type, you can specify a list of fields to include using the `fields` query parameter. For example, a request for `/api/dcim/sites/?fields=id,name,status,region` will return the following:
```json
{
"id": 2,
"name": "DM-Akron",
"status": {
"value": "active",
"label": "Active"
},
"region": {
"id": 51,
"url": "https://netbox/api/dcim/regions/51/",
"display": "Ohio",
"name": "Ohio",
"slug": "us-oh",
"description": "",
"site_count": 0,
"_depth": 2
}
}
```
Like brief mode, this approach can significantly reduce the response time of an API request by omitting unneeded data.
#### Employ Pagination
Like the user interface, the REST API employs pagination to limit the number of objects returned in a single response. If a page size is not specified by the request (i.e. by passing `?limit=10`), NetBox will use the default size defined by [`PAGINATE_COUNT`](../configuration/default-values.md#paginate_count). The default page size is 50.
For some requests, especially those using brief mode or a minimal selection of fields, it may be desirable to specify a higher page size, so that fewer requests are needed to retrieve all objects. Appending `?limit=0` to the request effectively seeks to disable pagination. (Note, however, that the requested page size cannot exceed the value of [`MAX_PAGE_SIZE`](../configuration/miscellaneous.md#max_page_size), which defaults to 1,000.)
Complex API requests, which pull in many related objects, generate a relatively high load on the application, and generally benefit from reduced page size. If you find that your API requests are taking an inordinate amount of time, try reducing the page size from the default value so that fewer objects need to be returned for each request.
### GraphQL API
NetBox's read-only [GraphQL API](../integrations/graphql-api.md) offers an alternative to its REST API, and provides a very flexible means of retrieving data. GraphQL enables the client to request any object from a single endpoint, specifying only the desired attributes and relations. Many users prefer this to the more rigid structure of the REST API, but it's important to understand the trade-offs of crafting complex queries.
#### Request Only the Necessary Fields
For optimal performance, craft your GraphQL queries to return only the fields needed by the client. This will reduce the overall query time, especially when omitting related objects.
#### Avoid Overly Complex Queries
The primary benefit of the GraphQL API is that it allows the client to offload to the server the work of stitching together various related objects, which would require the client to make multiple requests to different endpoints if using the REST API. However, this advantage does not come for free: The more information that is requested in a single query, the more work the server needs to do to fetch the raw data from the database and render it into a GraphQL response. Very complex queries can yield dozens or hundreds of SQL queries on the backend, which increase the time it takes to render a response.
While it can be tempting to pack as much data as possible into a single GraphQL query, realize that there is a balance to be struck between minimizing the number of queries needed and avoiding complexity in the interest of performance. For example, while it is possible to retrieve via a single GraphQL API request all the IP addresses and all attached cables for every device in a site, it is probably more efficient (often _much_ more efficient) to make two or three separate requests and correlate the data locally.
#### Use Filters
You can specify filters when making a GraphQL query to limit the set of objects returned. This works a bit differently from the REST API, as filters are declared inside the query statement rather than appended to the URL, but the concept is the same. For example, to return only active sites:
```graphql
query {
site_list(
filters: {
status: STATUS_ACTIVE
}
) {
name
}
}
```
This returns only sites with a status of "active" and avoid needing to parse through all the others. For further information about filters, see the [GraphQL API documentation](../integrations/graphql-api.md).
#### Employ Pagination
Like the REST API, the GraphQL API supports pagination. Queries which return a large number of objects should employ pagination to limit the size of each response.
```graphql
{
device_list(
pagination: {limit: 100}
) {
id
name
serial
status
}
}
```

View File

@@ -17,7 +17,7 @@ CUSTOM_VALIDATORS = {
},
"my_plugin.validators.Validator1"
],
"dim.device": [
"dcim.device": [
"my_plugin.validators.Validator1"
]
}

View File

@@ -4,7 +4,7 @@
This parameter controls the content and layout of user's default dashboard. Once the dashboard has been created, the user is free to customize it as they please by adding, removing, and reconfiguring widgets.
This parameter must specify an iterable of dictionaries, each representing a discrete dashboard widget and its configuration. The follow widget attributes are supported:
This parameter must specify an iterable of dictionaries, each representing a discrete dashboard widget and its configuration. The following widget attributes are supported:
* `widget`: Dotted path to the Python class (required)
* `width`: Default widget width (between 1 and 12, inclusive)
@@ -63,6 +63,8 @@ DEFAULT_USER_PREFERENCES = {
For a complete list of available preferences, log into NetBox and navigate to `/user/preferences/`. A period in a preference name indicates a level of nesting in the JSON data. The example above maps to `pagination.per_page`.
See also: [Clearing table preferences](../features/user-preferences.md#clearing-table-preferences) for resolving errors caused by saved table columns or ordering.
---
## PAGINATE_COUNT

View File

@@ -1,7 +1,32 @@
# Error Reporting Settings
## SENTRY_CONFIG
A dictionary mapping keyword arguments to values, to be passed to `sentry_sdk.init()`. See the [Sentry Python SDK documentation](https://docs.sentry.io/platforms/python/) for more information on supported parameters.
The default configuration is shown below:
```python
{
"sample_rate": 1.0,
"send_default_pii": False,
"traces_sample_rate": 0,
}
```
Additionally, `http_proxy` and `https_proxy` are set to the HTTP and HTTPS proxies, respectively, configured for NetBox (if any).
## SENTRY_DSN
!!! warning "This parameter will be removed in NetBox v4.5."
Set this using `SENTRY_CONFIG` instead:
```
SENTRY_CONFIG = {
"dsn": "https://examplePublicKey@o0.ingest.sentry.io/0",
}
```
Default: `None`
Defines a Sentry data source name (DSN) for automated error reporting. `SENTRY_ENABLED` must be `True` for this parameter to take effect. For example:
@@ -25,6 +50,15 @@ Set to `True` to enable automatic error reporting via [Sentry](https://sentry.io
## SENTRY_SAMPLE_RATE
!!! warning "This parameter will be removed in NetBox v4.5."
Set this using `SENTRY_CONFIG` instead:
```
SENTRY_CONFIG = {
"sample_rate": 0.2,
}
```
Default: `1.0` (all)
The sampling rate for errors. Must be a value between 0 (disabled) and 1.0 (report on all errors).
@@ -33,6 +67,15 @@ The sampling rate for errors. Must be a value between 0 (disabled) and 1.0 (repo
## SENTRY_SEND_DEFAULT_PII
!!! warning "This parameter will be removed in NetBox v4.5."
Set this using `SENTRY_CONFIG` instead:
```
SENTRY_CONFIG = {
"send_default_pii": True,
}
```
Default: `False`
Maps to the Sentry SDK's [`send_default_pii`](https://docs.sentry.io/platforms/python/configuration/options/#send-default-pii) parameter. If enabled, certain personally identifiable information (PII) is added.
@@ -60,6 +103,15 @@ SENTRY_TAGS = {
## SENTRY_TRACES_SAMPLE_RATE
!!! warning "This parameter will be removed in NetBox v4.5."
Set this using `SENTRY_CONFIG` instead:
```
SENTRY_CONFIG = {
"traces_sample_rate": 0.2,
}
```
Default: `0` (disabled)
The sampling rate for transactions. Must be a value between 0 (disabled) and 1.0 (report on all transactions).

View File

@@ -35,6 +35,7 @@ Some configuration parameters are primarily controlled via NetBox's admin interf
* [`POWERFEED_DEFAULT_MAX_UTILIZATION`](./default-values.md#powerfeed_default_max_utilization)
* [`POWERFEED_DEFAULT_VOLTAGE`](./default-values.md#powerfeed_default_voltage)
* [`PREFER_IPV4`](./miscellaneous.md#prefer_ipv4)
* [`PROTECTION_RULES`](./data-validation.md#protection_rules)
* [`RACK_ELEVATION_DEFAULT_UNIT_HEIGHT`](./default-values.md#rack_elevation_default_unit_height)
* [`RACK_ELEVATION_DEFAULT_UNIT_WIDTH`](./default-values.md#rack_elevation_default_unit_width)

View File

@@ -53,6 +53,16 @@ Sets content for the top banner in the user interface.
---
## COPILOT_ENABLED
!!! tip "Dynamic Configuration Parameter"
Default: `True`
Enables or disables the [NetBox Copilot](https://netboxlabs.com/docs/copilot/) agent globally. When enabled, users can opt to toggle the agent individually.
---
## CENSUS_REPORTING_ENABLED
Default: `True`

View File

@@ -92,7 +92,7 @@ If `True`, the cookie employed for cross-site request forgery (CSRF) protection
Default: `[]`
Defines a list of trusted origins for unsafe (e.g. `POST`) requests. This is a pass-through to Django's [`CSRF_TRUSTED_ORIGINS`](https://docs.djangoproject.com/en/stable/ref/settings/#csrf-trusted-origins) setting. Note that each host listed must specify a scheme (e.g. `http://` or `https://).
Defines a list of trusted origins for unsafe (e.g. `POST`) requests. This is a pass-through to Django's [`CSRF_TRUSTED_ORIGINS`](https://docs.djangoproject.com/en/stable/ref/settings/#csrf-trusted-origins) setting. Note that each host listed must specify a scheme (e.g. `http://` or `https://`).
```python
CSRF_TRUSTED_ORIGINS = (

View File

@@ -257,6 +257,46 @@ The specific configuration settings for each storage backend can be found in the
!!! note
Any keys defined in the `STORAGES` configuration parameter replace those in the default configuration. It is only necessary to define keys within the `STORAGES` for the specific backend(s) you wish to configure.
### Environment Variables and Third-Party Libraries
NetBox uses an explicit Python configuration approach rather than automatic environment variable detection. While this provides clear configuration management and version control capabilities, it affects how some third-party libraries like `django-storages` function within NetBox's context.
Many Django libraries (including `django-storages`) expect to automatically detect environment variables like `AWS_STORAGE_BUCKET_NAME` or `AWS_S3_ACCESS_KEY_ID`. However, NetBox's configuration processing prevents this automatic detection from working as documented in some of these libraries.
When using third-party libraries that rely on environment variable detection, you may need to explicitly read environment variables in your NetBox `configuration.py`:
```python
import os
STORAGES = {
'default': {
'BACKEND': 'storages.backends.s3.S3Storage',
'OPTIONS': {
'bucket_name': os.environ.get('AWS_STORAGE_BUCKET_NAME'),
'access_key': os.environ.get('AWS_S3_ACCESS_KEY_ID'),
'secret_key': os.environ.get('AWS_S3_SECRET_ACCESS_KEY'),
'endpoint_url': os.environ.get('AWS_S3_ENDPOINT_URL'),
'location': 'media/',
}
},
'staticfiles': {
'BACKEND': 'storages.backends.s3.S3Storage',
'OPTIONS': {
'bucket_name': os.environ.get('AWS_STORAGE_BUCKET_NAME'),
'access_key': os.environ.get('AWS_S3_ACCESS_KEY_ID'),
'secret_key': os.environ.get('AWS_S3_SECRET_ACCESS_KEY'),
'endpoint_url': os.environ.get('AWS_S3_ENDPOINT_URL'),
'location': 'static/',
}
},
}
```
This approach works because the environment variables are resolved during NetBox's configuration processing, before the third-party library attempts its own environment variable detection.
!!! warning "Configuration Behavior"
Simply setting environment variables like `AWS_STORAGE_BUCKET_NAME` without explicitly reading them in your configuration will not work. The variables must be read using `os.environ.get()` within your `configuration.py` file.
---
## TIME_ZONE

View File

@@ -404,6 +404,61 @@ A complete date & time. Returns a `datetime.datetime` object.
Custom scripts can be run via the web UI by navigating to the script, completing any required form data, and clicking the "run script" button. It is possible to schedule a script to be executed at specified time in the future. A scheduled script can be canceled by deleting the associated job result object.
#### Prefilling variables via URL parameters
Script form fields can be prefilled by appending query parameters to the script URL. Each parameter name must match the variable name defined on the script class. Prefilled values are treated as initial values and can be edited before execution. Multiple values can be supplied by repeating the same parameter. Query values must be percentencoded where required (for example, spaces as `%20`).
Examples:
For string and integer variables, when a script defines:
```python
from extras.scripts import Script, StringVar, IntegerVar
class MyScript(Script):
name = StringVar()
count = IntegerVar()
```
the following URL prefills the `name` and `count` fields:
```
https://<netbox>/extras/scripts/<script_id>/?name=Branch42&count=3
```
For object variables (`ObjectVar`), supply the objects primary key (PK):
```
https://<netbox>/extras/scripts/<script_id>/?device=1
```
If an object ID cannot be resolved or the object is not visible to the requesting user, the field remains unpopulated.
Supported variable types:
| Variable class | Expected input | Example query string |
|--------------------------|---------------------------------|---------------------------------------------|
| `StringVar` | string (percentencoded) | `?name=Branch42` |
| `TextVar` | string (percentencoded) | `?notes=Initial%20value` |
| `IntegerVar` | integer | `?count=3` |
| `DecimalVar` | decimal number | `?ratio=0.75` |
| `BooleanVar` | value → `True`; empty → `False` | `?enabled=true` (True), `?enabled=` (False) |
| `ChoiceVar` | choice value (not label) | `?role=edge` |
| `MultiChoiceVar` | choice values (repeat) | `?roles=edge&roles=core` |
| `ObjectVar(Device)` | PK (integer) | `?device=1` |
| `MultiObjectVar(Device)` | PKs (repeat) | `?devices=1&devices=2` |
| `IPAddressVar` | IP address | `?ip=198.51.100.10` |
| `IPAddressWithMaskVar` | IP address with mask | `?addr=192.0.2.1/24` |
| `IPNetworkVar` | IP network prefix | `?network=2001:db8::/64` |
| `DateVar` | date `YYYY-MM-DD` | `?date=2025-01-05` |
| `DateTimeVar` | ISO datetime | `?when=2025-01-05T14:30:00` |
| `FileVar` | — (not supported) | — |
!!! note
- The parameter names above are examples; use the actual variable attribute names defined by the script.
- For `BooleanVar`, only an empty value (`?enabled=`) unchecks the box; any other value including `false` or `0` checks it.
- File uploads (`FileVar`) cannot be prefilled via URL parameters.
### Via the API
To run a script via the REST API, issue a POST request to the script's endpoint specifying the form data and commitment. For example, to run a script named `example.MyReport`, we would make a request such as the following:

View File

@@ -2,12 +2,18 @@
The `users.UserConfig` model holds individual preferences for each user in the form of JSON data. This page serves as a manifest of all recognized user preferences in NetBox.
For enduser guidance on resetting saved table layouts, see [Features > User Preferences](../features/user-preferences.md#clearing-table-preferences).
## Available Preferences
| Name | Description |
|--------------------------|---------------------------------------------------------------|
| data_format | Preferred format when rendering raw data (JSON or YAML) |
| pagination.per_page | The number of items to display per page of a paginated table |
| pagination.placement | Where to display the paginator controls relative to the table |
| tables.${table}.columns | The ordered list of columns to display when viewing the table |
| tables.${table}.ordering | A list of column names by which the table should be ordered |
| Name | Description |
|----------------------------|---------------------------------------------------------------|
| `csv_delimiter` | The delimiting character used when exporting CSV data |
| `data_format` | Preferred format when rendering raw data (JSON or YAML) |
| `locale.language` | The language selected for UI translation |
| `pagination.per_page` | The number of items to display per page of a paginated table |
| `pagination.placement` | Where to display the paginator controls relative to the table |
| `tables.${table}.columns` | The ordered list of columns to display when viewing the table |
| `tables.${table}.ordering` | A list of column names by which the table should be ordered |
| `ui.copilot_enabled` | Toggles the NetBox Copilot AI agent |
| `ui.tables.striping` | Toggles visual striping of tables in the UI |

View File

@@ -2,6 +2,8 @@
While NetBox strives to meet the needs of every network, the needs of users to cater to their own unique environments cannot be ignored. NetBox was built with this in mind, and can be customized in many ways to better suit your particular needs.
For enduser personalization topics (bookmarks, table preferences, language, CSV delimiter, and more), see [Features > User Preferences](../features/user-preferences.md).
## Tags
Most objects in NetBox can be assigned user-created tags to aid with organization and filtering. Tag values are completely arbitrary: They may be used to store data in key-value pairs, or they may be employed simply as labels against which objects can be filtered. Each tag can also be assigned a color for quicker differentiation in the user interface.
@@ -18,10 +20,6 @@ The `tag` filter can be specified multiple times to match only objects which hav
GET /api/dcim/devices/?tag=monitored&tag=deprecated
```
## Bookmarks
Users can bookmark their most commonly visited objects for convenient access. Bookmarks are listed under a user's profile, and can be displayed with custom filtering and ordering on the user's personal dashboard.
## Custom Fields
While NetBox provides a rather extensive data model out of the box, the need may arise to store certain additional data associated with NetBox objects. For example, you might need to record the invoice ID alongside an installed device, or record an approving authority when creating a new IP prefix. NetBox administrators can create custom fields on built-in objects to meet these needs.
@@ -38,7 +36,7 @@ Custom links allow you to conveniently reference external resources related to N
http://server.local/vms/?name={{ object.name }}
```
Now, when viewing a virtual machine in NetBox, a user will see a handy button with the chosen title and link (complete with the name of the VM being viewed). Both the text and URL of custom links can be templatized in this manner, and custom links can be grouped together into dropdowns for more efficient display.
Now, when viewing a virtual machine in NetBox, a user will see a handy button with the chosen title and link (complete with the name of the VM being viewed). Both the text and URL of custom links can be templatized in this manner, and custom links can be grouped together into dropdowns for a more efficient display.
To learn more about this feature, check out the [custom link documentation](../customization/custom-links.md).

View File

@@ -0,0 +1,63 @@
# User Preferences
NetBox stores peruser options that control aspects of the web interface and data display. Preferences persist across sessions and can be managed under **User → Preferences**.
## Table configurations
When a list view is configured using **Configure**, NetBox records the selected columns and ordering as peruser table preferences for that table. These preferences are applied automatically on subsequent visits.
### Clearing table preferences
Saved table preferences may need to be reset, for example, if a table fails to render or after an upgrade that changes available columns.
To clear saved preferences for one or more tables:
1. Click the username in the topright corner.
2. Select **Preferences** from the dropdown.
3. Scroll to the **Table Configurations** section.
4. Select the tables to reset.
5. Click **Submit** to clear the selected preferences.
After clearing preferences, reopen the list view and use **Configure** to set the desired columns and ordering.
!!! note
Peruser table preferences are distinct from **Table Configs**, which are named, reusable configurations managed under *Customization → Table Configs*. Clearing preferences does not delete any Table Configs. See [Table Configs](../models/extras/tableconfig.md) for details.
## Other preferences
### Language
Selects the user interface language from installed translations (subject to system configuration).
### Page length
Sets the default number of rows displayed on paginated tables.
### Paginator placement
Controls where pagination controls are rendered relative to a table.
### HTMX navigation (experimental)
Enables partialpage navigation for supported views. Disable this preference if unexpected behavior is observed.
### Striped table rows
Toggles alternating row backgrounds on tables.
### Data format (raw views)
Sets the default format (JSON or YAML) when rendering raw data blocks.
### CSV delimiter
Overrides the delimiter used when exporting CSV data.
## Bookmarks
Users can bookmark frequently visited objects for convenient access. Bookmarks appear under the user menu and can be displayed on the personal dashboard using the bookmarks' widget. See [Bookmark](../models/extras/bookmark.md) for model details.
## Notifications and subscriptions
Users may subscribe to objects to receive notifications when changes occur. Notifications are listed under the user menu and can be marked as read or deleted. See [Features > Notifications](notifications.md) and the datamodel references for [Subscription](../models/extras/subscription.md) and [Notification](../models/extras/notification.md).
## Admin defaults
Administrators can define defaults for new users via [`DEFAULT_USER_PREFERENCES`](../configuration/default-values.md#default_user_preferences). Users may override these values under their own preferences.
## See also
- [Development > User Preferences](../development/user-preferences.md) (manifest of recognized preference keys)

View File

@@ -17,7 +17,7 @@ Dedicate some time to take stock of your own sources of truth for your infrastru
* **Multiple conflicting sources** for a given domain. For example, there may be multiple versions of a spreadsheet circulating, each of which asserts a conflicting set of data.
* **Sources with no domain defined.** You may encounter that different teams within your organization use different tools for the same purpose, with no normal definition of when either should be used.
* **Inaccessible data formatting.** Some tools are better suited for programmatic usage than others. For example, spreadsheets are generally very easy to parse and export, however free-form notes on wiki or similar application are much more difficult to consume.
* **Inaccessible data formatting.** Some tools are better suited for programmatic usage than others. For example, spreadsheets are generally very easy to parse and export; however, free-form notes on wiki or similar application are much more difficult to consume.
* **There is no source of truth.** Sometimes you'll find that a source of truth simply doesn't exist for a domain. For example, when assigning IP addresses, operators may be just using any (presumed) available IP from a subnet without ever recording its usage.
See if you can identify each domain of infrastructure data for your organization, and the source of truth for each. Once you have these compiled, you'll need to determine what belongs in NetBox.

View File

@@ -4,6 +4,9 @@ This object represents the saved configuration of an object table in NetBox. Tab
For example, you might wish to create a table config for the devices list to assist in inventory tasks. This view might show the device name, location, serial number, and asset tag, but omit operational details like IP addresses. Once applied, this table config can be saved for reuse in future audits.
!!! note
Peruser table preferences (columns and ordering remembered for an individual user) are distinct from Table Configs. If a list view fails to render due to outdated saved preferences, see [Clearing table preferences](../../features/user-preferences.md#clearing-table-preferences).
## Fields
### Name
@@ -20,7 +23,7 @@ The type of NetBox object to which the table config pertains.
### Table
The name of the specific table to which the table config pertains. (Some NetBox object use multiple tables.)
The name of the specific table to which the table config pertains. (Some NetBox objects use multiple tables.)
### Weight

View File

@@ -60,6 +60,13 @@ Four of the standard Python logging levels are supported:
Log entries recorded using the runner's logger will be saved in the job's log in the database in addition to being processed by other [system logging handlers](../../configuration/system.md#logging).
### Jobs running for Model instances
A Job can be executed for a specific instance of a Model.
To enable this functionality, the model must include the `JobsMixin`.
When enqueuing a Job, you can associate it with a particular instance by passing that instance to the `instance` parameter.
### Scheduled Jobs
As described above, jobs can be scheduled for immediate execution or at any later time using the `enqueue()` method. However, for management purposes, the `enqueue_once()` method allows a job to be scheduled exactly once avoiding duplicates. If a job is already scheduled for a particular instance, a second one won't be scheduled, respecting thread safety. An example use case would be to schedule a periodic task that is bound to an instance in general, but not to any event of that instance (such as updates). The parameters of the `enqueue_once()` method are identical to those of `enqueue()`.
@@ -73,9 +80,10 @@ As described above, jobs can be scheduled for immediate execution or at any late
from django.db import models
from core.choices import JobIntervalChoices
from netbox.models import NetBoxModel
from netbox.models.features import JobsMixin
from .jobs import MyTestJob
class MyModel(NetBoxModel):
class MyModel(JobsMixin, NetBoxModel):
foo = models.CharField()
def save(self, *args, **kwargs):

View File

@@ -1,6 +1,6 @@
# Filters & Filter Sets
Filter sets define the mechanisms available for filtering or searching through a set of objects in NetBox. For instance, sites can be filtered by their parent region or group, status, facility ID, and so on. The same filter set is used consistently for a model whether the request is made via the UI or REST API. (Note that the GraphQL API uses a separate filter class.) NetBox employs the [django-filters2](https://django-tables2.readthedocs.io/en/latest/) library to define filter sets.
Filter sets define the mechanisms available for filtering or searching through a set of objects in NetBox. For instance, sites can be filtered by their parent region or group, status, facility ID, and so on. The same filter set is used consistently for a model whether the request is made via the UI or REST API. (Note that the GraphQL API uses a separate filter class.) NetBox employs the [django-filter](https://django-filter.readthedocs.io/en/stable/) library to define filter sets.
## FilterSet Classes
@@ -55,6 +55,27 @@ class MyModelViewSet(...):
filterset_class = filtersets.MyModelFilterSet
```
### Implementing Quick Search
The `ObjectListView` has a field called Quick Search. For Quick Search to work the corresponding FilterSet has to override the `search` method that is implemented in `NetBoxModelFilterSet`. This function takes a queryset and can perform arbitrary operations on it and return it. A common use-case is to search for the given search value in multiple fields:
```python
from django.db.models import Q
from netbox.filtersets import NetBoxModelFilterSet
class MyFilterSet(NetBoxModelFilterSet):
...
def search(self, queryset, name, value):
if not value.strip():
return queryset
return queryset.filter(
Q(name__icontains=value) |
Q(description__icontains=value)
)
```
The `search` method is also used by the `q` filter in `NetBoxModelFilterSet` which in turn is used by the Search field in the filters tab.
## Filter Classes
### TagFilter

View File

@@ -66,7 +66,7 @@ The top level is the project root, which can have any name that you like. Immedi
* `README.md` - A brief introduction to your plugin, how to install and configure it, where to find help, and any other pertinent information. It is recommended to write `README` files using a markup language such as Markdown to enable human-friendly display.
* The plugin source directory. This must be a valid Python package name, typically comprising only lowercase letters, numbers, and underscores.
The plugin source directory contains all the actual Python code and other resources used by your plugin. Its structure is left to the author's discretion, however it is recommended to follow best practices as outlined in the [Django documentation](https://docs.djangoproject.com/en/stable/intro/reusable-apps/). At a minimum, this directory **must** contain an `__init__.py` file containing an instance of NetBox's `PluginConfig` class, discussed below.
The plugin source directory contains all the actual Python code and other resources used by your plugin. Its structure is left to the author's discretion; however, it is recommended to follow best practices as outlined in the [Django documentation](https://docs.djangoproject.com/en/stable/intro/reusable-apps/). At a minimum, this directory **must** contain an `__init__.py` file containing an instance of NetBox's `PluginConfig` class, discussed below.
**Note:** The [Cookiecutter NetBox Plugin](https://github.com/netbox-community/cookiecutter-netbox-plugin) can be used to auto-generate all the needed directories and files for a new plugin.
@@ -186,7 +186,7 @@ Many of these are self-explanatory, but for more information, see the [pyproject
## Create a Virtual Environment
It is strongly recommended to create a Python [virtual environment](https://docs.python.org/3/tutorial/venv.html) for the development of your plugin, as opposed to using system-wide packages. This will afford you complete control over the installed versions of all dependencies and avoid conflict with system packages. This environment can live wherever you'd like, however it should be excluded from revision control. (A popular convention is to keep all virtual environments in the user's home directory, e.g. `~/.virtualenvs/`.)
It is strongly recommended to create a Python [virtual environment](https://docs.python.org/3/tutorial/venv.html) for the development of your plugin, as opposed to using system-wide packages. This will afford you complete control over the installed versions of all dependencies and avoid conflict with system packages. This environment can live wherever you'd like;however, it should be excluded from revision control. (A popular convention is to keep all virtual environments in the user's home directory, e.g. `~/.virtualenvs/`.)
```shell
python3 -m venv ~/.virtualenvs/my_plugin

View File

@@ -47,6 +47,11 @@ table.configure(request)
This will automatically apply any user-specific preferences for the table. (If using a generic view provided by NetBox, table configuration is handled automatically.)
### Bulk Edit and Delete Actions
Bulk edit and delete buttons are automatically added to the table, if there is an appropriate view registered to the `${modelname}_bulk_edit` or `${modelname}_bulk_delete` path name.
## Columns
The table column classes listed below are supported for use in plugins. These classes can be imported from `netbox.tables.columns`.

View File

@@ -357,7 +357,7 @@ And the response:
...
```
All GraphQL requests are made at the `/graphql` URL (which also serves the GraphiQL UI). The API is currently read-only, however users who wish to disable it until needed can do so by setting the `GRAPHQL_ENABLED` configuration parameter to False. For more detail on NetBox's GraphQL implementation, see [the GraphQL API documentation](../integrations/graphql-api.md).
All GraphQL requests are made at the `/graphql` URL (which also serves the GraphiQL UI). The API is currently read-only; however, users who wish to disable it until needed can do so by setting the `GRAPHQL_ENABLED` configuration parameter to False. For more detail on NetBox's GraphQL implementation, see [the GraphQL API documentation](../integrations/graphql-api.md).
#### IP Ranges ([#834](https://github.com/netbox-community/netbox/issues/834))

View File

@@ -1,5 +1,145 @@
# NetBox v4.4
## v4.4.5 (2025-10-28)
### Enhancements
* [#19751](https://github.com/netbox-community/netbox/issues/19751) - Disable occupied module bays in form dropdowns when installing a new module
* [#20301](https://github.com/netbox-community/netbox/issues/20301) - Add a "dismiss all" option to the notifications dropdown
* [#20399](https://github.com/netbox-community/netbox/issues/20399) - Add `assigned` and `primary` boolean filters for MAC addresses
* [#20567](https://github.com/netbox-community/netbox/issues/20567) - Add contacts column to services table
* [#20675](https://github.com/netbox-community/netbox/issues/20675) - Enable [NetBox Copilot](https://netboxlabs.com/products/netbox-copilot/) integration
* [#20692](https://github.com/netbox-community/netbox/issues/20692) - Add contacts column to IP addresses table
* [#20700](https://github.com/netbox-community/netbox/issues/20700) - Add contacts table column for various additional models
### Bug Fixes
* [#19872](https://github.com/netbox-community/netbox/issues/19872) - Ensure custom script validation failures display error messages
* [#20389](https://github.com/netbox-community/netbox/issues/20389) - Fix "select all" behavior for bulk rename views
* [#20422](https://github.com/netbox-community/netbox/issues/20422) - Enable filtering of aggregates and prefixes by family in GraphQL API
* [#20459](https://github.com/netbox-community/netbox/issues/20459) - Fix validation of `is_oob` & `is_primary` fields under IP address bulk import
* [#20466](https://github.com/netbox-community/netbox/issues/20466) - Fix querying of devices with a primary IP assigned in GraphQL API
* [#20498](https://github.com/netbox-community/netbox/issues/20498) - Enforce the validation regex (if set) for custom URL fields
* [#20524](https://github.com/netbox-community/netbox/issues/20524) - Raise a validation error when attempting to schedule a custom script for a past date/time
* [#20541](https://github.com/netbox-community/netbox/issues/20541) - Fix resolution of GraphQL object fields which rely on custom filters
* [#20551](https://github.com/netbox-community/netbox/issues/20551) - Fix automatic slug generation in quick-add UI form
* [#20606](https://github.com/netbox-community/netbox/issues/20606) - Enable copying of values from table columns rendered as badges
* [#20641](https://github.com/netbox-community/netbox/issues/20641) - Fix `AttributeError` exception raised by the object changes REST API endpoint
* [#20646](https://github.com/netbox-community/netbox/issues/20646) - Prevent cables from connecting to objects marked as connected
* [#20655](https://github.com/netbox-community/netbox/issues/20655) - Fix `FieldError` exception when attempting to sort permissions list by actions
---
## v4.4.4 (2025-10-15)
### Bug Fixes
* [#20554](https://github.com/netbox-community/netbox/issues/20554) - Fix generic relation filters to accept `<app>.<model>` format matching POST requests
* [#20574](https://github.com/netbox-community/netbox/issues/20574) - Fix excessive storage initialization overhead when listing scripts with remote backends
* [#20584](https://github.com/netbox-community/netbox/issues/20584) - Enforce PoE mode requirement on interface templates when PoE type is set
* [#20585](https://github.com/netbox-community/netbox/issues/20585) - Fix API schema generation crash for models with single-field UniqueConstraints
* [#20587](https://github.com/netbox-community/netbox/issues/20587) - Fix upgrade.sh failure when removing stale content types
---
## v4.4.3 (2025-10-14)
### Enhancements
* [#20426](https://github.com/netbox-community/netbox/issues/20426) - Add a copy-to-clipboard button for custom script output
* [#20516](https://github.com/netbox-community/netbox/issues/20516) - Improve rendering of VLAN ID ranges in VLAN group tables
### Bug Fixes
* [#19302](https://github.com/netbox-community/netbox/issues/19302) - Fix uniqueness validation in REST API for nullable fields
* [#19615](https://github.com/netbox-community/netbox/issues/19615) - Fix support for static file parameters in templates when external storage is in use
* [#19818](https://github.com/netbox-community/netbox/issues/19818) - Hide primary IP assignment fields when creating a new virtual machine in the UI
* [#19825](https://github.com/netbox-community/netbox/issues/19825) - Prevent cache for config revisions from being erroneously overwritten when debugging is enabled
* [#20140](https://github.com/netbox-community/netbox/issues/20140) - Changing a site's region or group should update any associated circuit terminations
* [#20156](https://github.com/netbox-community/netbox/issues/20156) - Fix display of rack elevation labels
* [#20290](https://github.com/netbox-community/netbox/issues/20290) - Fix migration error when upgrading to NetBox v4.4 from releases earlier than v4.3
* [#20471](https://github.com/netbox-community/netbox/issues/20471) - Saving an unmodified VLAN group should not generate a change record
* [#20475](https://github.com/netbox-community/netbox/issues/20475) - Collapse singleton VLAN IDs in VLAN group display
* [#20494](https://github.com/netbox-community/netbox/issues/20494) - Correct OpenAPI schema definition for `IntegerRangeSerializer`
* [#20496](https://github.com/netbox-community/netbox/issues/20496) - REST API should always honor `MAX_PAGE_SIZE` value
* [#20497](https://github.com/netbox-community/netbox/issues/20497) - Fix filtering of VLAN groups by VLAN ID range in GraphQL API
* [#20507](https://github.com/netbox-community/netbox/issues/20507) - Fix support for fetching ASN contacts via GraphQL API
* [#20523](https://github.com/netbox-community/netbox/issues/20523) - Hide password change form for users authenticated via SSO
* [#20542](https://github.com/netbox-community/netbox/issues/20542) - Fix the creation of MAC addresses using the "quick add" form
---
## v4.4.2 (2025-09-30)
### Enhancements
* [#17010](https://github.com/netbox-community/netbox/issues/17010) - Show admin navigation menu items only for staff & superusers
* [#19590](https://github.com/netbox-community/netbox/issues/19590) - Add columns for device site & location to device component tables
* [#19765](https://github.com/netbox-community/netbox/issues/19765) - Linkify assigned object types under saved filter view
* [#20308](https://github.com/netbox-community/netbox/issues/20308) - Add a hotkey (`/`) for the global search field
* [#20332](https://github.com/netbox-community/netbox/issues/20332) - Add a "none" option to object tag filters
* [#20380](https://github.com/netbox-community/netbox/issues/20380) - Introduce the `SENTRY_CONFIG` configuration parameter
* [#20412](https://github.com/netbox-community/netbox/issues/20412) - Linkify cluster type on virtual machine detail view
* [#20438](https://github.com/netbox-community/netbox/issues/20438) - Add `facility` field to bulk edit forms for sites and locations
### Bug Fixes
* [#18878](https://github.com/netbox-community/netbox/issues/18878) - Automatically assign a designated primary MAC address upon creation of a new interface
* [#20243](https://github.com/netbox-community/netbox/issues/20243) - Prevent scheduled system jobs from re-running multiple times
* [#20253](https://github.com/netbox-community/netbox/issues/20253) - Fix support for filtering object contact assignments in GraphQL API
* [#20365](https://github.com/netbox-community/netbox/issues/20365) - Address various inaccuracies in generated OpenAPI schema
* [#20375](https://github.com/netbox-community/netbox/issues/20375) - Preserve filter parameters when performing bulk operations
* [#20390](https://github.com/netbox-community/netbox/issues/20390) - Fix styling of page size selection dropdown
* [#20392](https://github.com/netbox-community/netbox/issues/20392) - Clean up ordering of interface type options
* [#20398](https://github.com/netbox-community/netbox/issues/20398) - Fix misleading error reporting for min/max custom field values
* [#20419](https://github.com/netbox-community/netbox/issues/20419) - Correct action buttons for child object views
* [#20425](https://github.com/netbox-community/netbox/issues/20425) - Fix Markdown preview functionality within "quick add" modal
* [#20441](https://github.com/netbox-community/netbox/issues/20441) - Fix display of the "groups" column in contact assignments table
---
## v4.4.1 (2025-09-16)
### Enhancements
* [#15492](https://github.com/netbox-community/netbox/issues/15492) - Enable cloning of permissions
* [#16381](https://github.com/netbox-community/netbox/issues/16381) - Display script result timestamps in system timezone
* [#19262](https://github.com/netbox-community/netbox/issues/19262) - No longer restrict FHRP group assignment by assigned IP address
* [#19408](https://github.com/netbox-community/netbox/issues/19408) - Support export templates for circuit terminations and virtual circuit terminations
* [#19428](https://github.com/netbox-community/netbox/issues/19428) - Add an optional U height field to the devices table
* [#19547](https://github.com/netbox-community/netbox/issues/19547) - Add individual "sync" buttons in data sources table
* [#19865](https://github.com/netbox-community/netbox/issues/19865) - Reorganize cable type groupings
* [#20222](https://github.com/netbox-community/netbox/issues/20222) - Enable the `HttpOnly` flag for CSRF cookie
* [#20237](https://github.com/netbox-community/netbox/issues/20237) - Include VPN tunnel groups in global search results
* [#20241](https://github.com/netbox-community/netbox/issues/20241) - Record A & B terminations in cable changelog data
* [#20277](https://github.com/netbox-community/netbox/issues/20277) - Add support for attribute assignment to `deserialize_object()` utility
* [#20321](https://github.com/netbox-community/netbox/issues/20321) - Add physical media types for transceiver interfaces
* [#20347](https://github.com/netbox-community/netbox/issues/20347) - Add Wi-Fi Alliance aliases to 802.11 interface types
### Bug Fixes
* [#19729](https://github.com/netbox-community/netbox/issues/19729) - Restore `kind` filter for interfaces in GraphQL API
* [#19744](https://github.com/netbox-community/netbox/issues/19744) - Plugins list should be orderable by "active" column
* [#19851](https://github.com/netbox-community/netbox/issues/19851) - Fix `ValueError` complaining of missing `scope` when bulk importing wireless LANs
* [#19896](https://github.com/netbox-community/netbox/issues/19896) - Min/max values for decimal custom fields should accept decimal values
* [#20197](https://github.com/netbox-community/netbox/issues/20197) - Correct validation for virtual chassis parent interface
* [#20215](https://github.com/netbox-community/netbox/issues/20215) - All GraphQL filters for config contexts should be optional
* [#20217](https://github.com/netbox-community/netbox/issues/20217) - Remove "0 VLANs available" row at end of VLAN range table
* [#20221](https://github.com/netbox-community/netbox/issues/20221) - JSON fields should not coerce empty dictionaries to null
* [#20227](https://github.com/netbox-community/netbox/issues/20227) - Ensure consistent padding of Markdown content
* [#20234](https://github.com/netbox-community/netbox/issues/20234) - Fix "add" button link for prerequisite object warning in UI
* [#20236](https://github.com/netbox-community/netbox/issues/20236) - Strip invalid characters from uploaded image file names
* [#20238](https://github.com/netbox-community/netbox/issues/20238) - Fix support for outside IP assignment during bulk import of tunnel terminations
* [#20242](https://github.com/netbox-community/netbox/issues/20242) - Avoid `AttributeError` exception on background jobs with no request ID
* [#20252](https://github.com/netbox-community/netbox/issues/20252) - Remove generic AddObject from ObjectChildrenView to prevent duplicate "add" buttons
* [#20264](https://github.com/netbox-community/netbox/issues/20264) - Fix rendering of default icon in plugins list
* [#20272](https://github.com/netbox-community/netbox/issues/20272) - ConfigContexts assigned to ancestor locations should apply to device/VM
* [#20282](https://github.com/netbox-community/netbox/issues/20282) - Fix styling of prerequisite objects warning
* [#20298](https://github.com/netbox-community/netbox/issues/20298) - Display a placeholder when an image thumbnail fails to load
* [#20327](https://github.com/netbox-community/netbox/issues/20327) - Avoid calling `distinct()` on device/VM queryset when fetching config context data
---
## v4.4.0 (2025-09-02)
### New Features

View File

@@ -86,6 +86,7 @@ nav:
- Change Logging: 'features/change-logging.md'
- Journaling: 'features/journaling.md'
- Event Rules: 'features/event-rules.md'
- User Preferences: 'features/user-preferences.md'
- Notifications: 'features/notifications.md'
- Background Jobs: 'features/background-jobs.md'
- Auth & Permissions: 'features/authentication-permissions.md'
@@ -124,6 +125,9 @@ nav:
- Export Templates: 'customization/export-templates.md'
- Reports: 'customization/reports.md'
- Custom Scripts: 'customization/custom-scripts.md'
- Best Practices:
- Modeling Pluggable Transceivers: 'best-practices/modeling-pluggable-transceivers.md'
- Performance Handbook: 'best-practices/performance-handbook.md'
- Integrations:
- REST API: 'integrations/rest-api.md'
- GraphQL API: 'integrations/graphql-api.md'

View File

@@ -1,5 +1,7 @@
from django.apps import AppConfig
from netbox import denormalized
class CircuitsConfig(AppConfig):
name = "circuits"
@@ -8,6 +10,16 @@ class CircuitsConfig(AppConfig):
def ready(self):
from netbox.models.features import register_models
from . import signals, search # noqa: F401
from .models import CircuitTermination
# Register models
register_models(*self.get_models())
denormalized.register(CircuitTermination, '_site', {
'_region': 'region',
'_site_group': 'group',
})
denormalized.register(CircuitTermination, '_location', {
'_site': 'site',
})

View File

@@ -6,7 +6,6 @@ from django.urls import reverse
from django.utils.translation import gettext_lazy as _
from circuits.choices import *
from circuits.constants import *
from dcim.models import CabledObjectModel
from netbox.models import ChangeLoggedModel, OrganizationalModel, PrimaryModel
from netbox.models.mixins import DistanceMixin
@@ -231,6 +230,7 @@ class CircuitGroupAssignment(CustomFieldsMixin, ExportTemplatesMixin, TagsMixin,
class CircuitTermination(
CustomFieldsMixin,
CustomLinksMixin,
ExportTemplatesMixin,
TagsMixin,
ChangeLoggedModel,
CabledObjectModel

View File

@@ -8,7 +8,7 @@ from django.utils.translation import gettext_lazy as _
from circuits.choices import *
from netbox.models import ChangeLoggedModel, PrimaryModel
from netbox.models.features import CustomFieldsMixin, CustomLinksMixin, TagsMixin
from netbox.models.features import CustomFieldsMixin, CustomLinksMixin, ExportTemplatesMixin, TagsMixin
from .base import BaseCircuitType
__all__ = (
@@ -121,6 +121,7 @@ class VirtualCircuit(PrimaryModel):
class VirtualCircuitTermination(
CustomFieldsMixin,
CustomLinksMixin,
ExportTemplatesMixin,
TagsMixin,
ChangeLoggedModel
):

View File

@@ -83,6 +83,7 @@ class ProviderBulkEditView(generic.BulkEditView):
@register_model_view(Provider, 'bulk_rename', path='rename', detail=False)
class ProviderBulkRenameView(generic.BulkRenameView):
queryset = Provider.objects.all()
filterset = filtersets.ProviderFilterSet
@register_model_view(Provider, 'bulk_delete', path='delete', detail=False)
@@ -150,6 +151,7 @@ class ProviderAccountBulkEditView(generic.BulkEditView):
@register_model_view(ProviderAccount, 'bulk_rename', path='rename', detail=False)
class ProviderAccountBulkRenameView(generic.BulkRenameView):
queryset = ProviderAccount.objects.all()
filterset = filtersets.ProviderAccountFilterSet
@register_model_view(ProviderAccount, 'bulk_delete', path='delete', detail=False)
@@ -226,6 +228,7 @@ class ProviderNetworkBulkEditView(generic.BulkEditView):
@register_model_view(ProviderNetwork, 'bulk_rename', path='rename', detail=False)
class ProviderNetworkBulkRenameView(generic.BulkRenameView):
queryset = ProviderNetwork.objects.all()
filterset = filtersets.ProviderNetworkFilterSet
@register_model_view(ProviderNetwork, 'bulk_delete', path='delete', detail=False)
@@ -290,6 +293,7 @@ class CircuitTypeBulkEditView(generic.BulkEditView):
@register_model_view(CircuitType, 'bulk_rename', path='rename', detail=False)
class CircuitTypeBulkRenameView(generic.BulkRenameView):
queryset = CircuitType.objects.all()
filterset = filtersets.CircuitTypeFilterSet
@register_model_view(CircuitType, 'bulk_delete', path='delete', detail=False)
@@ -362,6 +366,7 @@ class CircuitBulkEditView(generic.BulkEditView):
class CircuitBulkRenameView(generic.BulkRenameView):
queryset = Circuit.objects.all()
field_name = 'cid'
filterset = filtersets.CircuitFilterSet
@register_model_view(Circuit, 'bulk_delete', path='delete', detail=False)
@@ -557,6 +562,7 @@ class CircuitGroupBulkEditView(generic.BulkEditView):
@register_model_view(CircuitGroup, 'bulk_rename', path='rename', detail=False)
class CircuitGroupBulkRenameView(generic.BulkRenameView):
queryset = CircuitGroup.objects.all()
filterset = filtersets.CircuitGroupFilterSet
@register_model_view(CircuitGroup, 'bulk_delete', path='delete', detail=False)
@@ -672,6 +678,7 @@ class VirtualCircuitTypeBulkEditView(generic.BulkEditView):
@register_model_view(VirtualCircuitType, 'bulk_rename', path='rename', detail=False)
class VirtualCircuitTypeBulkRenameView(generic.BulkRenameView):
queryset = VirtualCircuitType.objects.all()
filterset = filtersets.VirtualCircuitTypeFilterSet
@register_model_view(VirtualCircuitType, 'bulk_delete', path='delete', detail=False)
@@ -744,6 +751,7 @@ class VirtualCircuitBulkEditView(generic.BulkEditView):
class VirtualCircuitBulkRenameView(generic.BulkRenameView):
queryset = VirtualCircuit.objects.all()
field_name = 'cid'
filterset = filtersets.VirtualCircuitFilterSet
@register_model_view(VirtualCircuit, 'bulk_delete', path='delete', detail=False)

View File

@@ -282,18 +282,18 @@ class FixSerializedPKRelatedField(OpenApiSerializerFieldExtension):
class FixIntegerRangeSerializerSchema(OpenApiSerializerExtension):
target_class = 'netbox.api.fields.IntegerRangeSerializer'
match_subclasses = True
def map_serializer(self, auto_schema: 'AutoSchema', direction: Direction) -> _SchemaType:
# One range = two integers; many=True will wrap this in an outer array
return {
'type': 'array',
'items': {
'type': 'array',
'items': {
'type': 'integer',
},
'minItems': 2,
'maxItems': 2,
'type': 'integer',
},
'minItems': 2,
'maxItems': 2,
'example': [10, 20],
}

View File

@@ -13,7 +13,7 @@ class BackgroundTaskSerializer(serializers.Serializer):
url = serializers.HyperlinkedIdentityField(
view_name='core-api:rqtask-detail',
lookup_field='id',
lookup_url_kwarg='pk'
lookup_url_kwarg='id'
)
description = serializers.CharField()
origin = serializers.CharField()

View File

@@ -5,7 +5,7 @@ from django_rq.queues import get_redis_connection
from django_rq.settings import QUEUES_LIST
from django_rq.utils import get_statistics
from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import extend_schema
from drf_spectacular.utils import OpenApiParameter, extend_schema
from rest_framework import viewsets
from rest_framework.decorators import action
from rest_framework.exceptions import PermissionDenied
@@ -24,6 +24,7 @@ from netbox.api.authentication import IsAuthenticatedOrLoginNotRequired
from netbox.api.metadata import ContentTypeMetadata
from netbox.api.pagination import LimitOffsetListPagination
from netbox.api.viewsets import NetBoxModelViewSet, NetBoxReadOnlyModelViewSet
from . import serializers
@@ -117,29 +118,49 @@ class BaseRQViewSet(viewsets.ViewSet):
def get_serializer(self, *args, **kwargs):
"""
Return the serializer instance that should be used for validating and
deserializing input, and for serializing output.
deserializing input and for serializing output.
"""
serializer_class = self.get_serializer_class()
kwargs['context'] = self.get_serializer_context()
return serializer_class(*args, **kwargs)
def get_serializer_class(self):
"""
Return the class to use for the serializer.
"""
return self.serializer_class
def get_serializer_context(self):
"""
Extra context provided to the serializer class.
"""
return {
'request': self.request,
'format': self.format_kwarg,
'view': self,
}
class BackgroundQueueViewSet(BaseRQViewSet):
"""
Retrieve a list of RQ Queues.
Note: Queue names are not URL safe so not returning a detail view.
Note: Queue names are not URL safe, so not returning a detail view.
"""
serializer_class = serializers.BackgroundQueueSerializer
lookup_field = 'name'
lookup_value_regex = r'[\w.@+-]+'
def get_view_name(self):
return "Background Queues"
return 'Background Queues'
def get_data(self):
return get_statistics(run_maintenance_tasks=True)["queues"]
return get_statistics(run_maintenance_tasks=True)['queues']
@extend_schema(responses={200: OpenApiTypes.OBJECT})
@extend_schema(
operation_id='core_background_queues_retrieve_by_name',
parameters=[OpenApiParameter(name='name', type=OpenApiTypes.STR, location=OpenApiParameter.PATH)],
responses={200: OpenApiTypes.OBJECT},
)
def retrieve(self, request, name):
data = self.get_data()
if not data:
@@ -161,12 +182,17 @@ class BackgroundWorkerViewSet(BaseRQViewSet):
lookup_field = 'name'
def get_view_name(self):
return "Background Workers"
return 'Background Workers'
def get_data(self):
config = QUEUES_LIST[0]
return Worker.all(get_redis_connection(config['connection_config']))
@extend_schema(
operation_id='core_background_workers_retrieve_by_name',
parameters=[OpenApiParameter(name='name', type=OpenApiTypes.STR, location=OpenApiParameter.PATH)],
responses={200: OpenApiTypes.OBJECT},
)
def retrieve(self, request, name):
# all the RQ queues should use the same connection
config = QUEUES_LIST[0]
@@ -184,9 +210,10 @@ class BackgroundTaskViewSet(BaseRQViewSet):
Retrieve a list of RQ Tasks.
"""
serializer_class = serializers.BackgroundTaskSerializer
lookup_field = 'id'
def get_view_name(self):
return "Background Tasks"
return 'Background Tasks'
def get_data(self):
return get_rq_jobs()
@@ -199,45 +226,53 @@ class BackgroundTaskViewSet(BaseRQViewSet):
return task
@extend_schema(responses={200: OpenApiTypes.OBJECT})
def retrieve(self, request, pk):
@extend_schema(
operation_id='core_background_tasks_retrieve_by_id',
parameters=[OpenApiParameter(name='id', type=OpenApiTypes.STR, location=OpenApiParameter.PATH)],
responses={200: OpenApiTypes.OBJECT},
)
def retrieve(self, request, id):
"""
Retrieve the details of the specified RQ Task.
"""
task = self.get_task_from_id(pk)
task = self.get_task_from_id(id)
serializer = self.serializer_class(task, context={'request': request})
return Response(serializer.data)
@action(methods=["POST"], detail=True)
def delete(self, request, pk):
@extend_schema(parameters=[OpenApiParameter(name='id', type=OpenApiTypes.STR, location=OpenApiParameter.PATH)])
@action(methods=['POST'], detail=True)
def delete(self, request, id):
"""
Delete the specified RQ Task.
"""
delete_rq_job(pk)
delete_rq_job(id)
return HttpResponse(status=200)
@action(methods=["POST"], detail=True)
def requeue(self, request, pk):
@extend_schema(parameters=[OpenApiParameter(name='id', type=OpenApiTypes.STR, location=OpenApiParameter.PATH)])
@action(methods=['POST'], detail=True)
def requeue(self, request, id):
"""
Requeues the specified RQ Task.
"""
requeue_rq_job(pk)
requeue_rq_job(id)
return HttpResponse(status=200)
@action(methods=["POST"], detail=True)
def enqueue(self, request, pk):
@extend_schema(parameters=[OpenApiParameter(name='id', type=OpenApiTypes.STR, location=OpenApiParameter.PATH)])
@action(methods=['POST'], detail=True)
def enqueue(self, request, id):
"""
Enqueues the specified RQ Task.
"""
enqueue_rq_job(pk)
enqueue_rq_job(id)
return HttpResponse(status=200)
@action(methods=["POST"], detail=True)
def stop(self, request, pk):
@extend_schema(parameters=[OpenApiParameter(name='id', type=OpenApiTypes.STR, location=OpenApiParameter.PATH)])
@action(methods=['POST'], detail=True)
def stop(self, request, id):
"""
Stops the specified RQ Task.
"""
stopped_jobs = stop_rq_job(pk)
stopped_jobs = stop_rq_job(id)
if len(stopped_jobs) == 1:
return HttpResponse(status=200)
else:

View File

@@ -80,6 +80,7 @@ class JobFilterSet(BaseFilterSet):
method='search',
label=_('Search'),
)
object_type = ContentTypeFilter()
created = django_filters.DateTimeFilter()
created__before = django_filters.DateTimeFilter(
field_name='created',
@@ -169,6 +170,7 @@ class ObjectChangeFilterSet(BaseFilterSet):
changed_object_type_id = django_filters.ModelMultipleChoiceFilter(
queryset=ContentType.objects.all()
)
related_object_type = ContentTypeFilter()
user_id = django_filters.ModelMultipleChoiceFilter(
queryset=User.objects.all(),
label=_('User (ID)'),

View File

@@ -166,8 +166,8 @@ class ConfigRevisionForm(forms.ModelForm, metaclass=ConfigFormMetaclass):
FieldSet('CUSTOM_VALIDATORS', 'PROTECTION_RULES', name=_('Validation')),
FieldSet('DEFAULT_USER_PREFERENCES', name=_('User Preferences')),
FieldSet(
'MAINTENANCE_MODE', 'GRAPHQL_ENABLED', 'CHANGELOG_RETENTION', 'JOB_RETENTION', 'MAPS_URL',
name=_('Miscellaneous')
'MAINTENANCE_MODE', 'COPILOT_ENABLED', 'GRAPHQL_ENABLED', 'CHANGELOG_RETENTION', 'JOB_RETENTION',
'MAPS_URL', name=_('Miscellaneous'),
),
FieldSet('comment', name=_('Config Revision'))
)

View File

@@ -3,12 +3,12 @@ from typing import Annotated, List, TYPE_CHECKING
import strawberry
import strawberry_django
from django.contrib.contenttypes.models import ContentType
from strawberry.types import Info
from core.models import ObjectChange
if TYPE_CHECKING:
from core.graphql.types import DataFileType, DataSourceType
from netbox.core.graphql.types import ObjectChangeType
from core.graphql.types import DataFileType, DataSourceType, ObjectChangeType
__all__ = (
'ChangelogMixin',
@@ -20,7 +20,7 @@ __all__ = (
class ChangelogMixin:
@strawberry_django.field
def changelog(self, info) -> List[Annotated["ObjectChangeType", strawberry.lazy('.types')]]: # noqa: F821
def changelog(self, info: Info) -> List[Annotated['ObjectChangeType', strawberry.lazy('.types')]]: # noqa: F821
content_type = ContentType.objects.get_for_model(self)
object_changes = ObjectChange.objects.filter(
changed_object_type=content_type,
@@ -31,5 +31,5 @@ class ChangelogMixin:
@strawberry.type
class SyncedDataMixin:
data_source: Annotated["DataSourceType", strawberry.lazy('core.graphql.types')] | None
data_file: Annotated["DataFileType", strawberry.lazy('core.graphql.types')] | None
data_source: Annotated['DataSourceType', strawberry.lazy('core.graphql.types')] | None
data_file: Annotated['DataFileType', strawberry.lazy('core.graphql.types')] | None

View File

@@ -0,0 +1,48 @@
# Generated by Django 5.2.5 on 2025-09-09 16:48
from django.db import migrations, models
def get_active(apps, schema_editor):
from django.core.cache import cache
ConfigRevision = apps.get_model('core', 'ConfigRevision')
version = None
revision = None
# Try and get the latest version from cache
try:
version = cache.get('config_version')
except Exception:
pass
# If there is a version in cache, attempt to set revision to the current version from cache
# If the version in cache does not exist or there is no version, try the lastest revision in the database
if not version or (version and not (revision := ConfigRevision.objects.filter(pk=version).first())):
revision = ConfigRevision.objects.order_by('-created').first()
# If there is a revision set, set the active revision
if revision:
revision.active = True
revision.save()
class Migration(migrations.Migration):
dependencies = [
('core', '0018_concrete_objecttype'),
]
operations = [
migrations.AddField(
model_name='configrevision',
name='active',
field=models.BooleanField(default=False),
),
migrations.RunPython(code=get_active, reverse_code=migrations.RunPython.noop),
migrations.AddConstraint(
model_name='configrevision',
constraint=models.UniqueConstraint(
condition=models.Q(('active', True)), fields=('active',), name='unique_active_config_revision'
),
),
]

View File

@@ -14,6 +14,9 @@ class ConfigRevision(models.Model):
"""
An atomic revision of NetBox's configuration.
"""
active = models.BooleanField(
default=False
)
created = models.DateTimeField(
verbose_name=_('created'),
auto_now_add=True
@@ -35,6 +38,13 @@ class ConfigRevision(models.Model):
ordering = ['-created']
verbose_name = _('config revision')
verbose_name_plural = _('config revisions')
constraints = [
models.UniqueConstraint(
fields=('active',),
condition=models.Q(active=True),
name='unique_active_config_revision',
)
]
def __str__(self):
if not self.pk:
@@ -59,8 +69,13 @@ class ConfigRevision(models.Model):
"""
cache.set('config', self.data, None)
cache.set('config_version', self.pk, None)
# Set all instances of ConfigRevision to false and set this instance to true
ConfigRevision.objects.all().update(active=False)
ConfigRevision.objects.filter(pk=self.pk).update(active=True)
activate.alters_data = True
@property
def is_active(self):
return cache.get('config_version') == self.pk
return self.active

View File

@@ -6,7 +6,6 @@ from django.conf import settings
from django.core.exceptions import ValidationError
from django.db import models
from django.core.files.storage import storages
from django.urls import reverse
from django.utils.translation import gettext as _
from ..choices import ManagedFileRootPathChoices
@@ -64,9 +63,6 @@ class ManagedFile(SyncedDataMixin, models.Model):
def __str__(self):
return self.name
def get_absolute_url(self):
return reverse('core:managedfile', args=[self.pk])
@property
def name(self):
return self.file_path

View File

@@ -5,7 +5,7 @@ from django.contrib.contenttypes.models import ContentType
from django.contrib.postgres.fields import ArrayField
from django.contrib.postgres.indexes import GinIndex
from django.core.exceptions import ObjectDoesNotExist
from django.db import models
from django.db import connection, models
from django.db.models import Q
from django.utils.translation import gettext as _
@@ -66,6 +66,14 @@ class ObjectTypeManager(models.Manager):
"""
from netbox.models.features import get_model_features, model_is_public
# TODO: Remove this in NetBox v5.0
# If the ObjectType table has not yet been provisioned (e.g. because we're in a pre-v4.4 migration),
# fall back to ContentType.
if 'core_objecttype' not in connection.introspection.table_names():
ct = ContentType.objects.get_for_model(model, for_concrete_model=for_concrete_model)
ct.features = get_model_features(ct.model_class())
return ct
if not inspect.isclass(model):
model = model.__class__
opts = self._get_opts(model, for_concrete_model)

View File

@@ -1,6 +1,5 @@
import datetime
import importlib
import importlib.util
from dataclasses import dataclass, field
from typing import Optional

View File

@@ -3,6 +3,7 @@ from threading import local
from django.contrib.contenttypes.models import ContentType
from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.db.models import CASCADE
from django.db.models.fields.reverse_related import ManyToManyRel, ManyToOneRel
from django.db.models.signals import m2m_changed, post_migrate, post_save, pre_delete
from django.dispatch import receiver, Signal
@@ -220,14 +221,8 @@ def handle_deleted_object(sender, instance, **kwargs):
obj.snapshot() # Ensure the change record includes the "before" state
if type(relation) is ManyToManyRel:
getattr(obj, related_field_name).remove(instance)
elif type(relation) is ManyToOneRel and relation.field.null is True:
elif type(relation) is ManyToOneRel and relation.null and relation.on_delete is not CASCADE:
setattr(obj, related_field_name, None)
# make sure the object hasn't been deleted - in case of
# deletion chaining of related objects
try:
obj.refresh_from_db()
except DoesNotExist:
continue
obj.save()
# Enqueue the object for event processing

View File

@@ -4,6 +4,7 @@ import django_tables2 as tables
from core.models import *
from netbox.tables import NetBoxTable, columns
from .columns import BackendTypeColumn
from .template_code import DATA_SOURCE_SYNC_BUTTON
__all__ = (
'DataFileTable',
@@ -37,6 +38,9 @@ class DataSourceTable(NetBoxTable):
tags = columns.TagColumn(
url_name='core:datasource_list',
)
actions = columns.ActionsColumn(
extra_buttons=DATA_SOURCE_SYNC_BUTTON,
)
class Meta(NetBoxTable.Meta):
model = DataSource

View File

@@ -1,10 +1,8 @@
import django_tables2 as tables
from django.urls import reverse
from django.utils.safestring import mark_safe
from django.utils.translation import gettext_lazy as _
from netbox.tables import BaseTable, columns
from .template_code import PLUGIN_IS_INSTALLED
from .template_code import PLUGIN_IS_INSTALLED, PLUGIN_NAME_TEMPLATE
__all__ = (
'CatalogPluginTable',
@@ -12,12 +10,6 @@ __all__ = (
)
PLUGIN_NAME_TEMPLATE = """
<img class="plugin-icon" src="{{ record.icon_url }}">
<a href="{% url 'core:plugin' record.config_name %}">{{ record.title_long }}</a>
"""
class PluginVersionTable(BaseTable):
version = tables.Column(
verbose_name=_('Version')
@@ -61,6 +53,7 @@ class CatalogPluginTable(BaseTable):
verbose_name=_('Local')
)
is_installed = columns.TemplateColumn(
accessor=tables.A('is_loaded'),
verbose_name=_('Active'),
template_code=PLUGIN_IS_INSTALLED
)
@@ -93,10 +86,4 @@ class CatalogPluginTable(BaseTable):
)
# List installed plugins first, then certified plugins, then
# everything else (with each tranche ordered alphabetically)
order_by = ('-is_installed', '-is_certified', 'name')
def render_title_long(self, value, record):
if record.static:
return value
url = reverse('core:plugin', args=[record.config_name])
return mark_safe(f"<a href='{url}'>{value}</a>")
order_by = ('-is_installed', '-is_certified', 'title_long')

View File

@@ -26,3 +26,29 @@ PLUGIN_IS_INSTALLED = """
<span class="text-muted">&mdash;</span>
{% endif %}
"""
PLUGIN_NAME_TEMPLATE = """
{% load static %}
{% if record.icon_url %}
<img class="plugin-icon" src="{{ record.icon_url }}">
{% else %}
<img class="plugin-icon" src="{% static 'plugin-default.svg' %}">
{% endif %}
<a href="{% url 'core:plugin' record.config_name %}">{{ record.title_long }}</a>
"""
DATA_SOURCE_SYNC_BUTTON = """
{% load helpers %}
{% load i18n %}
{% if perms.core.sync_datasource %}
{% if record.ready_for_sync %}
<button class="btn btn-primary btn-sm" type="submit" formaction="{% url 'core:datasource_sync' pk=record.pk %}?return_url={{ request.get_full_path|urlencode }}" formmethod="post">
<i class="mdi mdi-sync" aria-hidden="true"></i> {% trans "Sync" %}
</button>
{% else %}
<button class="btn btn-primary btn-sm" disabled>
<i class="mdi mdi-sync" aria-hidden="true"></i> {% trans "Sync" %}
</button>
{% endif %}
{% endif %}
"""

View File

@@ -5,14 +5,16 @@ from rest_framework import status
from core.choices import ObjectChangeActionChoices
from core.models import ObjectChange, ObjectType
from dcim.choices import SiteStatusChoices
from dcim.models import Site, CableTermination, Device, DeviceType, DeviceRole, Interface, Cable
from dcim.choices import InterfaceTypeChoices, ModuleStatusChoices, SiteStatusChoices
from dcim.models import (
Cable, CableTermination, Device, DeviceRole, DeviceType, Manufacturer, Module, ModuleBay, ModuleType, Interface,
Site,
)
from extras.choices import *
from extras.models import CustomField, CustomFieldChoiceSet, Tag
from utilities.testing import APITestCase
from utilities.testing.utils import create_tags, post_data
from utilities.testing.utils import create_tags, create_test_device, post_data
from utilities.testing.views import ModelViewTestCase
from dcim.models import Manufacturer
class ChangeLogViewTest(ModelViewTestCase):
@@ -622,3 +624,64 @@ class ChangeLogAPITest(APITestCase):
self.assertEqual(objectchange.prechange_data['name'], 'Site 1')
self.assertEqual(objectchange.prechange_data['slug'], 'site-1')
self.assertEqual(objectchange.postchange_data, None)
def test_deletion_ordering(self):
"""
Check that the cascading deletion of dependent objects is recorded in the correct order.
"""
device = create_test_device('device1')
module_bay = ModuleBay.objects.create(device=device, name='Module Bay 1')
module_type = ModuleType.objects.create(manufacturer=Manufacturer.objects.first(), model='Module Type 1')
self.add_permissions('dcim.add_module', 'dcim.add_interface', 'dcim.delete_module')
self.assertEqual(ObjectChange.objects.count(), 0) # Sanity check
# Create a new Module
data = {
'device': device.pk,
'module_bay': module_bay.pk,
'module_type': module_type.pk,
'status': ModuleStatusChoices.STATUS_ACTIVE,
}
url = reverse('dcim-api:module-list')
response = self.client.post(url, data, format='json', **self.header)
self.assertHttpStatus(response, status.HTTP_201_CREATED)
module = device.modules.first()
# Create an Interface on the Module
data = {
'device': device.pk,
'module': module.pk,
'name': 'Interface 1',
'type': InterfaceTypeChoices.TYPE_1GE_FIXED,
}
url = reverse('dcim-api:interface-list')
response = self.client.post(url, data, format='json', **self.header)
self.assertHttpStatus(response, status.HTTP_201_CREATED)
interface = device.interfaces.first()
# Delete the Module
url = reverse('dcim-api:module-detail', kwargs={'pk': module.pk})
response = self.client.delete(url, **self.header)
self.assertHttpStatus(response, status.HTTP_204_NO_CONTENT)
self.assertEqual(Module.objects.count(), 0)
self.assertEqual(Interface.objects.count(), 0)
# Verify the creation of the expected ObjectChange records. We should see four total records, in this order:
# 1. Module created
# 2. Interface created
# 3. Interface deleted
# 4. Module deleted
changes = ObjectChange.objects.order_by('time')
self.assertEqual(len(changes), 4)
self.assertEqual(changes[0].changed_object_type, ContentType.objects.get_for_model(Module))
self.assertEqual(changes[0].changed_object_id, module.pk)
self.assertEqual(changes[0].action, ObjectChangeActionChoices.ACTION_CREATE)
self.assertEqual(changes[1].changed_object_type, ContentType.objects.get_for_model(Interface))
self.assertEqual(changes[1].changed_object_id, interface.pk)
self.assertEqual(changes[1].action, ObjectChangeActionChoices.ACTION_CREATE)
self.assertEqual(changes[2].changed_object_type, ContentType.objects.get_for_model(Interface))
self.assertEqual(changes[2].changed_object_id, interface.pk)
self.assertEqual(changes[2].action, ObjectChangeActionChoices.ACTION_DELETE)
self.assertEqual(changes[3].changed_object_type, ContentType.objects.get_for_model(Module))
self.assertEqual(changes[3].changed_object_id, module.pk)
self.assertEqual(changes[3].action, ObjectChangeActionChoices.ACTION_DELETE)

View File

@@ -33,7 +33,13 @@ from utilities.forms import ConfirmationForm
from utilities.htmx import htmx_partial
from utilities.json import ConfigJSONEncoder
from utilities.query import count_related
from utilities.views import ContentTypePermissionRequiredMixin, GetRelatedModelsMixin, ViewTab, register_model_view
from utilities.views import (
ContentTypePermissionRequiredMixin,
GetRelatedModelsMixin,
GetReturnURLMixin,
ViewTab,
register_model_view,
)
from . import filtersets, forms, tables
from .jobs import SyncDataSourceJob
from .models import *
@@ -66,7 +72,7 @@ class DataSourceView(GetRelatedModelsMixin, generic.ObjectView):
@register_model_view(DataSource, 'sync')
class DataSourceSyncView(BaseObjectView):
class DataSourceSyncView(GetReturnURLMixin, BaseObjectView):
queryset = DataSource.objects.all()
def get_required_permission(self):
@@ -85,7 +91,7 @@ class DataSourceSyncView(BaseObjectView):
request,
_("Queued job #{id} to sync {datasource}").format(id=job.pk, datasource=datasource)
)
return redirect(datasource.get_absolute_url())
return redirect(self.get_return_url(request, datasource))
@register_model_view(DataSource, 'add', detail=False)
@@ -119,6 +125,7 @@ class DataSourceBulkEditView(generic.BulkEditView):
@register_model_view(DataSource, 'bulk_rename', path='rename', detail=False)
class DataSourceBulkRenameView(generic.BulkRenameView):
queryset = DataSource.objects.all()
filterset = filtersets.DataSourceFilterSet
@register_model_view(DataSource, 'bulk_delete', path='delete', detail=False)

View File

@@ -889,22 +889,118 @@ class InterfaceTypeChoices(ChoiceSet):
TYPE_BRIDGE = 'bridge'
TYPE_LAG = 'lag'
# Ethernet
# FastEthernet
TYPE_100ME_FX = '100base-fx'
TYPE_100ME_LFX = '100base-lfx'
TYPE_100ME_FIXED = '100base-tx'
TYPE_100ME_FIXED = '100base-tx' # TODO: Rename to _TX
TYPE_100ME_T1 = '100base-t1'
# GigabitEthernet
TYPE_1GE_BX10_D = '1000base-bx10-d'
TYPE_1GE_BX10_U = '1000base-bx10-u'
TYPE_1GE_CWDM = '1000base-cwdm'
TYPE_1GE_CX = '1000base-cx'
TYPE_1GE_DWDM = '1000base-dwdm'
TYPE_1GE_EX = '1000base-ex'
TYPE_1GE_SX_FIXED = '1000base-sx' # TODO: Drop _FIXED suffix
TYPE_1GE_LSX = '1000base-lsx'
TYPE_1GE_LX_FIXED = '1000base-lx' # TODO: Drop _FIXED suffix
TYPE_1GE_LX10 = '1000base-lx10'
TYPE_1GE_FIXED = '1000base-t' # TODO: Rename to _T
TYPE_1GE_TX_FIXED = '1000base-tx' # TODO: Drop _FIXED suffix
TYPE_1GE_ZX = '1000base-zx'
# 2.5/5 Gbps Ethernet
TYPE_2GE_FIXED = '2.5gbase-t' # TODO: Rename to _T
TYPE_5GE_FIXED = '5gbase-t' # TODO: Rename to _T
# 10 Gbps Ethernet
TYPE_10GE_BR_D = '10gbase-br-d'
TYPE_10GE_BR_U = '10gbase-br-u'
TYPE_10GE_CX4 = '10gbase-cx4'
TYPE_10GE_ER = '10gbase-er'
TYPE_10GE_LR = '10gbase-lr'
TYPE_10GE_LRM = '10gbase-lrm'
TYPE_10GE_LX4 = '10gbase-lx4'
TYPE_10GE_SR = '10gbase-sr'
TYPE_10GE_FIXED = '10gbase-t'
TYPE_10GE_ZR = '10gbase-zr'
# 25 Gbps Ethernet
TYPE_25GE_CR = '25gbase-cr'
TYPE_25GE_ER = '25gbase-er'
TYPE_25GE_LR = '25gbase-lr'
TYPE_25GE_SR = '25gbase-sr'
TYPE_25GE_T = '25gbase-t'
# 40 Gbps Ethernet
TYPE_40GE_CR4 = '40gbase-cr4'
TYPE_40GE_ER4 = '40gbase-er4'
TYPE_40GE_FR4 = '40gbase-fr4'
TYPE_40GE_LR4 = '40gbase-lr4'
TYPE_40GE_SR4 = '40gbase-sr4'
# 50 Gbps Ethernet
TYPE_50GE_CR = '50gbase-cr'
TYPE_50GE_ER = '50gbase-er'
TYPE_50GE_FR = '50gbase-fr'
TYPE_50GE_LR = '50gbase-lr'
TYPE_50GE_SR = '50gbase-sr'
# 100 Gbps Ethernet
TYPE_100GE_CR1 = '100gbase-cr1'
TYPE_100GE_CR2 = '100gbase-cr2'
TYPE_100GE_CR4 = '100gbase-cr4'
TYPE_100GE_CR10 = '100gbase-cr10'
TYPE_100GE_CWDM4 = '100gbase-cwdm4'
TYPE_100GE_DR = '100gbase-dr'
TYPE_100GE_FR1 = '100gbase-fr1'
TYPE_100GE_ER4 = '100gbase-er4'
TYPE_100GE_LR1 = '100gbase-lr1'
TYPE_100GE_LR4 = '100gbase-lr4'
TYPE_100GE_SR1 = '100gbase-sr1'
TYPE_100GE_SR1_2 = '100gbase-sr1.2'
TYPE_100GE_SR2 = '100gbase-sr2'
TYPE_100GE_SR4 = '100gbase-sr4'
TYPE_100GE_SR10 = '100gbase-sr10'
TYPE_100GE_ZR = '100gbase-zr'
# 200 Gbps Ethernet
TYPE_200GE_CR2 = '200gbase-cr2'
TYPE_200GE_CR4 = '200gbase-cr4'
TYPE_200GE_SR2 = '200gbase-sr2'
TYPE_200GE_SR4 = '200gbase-sr4'
TYPE_200GE_DR4 = '200gbase-dr4'
TYPE_200GE_FR4 = '200gbase-fr4'
TYPE_200GE_LR4 = '200gbase-lr4'
TYPE_200GE_ER4 = '200gbase-er4'
TYPE_200GE_VR2 = '200gbase-vr2'
# 400 Gbps Ethernet
TYPE_400GE_CR4 = '400gbase-cr4'
TYPE_400GE_DR4 = '400gbase-dr4'
TYPE_400GE_ER8 = '400gbase-er8'
TYPE_400GE_FR4 = '400gbase-fr4'
TYPE_400GE_FR8 = '400gbase-fr8'
TYPE_400GE_LR4 = '400gbase-lr4'
TYPE_400GE_LR8 = '400gbase-lr8'
TYPE_400GE_SR4 = '400gbase-sr4'
TYPE_400GE_SR4_2 = '400gbase-sr4_2'
TYPE_400GE_SR8 = '400gbase-sr8'
TYPE_400GE_SR16 = '400gbase-sr16'
TYPE_400GE_VR4 = '400gbase-vr4'
TYPE_400GE_ZR = '400gbase-zr'
# 800 Gbps Ethernet
TYPE_800GE_CR8 = '800gbase-cr8'
TYPE_800GE_DR8 = '800gbase-dr8'
TYPE_800GE_SR8 = '800gbase-sr8'
TYPE_800GE_VR8 = '800gbase-vr8'
# Ethernet (modular)
TYPE_100ME_SFP = '100base-x-sfp'
TYPE_1GE_FIXED = '1000base-t'
TYPE_1GE_SX_FIXED = '1000base-sx'
TYPE_1GE_LX_FIXED = '1000base-lx'
TYPE_1GE_TX_FIXED = '1000base-tx'
TYPE_1GE_GBIC = '1000base-x-gbic'
TYPE_1GE_SFP = '1000base-x-sfp'
TYPE_2GE_FIXED = '2.5gbase-t'
TYPE_5GE_FIXED = '5gbase-t'
TYPE_10GE_FIXED = '10gbase-t'
TYPE_10GE_CX4 = '10gbase-cx4'
TYPE_10GE_SFP_PLUS = '10gbase-x-sfpp'
TYPE_10GE_XFP = '10gbase-x-xfp'
TYPE_10GE_XENPAK = '10gbase-x-xenpak'
@@ -935,7 +1031,7 @@ class InterfaceTypeChoices(ChoiceSet):
TYPE_800GE_QSFP_DD = '800gbase-x-qsfpdd'
TYPE_800GE_OSFP = '800gbase-x-osfp'
# Ethernet Backplane
# Backplane Ethernet
TYPE_1GE_KX = '1000base-kx'
TYPE_2GE_KX = '2.5gbase-kx'
TYPE_5GE_KR = '5gbase-kr'
@@ -1054,61 +1150,185 @@ class InterfaceTypeChoices(ChoiceSet):
),
),
(
_('Ethernet (fixed)'),
_('FastEthernet (100 Mbps)'),
(
(TYPE_100ME_FX, '100BASE-FX (10/100ME FIBER)'),
(TYPE_100ME_LFX, '100BASE-LFX (10/100ME FIBER)'),
(TYPE_100ME_FX, '100BASE-FX (10/100ME)'),
(TYPE_100ME_LFX, '100BASE-LFX (10/100ME)'),
(TYPE_100ME_FIXED, '100BASE-TX (10/100ME)'),
(TYPE_100ME_T1, '100BASE-T1 (10/100ME Single Pair)'),
(TYPE_1GE_FIXED, '1000BASE-T (1GE)'),
(TYPE_1GE_SX_FIXED, '1000BASE-SX (1GE)'),
(TYPE_100ME_T1, '100BASE-T1 (10/100ME)'),
),
),
(
_('GigabitEthernet (1 Gbps)'),
(
(TYPE_1GE_BX10_D, '1000BASE-BX10-D (1GE BiDi Down)'),
(TYPE_1GE_BX10_U, '1000BASE-BX10-U (1GE BiDi Up)'),
(TYPE_1GE_CWDM, '1000BASE-CWDM (1GE)'),
(TYPE_1GE_CX, '1000BASE-CX (1GE DAC)'),
(TYPE_1GE_DWDM, '1000BASE-DWDM (1GE)'),
(TYPE_1GE_EX, '1000BASE-EX (1GE)'),
(TYPE_1GE_LSX, '1000BASE-LSX (1GE)'),
(TYPE_1GE_LX_FIXED, '1000BASE-LX (1GE)'),
(TYPE_1GE_LX10, '1000BASE-LX10/LH (1GE)'),
(TYPE_1GE_SX_FIXED, '1000BASE-SX (1GE)'),
(TYPE_1GE_FIXED, '1000BASE-T (1GE)'),
(TYPE_1GE_TX_FIXED, '1000BASE-TX (1GE)'),
(TYPE_1GE_ZX, '1000BASE-ZX (1GE)'),
),
),
(
_('2.5/5 Gbps Ethernet'),
(
(TYPE_2GE_FIXED, '2.5GBASE-T (2.5GE)'),
(TYPE_5GE_FIXED, '5GBASE-T (5GE)'),
),
),
(
_('10 Gbps Ethernet'),
(
(TYPE_10GE_BR_D, '10GBASE-BR-D (10GE BiDi Down)'),
(TYPE_10GE_BR_U, '10GBASE-BR-U (10GE BiDi Up)'),
(TYPE_10GE_CX4, '10GBASE-CX4 (10GE DAC)'),
(TYPE_10GE_ER, '10GBASE-ER (10GE)'),
(TYPE_10GE_LR, '10GBASE-LR (10GE)'),
(TYPE_10GE_LRM, '10GBASE-LRM (10GE)'),
(TYPE_10GE_LX4, '10GBASE-LX4 (10GE)'),
(TYPE_10GE_SR, '10GBASE-SR (10GE)'),
(TYPE_10GE_FIXED, '10GBASE-T (10GE)'),
(TYPE_10GE_CX4, '10GBASE-CX4 (10GE)'),
(TYPE_10GE_ZR, '10GBASE-ZR (10GE)'),
)
),
(
_('Ethernet (modular)'),
_('25 Gbps Ethernet'),
(
(TYPE_25GE_CR, '25GBASE-CR (25GE DAC)'),
(TYPE_25GE_ER, '25GBASE-ER (25GE)'),
(TYPE_25GE_LR, '25GBASE-LR (25GE)'),
(TYPE_25GE_SR, '25GBASE-SR (25GE)'),
(TYPE_25GE_T, '25GBASE-T (25GE)'),
)
),
(
_('40 Gbps Ethernet'),
(
(TYPE_40GE_CR4, '40GBASE-CR4 (40GE DAC)'),
(TYPE_40GE_ER4, '40GBASE-ER4 (40GE)'),
(TYPE_40GE_FR4, '40GBASE-FR4 (40GE)'),
(TYPE_40GE_LR4, '40GBASE-LR4 (40GE)'),
(TYPE_40GE_SR4, '40GBASE-SR4 (40GE)'),
)
),
(
_('50 Gbps Ethernet'),
(
(TYPE_50GE_CR, '50GBASE-CR (50GE DAC)'),
(TYPE_50GE_ER, '50GBASE-ER (50GE)'),
(TYPE_50GE_FR, '50GBASE-FR (50GE)'),
(TYPE_50GE_LR, '50GBASE-LR (50GE)'),
(TYPE_50GE_SR, '50GBASE-SR (50GE)'),
)
),
(
_('100 Gbps Ethernet'),
(
(TYPE_100GE_CR1, '100GBASE-CR1 (100GE DAC)'),
(TYPE_100GE_CR2, '100GBASE-CR2 (100GE DAC)'),
(TYPE_100GE_CR4, '100GBASE-CR4 (100GE DAC)'),
(TYPE_100GE_CR10, '100GBASE-CR10 (100GE DAC)'),
(TYPE_100GE_CWDM4, '100GBASE-CWDM4 (100GE)'),
(TYPE_100GE_DR, '100GBASE-DR (100GE)'),
(TYPE_100GE_ER4, '100GBASE-ER4 (100GE)'),
(TYPE_100GE_FR1, '100GBASE-FR1 (100GE)'),
(TYPE_100GE_LR1, '100GBASE-LR1 (100GE)'),
(TYPE_100GE_LR4, '100GBASE-LR4 (100GE)'),
(TYPE_100GE_SR1, '100GBASE-SR1 (100GE)'),
(TYPE_100GE_SR1_2, '100GBASE-SR1.2 (100GE BiDi)'),
(TYPE_100GE_SR2, '100GBASE-SR2 (100GE)'),
(TYPE_100GE_SR4, '100GBASE-SR4 (100GE)'),
(TYPE_100GE_SR10, '100GBASE-SR10 (100GE)'),
(TYPE_100GE_ZR, '100GBASE-ZR (100GE)'),
)
),
(
_('200 Gbps Ethernet'),
(
(TYPE_200GE_CR2, '200GBASE-CR2 (200GE)'),
(TYPE_200GE_CR4, '200GBASE-CR4 (200GE)'),
(TYPE_200GE_DR4, '200GBASE-DR4 (200GE)'),
(TYPE_200GE_ER4, '200GBASE-ER4 (200GE)'),
(TYPE_200GE_FR4, '200GBASE-FR4 (200GE)'),
(TYPE_200GE_LR4, '200GBASE-LR4 (200GE)'),
(TYPE_200GE_SR2, '200GBASE-SR2 (200GE)'),
(TYPE_200GE_SR4, '200GBASE-SR4 (200GE)'),
(TYPE_200GE_VR2, '200GBASE-VR2 (200GE)'),
)
),
(
_('400 Gbps Ethernet'),
(
(TYPE_400GE_CR4, '400GBASE-CR4 (400GE)'),
(TYPE_400GE_DR4, '400GBASE-DR4 (400GE)'),
(TYPE_400GE_ER8, '400GBASE-ER8 (400GE)'),
(TYPE_400GE_FR4, '400GBASE-FR4 (400GE)'),
(TYPE_400GE_FR8, '400GBASE-FR8 (400GE)'),
(TYPE_400GE_LR4, '400GBASE-LR4 (400GE)'),
(TYPE_400GE_LR8, '400GBASE-LR8 (400GE)'),
(TYPE_400GE_SR4, '400GBASE-SR4 (400GE)'),
(TYPE_400GE_SR4_2, '400GBASE-SR4.2 (400GE BiDi)'),
(TYPE_400GE_SR8, '400GBASE-SR8 (400GE)'),
(TYPE_400GE_SR16, '400GBASE-SR16 (400GE)'),
(TYPE_400GE_VR4, '400GBASE-VR4 (400GE)'),
(TYPE_400GE_ZR, '400GBASE-ZR (400GE)'),
)
),
(
_('800 Gbps Ethernet'),
(
(TYPE_800GE_CR8, '800GBASE-CR8 (800GE)'),
(TYPE_800GE_DR8, '800GBASE-DR8 (800GE)'),
(TYPE_800GE_SR8, '800GBASE-SR8 (800GE)'),
(TYPE_800GE_VR8, '800GBASE-VR8 (800GE)'),
)
),
(
_('Pluggable transceivers'),
(
(TYPE_100ME_SFP, 'SFP (100ME)'),
(TYPE_1GE_GBIC, 'GBIC (1GE)'),
(TYPE_1GE_SFP, 'SFP (1GE)'),
(TYPE_10GE_SFP_PLUS, 'SFP+ (10GE)'),
(TYPE_10GE_XFP, 'XFP (10GE)'),
(TYPE_10GE_XENPAK, 'XENPAK (10GE)'),
(TYPE_10GE_XFP, 'XFP (10GE)'),
(TYPE_10GE_X2, 'X2 (10GE)'),
(TYPE_25GE_SFP28, 'SFP28 (25GE)'),
(TYPE_50GE_SFP56, 'SFP56 (50GE)'),
(TYPE_40GE_QSFP_PLUS, 'QSFP+ (40GE)'),
(TYPE_50GE_QSFP28, 'QSFP28 (50GE)'),
(TYPE_50GE_SFP56, 'SFP56 (50GE)'),
(TYPE_100GE_CFP, 'CFP (100GE)'),
(TYPE_100GE_CFP2, 'CFP2 (100GE)'),
(TYPE_200GE_CFP2, 'CFP2 (200GE)'),
(TYPE_400GE_CFP2, 'CFP2 (400GE)'),
(TYPE_100GE_CFP4, 'CFP4 (100GE)'),
(TYPE_100GE_CXP, 'CXP (100GE)'),
(TYPE_100GE_CPAK, 'Cisco CPAK (100GE)'),
(TYPE_100GE_DSFP, 'DSFP (100GE)'),
(TYPE_100GE_SFP_DD, 'SFP-DD (100GE)'),
(TYPE_100GE_QSFP28, 'QSFP28 (100GE)'),
(TYPE_100GE_QSFP_DD, 'QSFP-DD (100GE)'),
(TYPE_100GE_SFP_DD, 'SFP-DD (100GE)'),
(TYPE_200GE_CFP2, 'CFP2 (200GE)'),
(TYPE_200GE_QSFP56, 'QSFP56 (200GE)'),
(TYPE_200GE_QSFP_DD, 'QSFP-DD (200GE)'),
(TYPE_400GE_QSFP112, 'QSFP112 (400GE)'),
(TYPE_400GE_QSFP_DD, 'QSFP-DD (400GE)'),
(TYPE_400GE_CDFP, 'CDFP (400GE)'),
(TYPE_400GE_CFP2, 'CFP2 (400GE)'),
(TYPE_400GE_CFP8, 'CPF8 (400GE)'),
(TYPE_400GE_OSFP, 'OSFP (400GE)'),
(TYPE_400GE_OSFP_RHS, 'OSFP-RHS (400GE)'),
(TYPE_400GE_CDFP, 'CDFP (400GE)'),
(TYPE_400GE_CFP8, 'CPF8 (400GE)'),
(TYPE_800GE_QSFP_DD, 'QSFP-DD (800GE)'),
(TYPE_800GE_OSFP, 'OSFP (800GE)'),
(TYPE_800GE_QSFP_DD, 'QSFP-DD (800GE)'),
)
),
(
_('Ethernet (backplane)'),
_('Backplane Ethernet'),
(
(TYPE_1GE_KX, '1000BASE-KX (1GE)'),
(TYPE_2GE_KX, '2.5GBASE-KX (2.5GE)'),
@@ -1128,12 +1348,12 @@ class InterfaceTypeChoices(ChoiceSet):
(
(TYPE_80211A, 'IEEE 802.11a'),
(TYPE_80211G, 'IEEE 802.11b/g'),
(TYPE_80211N, 'IEEE 802.11n'),
(TYPE_80211AC, 'IEEE 802.11ac'),
(TYPE_80211AD, 'IEEE 802.11ad'),
(TYPE_80211AX, 'IEEE 802.11ax'),
(TYPE_80211AY, 'IEEE 802.11ay'),
(TYPE_80211BE, 'IEEE 802.11be'),
(TYPE_80211N, 'IEEE 802.11n (Wi-Fi 4)'),
(TYPE_80211AC, 'IEEE 802.11ac (Wi-Fi 5)'),
(TYPE_80211AD, 'IEEE 802.11ad (WiGig)'),
(TYPE_80211AX, 'IEEE 802.11ax (Wi-Fi 6)'),
(TYPE_80211AY, 'IEEE 802.11ay (WiGig)'),
(TYPE_80211BE, 'IEEE 802.11be (Wi-Fi 7)'),
(TYPE_802151, 'IEEE 802.15.1 (Bluetooth)'),
(TYPE_802154, 'IEEE 802.15.4 (LR-WPAN)'),
(TYPE_OTHER_WIRELESS, 'Other (Wireless)'),
@@ -1497,8 +1717,9 @@ class PortTypeChoices(ChoiceSet):
# Cables/links
#
class CableTypeChoices(ChoiceSet):
class CableTypeChoices(ChoiceSet):
# Copper - Twisted Pair (UTP/STP)
TYPE_CAT3 = 'cat3'
TYPE_CAT5 = 'cat5'
TYPE_CAT5E = 'cat5e'
@@ -1507,26 +1728,50 @@ class CableTypeChoices(ChoiceSet):
TYPE_CAT7 = 'cat7'
TYPE_CAT7A = 'cat7a'
TYPE_CAT8 = 'cat8'
TYPE_MRJ21_TRUNK = 'mrj21-trunk'
# Copper - Twinax (DAC)
TYPE_DAC_ACTIVE = 'dac-active'
TYPE_DAC_PASSIVE = 'dac-passive'
TYPE_MRJ21_TRUNK = 'mrj21-trunk'
# Copper - Coaxial
TYPE_COAXIAL = 'coaxial'
TYPE_RG_6 = 'rg-6'
TYPE_RG_8 = 'rg-8'
TYPE_RG_11 = 'rg-11'
TYPE_RG_59 = 'rg-59'
TYPE_RG_62 = 'rg-62'
TYPE_RG_213 = 'rg-213'
TYPE_LMR_100 = 'lmr-100'
TYPE_LMR_200 = 'lmr-200'
TYPE_LMR_400 = 'lmr-400'
# Fiber Optic - Multimode
TYPE_MMF = 'mmf'
TYPE_MMF_OM1 = 'mmf-om1'
TYPE_MMF_OM2 = 'mmf-om2'
TYPE_MMF_OM3 = 'mmf-om3'
TYPE_MMF_OM4 = 'mmf-om4'
TYPE_MMF_OM5 = 'mmf-om5'
# Fiber Optic - Single-mode
TYPE_SMF = 'smf'
TYPE_SMF_OS1 = 'smf-os1'
TYPE_SMF_OS2 = 'smf-os2'
# Fiber Optic - Other
TYPE_AOC = 'aoc'
# Power
TYPE_POWER = 'power'
# USB
TYPE_USB = 'usb'
CHOICES = (
(
_('Copper'), (
_('Copper - Twisted Pair (UTP/STP)'),
(
(TYPE_CAT3, 'CAT3'),
(TYPE_CAT5, 'CAT5'),
(TYPE_CAT5E, 'CAT5e'),
@@ -1535,28 +1780,66 @@ class CableTypeChoices(ChoiceSet):
(TYPE_CAT7, 'CAT7'),
(TYPE_CAT7A, 'CAT7a'),
(TYPE_CAT8, 'CAT8'),
(TYPE_DAC_ACTIVE, 'Direct Attach Copper (Active)'),
(TYPE_DAC_PASSIVE, 'Direct Attach Copper (Passive)'),
(TYPE_MRJ21_TRUNK, 'MRJ21 Trunk'),
(TYPE_COAXIAL, 'Coaxial'),
),
),
(
_('Fiber'), (
_('Copper - Twinax (DAC)'),
(
(TYPE_DAC_ACTIVE, 'Direct Attach Copper (Active)'),
(TYPE_DAC_PASSIVE, 'Direct Attach Copper (Passive)'),
),
),
(
_('Copper - Coaxial'),
(
(TYPE_COAXIAL, 'Coaxial'),
(TYPE_RG_6, 'RG-6'),
(TYPE_RG_8, 'RG-8'),
(TYPE_RG_11, 'RG-11'),
(TYPE_RG_59, 'RG-59'),
(TYPE_RG_62, 'RG-62'),
(TYPE_RG_213, 'RG-213'),
(TYPE_LMR_100, 'LMR-100'),
(TYPE_LMR_200, 'LMR-200'),
(TYPE_LMR_400, 'LMR-400'),
),
),
(
_('Fiber - Multimode'),
(
(TYPE_MMF, 'Multimode Fiber'),
(TYPE_MMF_OM1, 'Multimode Fiber (OM1)'),
(TYPE_MMF_OM2, 'Multimode Fiber (OM2)'),
(TYPE_MMF_OM3, 'Multimode Fiber (OM3)'),
(TYPE_MMF_OM4, 'Multimode Fiber (OM4)'),
(TYPE_MMF_OM5, 'Multimode Fiber (OM5)'),
(TYPE_SMF, 'Singlemode Fiber'),
(TYPE_SMF_OS1, 'Singlemode Fiber (OS1)'),
(TYPE_SMF_OS2, 'Singlemode Fiber (OS2)'),
(TYPE_AOC, 'Active Optical Cabling (AOC)'),
),
),
(TYPE_USB, _('USB')),
(TYPE_POWER, _('Power')),
(
_('Fiber - Single-mode'),
(
(TYPE_SMF, 'Single-mode Fiber'),
(TYPE_SMF_OS1, 'Single-mode Fiber (OS1)'),
(TYPE_SMF_OS2, 'Single-mode Fiber (OS2)'),
),
),
(
_('Fiber - Other'),
((TYPE_AOC, 'Active Optical Cabling (AOC)'),),
),
(
_('Power'),
(
(TYPE_POWER, 'Power'),
),
),
(
_('USB'),
(
(TYPE_USB, 'USB'),
),
),
)

View File

@@ -26,7 +26,7 @@ class eui64_unix_expanded_uppercase(eui64_unix_expanded):
#
class MACAddressField(models.Field):
description = "PostgreSQL MAC Address field"
description = 'PostgreSQL MAC Address field'
def python_type(self):
return EUI
@@ -34,6 +34,9 @@ class MACAddressField(models.Field):
def from_db_value(self, value, expression, connection):
return self.to_python(value)
def get_internal_type(self):
return 'CharField'
def to_python(self, value):
if value is None:
return value
@@ -54,7 +57,7 @@ class MACAddressField(models.Field):
class WWNField(models.Field):
description = "World Wide Name field"
description = 'World Wide Name field'
def python_type(self):
return EUI
@@ -62,6 +65,9 @@ class WWNField(models.Field):
def from_db_value(self, value, expression, connection):
return self.to_python(value)
def get_internal_type(self):
return 'CharField'
def to_python(self, value):
if value is None:
return value

View File

@@ -14,16 +14,16 @@ from netbox.filtersets import (
AttributeFiltersMixin, BaseFilterSet, ChangeLoggedModelFilterSet, NestedGroupModelFilterSet, NetBoxModelFilterSet,
OrganizationalModelFilterSet,
)
from tenancy.filtersets import TenancyFilterSet, ContactModelFilterSet
from tenancy.filtersets import ContactModelFilterSet, TenancyFilterSet
from tenancy.models import *
from users.models import User
from utilities.filters import (
ContentTypeFilter, MultiValueCharFilter, MultiValueMACAddressFilter, MultiValueNumberFilter, MultiValueWWNFilter,
NumericArrayFilter, TreeNodeMultipleChoiceFilter,
)
from virtualization.models import Cluster, ClusterGroup, VMInterface, VirtualMachine
from virtualization.models import Cluster, ClusterGroup, VirtualMachine, VMInterface
from vpn.models import L2VPN
from wireless.choices import WirelessRoleChoices, WirelessChannelChoices
from wireless.choices import WirelessChannelChoices, WirelessRoleChoices
from wireless.models import WirelessLAN, WirelessLink
from .choices import *
from .constants import *
@@ -1288,7 +1288,6 @@ class DeviceFilterSet(
Q(name__icontains=value) |
Q(virtual_chassis__name__icontains=value) |
Q(serial__icontains=value.strip()) |
Q(inventoryitems__serial__icontains=value.strip()) |
Q(asset_tag__icontains=value.strip()) |
Q(description__icontains=value.strip()) |
Q(comments__icontains=value) |
@@ -1764,6 +1763,7 @@ class PowerOutletFilterSet(
class MACAddressFilterSet(NetBoxModelFilterSet):
mac_address = MultiValueMACAddressFilter()
assigned_object_type = ContentTypeFilter()
device = MultiValueCharFilter(
method='filter_device',
field_name='name',
@@ -1806,6 +1806,14 @@ class MACAddressFilterSet(NetBoxModelFilterSet):
queryset=VMInterface.objects.all(),
label=_('VM interface (ID)'),
)
assigned = django_filters.BooleanFilter(
method='filter_assigned',
label=_('Is assigned'),
)
primary = django_filters.BooleanFilter(
method='filter_primary',
label=_('Is primary'),
)
class Meta:
model = MACAddress
@@ -1842,6 +1850,29 @@ class MACAddressFilterSet(NetBoxModelFilterSet):
vminterface__in=interface_ids
)
def filter_assigned(self, queryset, name, value):
params = {
'assigned_object_type__isnull': True,
'assigned_object_id__isnull': True,
}
if value:
return queryset.exclude(**params)
else:
return queryset.filter(**params)
def filter_primary(self, queryset, name, value):
interface_mac_ids = Interface.objects.filter(primary_mac_address_id__isnull=False).values_list(
'primary_mac_address_id', flat=True
)
vminterface_mac_ids = VMInterface.objects.filter(primary_mac_address_id__isnull=False).values_list(
'primary_mac_address_id', flat=True
)
query = Q(pk__in=interface_mac_ids) | Q(pk__in=vminterface_mac_ids)
if value:
return queryset.filter(query)
else:
return queryset.exclude(query)
class CommonInterfaceFilterSet(django_filters.FilterSet):
mode = django_filters.MultipleChoiceFilter(

View File

@@ -133,6 +133,11 @@ class SiteBulkEditForm(NetBoxModelBulkEditForm):
queryset=Tenant.objects.all(),
required=False
)
facility = forms.CharField(
label=_('Facility'),
max_length=50,
required=False
)
asns = DynamicModelMultipleChoiceField(
queryset=ASN.objects.all(),
label=_('ASNs'),
@@ -166,10 +171,10 @@ class SiteBulkEditForm(NetBoxModelBulkEditForm):
model = Site
fieldsets = (
FieldSet('status', 'region', 'group', 'tenant', 'asns', 'time_zone', 'description'),
FieldSet('status', 'region', 'group', 'tenant', 'facility', 'asns', 'time_zone', 'description'),
)
nullable_fields = (
'region', 'group', 'tenant', 'asns', 'time_zone', 'description', 'comments',
'region', 'group', 'tenant', 'facility', 'asns', 'time_zone', 'description', 'comments',
)
@@ -198,6 +203,11 @@ class LocationBulkEditForm(NetBoxModelBulkEditForm):
queryset=Tenant.objects.all(),
required=False
)
facility = forms.CharField(
label=_('Facility'),
max_length=50,
required=False
)
description = forms.CharField(
label=_('Description'),
max_length=200,
@@ -207,9 +217,9 @@ class LocationBulkEditForm(NetBoxModelBulkEditForm):
model = Location
fieldsets = (
FieldSet('site', 'parent', 'status', 'tenant', 'description'),
FieldSet('site', 'parent', 'status', 'tenant', 'facility', 'description'),
)
nullable_fields = ('parent', 'tenant', 'description', 'comments')
nullable_fields = ('parent', 'tenant', 'facility', 'description', 'comments')
class RackRoleBulkEditForm(NetBoxModelBulkEditForm):

View File

@@ -9,7 +9,8 @@ from dcim.choices import *
from dcim.constants import *
from dcim.models import *
from extras.models import ConfigTemplate
from ipam.models import VRF, IPAddress
from ipam.choices import VLANQinQRoleChoices
from ipam.models import VLAN, VRF, IPAddress, VLANGroup
from netbox.choices import *
from netbox.forms import NetBoxModelImportForm
from tenancy.models import Tenant
@@ -17,7 +18,7 @@ from utilities.forms.fields import (
CSVChoiceField, CSVContentTypeField, CSVModelChoiceField, CSVModelMultipleChoiceField, CSVTypedChoiceField,
SlugField,
)
from virtualization.models import Cluster, VMInterface, VirtualMachine
from virtualization.models import Cluster, VirtualMachine, VMInterface
from wireless.choices import WirelessRoleChoices
from .common import ModuleCommonForm
@@ -938,7 +939,7 @@ class InterfaceImportForm(NetBoxModelImportForm):
required=False,
to_field_name='name',
help_text=mark_safe(
_('VDC names separated by commas, encased with double quotes. Example:') + ' <code>vdc1,vdc2,vdc3</code>'
_('VDC names separated by commas, encased with double quotes. Example:') + ' <code>"vdc1,vdc2,vdc3"</code>'
)
)
type = CSVChoiceField(
@@ -967,7 +968,41 @@ class InterfaceImportForm(NetBoxModelImportForm):
label=_('Mode'),
choices=InterfaceModeChoices,
required=False,
help_text=_('IEEE 802.1Q operational mode (for L2 interfaces)')
help_text=_('IEEE 802.1Q operational mode (for L2 interfaces)'),
)
vlan_group = CSVModelChoiceField(
label=_('VLAN group'),
queryset=VLANGroup.objects.all(),
required=False,
to_field_name='name',
help_text=_('Filter VLANs available for assignment by group'),
)
untagged_vlan = CSVModelChoiceField(
label=_('Untagged VLAN'),
queryset=VLAN.objects.all(),
required=False,
to_field_name='vid',
help_text=_('Assigned untagged VLAN ID (filtered by VLAN group)'),
)
tagged_vlans = CSVModelMultipleChoiceField(
label=_('Tagged VLANs'),
queryset=VLAN.objects.all(),
required=False,
to_field_name='vid',
help_text=mark_safe(
_(
'Assigned tagged VLAN IDs separated by commas, encased with double quotes '
'(filtered by VLAN group). Example:'
)
+ ' <code>"100,200,300"</code>'
),
)
qinq_svlan = CSVModelChoiceField(
label=_('Q-in-Q Service VLAN'),
queryset=VLAN.objects.filter(qinq_role=VLANQinQRoleChoices.ROLE_SERVICE),
required=False,
to_field_name='vid',
help_text=_('Assigned Q-in-Q Service VLAN ID (filtered by VLAN group)'),
)
vrf = CSVModelChoiceField(
label=_('VRF'),
@@ -988,7 +1023,8 @@ class InterfaceImportForm(NetBoxModelImportForm):
fields = (
'device', 'name', 'label', 'parent', 'bridge', 'lag', 'type', 'speed', 'duplex', 'enabled',
'mark_connected', 'wwn', 'vdcs', 'mtu', 'mgmt_only', 'description', 'poe_mode', 'poe_type', 'mode',
'vrf', 'rf_role', 'rf_channel', 'rf_channel_frequency', 'rf_channel_width', 'tx_power', 'tags'
'vlan_group', 'untagged_vlan', 'tagged_vlans', 'qinq_svlan', 'vrf', 'rf_role', 'rf_channel',
'rf_channel_frequency', 'rf_channel_width', 'tx_power', 'tags'
)
def __init__(self, data=None, *args, **kwargs):
@@ -1005,6 +1041,13 @@ class InterfaceImportForm(NetBoxModelImportForm):
self.fields['lag'].queryset = self.fields['lag'].queryset.filter(**params)
self.fields['vdcs'].queryset = self.fields['vdcs'].queryset.filter(**params)
# Limit choices for VLANs to the assigned VLAN group
if vlan_group := data.get('vlan_group'):
params = {f"group__{self.fields['vlan_group'].to_field_name}": vlan_group}
self.fields['untagged_vlan'].queryset = self.fields['untagged_vlan'].queryset.filter(**params)
self.fields['tagged_vlans'].queryset = self.fields['tagged_vlans'].queryset.filter(**params)
self.fields['qinq_svlan'].queryset = self.fields['qinq_svlan'].queryset.filter(**params)
def clean_enabled(self):
# Make sure enabled is True when it's not included in the uploaded data
if 'enabled' not in self.data:
@@ -1181,7 +1224,7 @@ class InventoryItemImportForm(NetBoxModelImportForm):
help_text=_('Component Type')
)
component_name = forms.CharField(
label=_('Compnent name'),
label=_('Component name'),
required=False,
help_text=_('Component Name')
)

View File

@@ -1676,12 +1676,16 @@ class MACAddressFilterForm(NetBoxModelFilterSetForm):
model = MACAddress
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('mac_address', 'device_id', 'virtual_machine_id', name=_('MAC address')),
FieldSet('mac_address', name=_('Attributes')),
FieldSet(
'device_id', 'virtual_machine_id', 'assigned', 'primary',
name=_('Assignments'),
),
)
selector_fields = ('filter_id', 'q', 'device_id', 'virtual_machine_id')
mac_address = forms.CharField(
required=False,
label=_('MAC address')
label=_('MAC address'),
)
device_id = DynamicModelMultipleChoiceField(
queryset=Device.objects.all(),
@@ -1693,6 +1697,20 @@ class MACAddressFilterForm(NetBoxModelFilterSetForm):
required=False,
label=_('Assigned VM'),
)
assigned = forms.NullBooleanField(
required=False,
label=_('Assigned to an interface'),
widget=forms.Select(
choices=BOOLEAN_WITH_BLANK_CHOICES
),
)
primary = forms.NullBooleanField(
required=False,
label=_('Primary MAC of an interface'),
widget=forms.Select(
choices=BOOLEAN_WITH_BLANK_CHOICES
),
)
tag = TagFilterField(model)

View File

@@ -1,6 +1,6 @@
from django import forms
from django.contrib.contenttypes.models import ContentType
from django.core.exceptions import ObjectDoesNotExist
from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.utils.translation import gettext_lazy as _
from dcim.constants import LOCATION_SCOPE_TYPES
@@ -48,8 +48,17 @@ class ScopedForm(forms.Form):
def clean(self):
super().clean()
scope = self.cleaned_data.get('scope')
scope_type = self.cleaned_data.get('scope_type')
if scope_type and not scope:
raise ValidationError({
'scope': _(
"Please select a {scope_type}."
).format(scope_type=scope_type.model_class()._meta.model_name)
})
# Assign the selected scope (if any)
self.instance.scope = self.cleaned_data.get('scope')
self.instance.scope = scope
def _set_scoped_values(self):
if scope_type_id := get_field_value(self, 'scope_type'):
@@ -107,3 +116,15 @@ class ScopedImportForm(forms.Form):
required=False,
label=_('Scope type (app & model)')
)
def clean(self):
super().clean()
scope_id = self.cleaned_data.get('scope_id')
scope_type = self.cleaned_data.get('scope_type')
if scope_type and not scope_id:
raise ValidationError({
'scope_id': _(
"Please select a {scope_type}."
).format(scope_type=scope_type.model_class()._meta.model_name)
})

View File

@@ -755,7 +755,10 @@ class ModuleForm(ModuleCommonForm, NetBoxModelForm):
queryset=ModuleBay.objects.all(),
query_params={
'device_id': '$device'
}
},
context={
'disabled': 'installed_module',
},
)
module_type = DynamicModelChoiceField(
label=_('Module type'),

View File

@@ -453,6 +453,7 @@ class VirtualChassisCreateForm(NetBoxModelForm):
if instance.pk and self.cleaned_data['members']:
initial_position = self.cleaned_data.get('initial_position', 1)
for i, member in enumerate(self.cleaned_data['members'], start=initial_position):
member.snapshot()
member.virtual_chassis = instance
member.vc_position = i
member.save()

View File

@@ -12,6 +12,7 @@ __all__ = (
'DeviceFaceEnum',
'DeviceStatusEnum',
'InterfaceDuplexEnum',
'InterfaceKindEnum',
'InterfaceModeEnum',
'InterfacePoEModeEnum',
'InterfacePoETypeEnum',
@@ -48,6 +49,7 @@ DeviceAirflowEnum = strawberry.enum(DeviceAirflowChoices.as_enum(prefix='airflow
DeviceFaceEnum = strawberry.enum(DeviceFaceChoices.as_enum(prefix='face'))
DeviceStatusEnum = strawberry.enum(DeviceStatusChoices.as_enum(prefix='status'))
InterfaceDuplexEnum = strawberry.enum(InterfaceDuplexChoices.as_enum(prefix='duplex'))
InterfaceKindEnum = strawberry.enum(InterfaceKindChoices.as_enum(prefix='kind'))
InterfaceModeEnum = strawberry.enum(InterfaceModeChoices.as_enum(prefix='mode'))
InterfacePoEModeEnum = strawberry.enum(InterfacePoEModeChoices.as_enum(prefix='mode'))
InterfacePoETypeEnum = strawberry.enum(InterfacePoETypeChoices.as_enum())

View File

@@ -1,5 +1,6 @@
from typing import Annotated, TYPE_CHECKING
from django.db.models import Q
import strawberry
import strawberry_django
from strawberry.scalars import ID
@@ -7,6 +8,8 @@ from strawberry_django import FilterLookup
from core.graphql.filter_mixins import ChangeLogFilterMixin
from dcim import models
from dcim.constants import *
from dcim.graphql.enums import InterfaceKindEnum
from extras.graphql.filter_mixins import ConfigContextFilterMixin
from netbox.graphql.filter_mixins import (
PrimaryModelFilterMixin,
@@ -15,7 +18,9 @@ from netbox.graphql.filter_mixins import (
ImageAttachmentFilterMixin,
WeightFilterMixin,
)
from tenancy.graphql.filter_mixins import TenancyFilterMixin, ContactFilterMixin
from tenancy.graphql.filter_mixins import ContactFilterMixin, TenancyFilterMixin
from virtualization.models import VMInterface
from .filter_mixins import (
CabledObjectModelFilterMixin,
ComponentModelFilterMixin,
@@ -416,6 +421,24 @@ class MACAddressFilter(PrimaryModelFilterMixin):
)
assigned_object_id: ID | None = strawberry_django.filter_field()
@strawberry_django.filter_field()
def assigned(self, value: bool, prefix) -> Q:
return Q(**{f'{prefix}assigned_object_id__isnull': (not value)})
@strawberry_django.filter_field()
def primary(self, value: bool, prefix) -> Q:
interface_mac_ids = models.Interface.objects.filter(primary_mac_address_id__isnull=False).values_list(
'primary_mac_address_id', flat=True
)
vminterface_mac_ids = VMInterface.objects.filter(primary_mac_address_id__isnull=False).values_list(
'primary_mac_address_id', flat=True
)
query = Q(**{f'{prefix}pk__in': interface_mac_ids}) | Q(**{f'{prefix}pk__in': vminterface_mac_ids})
if value:
return Q(query)
else:
return ~Q(query)
@strawberry_django.filter_type(models.Interface, lookups=True)
class InterfaceFilter(ModularComponentModelFilterMixin, InterfaceBaseFilterMixin, CabledObjectModelFilterMixin):
@@ -485,6 +508,27 @@ class InterfaceFilter(ModularComponentModelFilterMixin, InterfaceBaseFilterMixin
strawberry_django.filter_field()
)
@strawberry_django.filter_field
def connected(self, queryset, value: bool, prefix: str):
if value is True:
return queryset, Q(**{f"{prefix}_path__is_active": True})
else:
return queryset, Q(**{f"{prefix}_path__isnull": True}) | Q(**{f"{prefix}_path__is_active": False})
@strawberry_django.filter_field
def kind(
self,
queryset,
value: Annotated['InterfaceKindEnum', strawberry.lazy('dcim.graphql.enums')],
prefix: str
):
if value == InterfaceKindEnum.KIND_PHYSICAL:
return queryset, ~Q(**{f"{prefix}type__in": NONCONNECTABLE_IFACE_TYPES})
elif value == InterfaceKindEnum.KIND_VIRTUAL:
return queryset, Q(**{f"{prefix}type__in": VIRTUAL_IFACE_TYPES})
elif value == InterfaceKindEnum.KIND_WIRELESS:
return queryset, Q(**{f"{prefix}type__in": WIRELESS_IFACE_TYPES})
@strawberry_django.filter_type(models.InterfaceTemplate, lookups=True)
class InterfaceTemplateFilter(ModularComponentTemplateFilterMixin):

View File

@@ -1,3 +1,5 @@
from strawberry.types import Info
from circuits.graphql.types import CircuitTerminationType, ProviderNetworkType
from circuits.models import CircuitTermination, ProviderNetwork
from dcim.graphql.types import (
@@ -49,7 +51,7 @@ class InventoryItemTemplateComponentType:
)
@classmethod
def resolve_type(cls, instance, info):
def resolve_type(cls, instance, info: Info):
if type(instance) is ConsolePortTemplate:
return ConsolePortTemplateType
if type(instance) is ConsoleServerPortTemplate:
@@ -79,7 +81,7 @@ class InventoryItemComponentType:
)
@classmethod
def resolve_type(cls, instance, info):
def resolve_type(cls, instance, info: Info):
if type(instance) is ConsolePort:
return ConsolePortType
if type(instance) is ConsoleServerPort:
@@ -112,7 +114,7 @@ class ConnectedEndpointType:
)
@classmethod
def resolve_type(cls, instance, info):
def resolve_type(cls, instance, info: Info):
if type(instance) is CircuitTermination:
return CircuitTerminationType
if type(instance) is ConsolePortType:

View File

@@ -3,9 +3,7 @@ import django.db.models.deletion
import taggit.managers
from django.db import migrations, models
import utilities.fields
import utilities.json
import utilities.ordering
class Migration(migrations.Migration):

View File

@@ -18,6 +18,7 @@ from utilities.conversion import to_meters
from utilities.exceptions import AbortRequest
from utilities.fields import ColorField, GenericArrayForeignKey
from utilities.querysets import RestrictedQuerySet
from utilities.serialization import deserialize_object, serialize_object
from wireless.models import WirelessLink
from .device_components import FrontPort, RearPort, PathEndpoint
@@ -119,43 +120,61 @@ class Cable(PrimaryModel):
pk = self.pk or self._pk
return self.label or f'#{pk}'
@property
def a_terminations(self):
if hasattr(self, '_a_terminations'):
return self._a_terminations
def get_status_color(self):
return LinkStatusChoices.colors.get(self.status)
def _get_x_terminations(self, side):
"""
Return the terminating objects for the given cable end (A or B).
"""
if side not in (CableEndChoices.SIDE_A, CableEndChoices.SIDE_B):
raise ValueError(f"Unknown cable side: {side}")
attr = f'_{side.lower()}_terminations'
if hasattr(self, attr):
return getattr(self, attr)
if not self.pk:
return []
# Query self.terminations.all() to leverage cached results
return [
ct.termination for ct in self.terminations.all() if ct.cable_end == CableEndChoices.SIDE_A
# Query self.terminations.all() to leverage cached results
ct.termination for ct in self.terminations.all() if ct.cable_end == side
]
def _set_x_terminations(self, side, value):
"""
Set the terminating objects for the given cable end (A or B).
"""
if side not in (CableEndChoices.SIDE_A, CableEndChoices.SIDE_B):
raise ValueError(f"Unknown cable side: {side}")
_attr = f'_{side.lower()}_terminations'
# If the provided value is a list of CableTermination IDs, resolve them
# to their corresponding termination objects.
if all(isinstance(item, int) for item in value):
value = [
ct.termination for ct in CableTermination.objects.filter(pk__in=value).prefetch_related('termination')
]
if not self.pk or getattr(self, _attr, []) != list(value):
self._terminations_modified = True
setattr(self, _attr, value)
@property
def a_terminations(self):
return self._get_x_terminations(CableEndChoices.SIDE_A)
@a_terminations.setter
def a_terminations(self, value):
if not self.pk or self.a_terminations != list(value):
self._terminations_modified = True
self._a_terminations = value
self._set_x_terminations(CableEndChoices.SIDE_A, value)
@property
def b_terminations(self):
if hasattr(self, '_b_terminations'):
return self._b_terminations
if not self.pk:
return []
# Query self.terminations.all() to leverage cached results
return [
ct.termination for ct in self.terminations.all() if ct.cable_end == CableEndChoices.SIDE_B
]
return self._get_x_terminations(CableEndChoices.SIDE_B)
@b_terminations.setter
def b_terminations(self, value):
if not self.pk or self.b_terminations != list(value):
self._terminations_modified = True
self._b_terminations = value
self._set_x_terminations(CableEndChoices.SIDE_B, value)
@property
def color_name(self):
@@ -208,7 +227,7 @@ class Cable(PrimaryModel):
for termination in self.b_terminations:
CableTermination(cable=self, cable_end='B', termination=termination).clean()
def save(self, *args, **kwargs):
def save(self, *args, force_insert=False, force_update=False, using=None, update_fields=None):
_created = self.pk is None
# Store the given length (if any) in meters for use in database ordering
@@ -221,39 +240,87 @@ class Cable(PrimaryModel):
if self.length is None:
self.length_unit = None
super().save(*args, **kwargs)
# If this is a new Cable, save it before attempting to create its CableTerminations
if self._state.adding:
super().save(*args, force_insert=True, using=using, update_fields=update_fields)
# Update the private PK used in __str__()
self._pk = self.pk
# Update the private pk used in __str__ in case this is a new object (i.e. just got its pk)
self._pk = self.pk
# Retrieve existing A/B terminations for the Cable
a_terminations = {ct.termination: ct for ct in self.terminations.filter(cable_end='A')}
b_terminations = {ct.termination: ct for ct in self.terminations.filter(cable_end='B')}
# Delete stale CableTerminations
if self._terminations_modified:
for termination, ct in a_terminations.items():
if termination.pk and termination not in self.a_terminations:
ct.delete()
for termination, ct in b_terminations.items():
if termination.pk and termination not in self.b_terminations:
ct.delete()
self.update_terminations()
super().save(*args, force_update=True, using=using, update_fields=update_fields)
# Save new CableTerminations (if any)
if self._terminations_modified:
for termination in self.a_terminations:
if not termination.pk or termination not in a_terminations:
CableTermination(cable=self, cable_end='A', termination=termination).save()
for termination in self.b_terminations:
if not termination.pk or termination not in b_terminations:
CableTermination(cable=self, cable_end='B', termination=termination).save()
try:
trace_paths.send(Cable, instance=self, created=_created)
except UnsupportedCablePath as e:
raise AbortRequest(e)
def get_status_color(self):
return LinkStatusChoices.colors.get(self.status)
def serialize_object(self, exclude=None):
data = serialize_object(self, exclude=exclude or [])
# Add A & B terminations to the serialized data
a_terminations, b_terminations = self.get_terminations()
data['a_terminations'] = sorted([ct.pk for ct in a_terminations.values()])
data['b_terminations'] = sorted([ct.pk for ct in b_terminations.values()])
return data
@classmethod
def deserialize_object(cls, data, pk=None):
a_terminations = data.pop('a_terminations', [])
b_terminations = data.pop('b_terminations', [])
instance = deserialize_object(cls, data, pk=pk)
# Assign A & B termination objects to the Cable instance
queryset = CableTermination.objects.prefetch_related('termination')
instance.a_terminations = [
ct.termination for ct in queryset.filter(pk__in=a_terminations)
]
instance.b_terminations = [
ct.termination for ct in queryset.filter(pk__in=b_terminations)
]
return instance
def get_terminations(self):
"""
Return two dictionaries mapping A & B side terminating objects to their corresponding CableTerminations
for this Cable.
"""
a_terminations = {}
b_terminations = {}
for ct in CableTermination.objects.filter(cable=self).prefetch_related('termination'):
if ct.cable_end == CableEndChoices.SIDE_A:
a_terminations[ct.termination] = ct
else:
b_terminations[ct.termination] = ct
return a_terminations, b_terminations
def update_terminations(self):
"""
Create/delete CableTerminations for this Cable to reflect its current state.
"""
a_terminations, b_terminations = self.get_terminations()
# Delete any stale CableTerminations
for termination, ct in a_terminations.items():
if termination.pk and termination not in self.a_terminations:
ct.delete()
for termination, ct in b_terminations.items():
if termination.pk and termination not in self.b_terminations:
ct.delete()
# Save any new CableTerminations
for termination in self.a_terminations:
if not termination.pk or termination not in a_terminations:
CableTermination(cable=self, cable_end='A', termination=termination).save()
for termination in self.b_terminations:
if not termination.pk or termination not in b_terminations:
CableTermination(cable=self, cable_end='B', termination=termination).save()
class CableTermination(ChangeLoggedModel):
@@ -326,6 +393,17 @@ class CableTermination(ChangeLoggedModel):
def clean(self):
super().clean()
# Disallow connecting a cable to any termination object that is
# explicitly flagged as "mark connected".
termination = getattr(self, 'termination', None)
if termination is not None and getattr(termination, "mark_connected", False):
raise ValidationError(
_("Cannot connect a cable to {obj_parent} > {obj} because it is marked as connected.").format(
obj_parent=termination.parent_object,
obj=termination,
)
)
# Check for existing termination
qs = CableTermination.objects.filter(
termination_type=self.termination_type,
@@ -337,14 +415,14 @@ class CableTermination(ChangeLoggedModel):
existing_termination = qs.first()
if existing_termination is not None:
raise ValidationError(
_("Duplicate termination found for {app_label}.{model} {termination_id}: cable {cable_pk}".format(
_("Duplicate termination found for {app_label}.{model} {termination_id}: cable {cable_pk}").format(
app_label=self.termination_type.app_label,
model=self.termination_type.model,
termination_id=self.termination_id,
cable_pk=existing_termination.cable.pk
))
)
)
# Validate interface type (if applicable)
# Validate the interface type (if applicable)
if self.termination_type.model == 'interface' and self.termination.type in NONCONNECTABLE_IFACE_TYPES:
raise ValidationError(
_("Cables cannot be terminated to {type_display} interfaces").format(

View File

@@ -7,6 +7,7 @@ from mptt.models import MPTTModel, TreeForeignKey
from dcim.choices import *
from dcim.constants import *
from dcim.models.mixins import InterfaceValidationMixin
from netbox.models import ChangeLoggedModel
from utilities.fields import ColorField, NaturalOrderingField
from utilities.mptt import TreeManager
@@ -405,7 +406,7 @@ class PowerOutletTemplate(ModularComponentTemplateModel):
}
class InterfaceTemplate(ModularComponentTemplateModel):
class InterfaceTemplate(InterfaceValidationMixin, ModularComponentTemplateModel):
"""
A template for a physical data interface on a new Device.
"""
@@ -469,8 +470,6 @@ class InterfaceTemplate(ModularComponentTemplateModel):
super().clean()
if self.bridge:
if self.pk and self.bridge_id == self.pk:
raise ValidationError({'bridge': _("An interface cannot be bridged to itself.")})
if self.device_type and self.device_type != self.bridge.device_type:
raise ValidationError({
'bridge': _(
@@ -484,11 +483,6 @@ class InterfaceTemplate(ModularComponentTemplateModel):
).format(bridge=self.bridge)
})
if self.rf_role and self.type not in WIRELESS_IFACE_TYPES:
raise ValidationError({
'rf_role': "Wireless role may be set only on wireless interfaces."
})
def instantiate(self, **kwargs):
return self.component_model(
name=self.resolve_name(kwargs.get('module')),

View File

@@ -11,6 +11,7 @@ from mptt.models import MPTTModel, TreeForeignKey
from dcim.choices import *
from dcim.constants import *
from dcim.fields import WWNField
from dcim.models.mixins import InterfaceValidationMixin
from netbox.choices import ColorChoices
from netbox.models import OrganizationalModel, NetBoxModel
from utilities.fields import ColorField, NaturalOrderingField
@@ -632,10 +633,17 @@ class BaseInterface(models.Model):
})
# Check that the primary MAC address (if any) is assigned to this interface
if self.primary_mac_address and self.primary_mac_address.assigned_object != self:
if (
self.primary_mac_address and
self.primary_mac_address.assigned_object is not None and
self.primary_mac_address.assigned_object != self
):
raise ValidationError({
'primary_mac_address': _("MAC address {mac_address} is not assigned to this interface.").format(
mac_address=self.primary_mac_address
'primary_mac_address': _(
"MAC address {mac_address} is assigned to a different interface ({interface})."
).format(
mac_address=self.primary_mac_address,
interface=self.primary_mac_address.assigned_object,
)
})
@@ -669,7 +677,14 @@ class BaseInterface(models.Model):
return self.primary_mac_address.mac_address
class Interface(ModularComponentModel, BaseInterface, CabledObjectModel, PathEndpoint, TrackingModelMixin):
class Interface(
InterfaceValidationMixin,
ModularComponentModel,
BaseInterface,
CabledObjectModel,
PathEndpoint,
TrackingModelMixin,
):
"""
A network interface within a Device. A physical Interface can connect to exactly one other Interface.
"""
@@ -872,25 +887,21 @@ class Interface(ModularComponentModel, BaseInterface, CabledObjectModel, PathEnd
"The selected parent interface ({interface}) belongs to a different device ({device})"
).format(interface=self.parent, device=self.parent.device)
})
elif self.parent.device.virtual_chassis != self.parent.virtual_chassis:
elif self.parent.device.virtual_chassis != self.device.virtual_chassis:
raise ValidationError({
'parent': _(
"The selected parent interface ({interface}) belongs to {device}, which is not part of "
"virtual chassis {virtual_chassis}."
).format(
interface=self.parent,
device=self.parent_device,
device=self.parent.device,
virtual_chassis=self.device.virtual_chassis
)
})
# Bridge validation
# An interface cannot be bridged to itself
if self.pk and self.bridge_id == self.pk:
raise ValidationError({'bridge': _("An interface cannot be bridged to itself.")})
# A bridged interface belong to the same device or virtual chassis
# A bridged interface belongs to the same device or virtual chassis
if self.bridge and self.bridge.device != self.device:
if self.device.virtual_chassis is None:
raise ValidationError({
@@ -935,29 +946,9 @@ class Interface(ModularComponentModel, BaseInterface, CabledObjectModel, PathEnd
)
})
# PoE validation
# Only physical interfaces may have a PoE mode/type assigned
if self.poe_mode and self.is_virtual:
raise ValidationError({
'poe_mode': _("Virtual interfaces cannot have a PoE mode.")
})
if self.poe_type and self.is_virtual:
raise ValidationError({
'poe_type': _("Virtual interfaces cannot have a PoE type.")
})
# An interface with a PoE type set must also specify a mode
if self.poe_type and not self.poe_mode:
raise ValidationError({
'poe_type': _("Must specify PoE mode when designating a PoE type.")
})
# Wireless validation
# RF role & channel may only be set for wireless interfaces
if self.rf_role and not self.is_wireless:
raise ValidationError({'rf_role': _("Wireless role may be set only on wireless interfaces.")})
# RF channel may only be set for wireless interfaces
if self.rf_channel and not self.is_wireless:
raise ValidationError({'rf_channel': _("Channel may be set only on wireless interfaces.")})

View File

@@ -4,8 +4,11 @@ from django.core.exceptions import ValidationError
from django.db import models
from django.utils.translation import gettext_lazy as _
from dcim.constants import VIRTUAL_IFACE_TYPES, WIRELESS_IFACE_TYPES
__all__ = (
'CachedScopeMixin',
'InterfaceValidationMixin',
'RenderConfigMixin',
)
@@ -87,11 +90,9 @@ class CachedScopeMixin(models.Model):
def clean(self):
if self.scope_type and not (self.scope or self.scope_id):
scope_type = self.scope_type.model_class()
raise ValidationError({
'scope': _(
"Please select a {scope_type}."
).format(scope_type=scope_type._meta.model_name)
})
raise ValidationError(
_("Please select a {scope_type}.").format(scope_type=scope_type._meta.model_name)
)
super().clean()
def save(self, *args, **kwargs):
@@ -118,3 +119,33 @@ class CachedScopeMixin(models.Model):
self._site = self.scope.site
self._location = self.scope
cache_related_objects.alters_data = True
class InterfaceValidationMixin:
def clean(self):
super().clean()
# An interface cannot be bridged to itself
if self.pk and self.bridge_id == self.pk:
raise ValidationError({'bridge': _("An interface cannot be bridged to itself.")})
# Only physical interfaces may have a PoE mode/type assigned
if self.poe_mode and self.type in VIRTUAL_IFACE_TYPES:
raise ValidationError({
'poe_mode': _("Virtual interfaces cannot have a PoE mode.")
})
if self.poe_type and self.type in VIRTUAL_IFACE_TYPES:
raise ValidationError({
'poe_type': _("Virtual interfaces cannot have a PoE type.")
})
# An interface with a PoE type set must also specify a mode
if self.poe_type and not self.poe_mode:
raise ValidationError({
'poe_type': _("Must specify PoE mode when designating a PoE type.")
})
# RF role may be set only for wireless interfaces
if self.rf_role and self.type not in WIRELESS_IFACE_TYPES:
raise ValidationError({'rf_role': _("Wireless role may be set only on wireless interfaces.")})

View File

@@ -4,6 +4,7 @@ from django.db.models.signals import post_save, post_delete, pre_delete
from django.dispatch import receiver
from dcim.choices import CableEndChoices, LinkStatusChoices
from virtualization.models import VMInterface
from .models import (
Cable, CablePath, CableTermination, ConsolePort, ConsoleServerPort, Device, DeviceBay, FrontPort, Interface,
InventoryItem, ModuleBay, PathEndpoint, PowerOutlet, PowerPanel, PowerPort, Rack, RearPort, Location,
@@ -170,3 +171,15 @@ def extend_rearport_cable_paths(instance, created, raw, **kwargs):
rearport = instance.rear_port
for cablepath in CablePath.objects.filter(_nodes__contains=rearport):
cablepath.retrace()
@receiver(post_save, sender=Interface)
@receiver(post_save, sender=VMInterface)
def update_mac_address_interface(instance, created, raw, **kwargs):
"""
When creating a new Interface or VMInterface, check whether a MACAddress has been designated as its primary. If so,
assign the MACAddress to the interface.
"""
if created and not raw and instance.primary_mac_address:
instance.primary_mac_address.assigned_object = instance
instance.primary_mac_address.save()

View File

@@ -195,6 +195,11 @@ class DeviceTable(TenancyColumnsMixin, ContactsColumnMixin, NetBoxTable):
linkify=True,
verbose_name=_('Type')
)
u_height = columns.TemplateColumn(
accessor=tables.A('device_type__u_height'),
verbose_name=_('U Height'),
template_code='{{ value|floatformat }}'
)
platform = tables.Column(
linkify=True,
verbose_name=_('Platform')
@@ -307,6 +312,16 @@ class DeviceComponentTable(NetBoxTable):
verbose_name=_('Name'),
linkify=True,
)
device_location = tables.Column(
accessor=tables.A('device__location'),
verbose_name=_('Device Location'),
linkify=True,
)
device_site = tables.Column(
accessor=tables.A('device__site'),
verbose_name=_('Device Site'),
linkify=True,
)
device_status = columns.ChoiceFieldColumn(
accessor=tables.A('device__status'),
verbose_name=_('Device Status'),
@@ -1159,6 +1174,9 @@ class MACAddressTable(NetBoxTable):
orderable=False,
verbose_name=_('Parent')
)
is_primary = columns.BooleanColumn(
verbose_name=_('Primary')
)
tags = columns.TagColumn(
url_name='dcim:macaddress_list'
)
@@ -1169,7 +1187,7 @@ class MACAddressTable(NetBoxTable):
class Meta(DeviceComponentTable.Meta):
model = models.MACAddress
fields = (
'pk', 'id', 'mac_address', 'assigned_object_parent', 'assigned_object', 'description', 'comments', 'tags',
'created', 'last_updated',
'pk', 'id', 'mac_address', 'assigned_object_parent', 'assigned_object', 'description', 'is_primary',
'comments', 'tags', 'created', 'last_updated',
)
default_columns = ('pk', 'mac_address', 'assigned_object_parent', 'assigned_object', 'description')

View File

@@ -10,7 +10,7 @@ from netbox.choices import ColorChoices, WeightUnitChoices
from tenancy.models import Tenant, TenantGroup
from users.models import User
from utilities.testing import ChangeLoggedFilterSetTests, create_test_device, create_test_virtualmachine
from virtualization.models import Cluster, ClusterType, ClusterGroup, VMInterface, VirtualMachine
from virtualization.models import Cluster, ClusterGroup, ClusterType, VirtualMachine, VMInterface
from wireless.choices import WirelessChannelChoices, WirelessRoleChoices
from wireless.models import WirelessLink
@@ -7164,9 +7164,20 @@ class MACAddressTestCase(TestCase, ChangeLoggedFilterSetTests):
MACAddress(mac_address='00-00-00-05-01-01', assigned_object=vm_interfaces[1]),
MACAddress(mac_address='00-00-00-06-01-01', assigned_object=vm_interfaces[2]),
MACAddress(mac_address='00-00-00-06-01-02', assigned_object=vm_interfaces[2]),
# unassigned
MACAddress(mac_address='00-00-00-07-01-01'),
)
MACAddress.objects.bulk_create(mac_addresses)
# Set MAC addresses as primary
for idx, interface in enumerate(interfaces):
interface.primary_mac_address = mac_addresses[idx]
interface.save()
for idx, vm_interface in enumerate(vm_interfaces):
# Offset by 4 for device MACs
vm_interface.primary_mac_address = mac_addresses[idx + 4]
vm_interface.save()
def test_mac_address(self):
params = {'mac_address': ['00-00-00-01-01-01', '00-00-00-02-01-01']}
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
@@ -7198,3 +7209,15 @@ class MACAddressTestCase(TestCase, ChangeLoggedFilterSetTests):
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
params = {'vminterface': [vm_interfaces[0].name, vm_interfaces[1].name]}
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
def test_assigned(self):
params = {'assigned': True}
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 8)
params = {'assigned': False}
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 1)
def test_primary(self):
params = {'primary': True}
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 6)
params = {'primary': False}
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 3)

View File

@@ -967,6 +967,18 @@ class CableTestCase(TestCase):
with self.assertRaises(ValidationError):
cable.clean()
def test_cannot_cable_to_mark_connected(self):
"""
Test that a cable cannot be connected to an interface marked as connected.
"""
device1 = Device.objects.get(name='TestDevice1')
interface1 = Interface.objects.get(device__name='TestDevice2', name='eth1')
mark_connected_interface = Interface(device=device1, name='mark_connected1', mark_connected=True)
cable = Cable(a_terminations=[mark_connected_interface], b_terminations=[interface1])
with self.assertRaises(ValidationError):
cable.clean()
class VirtualDeviceContextTestCase(TestCase):

View File

@@ -7,13 +7,14 @@ from django.test import override_settings, tag
from django.urls import reverse
from netaddr import EUI
from core.models import ObjectType
from dcim.choices import *
from dcim.constants import *
from dcim.models import *
from ipam.models import ASN, RIR, VLAN, VRF
from netbox.choices import CSVDelimiterChoices, ImportFormatChoices, WeightUnitChoices
from tenancy.models import Tenant
from users.models import User
from users.models import ObjectPermission, User
from utilities.testing import ViewTestCases, create_tags, create_test_device, post_data
from wireless.models import WirelessLAN
@@ -1078,14 +1079,14 @@ class ModuleTypeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
'dcim.add_modulebaytemplate',
)
def verify_module_type_profile(scenario_name):
# TODO: remove extra regression asserts once parent test supports testing all import fields
fan_module_type = ModuleType.objects.get(part_number='generic-fan')
fan_module_type_profile = ModuleTypeProfile.objects.get(name='Fan')
assert fan_module_type.profile == fan_module_type_profile
# run base test
super().test_bulk_import_objects_with_permission()
# TODO: remove extra regression asserts once parent test supports testing all import fields
fan_module_type = ModuleType.objects.get(part_number='generic-fan')
fan_module_type_profile = ModuleTypeProfile.objects.get(name='Fan')
assert fan_module_type.profile == fan_module_type_profile
super().test_bulk_import_objects_with_permission(post_import_callback=verify_module_type_profile)
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'], EXEMPT_EXCLUDE_MODELS=[])
def test_bulk_import_objects_with_constrained_permission(self):
@@ -2833,10 +2834,19 @@ class InterfaceTestCase(ViewTestCases.DeviceComponentViewTestCase):
}
cls.csv_data = (
"device,name,type,vrf.pk,poe_mode,poe_type",
f"Device 1,Interface 4,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af",
f"Device 1,Interface 5,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af",
f"Device 1,Interface 6,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af",
"device,name,type,vrf.pk,poe_mode,poe_type,mode,untagged_vlan,tagged_vlans",
(
f"Device 1,Interface 4,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af,"
f"tagged,{vlans[0].vid},'{','.join([str(v.vid) for v in vlans[1:4]])}'"
),
(
f"Device 1,Interface 5,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af,"
f"tagged,{vlans[0].vid},'{','.join([str(v.vid) for v in vlans[1:4]])}'"
),
(
f"Device 1,Interface 6,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af,"
f"tagged,{vlans[0].vid},'{','.join([str(v.vid) for v in vlans[1:4]])}'"
),
)
cls.csv_update_data = (
@@ -2884,6 +2894,43 @@ class InterfaceTestCase(ViewTestCases.DeviceComponentViewTestCase):
self.client.post(self._get_url('bulk_delete'), data)
self.assertEqual(device.interfaces.count(), 4) # Child & parent were both deleted
def test_rename_select_all_spans_pages(self):
"""
Tests the bulk rename functionality for interfaces spanning multiple pages in the UI.
"""
device_name = 'DeviceRename'
device = create_test_device(device_name)
# Create > default page size (25) so selection spans multiple pages
for i in range(37):
Interface.objects.create(device=device, name=f'eth{i}')
self.add_permissions('dcim.change_interface')
# Filter to this device's interfaces to simulate a real list filter
get_qs = {'device_id': Device.objects.get(name=device_name).pk}
post_url = f'{self._get_url("bulk_rename")}?device_id={get_qs["device_id"]}'
# Preview step: ensure 37 selected (not just one page)
data = {'_preview': '1', '_all': '1', 'find': 'eth', 'replace': 'xe'}
response = self.client.post(post_url, data=data)
self.assertHttpStatus(response, 200)
self.assertEqual(len(response.context['selected_objects']), 37)
# Extract pk[] just like the browser would submit on Apply
# (either from the form's initial, or from selected_objects)
pk_list = response.context['form'].initial.get('pk')
if not pk_list:
pk_list = [obj.pk for obj in response.context['selected_objects']]
pk_list = [str(pk) for pk in pk_list]
# Apply step: include pk[] in the POST
apply_data = {'_apply': '1', '_all': '1', 'find': 'eth', 'replace': 'xe', 'pk': pk_list}
response = self.client.post(post_url, data=apply_data)
# On success the view redirects back to the return URL
self.assertHttpStatus(response, 302)
self.assertEqual(Interface.objects.filter(device=device, name__startswith='xe').count(), 37)
class FrontPortTestCase(ViewTestCases.DeviceComponentViewTestCase):
model = FrontPort
@@ -3290,8 +3337,10 @@ class CableTestCase(
Device(name='Device 1', site=sites[0], device_type=devicetype, role=role),
Device(name='Device 2', site=sites[0], device_type=devicetype, role=role),
Device(name='Device 3', site=sites[0], device_type=devicetype, role=role),
Device(name='Device 4', site=sites[0], device_type=devicetype, role=role),
# Create 'Device 1' assigned to 'Site 2' (allowed since the site is different)
Device(name='Device 1', site=sites[1], device_type=devicetype, role=role),
Device(name='Device 5', site=sites[1], device_type=devicetype, role=role),
)
Device.objects.bulk_create(devices)
@@ -3300,22 +3349,36 @@ class CableTestCase(
vc.save()
interfaces = (
# Device 1, Site 1
Interface(device=devices[0], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[0], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[0], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
# Device 2, Site 1
Interface(device=devices[1], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[1], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[1], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
# Device 3, Site 1
Interface(device=devices[2], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[2], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[2], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
# Device 3, Site 1
Interface(device=devices[3], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[3], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[3], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
# Device 1, Site 2
Interface(device=devices[4], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[4], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[4], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
# Device 1, Site 2
Interface(device=devices[5], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[5], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[5], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[1], name='Device 2 Interface', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[2], name='Device 3 Interface', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[3], name='Interface 4', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[3], name='Interface 5', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[4], name='Interface 4', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
Interface(device=devices[4], name='Interface 5', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
)
Interface.objects.bulk_create(interfaces)
@@ -3342,16 +3405,29 @@ class CableTestCase(
'tags': [t.pk for t in tags],
}
# Ensure that CSV bulk import supports assigning terminations from parent devices that share
# the same device name, provided those devices belong to different sites.
cls.csv_data = (
"side_a_site,side_a_device,side_a_type,side_a_name,side_b_site,side_b_device,side_b_type,side_b_name",
"Site 1,Device 3,dcim.interface,Interface 1,Site 2,Device 1,dcim.interface,Interface 1",
"Site 1,Device 3,dcim.interface,Interface 2,Site 2,Device 1,dcim.interface,Interface 2",
"Site 1,Device 3,dcim.interface,Interface 3,Site 2,Device 1,dcim.interface,Interface 3",
"Site 1,Device 1,dcim.interface,Device 2 Interface,Site 2,Device 1,dcim.interface,Interface 4",
"Site 1,Device 1,dcim.interface,Device 3 Interface,Site 2,Device 1,dcim.interface,Interface 5",
)
cls.csv_data = {
'default': (
"side_a_device,side_a_type,side_a_name,side_b_device,side_b_type,side_b_name",
"Device 4,dcim.interface,Interface 1,Device 5,dcim.interface,Interface 1",
"Device 3,dcim.interface,Interface 2,Device 4,dcim.interface,Interface 2",
"Device 3,dcim.interface,Interface 3,Device 4,dcim.interface,Interface 3",
# The following is no longer possible in this scenario, because there are multiple
# devices named "Device 1" across multiple sites. See the "site-filtering" scenario
# below for how to specify a site for non-unique device names.
# "Device 1,dcim.interface,Device 3 Interface,Device 4,dcim.interface,Interface 5",
),
'site-filtering': (
# Ensure that CSV bulk import supports assigning terminations from parent devices
# that share the same device name, provided those devices belong to different sites.
"side_a_site,side_a_device,side_a_type,side_a_name,side_b_site,side_b_device,side_b_type,side_b_name",
"Site 1,Device 3,dcim.interface,Interface 1,Site 2,Device 1,dcim.interface,Interface 1",
"Site 1,Device 3,dcim.interface,Interface 2,Site 2,Device 1,dcim.interface,Interface 2",
"Site 1,Device 3,dcim.interface,Interface 3,Site 2,Device 1,dcim.interface,Interface 3",
"Site 1,Device 1,dcim.interface,Device 2 Interface,Site 2,Device 1,dcim.interface,Interface 4",
"Site 1,Device 1,dcim.interface,Device 3 Interface,Site 2,Device 1,dcim.interface,Interface 5",
)
}
cls.csv_update_data = (
"id,label,color",
@@ -3699,3 +3775,29 @@ class MACAddressTestCase(ViewTestCases.PrimaryObjectViewTestCase):
cls.bulk_edit_data = {
'description': 'New description',
}
@tag('regression') # Issue #20542
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'], EXEMPT_EXCLUDE_MODELS=[])
def test_create_macaddress_via_quickadd(self):
"""
Test creating a MAC address via quick-add modal (e.g., from Interface form).
Regression test for issue #20542 where form prefix was missing in POST handler.
"""
obj_perm = ObjectPermission(name='Test permission', actions=['add'])
obj_perm.save()
obj_perm.users.add(self.user)
obj_perm.object_types.add(ObjectType.objects.get_for_model(self.model))
# Simulate quick-add form submission with 'quickadd-' prefix
formatted_data = post_data(self.form_data)
quickadd_data = {f'quickadd-{k}': v for k, v in formatted_data.items()}
quickadd_data['_quickadd'] = 'True'
initial_count = self._get_queryset().count()
url = f"{self._get_url('add')}?_quickadd=True&target=id_primary_mac_address"
response = self.client.post(url, data=quickadd_data)
# Should successfully create the MAC address and return the quick_add_created template
self.assertHttpStatus(response, 200)
self.assertIn(b'quick-add-object', response.content)
self.assertEqual(initial_count + 1, self._get_queryset().count())

View File

@@ -295,6 +295,7 @@ class RegionBulkEditView(generic.BulkEditView):
@register_model_view(Region, 'bulk_rename', path='rename', detail=False)
class RegionBulkRenameView(generic.BulkRenameView):
queryset = Region.objects.all()
filterset = filtersets.RegionFilterSet
@register_model_view(Region, 'bulk_delete', path='delete', detail=False)
@@ -426,6 +427,7 @@ class SiteGroupBulkEditView(generic.BulkEditView):
@register_model_view(SiteGroup, 'bulk_rename', path='rename', detail=False)
class SiteGroupBulkRenameView(generic.BulkRenameView):
queryset = SiteGroup.objects.all()
filterset = filtersets.SiteGroupFilterSet
@register_model_view(SiteGroup, 'bulk_delete', path='delete', detail=False)
@@ -516,6 +518,7 @@ class SiteBulkEditView(generic.BulkEditView):
@register_model_view(Site, 'bulk_rename', path='rename', detail=False)
class SiteBulkRenameView(generic.BulkRenameView):
queryset = Site.objects.all()
filterset = filtersets.SiteFilterSet
@register_model_view(Site, 'bulk_delete', path='delete', detail=False)
@@ -625,6 +628,7 @@ class LocationBulkEditView(generic.BulkEditView):
@register_model_view(Location, 'bulk_rename', path='rename', detail=False)
class LocationBulkRenameView(generic.BulkRenameView):
queryset = Location.objects.all()
filterset = filtersets.LocationFilterSet
@register_model_view(Location, 'bulk_delete', path='delete', detail=False)
@@ -695,6 +699,7 @@ class RackRoleBulkEditView(generic.BulkEditView):
@register_model_view(RackRole, 'bulk_rename', path='rename', detail=False)
class RackRoleBulkRenameView(generic.BulkRenameView):
queryset = RackRole.objects.all()
filterset = filtersets.RackRoleFilterSet
@register_model_view(RackRole, 'bulk_delete', path='delete', detail=False)
@@ -760,6 +765,7 @@ class RackTypeBulkEditView(generic.BulkEditView):
class RackTypeBulkRenameView(generic.BulkRenameView):
queryset = RackType.objects.all()
field_name = 'model'
filterset = filtersets.RackTypeFilterSet
@register_model_view(RackType, 'bulk_delete', path='delete', detail=False)
@@ -944,6 +950,7 @@ class RackBulkEditView(generic.BulkEditView):
@register_model_view(Rack, 'bulk_rename', path='rename', detail=False)
class RackBulkRenameView(generic.BulkRenameView):
queryset = Rack.objects.all()
filterset = filtersets.RackFilterSet
@register_model_view(Rack, 'bulk_delete', path='delete', detail=False)
@@ -1083,6 +1090,7 @@ class ManufacturerBulkEditView(generic.BulkEditView):
@register_model_view(Manufacturer, 'bulk_rename', path='rename', detail=False)
class ManufacturerBulkRenameView(generic.BulkRenameView):
queryset = Manufacturer.objects.all()
filterset = filtersets.ManufacturerFilterSet
@register_model_view(Manufacturer, 'bulk_delete', path='delete', detail=False)
@@ -1336,6 +1344,7 @@ class DeviceTypeBulkEditView(generic.BulkEditView):
class DeviceTypeBulkRenameView(generic.BulkRenameView):
queryset = DeviceType.objects.all()
field_name = 'model'
filterset = filtersets.DeviceTypeFilterSet
@register_model_view(DeviceType, 'bulk_delete', path='delete', detail=False)
@@ -1397,6 +1406,7 @@ class ModuleTypeProfileBulkEditView(generic.BulkEditView):
@register_model_view(ModuleTypeProfile, 'bulk_rename', path='rename', detail=False)
class ModuleTypeProfileBulkRenameView(generic.BulkRenameView):
queryset = ModuleTypeProfile.objects.all()
filterset = filtersets.ModuleTypeProfileFilterSet
@register_model_view(ModuleTypeProfile, 'bulk_delete', path='delete', detail=False)
@@ -1612,6 +1622,7 @@ class ModuleTypeBulkEditView(generic.BulkEditView):
@register_model_view(ModuleType, 'bulk_rename', path='rename', detail=False)
class ModuleTypeBulkRenameView(generic.BulkRenameView):
queryset = ModuleType.objects.all()
filterset = filtersets.ModuleTypeFilterSet
@register_model_view(ModuleType, 'bulk_delete', path='delete', detail=False)
@@ -2100,6 +2111,7 @@ class DeviceRoleBulkEditView(generic.BulkEditView):
@register_model_view(DeviceRole, 'bulk_rename', path='rename', detail=False)
class DeviceRoleBulkRenameView(generic.BulkRenameView):
queryset = DeviceRole.objects.all()
filterset = filtersets.DeviceRoleFilterSet
@register_model_view(DeviceRole, 'bulk_delete', path='delete', detail=False)
@@ -2175,6 +2187,7 @@ class PlatformBulkEditView(generic.BulkEditView):
@register_model_view(Platform, 'bulk_rename', path='rename', detail=False)
class PlatformBulkRenameView(generic.BulkRenameView):
queryset = Platform.objects.all()
filterset = filtersets.PlatformFilterSet
@register_model_view(Platform, 'bulk_delete', path='delete', detail=False)
@@ -2582,6 +2595,7 @@ class ConsolePortBulkEditView(generic.BulkEditView):
@register_model_view(ConsolePort, 'bulk_rename', path='rename', detail=False)
class ConsolePortBulkRenameView(generic.BulkRenameView):
queryset = ConsolePort.objects.all()
filterset = filtersets.ConsolePortFilterSet
@register_model_view(ConsolePort, 'bulk_disconnect', path='disconnect', detail=False)
@@ -2652,6 +2666,7 @@ class ConsoleServerPortBulkEditView(generic.BulkEditView):
@register_model_view(ConsoleServerPort, 'bulk_rename', path='rename', detail=False)
class ConsoleServerPortBulkRenameView(generic.BulkRenameView):
queryset = ConsoleServerPort.objects.all()
filterset = filtersets.ConsoleServerPortFilterSet
@register_model_view(ConsoleServerPort, 'bulk_disconnect', path='disconnect', detail=False)
@@ -2722,6 +2737,7 @@ class PowerPortBulkEditView(generic.BulkEditView):
@register_model_view(PowerPort, 'bulk_rename', path='rename', detail=False)
class PowerPortBulkRenameView(generic.BulkRenameView):
queryset = PowerPort.objects.all()
filterset = filtersets.PowerPortFilterSet
@register_model_view(PowerPort, 'bulk_disconnect', path='disconnect', detail=False)
@@ -2792,6 +2808,7 @@ class PowerOutletBulkEditView(generic.BulkEditView):
@register_model_view(PowerOutlet, 'bulk_rename', path='rename', detail=False)
class PowerOutletBulkRenameView(generic.BulkRenameView):
queryset = PowerOutlet.objects.all()
filterset = filtersets.PowerOutletFilterSet
@register_model_view(PowerOutlet, 'bulk_disconnect', path='disconnect', detail=False)
@@ -2934,6 +2951,7 @@ class InterfaceBulkEditView(generic.BulkEditView):
@register_model_view(Interface, 'bulk_rename', path='rename', detail=False)
class InterfaceBulkRenameView(generic.BulkRenameView):
queryset = Interface.objects.all()
filterset = filtersets.InterfaceFilterSet
@register_model_view(Interface, 'bulk_disconnect', path='disconnect', detail=False)
@@ -3005,6 +3023,7 @@ class FrontPortBulkEditView(generic.BulkEditView):
@register_model_view(FrontPort, 'bulk_rename', path='rename', detail=False)
class FrontPortBulkRenameView(generic.BulkRenameView):
queryset = FrontPort.objects.all()
filterset = filtersets.FrontPortFilterSet
@register_model_view(FrontPort, 'bulk_disconnect', path='disconnect', detail=False)
@@ -3080,6 +3099,7 @@ class RearPortBulkRenameView(generic.BulkRenameView):
@register_model_view(RearPort, 'bulk_disconnect', path='disconnect', detail=False)
class RearPortBulkDisconnectView(BulkDisconnectView):
queryset = RearPort.objects.all()
filterset = filtersets.RearPortFilterSet
@register_model_view(RearPort, 'bulk_delete', path='delete', detail=False)
@@ -3145,6 +3165,7 @@ class ModuleBayBulkEditView(generic.BulkEditView):
@register_model_view(ModuleBay, 'bulk_rename', path='rename', detail=False)
class ModuleBayBulkRenameView(generic.BulkRenameView):
queryset = ModuleBay.objects.all()
filterset = filtersets.ModuleBayFilterSet
@register_model_view(ModuleBay, 'bulk_delete', path='delete', detail=False)
@@ -3287,6 +3308,7 @@ class DeviceBayBulkEditView(generic.BulkEditView):
@register_model_view(DeviceBay, 'bulk_rename', path='rename', detail=False)
class DeviceBayBulkRenameView(generic.BulkRenameView):
queryset = DeviceBay.objects.all()
filterset = filtersets.DeviceBayFilterSet
@register_model_view(DeviceBay, 'bulk_delete', path='delete', detail=False)
@@ -3348,6 +3370,7 @@ class InventoryItemBulkEditView(generic.BulkEditView):
@register_model_view(InventoryItem, 'bulk_rename', path='rename', detail=False)
class InventoryItemBulkRenameView(generic.BulkRenameView):
queryset = InventoryItem.objects.all()
filterset = filtersets.InventoryItemFilterSet
@register_model_view(InventoryItem, 'bulk_delete', path='delete', detail=False)
@@ -3431,6 +3454,7 @@ class InventoryItemRoleBulkEditView(generic.BulkEditView):
@register_model_view(InventoryItemRole, 'bulk_rename', path='rename', detail=False)
class InventoryItemRoleBulkRenameView(generic.BulkRenameView):
queryset = InventoryItemRole.objects.all()
filterset = filtersets.InventoryItemRoleFilterSet
@register_model_view(InventoryItemRole, 'bulk_delete', path='delete', detail=False)
@@ -3634,6 +3658,7 @@ class CableBulkEditView(generic.BulkEditView):
class CableBulkRenameView(generic.BulkRenameView):
queryset = Cable.objects.all()
field_name = 'label'
filterset = filtersets.CableFilterSet
@register_model_view(Cable, 'bulk_delete', path='delete', detail=False)
@@ -3754,6 +3779,7 @@ class VirtualChassisEditView(ObjectPermissionRequiredMixin, GetReturnURLMixin, V
def post(self, request, pk):
virtual_chassis = get_object_or_404(self.queryset, pk=pk)
virtual_chassis.snapshot()
VCMemberFormSet = modelformset_factory(
model=Device,
form=forms.DeviceVCMembershipForm,
@@ -3806,9 +3832,7 @@ class VirtualChassisAddMemberView(ObjectPermissionRequiredMixin, GetReturnURLMix
return 'dcim.change_virtualchassis'
def get(self, request, pk):
virtual_chassis = get_object_or_404(self.queryset, pk=pk)
initial_data = {k: request.GET[k] for k in request.GET}
member_select_form = forms.VCMemberSelectForm(initial=initial_data)
membership_form = forms.DeviceVCMembershipForm(initial=initial_data)
@@ -3821,20 +3845,20 @@ class VirtualChassisAddMemberView(ObjectPermissionRequiredMixin, GetReturnURLMix
})
def post(self, request, pk):
virtual_chassis = get_object_or_404(self.queryset, pk=pk)
member_select_form = forms.VCMemberSelectForm(request.POST)
if member_select_form.is_valid():
device = member_select_form.cleaned_data['device']
device.snapshot()
device.virtual_chassis = virtual_chassis
data = {k: request.POST[k] for k in ['vc_position', 'vc_priority']}
data = {
'vc_position': request.POST['vc_position'],
'vc_priority': request.POST['vc_priority'],
}
membership_form = forms.DeviceVCMembershipForm(data=data, validate_vc_position=True, instance=device)
if membership_form.is_valid():
membership_form.save()
messages.success(request, mark_safe(
_('Added member <a href="{url}">{device}</a>').format(
@@ -3844,11 +3868,9 @@ class VirtualChassisAddMemberView(ObjectPermissionRequiredMixin, GetReturnURLMix
if '_addanother' in request.POST and safe_for_redirect(request.get_full_path()):
return redirect(request.get_full_path())
return redirect(self.get_return_url(request, device))
else:
membership_form = forms.DeviceVCMembershipForm(data=request.POST)
return render(request, 'dcim/virtualchassis_add_member.html', {
@@ -3866,7 +3888,6 @@ class VirtualChassisRemoveMemberView(ObjectPermissionRequiredMixin, GetReturnURL
return 'dcim.change_device'
def get(self, request, pk):
device = get_object_or_404(self.queryset, pk=pk, virtual_chassis__isnull=False)
form = ConfirmationForm(initial=request.GET)
@@ -3877,7 +3898,6 @@ class VirtualChassisRemoveMemberView(ObjectPermissionRequiredMixin, GetReturnURL
})
def post(self, request, pk):
device = get_object_or_404(self.queryset, pk=pk, virtual_chassis__isnull=False)
form = ConfirmationForm(request.POST)
@@ -3891,13 +3911,11 @@ class VirtualChassisRemoveMemberView(ObjectPermissionRequiredMixin, GetReturnURL
return redirect(device.get_absolute_url())
if form.is_valid():
devices = Device.objects.filter(pk=device.pk)
for device in devices:
device.virtual_chassis = None
device.vc_position = None
device.vc_priority = None
device.save()
device.snapshot()
device.virtual_chassis = None
device.vc_position = None
device.vc_priority = None
device.save()
msg = _('Removed {device} from virtual chassis {chassis}').format(
device=device,
@@ -3931,6 +3949,7 @@ class VirtualChassisBulkEditView(generic.BulkEditView):
@register_model_view(VirtualChassis, 'bulk_rename', path='rename', detail=False)
class VirtualChassisBulkRenameView(generic.BulkRenameView):
queryset = VirtualChassis.objects.all()
filterset = filtersets.VirtualChassisFilterSet
@register_model_view(VirtualChassis, 'bulk_delete', path='delete', detail=False)
@@ -3993,6 +4012,7 @@ class PowerPanelBulkEditView(generic.BulkEditView):
@register_model_view(PowerPanel, 'bulk_rename', path='rename', detail=False)
class PowerPanelBulkRenameView(generic.BulkRenameView):
queryset = PowerPanel.objects.all()
filterset = filtersets.PowerPanelFilterSet
@register_model_view(PowerPanel, 'bulk_delete', path='delete', detail=False)
@@ -4050,6 +4070,7 @@ class PowerFeedBulkEditView(generic.BulkEditView):
@register_model_view(PowerFeed, 'bulk_rename', path='rename', detail=False)
class PowerFeedBulkRenameView(generic.BulkRenameView):
queryset = PowerFeed.objects.all()
filterset = filtersets.PowerFeedFilterSet
@register_model_view(PowerFeed, 'bulk_disconnect', path='disconnect', detail=False)
@@ -4128,6 +4149,7 @@ class VirtualDeviceContextBulkEditView(generic.BulkEditView):
@register_model_view(VirtualDeviceContext, 'bulk_rename', path='rename', detail=False)
class VirtualDeviceContextBulkRenameView(generic.BulkRenameView):
queryset = VirtualDeviceContext.objects.all()
filterset = filtersets.VirtualDeviceContextFilterSet
@register_model_view(VirtualDeviceContext, 'bulk_delete', path='delete', detail=False)

View File

@@ -26,6 +26,7 @@ class CustomFieldChoiceSetSerializer(ChangeLogMessageSerializer, ValidatedModelS
max_length=2
)
)
choices_count = serializers.IntegerField(read_only=True)
class Meta:
model = CustomFieldChoiceSet

View File

@@ -5,6 +5,7 @@ from rest_framework import serializers
from core.api.serializers_.jobs import JobSerializer
from extras.models import Script
from netbox.api.serializers import ValidatedModelSerializer
from utilities.datetime import local_now
__all__ = (
'ScriptDetailSerializer',
@@ -66,11 +67,31 @@ class ScriptInputSerializer(serializers.Serializer):
interval = serializers.IntegerField(required=False, allow_null=True)
def validate_schedule_at(self, value):
if value and not self.context['script'].python_class.scheduling_enabled:
raise serializers.ValidationError(_("Scheduling is not enabled for this script."))
"""
Validates the specified schedule time for a script execution.
"""
if value:
if not self.context['script'].python_class.scheduling_enabled:
raise serializers.ValidationError(_('Scheduling is not enabled for this script.'))
if value < local_now():
raise serializers.ValidationError(_('Scheduled time must be in the future.'))
return value
def validate_interval(self, value):
"""
Validates the provided interval based on the script's scheduling configuration.
"""
if value and not self.context['script'].python_class.scheduling_enabled:
raise serializers.ValidationError(_("Scheduling is not enabled for this script."))
raise serializers.ValidationError(_('Scheduling is not enabled for this script.'))
return value
def validate(self, data):
"""
Validates the given data and ensures the necessary fields are populated.
"""
# Set the schedule_at time to now if only an interval is provided
# while handling the case where schedule_at is null.
if data.get('interval') and not data.get('schedule_at'):
data['schedule_at'] = local_now()
return super().validate(data)

View File

@@ -76,11 +76,11 @@ class CustomFieldBulkEditForm(ChangelogMessageMixin, BulkEditForm):
required=False,
widget=BulkEditNullBooleanSelect()
)
validation_minimum = forms.IntegerField(
validation_minimum = forms.DecimalField(
label=_('Minimum value'),
required=False,
)
validation_maximum = forms.IntegerField(
validation_maximum = forms.DecimalField(
label=_('Maximum value'),
required=False,
)

View File

@@ -103,11 +103,11 @@ class CustomFieldFilterForm(SavedFiltersMixin, FilterForm):
choices=BOOLEAN_WITH_BLANK_CHOICES
)
)
validation_minimum = forms.IntegerField(
validation_minimum = forms.DecimalField(
label=_('Minimum value'),
required=False
)
validation_maximum = forms.IntegerField(
validation_maximum = forms.DecimalField(
label=_('Maximum value'),
required=False
)

View File

@@ -17,7 +17,7 @@ if TYPE_CHECKING:
)
from tenancy.graphql.filters import TenantFilter, TenantGroupFilter
from netbox.graphql.enums import ColorEnum
from netbox.graphql.filter_lookups import IntegerLookup, JSONFilter, StringArrayLookup, TreeNodeFilter
from netbox.graphql.filter_lookups import FloatLookup, IntegerLookup, JSONFilter, StringArrayLookup, TreeNodeFilter
from users.graphql.filters import GroupFilter, UserFilter
from virtualization.graphql.filters import ClusterFilter, ClusterGroupFilter, ClusterTypeFilter
from .enums import *
@@ -43,12 +43,12 @@ __all__ = (
@strawberry_django.filter_type(models.ConfigContext, lookups=True)
class ConfigContextFilter(BaseObjectTypeFilterMixin, SyncedDataFilterMixin, ChangeLogFilterMixin):
name: FilterLookup[str] = strawberry_django.filter_field()
name: FilterLookup[str] | None = strawberry_django.filter_field()
weight: Annotated['IntegerLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
strawberry_django.filter_field()
)
description: FilterLookup[str] = strawberry_django.filter_field()
is_active: FilterLookup[bool] = strawberry_django.filter_field()
description: FilterLookup[str] | None = strawberry_django.filter_field()
is_active: FilterLookup[bool] | None = strawberry_django.filter_field()
regions: Annotated['RegionFilter', strawberry.lazy('dcim.graphql.filters')] | None = (
strawberry_django.filter_field()
)
@@ -151,10 +151,10 @@ class CustomFieldFilter(BaseObjectTypeFilterMixin, ChangeLogFilterMixin):
weight: Annotated['IntegerLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
strawberry_django.filter_field()
)
validation_minimum: Annotated['IntegerLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
validation_minimum: Annotated['FloatLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
strawberry_django.filter_field()
)
validation_maximum: Annotated['IntegerLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
validation_maximum: Annotated['FloatLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
strawberry_django.filter_field()
)
validation_regex: FilterLookup[str] | None = strawberry_django.filter_field()

View File

@@ -2,6 +2,7 @@ from typing import TYPE_CHECKING, Annotated, List
import strawberry
import strawberry_django
from strawberry.types import Info
__all__ = (
'ConfigContextMixin',
@@ -37,7 +38,7 @@ class CustomFieldsMixin:
class ImageAttachmentsMixin:
@strawberry_django.field
def image_attachments(self, info) -> List[Annotated["ImageAttachmentType", strawberry.lazy('.types')]]:
def image_attachments(self, info: Info) -> List[Annotated['ImageAttachmentType', strawberry.lazy('.types')]]:
return self.images.restrict(info.context.request.user, 'view')
@@ -45,17 +46,17 @@ class ImageAttachmentsMixin:
class JournalEntriesMixin:
@strawberry_django.field
def journal_entries(self, info) -> List[Annotated["JournalEntryType", strawberry.lazy('.types')]]:
def journal_entries(self, info: Info) -> List[Annotated['JournalEntryType', strawberry.lazy('.types')]]:
return self.journal_entries.all()
@strawberry.type
class TagsMixin:
tags: List[Annotated["TagType", strawberry.lazy('.types')]]
tags: List[Annotated['TagType', strawberry.lazy('.types')]]
@strawberry.type
class ContactsMixin:
contacts: List[Annotated["ContactAssignmentType", strawberry.lazy('tenancy.graphql.types')]]
contacts: List[Annotated['ContactAssignmentType', strawberry.lazy('tenancy.graphql.types')]]

View File

@@ -106,7 +106,7 @@ class ScriptJob(JobRunner):
# Add the current request as a property of the script
script.request = request
self.logger.debug(f"Request ID: {request.id}")
self.logger.debug(f"Request ID: {request.id if request else None}")
# Execute the script. If commit is True, wrap it with the event_tracking context manager to ensure we process
# change logging, event rules, etc.

View File

@@ -1,9 +1,39 @@
from django.contrib.postgres.fields import ArrayField
from django.contrib.postgres.fields.ranges import RangeField
from django.db.models import CharField, JSONField, Lookup
from django.db.models.fields.json import KeyTextTransform
from .fields import CachedValueField
class RangeContains(Lookup):
"""
Filter ArrayField(RangeField) columns where ANY element-range contains the scalar RHS.
Usage (ORM):
Model.objects.filter(<range_array_field>__range_contains=<scalar>)
Works with int4range[], int8range[], daterange[], tstzrange[], etc.
"""
lookup_name = 'range_contains'
def as_sql(self, compiler, connection):
# Compile LHS (the array-of-ranges column/expression) and RHS (scalar)
lhs, lhs_params = self.process_lhs(compiler, connection)
rhs, rhs_params = self.process_rhs(compiler, connection)
# Guard: only allow ArrayField whose base_field is a PostgreSQL RangeField
field = getattr(self.lhs, 'output_field', None)
if not (isinstance(field, ArrayField) and isinstance(field.base_field, RangeField)):
raise TypeError('range_contains is only valid for ArrayField(RangeField) columns')
# Range-contains-element using EXISTS + UNNEST keeps the range on the LHS: r @> value
sql = f"EXISTS (SELECT 1 FROM unnest({lhs}) AS r WHERE r @> {rhs})"
params = lhs_params + rhs_params
return sql, params
class Empty(Lookup):
"""
Filter on whether a string is empty.
@@ -25,7 +55,7 @@ class JSONEmpty(Lookup):
A key is considered empty if it is "", null, or does not exist.
"""
lookup_name = "empty"
lookup_name = 'empty'
def as_sql(self, compiler, connection):
# self.lhs.lhs is the parent expression (could be a JSONField or another KeyTransform)
@@ -69,6 +99,7 @@ class NetContainsOrEquals(Lookup):
return 'CAST(%s AS INET) >>= %s' % (lhs, rhs), params
ArrayField.register_lookup(RangeContains)
CharField.register_lookup(Empty)
JSONField.register_lookup(JSONEmpty)
CachedValueField.register_lookup(NetHost)

View File

@@ -0,0 +1,21 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('extras', '0132_configcontextprofile'),
]
operations = [
migrations.AlterField(
model_name='customfield',
name='validation_maximum',
field=models.DecimalField(blank=True, decimal_places=4, max_digits=16, null=True),
),
migrations.AlterField(
model_name='customfield',
name='validation_minimum',
field=models.DecimalField(blank=True, decimal_places=4, max_digits=16, null=True),
),
]

View File

@@ -174,13 +174,17 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
verbose_name=_('display weight'),
help_text=_('Fields with higher weights appear lower in a form.')
)
validation_minimum = models.BigIntegerField(
validation_minimum = models.DecimalField(
max_digits=16,
decimal_places=4,
blank=True,
null=True,
verbose_name=_('minimum value'),
help_text=_('Minimum allowed value (for numeric fields)')
)
validation_maximum = models.BigIntegerField(
validation_maximum = models.DecimalField(
max_digits=16,
decimal_places=4,
blank=True,
null=True,
verbose_name=_('maximum value'),
@@ -471,7 +475,7 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
field = forms.DecimalField(
required=required,
initial=initial,
max_digits=12,
max_digits=16,
decimal_places=4,
min_value=self.validation_minimum,
max_value=self.validation_maximum
@@ -531,10 +535,19 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
# URL
elif self.type == CustomFieldTypeChoices.TYPE_URL:
field = LaxURLField(assume_scheme='https', required=required, initial=initial)
if self.validation_regex:
field.validators = [
RegexValidator(
regex=self.validation_regex,
message=mark_safe(_("Values must match this regex: <code>{regex}</code>").format(
regex=escape(self.validation_regex)
))
)
]
# JSON
elif self.type == CustomFieldTypeChoices.TYPE_JSON:
field = JSONField(required=required, initial=json.dumps(initial) if initial else None)
field = JSONField(required=required, initial=json.dumps(initial) if initial is not None else None)
# Object
elif self.type == CustomFieldTypeChoices.TYPE_OBJECT:
@@ -680,6 +693,13 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
if self.validation_regex and not re.match(self.validation_regex, value):
raise ValidationError(_("Value must match regex '{regex}'").format(regex=self.validation_regex))
# Validate URL field
elif self.type == CustomFieldTypeChoices.TYPE_URL:
if type(value) is not str:
raise ValidationError(_("Value must be a string."))
if self.validation_regex and not re.match(self.validation_regex, value):
raise ValidationError(_("Value must match regex '{regex}'").format(regex=self.validation_regex))
# Validate integer
elif self.type == CustomFieldTypeChoices.TYPE_INTEGER:
if type(value) is not int:

View File

@@ -1,6 +1,6 @@
import json
import os
import urllib.parse
from pathlib import Path
from django.conf import settings
from django.contrib.contenttypes.fields import GenericForeignKey, GenericRelation
@@ -728,7 +728,9 @@ class ImageAttachment(ChangeLoggedModel):
@property
def filename(self):
return os.path.basename(self.image.name).split('_', 2)[2]
base_name = Path(self.image.name).name
prefix = f"{self.object_type.model}_{self.object_id}_"
return base_name.removeprefix(prefix)
@property
def html_tag(self):

View File

@@ -22,9 +22,10 @@ class ConfigContextQuerySet(RestrictedQuerySet):
aggregate_data: If True, use the JSONBAgg aggregate function to return only the list of JSON data objects
"""
# Device type and location assignment is relevant only for Devices
# Device type and location assignment are relevant only for Devices
device_type = getattr(obj, 'device_type', None)
location = getattr(obj, 'location', None)
locations = location.get_ancestors(include_self=True) if location else []
# Get assigned cluster, group, and type (if any)
cluster = getattr(obj, 'cluster', None)
@@ -49,7 +50,7 @@ class ConfigContextQuerySet(RestrictedQuerySet):
Q(regions__in=regions) | Q(regions=None),
Q(site_groups__in=sitegroups) | Q(site_groups=None),
Q(sites=obj.site) | Q(sites=None),
Q(locations=location) | Q(locations=None),
Q(locations__in=locations) | Q(locations=None),
Q(device_types=device_type) | Q(device_types=None),
Q(roles__in=device_roles) | Q(roles=None),
Q(platforms=obj.platform) | Q(platforms=None),
@@ -89,10 +90,10 @@ class ConfigContextModelQuerySet(RestrictedQuerySet):
ConfigContext.objects.filter(
self._get_config_context_filters()
).annotate(
_data=EmptyGroupByJSONBAgg('data', ordering=['weight', 'name'])
_data=EmptyGroupByJSONBAgg('data', order_by=['weight', 'name'])
).values("_data").order_by()
)
).distinct()
)
def _get_config_context_filters(self):
# Construct the set of Q objects for the specific object types
@@ -116,7 +117,7 @@ class ConfigContextModelQuerySet(RestrictedQuerySet):
).values_list(
'tag_id',
flat=True
)
).distinct()
)
) | Q(tags=None),
is_active=True,
@@ -124,7 +125,15 @@ class ConfigContextModelQuerySet(RestrictedQuerySet):
# Apply Location & DeviceType filters only for VirtualMachines
if self.model._meta.model_name == 'device':
base_query.add((Q(locations=OuterRef('location')) | Q(locations=None)), Q.AND)
base_query.add(
(Q(
locations__tree_id=OuterRef('location__tree_id'),
locations__level__lte=OuterRef('location__level'),
locations__lft__lte=OuterRef('location__lft'),
locations__rght__gte=OuterRef('location__rght'),
) | Q(locations=None)),
Q.AND
)
base_query.add((Q(device_types=OuterRef('device_type')) | Q(device_types=None)), Q.AND)
elif self.model._meta.model_name == 'virtualmachine':
base_query.add(Q(locations=None), Q.AND)

View File

@@ -329,6 +329,9 @@ class BaseScript:
# Declare the placeholder for the current request
self.request = None
# Initiate the storage backend (local, S3, etc) as a class attr
self.storage = storages.create_storage(storages.backends["scripts"])
# Compile test methods and initialize results skeleton
for method in dir(self):
if method.startswith('test_') and callable(getattr(self, method)):
@@ -394,8 +397,7 @@ class BaseScript:
return inspect.getfile(self.__class__)
def findsource(self, object):
storage = storages.create_storage(storages.backends["scripts"])
with storage.open(os.path.basename(self.filename), 'r') as f:
with self.storage.open(os.path.basename(self.filename), 'r') as f:
data = f.read()
# Break the source code into lines

View File

@@ -725,8 +725,9 @@ class ScriptResultsTable(BaseTable):
index = tables.Column(
verbose_name=_('Line')
)
time = tables.Column(
verbose_name=_('Time')
time = columns.DateTimeColumn(
verbose_name=_('Time'),
timespec='seconds'
)
status = tables.TemplateColumn(
template_code="""{% load log_levels %}{% log_level record.status %}""",

View File

@@ -3,6 +3,7 @@ import datetime
from django.contrib.contenttypes.models import ContentType
from django.urls import reverse
from django.utils.timezone import make_aware, now
from rest_framework import status
from core.choices import ManagedFileRootPathChoices
from core.events import *
@@ -858,16 +859,16 @@ class ConfigTemplateTest(APIViewTestCases.APIViewTestCase):
class ScriptTest(APITestCase):
class TestScriptClass(PythonClass):
class Meta:
name = "Test script"
name = 'Test script'
commit = True
scheduling_enabled = True
var1 = StringVar()
var2 = IntegerVar()
var3 = BooleanVar()
def run(self, data, commit=True):
self.log_info(data['var1'])
self.log_success(data['var2'])
self.log_failure(data['var3'])
@@ -878,14 +879,16 @@ class ScriptTest(APITestCase):
def setUpTestData(cls):
module = ScriptModule.objects.create(
file_root=ManagedFileRootPathChoices.SCRIPTS,
file_path='/var/tmp/script.py'
file_path='script.py',
)
Script.objects.create(
script = Script.objects.create(
module=module,
name="Test script",
name='Test script',
is_executable=True,
)
cls.url = reverse('extras-api:script-detail', kwargs={'pk': script.pk})
@property
def python_class(self):
return self.TestScriptClass
@@ -898,7 +901,7 @@ class ScriptTest(APITestCase):
def test_get_script(self):
module = ScriptModule.objects.get(
file_root=ManagedFileRootPathChoices.SCRIPTS,
file_path='/var/tmp/script.py'
file_path='script.py',
)
script = module.scripts.all().first()
url = reverse('extras-api:script-detail', kwargs={'pk': script.pk})
@@ -909,6 +912,76 @@ class ScriptTest(APITestCase):
self.assertEqual(response.data['vars']['var2'], 'IntegerVar')
self.assertEqual(response.data['vars']['var3'], 'BooleanVar')
def test_schedule_script_past_time_rejected(self):
"""
Scheduling with past schedule_at should fail.
"""
self.add_permissions('extras.run_script')
payload = {
'data': {'var1': 'hello', 'var2': 1, 'var3': False},
'commit': True,
'schedule_at': now() - datetime.timedelta(hours=1),
}
response = self.client.post(self.url, payload, format='json', **self.header)
self.assertHttpStatus(response, status.HTTP_400_BAD_REQUEST)
self.assertIn('schedule_at', response.data)
# Be tolerant of exact wording but ensure we failed on schedule_at being in the past
self.assertIn('future', str(response.data['schedule_at']).lower())
def test_schedule_script_interval_only(self):
"""
Interval without schedule_at should auto-set schedule_at now.
"""
self.add_permissions('extras.run_script')
payload = {
'data': {'var1': 'hello', 'var2': 1, 'var3': False},
'commit': True,
'interval': 60,
}
response = self.client.post(self.url, payload, format='json', **self.header)
self.assertHttpStatus(response, status.HTTP_200_OK)
# The latest job is returned in the script detail serializer under "result"
self.assertIn('result', response.data)
self.assertEqual(response.data['result']['interval'], 60)
# Ensure a start time was autopopulated
self.assertIsNotNone(response.data['result']['scheduled'])
def test_schedule_script_when_disabled(self):
"""
Scheduling should fail when script.scheduling_enabled=False.
"""
self.add_permissions('extras.run_script')
# Temporarily disable scheduling on the in-test Python class
original = getattr(self.TestScriptClass.Meta, 'scheduling_enabled', True)
self.TestScriptClass.Meta.scheduling_enabled = False
base = {
'data': {'var1': 'hello', 'var2': 1, 'var3': False},
'commit': True,
}
# Check both schedule_at and interval paths
cases = [
{**base, 'schedule_at': now() + datetime.timedelta(minutes=5)},
{**base, 'interval': 60},
]
try:
for case in cases:
with self.subTest(case=list(case.keys())):
response = self.client.post(self.url, case, format='json', **self.header)
self.assertHttpStatus(response, status.HTTP_400_BAD_REQUEST)
# Error should be attached to whichever field we used
key = 'schedule_at' if 'schedule_at' in case else 'interval'
self.assertIn(key, response.data)
self.assertIn('scheduling is not enabled', str(response.data[key]).lower())
finally:
# Restore the original setting for other tests
self.TestScriptClass.Meta.scheduling_enabled = original
class CreatedUpdatedFilterTest(APITestCase):

View File

@@ -1,7 +1,9 @@
import datetime
import json
from decimal import Decimal
from django.core.exceptions import ValidationError
from django.test import tag
from django.urls import reverse
from rest_framework import status
@@ -269,6 +271,60 @@ class CustomFieldTest(TestCase):
instance.refresh_from_db()
self.assertIsNone(instance.custom_field_data.get(cf.name))
@tag('regression')
def test_json_field_falsy_defaults(self):
"""Test that falsy JSON default values are properly handled"""
falsy_test_cases = [
({}, 'empty_dict'),
([], 'empty_array'),
(0, 'zero'),
(False, 'false_bool'),
("", 'empty_string'),
]
for default, suffix in falsy_test_cases:
with self.subTest(default=default, suffix=suffix):
cf = CustomField.objects.create(
name=f'json_falsy_{suffix}',
type=CustomFieldTypeChoices.TYPE_JSON,
default=default,
required=False
)
cf.object_types.set([self.object_type])
instance = Site.objects.create(name=f'Test Site {suffix}', slug=f'test-site-{suffix}')
self.assertIsNotNone(instance.custom_field_data)
self.assertIn(cf.name, instance.custom_field_data)
instance.refresh_from_db()
stored = instance.custom_field_data[cf.name]
self.assertEqual(stored, default)
@tag('regression')
def test_json_field_falsy_to_form_field(self):
"""Test form field generation preserves falsy defaults"""
falsy_test_cases = (
({}, json.dumps({}), 'empty_dict'),
([], json.dumps([]), 'empty_array'),
(0, json.dumps(0), 'zero'),
(False, json.dumps(False), 'false_bool'),
("", '""', 'empty_string'),
)
for default, expected, suffix in falsy_test_cases:
with self.subTest(default=default, expected=expected, suffix=suffix):
cf = CustomField.objects.create(
name=f'json_falsy_{suffix}',
type=CustomFieldTypeChoices.TYPE_JSON,
default=default,
required=False
)
cf.object_types.set([self.object_type])
form_field = cf.to_form_field(set_initial=True)
self.assertEqual(form_field.initial, expected)
def test_select_field(self):
CHOICES = (
('a', 'Option A'),
@@ -1244,6 +1300,28 @@ class CustomFieldAPITest(APITestCase):
response = self.client.patch(url, data, format='json', **self.header)
self.assertHttpStatus(response, status.HTTP_200_OK)
def test_url_regex_validation(self):
"""
Test that validation_regex is applied to URL custom fields (fixes #20498).
"""
site2 = Site.objects.get(name='Site 2')
url = reverse('dcim-api:site-detail', kwargs={'pk': site2.pk})
self.add_permissions('dcim.change_site')
cf_url = CustomField.objects.get(name='url_field')
cf_url.validation_regex = r'^https://' # Require HTTPS
cf_url.save()
# Test invalid URL (http instead of https)
data = {'custom_fields': {'url_field': 'http://example.com'}}
response = self.client.patch(url, data, format='json', **self.header)
self.assertHttpStatus(response, status.HTTP_400_BAD_REQUEST)
# Test valid URL (https)
data = {'custom_fields': {'url_field': 'https://example.com'}}
response = self.client.patch(url, data, format='json', **self.header)
self.assertHttpStatus(response, status.HTTP_200_OK)
def test_uniqueness_validation(self):
# Create a unique custom field
cf_text = CustomField.objects.get(name='text_field')

View File

@@ -1,17 +1,95 @@
import tempfile
from pathlib import Path
from django.contrib.contenttypes.models import ContentType
from django.core.files.uploadedfile import SimpleUploadedFile
from django.forms import ValidationError
from django.test import tag, TestCase
from core.models import DataSource, ObjectType
from dcim.models import Device, DeviceRole, DeviceType, Location, Manufacturer, Platform, Region, Site, SiteGroup
from extras.models import ConfigContext, ConfigContextProfile, ConfigTemplate, Tag
from extras.models import ConfigContext, ConfigContextProfile, ConfigTemplate, ImageAttachment, Tag, TaggedItem
from tenancy.models import Tenant, TenantGroup
from utilities.exceptions import AbortRequest
from virtualization.models import Cluster, ClusterGroup, ClusterType, VirtualMachine
class ImageAttachmentTests(TestCase):
@classmethod
def setUpTestData(cls):
cls.ct_rack = ContentType.objects.get(app_label='dcim', model='rack')
cls.image_content = b''
def _stub_image_attachment(self, object_id, image_filename, name=None):
"""
Creates an instance of ImageAttachment with the provided object_id and image_name.
This method prepares a stubbed image attachment to test functionalities that
require an ImageAttachment object.
The function initializes the attachment with a specified file name and
pre-defined image content.
"""
ia = ImageAttachment(
object_type=self.ct_rack,
object_id=object_id,
name=name,
image=SimpleUploadedFile(
name=image_filename,
content=self.image_content,
content_type='image/jpeg',
),
)
return ia
def test_filename_strips_expected_prefix(self):
"""
Tests that the filename of the image attachment is stripped of the expected
prefix.
"""
ia = self._stub_image_attachment(12, 'image-attachments/rack_12_My_File.png')
self.assertEqual(ia.filename, 'My_File.png')
def test_filename_legacy_nested_path_returns_basename(self):
"""
Tests if the filename of a legacy-nested path correctly returns only the basename.
"""
# e.g. "image-attachments/rack_12_5/31/23.jpg" -> "23.jpg"
ia = self._stub_image_attachment(12, 'image-attachments/rack_12_5/31/23.jpg')
self.assertEqual(ia.filename, '23.jpg')
def test_filename_no_prefix_returns_basename(self):
"""
Tests that the filename property correctly returns the basename for an image
attachment that has no leading prefix in its path.
"""
ia = self._stub_image_attachment(42, 'image-attachments/just_name.webp')
self.assertEqual(ia.filename, 'just_name.webp')
def test_mismatched_prefix_is_not_stripped(self):
"""
Tests that a mismatched prefix in the filename is not stripped.
"""
# Prefix does not match object_id -> leave as-is (basename only)
ia = self._stub_image_attachment(12, 'image-attachments/rack_13_other.png')
self.assertEqual('rack_13_other.png', ia.filename)
def test_str_uses_name_when_present(self):
"""
Tests that the `str` representation of the object uses the
`name` attribute when provided.
"""
ia = self._stub_image_attachment(12, 'image-attachments/rack_12_file.png', name='Human title')
self.assertEqual('Human title', str(ia))
def test_str_falls_back_to_filename(self):
"""
Tests that the `str` representation of the object falls back to
the filename if the name attribute is not set.
"""
ia = self._stub_image_attachment(12, 'image-attachments/rack_12_file.png', name='')
self.assertEqual('file.png', str(ia))
class TagTest(TestCase):
def test_default_ordering_weight_then_name_is_set(self):
@@ -445,7 +523,7 @@ class ConfigContextTest(TestCase):
vm1 = VirtualMachine.objects.create(name="VM 1", site=site, role=vm_role)
vm2 = VirtualMachine.objects.create(name="VM 2", cluster=cluster, role=vm_role)
# Check that their individually-rendered config contexts are identical
# Check that their individually rendered config contexts are identical
self.assertEqual(
vm1.get_config_context(),
vm2.get_config_context()
@@ -458,11 +536,39 @@ class ConfigContextTest(TestCase):
vms[1].get_config_context()
)
def test_valid_local_context_data(self):
device = Device.objects.first()
device.local_context_data = None
device.clean()
device.local_context_data = {"foo": "bar"}
device.clean()
def test_invalid_local_context_data(self):
device = Device.objects.first()
device.local_context_data = ""
with self.assertRaises(ValidationError):
device.clean()
device.local_context_data = 0
with self.assertRaises(ValidationError):
device.clean()
device.local_context_data = False
with self.assertRaises(ValidationError):
device.clean()
device.local_context_data = 'foo'
with self.assertRaises(ValidationError):
device.clean()
@tag('regression')
def test_multiple_tags_return_distinct_objects(self):
"""
Tagged items use a generic relationship, which results in duplicate rows being returned when queried.
This is combated by appending distinct() to the config context querysets. This test creates a config
context assigned to two tags and ensures objects related by those same two tags result in only a single
context assigned to two tags and ensures objects related to those same two tags result in only a single
config context record being returned.
See https://github.com/netbox-community/netbox/issues/5314
@@ -495,14 +601,15 @@ class ConfigContextTest(TestCase):
self.assertEqual(ConfigContext.objects.get_for_object(device).count(), 1)
self.assertEqual(device.get_config_context(), annotated_queryset[0].get_config_context())
def test_multiple_tags_return_distinct_objects_with_seperate_config_contexts(self):
@tag('regression')
def test_multiple_tags_return_distinct_objects_with_separate_config_contexts(self):
"""
Tagged items use a generic relationship, which results in duplicate rows being returned when queried.
This is combatted by by appending distinct() to the config context querysets. This test creates a config
context assigned to two tags and ensures objects related by those same two tags result in only a single
This is combated by appending distinct() to the config context querysets. This test creates a config
context assigned to two tags and ensures objects related to those same two tags result in only a single
config context record being returned.
This test case is seperate from the above in that it deals with multiple config context objects in play.
This test case is separate from the above in that it deals with multiple config context objects in play.
See https://github.com/netbox-community/netbox/issues/5387
"""
@@ -543,32 +650,47 @@ class ConfigContextTest(TestCase):
self.assertEqual(ConfigContext.objects.get_for_object(device).count(), 2)
self.assertEqual(device.get_config_context(), annotated_queryset[0].get_config_context())
def test_valid_local_context_data(self):
@tag('performance', 'regression')
def test_config_context_annotation_query_optimization(self):
"""
Regression test for issue #20327: Ensure config context annotation
doesn't use expensive DISTINCT on main query.
Verifies that DISTINCT is only used in tag subquery where needed,
not on the main device query which is expensive for large datasets.
"""
device = Device.objects.first()
device.local_context_data = None
device.clean()
queryset = Device.objects.filter(pk=device.pk).annotate_config_context_data()
device.local_context_data = {"foo": "bar"}
device.clean()
# Main device query should NOT use DISTINCT
self.assertFalse(queryset.query.distinct)
def test_invalid_local_context_data(self):
device = Device.objects.first()
# Check that tag subqueries DO use DISTINCT by inspecting the annotation
config_annotation = queryset.query.annotations.get('config_context_data')
self.assertIsNotNone(config_annotation)
device.local_context_data = ""
with self.assertRaises(ValidationError):
device.clean()
def find_tag_subqueries(where_node):
"""Find subqueries in WHERE clause that relate to tag filtering"""
subqueries = []
device.local_context_data = 0
with self.assertRaises(ValidationError):
device.clean()
def traverse(node):
if hasattr(node, 'children'):
for child in node.children:
try:
if child.rhs.query.model is TaggedItem:
subqueries.append(child.rhs.query)
except AttributeError:
traverse(child)
traverse(where_node)
return subqueries
device.local_context_data = False
with self.assertRaises(ValidationError):
device.clean()
# Find subqueries in the WHERE clause that should have DISTINCT
tag_subqueries = find_tag_subqueries(config_annotation.query.where)
distinct_subqueries = [sq for sq in tag_subqueries if sq.distinct]
device.local_context_data = 'foo'
with self.assertRaises(ValidationError):
device.clean()
# Verify we found at least one DISTINCT subquery for tags
self.assertEqual(len(distinct_subqueries), 1)
self.assertTrue(distinct_subqueries[0].distinct)
class ConfigTemplateTest(TestCase):

View File

@@ -1,7 +1,10 @@
from types import SimpleNamespace
from django.contrib.contenttypes.models import ContentType
from django.test import TestCase
from extras.models import ExportTemplate
from extras.utils import filename_from_model
from extras.utils import filename_from_model, image_upload
from tenancy.models import ContactGroup, TenantGroup
from wireless.models import WirelessLANGroup
@@ -17,3 +20,141 @@ class FilenameFromModelTests(TestCase):
for model, expected in cases:
self.assertEqual(filename_from_model(model), expected)
class ImageUploadTests(TestCase):
@classmethod
def setUpTestData(cls):
# We only need a ContentType with model="rack" for the prefix;
# this doesn't require creating a Rack object.
cls.ct_rack = ContentType.objects.get(app_label='dcim', model='rack')
def _stub_instance(self, object_id=12, name=None):
"""
Creates a minimal stub for use with the `image_upload()` function.
This method generates an instance of `SimpleNamespace` containing a set
of attributes required to simulate the expected input for the
`image_upload()` method.
It is designed to simplify testing or processing by providing a
lightweight representation of an object.
"""
return SimpleNamespace(object_type=self.ct_rack, object_id=object_id, name=name)
def _second_segment(self, path: str):
"""
Extracts and returns the portion of the input string after the
first '/' character.
"""
return path.split('/', 1)[1]
def test_windows_fake_path_and_extension_lowercased(self):
"""
Tests handling of a Windows file path with a fake directory and extension.
"""
inst = self._stub_instance(name=None)
path = image_upload(inst, r'C:\fake_path\MyPhoto.JPG')
# Base directory and single-level path
seg2 = self._second_segment(path)
self.assertTrue(path.startswith('image-attachments/rack_12_'))
self.assertNotIn('/', seg2, 'should not create nested directories')
# Extension from the uploaded file, lowercased
self.assertTrue(seg2.endswith('.jpg'))
def test_name_with_slashes_is_flattened_no_subdirectories(self):
"""
Tests that a name with slashes is flattened and does not
create subdirectories.
"""
inst = self._stub_instance(name='5/31/23')
path = image_upload(inst, 'image.png')
seg2 = self._second_segment(path)
self.assertTrue(seg2.startswith('rack_12_'))
self.assertNotIn('/', seg2)
self.assertNotIn('\\', seg2)
self.assertTrue(seg2.endswith('.png'))
def test_name_with_backslashes_is_flattened_no_subdirectories(self):
"""
Tests that a name including backslashes is correctly flattened
into a single directory name without creating subdirectories.
"""
inst = self._stub_instance(name=r'5\31\23')
path = image_upload(inst, 'image_name.png')
seg2 = self._second_segment(path)
self.assertTrue(seg2.startswith('rack_12_'))
self.assertNotIn('/', seg2)
self.assertNotIn('\\', seg2)
self.assertTrue(seg2.endswith('.png'))
def test_prefix_format_is_as_expected(self):
"""
Tests the output path format generated by the `image_upload` function.
"""
inst = self._stub_instance(object_id=99, name='label')
path = image_upload(inst, 'a.webp')
# The second segment must begin with "rack_99_"
seg2 = self._second_segment(path)
self.assertTrue(seg2.startswith('rack_99_'))
self.assertTrue(seg2.endswith('.webp'))
def test_unsupported_file_extension(self):
"""
Test that when the file extension is not allowed, the extension
is omitted.
"""
inst = self._stub_instance(name='test')
path = image_upload(inst, 'document.txt')
seg2 = self._second_segment(path)
self.assertTrue(seg2.startswith('rack_12_test'))
self.assertFalse(seg2.endswith('.txt'))
# When not allowed, no extension should be appended
self.assertNotRegex(seg2, r'\.txt$')
def test_instance_name_with_whitespace_and_special_chars(self):
"""
Test that an instance name with leading/trailing whitespace and
special characters is sanitized properly.
"""
# Suppose the instance name has surrounding whitespace and
# extra slashes.
inst = self._stub_instance(name=' my/complex\\name ')
path = image_upload(inst, 'irrelevant.png')
# The output should be flattened and sanitized.
# We expect the name to be transformed into a valid filename without
# path separators.
seg2 = self._second_segment(path)
self.assertNotIn(' ', seg2)
self.assertNotIn('/', seg2)
self.assertNotIn('\\', seg2)
self.assertTrue(seg2.endswith('.png'))
def test_separator_variants_with_subTest(self):
"""
Tests that both forward slash and backslash in file paths are
handled consistently by the `image_upload` function and
processed into a sanitized uniform format.
"""
for name in ['2025/09/12', r'2025\09\12']:
with self.subTest(name=name):
inst = self._stub_instance(name=name)
path = image_upload(inst, 'x.jpeg')
seg2 = self._second_segment(path)
self.assertTrue(seg2.startswith('rack_12_'))
self.assertNotIn('/', seg2)
self.assertNotIn('\\', seg2)
self.assertTrue(seg2.endswith('.jpeg') or seg2.endswith('.jpg'))
def test_fallback_on_suspicious_file_operation(self):
"""
Test that when default_storage.get_valid_name() raises a
SuspiciousFileOperation, the fallback default is used.
"""
inst = self._stub_instance(name=' ')
path = image_upload(inst, 'sample.png')
# Expect the fallback name 'unnamed' to be used.
self.assertIn('unnamed', path)
self.assertTrue(path.startswith('image-attachments/rack_12_'))

Some files were not shown because too many files have changed in this diff Show More