Merge branch 'main' into feature
Some checks are pending
CI / build (20.x, 3.12) (push) Waiting to run
CI / build (20.x, 3.13) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, actions) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Waiting to run
CodeQL / Analyze (${{ matrix.language }}) (none, python) (push) Waiting to run

This commit is contained in:
Jeremy Stretch 2025-11-25 15:25:53 -05:00
commit afba5b2791
80 changed files with 12025 additions and 7045 deletions

View File

@ -15,7 +15,7 @@ body:
attributes: attributes:
label: NetBox version label: NetBox version
description: What version of NetBox are you currently running? description: What version of NetBox are you currently running?
placeholder: v4.4.6 placeholder: v4.4.7
validations: validations:
required: true required: true
- type: dropdown - type: dropdown

View File

@ -27,7 +27,7 @@ body:
attributes: attributes:
label: NetBox Version label: NetBox Version
description: What version of NetBox are you currently running? description: What version of NetBox are you currently running?
placeholder: v4.4.6 placeholder: v4.4.7
validations: validations:
required: true required: true
- type: dropdown - type: dropdown

View File

@ -186,6 +186,7 @@
"usb-3-micro-b", "usb-3-micro-b",
"molex-micro-fit-1x2", "molex-micro-fit-1x2",
"molex-micro-fit-2x2", "molex-micro-fit-2x2",
"molex-micro-fit-2x3",
"molex-micro-fit-2x4", "molex-micro-fit-2x4",
"dc-terminal", "dc-terminal",
"saf-d-grid", "saf-d-grid",
@ -293,6 +294,7 @@
"usb-c", "usb-c",
"molex-micro-fit-1x2", "molex-micro-fit-1x2",
"molex-micro-fit-2x2", "molex-micro-fit-2x2",
"molex-micro-fit-2x3",
"molex-micro-fit-2x4", "molex-micro-fit-2x4",
"dc-terminal", "dc-terminal",
"eaton-c39", "eaton-c39",

File diff suppressed because one or more lines are too long

View File

@ -232,6 +232,9 @@ STORAGES = {
}, },
"scripts": { "scripts": {
"BACKEND": "extras.storage.ScriptFileSystemStorage", "BACKEND": "extras.storage.ScriptFileSystemStorage",
"OPTIONS": {
"allow_overwrite": True,
},
}, },
} }
``` ```
@ -247,6 +250,7 @@ STORAGES = {
"OPTIONS": { "OPTIONS": {
'access_key': 'access key', 'access_key': 'access key',
'secret_key': 'secret key', 'secret_key': 'secret key',
"allow_overwrite": True,
} }
}, },
} }

View File

@ -95,7 +95,7 @@ An example fieldset definition is provided below:
```python ```python
class MyScript(Script): class MyScript(Script):
class Meta: class Meta(Script.Meta):
fieldsets = ( fieldsets = (
('First group', ('field1', 'field2', 'field3')), ('First group', ('field1', 'field2', 'field3')),
('Second group', ('field4', 'field5')), ('Second group', ('field4', 'field5')),
@ -499,7 +499,7 @@ from extras.scripts import *
class NewBranchScript(Script): class NewBranchScript(Script):
class Meta: class Meta(Script.Meta):
name = "New Branch" name = "New Branch"
description = "Provision a new branch site" description = "Provision a new branch site"
field_order = ['site_name', 'switch_count', 'switch_model'] field_order = ['site_name', 'switch_count', 'switch_model']

View File

@ -1,5 +1,36 @@
# NetBox v4.4 # NetBox v4.4
## v4.4.7 (2025-11-25)
### Enhancements
* [#20371](https://github.com/netbox-community/netbox/issues/20371) - Add Molex Micro-Fit 2x3 for power ports & power outlets
* [#20731](https://github.com/netbox-community/netbox/issues/20731) - Enable specifying `data_source` & `data_file` when bulk import config templates
* [#20820](https://github.com/netbox-community/netbox/issues/20820) - Enable filtering of custom fields by object type
* [#20823](https://github.com/netbox-community/netbox/issues/20823) - Disallow creation of API tokens with an expiration date in the past
* [#20841](https://github.com/netbox-community/netbox/issues/20841) - Support advanced filtering for available rack types when creating/editing a rack
### Bug Fixes
* [#20134](https://github.com/netbox-community/netbox/issues/20134) - Prevent out-of-band HTMX content swaps in embedded tables
* [#20432](https://github.com/netbox-community/netbox/issues/20432) - Fix tracing of cables across multiple circuits in parallel
* [#20465](https://github.com/netbox-community/netbox/issues/20465) - Ensure that scripts are updated immediately when a new file is uploaded
* [#20638](https://github.com/netbox-community/netbox/issues/20638) - Correct OpenAPI schema for bulk create operations
* [#20649](https://github.com/netbox-community/netbox/issues/20649) - Enforce view permissions on REST API endpoint for custom scripts
* [#20740](https://github.com/netbox-community/netbox/issues/20740) - Ensure permissions constraints are enforced when executing custom scripts via the REST API
* [#20743](https://github.com/netbox-community/netbox/issues/20743) - Pass request context to custom script when triggered by an event rule
* [#20766](https://github.com/netbox-community/netbox/issues/20766) - Fix inadvertent translations on server error page
* [#20775](https://github.com/netbox-community/netbox/issues/20775) - Fix `TypeError` exception when bulk renaming unnamed devices
* [#20822](https://github.com/netbox-community/netbox/issues/20822) - Add missing `auto_sync_enabled` field in bulk edit forms
* [#20827](https://github.com/netbox-community/netbox/issues/20827) - Fix UI styling issue when toggling between light and dark mode
* [#20839](https://github.com/netbox-community/netbox/issues/20839) - Fix filtering by object type in UI for custom links and saved filters
* [#20840](https://github.com/netbox-community/netbox/issues/20840) - Remove extraneous references to airflow for RackType model
* [#20844](https://github.com/netbox-community/netbox/issues/20844) - Fix object type filter for L2VPN terminations
* [#20859](https://github.com/netbox-community/netbox/issues/20859) - Prevent dashboard crash due to exception raised by a widget
* [#20865](https://github.com/netbox-community/netbox/issues/20865) - Enforce proper min/max values for latitude & longitude fields
---
## v4.4.6 (2025-11-11) ## v4.4.6 (2025-11-11)
### Enhancements ### Enhancements

View File

@ -12,6 +12,7 @@ from drf_spectacular.utils import Direction
from netbox.api.fields import ChoiceField from netbox.api.fields import ChoiceField
from netbox.api.serializers import WritableNestedSerializer from netbox.api.serializers import WritableNestedSerializer
from netbox.api.viewsets import NetBoxModelViewSet
# see netbox.api.routers.NetBoxRouter # see netbox.api.routers.NetBoxRouter
BULK_ACTIONS = ("bulk_destroy", "bulk_partial_update", "bulk_update") BULK_ACTIONS = ("bulk_destroy", "bulk_partial_update", "bulk_update")
@ -49,6 +50,11 @@ class ChoiceFieldFix(OpenApiSerializerFieldExtension):
) )
def viewset_handles_bulk_create(view):
"""Check if view automatically provides list-based bulk create"""
return isinstance(view, NetBoxModelViewSet)
class NetBoxAutoSchema(AutoSchema): class NetBoxAutoSchema(AutoSchema):
""" """
Overrides to drf_spectacular.openapi.AutoSchema to fix following issues: Overrides to drf_spectacular.openapi.AutoSchema to fix following issues:
@ -128,6 +134,36 @@ class NetBoxAutoSchema(AutoSchema):
return response_serializers return response_serializers
def _get_request_for_media_type(self, serializer, direction='request'):
"""
Override to generate oneOf schema for serializers that support both
single object and array input (NetBoxModelViewSet POST operations).
Refs: #20638
"""
# Get the standard schema first
schema, required = super()._get_request_for_media_type(serializer, direction)
# If this serializer supports arrays (marked in get_request_serializer),
# wrap the schema in oneOf to allow single object OR array
if (
direction == 'request' and
schema is not None and
getattr(self.view, 'action', None) == 'create' and
viewset_handles_bulk_create(self.view)
):
return {
'oneOf': [
schema, # Single object
{
'type': 'array',
'items': schema, # Array of objects
}
]
}, required
return schema, required
def _get_serializer_name(self, serializer, direction, bypass_extensions=False) -> str: def _get_serializer_name(self, serializer, direction, bypass_extensions=False) -> str:
name = super()._get_serializer_name(serializer, direction, bypass_extensions) name = super()._get_serializer_name(serializer, direction, bypass_extensions)

View File

@ -0,0 +1,108 @@
"""
Unit tests for OpenAPI schema generation.
Refs: #20638
"""
import json
from django.test import TestCase
class OpenAPISchemaTestCase(TestCase):
"""Tests for OpenAPI schema generation."""
def setUp(self):
"""Fetch schema via API endpoint."""
response = self.client.get('/api/schema/', {'format': 'json'})
self.assertEqual(response.status_code, 200)
self.schema = json.loads(response.content)
def test_post_operation_documents_single_or_array(self):
"""
POST operations on NetBoxModelViewSet endpoints should document
support for both single objects and arrays via oneOf.
Refs: #20638
"""
# Test representative endpoints across different apps
test_paths = [
'/api/core/data-sources/',
'/api/dcim/sites/',
'/api/users/users/',
'/api/ipam/ip-addresses/',
]
for path in test_paths:
with self.subTest(path=path):
operation = self.schema['paths'][path]['post']
# Get the request body schema
request_schema = operation['requestBody']['content']['application/json']['schema']
# Should have oneOf with two options
self.assertIn('oneOf', request_schema, f"POST {path} should have oneOf schema")
self.assertEqual(
len(request_schema['oneOf']), 2,
f"POST {path} oneOf should have exactly 2 options"
)
# First option: single object (has $ref or properties)
single_schema = request_schema['oneOf'][0]
self.assertTrue(
'$ref' in single_schema or 'properties' in single_schema,
f"POST {path} first oneOf option should be single object"
)
# Second option: array of objects
array_schema = request_schema['oneOf'][1]
self.assertEqual(
array_schema['type'], 'array',
f"POST {path} second oneOf option should be array"
)
self.assertIn('items', array_schema, f"POST {path} array should have items")
def test_bulk_update_operations_require_array_only(self):
"""
Bulk update/patch operations should require arrays only, not oneOf.
They don't support single object input.
Refs: #20638
"""
test_paths = [
'/api/dcim/sites/',
'/api/users/users/',
]
for path in test_paths:
for method in ['put', 'patch']:
with self.subTest(path=path, method=method):
operation = self.schema['paths'][path][method]
request_schema = operation['requestBody']['content']['application/json']['schema']
# Should be array-only, not oneOf
self.assertNotIn(
'oneOf', request_schema,
f"{method.upper()} {path} should NOT have oneOf (array-only)"
)
self.assertEqual(
request_schema['type'], 'array',
f"{method.upper()} {path} should require array"
)
self.assertIn(
'items', request_schema,
f"{method.upper()} {path} array should have items"
)
def test_bulk_delete_requires_array(self):
"""
Bulk delete operations should require arrays.
Refs: #20638
"""
path = '/api/dcim/sites/'
operation = self.schema['paths'][path]['delete']
request_schema = operation['requestBody']['content']['application/json']['schema']
# Should be array-only
self.assertNotIn('oneOf', request_schema, "DELETE should NOT have oneOf")
self.assertEqual(request_schema['type'], 'array', "DELETE should require array")
self.assertIn('items', request_schema, "DELETE array should have items")

View File

@ -461,6 +461,7 @@ class PowerPortTypeChoices(ChoiceSet):
# Molex # Molex
TYPE_MOLEX_MICRO_FIT_1X2 = 'molex-micro-fit-1x2' TYPE_MOLEX_MICRO_FIT_1X2 = 'molex-micro-fit-1x2'
TYPE_MOLEX_MICRO_FIT_2X2 = 'molex-micro-fit-2x2' TYPE_MOLEX_MICRO_FIT_2X2 = 'molex-micro-fit-2x2'
TYPE_MOLEX_MICRO_FIT_2X3 = 'molex-micro-fit-2x3'
TYPE_MOLEX_MICRO_FIT_2X4 = 'molex-micro-fit-2x4' TYPE_MOLEX_MICRO_FIT_2X4 = 'molex-micro-fit-2x4'
# Direct current (DC) # Direct current (DC)
TYPE_DC = 'dc-terminal' TYPE_DC = 'dc-terminal'
@ -588,6 +589,7 @@ class PowerPortTypeChoices(ChoiceSet):
('Molex', ( ('Molex', (
(TYPE_MOLEX_MICRO_FIT_1X2, 'Molex Micro-Fit 1x2'), (TYPE_MOLEX_MICRO_FIT_1X2, 'Molex Micro-Fit 1x2'),
(TYPE_MOLEX_MICRO_FIT_2X2, 'Molex Micro-Fit 2x2'), (TYPE_MOLEX_MICRO_FIT_2X2, 'Molex Micro-Fit 2x2'),
(TYPE_MOLEX_MICRO_FIT_2X3, 'Molex Micro-Fit 2x3'),
(TYPE_MOLEX_MICRO_FIT_2X4, 'Molex Micro-Fit 2x4'), (TYPE_MOLEX_MICRO_FIT_2X4, 'Molex Micro-Fit 2x4'),
)), )),
('DC', ( ('DC', (
@ -710,6 +712,7 @@ class PowerOutletTypeChoices(ChoiceSet):
# Molex # Molex
TYPE_MOLEX_MICRO_FIT_1X2 = 'molex-micro-fit-1x2' TYPE_MOLEX_MICRO_FIT_1X2 = 'molex-micro-fit-1x2'
TYPE_MOLEX_MICRO_FIT_2X2 = 'molex-micro-fit-2x2' TYPE_MOLEX_MICRO_FIT_2X2 = 'molex-micro-fit-2x2'
TYPE_MOLEX_MICRO_FIT_2X3 = 'molex-micro-fit-2x3'
TYPE_MOLEX_MICRO_FIT_2X4 = 'molex-micro-fit-2x4' TYPE_MOLEX_MICRO_FIT_2X4 = 'molex-micro-fit-2x4'
# Direct current (DC) # Direct current (DC)
TYPE_DC = 'dc-terminal' TYPE_DC = 'dc-terminal'
@ -831,6 +834,7 @@ class PowerOutletTypeChoices(ChoiceSet):
('Molex', ( ('Molex', (
(TYPE_MOLEX_MICRO_FIT_1X2, 'Molex Micro-Fit 1x2'), (TYPE_MOLEX_MICRO_FIT_1X2, 'Molex Micro-Fit 1x2'),
(TYPE_MOLEX_MICRO_FIT_2X2, 'Molex Micro-Fit 2x2'), (TYPE_MOLEX_MICRO_FIT_2X2, 'Molex Micro-Fit 2x2'),
(TYPE_MOLEX_MICRO_FIT_2X3, 'Molex Micro-Fit 2x3'),
(TYPE_MOLEX_MICRO_FIT_2X4, 'Molex Micro-Fit 2x4'), (TYPE_MOLEX_MICRO_FIT_2X4, 'Molex Micro-Fit 2x4'),
)), )),
('DC', ( ('DC', (

View File

@ -291,11 +291,6 @@ class RackBaseFilterForm(PrimaryModelFilterSetForm):
choices=BOOLEAN_WITH_BLANK_CHOICES choices=BOOLEAN_WITH_BLANK_CHOICES
) )
) )
airflow = forms.MultipleChoiceField(
label=_('Airflow'),
choices=add_blank_choice(RackAirflowChoices),
required=False
)
weight = forms.DecimalField( weight = forms.DecimalField(
label=_('Weight'), label=_('Weight'),
required=False, required=False,
@ -399,6 +394,11 @@ class RackFilterForm(TenancyFilterForm, ContactModelFilterForm, RackBaseFilterFo
}, },
label=_('Rack type') label=_('Rack type')
) )
airflow = forms.MultipleChoiceField(
label=_('Airflow'),
choices=add_blank_choice(RackAirflowChoices),
required=False
)
serial = forms.CharField( serial = forms.CharField(
label=_('Serial'), label=_('Serial'),
required=False required=False

View File

@ -259,7 +259,8 @@ class RackForm(TenancyForm, PrimaryModelForm):
label=_('Rack Type'), label=_('Rack Type'),
queryset=RackType.objects.all(), queryset=RackType.objects.all(),
required=False, required=False,
help_text=_("Select a pre-defined rack type, or set physical characteristics below.") selector=True,
help_text=_("Select a pre-defined rack type, or set physical characteristics below."),
) )
fieldsets = ( fieldsets = (

View File

@ -0,0 +1,67 @@
import django.core.validators
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dcim', '0215_rackreservation_status'),
]
operations = [
migrations.AlterField(
model_name='device',
name='latitude',
field=models.DecimalField(
blank=True,
decimal_places=6,
max_digits=8,
null=True,
validators=[
django.core.validators.MinValueValidator(-90.0),
django.core.validators.MaxValueValidator(90.0),
],
),
),
migrations.AlterField(
model_name='device',
name='longitude',
field=models.DecimalField(
blank=True,
decimal_places=6,
max_digits=9,
null=True,
validators=[
django.core.validators.MinValueValidator(-180.0),
django.core.validators.MaxValueValidator(180.0),
],
),
),
migrations.AlterField(
model_name='site',
name='latitude',
field=models.DecimalField(
blank=True,
decimal_places=6,
max_digits=8,
null=True,
validators=[
django.core.validators.MinValueValidator(-90.0),
django.core.validators.MaxValueValidator(90.0),
],
),
),
migrations.AlterField(
model_name='site',
name='longitude',
field=models.DecimalField(
blank=True,
decimal_places=6,
max_digits=9,
null=True,
validators=[
django.core.validators.MinValueValidator(-180.0),
django.core.validators.MaxValueValidator(180.0),
],
),
),
]

View File

@ -5,7 +5,7 @@ from django.db import migrations
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
('dcim', '0215_rackreservation_status'), ('dcim', '0216_latitude_longitude_validators'),
] ]
operations = [ operations = [

View File

@ -4,7 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
('dcim', '0216_poweroutlettemplate_color'), ('dcim', '0217_poweroutlettemplate_color'),
('users', '0015_owner'), ('users', '0015_owner'),
] ]

View File

@ -35,7 +35,7 @@ def populate_rack_type_rack_count(apps, schema_editor):
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
('dcim', '0217_owner'), ('dcim', '0218_owner'),
] ]
operations = [ operations = [

View File

@ -5,7 +5,7 @@ from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
('dcim', '0218_devicetype_device_count'), ('dcim', '0219_devicetype_device_count'),
] ]
operations = [ operations = [

View File

@ -4,7 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
('dcim', '0219_cable_profile'), ('dcim', '0220_cable_profile'),
] ]
operations = [ operations = [

View File

@ -11,6 +11,7 @@ from django.utils.translation import gettext_lazy as _
from core.models import ObjectType from core.models import ObjectType
from dcim.choices import * from dcim.choices import *
from dcim.constants import * from dcim.constants import *
from dcim.exceptions import UnsupportedCablePath
from dcim.fields import PathField from dcim.fields import PathField
from dcim.utils import decompile_path_node, object_to_path_node from dcim.utils import decompile_path_node, object_to_path_node
from netbox.choices import ColorChoices from netbox.choices import ColorChoices
@ -29,8 +30,6 @@ __all__ = (
'CableTermination', 'CableTermination',
) )
from ..exceptions import UnsupportedCablePath
trace_paths = Signal() trace_paths = Signal()
@ -652,7 +651,7 @@ class CablePath(models.Model):
Cable or WirelessLink connects (interfaces, console ports, circuit termination, etc.). All terminations must be Cable or WirelessLink connects (interfaces, console ports, circuit termination, etc.). All terminations must be
of the same type and must belong to the same parent object. of the same type and must belong to the same parent object.
""" """
from circuits.models import CircuitTermination from circuits.models import CircuitTermination, Circuit
if not terminations: if not terminations:
return None return None
@ -674,8 +673,11 @@ class CablePath(models.Model):
raise UnsupportedCablePath(_("All mid-span terminations must have the same termination type")) raise UnsupportedCablePath(_("All mid-span terminations must have the same termination type"))
# All mid-span terminations must all be attached to the same device # All mid-span terminations must all be attached to the same device
if (not isinstance(terminations[0], PathEndpoint) and not if (
all(t.parent_object == terminations[0].parent_object for t in terminations[1:])): not isinstance(terminations[0], PathEndpoint) and
not isinstance(terminations[0].parent_object, Circuit) and
not all(t.parent_object == terminations[0].parent_object for t in terminations[1:])
):
raise UnsupportedCablePath(_("All mid-span terminations must have the same parent object")) raise UnsupportedCablePath(_("All mid-span terminations must have the same parent object"))
# Check for a split path (e.g. rear port fanning out to multiple front ports with # Check for a split path (e.g. rear port fanning out to multiple front ports with
@ -830,32 +832,39 @@ class CablePath(models.Model):
elif isinstance(remote_terminations[0], CircuitTermination): elif isinstance(remote_terminations[0], CircuitTermination):
# Follow a CircuitTermination to its corresponding CircuitTermination (A to Z or vice versa) # Follow a CircuitTermination to its corresponding CircuitTermination (A to Z or vice versa)
if len(remote_terminations) > 1: qs = Q()
is_split = True for remote_termination in remote_terminations:
qs |= Q(
circuit=remote_termination.circuit,
term_side='Z' if remote_termination.term_side == 'A' else 'A'
)
# Get all circuit terminations
circuit_terminations = CircuitTermination.objects.filter(qs)
if not circuit_terminations.exists():
break break
circuit_termination = CircuitTermination.objects.filter( elif all([ct._provider_network for ct in circuit_terminations]):
circuit=remote_terminations[0].circuit,
term_side='Z' if remote_terminations[0].term_side == 'A' else 'A'
).first()
if circuit_termination is None:
break
elif circuit_termination._provider_network:
# Circuit terminates to a ProviderNetwork # Circuit terminates to a ProviderNetwork
path.extend([ path.extend([
[object_to_path_node(circuit_termination)], [object_to_path_node(ct) for ct in circuit_terminations],
[object_to_path_node(circuit_termination._provider_network)], [object_to_path_node(ct._provider_network) for ct in circuit_terminations],
]) ])
is_complete = True is_complete = True
break break
elif circuit_termination.termination and not circuit_termination.cable: elif all([ct.termination and not ct.cable for ct in circuit_terminations]):
# Circuit terminates to a Region/Site/etc. # Circuit terminates to a Region/Site/etc.
path.extend([ path.extend([
[object_to_path_node(circuit_termination)], [object_to_path_node(ct) for ct in circuit_terminations],
[object_to_path_node(circuit_termination.termination)], [object_to_path_node(ct.termination) for ct in circuit_terminations],
]) ])
break break
elif any([ct.cable in links for ct in circuit_terminations]):
# No valid path
is_split = True
break
terminations = [circuit_termination] terminations = circuit_terminations
else: else:
# Check for non-symmetric path # Check for non-symmetric path

View File

@ -650,6 +650,7 @@ class Device(
decimal_places=6, decimal_places=6,
blank=True, blank=True,
null=True, null=True,
validators=[MinValueValidator(-90.0), MaxValueValidator(90.0)],
help_text=_("GPS coordinate in decimal format (xx.yyyyyy)") help_text=_("GPS coordinate in decimal format (xx.yyyyyy)")
) )
longitude = models.DecimalField( longitude = models.DecimalField(
@ -658,6 +659,7 @@ class Device(
decimal_places=6, decimal_places=6,
blank=True, blank=True,
null=True, null=True,
validators=[MinValueValidator(-180.0), MaxValueValidator(180.0)],
help_text=_("GPS coordinate in decimal format (xx.yyyyyy)") help_text=_("GPS coordinate in decimal format (xx.yyyyyy)")
) )
services = GenericRelation( services = GenericRelation(

View File

@ -1,5 +1,6 @@
from django.contrib.contenttypes.fields import GenericRelation from django.contrib.contenttypes.fields import GenericRelation
from django.core.exceptions import ValidationError from django.core.exceptions import ValidationError
from django.core.validators import MaxValueValidator, MinValueValidator
from django.db import models from django.db import models
from django.utils.translation import gettext_lazy as _ from django.utils.translation import gettext_lazy as _
from timezone_field import TimeZoneField from timezone_field import TimeZoneField
@ -210,6 +211,7 @@ class Site(ContactsMixin, ImageAttachmentsMixin, PrimaryModel):
decimal_places=6, decimal_places=6,
blank=True, blank=True,
null=True, null=True,
validators=[MinValueValidator(-90.0), MaxValueValidator(90.0)],
help_text=_('GPS coordinate in decimal format (xx.yyyyyy)') help_text=_('GPS coordinate in decimal format (xx.yyyyyy)')
) )
longitude = models.DecimalField( longitude = models.DecimalField(
@ -218,6 +220,7 @@ class Site(ContactsMixin, ImageAttachmentsMixin, PrimaryModel):
decimal_places=6, decimal_places=6,
blank=True, blank=True,
null=True, null=True,
validators=[MinValueValidator(-180.0), MaxValueValidator(180.0)],
help_text=_('GPS coordinate in decimal format (xx.yyyyyy)') help_text=_('GPS coordinate in decimal format (xx.yyyyyy)')
) )

View File

@ -89,8 +89,8 @@ class RackTypeTable(PrimaryModelTable):
model = RackType model = RackType
fields = ( fields = (
'pk', 'id', 'model', 'manufacturer', 'form_factor', 'u_height', 'starting_unit', 'width', 'outer_width', 'pk', 'id', 'model', 'manufacturer', 'form_factor', 'u_height', 'starting_unit', 'width', 'outer_width',
'outer_height', 'outer_depth', 'mounting_depth', 'airflow', 'weight', 'max_weight', 'description', 'outer_height', 'outer_depth', 'mounting_depth', 'weight', 'max_weight', 'description', 'comments',
'comments', 'rack_count', 'tags', 'created', 'last_updated', 'rack_count', 'tags', 'created', 'last_updated',
) )
default_columns = ( default_columns = (
'pk', 'model', 'manufacturer', 'type', 'u_height', 'description', 'rack_count', 'pk', 'model', 'manufacturer', 'type', 'u_height', 'description', 'rack_count',

View File

@ -2191,7 +2191,81 @@ class LegacyCablePathTests(CablePathTestCase):
CableTraceSVG(interface1).render() CableTraceSVG(interface1).render()
CableTraceSVG(interface2).render() CableTraceSVG(interface2).render()
def test_223_single_path_via_multiple_pass_throughs_with_breakouts(self): def test_223_interface_to_interface_via_multiple_circuit_terminations(self):
provider = Provider.objects.first()
circuit_type = CircuitType.objects.first()
circuit1 = self.circuit
circuit2 = Circuit.objects.create(provider=provider, type=circuit_type, cid='Circuit 2')
interface1 = Interface.objects.create(device=self.device, name='Interface 1')
interface2 = Interface.objects.create(device=self.device, name='Interface 2')
circuittermination1_A = CircuitTermination.objects.create(
circuit=circuit1,
termination=self.site,
term_side='A'
)
circuittermination1_Z = CircuitTermination.objects.create(
circuit=circuit1,
termination=self.site,
term_side='Z'
)
circuittermination2_A = CircuitTermination.objects.create(
circuit=circuit2,
termination=self.site,
term_side='A'
)
circuittermination2_Z = CircuitTermination.objects.create(
circuit=circuit2,
termination=self.site,
term_side='Z'
)
# Create cables
cable1 = Cable(
a_terminations=[interface1],
b_terminations=[circuittermination1_A, circuittermination2_A]
)
cable2 = Cable(
a_terminations=[interface2],
b_terminations=[circuittermination1_Z, circuittermination2_Z]
)
cable1.save()
cable2.save()
self.assertEqual(CablePath.objects.count(), 2)
path1 = self.assertPathExists(
(
interface1,
cable1,
(circuittermination1_A, circuittermination2_A),
(circuittermination1_Z, circuittermination2_Z),
cable2,
interface2
),
is_active=True,
is_complete=True,
)
interface1.refresh_from_db()
self.assertPathIsSet(interface1, path1)
path2 = self.assertPathExists(
(
interface2,
cable2,
(circuittermination1_Z, circuittermination2_Z),
(circuittermination1_A, circuittermination2_A),
cable1,
interface1
),
is_active=True,
is_complete=True,
)
interface2.refresh_from_db()
self.assertPathIsSet(interface2, path2)
def test_224_single_path_via_multiple_pass_throughs_with_breakouts(self):
""" """
[IF1] --C1-- [FP1] [RP1] --C2-- [IF3] [IF1] --C1-- [FP1] [RP1] --C2-- [IF3]
[IF2] [FP2] [RP2] [IF4] [IF2] [FP2] [RP2] [IF4]
@ -2480,3 +2554,33 @@ class LegacyCablePathTests(CablePathTestCase):
is_active=True is_active=True
) )
self.assertEqual(CablePath.objects.count(), 0) self.assertEqual(CablePath.objects.count(), 0)
def test_402_exclude_circuit_loopback(self):
interface = Interface.objects.create(device=self.device, name='Interface 1')
circuittermination1 = CircuitTermination.objects.create(
circuit=self.circuit,
termination=self.site,
term_side='A'
)
circuittermination2 = CircuitTermination.objects.create(
circuit=self.circuit,
termination=self.site,
term_side='Z'
)
# Create cables
cable = Cable(
a_terminations=[interface],
b_terminations=[circuittermination1, circuittermination2]
)
cable.save()
path = self.assertPathExists(
(interface, cable, (circuittermination1, circuittermination2)),
is_active=True,
is_complete=False,
is_split=True
)
self.assertEqual(CablePath.objects.count(), 1)
interface.refresh_from_db()
self.assertPathIsSet(interface, path)

View File

@ -29,6 +29,6 @@ class ConfigTemplateSerializer(
fields = [ fields = [
'id', 'url', 'display_url', 'display', 'name', 'description', 'environment_params', 'template_code', 'id', 'url', 'display_url', 'display', 'name', 'description', 'environment_params', 'template_code',
'mime_type', 'file_name', 'file_extension', 'as_attachment', 'data_source', 'data_path', 'data_file', 'mime_type', 'file_name', 'file_extension', 'as_attachment', 'data_source', 'data_path', 'data_file',
'data_synced', 'owner', 'tags', 'created', 'last_updated', 'auto_sync_enabled', 'data_synced', 'owner', 'tags', 'created', 'last_updated',
] ]
brief_fields = ('id', 'url', 'display', 'name', 'description') brief_fields = ('id', 'url', 'display', 'name', 'description')

View File

@ -276,6 +276,14 @@ class ScriptViewSet(ModelViewSet):
_ignore_model_permissions = True _ignore_model_permissions = True
lookup_value_regex = '[^/]+' # Allow dots lookup_value_regex = '[^/]+' # Allow dots
def initial(self, request, *args, **kwargs):
super().initial(request, *args, **kwargs)
# Restrict the view's QuerySet to allow only the permitted objects
if request.user.is_authenticated:
action = 'run' if request.method == 'POST' else 'view'
self.queryset = self.queryset.restrict(request.user, action)
def _get_script(self, pk): def _get_script(self, pk):
# If pk is numeric, retrieve script by ID # If pk is numeric, retrieve script by ID
if pk.isnumeric(): if pk.isnumeric():
@ -299,10 +307,12 @@ class ScriptViewSet(ModelViewSet):
""" """
Run a Script identified by its numeric PK or module & name and return the pending Job as the result Run a Script identified by its numeric PK or module & name and return the pending Job as the result
""" """
if not request.user.has_perm('extras.run_script'):
raise PermissionDenied("This user does not have permission to run scripts.")
script = self._get_script(pk) script = self._get_script(pk)
if not request.user.has_perm('extras.run_script', obj=script):
raise PermissionDenied("This user does not have permission to run this script.")
input_serializer = serializers.ScriptInputSerializer( input_serializer = serializers.ScriptInputSerializer(
data=request.data, data=request.data,
context={'script': script} context={'script': script}

View File

@ -209,7 +209,10 @@ class ObjectCountsWidget(DashboardWidget):
url = get_action_url(model, action='list') url = get_action_url(model, action='list')
except NoReverseMatch: except NoReverseMatch:
url = None url = None
qs = model.objects.restrict(request.user, 'view') try:
qs = model.objects.restrict(request.user, 'view')
except AttributeError:
qs = model.objects.all()
# Apply any specified filters # Apply any specified filters
if url and (filters := self.config.get('filters')): if url and (filters := self.config.get('filters')):
params = dict_to_querydict(filters) params = dict_to_querydict(filters)

View File

@ -134,11 +134,18 @@ def process_event_rules(event_rules, object_type, event_type, data, username=Non
# Enqueue a Job to record the script's execution # Enqueue a Job to record the script's execution
from extras.jobs import ScriptJob from extras.jobs import ScriptJob
params = {
"instance": event_rule.action_object,
"name": script.name,
"user": user,
"data": event_data
}
if snapshots:
params["snapshots"] = snapshots
if request:
params["request"] = copy_safe_request(request)
ScriptJob.enqueue( ScriptJob.enqueue(
instance=event_rule.action_object, **params
name=script.name,
user=user,
data=event_data
) )
# Notification groups # Notification groups

View File

@ -392,8 +392,12 @@ class ConfigTemplateBulkEditForm(ChangelogMessageMixin, OwnerMixin, BulkEditForm
required=False, required=False,
widget=BulkEditNullBooleanSelect() widget=BulkEditNullBooleanSelect()
) )
auto_sync_enabled = forms.NullBooleanField(
nullable_fields = ('description', 'mime_type', 'file_name', 'file_extension') label=_('Auto sync enabled'),
required=False,
widget=BulkEditNullBooleanSelect()
)
nullable_fields = ('description', 'mime_type', 'file_name', 'file_extension', 'auto_sync_enabled',)
class ImageAttachmentBulkEditForm(ChangelogMessageMixin, BulkEditForm): class ImageAttachmentBulkEditForm(ChangelogMessageMixin, BulkEditForm):

View File

@ -5,7 +5,7 @@ from django.contrib.postgres.forms import SimpleArrayField
from django.core.exceptions import ObjectDoesNotExist from django.core.exceptions import ObjectDoesNotExist
from django.utils.translation import gettext_lazy as _ from django.utils.translation import gettext_lazy as _
from core.models import ObjectType from core.models import DataFile, DataSource, ObjectType
from extras.choices import * from extras.choices import *
from extras.models import * from extras.models import *
from netbox.events import get_event_type_choices from netbox.events import get_event_type_choices
@ -160,14 +160,41 @@ class ConfigContextProfileImportForm(PrimaryModelImportForm):
class ConfigTemplateImportForm(OwnerCSVMixin, CSVModelForm): class ConfigTemplateImportForm(OwnerCSVMixin, CSVModelForm):
data_source = CSVModelChoiceField(
label=_('Data source'),
queryset=DataSource.objects.all(),
required=False,
to_field_name='name',
help_text=_('Data source which provides the data file')
)
data_file = CSVModelChoiceField(
label=_('Data file'),
queryset=DataFile.objects.all(),
required=False,
to_field_name='path',
help_text=_('Data file containing the template code')
)
auto_sync_enabled = forms.BooleanField(
required=False,
label=_('Auto sync enabled'),
help_text=_("Enable automatic synchronization of template content when the data file is updated")
)
class Meta: class Meta:
model = ConfigTemplate model = ConfigTemplate
fields = ( fields = (
'name', 'description', 'template_code', 'environment_params', 'mime_type', 'file_name', 'file_extension', 'name', 'description', 'template_code', 'data_source', 'data_file', 'auto_sync_enabled',
'as_attachment', 'owner', 'tags', 'environment_params', 'mime_type', 'file_name', 'file_extension', 'as_attachment', 'owner', 'tags',
) )
def clean(self):
super().clean()
# Make sure template_code is None when it's not included in the uploaded data
if not self.data.get('template_code') and not self.data.get('data_file'):
raise forms.ValidationError(_("Must specify either local content or a data file"))
return self.cleaned_data['template_code']
class SavedFilterImportForm(OwnerCSVMixin, CSVModelForm): class SavedFilterImportForm(OwnerCSVMixin, CSVModelForm):
object_types = CSVMultipleContentTypeField( object_types = CSVMultipleContentTypeField(

View File

@ -43,17 +43,20 @@ class CustomFieldFilterForm(SavedFiltersMixin, FilterForm):
model = CustomField model = CustomField
fieldsets = ( fieldsets = (
FieldSet('q', 'filter_id'), FieldSet('q', 'filter_id'),
FieldSet( FieldSet('object_type_id', 'type', 'group_name', 'weight', 'required', 'unique', name=_('Attributes')),
'type', 'related_object_type_id', 'group_name', 'weight', 'required', 'unique', 'choice_set_id', FieldSet('choice_set_id', 'related_object_type_id', name=_('Type Options')),
name=_('Attributes')
),
FieldSet('ui_visible', 'ui_editable', 'is_cloneable', name=_('Behavior')), FieldSet('ui_visible', 'ui_editable', 'is_cloneable', name=_('Behavior')),
FieldSet('validation_minimum', 'validation_maximum', 'validation_regex', name=_('Validation')), FieldSet('validation_minimum', 'validation_maximum', 'validation_regex', name=_('Validation')),
) )
related_object_type_id = ContentTypeMultipleChoiceField( object_type_id = ContentTypeMultipleChoiceField(
queryset=ObjectType.objects.with_feature('custom_fields'), queryset=ObjectType.objects.with_feature('custom_fields'),
required=False, required=False,
label=_('Related object type') label=_('Object types'),
)
related_object_type_id = ContentTypeMultipleChoiceField(
queryset=ObjectType.objects.public(),
required=False,
label=_('Related object type'),
) )
type = forms.MultipleChoiceField( type = forms.MultipleChoiceField(
choices=CustomFieldTypeChoices, choices=CustomFieldTypeChoices,
@ -147,12 +150,12 @@ class CustomLinkFilterForm(SavedFiltersMixin, FilterForm):
model = CustomLink model = CustomLink
fieldsets = ( fieldsets = (
FieldSet('q', 'filter_id'), FieldSet('q', 'filter_id'),
FieldSet('object_type', 'enabled', 'new_window', 'weight', name=_('Attributes')), FieldSet('object_type_id', 'enabled', 'new_window', 'weight', name=_('Attributes')),
) )
object_type = ContentTypeMultipleChoiceField( object_type_id = ContentTypeMultipleChoiceField(
label=_('Object types'), label=_('Object types'),
queryset=ObjectType.objects.with_feature('custom_links'), queryset=ObjectType.objects.with_feature('custom_links'),
required=False required=False,
) )
enabled = forms.NullBooleanField( enabled = forms.NullBooleanField(
label=_('Enabled'), label=_('Enabled'),
@ -251,12 +254,12 @@ class SavedFilterFilterForm(SavedFiltersMixin, FilterForm):
model = SavedFilter model = SavedFilter
fieldsets = ( fieldsets = (
FieldSet('q', 'filter_id'), FieldSet('q', 'filter_id'),
FieldSet('object_type', 'enabled', 'shared', 'weight', name=_('Attributes')), FieldSet('object_type_id', 'enabled', 'shared', 'weight', name=_('Attributes')),
) )
object_type = ContentTypeMultipleChoiceField( object_type_id = ContentTypeMultipleChoiceField(
label=_('Object types'), label=_('Object types'),
queryset=ObjectType.objects.public(), queryset=ObjectType.objects.public(),
required=False required=False,
) )
enabled = forms.NullBooleanField( enabled = forms.NullBooleanField(
label=_('Enabled'), label=_('Enabled'),
@ -521,7 +524,7 @@ class ConfigTemplateFilterForm(SavedFiltersMixin, FilterForm):
model = ConfigTemplate model = ConfigTemplate
fieldsets = ( fieldsets = (
FieldSet('q', 'filter_id', 'tag'), FieldSet('q', 'filter_id', 'tag'),
FieldSet('data_source_id', 'data_file_id', name=_('Data')), FieldSet('data_source_id', 'data_file_id', 'auto_sync_enabled', name=_('Data')),
FieldSet('mime_type', 'file_name', 'file_extension', 'as_attachment', name=_('Rendering')) FieldSet('mime_type', 'file_name', 'file_extension', 'as_attachment', name=_('Rendering'))
) )
data_source_id = DynamicModelMultipleChoiceField( data_source_id = DynamicModelMultipleChoiceField(
@ -537,6 +540,13 @@ class ConfigTemplateFilterForm(SavedFiltersMixin, FilterForm):
'source_id': '$data_source_id' 'source_id': '$data_source_id'
} }
) )
auto_sync_enabled = forms.NullBooleanField(
label=_('Auto sync enabled'),
required=False,
widget=forms.Select(
choices=BOOLEAN_WITH_BLANK_CHOICES
)
)
tag = TagFilterField(ConfigTemplate) tag = TagFilterField(ConfigTemplate)
mime_type = forms.CharField( mime_type = forms.CharField(
required=False, required=False,

View File

@ -668,6 +668,10 @@ class ConfigTemplateTable(NetBoxTable):
orderable=False, orderable=False,
verbose_name=_('Synced') verbose_name=_('Synced')
) )
auto_sync_enabled = columns.BooleanColumn(
verbose_name=_('Auto Sync Enabled'),
orderable=False,
)
mime_type = tables.Column( mime_type = tables.Column(
verbose_name=_('MIME Type') verbose_name=_('MIME Type')
) )

View File

@ -1,4 +1,6 @@
from django import template from django import template
from django.utils.safestring import mark_safe
from django.utils.translation import gettext as _
register = template.Library() register = template.Library()
@ -8,4 +10,16 @@ register = template.Library()
def render_widget(context, widget): def render_widget(context, widget):
request = context['request'] request = context['request']
return widget.render(request) try:
return widget.render(request)
except Exception as e:
message1 = _('An error was encountered when attempting to render this widget:')
message2 = _('Please try reconfiguring the widget, or remove it from your dashboard.')
return mark_safe(f"""
<p>
<span class="text-danger"><i class="mdi mdi-alert"></i></span>
{message1}
</p>
<p class="font-monospace ps-3">{e}</p>
<p>{message2}</p>
""")

View File

@ -936,18 +936,13 @@ class ScriptTest(APITestCase):
def setUp(self): def setUp(self):
super().setUp() super().setUp()
self.add_permissions('extras.view_script')
# Monkey-patch the Script model to return our TestScriptClass above # Monkey-patch the Script model to return our TestScriptClass above
Script.python_class = self.python_class Script.python_class = self.python_class
def test_get_script(self): def test_get_script(self):
module = ScriptModule.objects.get( response = self.client.get(self.url, **self.header)
file_root=ManagedFileRootPathChoices.SCRIPTS,
file_path='script.py',
)
script = module.scripts.all().first()
url = reverse('extras-api:script-detail', kwargs={'pk': script.pk})
response = self.client.get(url, **self.header)
self.assertEqual(response.data['name'], self.TestScriptClass.Meta.name) self.assertEqual(response.data['name'], self.TestScriptClass.Meta.name)
self.assertEqual(response.data['vars']['var1'], 'StringVar') self.assertEqual(response.data['vars']['var1'], 'StringVar')

View File

@ -250,6 +250,9 @@ SESSION_FILE_PATH = None
# }, # },
# "scripts": { # "scripts": {
# "BACKEND": "extras.storage.ScriptFileSystemStorage", # "BACKEND": "extras.storage.ScriptFileSystemStorage",
# "OPTIONS": {
# "allow_overwrite": True,
# },
# }, # },
# } # }

View File

@ -297,6 +297,9 @@ DEFAULT_STORAGES = {
}, },
"scripts": { "scripts": {
"BACKEND": "extras.storage.ScriptFileSystemStorage", "BACKEND": "extras.storage.ScriptFileSystemStorage",
"OPTIONS": {
"allow_overwrite": True,
},
}, },
} }
STORAGES = DEFAULT_STORAGES | STORAGES STORAGES = DEFAULT_STORAGES | STORAGES

View File

@ -851,12 +851,12 @@ class BulkRenameView(GetReturnURLMixin, BaseMultiObjectView):
replace = form.cleaned_data['replace'] replace = form.cleaned_data['replace']
if form.cleaned_data['use_regex']: if form.cleaned_data['use_regex']:
try: try:
obj.new_name = re.sub(find, replace, getattr(obj, self.field_name, '')) obj.new_name = re.sub(find, replace, getattr(obj, self.field_name, '') or '')
# Catch regex group reference errors # Catch regex group reference errors
except re.error: except re.error:
obj.new_name = getattr(obj, self.field_name) obj.new_name = getattr(obj, self.field_name)
else: else:
obj.new_name = getattr(obj, self.field_name, '').replace(find, replace) obj.new_name = (getattr(obj, self.field_name, '') or '').replace(find, replace)
renamed_pks.append(obj.pk) renamed_pks.append(obj.pk)
return renamed_pks return renamed_pks

Binary file not shown.

View File

@ -30,7 +30,7 @@
"gridstack": "12.3.3", "gridstack": "12.3.3",
"htmx.org": "2.0.8", "htmx.org": "2.0.8",
"query-string": "9.3.1", "query-string": "9.3.1",
"sass": "1.94.0", "sass": "1.94.2",
"tom-select": "2.4.3", "tom-select": "2.4.3",
"typeface-inter": "3.18.1", "typeface-inter": "3.18.1",
"typeface-roboto-mono": "1.1.13" "typeface-roboto-mono": "1.1.13"

View File

@ -162,3 +162,18 @@ pre code {
vertical-align: .05em; vertical-align: .05em;
height: auto; height: auto;
} }
// Theme-based visibility utilities
// Tabler's .hide-theme-* utilities expect data-bs-theme on :root, but NetBox applies
// it to body. These overrides use higher specificity selectors to ensure theme-based
// visibility works correctly. The :root:not(.dummy) pattern provides the additional
// specificity needed to override Tabler's :root:not() rules.
:root:not(.dummy) body[data-bs-theme='light'] .hide-theme-light,
:root:not(.dummy) body[data-bs-theme='dark'] .hide-theme-dark {
display: none !important;
}
:root:not(.dummy) body[data-bs-theme='dark'] .hide-theme-light,
:root:not(.dummy) body[data-bs-theme='light'] .hide-theme-dark {
display: inline-flex !important;
}

View File

@ -3190,10 +3190,10 @@ safe-regex-test@^1.1.0:
es-errors "^1.3.0" es-errors "^1.3.0"
is-regex "^1.2.1" is-regex "^1.2.1"
sass@1.94.0: sass@1.94.2:
version "1.94.0" version "1.94.2"
resolved "https://registry.yarnpkg.com/sass/-/sass-1.94.0.tgz#a04198d8940358ca6ad537d2074051edbbe7c1a7" resolved "https://registry.yarnpkg.com/sass/-/sass-1.94.2.tgz#198511fc6fdd2fc0a71b8d1261735c12608d4ef3"
integrity sha512-Dqh7SiYcaFtdv5Wvku6QgS5IGPm281L+ZtVD1U2FJa7Q0EFRlq8Z3sjYtz6gYObsYThUOz9ArwFqPZx+1azILQ== integrity sha512-N+7WK20/wOr7CzA2snJcUSSNTCzeCGUTFY3OgeQP3mZ1aj9NMQ0mSTXwlrnd89j33zzQJGqIN52GIOmYrfq46A==
dependencies: dependencies:
chokidar "^4.0.0" chokidar "^4.0.0"
immutable "^5.0.2" immutable "^5.0.2"

View File

@ -1,3 +1,3 @@
version: "4.4.6" version: "4.4.7"
edition: "Community" edition: "Community"
published: "2025-11-11" published: "2025-11-25"

View File

@ -8,10 +8,10 @@
<p> <p>
<i class="mdi mdi-alert"></i> <i class="mdi mdi-alert"></i>
<strong>{% trans "Missing required packages" %}.</strong> <strong>{% trans "Missing required packages" %}.</strong>
{% blocktrans trimmed %} {% blocktrans trimmed with req_file="requirements.txt" local_req_file="local_requirements.txt" pip_cmd="pip freeze" %}
This installation of NetBox might be missing one or more required Python packages. These packages are listed in This installation of NetBox might be missing one or more required Python packages. These packages are listed in
<code>requirements.txt</code> and <code>local_requirements.txt</code>, and are normally installed as part of the <code>{{ req_file }}</code> and <code>{{ local_req_file }}</code>, and are normally installed as part of the
installation or upgrade process. To verify installed packages, run <code>pip freeze</code> from the console and installation or upgrade process. To verify installed packages, run <code>{{ pip_cmd }}</code> from the console and
compare the output to the list of required packages. compare the output to the list of required packages.
{% endblocktrans %} {% endblocktrans %}
</p> </p>

View File

@ -8,17 +8,17 @@
<p> <p>
<i class="mdi mdi-alert"></i> <i class="mdi mdi-alert"></i>
<strong>{% trans "Database migrations missing" %}.</strong> <strong>{% trans "Database migrations missing" %}.</strong>
{% blocktrans trimmed %} {% blocktrans trimmed with command="python3 manage.py migrate" %}
When upgrading to a new NetBox release, the upgrade script must be run to apply any new database migrations. You When upgrading to a new NetBox release, the upgrade script must be run to apply any new database migrations. You
can run migrations manually by executing <code>python3 manage.py migrate</code> from the command line. can run migrations manually by executing <code>{{ command }}</code> from the command line.
{% endblocktrans %} {% endblocktrans %}
</p> </p>
<p> <p>
<i class="mdi mdi-alert"></i> <i class="mdi mdi-alert"></i>
<strong>{% trans "Unsupported PostgreSQL version" %}.</strong> <strong>{% trans "Unsupported PostgreSQL version" %}.</strong>
{% blocktrans trimmed %} {% blocktrans trimmed with sql_query="SELECT VERSION()" %}
Ensure that PostgreSQL version 14 or later is in use. You can check this by connecting to the database using Ensure that PostgreSQL version 14 or later is in use. You can check this by connecting to the database using
NetBox's credentials and issuing a query for <code>SELECT VERSION()</code>. NetBox's credentials and issuing a query for <code>{{ sql_query }}</code>.
{% endblocktrans %} {% endblocktrans %}
</p> </p>
{% endblock message %} {% endblock message %}

View File

@ -62,6 +62,10 @@
<th scope="row">{% trans "Data Synced" %}</th> <th scope="row">{% trans "Data Synced" %}</th>
<td>{{ object.data_synced|placeholder }}</td> <td>{{ object.data_synced|placeholder }}</td>
</tr> </tr>
<tr>
<th scope="row">{% trans "Auto Sync Enabled" %}</th>
<td>{% checkmark object.auto_sync_enabled %}</td>
</tr>
</table> </table>
</div> </div>
{% include 'inc/panels/tags.html' %} {% include 'inc/panels/tags.html' %}

View File

@ -17,15 +17,17 @@
{% if request.htmx %} {% if request.htmx %}
{# Include the updated object count for display elsewhere on the page #} {# Include the updated object count for display elsewhere on the page #}
<div hx-swap-oob="innerHTML:.total-object-count">{{ table.rows|length }}</div> {% if not table.embedded %}
<div hx-swap-oob="innerHTML:.total-object-count">{{ table.rows|length }}</div>
{% endif %}
{# Include the updated "save" link for the table configuration #} {# Include the updated "save" link for the table configuration #}
{% if table.config_params %} {% if table.config_params and not table.embedded %}
<a class="dropdown-item" hx-swap-oob="outerHTML:#table_save_link" href="{% url 'extras:tableconfig_add' %}?{{ table.config_params }}&return_url={{ request.path }}" id="table_save_link">Save</a> <a class="dropdown-item" hx-swap-oob="outerHTML:#table_save_link" href="{% url 'extras:tableconfig_add' %}?{{ table.config_params }}&return_url={{ request.path }}" id="table_save_link">Save</a>
{% endif %} {% endif %}
{# Update the bulk action buttons with new query parameters #} {# Update the bulk action buttons with new query parameters #}
{% if actions %} {% if actions and not table.embedded %}
<div class="bulk-action-buttons" hx-swap-oob="outerHTML:.bulk-action-buttons"> <div class="bulk-action-buttons" hx-swap-oob="outerHTML:.bulk-action-buttons">
{% action_buttons actions model multi=True %} {% action_buttons actions model multi=True %}
</div> </div>

View File

@ -26,8 +26,8 @@
<p>{% trans "Check the following" %}:</p> <p>{% trans "Check the following" %}:</p>
<ul> <ul>
<li class="tip"> <li class="tip">
{% blocktrans trimmed %} {% blocktrans trimmed with command="manage.py collectstatic" %}
<code>manage.py collectstatic</code> was run during the most recent upgrade. This installs the most <code>{{ command }}</code> was run during the most recent upgrade. This installs the most
recent iteration of each static file into the static root path. recent iteration of each static file into the static root path.
{% endblocktrans %} {% endblocktrans %}
</li> </li>

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,7 @@
import hashlib import hashlib
import hmac import hmac
import random import random
import zoneinfo
from django.conf import settings from django.conf import settings
from django.contrib.postgres.fields import ArrayField from django.contrib.postgres.fields import ArrayField
@ -180,12 +181,29 @@ class Token(models.Model):
self.update_digest() self.update_digest()
def clean(self): def clean(self):
super().clean()
if self._state.adding: if self._state.adding:
if self.pepper_id is not None and self.pepper_id not in settings.API_TOKEN_PEPPERS: if self.pepper_id is not None and self.pepper_id not in settings.API_TOKEN_PEPPERS:
raise ValidationError(_( raise ValidationError(_(
"Invalid pepper ID: {id}. Check configured API_TOKEN_PEPPERS." "Invalid pepper ID: {id}. Check configured API_TOKEN_PEPPERS."
).format(id=self.pepper_id)) ).format(id=self.pepper_id))
# Prevent creating a token with a past expiration date
# while allowing updates to existing tokens.
if self.pk is None and self.is_expired:
current_tz = zoneinfo.ZoneInfo(settings.TIME_ZONE)
now = timezone.now().astimezone(current_tz)
current_time_str = f'{now.date().isoformat()} {now.time().isoformat(timespec="seconds")}'
# Translators: {current_time} is the current server date and time in ISO format,
# {timezone} is the configured server time zone (for example, "UTC" or "Europe/Berlin").
message = _(
'Expiration time must be in the future. Current server time is {current_time} ({timezone}).'
).format(current_time=current_time_str, timezone=current_tz.key)
raise ValidationError({'expires': message})
def save(self, *args, **kwargs): def save(self, *args, **kwargs):
# If creating a new Token and no token value has been specified, generate one # If creating a new Token and no token value has been specified, generate one
if self._state.adding and self.token is None: if self._state.adding and self.token is None:

View File

@ -1,6 +1,72 @@
from django.test import TestCase from datetime import timedelta
from users.models import User from django.core.exceptions import ValidationError
from django.test import TestCase
from django.utils import timezone
from users.models import User, Token
from utilities.testing import create_test_user
class TokenTest(TestCase):
"""
Test class for testing the functionality of the Token model.
"""
@classmethod
def setUpTestData(cls):
"""
Set up test data for the Token model.
"""
cls.user = create_test_user('User 1')
def test_is_expired(self):
"""
Test the is_expired property.
"""
# Token with no expiration
token = Token(user=self.user, expires=None)
self.assertFalse(token.is_expired)
# Token with future expiration
token.expires = timezone.now() + timedelta(days=1)
self.assertFalse(token.is_expired)
# Token with past expiration
token.expires = timezone.now() - timedelta(days=1)
self.assertTrue(token.is_expired)
def test_cannot_create_token_with_past_expiration(self):
"""
Test that creating a token with an expiration date in the past raises a ValidationError.
"""
past_date = timezone.now() - timedelta(days=1)
token = Token(user=self.user, expires=past_date)
with self.assertRaises(ValidationError) as cm:
token.clean()
self.assertIn('expires', cm.exception.error_dict)
def test_can_update_existing_expired_token(self):
"""
Test that updating an already expired token does NOT raise a ValidationError.
"""
# Create a valid token first with an expiration date in the past
# bypasses the clean() method
token = Token.objects.create(user=self.user)
token.expires = timezone.now() - timedelta(days=1)
token.save()
# Try to update the description
token.description = 'New Description'
try:
token.clean()
token.save()
except ValidationError:
self.fail('Updating an expired token should not raise ValidationError')
token.refresh_from_db()
self.assertEqual(token.description, 'New Description')
class UserConfigTest(TestCase): class UserConfigTest(TestCase):

View File

@ -2,6 +2,7 @@ import django_filters
from django.db.models import Q from django.db.models import Q
from django.utils.translation import gettext as _ from django.utils.translation import gettext as _
from core.models import ObjectType
from dcim.models import Device, Interface from dcim.models import Device, Interface
from ipam.models import IPAddress, RouteTarget, VLAN from ipam.models import IPAddress, RouteTarget, VLAN
from netbox.filtersets import NetBoxModelFilterSet, OrganizationalModelFilterSet, PrimaryModelFilterSet from netbox.filtersets import NetBoxModelFilterSet, OrganizationalModelFilterSet, PrimaryModelFilterSet
@ -429,6 +430,10 @@ class L2VPNTerminationFilterSet(NetBoxModelFilterSet):
queryset=VLAN.objects.all(), queryset=VLAN.objects.all(),
label=_('VLAN (ID)'), label=_('VLAN (ID)'),
) )
assigned_object_type_id = django_filters.ModelMultipleChoiceFilter(
queryset=ObjectType.objects.all(),
field_name='assigned_object_type'
)
assigned_object_type = ContentTypeFilter() assigned_object_type = ContentTypeFilter()
class Meta: class Meta:

View File

@ -3,7 +3,7 @@
[project] [project]
name = "netbox" name = "netbox"
version = "4.4.6" version = "4.4.7"
requires-python = ">=3.10" requires-python = ">=3.10"
description = "The premier source of truth powering network automation." description = "The premier source of truth powering network automation."
readme = "README.md" readme = "README.md"

View File

@ -10,9 +10,9 @@ django-pglocks==1.0.4
django-prometheus==2.4.1 django-prometheus==2.4.1
django-redis==6.0.0 django-redis==6.0.0
django-rich==2.2.0 django-rich==2.2.0
django-rq==3.1 django-rq==3.2.1
django-storages==1.14.6 django-storages==1.14.6
django-tables2==2.7.5 django-tables2==2.8.0
django-taggit==6.1.0 django-taggit==6.1.0
django-timezone-field==7.1 django-timezone-field==7.1
djangorestframework==3.16.1 djangorestframework==3.16.1
@ -23,21 +23,21 @@ gunicorn==23.0.0
Jinja2==3.1.6 Jinja2==3.1.6
jsonschema==4.25.1 jsonschema==4.25.1
Markdown==3.10 Markdown==3.10
mkdocs-material==9.6.22 mkdocs-material==9.7.0
mkdocstrings==0.30.1 mkdocstrings==0.30.1
mkdocstrings-python==1.19.0 mkdocstrings-python==1.19.0
netaddr==1.3.0 netaddr==1.3.0
nh3==0.3.2 nh3==0.3.2
Pillow==12.0.0 Pillow==12.0.0
psycopg[c,pool]==3.2.12 psycopg[c,pool]==3.2.13
PyYAML==6.0.3 PyYAML==6.0.3
requests==2.32.5 requests==2.32.5
rq==2.6.0 rq==2.6.1
social-auth-app-django==5.6.0 social-auth-app-django==5.6.0
social-auth-core==4.8.1 social-auth-core==4.8.1
sorl-thumbnail==12.11.0 sorl-thumbnail==12.11.0
strawberry-graphql==0.285.0 strawberry-graphql==0.287.0
strawberry-graphql-django==0.67.0 strawberry-graphql-django==0.67.2
svgwrite==1.4.3 svgwrite==1.4.3
tablib==3.9.0 tablib==3.9.0
tzdata==2025.2 tzdata==2025.2