mirror of
https://github.com/netbox-community/netbox.git
synced 2025-07-14 01:41:22 -06:00
* Resolve F541 errors * Resolve F841 errors * Resolve F811 errors * Resolve F901 errors * Resolve E714 errors * Ignore F821 errors for GraphQL mixins * Replace pycodestyle with ruff * Move ignores to ruff.toml
This commit is contained in:
parent
1e6f222475
commit
7ac6dff96d
4
.github/workflows/ci.yml
vendored
4
.github/workflows/ci.yml
vendored
@ -73,7 +73,7 @@ jobs:
|
|||||||
run: |
|
run: |
|
||||||
python -m pip install --upgrade pip
|
python -m pip install --upgrade pip
|
||||||
pip install -r requirements.txt
|
pip install -r requirements.txt
|
||||||
pip install pycodestyle coverage tblib
|
pip install ruff coverage tblib
|
||||||
|
|
||||||
- name: Build documentation
|
- name: Build documentation
|
||||||
run: mkdocs build
|
run: mkdocs build
|
||||||
@ -85,7 +85,7 @@ jobs:
|
|||||||
run: python netbox/manage.py makemigrations --check
|
run: python netbox/manage.py makemigrations --check
|
||||||
|
|
||||||
- name: Check PEP8 compliance
|
- name: Check PEP8 compliance
|
||||||
run: pycodestyle --ignore=W504,E501 --exclude=node_modules netbox/
|
run: ruff check netbox/
|
||||||
|
|
||||||
- name: Check UI ESLint, TypeScript, and Prettier Compliance
|
- name: Check UI ESLint, TypeScript, and Prettier Compliance
|
||||||
run: yarn --cwd netbox/project-static validate
|
run: yarn --cwd netbox/project-static validate
|
||||||
|
@ -70,10 +70,10 @@ NetBox ships with a [git pre-commit hook](https://githooks.com/) script that aut
|
|||||||
cd .git/hooks/
|
cd .git/hooks/
|
||||||
ln -s ../../scripts/git-hooks/pre-commit
|
ln -s ../../scripts/git-hooks/pre-commit
|
||||||
```
|
```
|
||||||
For the pre-commit hooks to work, you will also need to install the pycodestyle package:
|
For the pre-commit hooks to work, you will also need to install the [ruff](https://docs.astral.sh/ruff/) linter:
|
||||||
|
|
||||||
```no-highlight
|
```no-highlight
|
||||||
python -m pip install pycodestyle
|
python -m pip install ruff
|
||||||
```
|
```
|
||||||
...and set up the yarn packages as shown in the [Web UI Development Guide](web-ui.md)
|
...and set up the yarn packages as shown in the [Web UI Development Guide](web-ui.md)
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
# Style Guide
|
# Style Guide
|
||||||
|
|
||||||
NetBox generally follows the [Django style guide](https://docs.djangoproject.com/en/stable/internals/contributing/writing-code/coding-style/), which is itself based on [PEP 8](https://www.python.org/dev/peps/pep-0008/). [Pycodestyle](https://github.com/pycqa/pycodestyle) is used to validate code formatting, ignoring certain violations.
|
NetBox generally follows the [Django style guide](https://docs.djangoproject.com/en/stable/internals/contributing/writing-code/coding-style/), which is itself based on [PEP 8](https://www.python.org/dev/peps/pep-0008/). [ruff](https://docs.astral.sh/ruff/) is used for linting (with certain [exceptions](#linter-exceptions)).
|
||||||
|
|
||||||
## Code
|
## Code
|
||||||
|
|
||||||
@ -20,32 +20,32 @@ NetBox generally follows the [Django style guide](https://docs.djangoproject.com
|
|||||||
|
|
||||||
* Nested API serializers generate minimal representations of an object. These are stored separately from the primary serializers to avoid circular dependencies. Always import nested serializers from other apps directly. For example, from within the DCIM app you would write `from ipam.api.nested_serializers import NestedIPAddressSerializer`.
|
* Nested API serializers generate minimal representations of an object. These are stored separately from the primary serializers to avoid circular dependencies. Always import nested serializers from other apps directly. For example, from within the DCIM app you would write `from ipam.api.nested_serializers import NestedIPAddressSerializer`.
|
||||||
|
|
||||||
### PEP 8 Exceptions
|
### Linting
|
||||||
|
|
||||||
NetBox ignores certain PEP8 assertions. These are listed below.
|
The [ruff](https://docs.astral.sh/ruff/) linter is used to enforce code style. A [pre-commit hook](./getting-started.md#3-enable-pre-commit-hooks) which runs this automatically is included with NetBox. To invoke `ruff` manually, run:
|
||||||
|
|
||||||
#### Wildcard Imports
|
```
|
||||||
|
ruff check netbox/
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Linter Exceptions
|
||||||
|
|
||||||
|
The following rules are ignored when linting.
|
||||||
|
|
||||||
|
##### [E501](https://docs.astral.sh/ruff/rules/line-too-long/): Line too long
|
||||||
|
|
||||||
|
NetBox does not enforce a hard restriction on line length, although a maximum length of 120 characters is strongly encouraged for Python code where possible. The maximum length does not apply to HTML templates or to automatically generated code (e.g. database migrations).
|
||||||
|
|
||||||
|
##### [F403](https://docs.astral.sh/ruff/rules/undefined-local-with-import-star/): Undefined local with import star
|
||||||
|
|
||||||
Wildcard imports (for example, `from .constants import *`) are acceptable under any of the following conditions:
|
Wildcard imports (for example, `from .constants import *`) are acceptable under any of the following conditions:
|
||||||
|
|
||||||
* The library being import contains only constant declarations (e.g. `constants.py`)
|
* The library being import contains only constant declarations (e.g. `constants.py`)
|
||||||
* The library being imported explicitly defines `__all__`
|
* The library being imported explicitly defines `__all__`
|
||||||
|
|
||||||
#### Maximum Line Length (E501)
|
##### [F405](https://docs.astral.sh/ruff/rules/undefined-local-with-import-star-usage/): Undefined local with import star usage
|
||||||
|
|
||||||
NetBox does not restrict lines to a maximum length of 79 characters. We use a maximum line length of 120 characters, however this is not enforced by CI. The maximum length does not apply to HTML templates or to automatically generated code (e.g. database migrations).
|
The justification for ignoring this rule is the same as F403 above.
|
||||||
|
|
||||||
#### Line Breaks Following Binary Operators (W504)
|
|
||||||
|
|
||||||
Line breaks are permitted following binary operators.
|
|
||||||
|
|
||||||
### Enforcing Code Style
|
|
||||||
|
|
||||||
The [`pycodestyle`](https://pypi.org/project/pycodestyle/) utility (formerly `pep8`) is used by the CI process to enforce code style. A [pre-commit hook](./getting-started.md#3-enable-pre-commit-hooks) which runs this automatically is included with NetBox. To invoke `pycodestyle` manually, run:
|
|
||||||
|
|
||||||
```
|
|
||||||
pycodestyle --ignore=W504,E501 netbox/
|
|
||||||
```
|
|
||||||
|
|
||||||
### Introducing New Dependencies
|
### Introducing New Dependencies
|
||||||
|
|
||||||
|
@ -18,7 +18,7 @@ __all__ = [
|
|||||||
|
|
||||||
# TODO: Remove in v4.2
|
# TODO: Remove in v4.2
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
f"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
||||||
DeprecationWarning
|
DeprecationWarning
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -171,7 +171,7 @@ class CircuitTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
|||||||
)
|
)
|
||||||
|
|
||||||
cls.csv_update_data = (
|
cls.csv_update_data = (
|
||||||
f"id,cid,description,status",
|
"id,cid,description,status",
|
||||||
f"{circuits[0].pk},Circuit 7,New description7,{CircuitStatusChoices.STATUS_DECOMMISSIONED}",
|
f"{circuits[0].pk},Circuit 7,New description7,{CircuitStatusChoices.STATUS_DECOMMISSIONED}",
|
||||||
f"{circuits[1].pk},Circuit 8,New description8,{CircuitStatusChoices.STATUS_DECOMMISSIONED}",
|
f"{circuits[1].pk},Circuit 8,New description8,{CircuitStatusChoices.STATUS_DECOMMISSIONED}",
|
||||||
f"{circuits[2].pk},Circuit 9,New description9,{CircuitStatusChoices.STATUS_DECOMMISSIONED}",
|
f"{circuits[2].pk},Circuit 9,New description9,{CircuitStatusChoices.STATUS_DECOMMISSIONED}",
|
||||||
|
@ -16,7 +16,7 @@ __all__ = (
|
|||||||
|
|
||||||
# TODO: Remove in v4.2
|
# TODO: Remove in v4.2
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
f"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
||||||
DeprecationWarning
|
DeprecationWarning
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -34,7 +34,7 @@ class LocalBackend(DataBackend):
|
|||||||
|
|
||||||
@contextmanager
|
@contextmanager
|
||||||
def fetch(self):
|
def fetch(self):
|
||||||
logger.debug(f"Data source type is local; skipping fetch")
|
logger.debug("Data source type is local; skipping fetch")
|
||||||
local_path = urlparse(self.url).path # Strip file:// scheme
|
local_path = urlparse(self.url).path # Strip file:// scheme
|
||||||
|
|
||||||
yield local_path
|
yield local_path
|
||||||
|
@ -15,7 +15,7 @@ __all__ = (
|
|||||||
class ChangelogMixin:
|
class ChangelogMixin:
|
||||||
|
|
||||||
@strawberry_django.field
|
@strawberry_django.field
|
||||||
def changelog(self, info) -> List[Annotated["ObjectChangeType", strawberry.lazy('.types')]]:
|
def changelog(self, info) -> List[Annotated["ObjectChangeType", strawberry.lazy('.types')]]: # noqa: F821
|
||||||
content_type = ContentType.objects.get_for_model(self)
|
content_type = ContentType.objects.get_for_model(self)
|
||||||
object_changes = ObjectChange.objects.filter(
|
object_changes = ObjectChange.objects.filter(
|
||||||
changed_object_type=content_type,
|
changed_object_type=content_type,
|
||||||
|
@ -26,7 +26,7 @@ class Command(BaseCommand):
|
|||||||
if invalid_names := set(options['name']) - found_names:
|
if invalid_names := set(options['name']) - found_names:
|
||||||
raise CommandError(f"Invalid data source names: {', '.join(invalid_names)}")
|
raise CommandError(f"Invalid data source names: {', '.join(invalid_names)}")
|
||||||
else:
|
else:
|
||||||
raise CommandError(f"Must specify at least one data source, or set --all.")
|
raise CommandError("Must specify at least one data source, or set --all.")
|
||||||
|
|
||||||
if len(options['name']) > 1:
|
if len(options['name']) > 1:
|
||||||
self.stdout.write(f"Syncing {len(datasources)} data sources.")
|
self.stdout.write(f"Syncing {len(datasources)} data sources.")
|
||||||
@ -43,4 +43,4 @@ class Command(BaseCommand):
|
|||||||
raise e
|
raise e
|
||||||
|
|
||||||
if len(options['name']) > 1:
|
if len(options['name']) > 1:
|
||||||
self.stdout.write(f"Finished.")
|
self.stdout.write("Finished.")
|
||||||
|
@ -125,7 +125,7 @@ class DataSource(JobsMixin, PrimaryModel):
|
|||||||
# Ensure URL scheme matches selected type
|
# Ensure URL scheme matches selected type
|
||||||
if self.backend_class.is_local and self.url_scheme not in ('file', ''):
|
if self.backend_class.is_local and self.url_scheme not in ('file', ''):
|
||||||
raise ValidationError({
|
raise ValidationError({
|
||||||
'source_url': f"URLs for local sources must start with file:// (or specify no scheme)"
|
'source_url': "URLs for local sources must start with file:// (or specify no scheme)"
|
||||||
})
|
})
|
||||||
|
|
||||||
def to_objectchange(self, action):
|
def to_objectchange(self, action):
|
||||||
|
@ -118,9 +118,9 @@ class Job(models.Model):
|
|||||||
# TODO: Employ dynamic registration
|
# TODO: Employ dynamic registration
|
||||||
if self.object_type:
|
if self.object_type:
|
||||||
if self.object_type.model == 'reportmodule':
|
if self.object_type.model == 'reportmodule':
|
||||||
return reverse(f'extras:report_result', kwargs={'job_pk': self.pk})
|
return reverse('extras:report_result', kwargs={'job_pk': self.pk})
|
||||||
elif self.object_type.model == 'scriptmodule':
|
elif self.object_type.model == 'scriptmodule':
|
||||||
return reverse(f'extras:script_result', kwargs={'job_pk': self.pk})
|
return reverse('extras:script_result', kwargs={'job_pk': self.pk})
|
||||||
return reverse('core:job', args=[self.pk])
|
return reverse('core:job', args=[self.pk])
|
||||||
|
|
||||||
def get_status_color(self):
|
def get_status_color(self):
|
||||||
|
@ -56,7 +56,7 @@ __all__ = [
|
|||||||
|
|
||||||
# TODO: Remove in v4.2
|
# TODO: Remove in v4.2
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
f"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
||||||
DeprecationWarning
|
DeprecationWarning
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -261,8 +261,8 @@ class FrontPortCreateForm(ComponentCreateForm, model_forms.FrontPortForm):
|
|||||||
# TODO: Clean up the application of HTMXSelect attributes
|
# TODO: Clean up the application of HTMXSelect attributes
|
||||||
attrs={
|
attrs={
|
||||||
'hx-get': '.',
|
'hx-get': '.',
|
||||||
'hx-include': f'#form_fields',
|
'hx-include': '#form_fields',
|
||||||
'hx-target': f'#form_fields',
|
'hx-target': '#form_fields',
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
@ -10,18 +10,18 @@ __all__ = (
|
|||||||
|
|
||||||
@strawberry.type
|
@strawberry.type
|
||||||
class CabledObjectMixin:
|
class CabledObjectMixin:
|
||||||
cable: Annotated["CableType", strawberry.lazy('dcim.graphql.types')] | None
|
cable: Annotated["CableType", strawberry.lazy('dcim.graphql.types')] | None # noqa: F821
|
||||||
|
|
||||||
link_peers: List[Annotated[Union[
|
link_peers: List[Annotated[Union[
|
||||||
Annotated["CircuitTerminationType", strawberry.lazy('circuits.graphql.types')],
|
Annotated["CircuitTerminationType", strawberry.lazy('circuits.graphql.types')], # noqa: F821
|
||||||
Annotated["ConsolePortType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["ConsolePortType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["ConsoleServerPortType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["ConsoleServerPortType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["FrontPortType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["FrontPortType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["InterfaceType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["InterfaceType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["PowerFeedType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["PowerFeedType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["PowerOutletType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["PowerOutletType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["PowerPortType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["PowerPortType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["RearPortType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["RearPortType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
], strawberry.union("LinkPeerType")]]
|
], strawberry.union("LinkPeerType")]]
|
||||||
|
|
||||||
|
|
||||||
@ -29,14 +29,14 @@ class CabledObjectMixin:
|
|||||||
class PathEndpointMixin:
|
class PathEndpointMixin:
|
||||||
|
|
||||||
connected_endpoints: List[Annotated[Union[
|
connected_endpoints: List[Annotated[Union[
|
||||||
Annotated["CircuitTerminationType", strawberry.lazy('circuits.graphql.types')],
|
Annotated["CircuitTerminationType", strawberry.lazy('circuits.graphql.types')], # noqa: F821
|
||||||
Annotated["ConsolePortType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["ConsolePortType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["ConsoleServerPortType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["ConsoleServerPortType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["FrontPortType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["FrontPortType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["InterfaceType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["InterfaceType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["PowerFeedType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["PowerFeedType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["PowerOutletType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["PowerOutletType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["PowerPortType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["PowerPortType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
Annotated["ProviderNetworkType", strawberry.lazy('circuits.graphql.types')],
|
Annotated["ProviderNetworkType", strawberry.lazy('circuits.graphql.types')], # noqa: F821
|
||||||
Annotated["RearPortType", strawberry.lazy('dcim.graphql.types')],
|
Annotated["RearPortType", strawberry.lazy('dcim.graphql.types')], # noqa: F821
|
||||||
], strawberry.union("ConnectedEndpointType")]]
|
], strawberry.union("ConnectedEndpointType")]]
|
||||||
|
@ -60,7 +60,7 @@ class Command(BaseCommand):
|
|||||||
self.stdout.write((self.style.SUCCESS(f' Deleted {deleted_count} paths')))
|
self.stdout.write((self.style.SUCCESS(f' Deleted {deleted_count} paths')))
|
||||||
|
|
||||||
# Reinitialize the model's PK sequence
|
# Reinitialize the model's PK sequence
|
||||||
self.stdout.write(f'Resetting database sequence for CablePath model')
|
self.stdout.write('Resetting database sequence for CablePath model')
|
||||||
sequence_sql = connection.ops.sequence_reset_sql(no_style(), [CablePath])
|
sequence_sql = connection.ops.sequence_reset_sql(no_style(), [CablePath])
|
||||||
with connection.cursor() as cursor:
|
with connection.cursor() as cursor:
|
||||||
for sql in sequence_sql:
|
for sql in sequence_sql:
|
||||||
|
@ -160,7 +160,6 @@ class ModularComponentTemplateModel(ComponentTemplateModel):
|
|||||||
|
|
||||||
def _get_module_tree(self, module):
|
def _get_module_tree(self, module):
|
||||||
modules = []
|
modules = []
|
||||||
all_module_bays = module.device.modulebays.all().select_related('module')
|
|
||||||
while module:
|
while module:
|
||||||
modules.append(module)
|
modules.append(module)
|
||||||
if module.module_bay:
|
if module.module_bay:
|
||||||
|
@ -1,6 +1,5 @@
|
|||||||
from django.utils.translation import gettext_lazy as _
|
|
||||||
import django_tables2 as tables
|
import django_tables2 as tables
|
||||||
from django.utils.translation import gettext as _
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
from dcim import models
|
from dcim import models
|
||||||
from netbox.tables import NetBoxTable, columns
|
from netbox.tables import NetBoxTable, columns
|
||||||
|
@ -2135,12 +2135,12 @@ class ConnectedDeviceTest(APITestCase):
|
|||||||
def test_get_connected_device(self):
|
def test_get_connected_device(self):
|
||||||
url = reverse('dcim-api:connected-device-list')
|
url = reverse('dcim-api:connected-device-list')
|
||||||
|
|
||||||
url_params = f'?peer_device=TestDevice1&peer_interface=eth0'
|
url_params = '?peer_device=TestDevice1&peer_interface=eth0'
|
||||||
response = self.client.get(url + url_params, **self.header)
|
response = self.client.get(url + url_params, **self.header)
|
||||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||||
self.assertEqual(response.data['name'], 'TestDevice2')
|
self.assertEqual(response.data['name'], 'TestDevice2')
|
||||||
|
|
||||||
url_params = f'?peer_device=TestDevice1&peer_interface=eth1'
|
url_params = '?peer_device=TestDevice1&peer_interface=eth1'
|
||||||
response = self.client.get(url + url_params, **self.header)
|
response = self.client.get(url + url_params, **self.header)
|
||||||
self.assertHttpStatus(response, status.HTTP_404_NOT_FOUND)
|
self.assertHttpStatus(response, status.HTTP_404_NOT_FOUND)
|
||||||
|
|
||||||
|
@ -4838,13 +4838,6 @@ class InventoryItemTestCase(TestCase, ChangeLoggedFilterSetTests):
|
|||||||
params = {'device_role': [role[0].slug, role[1].slug]}
|
params = {'device_role': [role[0].slug, role[1].slug]}
|
||||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 4)
|
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 4)
|
||||||
|
|
||||||
def test_role(self):
|
|
||||||
role = DeviceRole.objects.all()[:2]
|
|
||||||
params = {'role_id': [role[0].pk, role[1].pk]}
|
|
||||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 4)
|
|
||||||
params = {'role': [role[0].slug, role[1].slug]}
|
|
||||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 4)
|
|
||||||
|
|
||||||
def test_device(self):
|
def test_device(self):
|
||||||
devices = Device.objects.all()[:2]
|
devices = Device.objects.all()[:2]
|
||||||
params = {'device_id': [devices[0].pk, devices[1].pk]}
|
params = {'device_id': [devices[0].pk, devices[1].pk]}
|
||||||
|
@ -662,10 +662,8 @@ class ModuleBayTestCase(TestCase):
|
|||||||
|
|
||||||
def test_module_bay_recursion(self):
|
def test_module_bay_recursion(self):
|
||||||
module_bay_1 = ModuleBay.objects.get(name='Module Bay 1')
|
module_bay_1 = ModuleBay.objects.get(name='Module Bay 1')
|
||||||
module_bay_2 = ModuleBay.objects.get(name='Module Bay 2')
|
|
||||||
module_bay_3 = ModuleBay.objects.get(name='Module Bay 3')
|
module_bay_3 = ModuleBay.objects.get(name='Module Bay 3')
|
||||||
module_1 = Module.objects.get(module_bay=module_bay_1)
|
module_1 = Module.objects.get(module_bay=module_bay_1)
|
||||||
module_2 = Module.objects.get(module_bay=module_bay_2)
|
|
||||||
module_3 = Module.objects.get(module_bay=module_bay_3)
|
module_3 = Module.objects.get(module_bay=module_bay_3)
|
||||||
|
|
||||||
# Confirm error if ModuleBay recurses
|
# Confirm error if ModuleBay recurses
|
||||||
@ -681,8 +679,6 @@ class ModuleBayTestCase(TestCase):
|
|||||||
module_1.save()
|
module_1.save()
|
||||||
|
|
||||||
def test_single_module_token(self):
|
def test_single_module_token(self):
|
||||||
module_bays = ModuleBay.objects.all()
|
|
||||||
modules = Module.objects.all()
|
|
||||||
device_type = DeviceType.objects.first()
|
device_type = DeviceType.objects.first()
|
||||||
device_role = DeviceRole.objects.first()
|
device_role = DeviceRole.objects.first()
|
||||||
site = Site.objects.first()
|
site = Site.objects.first()
|
||||||
@ -708,7 +704,7 @@ class ModuleBayTestCase(TestCase):
|
|||||||
location=location,
|
location=location,
|
||||||
rack=rack
|
rack=rack
|
||||||
)
|
)
|
||||||
cp = device.consoleports.first()
|
device.consoleports.first()
|
||||||
|
|
||||||
def test_nested_module_token(self):
|
def test_nested_module_token(self):
|
||||||
pass
|
pass
|
||||||
@ -733,39 +729,41 @@ class CableTestCase(TestCase):
|
|||||||
device2 = Device.objects.create(
|
device2 = Device.objects.create(
|
||||||
device_type=devicetype, role=role, name='TestDevice2', site=site
|
device_type=devicetype, role=role, name='TestDevice2', site=site
|
||||||
)
|
)
|
||||||
interface1 = Interface.objects.create(device=device1, name='eth0')
|
interfaces = (
|
||||||
interface2 = Interface.objects.create(device=device2, name='eth0')
|
Interface(device=device1, name='eth0'),
|
||||||
interface3 = Interface.objects.create(device=device2, name='eth1')
|
Interface(device=device2, name='eth0'),
|
||||||
Cable(a_terminations=[interface1], b_terminations=[interface2]).save()
|
Interface(device=device2, name='eth1'),
|
||||||
|
)
|
||||||
|
Interface.objects.bulk_create(interfaces)
|
||||||
|
Cable(a_terminations=[interfaces[0]], b_terminations=[interfaces[1]]).save()
|
||||||
|
PowerPort.objects.create(device=device2, name='psu1')
|
||||||
|
|
||||||
power_port1 = PowerPort.objects.create(device=device2, name='psu1')
|
patch_panel = Device.objects.create(
|
||||||
patch_pannel = Device.objects.create(
|
|
||||||
device_type=devicetype, role=role, name='TestPatchPanel', site=site
|
device_type=devicetype, role=role, name='TestPatchPanel', site=site
|
||||||
)
|
)
|
||||||
rear_port1 = RearPort.objects.create(device=patch_pannel, name='RP1', type='8p8c')
|
rear_ports = (
|
||||||
front_port1 = FrontPort.objects.create(
|
RearPort(device=patch_panel, name='RP1', type='8p8c'),
|
||||||
device=patch_pannel, name='FP1', type='8p8c', rear_port=rear_port1, rear_port_position=1
|
RearPort(device=patch_panel, name='RP2', type='8p8c', positions=2),
|
||||||
|
RearPort(device=patch_panel, name='RP3', type='8p8c', positions=3),
|
||||||
|
RearPort(device=patch_panel, name='RP4', type='8p8c', positions=3),
|
||||||
)
|
)
|
||||||
rear_port2 = RearPort.objects.create(device=patch_pannel, name='RP2', type='8p8c', positions=2)
|
RearPort.objects.bulk_create(rear_ports)
|
||||||
front_port2 = FrontPort.objects.create(
|
front_ports = (
|
||||||
device=patch_pannel, name='FP2', type='8p8c', rear_port=rear_port2, rear_port_position=1
|
FrontPort(device=patch_panel, name='FP1', type='8p8c', rear_port=rear_ports[0], rear_port_position=1),
|
||||||
)
|
FrontPort(device=patch_panel, name='FP2', type='8p8c', rear_port=rear_ports[1], rear_port_position=1),
|
||||||
rear_port3 = RearPort.objects.create(device=patch_pannel, name='RP3', type='8p8c', positions=3)
|
FrontPort(device=patch_panel, name='FP3', type='8p8c', rear_port=rear_ports[2], rear_port_position=1),
|
||||||
front_port3 = FrontPort.objects.create(
|
FrontPort(device=patch_panel, name='FP4', type='8p8c', rear_port=rear_ports[3], rear_port_position=1),
|
||||||
device=patch_pannel, name='FP3', type='8p8c', rear_port=rear_port3, rear_port_position=1
|
|
||||||
)
|
|
||||||
rear_port4 = RearPort.objects.create(device=patch_pannel, name='RP4', type='8p8c', positions=3)
|
|
||||||
front_port4 = FrontPort.objects.create(
|
|
||||||
device=patch_pannel, name='FP4', type='8p8c', rear_port=rear_port4, rear_port_position=1
|
|
||||||
)
|
)
|
||||||
|
FrontPort.objects.bulk_create(front_ports)
|
||||||
|
|
||||||
provider = Provider.objects.create(name='Provider 1', slug='provider-1')
|
provider = Provider.objects.create(name='Provider 1', slug='provider-1')
|
||||||
provider_network = ProviderNetwork.objects.create(name='Provider Network 1', provider=provider)
|
provider_network = ProviderNetwork.objects.create(name='Provider Network 1', provider=provider)
|
||||||
circuittype = CircuitType.objects.create(name='Circuit Type 1', slug='circuit-type-1')
|
circuittype = CircuitType.objects.create(name='Circuit Type 1', slug='circuit-type-1')
|
||||||
circuit1 = Circuit.objects.create(provider=provider, type=circuittype, cid='1')
|
circuit1 = Circuit.objects.create(provider=provider, type=circuittype, cid='1')
|
||||||
circuit2 = Circuit.objects.create(provider=provider, type=circuittype, cid='2')
|
circuit2 = Circuit.objects.create(provider=provider, type=circuittype, cid='2')
|
||||||
circuittermination1 = CircuitTermination.objects.create(circuit=circuit1, site=site, term_side='A')
|
CircuitTermination.objects.create(circuit=circuit1, site=site, term_side='A')
|
||||||
circuittermination2 = CircuitTermination.objects.create(circuit=circuit1, site=site, term_side='Z')
|
CircuitTermination.objects.create(circuit=circuit1, site=site, term_side='Z')
|
||||||
circuittermination3 = CircuitTermination.objects.create(circuit=circuit2, provider_network=provider_network, term_side='A')
|
CircuitTermination.objects.create(circuit=circuit2, provider_network=provider_network, term_side='A')
|
||||||
|
|
||||||
def test_cable_creation(self):
|
def test_cable_creation(self):
|
||||||
"""
|
"""
|
||||||
|
@ -2571,7 +2571,7 @@ class InterfaceTestCase(ViewTestCases.DeviceComponentViewTestCase):
|
|||||||
}
|
}
|
||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
f"device,name,type,vrf.pk,poe_mode,poe_type",
|
"device,name,type,vrf.pk,poe_mode,poe_type",
|
||||||
f"Device 1,Interface 4,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af",
|
f"Device 1,Interface 4,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af",
|
||||||
f"Device 1,Interface 5,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af",
|
f"Device 1,Interface 5,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af",
|
||||||
f"Device 1,Interface 6,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af",
|
f"Device 1,Interface 6,1000base-t,{vrfs[0].pk},pse,type1-ieee802.3af",
|
||||||
|
@ -24,7 +24,7 @@ __all__ = [
|
|||||||
|
|
||||||
# TODO: Remove in v4.2
|
# TODO: Remove in v4.2
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
f"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
||||||
DeprecationWarning
|
DeprecationWarning
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -48,7 +48,7 @@ class ScriptJob(JobRunner):
|
|||||||
except AbortTransaction:
|
except AbortTransaction:
|
||||||
script.log_info(message=_("Database changes have been reverted automatically."))
|
script.log_info(message=_("Database changes have been reverted automatically."))
|
||||||
if script.failed:
|
if script.failed:
|
||||||
logger.warning(f"Script failed")
|
logger.warning("Script failed")
|
||||||
raise
|
raise
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
|
@ -95,7 +95,7 @@ class Command(BaseCommand):
|
|||||||
self.stdout.write("[*] Checking for latest release")
|
self.stdout.write("[*] Checking for latest release")
|
||||||
if settings.ISOLATED_DEPLOYMENT:
|
if settings.ISOLATED_DEPLOYMENT:
|
||||||
if options['verbosity']:
|
if options['verbosity']:
|
||||||
self.stdout.write(f"\tSkipping: ISOLATED_DEPLOYMENT is enabled")
|
self.stdout.write("\tSkipping: ISOLATED_DEPLOYMENT is enabled")
|
||||||
elif settings.RELEASE_CHECK_URL:
|
elif settings.RELEASE_CHECK_URL:
|
||||||
headers = {
|
headers = {
|
||||||
'Accept': 'application/vnd.github.v3+json',
|
'Accept': 'application/vnd.github.v3+json',
|
||||||
@ -129,7 +129,7 @@ class Command(BaseCommand):
|
|||||||
self.stdout.write(f"\tRequest error: {exc}", self.style.ERROR)
|
self.stdout.write(f"\tRequest error: {exc}", self.style.ERROR)
|
||||||
else:
|
else:
|
||||||
if options['verbosity']:
|
if options['verbosity']:
|
||||||
self.stdout.write(f"\tSkipping: RELEASE_CHECK_URL not set")
|
self.stdout.write("\tSkipping: RELEASE_CHECK_URL not set")
|
||||||
|
|
||||||
if options['verbosity']:
|
if options['verbosity']:
|
||||||
self.stdout.write("Finished.", self.style.SUCCESS)
|
self.stdout.write("Finished.", self.style.SUCCESS)
|
||||||
|
@ -96,9 +96,9 @@ class Command(BaseCommand):
|
|||||||
if i:
|
if i:
|
||||||
self.stdout.write(f'{i} entries cached.')
|
self.stdout.write(f'{i} entries cached.')
|
||||||
else:
|
else:
|
||||||
self.stdout.write(f'No objects found.')
|
self.stdout.write('No objects found.')
|
||||||
|
|
||||||
msg = f'Completed.'
|
msg = 'Completed.'
|
||||||
if total_count := search_backend.size:
|
if total_count := search_backend.size:
|
||||||
msg += f' Total entries: {total_count}'
|
msg += f' Total entries: {total_count}'
|
||||||
self.stdout.write(msg, self.style.SUCCESS)
|
self.stdout.write(msg, self.style.SUCCESS)
|
||||||
|
@ -51,7 +51,7 @@ class Command(BaseCommand):
|
|||||||
user = User.objects.filter(is_superuser=True).order_by('pk')[0]
|
user = User.objects.filter(is_superuser=True).order_by('pk')[0]
|
||||||
|
|
||||||
# Setup logging to Stdout
|
# Setup logging to Stdout
|
||||||
formatter = logging.Formatter(f'[%(asctime)s][%(levelname)s] - %(message)s')
|
formatter = logging.Formatter('[%(asctime)s][%(levelname)s] - %(message)s')
|
||||||
stdouthandler = logging.StreamHandler(sys.stdout)
|
stdouthandler = logging.StreamHandler(sys.stdout)
|
||||||
stdouthandler.setLevel(logging.DEBUG)
|
stdouthandler.setLevel(logging.DEBUG)
|
||||||
stdouthandler.setFormatter(formatter)
|
stdouthandler.setFormatter(formatter)
|
||||||
|
@ -283,7 +283,7 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
|
|||||||
"""
|
"""
|
||||||
for ct in content_types:
|
for ct in content_types:
|
||||||
model = ct.model_class()
|
model = ct.model_class()
|
||||||
instances = model.objects.exclude(**{f'custom_field_data__contains': self.name})
|
instances = model.objects.exclude(**{'custom_field_data__contains': self.name})
|
||||||
for instance in instances:
|
for instance in instances:
|
||||||
instance.custom_field_data[self.name] = self.default
|
instance.custom_field_data[self.name] = self.default
|
||||||
model.objects.bulk_update(instances, ['custom_field_data'], batch_size=100)
|
model.objects.bulk_update(instances, ['custom_field_data'], batch_size=100)
|
||||||
|
@ -554,7 +554,7 @@ class BaseScript:
|
|||||||
"""
|
"""
|
||||||
Run the report and save its results. Each test method will be executed in order.
|
Run the report and save its results. Each test method will be executed in order.
|
||||||
"""
|
"""
|
||||||
self.logger.info(f"Running report")
|
self.logger.info("Running report")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
for test_name in self.tests:
|
for test_name in self.tests:
|
||||||
|
@ -162,7 +162,7 @@ class CustomValidatorTest(TestCase):
|
|||||||
Site(name='abcdef123', slug='abcdef123').clean()
|
Site(name='abcdef123', slug='abcdef123').clean()
|
||||||
|
|
||||||
@override_settings(CUSTOM_VALIDATORS={'dcim.site': [region_validator]})
|
@override_settings(CUSTOM_VALIDATORS={'dcim.site': [region_validator]})
|
||||||
def test_valid(self):
|
def test_related_object(self):
|
||||||
region1 = Region(name='Foo', slug='foo')
|
region1 = Region(name='Foo', slug='foo')
|
||||||
region1.save()
|
region1.save()
|
||||||
region2 = Region(name='Bar', slug='bar')
|
region2 = Region(name='Bar', slug='bar')
|
||||||
|
@ -49,11 +49,11 @@ class ConfigContextTest(TestCase):
|
|||||||
sitegroup = SiteGroup.objects.create(name='Site Group')
|
sitegroup = SiteGroup.objects.create(name='Site Group')
|
||||||
site = Site.objects.create(name='Site 1', slug='site-1', region=region, group=sitegroup)
|
site = Site.objects.create(name='Site 1', slug='site-1', region=region, group=sitegroup)
|
||||||
location = Location.objects.create(name='Location 1', slug='location-1', site=site)
|
location = Location.objects.create(name='Location 1', slug='location-1', site=site)
|
||||||
platform = Platform.objects.create(name='Platform')
|
Platform.objects.create(name='Platform')
|
||||||
tenantgroup = TenantGroup.objects.create(name='Tenant Group')
|
tenantgroup = TenantGroup.objects.create(name='Tenant Group')
|
||||||
tenant = Tenant.objects.create(name='Tenant', group=tenantgroup)
|
Tenant.objects.create(name='Tenant', group=tenantgroup)
|
||||||
tag1 = Tag.objects.create(name='Tag', slug='tag')
|
Tag.objects.create(name='Tag', slug='tag')
|
||||||
tag2 = Tag.objects.create(name='Tag2', slug='tag2')
|
Tag.objects.create(name='Tag2', slug='tag2')
|
||||||
|
|
||||||
Device.objects.create(
|
Device.objects.create(
|
||||||
name='Device 1',
|
name='Device 1',
|
||||||
|
@ -417,7 +417,7 @@ class EventRulesTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
|||||||
}
|
}
|
||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
f'name,object_types,event_types,action_type,action_object',
|
'name,object_types,event_types,action_type,action_object',
|
||||||
f'Webhook 4,dcim.site,"{OBJECT_CREATED},{OBJECT_UPDATED}",webhook,Webhook 1',
|
f'Webhook 4,dcim.site,"{OBJECT_CREATED},{OBJECT_UPDATED}",webhook,Webhook 1',
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -30,7 +30,7 @@ __all__ = [
|
|||||||
|
|
||||||
# TODO: Remove in v4.2
|
# TODO: Remove in v4.2
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
f"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
||||||
DeprecationWarning
|
DeprecationWarning
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -186,13 +186,13 @@ class AvailableObjectsView(ObjectValidationMixin, APIView):
|
|||||||
"""
|
"""
|
||||||
Return the parent object.
|
Return the parent object.
|
||||||
"""
|
"""
|
||||||
raise NotImplemented()
|
raise NotImplementedError()
|
||||||
|
|
||||||
def get_available_objects(self, parent, limit=None):
|
def get_available_objects(self, parent, limit=None):
|
||||||
"""
|
"""
|
||||||
Return all available objects for the parent.
|
Return all available objects for the parent.
|
||||||
"""
|
"""
|
||||||
raise NotImplemented()
|
raise NotImplementedError()
|
||||||
|
|
||||||
def get_extra_context(self, parent):
|
def get_extra_context(self, parent):
|
||||||
"""
|
"""
|
||||||
@ -250,7 +250,7 @@ class AvailableObjectsView(ObjectValidationMixin, APIView):
|
|||||||
# Determine if the requested number of objects is available
|
# Determine if the requested number of objects is available
|
||||||
if not self.check_sufficient_available(serializer.validated_data, available_objects):
|
if not self.check_sufficient_available(serializer.validated_data, available_objects):
|
||||||
return Response(
|
return Response(
|
||||||
{"detail": f"Insufficient resources are available to satisfy the request"},
|
{"detail": "Insufficient resources are available to satisfy the request"},
|
||||||
status=status.HTTP_409_CONFLICT
|
status=status.HTTP_409_CONFLICT
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -10,9 +10,9 @@ __all__ = (
|
|||||||
|
|
||||||
@strawberry.type
|
@strawberry.type
|
||||||
class IPAddressesMixin:
|
class IPAddressesMixin:
|
||||||
ip_addresses: List[Annotated["IPAddressType", strawberry.lazy('ipam.graphql.types')]]
|
ip_addresses: List[Annotated["IPAddressType", strawberry.lazy('ipam.graphql.types')]] # noqa: F821
|
||||||
|
|
||||||
|
|
||||||
@strawberry.type
|
@strawberry.type
|
||||||
class VLANGroupsMixin:
|
class VLANGroupsMixin:
|
||||||
vlan_groups: List[Annotated["VLANGroupType", strawberry.lazy('ipam.graphql.types')]]
|
vlan_groups: List[Annotated["VLANGroupType", strawberry.lazy('ipam.graphql.types')]] # noqa: F821
|
||||||
|
@ -700,8 +700,6 @@ class IPAddressTest(APIViewTestCases.APIViewTestCase):
|
|||||||
device1.primary_ip4 = ip_addresses[0]
|
device1.primary_ip4 = ip_addresses[0]
|
||||||
device1.save()
|
device1.save()
|
||||||
|
|
||||||
ip2 = ip_addresses[1]
|
|
||||||
|
|
||||||
url = reverse('ipam-api:ipaddress-detail', kwargs={'pk': ip1.pk})
|
url = reverse('ipam-api:ipaddress-detail', kwargs={'pk': ip1.pk})
|
||||||
self.add_permissions('ipam.change_ipaddress')
|
self.add_permissions('ipam.change_ipaddress')
|
||||||
|
|
||||||
|
@ -50,7 +50,7 @@ class ASNRangeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
|||||||
}
|
}
|
||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
f"name,slug,rir,tenant,start,end,description",
|
"name,slug,rir,tenant,start,end,description",
|
||||||
f"ASN Range 4,asn-range-4,{rirs[1].name},{tenants[1].name},400,499,Fourth range",
|
f"ASN Range 4,asn-range-4,{rirs[1].name},{tenants[1].name},400,499,Fourth range",
|
||||||
f"ASN Range 5,asn-range-5,{rirs[1].name},{tenants[1].name},500,599,Fifth range",
|
f"ASN Range 5,asn-range-5,{rirs[1].name},{tenants[1].name},500,599,Fifth range",
|
||||||
f"ASN Range 6,asn-range-6,{rirs[1].name},{tenants[1].name},600,699,Sixth range",
|
f"ASN Range 6,asn-range-6,{rirs[1].name},{tenants[1].name},600,699,Sixth range",
|
||||||
@ -770,14 +770,14 @@ class VLANGroupTestCase(ViewTestCases.OrganizationalObjectViewTestCase):
|
|||||||
}
|
}
|
||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
f"name,slug,scope_type,scope_id,description",
|
"name,slug,scope_type,scope_id,description",
|
||||||
f"VLAN Group 4,vlan-group-4,,,Fourth VLAN group",
|
"VLAN Group 4,vlan-group-4,,,Fourth VLAN group",
|
||||||
f"VLAN Group 5,vlan-group-5,dcim.site,{sites[0].pk},Fifth VLAN group",
|
f"VLAN Group 5,vlan-group-5,dcim.site,{sites[0].pk},Fifth VLAN group",
|
||||||
f"VLAN Group 6,vlan-group-6,dcim.site,{sites[1].pk},Sixth VLAN group",
|
f"VLAN Group 6,vlan-group-6,dcim.site,{sites[1].pk},Sixth VLAN group",
|
||||||
)
|
)
|
||||||
|
|
||||||
cls.csv_update_data = (
|
cls.csv_update_data = (
|
||||||
f"id,name,description",
|
"id,name,description",
|
||||||
f"{vlan_groups[0].pk},VLAN Group 7,Fourth VLAN group7",
|
f"{vlan_groups[0].pk},VLAN Group 7,Fourth VLAN group7",
|
||||||
f"{vlan_groups[1].pk},VLAN Group 8,Fifth VLAN group8",
|
f"{vlan_groups[1].pk},VLAN Group 8,Fifth VLAN group8",
|
||||||
f"{vlan_groups[2].pk},VLAN Group 9,Sixth VLAN group9",
|
f"{vlan_groups[2].pk},VLAN Group 9,Sixth VLAN group9",
|
||||||
|
@ -85,7 +85,7 @@ class Config:
|
|||||||
logger.debug("Loaded configuration data from database")
|
logger.debug("Loaded configuration data from database")
|
||||||
except DatabaseError:
|
except DatabaseError:
|
||||||
# The database may not be available yet (e.g. when running a management command)
|
# The database may not be available yet (e.g. when running a management command)
|
||||||
logger.warning(f"Skipping config initialization (database unavailable)")
|
logger.warning("Skipping config initialization (database unavailable)")
|
||||||
return
|
return
|
||||||
|
|
||||||
revision.activate()
|
revision.activate()
|
||||||
|
@ -50,4 +50,4 @@ class DataBackend:
|
|||||||
2. Yields the local path at which data has been replicated
|
2. Yields the local path at which data has been replicated
|
||||||
3. Performs any necessary cleanup
|
3. Performs any necessary cleanup
|
||||||
"""
|
"""
|
||||||
raise NotImplemented()
|
raise NotImplementedError()
|
||||||
|
@ -386,57 +386,57 @@ ADMIN_MENU = Menu(
|
|||||||
label=_('Authentication'),
|
label=_('Authentication'),
|
||||||
items=(
|
items=(
|
||||||
MenuItem(
|
MenuItem(
|
||||||
link=f'users:user_list',
|
link='users:user_list',
|
||||||
link_text=_('Users'),
|
link_text=_('Users'),
|
||||||
auth_required=True,
|
auth_required=True,
|
||||||
permissions=[f'users.view_user'],
|
permissions=['users.view_user'],
|
||||||
buttons=(
|
buttons=(
|
||||||
MenuItemButton(
|
MenuItemButton(
|
||||||
link=f'users:user_add',
|
link='users:user_add',
|
||||||
title='Add',
|
title='Add',
|
||||||
icon_class='mdi mdi-plus-thick',
|
icon_class='mdi mdi-plus-thick',
|
||||||
permissions=[f'users.add_user']
|
permissions=['users.add_user']
|
||||||
),
|
),
|
||||||
MenuItemButton(
|
MenuItemButton(
|
||||||
link=f'users:user_import',
|
link='users:user_import',
|
||||||
title='Import',
|
title='Import',
|
||||||
icon_class='mdi mdi-upload',
|
icon_class='mdi mdi-upload',
|
||||||
permissions=[f'users.add_user']
|
permissions=['users.add_user']
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
),
|
),
|
||||||
MenuItem(
|
MenuItem(
|
||||||
link=f'users:group_list',
|
link='users:group_list',
|
||||||
link_text=_('Groups'),
|
link_text=_('Groups'),
|
||||||
auth_required=True,
|
auth_required=True,
|
||||||
permissions=[f'users.view_group'],
|
permissions=['users.view_group'],
|
||||||
buttons=(
|
buttons=(
|
||||||
MenuItemButton(
|
MenuItemButton(
|
||||||
link=f'users:group_add',
|
link='users:group_add',
|
||||||
title='Add',
|
title='Add',
|
||||||
icon_class='mdi mdi-plus-thick',
|
icon_class='mdi mdi-plus-thick',
|
||||||
permissions=[f'users.add_group']
|
permissions=['users.add_group']
|
||||||
),
|
),
|
||||||
MenuItemButton(
|
MenuItemButton(
|
||||||
link=f'users:group_import',
|
link='users:group_import',
|
||||||
title='Import',
|
title='Import',
|
||||||
icon_class='mdi mdi-upload',
|
icon_class='mdi mdi-upload',
|
||||||
permissions=[f'users.add_group']
|
permissions=['users.add_group']
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
),
|
),
|
||||||
MenuItem(
|
MenuItem(
|
||||||
link=f'users:token_list',
|
link='users:token_list',
|
||||||
link_text=_('API Tokens'),
|
link_text=_('API Tokens'),
|
||||||
auth_required=True,
|
auth_required=True,
|
||||||
permissions=[f'users.view_token'],
|
permissions=['users.view_token'],
|
||||||
buttons=get_model_buttons('users', 'token')
|
buttons=get_model_buttons('users', 'token')
|
||||||
),
|
),
|
||||||
MenuItem(
|
MenuItem(
|
||||||
link=f'users:objectpermission_list',
|
link='users:objectpermission_list',
|
||||||
link_text=_('Permissions'),
|
link_text=_('Permissions'),
|
||||||
auth_required=True,
|
auth_required=True,
|
||||||
permissions=[f'users.view_objectpermission'],
|
permissions=['users.view_objectpermission'],
|
||||||
buttons=get_model_buttons('users', 'objectpermission', actions=['add'])
|
buttons=get_model_buttons('users', 'objectpermission', actions=['add'])
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
|
@ -198,7 +198,7 @@ if len(SECRET_KEY) < 50:
|
|||||||
if RELEASE_CHECK_URL:
|
if RELEASE_CHECK_URL:
|
||||||
try:
|
try:
|
||||||
URLValidator()(RELEASE_CHECK_URL)
|
URLValidator()(RELEASE_CHECK_URL)
|
||||||
except ValidationError as e:
|
except ValidationError:
|
||||||
raise ImproperlyConfigured(
|
raise ImproperlyConfigured(
|
||||||
"RELEASE_CHECK_URL must be a valid URL. Example: https://api.github.com/repos/netbox-community/netbox"
|
"RELEASE_CHECK_URL must be a valid URL. Example: https://api.github.com/repos/netbox-community/netbox"
|
||||||
)
|
)
|
||||||
|
@ -80,7 +80,7 @@ class checkout:
|
|||||||
Create Change instances for all actions stored in the queue.
|
Create Change instances for all actions stored in the queue.
|
||||||
"""
|
"""
|
||||||
if not self.queue:
|
if not self.queue:
|
||||||
logger.debug(f"No queued changes; aborting")
|
logger.debug("No queued changes; aborting")
|
||||||
return
|
return
|
||||||
logger.debug(f"Processing {len(self.queue)} queued changes")
|
logger.debug(f"Processing {len(self.queue)} queued changes")
|
||||||
|
|
||||||
|
@ -21,7 +21,7 @@ class DummyModelsView(View):
|
|||||||
class DummyModelAddView(View):
|
class DummyModelAddView(View):
|
||||||
|
|
||||||
def get(self, request):
|
def get(self, request):
|
||||||
return HttpResponse(f"Create an instance")
|
return HttpResponse("Create an instance")
|
||||||
|
|
||||||
def post(self, request):
|
def post(self, request):
|
||||||
instance = DummyModel(
|
instance = DummyModel(
|
||||||
@ -29,7 +29,7 @@ class DummyModelAddView(View):
|
|||||||
number=random.randint(1, 100000)
|
number=random.randint(1, 100000)
|
||||||
)
|
)
|
||||||
instance.save()
|
instance.save()
|
||||||
return HttpResponse(f"Instance created")
|
return HttpResponse("Instance created")
|
||||||
|
|
||||||
|
|
||||||
@register_model_view(Site, 'extra', path='other-stuff')
|
@register_model_view(Site, 'extra', path='other-stuff')
|
||||||
|
@ -106,7 +106,7 @@ class ExternalAuthenticationTestCase(TestCase):
|
|||||||
self.assertEqual(settings.REMOTE_AUTH_HEADER, 'HTTP_REMOTE_USER')
|
self.assertEqual(settings.REMOTE_AUTH_HEADER, 'HTTP_REMOTE_USER')
|
||||||
|
|
||||||
# Client should not be authenticated
|
# Client should not be authenticated
|
||||||
response = self.client.get(reverse('home'), follow=True, **headers)
|
self.client.get(reverse('home'), follow=True, **headers)
|
||||||
self.assertNotIn('_auth_user_id', self.client.session)
|
self.assertNotIn('_auth_user_id', self.client.session)
|
||||||
|
|
||||||
@override_settings(
|
@override_settings(
|
||||||
|
@ -77,7 +77,6 @@ class CSVImportTestCase(ModelViewTestCase):
|
|||||||
self.assertHttpStatus(self.client.post(self._get_url('import'), data), 302)
|
self.assertHttpStatus(self.client.post(self._get_url('import'), data), 302)
|
||||||
regions = Region.objects.all()
|
regions = Region.objects.all()
|
||||||
self.assertEqual(regions.count(), 4)
|
self.assertEqual(regions.count(), 4)
|
||||||
region = Region.objects.get(slug="region-4")
|
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
list(regions[0].tags.values_list('name', flat=True)),
|
list(regions[0].tags.values_list('name', flat=True)),
|
||||||
['Alpha', 'Bravo']
|
['Alpha', 'Bravo']
|
||||||
|
@ -15,7 +15,7 @@ __all__ = [
|
|||||||
|
|
||||||
# TODO: Remove in v4.2
|
# TODO: Remove in v4.2
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
f"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
||||||
DeprecationWarning
|
DeprecationWarning
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -10,4 +10,4 @@ __all__ = (
|
|||||||
@strawberry.type
|
@strawberry.type
|
||||||
class ContactAssignmentsMixin:
|
class ContactAssignmentsMixin:
|
||||||
|
|
||||||
assignments: List[Annotated["ContactAssignmentType", strawberry.lazy('tenancy.graphql.types')]]
|
assignments: List[Annotated["ContactAssignmentType", strawberry.lazy('tenancy.graphql.types')]] # noqa: F821
|
||||||
|
@ -18,7 +18,7 @@ __all__ = [
|
|||||||
|
|
||||||
# TODO: Remove in v4.2
|
# TODO: Remove in v4.2
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
f"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
||||||
DeprecationWarning
|
DeprecationWarning
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -73,7 +73,7 @@ class TokenProvisionView(APIView):
|
|||||||
|
|
||||||
def perform_create(self, serializer):
|
def perform_create(self, serializer):
|
||||||
model = serializer.Meta.model
|
model = serializer.Meta.model
|
||||||
logger = logging.getLogger(f'netbox.api.views.TokenProvisionView')
|
logger = logging.getLogger('netbox.api.views.TokenProvisionView')
|
||||||
logger.info(f"Creating new {model._meta.verbose_name}")
|
logger.info(f"Creating new {model._meta.verbose_name}")
|
||||||
serializer.save()
|
serializer.save()
|
||||||
|
|
||||||
|
@ -36,7 +36,6 @@ class UserConfigFormMetaclass(forms.models.ModelFormMetaclass):
|
|||||||
# Emulate a declared field for each supported user preference
|
# Emulate a declared field for each supported user preference
|
||||||
preference_fields = {}
|
preference_fields = {}
|
||||||
for field_name, preference in PREFERENCES.items():
|
for field_name, preference in PREFERENCES.items():
|
||||||
description = f'{preference.description}<br />' if preference.description else ''
|
|
||||||
help_text = f'<code>{field_name}</code>'
|
help_text = f'<code>{field_name}</code>'
|
||||||
if preference.description:
|
if preference.description:
|
||||||
help_text = f'{preference.description}<br />{help_text}'
|
help_text = f'{preference.description}<br />{help_text}'
|
||||||
|
@ -51,11 +51,11 @@ class UserPreferencesTest(TestCase):
|
|||||||
|
|
||||||
# Check that table ordering preference has been recorded
|
# Check that table ordering preference has been recorded
|
||||||
self.user.refresh_from_db()
|
self.user.refresh_from_db()
|
||||||
ordering = self.user.config.get(f'tables.SiteTable.ordering')
|
ordering = self.user.config.get('tables.SiteTable.ordering')
|
||||||
self.assertEqual(ordering, ['status'])
|
self.assertEqual(ordering, ['status'])
|
||||||
|
|
||||||
# Check that a recorded preference is honored by default
|
# Check that a recorded preference is honored by default
|
||||||
self.user.config.set(f'tables.SiteTable.ordering', ['-status'], commit=True)
|
self.user.config.set('tables.SiteTable.ordering', ['-status'], commit=True)
|
||||||
table = SiteTable(Site.objects.all())
|
table = SiteTable(Site.objects.all())
|
||||||
request = RequestFactory().get(url)
|
request = RequestFactory().get(url)
|
||||||
request.user = self.user
|
request.user = self.user
|
||||||
|
@ -142,7 +142,7 @@ class DynamicModelChoiceMixin:
|
|||||||
|
|
||||||
if data:
|
if data:
|
||||||
# When the field is multiple choice pass the data as a list if it's not already
|
# When the field is multiple choice pass the data as a list if it's not already
|
||||||
if isinstance(bound_field.field, DynamicModelMultipleChoiceField) and not type(data) is list:
|
if isinstance(bound_field.field, DynamicModelMultipleChoiceField) and type(data) is not list:
|
||||||
data = [data]
|
data = [data]
|
||||||
|
|
||||||
field_name = getattr(self, 'to_field_name') or 'pk'
|
field_name = getattr(self, 'to_field_name') or 'pk'
|
||||||
|
@ -59,7 +59,7 @@ def highlight(value, highlight, trim_pre=None, trim_post=None, trim_placeholder=
|
|||||||
else:
|
else:
|
||||||
highlight = re.escape(highlight)
|
highlight = re.escape(highlight)
|
||||||
pre, match, post = re.split(fr'({highlight})', value, maxsplit=1, flags=re.IGNORECASE)
|
pre, match, post = re.split(fr'({highlight})', value, maxsplit=1, flags=re.IGNORECASE)
|
||||||
except ValueError as e:
|
except ValueError:
|
||||||
# Match not found
|
# Match not found
|
||||||
return escape(value)
|
return escape(value)
|
||||||
|
|
||||||
|
@ -149,7 +149,7 @@ class APIPaginationTestCase(APITestCase):
|
|||||||
|
|
||||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||||
self.assertEqual(response.data['count'], 100)
|
self.assertEqual(response.data['count'], 100)
|
||||||
self.assertTrue(response.data['next'].endswith(f'?limit=10&offset=10'))
|
self.assertTrue(response.data['next'].endswith('?limit=10&offset=10'))
|
||||||
self.assertIsNone(response.data['previous'])
|
self.assertIsNone(response.data['previous'])
|
||||||
self.assertEqual(len(response.data['results']), 10)
|
self.assertEqual(len(response.data['results']), 10)
|
||||||
|
|
||||||
@ -159,7 +159,7 @@ class APIPaginationTestCase(APITestCase):
|
|||||||
|
|
||||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||||
self.assertEqual(response.data['count'], 100)
|
self.assertEqual(response.data['count'], 100)
|
||||||
self.assertTrue(response.data['next'].endswith(f'?limit=20&offset=20'))
|
self.assertTrue(response.data['next'].endswith('?limit=20&offset=20'))
|
||||||
self.assertIsNone(response.data['previous'])
|
self.assertIsNone(response.data['previous'])
|
||||||
self.assertEqual(len(response.data['results']), 20)
|
self.assertEqual(len(response.data['results']), 20)
|
||||||
|
|
||||||
|
@ -85,7 +85,7 @@ class CountersTest(TestCase):
|
|||||||
def test_mptt_child_delete(self):
|
def test_mptt_child_delete(self):
|
||||||
device1, device2 = Device.objects.all()
|
device1, device2 = Device.objects.all()
|
||||||
inventory_item1 = InventoryItem.objects.create(device=device1, name='Inventory Item 1')
|
inventory_item1 = InventoryItem.objects.create(device=device1, name='Inventory Item 1')
|
||||||
inventory_item2 = InventoryItem.objects.create(device=device1, name='Inventory Item 2', parent=inventory_item1)
|
InventoryItem.objects.create(device=device1, name='Inventory Item 2', parent=inventory_item1)
|
||||||
device1.refresh_from_db()
|
device1.refresh_from_db()
|
||||||
self.assertEqual(device1.inventory_item_count, 2)
|
self.assertEqual(device1.inventory_item_count, 2)
|
||||||
|
|
||||||
|
@ -18,7 +18,7 @@ __all__ = [
|
|||||||
|
|
||||||
# TODO: Remove in v4.2
|
# TODO: Remove in v4.2
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
f"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
||||||
DeprecationWarning
|
DeprecationWarning
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -354,14 +354,14 @@ class VMInterfaceTestCase(ViewTestCases.DeviceComponentViewTestCase):
|
|||||||
}
|
}
|
||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
f"virtual_machine,name,vrf.pk",
|
"virtual_machine,name,vrf.pk",
|
||||||
f"Virtual Machine 2,Interface 4,{vrfs[0].pk}",
|
f"Virtual Machine 2,Interface 4,{vrfs[0].pk}",
|
||||||
f"Virtual Machine 2,Interface 5,{vrfs[0].pk}",
|
f"Virtual Machine 2,Interface 5,{vrfs[0].pk}",
|
||||||
f"Virtual Machine 2,Interface 6,{vrfs[0].pk}",
|
f"Virtual Machine 2,Interface 6,{vrfs[0].pk}",
|
||||||
)
|
)
|
||||||
|
|
||||||
cls.csv_update_data = (
|
cls.csv_update_data = (
|
||||||
f"id,name,description",
|
"id,name,description",
|
||||||
f"{interfaces[0].pk},Interface 7,New description 7",
|
f"{interfaces[0].pk},Interface 7,New description 7",
|
||||||
f"{interfaces[1].pk},Interface 8,New description 8",
|
f"{interfaces[1].pk},Interface 8,New description 8",
|
||||||
f"{interfaces[2].pk},Interface 9,New description 9",
|
f"{interfaces[2].pk},Interface 9,New description 9",
|
||||||
@ -438,14 +438,14 @@ class VirtualDiskTestCase(ViewTestCases.DeviceComponentViewTestCase):
|
|||||||
}
|
}
|
||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
f"virtual_machine,name,size,description",
|
"virtual_machine,name,size,description",
|
||||||
f"Virtual Machine 1,Disk 4,20,Fourth",
|
"Virtual Machine 1,Disk 4,20,Fourth",
|
||||||
f"Virtual Machine 1,Disk 5,20,Fifth",
|
"Virtual Machine 1,Disk 5,20,Fifth",
|
||||||
f"Virtual Machine 1,Disk 6,20,Sixth",
|
"Virtual Machine 1,Disk 6,20,Sixth",
|
||||||
)
|
)
|
||||||
|
|
||||||
cls.csv_update_data = (
|
cls.csv_update_data = (
|
||||||
f"id,name,size",
|
"id,name,size",
|
||||||
f"{disks[0].pk},disk1,20",
|
f"{disks[0].pk},disk1,20",
|
||||||
f"{disks[1].pk},disk2,20",
|
f"{disks[1].pk},disk2,20",
|
||||||
f"{disks[2].pk},disk3,20",
|
f"{disks[2].pk},disk3,20",
|
||||||
|
@ -657,7 +657,7 @@ class VirtualMachineBulkAddInterfaceView(generic.BulkComponentCreateView):
|
|||||||
default_return_url = 'virtualization:virtualmachine_list'
|
default_return_url = 'virtualization:virtualmachine_list'
|
||||||
|
|
||||||
def get_required_permission(self):
|
def get_required_permission(self):
|
||||||
return f'virtualization.add_vminterface'
|
return 'virtualization.add_vminterface'
|
||||||
|
|
||||||
|
|
||||||
class VirtualMachineBulkAddVirtualDiskView(generic.BulkComponentCreateView):
|
class VirtualMachineBulkAddVirtualDiskView(generic.BulkComponentCreateView):
|
||||||
@ -671,4 +671,4 @@ class VirtualMachineBulkAddVirtualDiskView(generic.BulkComponentCreateView):
|
|||||||
default_return_url = 'virtualization:virtualmachine_list'
|
default_return_url = 'virtualization:virtualmachine_list'
|
||||||
|
|
||||||
def get_required_permission(self):
|
def get_required_permission(self):
|
||||||
return f'virtualization.add_virtualdisk'
|
return 'virtualization.add_virtualdisk'
|
||||||
|
@ -21,7 +21,7 @@ __all__ = (
|
|||||||
|
|
||||||
# TODO: Remove in v4.2
|
# TODO: Remove in v4.2
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
f"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
||||||
DeprecationWarning
|
DeprecationWarning
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -147,17 +147,6 @@ class IKEProposalFilterSet(NetBoxModelFilterSet):
|
|||||||
group = django_filters.MultipleChoiceFilter(
|
group = django_filters.MultipleChoiceFilter(
|
||||||
choices=DHGroupChoices
|
choices=DHGroupChoices
|
||||||
)
|
)
|
||||||
ike_policy_id = django_filters.ModelMultipleChoiceFilter(
|
|
||||||
field_name='ike_policies',
|
|
||||||
queryset=IKEPolicy.objects.all(),
|
|
||||||
label=_('IKE policy (ID)'),
|
|
||||||
)
|
|
||||||
ike_policy = django_filters.ModelMultipleChoiceFilter(
|
|
||||||
field_name='ike_policies__name',
|
|
||||||
queryset=IKEPolicy.objects.all(),
|
|
||||||
to_field_name='name',
|
|
||||||
label=_('IKE policy (name)'),
|
|
||||||
)
|
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
model = IKEProposal
|
model = IKEProposal
|
||||||
|
@ -385,13 +385,6 @@ class IKEProposalTestCase(TestCase, ChangeLoggedFilterSetTests):
|
|||||||
params = {'sa_lifetime': [1000, 2000]}
|
params = {'sa_lifetime': [1000, 2000]}
|
||||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||||
|
|
||||||
def test_ike_policy(self):
|
|
||||||
ike_policies = IKEPolicy.objects.all()[:2]
|
|
||||||
params = {'ike_policy_id': [ike_policies[0].pk, ike_policies[1].pk]}
|
|
||||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
|
||||||
params = {'ike_policy': [ike_policies[0].name, ike_policies[1].name]}
|
|
||||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
|
||||||
|
|
||||||
|
|
||||||
class IKEPolicyTestCase(TestCase, ChangeLoggedFilterSetTests):
|
class IKEPolicyTestCase(TestCase, ChangeLoggedFilterSetTests):
|
||||||
queryset = IKEPolicy.objects.all()
|
queryset = IKEPolicy.objects.all()
|
||||||
|
@ -542,9 +542,9 @@ class IPSecProfileTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
|||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
"name,mode,ike_policy,ipsec_policy",
|
"name,mode,ike_policy,ipsec_policy",
|
||||||
f"IKE Proposal 4,ah,IKE Policy 2,IPSec Policy 2",
|
"IKE Proposal 4,ah,IKE Policy 2,IPSec Policy 2",
|
||||||
f"IKE Proposal 5,ah,IKE Policy 2,IPSec Policy 2",
|
"IKE Proposal 5,ah,IKE Policy 2,IPSec Policy 2",
|
||||||
f"IKE Proposal 6,ah,IKE Policy 2,IPSec Policy 2",
|
"IKE Proposal 6,ah,IKE Policy 2,IPSec Policy 2",
|
||||||
)
|
)
|
||||||
|
|
||||||
cls.csv_update_data = (
|
cls.csv_update_data = (
|
||||||
@ -661,7 +661,7 @@ class L2VPNTerminationTestCase(
|
|||||||
)
|
)
|
||||||
|
|
||||||
cls.csv_update_data = (
|
cls.csv_update_data = (
|
||||||
f"id,l2vpn",
|
"id,l2vpn",
|
||||||
f"{terminations[0].pk},{l2vpns[0].name}",
|
f"{terminations[0].pk},{l2vpns[0].name}",
|
||||||
f"{terminations[1].pk},{l2vpns[0].name}",
|
f"{terminations[1].pk},{l2vpns[0].name}",
|
||||||
f"{terminations[2].pk},{l2vpns[0].name}",
|
f"{terminations[2].pk},{l2vpns[0].name}",
|
||||||
|
@ -12,7 +12,7 @@ __all__ = (
|
|||||||
|
|
||||||
# TODO: Remove in v4.2
|
# TODO: Remove in v4.2
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
f"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
"Dedicated nested serializers will be removed in NetBox v4.2. Use Serializer(nested=True) instead.",
|
||||||
DeprecationWarning
|
DeprecationWarning
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -102,14 +102,14 @@ class WirelessLANTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
|||||||
}
|
}
|
||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
f"group,ssid,status,tenant",
|
"group,ssid,status,tenant",
|
||||||
f"Wireless LAN Group 2,WLAN4,{WirelessLANStatusChoices.STATUS_ACTIVE},{tenants[0].name}",
|
f"Wireless LAN Group 2,WLAN4,{WirelessLANStatusChoices.STATUS_ACTIVE},{tenants[0].name}",
|
||||||
f"Wireless LAN Group 2,WLAN5,{WirelessLANStatusChoices.STATUS_DISABLED},{tenants[1].name}",
|
f"Wireless LAN Group 2,WLAN5,{WirelessLANStatusChoices.STATUS_DISABLED},{tenants[1].name}",
|
||||||
f"Wireless LAN Group 2,WLAN6,{WirelessLANStatusChoices.STATUS_RESERVED},{tenants[2].name}",
|
f"Wireless LAN Group 2,WLAN6,{WirelessLANStatusChoices.STATUS_RESERVED},{tenants[2].name}",
|
||||||
)
|
)
|
||||||
|
|
||||||
cls.csv_update_data = (
|
cls.csv_update_data = (
|
||||||
f"id,ssid",
|
"id,ssid",
|
||||||
f"{wireless_lans[0].pk},WLAN7",
|
f"{wireless_lans[0].pk},WLAN7",
|
||||||
f"{wireless_lans[1].pk},WLAN8",
|
f"{wireless_lans[1].pk},WLAN8",
|
||||||
f"{wireless_lans[2].pk},WLAN9",
|
f"{wireless_lans[2].pk},WLAN9",
|
||||||
@ -167,7 +167,7 @@ class WirelessLinkTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
|||||||
}
|
}
|
||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
f"interface_a,interface_b,status,tenant",
|
"interface_a,interface_b,status,tenant",
|
||||||
f"{interfaces[6].pk},{interfaces[7].pk},connected,{tenants[0].name}",
|
f"{interfaces[6].pk},{interfaces[7].pk},connected,{tenants[0].name}",
|
||||||
f"{interfaces[8].pk},{interfaces[9].pk},connected,{tenants[1].name}",
|
f"{interfaces[8].pk},{interfaces[9].pk},connected,{tenants[1].name}",
|
||||||
f"{interfaces[10].pk},{interfaces[11].pk},connected,{tenants[2].name}",
|
f"{interfaces[10].pk},{interfaces[11].pk},connected,{tenants[2].name}",
|
||||||
|
@ -28,8 +28,8 @@ if [ ${NOVALIDATE} ]; then
|
|||||||
exit $EXIT
|
exit $EXIT
|
||||||
fi
|
fi
|
||||||
|
|
||||||
echo "Validating PEP8 compliance..."
|
echo "Linting with ruff..."
|
||||||
pycodestyle --ignore=W504,E501 --exclude=node_modules netbox/
|
ruff check netbox/
|
||||||
if [ $? != 0 ]; then
|
if [ $? != 0 ]; then
|
||||||
EXIT=1
|
EXIT=1
|
||||||
fi
|
fi
|
||||||
|
Loading…
Reference in New Issue
Block a user