mirror of
https://github.com/netbox-community/netbox.git
synced 2025-07-16 04:02:52 -06:00
Compare commits
55 Commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
b88b5b0b1b | ||
![]() |
6eeb382512 | ||
![]() |
e5d6c71171 | ||
![]() |
f777bfee2e | ||
![]() |
8b63eb64c1 | ||
![]() |
cff29f9551 | ||
![]() |
a5c0cae112 | ||
![]() |
2a27e475e4 | ||
![]() |
44efa037cc | ||
![]() |
6c17629159 | ||
![]() |
f13d028c98 | ||
![]() |
f5d32b1bf1 | ||
![]() |
f05897d61a | ||
![]() |
b5421f1cd6 | ||
![]() |
23cc4f1c41 | ||
![]() |
9c2cd66162 | ||
![]() |
f61a2964c8 | ||
![]() |
ee94fb0b94 | ||
![]() |
8fb8f4c75b | ||
![]() |
e33793dc82 | ||
![]() |
3b8841ee3b | ||
![]() |
ea4c205a37 | ||
![]() |
2a5d3abafb | ||
![]() |
71e6ea5785 | ||
![]() |
0a9887b42f | ||
![]() |
3ecf29d797 | ||
![]() |
c48e4f590e | ||
![]() |
aee83a434a | ||
![]() |
a17699d261 | ||
![]() |
f97d07a11c | ||
![]() |
1fd3d390ae | ||
![]() |
7dab7d730d | ||
![]() |
c660f1c019 | ||
![]() |
334b45f55a | ||
![]() |
e6c1cebd34 | ||
![]() |
a9af541e81 | ||
![]() |
f706572113 | ||
![]() |
6a6286777c | ||
![]() |
afeddee10d | ||
![]() |
a48bee2a2e | ||
![]() |
b9db6ebd63 | ||
![]() |
9e0493c64c | ||
![]() |
e3509c092a | ||
![]() |
762cfc7d10 | ||
![]() |
522f80ed9d | ||
![]() |
fd6062de75 | ||
![]() |
c872cce59f | ||
![]() |
dc8267d890 | ||
![]() |
2bfb9f4ed0 | ||
![]() |
dda0a55e5e | ||
![]() |
2680f855ff | ||
![]() |
6ca791850a | ||
![]() |
43df06f210 | ||
![]() |
7e6b1bbd79 | ||
![]() |
716acaa657 |
@ -15,7 +15,7 @@ body:
|
|||||||
attributes:
|
attributes:
|
||||||
label: NetBox version
|
label: NetBox version
|
||||||
description: What version of NetBox are you currently running?
|
description: What version of NetBox are you currently running?
|
||||||
placeholder: v4.3.2
|
placeholder: v4.3.4
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
- type: dropdown
|
- type: dropdown
|
||||||
|
2
.github/ISSUE_TEMPLATE/02-bug_report.yaml
vendored
2
.github/ISSUE_TEMPLATE/02-bug_report.yaml
vendored
@ -27,7 +27,7 @@ body:
|
|||||||
attributes:
|
attributes:
|
||||||
label: NetBox Version
|
label: NetBox Version
|
||||||
description: What version of NetBox are you currently running?
|
description: What version of NetBox are you currently running?
|
||||||
placeholder: v4.3.2
|
placeholder: v4.3.4
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
- type: dropdown
|
- type: dropdown
|
||||||
|
@ -8,7 +8,7 @@
|
|||||||
</h3>
|
</h3>
|
||||||
<h3>
|
<h3>
|
||||||
:jigsaw: <a href="#jigsaw-creating-plugins">Create a plugin</a> ·
|
:jigsaw: <a href="#jigsaw-creating-plugins">Create a plugin</a> ·
|
||||||
:rescue_worker_helmet: <a href="#rescue_worker_helmet-become-a-maintainer">Become a maintainer</a> ·
|
:briefcase: <a href="#briefcase-looking-for-a-job">Work with us!</a> ·
|
||||||
:heart: <a href="#heart-other-ways-to-contribute">Other ideas</a>
|
:heart: <a href="#heart-other-ways-to-contribute">Other ideas</a>
|
||||||
</h3>
|
</h3>
|
||||||
</div>
|
</div>
|
||||||
@ -109,21 +109,9 @@ Do you have an idea for something you'd like to build in NetBox, but might not b
|
|||||||
|
|
||||||
Check out our [plugin development tutorial](https://github.com/netbox-community/netbox-plugin-tutorial) to get started!
|
Check out our [plugin development tutorial](https://github.com/netbox-community/netbox-plugin-tutorial) to get started!
|
||||||
|
|
||||||
## :rescue_worker_helmet: Become a Maintainer
|
## :briefcase: Looking for a Job?
|
||||||
|
|
||||||
We're always looking for motivated individuals to join the maintainers team and help drive NetBox's long-term development. Some of our most sought-after skills include:
|
At [NetBox Labs](https://netboxlabs.com/), we're always looking for highly skilled and motivated people to join our team. While NetBox is a core part of our product lineup, we have an ever-expanding suite of solutions serving the network automation space. Check out our [current openings](https://netboxlabs.com/careers/) to see if you might be a fit!
|
||||||
|
|
||||||
* Python development with a strong focus on the [Django](https://www.djangoproject.com/) framework
|
|
||||||
* Expertise working with PostgreSQL databases
|
|
||||||
* Javascript & TypeScript proficiency
|
|
||||||
* A knack for web application design (HTML & CSS)
|
|
||||||
* Familiarity with git and software development best practices
|
|
||||||
* Excellent attention to detail
|
|
||||||
* Working experience in the field of network operations & engineering
|
|
||||||
|
|
||||||
We generally ask that maintainers dedicate around four hours of work to the project each week on average, which includes both hands-on development and project management tasks such as issue triage. Maintainers are also encouraged (but not required) to attend our bi-weekly Zoom call to catch up on recent items.
|
|
||||||
|
|
||||||
Interested? You can contact our lead maintainer, Jeremy Stretch, at jeremy@netbox.dev or on the [NetDev Community Slack](https://netdev.chat/). We'd love to have you on the team!
|
|
||||||
|
|
||||||
## :heart: Other Ways to Contribute
|
## :heart: Other Ways to Contribute
|
||||||
|
|
||||||
|
@ -6,9 +6,9 @@
|
|||||||
<a href="https://github.com/netbox-community/netbox/graphs/contributors"><img src="https://img.shields.io/github/contributors/netbox-community/netbox?color=blue" alt="Contributors" /></a>
|
<a href="https://github.com/netbox-community/netbox/graphs/contributors"><img src="https://img.shields.io/github/contributors/netbox-community/netbox?color=blue" alt="Contributors" /></a>
|
||||||
<a href="https://github.com/netbox-community/netbox/stargazers"><img src="https://img.shields.io/github/stars/netbox-community/netbox?style=flat" alt="GitHub stars" /></a>
|
<a href="https://github.com/netbox-community/netbox/stargazers"><img src="https://img.shields.io/github/stars/netbox-community/netbox?style=flat" alt="GitHub stars" /></a>
|
||||||
<a href="https://explore.transifex.com/netbox-community/netbox/"><img src="https://img.shields.io/badge/languages-15-blue" alt="Languages supported" /></a>
|
<a href="https://explore.transifex.com/netbox-community/netbox/"><img src="https://img.shields.io/badge/languages-15-blue" alt="Languages supported" /></a>
|
||||||
<a href="https://github.com/netbox-community/netbox/actions/workflows/ci.yml"><img src="https://github.com/netbox-community/netbox/workflows/CI/badge.svg?branch=main" alt="CI status" /></a>
|
<a href="https://github.com/netbox-community/netbox/actions/workflows/ci.yml"><img src="https://github.com/netbox-community/netbox/actions/workflows/ci.yml/badge.svg" alt="CI status" /></a>
|
||||||
<p>
|
<p>
|
||||||
<strong><a href="https://github.com/netbox-community/netbox/">NetBox Community</a></strong> |
|
<strong><a href="https://netboxlabs.com/community/">NetBox Community</a></strong> |
|
||||||
<strong><a href="https://netboxlabs.com/netbox-cloud/">NetBox Cloud</a></strong> |
|
<strong><a href="https://netboxlabs.com/netbox-cloud/">NetBox Cloud</a></strong> |
|
||||||
<strong><a href="https://netboxlabs.com/netbox-enterprise/">NetBox Enterprise</a></strong>
|
<strong><a href="https://netboxlabs.com/netbox-enterprise/">NetBox Enterprise</a></strong>
|
||||||
</p>
|
</p>
|
||||||
|
@ -14,6 +14,10 @@ django-debug-toolbar
|
|||||||
# https://github.com/carltongibson/django-filter/blob/main/CHANGES.rst
|
# https://github.com/carltongibson/django-filter/blob/main/CHANGES.rst
|
||||||
django-filter
|
django-filter
|
||||||
|
|
||||||
|
# Django Debug Toolbar extension for GraphiQL
|
||||||
|
# https://github.com/flavors/django-graphiql-debug-toolbar/blob/main/CHANGES.rst
|
||||||
|
django-graphiql-debug-toolbar
|
||||||
|
|
||||||
# HTMX utilities for Django
|
# HTMX utilities for Django
|
||||||
# https://django-htmx.readthedocs.io/en/latest/changelog.html
|
# https://django-htmx.readthedocs.io/en/latest/changelog.html
|
||||||
django-htmx
|
django-htmx
|
||||||
@ -108,6 +112,7 @@ nh3
|
|||||||
|
|
||||||
# Fork of PIL (Python Imaging Library) for image processing
|
# Fork of PIL (Python Imaging Library) for image processing
|
||||||
# https://github.com/python-pillow/Pillow/releases
|
# https://github.com/python-pillow/Pillow/releases
|
||||||
|
# https://pillow.readthedocs.io/en/stable/releasenotes/
|
||||||
Pillow
|
Pillow
|
||||||
|
|
||||||
# PostgreSQL database adapter for Python
|
# PostgreSQL database adapter for Python
|
||||||
@ -126,21 +131,22 @@ requests
|
|||||||
# https://github.com/rq/rq/blob/master/CHANGES.md
|
# https://github.com/rq/rq/blob/master/CHANGES.md
|
||||||
rq
|
rq
|
||||||
|
|
||||||
# Social authentication framework
|
|
||||||
# https://github.com/python-social-auth/social-core/blob/master/CHANGELOG.md
|
|
||||||
social-auth-core
|
|
||||||
|
|
||||||
# Django app for social-auth-core
|
# Django app for social-auth-core
|
||||||
# https://github.com/python-social-auth/social-app-django/blob/master/CHANGELOG.md
|
# https://github.com/python-social-auth/social-app-django/blob/master/CHANGELOG.md
|
||||||
social-auth-app-django
|
social-auth-app-django
|
||||||
|
|
||||||
|
# Social authentication framework
|
||||||
|
# https://github.com/python-social-auth/social-core/blob/master/CHANGELOG.md
|
||||||
|
social-auth-core
|
||||||
|
|
||||||
# Strawberry GraphQL
|
# Strawberry GraphQL
|
||||||
# https://github.com/strawberry-graphql/strawberry/blob/main/CHANGELOG.md
|
# https://github.com/strawberry-graphql/strawberry/blob/main/CHANGELOG.md
|
||||||
strawberry-graphql
|
strawberry-graphql
|
||||||
|
|
||||||
# Strawberry GraphQL Django extension
|
# Strawberry GraphQL Django extension
|
||||||
# https://github.com/strawberry-graphql/strawberry-django/releases
|
# https://github.com/strawberry-graphql/strawberry-django/releases
|
||||||
strawberry-graphql-django
|
# See #19771
|
||||||
|
strawberry-graphql-django==0.60.0
|
||||||
|
|
||||||
# SVG image rendering (used for rack elevations)
|
# SVG image rendering (used for rack elevations)
|
||||||
# https://github.com/mozman/svgwrite/blob/master/NEWS.rst
|
# https://github.com/mozman/svgwrite/blob/master/NEWS.rst
|
||||||
|
@ -158,6 +158,7 @@ LOGGING = {
|
|||||||
* `netbox.<app>.<model>` - Generic form for model-specific log messages
|
* `netbox.<app>.<model>` - Generic form for model-specific log messages
|
||||||
* `netbox.auth.*` - Authentication events
|
* `netbox.auth.*` - Authentication events
|
||||||
* `netbox.api.views.*` - Views which handle business logic for the REST API
|
* `netbox.api.views.*` - Views which handle business logic for the REST API
|
||||||
|
* `netbox.event_rules` - Event rules
|
||||||
* `netbox.reports.*` - Report execution (`module.name`)
|
* `netbox.reports.*` - Report execution (`module.name`)
|
||||||
* `netbox.scripts.*` - Custom script execution (`module.name`)
|
* `netbox.scripts.*` - Custom script execution (`module.name`)
|
||||||
* `netbox.views.*` - Views which handle business logic for the web UI
|
* `netbox.views.*` - Views which handle business logic for the web UI
|
||||||
|
@ -147,7 +147,7 @@ For UI development you will need to review the [Web UI Development Guide](web-ui
|
|||||||
|
|
||||||
## Populating Demo Data
|
## Populating Demo Data
|
||||||
|
|
||||||
Once you have your development environment up and running, it might be helpful to populate some "dummy" data to make interacting with the UI and APIs more convenient. Check out the [netbox-demo-data](https://github.com/netbox-community/netbox-demo-data) repo on GitHub, which houses a collection of sample data that can be easily imported to any new NetBox deployment. (This sample data is used to populate the public demo instance at <https://demo.netbox.dev>.)
|
Once you have your development environment up and running, it might be helpful to populate some "dummy" data to make interacting with the UI and APIs more convenient. Check out the [netbox-demo-data](https://github.com/netbox-community/netbox-demo-data) repo on GitHub, which houses a collection of sample data that can be easily imported to any new NetBox deployment. This sample data is used to populate the [public demo instance](https://demo.netbox.dev).
|
||||||
|
|
||||||
The demo data is provided in JSON format and loaded into an empty database using Django's `loaddata` management command. Consult the demo data repo's `README` file for complete instructions on populating the data.
|
The demo data is provided in JSON format and loaded into an empty database using Django's `loaddata` management command. Consult the demo data repo's `README` file for complete instructions on populating the data.
|
||||||
|
|
||||||
|
@ -166,7 +166,8 @@ Then, compile these portable (`.po`) files for use in the application:
|
|||||||
|
|
||||||
### Update Version and Changelog
|
### Update Version and Changelog
|
||||||
|
|
||||||
* Update the version number and date in `netbox/release.yaml` and `pyproject.toml`. Add or remove the designation (e.g. `beta1`) if applicable.
|
* Update the version number and published date in `netbox/release.yaml`. Add or remove the designation (e.g. `beta1`) if applicable.
|
||||||
|
* Copy the version number from `release.yaml` to `pyproject.toml` in the project root.
|
||||||
* Update the example version numbers in the feature request and bug report templates under `.github/ISSUE_TEMPLATES/`.
|
* Update the example version numbers in the feature request and bug report templates under `.github/ISSUE_TEMPLATES/`.
|
||||||
* Add a section for this release at the top of the changelog page for the minor version (e.g. `docs/release-notes/version-4.2.md`) listing all relevant changes made in this release.
|
* Add a section for this release at the top of the changelog page for the minor version (e.g. `docs/release-notes/version-4.2.md`) listing all relevant changes made in this release.
|
||||||
|
|
||||||
@ -192,15 +193,3 @@ Create a [new release](https://github.com/netbox-community/netbox/releases/new)
|
|||||||
* **Description:** Copy from the pull request body, then promote the `###` headers to `##` ones
|
* **Description:** Copy from the pull request body, then promote the `###` headers to `##` ones
|
||||||
|
|
||||||
Once created, the release will become available for users to install.
|
Once created, the release will become available for users to install.
|
||||||
|
|
||||||
### Update the Public Documentation
|
|
||||||
|
|
||||||
After a release has been published, the public NetBox documentation needs to be updated. This is accomplished by running two actions on the [netboxlabs-docs](https://github.com/netboxlabs/netboxlabs-docs) repository.
|
|
||||||
|
|
||||||
First, run the `build-site` action, by navigating to Actions > build-site > Run workflow. This process compiles the documentation along with an overlay for integration with the documentation portal at <https://netboxlabs.com/docs>. The job should take about two minutes.
|
|
||||||
|
|
||||||
Once the documentation files have been compiled, they must be published by running the `deploy-kinsta` action. Select the desired deployment environment (staging or production) and specify `latest` as the deploy tag.
|
|
||||||
|
|
||||||
Clear the CDN cache from the [Kinsta](https://my.kinsta.com/) portal. Navigate to _Sites_ / _NetBox Labs_ / _Live_, select _Cache_ in the left-nav, click the _Clear Cache_ button, and confirm the clear operation.
|
|
||||||
|
|
||||||
Finally, verify that the documentation at <https://netboxlabs.com/docs/netbox/en/stable/> has been updated.
|
|
||||||
|
@ -2,9 +2,9 @@
|
|||||||
|
|
||||||
NetBox includes the ability to execute certain functions as background tasks. These include:
|
NetBox includes the ability to execute certain functions as background tasks. These include:
|
||||||
|
|
||||||
* [Report](../customization/reports.md) execution
|
|
||||||
* [Custom script](../customization/custom-scripts.md) execution
|
* [Custom script](../customization/custom-scripts.md) execution
|
||||||
* Synchronization of [remote data sources](../integrations/synchronized-data.md)
|
* Synchronization of [remote data sources](../integrations/synchronized-data.md)
|
||||||
|
* Housekeeping tasks
|
||||||
|
|
||||||
Additionally, NetBox plugins can enqueue their own background tasks. This is accomplished using the [Job model](../models/core/job.md). Background tasks are executed by the `rqworker` process(es).
|
Additionally, NetBox plugins can enqueue their own background tasks. This is accomplished using the [Job model](../models/core/job.md). Background tasks are executed by the `rqworker` process(es).
|
||||||
|
|
||||||
|
@ -135,7 +135,7 @@ Check out the desired release by specifying its tag. For example:
|
|||||||
|
|
||||||
```
|
```
|
||||||
cd /opt/netbox && \
|
cd /opt/netbox && \
|
||||||
sudo git fetch && \
|
sudo git fetch --tags && \
|
||||||
sudo git checkout v4.2.7
|
sudo git checkout v4.2.7
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Binary file not shown.
Before Width: | Height: | Size: 15 KiB After Width: | Height: | Size: 24 KiB |
@ -15,7 +15,6 @@ A background job implements a basic [Job](../../models/core/job.md) executor for
|
|||||||
```python title="jobs.py"
|
```python title="jobs.py"
|
||||||
from netbox.jobs import JobRunner
|
from netbox.jobs import JobRunner
|
||||||
|
|
||||||
|
|
||||||
class MyTestJob(JobRunner):
|
class MyTestJob(JobRunner):
|
||||||
class Meta:
|
class Meta:
|
||||||
name = "My Test Job"
|
name = "My Test Job"
|
||||||
@ -25,6 +24,8 @@ class MyTestJob(JobRunner):
|
|||||||
# your logic goes here
|
# your logic goes here
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Completed jobs will have their status updated to "completed" by default, or "errored" if an unhandled exception was raised by the `run()` method. To intentionally mark a job as failed, raise the `core.exceptions.JobFailed` exception. (Note that "failed" differs from "errored" in that a failure may be expected under certain conditions, whereas an error is not.)
|
||||||
|
|
||||||
You can schedule the background job from within your code (e.g. from a model's `save()` method or a view) by calling `MyTestJob.enqueue()`. This method passes through all arguments to `Job.enqueue()`. However, no `name` argument must be passed, as the background job name will be used instead.
|
You can schedule the background job from within your code (e.g. from a model's `save()` method or a view) by calling `MyTestJob.enqueue()`. This method passes through all arguments to `Job.enqueue()`. However, no `name` argument must be passed, as the background job name will be used instead.
|
||||||
|
|
||||||
!!! tip
|
!!! tip
|
||||||
|
@ -86,3 +86,69 @@ netbox=> DELETE FROM django_migrations WHERE app='pluginname';
|
|||||||
|
|
||||||
!!! warning
|
!!! warning
|
||||||
Exercise extreme caution when altering Django system tables. Users are strongly encouraged to perform a backup of their database immediately before taking these actions.
|
Exercise extreme caution when altering Django system tables. Users are strongly encouraged to perform a backup of their database immediately before taking these actions.
|
||||||
|
|
||||||
|
## Clean Up Content Types and Permissions
|
||||||
|
|
||||||
|
After removing a plugin and its database tables, you may find that object type references (`ContentTypes`) created by the plugin still appear in the permissions management section (e.g., when editing permissions in the NetBox UI).
|
||||||
|
This happens because the `django_content_type` table retains entries for the models that the plugin registered with Django.
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
Please use caution when removing `ContentTypes`. It is strongly recommended to **back up your database** before making these changes.
|
||||||
|
|
||||||
|
**Identify Stale Content Types:**
|
||||||
|
|
||||||
|
Open the Django shell to inspect lingering `ContentType` entries related to the removed plugin.
|
||||||
|
Typically, the Content Type's `app_label` matches the plugin’s name.
|
||||||
|
|
||||||
|
|
||||||
|
```no-highlight
|
||||||
|
$ cd /opt/netbox/
|
||||||
|
$ source /opt/netbox/venv/bin/activate
|
||||||
|
(venv) $ python3 netbox/manage.py nbshell
|
||||||
|
```
|
||||||
|
|
||||||
|
Then, in the shell:
|
||||||
|
|
||||||
|
```no-highlight
|
||||||
|
from django.contrib.contenttypes.models import ContentType
|
||||||
|
# Replace 'pluginname' with your plugin's actual name
|
||||||
|
stale_types = ContentType.objects.filter(app_label="pluginname")
|
||||||
|
for ct in stale_types:
|
||||||
|
print(ct)
|
||||||
|
### ^^^ These will be removed, make sure its ok
|
||||||
|
```
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
Review the output carefully and confirm that each listed Content Type is related to the plugin you removed.
|
||||||
|
|
||||||
|
**Remove Stale Content Types and Related Permissions:**
|
||||||
|
|
||||||
|
Next, check for any permissions associated with these Content Types:
|
||||||
|
|
||||||
|
```no-highlight
|
||||||
|
from django.contrib.auth.models import Permission
|
||||||
|
for ct in stale_types:
|
||||||
|
perms = Permission.objects.filter(content_type=ct)
|
||||||
|
print(list(perms))
|
||||||
|
```
|
||||||
|
|
||||||
|
If there are related Permissions, you can remove them safely:
|
||||||
|
|
||||||
|
```no-highlight
|
||||||
|
for ct in stale_types:
|
||||||
|
Permission.objects.filter(content_type=ct).delete()
|
||||||
|
```
|
||||||
|
|
||||||
|
After removing any related permissions, delete the Content Type entries:
|
||||||
|
|
||||||
|
```no-highlight
|
||||||
|
stale_types.delete()
|
||||||
|
```
|
||||||
|
|
||||||
|
**Restart NetBox:**
|
||||||
|
|
||||||
|
After making these changes, restart the NetBox service to ensure all changes are reflected.
|
||||||
|
|
||||||
|
```no-highlight
|
||||||
|
sudo systemctl restart netbox
|
||||||
|
```
|
||||||
|
@ -1,5 +1,55 @@
|
|||||||
# NetBox v4.3
|
# NetBox v4.3
|
||||||
|
|
||||||
|
## v4.3.4 (2025-07-15)
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
* [#18811](https://github.com/netbox-community/netbox/issues/18811) - Match expanded form IPv6 addresses in global search
|
||||||
|
* [#19550](https://github.com/netbox-community/netbox/issues/19550) - Enable lazy loading for rack elevations
|
||||||
|
* [#19571](https://github.com/netbox-community/netbox/issues/19571) - Add a default module type profile for expansion cards
|
||||||
|
* [#19793](https://github.com/netbox-community/netbox/issues/19793) - Support custom dynamic navigation menu links
|
||||||
|
* [#19828](https://github.com/netbox-community/netbox/issues/19828) - Expose L2VPN termination in interface GraphQL response
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
|
||||||
|
* [#19413](https://github.com/netbox-community/netbox/issues/19413) - Custom fields should be grouped in filter forms
|
||||||
|
* [#19633](https://github.com/netbox-community/netbox/issues/19633) - Introduce InvalidCondition exception and log all evaluations of invalid event rule conditions
|
||||||
|
* [#19800](https://github.com/netbox-community/netbox/issues/19800) - Module type bulk import should support profile assignment
|
||||||
|
* [#19806](https://github.com/netbox-community/netbox/issues/19806) - Introduce JobFailed exception to allow marking background jobs as failed
|
||||||
|
* [#19827](https://github.com/netbox-community/netbox/issues/19827) - Enforce uniqueness for device role names & slugs
|
||||||
|
* [#19839](https://github.com/netbox-community/netbox/issues/19839) - Enable export of parent assignment for recursively nested objects
|
||||||
|
* [#19876](https://github.com/netbox-community/netbox/issues/19876) - Remove Markdown rendering from CustomFieldChoiceSet description field
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## v4.3.3 (2025-06-26)
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
* [#17183](https://github.com/netbox-community/netbox/issues/17183) - Enable associating tags with object types during bulk import
|
||||||
|
* [#17719](https://github.com/netbox-community/netbox/issues/17719) - Introduce a user preference for table row striping
|
||||||
|
* [#19492](https://github.com/netbox-community/netbox/issues/19492) - Add a UI button to download the output of an executed custom script
|
||||||
|
* [#19499](https://github.com/netbox-community/netbox/issues/19499) - Support qualifying interfaces by parent device when bulk importing wireless links
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
|
||||||
|
* [#19529](https://github.com/netbox-community/netbox/issues/19529) - Fix support for running custom scripts via the `runscript` management command
|
||||||
|
* [#19555](https://github.com/netbox-community/netbox/issues/19555) - Fix support for `schedule_at` when invoking a custom script via the REST API
|
||||||
|
* [#19617](https://github.com/netbox-community/netbox/issues/19617) - Ensure consistent styling of "connect" buttons in UI
|
||||||
|
* [#19640](https://github.com/netbox-community/netbox/issues/19640) - Restore ability to filter FHRP group assignments by device/VM in GraphQL API
|
||||||
|
* [#19644](https://github.com/netbox-community/netbox/issues/19644) - Atomic transactions should always employ database routing
|
||||||
|
* [#19659](https://github.com/netbox-community/netbox/issues/19659) - Populate initial device/VM selection for "add a service" button
|
||||||
|
* [#19665](https://github.com/netbox-community/netbox/issues/19665) - Correct field reference in wireless link model validation
|
||||||
|
* [#19667](https://github.com/netbox-community/netbox/issues/19667) - Fix `TypeError` exception when creating a new module profile type with no schema
|
||||||
|
* [#19673](https://github.com/netbox-community/netbox/issues/19673) - Ignore custom field references when compiling table prefetches
|
||||||
|
* [#19677](https://github.com/netbox-community/netbox/issues/19677) - Fix exception when passing null value to `present_in_vrf` filter
|
||||||
|
* [#19680](https://github.com/netbox-community/netbox/issues/19680) - Correct chronological ordering of change records resulting from device deletions
|
||||||
|
* [#19687](https://github.com/netbox-community/netbox/issues/19687) - Cellular interface types should be considered non-connectable
|
||||||
|
* [#19702](https://github.com/netbox-community/netbox/issues/19702) - Fix `DoesNotExist` exception when deleting a notification group with an associated event rule
|
||||||
|
* [#19745](https://github.com/netbox-community/netbox/issues/19745) - Fix bulk import of services with IP addresses assigned to FHRP groups
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## v4.3.2 (2025-06-05)
|
## v4.3.2 (2025-06-05)
|
||||||
|
|
||||||
### Enhancements
|
### Enhancements
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
from django.contrib import messages
|
from django.contrib import messages
|
||||||
from django.db import transaction
|
from django.db import router, transaction
|
||||||
from django.shortcuts import get_object_or_404, redirect, render
|
from django.shortcuts import get_object_or_404, redirect, render
|
||||||
from django.utils.translation import gettext_lazy as _
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
@ -384,7 +384,7 @@ class CircuitSwapTerminations(generic.ObjectEditView):
|
|||||||
|
|
||||||
if termination_a and termination_z:
|
if termination_a and termination_z:
|
||||||
# Use a placeholder to avoid an IntegrityError on the (circuit, term_side) unique constraint
|
# Use a placeholder to avoid an IntegrityError on the (circuit, term_side) unique constraint
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(CircuitTermination)):
|
||||||
termination_a.term_side = '_'
|
termination_a.term_side = '_'
|
||||||
termination_a.save()
|
termination_a.save()
|
||||||
termination_z.term_side = 'A'
|
termination_z.term_side = 'A'
|
||||||
|
@ -1,9 +1,19 @@
|
|||||||
from django.core.exceptions import ImproperlyConfigured
|
from django.core.exceptions import ImproperlyConfigured
|
||||||
|
|
||||||
|
__all__ = (
|
||||||
class SyncError(Exception):
|
'IncompatiblePluginError',
|
||||||
pass
|
'JobFailed',
|
||||||
|
'SyncError',
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
class IncompatiblePluginError(ImproperlyConfigured):
|
class IncompatiblePluginError(ImproperlyConfigured):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class JobFailed(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class SyncError(Exception):
|
||||||
|
pass
|
||||||
|
@ -187,15 +187,14 @@ class Job(models.Model):
|
|||||||
"""
|
"""
|
||||||
Mark the job as completed, optionally specifying a particular termination status.
|
Mark the job as completed, optionally specifying a particular termination status.
|
||||||
"""
|
"""
|
||||||
valid_statuses = JobStatusChoices.TERMINAL_STATE_CHOICES
|
if status not in JobStatusChoices.TERMINAL_STATE_CHOICES:
|
||||||
if status not in valid_statuses:
|
|
||||||
raise ValueError(
|
raise ValueError(
|
||||||
_("Invalid status for job termination. Choices are: {choices}").format(
|
_("Invalid status for job termination. Choices are: {choices}").format(
|
||||||
choices=', '.join(valid_statuses)
|
choices=', '.join(JobStatusChoices.TERMINAL_STATE_CHOICES)
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
# Mark the job as completed
|
# Set the job's status and completion time
|
||||||
self.status = status
|
self.status = status
|
||||||
if error:
|
if error:
|
||||||
self.error = error
|
self.error = error
|
||||||
|
@ -162,6 +162,12 @@ def handle_deleted_object(sender, instance, **kwargs):
|
|||||||
getattr(obj, related_field_name).remove(instance)
|
getattr(obj, related_field_name).remove(instance)
|
||||||
elif type(relation) is ManyToOneRel and relation.field.null is True:
|
elif type(relation) is ManyToOneRel and relation.field.null is True:
|
||||||
setattr(obj, related_field_name, None)
|
setattr(obj, related_field_name, None)
|
||||||
|
# make sure the object hasn't been deleted - in case of
|
||||||
|
# deletion chaining of related objects
|
||||||
|
try:
|
||||||
|
obj.refresh_from_db()
|
||||||
|
except DoesNotExist:
|
||||||
|
continue
|
||||||
obj.save()
|
obj.save()
|
||||||
|
|
||||||
# Enqueue the object for event processing
|
# Enqueue the object for event processing
|
||||||
|
@ -6,12 +6,13 @@ from rest_framework import status
|
|||||||
from core.choices import ObjectChangeActionChoices
|
from core.choices import ObjectChangeActionChoices
|
||||||
from core.models import ObjectChange, ObjectType
|
from core.models import ObjectChange, ObjectType
|
||||||
from dcim.choices import SiteStatusChoices
|
from dcim.choices import SiteStatusChoices
|
||||||
from dcim.models import Site
|
from dcim.models import Site, CableTermination, Device, DeviceType, DeviceRole, Interface, Cable
|
||||||
from extras.choices import *
|
from extras.choices import *
|
||||||
from extras.models import CustomField, CustomFieldChoiceSet, Tag
|
from extras.models import CustomField, CustomFieldChoiceSet, Tag
|
||||||
from utilities.testing import APITestCase
|
from utilities.testing import APITestCase
|
||||||
from utilities.testing.utils import create_tags, post_data
|
from utilities.testing.utils import create_tags, post_data
|
||||||
from utilities.testing.views import ModelViewTestCase
|
from utilities.testing.views import ModelViewTestCase
|
||||||
|
from dcim.models import Manufacturer
|
||||||
|
|
||||||
|
|
||||||
class ChangeLogViewTest(ModelViewTestCase):
|
class ChangeLogViewTest(ModelViewTestCase):
|
||||||
@ -270,6 +271,81 @@ class ChangeLogViewTest(ModelViewTestCase):
|
|||||||
# Check that no ObjectChange records have been created
|
# Check that no ObjectChange records have been created
|
||||||
self.assertEqual(ObjectChange.objects.count(), 0)
|
self.assertEqual(ObjectChange.objects.count(), 0)
|
||||||
|
|
||||||
|
def test_ordering_genericrelation(self):
|
||||||
|
# Create required objects first
|
||||||
|
manufacturer = Manufacturer.objects.create(name='Manufacturer 1')
|
||||||
|
device_type = DeviceType.objects.create(
|
||||||
|
manufacturer=manufacturer,
|
||||||
|
model='Model 1',
|
||||||
|
slug='model-1'
|
||||||
|
)
|
||||||
|
device_role = DeviceRole.objects.create(
|
||||||
|
name='Role 1',
|
||||||
|
slug='role-1'
|
||||||
|
)
|
||||||
|
site = Site.objects.create(
|
||||||
|
name='Site 1',
|
||||||
|
slug='site-1'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create two devices
|
||||||
|
device1 = Device.objects.create(
|
||||||
|
name='Device 1',
|
||||||
|
device_type=device_type,
|
||||||
|
role=device_role,
|
||||||
|
site=site
|
||||||
|
)
|
||||||
|
device2 = Device.objects.create(
|
||||||
|
name='Device 2',
|
||||||
|
device_type=device_type,
|
||||||
|
role=device_role,
|
||||||
|
site=site
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create interfaces on both devices
|
||||||
|
interface1 = Interface.objects.create(
|
||||||
|
device=device1,
|
||||||
|
name='eth0',
|
||||||
|
type='1000base-t'
|
||||||
|
)
|
||||||
|
interface2 = Interface.objects.create(
|
||||||
|
device=device2,
|
||||||
|
name='eth0',
|
||||||
|
type='1000base-t'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create a cable between the interfaces
|
||||||
|
_ = Cable.objects.create(
|
||||||
|
a_terminations=[interface1],
|
||||||
|
b_terminations=[interface2],
|
||||||
|
status='connected'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Delete device1
|
||||||
|
request = {
|
||||||
|
'path': reverse('dcim:device_delete', kwargs={'pk': device1.pk}),
|
||||||
|
'data': post_data({'confirm': True}),
|
||||||
|
}
|
||||||
|
self.add_permissions(
|
||||||
|
'dcim.delete_device',
|
||||||
|
'dcim.delete_interface',
|
||||||
|
'dcim.delete_cable',
|
||||||
|
'dcim.delete_cabletermination'
|
||||||
|
)
|
||||||
|
response = self.client.post(**request)
|
||||||
|
self.assertHttpStatus(response, 302)
|
||||||
|
|
||||||
|
# Get the ObjectChange records for delete actions ordered by time
|
||||||
|
changes = ObjectChange.objects.filter(
|
||||||
|
action=ObjectChangeActionChoices.ACTION_DELETE
|
||||||
|
).order_by('time')[:3]
|
||||||
|
|
||||||
|
# Verify the order of deletion
|
||||||
|
self.assertEqual(len(changes), 3)
|
||||||
|
self.assertEqual(changes[0].changed_object_type, ContentType.objects.get_for_model(CableTermination))
|
||||||
|
self.assertEqual(changes[1].changed_object_type, ContentType.objects.get_for_model(Interface))
|
||||||
|
self.assertEqual(changes[2].changed_object_type, ContentType.objects.get_for_model(Device))
|
||||||
|
|
||||||
|
|
||||||
class ChangeLogAPITest(APITestCase):
|
class ChangeLogAPITest(APITestCase):
|
||||||
|
|
||||||
|
@ -53,6 +53,11 @@ WIRELESS_IFACE_TYPES = [
|
|||||||
InterfaceTypeChoices.TYPE_802151,
|
InterfaceTypeChoices.TYPE_802151,
|
||||||
InterfaceTypeChoices.TYPE_802154,
|
InterfaceTypeChoices.TYPE_802154,
|
||||||
InterfaceTypeChoices.TYPE_OTHER_WIRELESS,
|
InterfaceTypeChoices.TYPE_OTHER_WIRELESS,
|
||||||
|
InterfaceTypeChoices.TYPE_GSM,
|
||||||
|
InterfaceTypeChoices.TYPE_CDMA,
|
||||||
|
InterfaceTypeChoices.TYPE_LTE,
|
||||||
|
InterfaceTypeChoices.TYPE_4G,
|
||||||
|
InterfaceTypeChoices.TYPE_5G,
|
||||||
]
|
]
|
||||||
|
|
||||||
NONCONNECTABLE_IFACE_TYPES = VIRTUAL_IFACE_TYPES + WIRELESS_IFACE_TYPES
|
NONCONNECTABLE_IFACE_TYPES = VIRTUAL_IFACE_TYPES + WIRELESS_IFACE_TYPES
|
||||||
|
@ -470,8 +470,8 @@ class ModuleTypeImportForm(NetBoxModelImportForm):
|
|||||||
class Meta:
|
class Meta:
|
||||||
model = ModuleType
|
model = ModuleType
|
||||||
fields = [
|
fields = [
|
||||||
'manufacturer', 'model', 'part_number', 'description', 'airflow', 'weight', 'weight_unit', 'comments',
|
'manufacturer', 'model', 'part_number', 'description', 'airflow', 'weight', 'weight_unit', 'profile',
|
||||||
'tags',
|
'comments', 'tags'
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
|
@ -33,6 +33,7 @@ if TYPE_CHECKING:
|
|||||||
from tenancy.graphql.types import TenantType
|
from tenancy.graphql.types import TenantType
|
||||||
from users.graphql.types import UserType
|
from users.graphql.types import UserType
|
||||||
from virtualization.graphql.types import ClusterType, VMInterfaceType, VirtualMachineType
|
from virtualization.graphql.types import ClusterType, VMInterfaceType, VirtualMachineType
|
||||||
|
from vpn.graphql.types import L2VPNTerminationType
|
||||||
from wireless.graphql.types import WirelessLANType, WirelessLinkType
|
from wireless.graphql.types import WirelessLANType, WirelessLinkType
|
||||||
|
|
||||||
__all__ = (
|
__all__ = (
|
||||||
@ -440,6 +441,7 @@ class InterfaceType(IPAddressesMixin, ModularComponentType, CabledObjectMixin, P
|
|||||||
primary_mac_address: Annotated["MACAddressType", strawberry.lazy('dcim.graphql.types')] | None
|
primary_mac_address: Annotated["MACAddressType", strawberry.lazy('dcim.graphql.types')] | None
|
||||||
qinq_svlan: Annotated["VLANType", strawberry.lazy('ipam.graphql.types')] | None
|
qinq_svlan: Annotated["VLANType", strawberry.lazy('ipam.graphql.types')] | None
|
||||||
vlan_translation_policy: Annotated["VLANTranslationPolicyType", strawberry.lazy('ipam.graphql.types')] | None
|
vlan_translation_policy: Annotated["VLANTranslationPolicyType", strawberry.lazy('ipam.graphql.types')] | None
|
||||||
|
l2vpn_termination: Annotated["L2VPNTerminationType", strawberry.lazy('vpn.graphql.types')] | None
|
||||||
|
|
||||||
vdcs: List[Annotated["VirtualDeviceContextType", strawberry.lazy('dcim.graphql.types')]]
|
vdcs: List[Annotated["VirtualDeviceContextType", strawberry.lazy('dcim.graphql.types')]]
|
||||||
tagged_vlans: List[Annotated["VLANType", strawberry.lazy('ipam.graphql.types')]]
|
tagged_vlans: List[Annotated["VLANType", strawberry.lazy('ipam.graphql.types')]]
|
||||||
|
@ -19,7 +19,8 @@ def load_initial_data(apps, schema_editor):
|
|||||||
'gpu',
|
'gpu',
|
||||||
'hard_disk',
|
'hard_disk',
|
||||||
'memory',
|
'memory',
|
||||||
'power_supply'
|
'power_supply',
|
||||||
|
'expansion_card'
|
||||||
)
|
)
|
||||||
|
|
||||||
for name in initial_profiles:
|
for name in initial_profiles:
|
||||||
|
44
netbox/dcim/migrations/0208_devicerole_uniqueness.py
Normal file
44
netbox/dcim/migrations/0208_devicerole_uniqueness.py
Normal file
@ -0,0 +1,44 @@
|
|||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('dcim', '0207_remove_redundant_indexes'),
|
||||||
|
('extras', '0129_fix_script_paths'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='devicerole',
|
||||||
|
constraint=models.UniqueConstraint(
|
||||||
|
fields=('parent', 'name'),
|
||||||
|
name='dcim_devicerole_parent_name'
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='devicerole',
|
||||||
|
constraint=models.UniqueConstraint(
|
||||||
|
condition=models.Q(('parent__isnull', True)),
|
||||||
|
fields=('name',),
|
||||||
|
name='dcim_devicerole_name',
|
||||||
|
violation_error_message='A top-level device role with this name already exists.'
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='devicerole',
|
||||||
|
constraint=models.UniqueConstraint(
|
||||||
|
fields=('parent', 'slug'),
|
||||||
|
name='dcim_devicerole_parent_slug'
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='devicerole',
|
||||||
|
constraint=models.UniqueConstraint(
|
||||||
|
condition=models.Q(('parent__isnull', True)),
|
||||||
|
fields=('slug',),
|
||||||
|
name='dcim_devicerole_slug',
|
||||||
|
violation_error_message='A top-level device role with this slug already exists.'
|
||||||
|
),
|
||||||
|
),
|
||||||
|
]
|
@ -0,0 +1,15 @@
|
|||||||
|
{
|
||||||
|
"name": "Expansion card",
|
||||||
|
"schema": {
|
||||||
|
"properties": {
|
||||||
|
"connector_type": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Connector type e.g. PCIe x4"
|
||||||
|
},
|
||||||
|
"bandwidth": {
|
||||||
|
"type": "integer",
|
||||||
|
"description": "Total Bandwidth for this module"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
@ -398,6 +398,28 @@ class DeviceRole(NestedGroupModel):
|
|||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
ordering = ('name',)
|
ordering = ('name',)
|
||||||
|
constraints = (
|
||||||
|
models.UniqueConstraint(
|
||||||
|
fields=('parent', 'name'),
|
||||||
|
name='%(app_label)s_%(class)s_parent_name'
|
||||||
|
),
|
||||||
|
models.UniqueConstraint(
|
||||||
|
fields=('name',),
|
||||||
|
name='%(app_label)s_%(class)s_name',
|
||||||
|
condition=Q(parent__isnull=True),
|
||||||
|
violation_error_message=_("A top-level device role with this name already exists.")
|
||||||
|
),
|
||||||
|
models.UniqueConstraint(
|
||||||
|
fields=('parent', 'slug'),
|
||||||
|
name='%(app_label)s_%(class)s_parent_slug'
|
||||||
|
),
|
||||||
|
models.UniqueConstraint(
|
||||||
|
fields=('slug',),
|
||||||
|
name='%(app_label)s_%(class)s_slug',
|
||||||
|
condition=Q(parent__isnull=True),
|
||||||
|
violation_error_message=_("A top-level device role with this slug already exists.")
|
||||||
|
),
|
||||||
|
)
|
||||||
verbose_name = _('device role')
|
verbose_name = _('device role')
|
||||||
verbose_name_plural = _('device roles')
|
verbose_name_plural = _('device roles')
|
||||||
|
|
||||||
|
@ -144,7 +144,7 @@ class ModuleType(ImageAttachmentsMixin, PrimaryModel, WeightMixin):
|
|||||||
super().clean()
|
super().clean()
|
||||||
|
|
||||||
# Validate any attributes against the assigned profile's schema
|
# Validate any attributes against the assigned profile's schema
|
||||||
if self.profile:
|
if self.profile and self.profile.schema:
|
||||||
try:
|
try:
|
||||||
jsonschema.validate(self.attribute_data, schema=self.profile.schema)
|
jsonschema.validate(self.attribute_data, schema=self.profile.schema)
|
||||||
except JSONValidationError as e:
|
except JSONValidationError as e:
|
||||||
|
@ -63,6 +63,10 @@ class DeviceRoleTable(NetBoxTable):
|
|||||||
verbose_name=_('Name'),
|
verbose_name=_('Name'),
|
||||||
linkify=True
|
linkify=True
|
||||||
)
|
)
|
||||||
|
parent = tables.Column(
|
||||||
|
verbose_name=_('Parent'),
|
||||||
|
linkify=True,
|
||||||
|
)
|
||||||
device_count = columns.LinkedCountColumn(
|
device_count = columns.LinkedCountColumn(
|
||||||
viewname='dcim:device_list',
|
viewname='dcim:device_list',
|
||||||
url_params={'role_id': 'pk'},
|
url_params={'role_id': 'pk'},
|
||||||
@ -88,8 +92,8 @@ class DeviceRoleTable(NetBoxTable):
|
|||||||
class Meta(NetBoxTable.Meta):
|
class Meta(NetBoxTable.Meta):
|
||||||
model = models.DeviceRole
|
model = models.DeviceRole
|
||||||
fields = (
|
fields = (
|
||||||
'pk', 'id', 'name', 'device_count', 'vm_count', 'color', 'vm_role', 'config_template', 'description',
|
'pk', 'id', 'name', 'parent', 'device_count', 'vm_count', 'color', 'vm_role', 'config_template',
|
||||||
'slug', 'tags', 'actions', 'created', 'last_updated',
|
'description', 'slug', 'tags', 'actions', 'created', 'last_updated',
|
||||||
)
|
)
|
||||||
default_columns = ('pk', 'name', 'device_count', 'vm_count', 'color', 'vm_role', 'description')
|
default_columns = ('pk', 'name', 'device_count', 'vm_count', 'color', 'vm_role', 'description')
|
||||||
|
|
||||||
|
@ -24,6 +24,10 @@ class RegionTable(ContactsColumnMixin, NetBoxTable):
|
|||||||
verbose_name=_('Name'),
|
verbose_name=_('Name'),
|
||||||
linkify=True
|
linkify=True
|
||||||
)
|
)
|
||||||
|
parent = tables.Column(
|
||||||
|
verbose_name=_('Parent'),
|
||||||
|
linkify=True,
|
||||||
|
)
|
||||||
site_count = columns.LinkedCountColumn(
|
site_count = columns.LinkedCountColumn(
|
||||||
viewname='dcim:site_list',
|
viewname='dcim:site_list',
|
||||||
url_params={'region_id': 'pk'},
|
url_params={'region_id': 'pk'},
|
||||||
@ -39,7 +43,7 @@ class RegionTable(ContactsColumnMixin, NetBoxTable):
|
|||||||
class Meta(NetBoxTable.Meta):
|
class Meta(NetBoxTable.Meta):
|
||||||
model = Region
|
model = Region
|
||||||
fields = (
|
fields = (
|
||||||
'pk', 'id', 'name', 'slug', 'site_count', 'description', 'comments', 'contacts', 'tags',
|
'pk', 'id', 'name', 'parent', 'slug', 'site_count', 'description', 'comments', 'contacts', 'tags',
|
||||||
'created', 'last_updated', 'actions',
|
'created', 'last_updated', 'actions',
|
||||||
)
|
)
|
||||||
default_columns = ('pk', 'name', 'site_count', 'description')
|
default_columns = ('pk', 'name', 'site_count', 'description')
|
||||||
@ -54,6 +58,10 @@ class SiteGroupTable(ContactsColumnMixin, NetBoxTable):
|
|||||||
verbose_name=_('Name'),
|
verbose_name=_('Name'),
|
||||||
linkify=True
|
linkify=True
|
||||||
)
|
)
|
||||||
|
parent = tables.Column(
|
||||||
|
verbose_name=_('Parent'),
|
||||||
|
linkify=True,
|
||||||
|
)
|
||||||
site_count = columns.LinkedCountColumn(
|
site_count = columns.LinkedCountColumn(
|
||||||
viewname='dcim:site_list',
|
viewname='dcim:site_list',
|
||||||
url_params={'group_id': 'pk'},
|
url_params={'group_id': 'pk'},
|
||||||
@ -69,7 +77,7 @@ class SiteGroupTable(ContactsColumnMixin, NetBoxTable):
|
|||||||
class Meta(NetBoxTable.Meta):
|
class Meta(NetBoxTable.Meta):
|
||||||
model = SiteGroup
|
model = SiteGroup
|
||||||
fields = (
|
fields = (
|
||||||
'pk', 'id', 'name', 'slug', 'site_count', 'description', 'comments', 'contacts', 'tags',
|
'pk', 'id', 'name', 'parent', 'slug', 'site_count', 'description', 'comments', 'contacts', 'tags',
|
||||||
'created', 'last_updated', 'actions',
|
'created', 'last_updated', 'actions',
|
||||||
)
|
)
|
||||||
default_columns = ('pk', 'name', 'site_count', 'description')
|
default_columns = ('pk', 'name', 'site_count', 'description')
|
||||||
@ -135,6 +143,10 @@ class LocationTable(TenancyColumnsMixin, ContactsColumnMixin, NetBoxTable):
|
|||||||
verbose_name=_('Name'),
|
verbose_name=_('Name'),
|
||||||
linkify=True
|
linkify=True
|
||||||
)
|
)
|
||||||
|
parent = tables.Column(
|
||||||
|
verbose_name=_('Parent'),
|
||||||
|
linkify=True,
|
||||||
|
)
|
||||||
site = tables.Column(
|
site = tables.Column(
|
||||||
verbose_name=_('Site'),
|
verbose_name=_('Site'),
|
||||||
linkify=True
|
linkify=True
|
||||||
@ -170,8 +182,8 @@ class LocationTable(TenancyColumnsMixin, ContactsColumnMixin, NetBoxTable):
|
|||||||
class Meta(NetBoxTable.Meta):
|
class Meta(NetBoxTable.Meta):
|
||||||
model = Location
|
model = Location
|
||||||
fields = (
|
fields = (
|
||||||
'pk', 'id', 'name', 'site', 'status', 'facility', 'tenant', 'tenant_group', 'rack_count', 'device_count',
|
'pk', 'id', 'name', 'parent', 'site', 'status', 'facility', 'tenant', 'tenant_group', 'rack_count',
|
||||||
'description', 'slug', 'comments', 'contacts', 'tags', 'actions', 'created', 'last_updated',
|
'device_count', 'description', 'slug', 'comments', 'contacts', 'tags', 'actions', 'created', 'last_updated',
|
||||||
'vlangroup_count',
|
'vlangroup_count',
|
||||||
)
|
)
|
||||||
default_columns = (
|
default_columns = (
|
||||||
|
@ -954,6 +954,19 @@ class CableTestCase(TestCase):
|
|||||||
with self.assertRaises(ValidationError):
|
with self.assertRaises(ValidationError):
|
||||||
cable.clean()
|
cable.clean()
|
||||||
|
|
||||||
|
@tag('regression')
|
||||||
|
def test_cable_cannot_terminate_to_a_cellular_interface(self):
|
||||||
|
"""
|
||||||
|
A cable cannot terminate to a cellular interface
|
||||||
|
"""
|
||||||
|
device1 = Device.objects.get(name='TestDevice1')
|
||||||
|
interface2 = Interface.objects.get(device__name='TestDevice2', name='eth0')
|
||||||
|
|
||||||
|
cellular_interface = Interface(device=device1, name="W1", type=InterfaceTypeChoices.TYPE_LTE)
|
||||||
|
cable = Cable(a_terminations=[interface2], b_terminations=[cellular_interface])
|
||||||
|
with self.assertRaises(ValidationError):
|
||||||
|
cable.clean()
|
||||||
|
|
||||||
|
|
||||||
class VirtualDeviceContextTestCase(TestCase):
|
class VirtualDeviceContextTestCase(TestCase):
|
||||||
|
|
||||||
|
@ -3,7 +3,7 @@ from decimal import Decimal
|
|||||||
from zoneinfo import ZoneInfo
|
from zoneinfo import ZoneInfo
|
||||||
|
|
||||||
import yaml
|
import yaml
|
||||||
from django.test import override_settings
|
from django.test import override_settings, tag
|
||||||
from django.urls import reverse
|
from django.urls import reverse
|
||||||
from netaddr import EUI
|
from netaddr import EUI
|
||||||
|
|
||||||
@ -1000,18 +1000,7 @@ inventory-items:
|
|||||||
self.assertEqual(response.get('Content-Type'), 'text/csv; charset=utf-8')
|
self.assertEqual(response.get('Content-Type'), 'text/csv; charset=utf-8')
|
||||||
|
|
||||||
|
|
||||||
# TODO: Change base class to PrimaryObjectViewTestCase
|
class ModuleTypeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
||||||
# Blocked by absence of bulk import view for ModuleTypes
|
|
||||||
class ModuleTypeTestCase(
|
|
||||||
ViewTestCases.GetObjectViewTestCase,
|
|
||||||
ViewTestCases.GetObjectChangelogViewTestCase,
|
|
||||||
ViewTestCases.CreateObjectViewTestCase,
|
|
||||||
ViewTestCases.EditObjectViewTestCase,
|
|
||||||
ViewTestCases.DeleteObjectViewTestCase,
|
|
||||||
ViewTestCases.ListObjectsViewTestCase,
|
|
||||||
ViewTestCases.BulkEditObjectsViewTestCase,
|
|
||||||
ViewTestCases.BulkDeleteObjectsViewTestCase
|
|
||||||
):
|
|
||||||
model = ModuleType
|
model = ModuleType
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
@ -1023,7 +1012,7 @@ class ModuleTypeTestCase(
|
|||||||
)
|
)
|
||||||
Manufacturer.objects.bulk_create(manufacturers)
|
Manufacturer.objects.bulk_create(manufacturers)
|
||||||
|
|
||||||
ModuleType.objects.bulk_create([
|
module_types = ModuleType.objects.bulk_create([
|
||||||
ModuleType(model='Module Type 1', manufacturer=manufacturers[0]),
|
ModuleType(model='Module Type 1', manufacturer=manufacturers[0]),
|
||||||
ModuleType(model='Module Type 2', manufacturer=manufacturers[0]),
|
ModuleType(model='Module Type 2', manufacturer=manufacturers[0]),
|
||||||
ModuleType(model='Module Type 3', manufacturer=manufacturers[0]),
|
ModuleType(model='Module Type 3', manufacturer=manufacturers[0]),
|
||||||
@ -1031,6 +1020,8 @@ class ModuleTypeTestCase(
|
|||||||
|
|
||||||
tags = create_tags('Alpha', 'Bravo', 'Charlie')
|
tags = create_tags('Alpha', 'Bravo', 'Charlie')
|
||||||
|
|
||||||
|
fan_module_type_profile = ModuleTypeProfile.objects.get(name='Fan')
|
||||||
|
|
||||||
cls.form_data = {
|
cls.form_data = {
|
||||||
'manufacturer': manufacturers[1].pk,
|
'manufacturer': manufacturers[1].pk,
|
||||||
'model': 'Device Type X',
|
'model': 'Device Type X',
|
||||||
@ -1044,6 +1035,70 @@ class ModuleTypeTestCase(
|
|||||||
'part_number': '456DEF',
|
'part_number': '456DEF',
|
||||||
}
|
}
|
||||||
|
|
||||||
|
cls.csv_data = (
|
||||||
|
"manufacturer,model,part_number,comments,profile",
|
||||||
|
f"Manufacturer 1,fan0,generic-fan,,{fan_module_type_profile.name}"
|
||||||
|
)
|
||||||
|
|
||||||
|
cls.csv_update_data = (
|
||||||
|
"id,model",
|
||||||
|
f"{module_types[0].id},test model",
|
||||||
|
)
|
||||||
|
|
||||||
|
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||||
|
def test_bulk_update_objects_with_permission(self):
|
||||||
|
self.add_permissions(
|
||||||
|
'dcim.add_consoleporttemplate',
|
||||||
|
'dcim.add_consoleserverporttemplate',
|
||||||
|
'dcim.add_powerporttemplate',
|
||||||
|
'dcim.add_poweroutlettemplate',
|
||||||
|
'dcim.add_interfacetemplate',
|
||||||
|
'dcim.add_frontporttemplate',
|
||||||
|
'dcim.add_rearporttemplate',
|
||||||
|
'dcim.add_modulebaytemplate',
|
||||||
|
)
|
||||||
|
|
||||||
|
# run base test
|
||||||
|
super().test_bulk_update_objects_with_permission()
|
||||||
|
|
||||||
|
@tag('regression')
|
||||||
|
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'], EXEMPT_EXCLUDE_MODELS=[])
|
||||||
|
def test_bulk_import_objects_with_permission(self):
|
||||||
|
self.add_permissions(
|
||||||
|
'dcim.add_consoleporttemplate',
|
||||||
|
'dcim.add_consoleserverporttemplate',
|
||||||
|
'dcim.add_powerporttemplate',
|
||||||
|
'dcim.add_poweroutlettemplate',
|
||||||
|
'dcim.add_interfacetemplate',
|
||||||
|
'dcim.add_frontporttemplate',
|
||||||
|
'dcim.add_rearporttemplate',
|
||||||
|
'dcim.add_modulebaytemplate',
|
||||||
|
)
|
||||||
|
|
||||||
|
# run base test
|
||||||
|
super().test_bulk_import_objects_with_permission()
|
||||||
|
|
||||||
|
# TODO: remove extra regression asserts once parent test supports testing all import fields
|
||||||
|
fan_module_type = ModuleType.objects.get(part_number='generic-fan')
|
||||||
|
fan_module_type_profile = ModuleTypeProfile.objects.get(name='Fan')
|
||||||
|
|
||||||
|
assert fan_module_type.profile == fan_module_type_profile
|
||||||
|
|
||||||
|
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'], EXEMPT_EXCLUDE_MODELS=[])
|
||||||
|
def test_bulk_import_objects_with_constrained_permission(self):
|
||||||
|
self.add_permissions(
|
||||||
|
'dcim.add_consoleporttemplate',
|
||||||
|
'dcim.add_consoleserverporttemplate',
|
||||||
|
'dcim.add_powerporttemplate',
|
||||||
|
'dcim.add_poweroutlettemplate',
|
||||||
|
'dcim.add_interfacetemplate',
|
||||||
|
'dcim.add_frontporttemplate',
|
||||||
|
'dcim.add_rearporttemplate',
|
||||||
|
'dcim.add_modulebaytemplate',
|
||||||
|
)
|
||||||
|
|
||||||
|
super().test_bulk_import_objects_with_constrained_permission()
|
||||||
|
|
||||||
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'])
|
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||||
def test_moduletype_consoleports(self):
|
def test_moduletype_consoleports(self):
|
||||||
moduletype = ModuleType.objects.first()
|
moduletype = ModuleType.objects.first()
|
||||||
@ -1804,9 +1859,9 @@ class DeviceRoleTestCase(ViewTestCases.OrganizationalObjectViewTestCase):
|
|||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
"name,slug,color",
|
"name,slug,color",
|
||||||
"Device Role 4,device-role-4,ff0000",
|
"Device Role 6,device-role-6,ff0000",
|
||||||
"Device Role 5,device-role-5,00ff00",
|
"Device Role 7,device-role-7,00ff00",
|
||||||
"Device Role 6,device-role-6,0000ff",
|
"Device Role 8,device-role-8,0000ff",
|
||||||
)
|
)
|
||||||
|
|
||||||
cls.csv_update_data = (
|
cls.csv_update_data = (
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
from django.apps import apps
|
from django.apps import apps
|
||||||
from django.contrib.contenttypes.models import ContentType
|
from django.contrib.contenttypes.models import ContentType
|
||||||
from django.db import transaction
|
from django.db import router, transaction
|
||||||
|
|
||||||
|
|
||||||
def compile_path_node(ct_id, object_id):
|
def compile_path_node(ct_id, object_id):
|
||||||
@ -53,7 +53,7 @@ def rebuild_paths(terminations):
|
|||||||
for obj in terminations:
|
for obj in terminations:
|
||||||
cable_paths = CablePath.objects.filter(_nodes__contains=obj)
|
cable_paths = CablePath.objects.filter(_nodes__contains=obj)
|
||||||
|
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(CablePath)):
|
||||||
for cp in cable_paths:
|
for cp in cable_paths:
|
||||||
cp.delete()
|
cp.delete()
|
||||||
create_cablepath(cp.origins)
|
create_cablepath(cp.origins)
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
from django.contrib import messages
|
from django.contrib import messages
|
||||||
from django.contrib.contenttypes.models import ContentType
|
from django.contrib.contenttypes.models import ContentType
|
||||||
from django.core.paginator import EmptyPage, PageNotAnInteger
|
from django.core.paginator import EmptyPage, PageNotAnInteger
|
||||||
from django.db import transaction
|
from django.db import router, transaction
|
||||||
from django.db.models import Prefetch
|
from django.db.models import Prefetch
|
||||||
from django.forms import ModelMultipleChoiceField, MultipleHiddenInput, modelformset_factory
|
from django.forms import ModelMultipleChoiceField, MultipleHiddenInput, modelformset_factory
|
||||||
from django.shortcuts import get_object_or_404, redirect, render
|
from django.shortcuts import get_object_or_404, redirect, render
|
||||||
@ -124,7 +124,7 @@ class BulkDisconnectView(GetReturnURLMixin, ObjectPermissionRequiredMixin, View)
|
|||||||
|
|
||||||
if form.is_valid():
|
if form.is_valid():
|
||||||
|
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(Cable)):
|
||||||
count = 0
|
count = 0
|
||||||
cable_ids = set()
|
cable_ids = set()
|
||||||
for obj in self.queryset.filter(pk__in=form.cleaned_data['pk']):
|
for obj in self.queryset.filter(pk__in=form.cleaned_data['pk']):
|
||||||
@ -3746,7 +3746,7 @@ class VirtualChassisEditView(ObjectPermissionRequiredMixin, GetReturnURLMixin, V
|
|||||||
|
|
||||||
if vc_form.is_valid() and formset.is_valid():
|
if vc_form.is_valid() and formset.is_valid():
|
||||||
|
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(Device)):
|
||||||
|
|
||||||
# Save the VirtualChassis
|
# Save the VirtualChassis
|
||||||
vc_form.save()
|
vc_form.save()
|
||||||
|
@ -66,11 +66,11 @@ class ScriptInputSerializer(serializers.Serializer):
|
|||||||
interval = serializers.IntegerField(required=False, allow_null=True)
|
interval = serializers.IntegerField(required=False, allow_null=True)
|
||||||
|
|
||||||
def validate_schedule_at(self, value):
|
def validate_schedule_at(self, value):
|
||||||
if value and not self.context['script'].scheduling_enabled:
|
if value and not self.context['script'].python_class.scheduling_enabled:
|
||||||
raise serializers.ValidationError(_("Scheduling is not enabled for this script."))
|
raise serializers.ValidationError(_("Scheduling is not enabled for this script."))
|
||||||
return value
|
return value
|
||||||
|
|
||||||
def validate_interval(self, value):
|
def validate_interval(self, value):
|
||||||
if value and not self.context['script'].scheduling_enabled:
|
if value and not self.context['script'].python_class.scheduling_enabled:
|
||||||
raise serializers.ValidationError(_("Scheduling is not enabled for this script."))
|
raise serializers.ValidationError(_("Scheduling is not enabled for this script."))
|
||||||
return value
|
return value
|
||||||
|
@ -270,6 +270,7 @@ class ScriptViewSet(ModelViewSet):
|
|||||||
module_name, script_name = pk.split('.', maxsplit=1)
|
module_name, script_name = pk.split('.', maxsplit=1)
|
||||||
except ValueError:
|
except ValueError:
|
||||||
raise Http404
|
raise Http404
|
||||||
|
|
||||||
return get_object_or_404(self.queryset, module__file_path=f'{module_name}.py', name=script_name)
|
return get_object_or_404(self.queryset, module__file_path=f'{module_name}.py', name=script_name)
|
||||||
|
|
||||||
def retrieve(self, request, pk):
|
def retrieve(self, request, pk):
|
||||||
|
@ -1,13 +1,14 @@
|
|||||||
import functools
|
import functools
|
||||||
|
import operator
|
||||||
import re
|
import re
|
||||||
from django.utils.translation import gettext as _
|
from django.utils.translation import gettext as _
|
||||||
|
|
||||||
__all__ = (
|
__all__ = (
|
||||||
'Condition',
|
'Condition',
|
||||||
'ConditionSet',
|
'ConditionSet',
|
||||||
|
'InvalidCondition',
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
AND = 'and'
|
AND = 'and'
|
||||||
OR = 'or'
|
OR = 'or'
|
||||||
|
|
||||||
@ -19,6 +20,10 @@ def is_ruleset(data):
|
|||||||
return type(data) is dict and len(data) == 1 and list(data.keys())[0] in (AND, OR)
|
return type(data) is dict and len(data) == 1 and list(data.keys())[0] in (AND, OR)
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidCondition(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
class Condition:
|
class Condition:
|
||||||
"""
|
"""
|
||||||
An individual conditional rule that evaluates a single attribute and its value.
|
An individual conditional rule that evaluates a single attribute and its value.
|
||||||
@ -61,6 +66,7 @@ class Condition:
|
|||||||
|
|
||||||
self.attr = attr
|
self.attr = attr
|
||||||
self.value = value
|
self.value = value
|
||||||
|
self.op = op
|
||||||
self.eval_func = getattr(self, f'eval_{op}')
|
self.eval_func = getattr(self, f'eval_{op}')
|
||||||
self.negate = negate
|
self.negate = negate
|
||||||
|
|
||||||
@ -70,16 +76,17 @@ class Condition:
|
|||||||
"""
|
"""
|
||||||
def _get(obj, key):
|
def _get(obj, key):
|
||||||
if isinstance(obj, list):
|
if isinstance(obj, list):
|
||||||
return [dict.get(i, key) for i in obj]
|
return [operator.getitem(item or {}, key) for item in obj]
|
||||||
|
return operator.getitem(obj or {}, key)
|
||||||
return dict.get(obj, key)
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
value = functools.reduce(_get, self.attr.split('.'), data)
|
value = functools.reduce(_get, self.attr.split('.'), data)
|
||||||
except TypeError:
|
except KeyError:
|
||||||
# Invalid key path
|
raise InvalidCondition(f"Invalid key path: {self.attr}")
|
||||||
value = None
|
try:
|
||||||
result = self.eval_func(value)
|
result = self.eval_func(value)
|
||||||
|
except TypeError as e:
|
||||||
|
raise InvalidCondition(f"Invalid data type at '{self.attr}' for '{self.op}' evaluation: {e}")
|
||||||
|
|
||||||
if self.negate:
|
if self.negate:
|
||||||
return not result
|
return not result
|
||||||
|
@ -192,5 +192,5 @@ def flush_events(events):
|
|||||||
try:
|
try:
|
||||||
func = import_string(name)
|
func = import_string(name)
|
||||||
func(events)
|
func(events)
|
||||||
except Exception as e:
|
except ImportError as e:
|
||||||
logger.error(_("Cannot import events pipeline {name} error: {error}").format(name=name, error=e))
|
logger.error(_("Cannot import events pipeline {name} error: {error}").format(name=name, error=e))
|
||||||
|
@ -238,10 +238,18 @@ class TagImportForm(CSVModelForm):
|
|||||||
label=_('Weight'),
|
label=_('Weight'),
|
||||||
required=False
|
required=False
|
||||||
)
|
)
|
||||||
|
object_types = CSVMultipleContentTypeField(
|
||||||
|
label=_('Object types'),
|
||||||
|
queryset=ObjectType.objects.with_feature('tags'),
|
||||||
|
help_text=_("One or more assigned object types"),
|
||||||
|
required=False,
|
||||||
|
)
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
model = Tag
|
model = Tag
|
||||||
fields = ('name', 'slug', 'color', 'weight', 'description')
|
fields = (
|
||||||
|
'name', 'slug', 'color', 'weight', 'description', 'object_types',
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
class JournalEntryImportForm(NetBoxModelImportForm):
|
class JournalEntryImportForm(NetBoxModelImportForm):
|
||||||
|
@ -1,13 +1,8 @@
|
|||||||
import os
|
|
||||||
|
|
||||||
from django import forms
|
|
||||||
from django.conf import settings
|
|
||||||
from django.core.files.storage import storages
|
|
||||||
from django.utils.translation import gettext_lazy as _
|
|
||||||
|
|
||||||
from core.choices import JobIntervalChoices
|
from core.choices import JobIntervalChoices
|
||||||
from core.forms import ManagedFileForm
|
from core.forms import ManagedFileForm
|
||||||
from extras.storage import ScriptFileSystemStorage
|
from django import forms
|
||||||
|
from django.core.files.storage import storages
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
from utilities.datetime import local_now
|
from utilities.datetime import local_now
|
||||||
from utilities.forms.widgets import DateTimePicker, NumberWithOptions
|
from utilities.forms.widgets import DateTimePicker, NumberWithOptions
|
||||||
|
|
||||||
@ -74,12 +69,7 @@ class ScriptFileForm(ManagedFileForm):
|
|||||||
storage = storages.create_storage(storages.backends["scripts"])
|
storage = storages.create_storage(storages.backends["scripts"])
|
||||||
|
|
||||||
filename = self.cleaned_data['upload_file'].name
|
filename = self.cleaned_data['upload_file'].name
|
||||||
if isinstance(storage, ScriptFileSystemStorage):
|
self.instance.file_path = filename
|
||||||
full_path = os.path.join(settings.SCRIPTS_ROOT, filename)
|
|
||||||
else:
|
|
||||||
full_path = filename
|
|
||||||
|
|
||||||
self.instance.file_path = full_path
|
|
||||||
data = self.cleaned_data['upload_file']
|
data = self.cleaned_data['upload_file']
|
||||||
storage.save(filename, data)
|
storage.save(filename, data)
|
||||||
|
|
||||||
|
@ -39,6 +39,9 @@ class ScriptJob(JobRunner):
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
try:
|
try:
|
||||||
|
# A script can modify multiple models so need to do an atomic lock on
|
||||||
|
# both the default database (for non ChangeLogged models) and potentially
|
||||||
|
# any other database (for ChangeLogged models)
|
||||||
with transaction.atomic():
|
with transaction.atomic():
|
||||||
script.output = script.run(data, commit)
|
script.output = script.run(data, commit)
|
||||||
if not commit:
|
if not commit:
|
||||||
|
@ -18,9 +18,22 @@ class Empty(Lookup):
|
|||||||
return f"CAST(LENGTH({sql}) AS BOOLEAN) IS TRUE", params
|
return f"CAST(LENGTH({sql}) AS BOOLEAN) IS TRUE", params
|
||||||
|
|
||||||
|
|
||||||
|
class NetHost(Lookup):
|
||||||
|
"""
|
||||||
|
Similar to ipam.lookups.NetHost, but casts the field to INET.
|
||||||
|
"""
|
||||||
|
lookup_name = 'net_host'
|
||||||
|
|
||||||
|
def as_sql(self, qn, connection):
|
||||||
|
lhs, lhs_params = self.process_lhs(qn, connection)
|
||||||
|
rhs, rhs_params = self.process_rhs(qn, connection)
|
||||||
|
params = lhs_params + rhs_params
|
||||||
|
return 'HOST(CAST(%s AS INET)) = HOST(%s)' % (lhs, rhs), params
|
||||||
|
|
||||||
|
|
||||||
class NetContainsOrEquals(Lookup):
|
class NetContainsOrEquals(Lookup):
|
||||||
"""
|
"""
|
||||||
This lookup has the same functionality as the one from the ipam app except lhs is cast to inet
|
Similar to ipam.lookups.NetContainsOrEquals, but casts the field to INET.
|
||||||
"""
|
"""
|
||||||
lookup_name = 'net_contains_or_equals'
|
lookup_name = 'net_contains_or_equals'
|
||||||
|
|
||||||
@ -32,4 +45,5 @@ class NetContainsOrEquals(Lookup):
|
|||||||
|
|
||||||
|
|
||||||
CharField.register_lookup(Empty)
|
CharField.register_lookup(Empty)
|
||||||
|
CachedValueField.register_lookup(NetHost)
|
||||||
CachedValueField.register_lookup(NetContainsOrEquals)
|
CachedValueField.register_lookup(NetContainsOrEquals)
|
||||||
|
56
netbox/extras/migrations/0129_fix_script_paths.py
Normal file
56
netbox/extras/migrations/0129_fix_script_paths.py
Normal file
@ -0,0 +1,56 @@
|
|||||||
|
from django.conf import settings
|
||||||
|
from django.core.files.storage import storages
|
||||||
|
from django.db import migrations
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
|
from extras.storage import ScriptFileSystemStorage
|
||||||
|
|
||||||
|
|
||||||
|
def normalize(url):
|
||||||
|
parsed_url = urlparse(url)
|
||||||
|
if not parsed_url.path.endswith('/'):
|
||||||
|
return url + '/'
|
||||||
|
return url
|
||||||
|
|
||||||
|
|
||||||
|
def fix_script_paths(apps, schema_editor):
|
||||||
|
"""
|
||||||
|
Fix script paths for scripts that had incorrect path from NB 4.3.
|
||||||
|
"""
|
||||||
|
storage = storages.create_storage(storages.backends["scripts"])
|
||||||
|
if not isinstance(storage, ScriptFileSystemStorage):
|
||||||
|
return
|
||||||
|
|
||||||
|
ScriptModule = apps.get_model('extras', 'ScriptModule')
|
||||||
|
script_root_path = normalize(settings.SCRIPTS_ROOT)
|
||||||
|
for script in ScriptModule.objects.filter(file_path__startswith=script_root_path):
|
||||||
|
script.file_path = script.file_path[len(script_root_path):]
|
||||||
|
script.save()
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('extras', '0128_tableconfig'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.RunPython(code=fix_script_paths, reverse_code=migrations.RunPython.noop),
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def oc_fix_script_paths(objectchange, reverting):
|
||||||
|
script_root_path = normalize(settings.SCRIPTS_ROOT)
|
||||||
|
|
||||||
|
for data in (objectchange.prechange_data, objectchange.postchange_data):
|
||||||
|
if data is None:
|
||||||
|
continue
|
||||||
|
|
||||||
|
if file_path := data.get('file_path'):
|
||||||
|
if file_path.startswith(script_root_path):
|
||||||
|
data['file_path'] = file_path[len(script_root_path):]
|
||||||
|
|
||||||
|
|
||||||
|
objectchange_migrators = {
|
||||||
|
'extras.scriptmodule': oc_fix_script_paths,
|
||||||
|
}
|
@ -13,7 +13,7 @@ from rest_framework.utils.encoders import JSONEncoder
|
|||||||
|
|
||||||
from core.models import ObjectType
|
from core.models import ObjectType
|
||||||
from extras.choices import *
|
from extras.choices import *
|
||||||
from extras.conditions import ConditionSet
|
from extras.conditions import ConditionSet, InvalidCondition
|
||||||
from extras.constants import *
|
from extras.constants import *
|
||||||
from extras.utils import image_upload
|
from extras.utils import image_upload
|
||||||
from extras.models.mixins import RenderTemplateMixin
|
from extras.models.mixins import RenderTemplateMixin
|
||||||
@ -142,7 +142,15 @@ class EventRule(CustomFieldsMixin, ExportTemplatesMixin, TagsMixin, ChangeLogged
|
|||||||
if not self.conditions:
|
if not self.conditions:
|
||||||
return True
|
return True
|
||||||
|
|
||||||
return ConditionSet(self.conditions).eval(data)
|
logger = logging.getLogger('netbox.event_rules')
|
||||||
|
|
||||||
|
try:
|
||||||
|
result = ConditionSet(self.conditions).eval(data)
|
||||||
|
logger.debug(f'{self.name}: Evaluated as {result}')
|
||||||
|
return result
|
||||||
|
except InvalidCondition as e:
|
||||||
|
logger.error(f"{self.name}: Evaluation failed. {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
class Webhook(CustomFieldsMixin, ExportTemplatesMixin, TagsMixin, ChangeLoggedModel):
|
class Webhook(CustomFieldsMixin, ExportTemplatesMixin, TagsMixin, ChangeLoggedModel):
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
from functools import cached_property
|
from functools import cached_property
|
||||||
|
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
from django.contrib.contenttypes.fields import GenericForeignKey
|
from django.contrib.contenttypes.fields import GenericForeignKey, GenericRelation
|
||||||
from django.core.exceptions import ValidationError
|
from django.core.exceptions import ValidationError
|
||||||
from django.db import models
|
from django.db import models
|
||||||
from django.urls import reverse
|
from django.urls import reverse
|
||||||
@ -144,6 +144,12 @@ class NotificationGroup(ChangeLoggedModel):
|
|||||||
blank=True,
|
blank=True,
|
||||||
related_name='notification_groups'
|
related_name='notification_groups'
|
||||||
)
|
)
|
||||||
|
event_rules = GenericRelation(
|
||||||
|
to='extras.EventRule',
|
||||||
|
content_type_field='action_object_type',
|
||||||
|
object_id_field='action_object_id',
|
||||||
|
related_query_name='+'
|
||||||
|
)
|
||||||
|
|
||||||
objects = RestrictedQuerySet.as_manager()
|
objects = RestrictedQuerySet.as_manager()
|
||||||
|
|
||||||
|
@ -4,7 +4,7 @@ from django.test import TestCase
|
|||||||
from core.events import *
|
from core.events import *
|
||||||
from dcim.choices import SiteStatusChoices
|
from dcim.choices import SiteStatusChoices
|
||||||
from dcim.models import Site
|
from dcim.models import Site
|
||||||
from extras.conditions import Condition, ConditionSet
|
from extras.conditions import Condition, ConditionSet, InvalidCondition
|
||||||
from extras.events import serialize_for_event
|
from extras.events import serialize_for_event
|
||||||
from extras.forms import EventRuleForm
|
from extras.forms import EventRuleForm
|
||||||
from extras.models import EventRule, Webhook
|
from extras.models import EventRule, Webhook
|
||||||
@ -12,16 +12,11 @@ from extras.models import EventRule, Webhook
|
|||||||
|
|
||||||
class ConditionTestCase(TestCase):
|
class ConditionTestCase(TestCase):
|
||||||
|
|
||||||
def test_dotted_path_access(self):
|
|
||||||
c = Condition('a.b.c', 1, 'eq')
|
|
||||||
self.assertTrue(c.eval({'a': {'b': {'c': 1}}}))
|
|
||||||
self.assertFalse(c.eval({'a': {'b': {'c': 2}}}))
|
|
||||||
self.assertFalse(c.eval({'a': {'b': {'x': 1}}}))
|
|
||||||
|
|
||||||
def test_undefined_attr(self):
|
def test_undefined_attr(self):
|
||||||
c = Condition('x', 1, 'eq')
|
c = Condition('x', 1, 'eq')
|
||||||
self.assertFalse(c.eval({}))
|
|
||||||
self.assertTrue(c.eval({'x': 1}))
|
self.assertTrue(c.eval({'x': 1}))
|
||||||
|
with self.assertRaises(InvalidCondition):
|
||||||
|
c.eval({})
|
||||||
|
|
||||||
#
|
#
|
||||||
# Validation tests
|
# Validation tests
|
||||||
@ -37,10 +32,13 @@ class ConditionTestCase(TestCase):
|
|||||||
# dict type is unsupported
|
# dict type is unsupported
|
||||||
Condition('x', 1, dict())
|
Condition('x', 1, dict())
|
||||||
|
|
||||||
def test_invalid_op_type(self):
|
def test_invalid_op_types(self):
|
||||||
with self.assertRaises(ValueError):
|
with self.assertRaises(ValueError):
|
||||||
# 'gt' supports only numeric values
|
# 'gt' supports only numeric values
|
||||||
Condition('x', 'foo', 'gt')
|
Condition('x', 'foo', 'gt')
|
||||||
|
with self.assertRaises(ValueError):
|
||||||
|
# 'in' supports only iterable values
|
||||||
|
Condition('x', 123, 'in')
|
||||||
|
|
||||||
#
|
#
|
||||||
# Nested attrs tests
|
# Nested attrs tests
|
||||||
@ -50,7 +48,10 @@ class ConditionTestCase(TestCase):
|
|||||||
c = Condition('x.y.z', 1)
|
c = Condition('x.y.z', 1)
|
||||||
self.assertTrue(c.eval({'x': {'y': {'z': 1}}}))
|
self.assertTrue(c.eval({'x': {'y': {'z': 1}}}))
|
||||||
self.assertFalse(c.eval({'x': {'y': {'z': 2}}}))
|
self.assertFalse(c.eval({'x': {'y': {'z': 2}}}))
|
||||||
self.assertFalse(c.eval({'a': {'b': {'c': 1}}}))
|
with self.assertRaises(InvalidCondition):
|
||||||
|
c.eval({'x': {'y': None}})
|
||||||
|
with self.assertRaises(InvalidCondition):
|
||||||
|
c.eval({'x': {'y': {'a': 1}}})
|
||||||
|
|
||||||
#
|
#
|
||||||
# Operator tests
|
# Operator tests
|
||||||
@ -74,23 +75,31 @@ class ConditionTestCase(TestCase):
|
|||||||
c = Condition('x', 1, 'gt')
|
c = Condition('x', 1, 'gt')
|
||||||
self.assertTrue(c.eval({'x': 2}))
|
self.assertTrue(c.eval({'x': 2}))
|
||||||
self.assertFalse(c.eval({'x': 1}))
|
self.assertFalse(c.eval({'x': 1}))
|
||||||
|
with self.assertRaises(InvalidCondition):
|
||||||
|
c.eval({'x': 'foo'}) # Invalid type
|
||||||
|
|
||||||
def test_gte(self):
|
def test_gte(self):
|
||||||
c = Condition('x', 1, 'gte')
|
c = Condition('x', 1, 'gte')
|
||||||
self.assertTrue(c.eval({'x': 2}))
|
self.assertTrue(c.eval({'x': 2}))
|
||||||
self.assertTrue(c.eval({'x': 1}))
|
self.assertTrue(c.eval({'x': 1}))
|
||||||
self.assertFalse(c.eval({'x': 0}))
|
self.assertFalse(c.eval({'x': 0}))
|
||||||
|
with self.assertRaises(InvalidCondition):
|
||||||
|
c.eval({'x': 'foo'}) # Invalid type
|
||||||
|
|
||||||
def test_lt(self):
|
def test_lt(self):
|
||||||
c = Condition('x', 2, 'lt')
|
c = Condition('x', 2, 'lt')
|
||||||
self.assertTrue(c.eval({'x': 1}))
|
self.assertTrue(c.eval({'x': 1}))
|
||||||
self.assertFalse(c.eval({'x': 2}))
|
self.assertFalse(c.eval({'x': 2}))
|
||||||
|
with self.assertRaises(InvalidCondition):
|
||||||
|
c.eval({'x': 'foo'}) # Invalid type
|
||||||
|
|
||||||
def test_lte(self):
|
def test_lte(self):
|
||||||
c = Condition('x', 2, 'lte')
|
c = Condition('x', 2, 'lte')
|
||||||
self.assertTrue(c.eval({'x': 1}))
|
self.assertTrue(c.eval({'x': 1}))
|
||||||
self.assertTrue(c.eval({'x': 2}))
|
self.assertTrue(c.eval({'x': 2}))
|
||||||
self.assertFalse(c.eval({'x': 3}))
|
self.assertFalse(c.eval({'x': 3}))
|
||||||
|
with self.assertRaises(InvalidCondition):
|
||||||
|
c.eval({'x': 'foo'}) # Invalid type
|
||||||
|
|
||||||
def test_in(self):
|
def test_in(self):
|
||||||
c = Condition('x', [1, 2, 3], 'in')
|
c = Condition('x', [1, 2, 3], 'in')
|
||||||
@ -106,6 +115,8 @@ class ConditionTestCase(TestCase):
|
|||||||
c = Condition('x', 1, 'contains')
|
c = Condition('x', 1, 'contains')
|
||||||
self.assertTrue(c.eval({'x': [1, 2, 3]}))
|
self.assertTrue(c.eval({'x': [1, 2, 3]}))
|
||||||
self.assertFalse(c.eval({'x': [2, 3, 4]}))
|
self.assertFalse(c.eval({'x': [2, 3, 4]}))
|
||||||
|
with self.assertRaises(InvalidCondition):
|
||||||
|
c.eval({'x': 123}) # Invalid type
|
||||||
|
|
||||||
def test_contains_negated(self):
|
def test_contains_negated(self):
|
||||||
c = Condition('x', 1, 'contains', negate=True)
|
c = Condition('x', 1, 'contains', negate=True)
|
||||||
|
@ -444,6 +444,8 @@ class TagTestCase(ViewTestCases.OrganizationalObjectViewTestCase):
|
|||||||
@classmethod
|
@classmethod
|
||||||
def setUpTestData(cls):
|
def setUpTestData(cls):
|
||||||
|
|
||||||
|
site_ct = ContentType.objects.get_for_model(Site)
|
||||||
|
|
||||||
tags = (
|
tags = (
|
||||||
Tag(name='Tag 1', slug='tag-1'),
|
Tag(name='Tag 1', slug='tag-1'),
|
||||||
Tag(name='Tag 2', slug='tag-2', weight=1),
|
Tag(name='Tag 2', slug='tag-2', weight=1),
|
||||||
@ -456,14 +458,15 @@ class TagTestCase(ViewTestCases.OrganizationalObjectViewTestCase):
|
|||||||
'slug': 'tag-x',
|
'slug': 'tag-x',
|
||||||
'color': 'c0c0c0',
|
'color': 'c0c0c0',
|
||||||
'comments': 'Some comments',
|
'comments': 'Some comments',
|
||||||
|
'object_types': [site_ct.pk],
|
||||||
'weight': 11,
|
'weight': 11,
|
||||||
}
|
}
|
||||||
|
|
||||||
cls.csv_data = (
|
cls.csv_data = (
|
||||||
"name,slug,color,description,weight",
|
"name,slug,color,description,object_types,weight",
|
||||||
"Tag 4,tag-4,ff0000,Fourth tag,0",
|
"Tag 4,tag-4,ff0000,Fourth tag,dcim.interface,0",
|
||||||
"Tag 5,tag-5,00ff00,Fifth tag,1111",
|
"Tag 5,tag-5,00ff00,Fifth tag,'dcim.device,dcim.site',1111",
|
||||||
"Tag 6,tag-6,0000ff,Sixth tag,0",
|
"Tag 6,tag-6,0000ff,Sixth tag,dcim.site,0",
|
||||||
)
|
)
|
||||||
|
|
||||||
cls.csv_update_data = (
|
cls.csv_update_data = (
|
||||||
|
@ -1476,7 +1476,16 @@ class ScriptResultView(TableMixin, generic.ObjectView):
|
|||||||
table = None
|
table = None
|
||||||
job = get_object_or_404(Job.objects.all(), pk=kwargs.get('job_pk'))
|
job = get_object_or_404(Job.objects.all(), pk=kwargs.get('job_pk'))
|
||||||
|
|
||||||
if job.completed:
|
# If a direct export output has been requested, return the job data content as a
|
||||||
|
# downloadable file.
|
||||||
|
if job.completed and request.GET.get('export') == 'output':
|
||||||
|
content = (job.data.get("output") or "").encode()
|
||||||
|
response = HttpResponse(content, content_type='text')
|
||||||
|
filename = f"{job.object.name or 'script-output'}_{job.completed.strftime('%Y-%m-%d_%H%M%S')}.txt"
|
||||||
|
response['Content-Disposition'] = f'attachment; filename="{filename}"'
|
||||||
|
return response
|
||||||
|
|
||||||
|
elif job.completed:
|
||||||
table = self.get_table(job, request, bulk_actions=False)
|
table = self.get_table(job, request, bulk_actions=False)
|
||||||
|
|
||||||
log_threshold = request.GET.get('log_threshold', LogLevelChoices.LOG_INFO)
|
log_threshold = request.GET.get('log_threshold', LogLevelChoices.LOG_INFO)
|
||||||
|
@ -2,7 +2,7 @@ from copy import deepcopy
|
|||||||
|
|
||||||
from django.contrib.contenttypes.prefetch import GenericPrefetch
|
from django.contrib.contenttypes.prefetch import GenericPrefetch
|
||||||
from django.core.exceptions import ObjectDoesNotExist, PermissionDenied
|
from django.core.exceptions import ObjectDoesNotExist, PermissionDenied
|
||||||
from django.db import transaction
|
from django.db import router, transaction
|
||||||
from django.shortcuts import get_object_or_404
|
from django.shortcuts import get_object_or_404
|
||||||
from django.utils.translation import gettext as _
|
from django.utils.translation import gettext as _
|
||||||
from django_pglocks import advisory_lock
|
from django_pglocks import advisory_lock
|
||||||
@ -295,7 +295,7 @@ class AvailableObjectsView(ObjectValidationMixin, APIView):
|
|||||||
|
|
||||||
# Create the new IP address(es)
|
# Create the new IP address(es)
|
||||||
try:
|
try:
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
|
||||||
created = serializer.save()
|
created = serializer.save()
|
||||||
self._validate_objects(created)
|
self._validate_objects(created)
|
||||||
except ObjectDoesNotExist:
|
except ObjectDoesNotExist:
|
||||||
|
@ -449,7 +449,7 @@ class PrefixFilterSet(NetBoxModelFilterSet, ScopedFilterSet, TenancyFilterSet, C
|
|||||||
@extend_schema_field(OpenApiTypes.STR)
|
@extend_schema_field(OpenApiTypes.STR)
|
||||||
def filter_present_in_vrf(self, queryset, name, vrf):
|
def filter_present_in_vrf(self, queryset, name, vrf):
|
||||||
if vrf is None:
|
if vrf is None:
|
||||||
return queryset.none
|
return queryset.none()
|
||||||
return queryset.filter(
|
return queryset.filter(
|
||||||
Q(vrf=vrf) |
|
Q(vrf=vrf) |
|
||||||
Q(vrf__export_targets__in=vrf.import_targets.all())
|
Q(vrf__export_targets__in=vrf.import_targets.all())
|
||||||
@ -729,7 +729,7 @@ class IPAddressFilterSet(NetBoxModelFilterSet, TenancyFilterSet, ContactModelFil
|
|||||||
@extend_schema_field(OpenApiTypes.STR)
|
@extend_schema_field(OpenApiTypes.STR)
|
||||||
def filter_present_in_vrf(self, queryset, name, vrf):
|
def filter_present_in_vrf(self, queryset, name, vrf):
|
||||||
if vrf is None:
|
if vrf is None:
|
||||||
return queryset.none
|
return queryset.none()
|
||||||
return queryset.filter(
|
return queryset.filter(
|
||||||
Q(vrf=vrf) |
|
Q(vrf=vrf) |
|
||||||
Q(vrf__export_targets__in=vrf.import_targets.all())
|
Q(vrf__export_targets__in=vrf.import_targets.all())
|
||||||
|
@ -633,7 +633,10 @@ class ServiceImportForm(NetBoxModelImportForm):
|
|||||||
# triggered
|
# triggered
|
||||||
parent = self.cleaned_data.get('parent')
|
parent = self.cleaned_data.get('parent')
|
||||||
for ip_address in self.cleaned_data.get('ipaddresses', []):
|
for ip_address in self.cleaned_data.get('ipaddresses', []):
|
||||||
if not ip_address.assigned_object or getattr(ip_address.assigned_object, 'parent_object') != parent:
|
if not (assigned := ip_address.assigned_object) or ( # no assigned object
|
||||||
|
(isinstance(parent, FHRPGroup) and assigned != parent) # assigned to FHRPGroup
|
||||||
|
and getattr(assigned, 'parent_object') != parent # assigned to [VM]Interface
|
||||||
|
):
|
||||||
raise forms.ValidationError(
|
raise forms.ValidationError(
|
||||||
_("{ip} is not assigned to this parent.").format(ip=ip_address)
|
_("{ip} is not assigned to this parent.").format(ip=ip_address)
|
||||||
)
|
)
|
||||||
|
@ -826,7 +826,7 @@ class ServiceForm(NetBoxModelForm):
|
|||||||
except ObjectDoesNotExist:
|
except ObjectDoesNotExist:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
if self.instance and parent_object_type_id != self.instance.parent_object_type_id:
|
if self.instance and self.instance.pk and parent_object_type_id != self.instance.parent_object_type_id:
|
||||||
self.initial['parent'] = None
|
self.initial['parent'] = None
|
||||||
|
|
||||||
def clean(self):
|
def clean(self):
|
||||||
|
@ -11,10 +11,12 @@ from strawberry_django import FilterLookup, DateFilterLookup
|
|||||||
|
|
||||||
from core.graphql.filter_mixins import BaseObjectTypeFilterMixin, ChangeLogFilterMixin
|
from core.graphql.filter_mixins import BaseObjectTypeFilterMixin, ChangeLogFilterMixin
|
||||||
from dcim.graphql.filter_mixins import ScopedFilterMixin
|
from dcim.graphql.filter_mixins import ScopedFilterMixin
|
||||||
|
from dcim.models import Device
|
||||||
from ipam import models
|
from ipam import models
|
||||||
from ipam.graphql.filter_mixins import ServiceBaseFilterMixin
|
from ipam.graphql.filter_mixins import ServiceBaseFilterMixin
|
||||||
from netbox.graphql.filter_mixins import NetBoxModelFilterMixin, OrganizationalModelFilterMixin, PrimaryModelFilterMixin
|
from netbox.graphql.filter_mixins import NetBoxModelFilterMixin, OrganizationalModelFilterMixin, PrimaryModelFilterMixin
|
||||||
from tenancy.graphql.filter_mixins import ContactFilterMixin, TenancyFilterMixin
|
from tenancy.graphql.filter_mixins import ContactFilterMixin, TenancyFilterMixin
|
||||||
|
from virtualization.models import VMInterface
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from netbox.graphql.filter_lookups import IntegerArrayLookup, IntegerLookup
|
from netbox.graphql.filter_lookups import IntegerArrayLookup, IntegerLookup
|
||||||
@ -116,6 +118,30 @@ class FHRPGroupAssignmentFilter(BaseObjectTypeFilterMixin, ChangeLogFilterMixin)
|
|||||||
strawberry_django.filter_field()
|
strawberry_django.filter_field()
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@strawberry_django.filter_field()
|
||||||
|
def device_id(self, queryset, value: list[str], prefix) -> Q:
|
||||||
|
return self.filter_device('id', value)
|
||||||
|
|
||||||
|
@strawberry_django.filter_field()
|
||||||
|
def device(self, value: list[str], prefix) -> Q:
|
||||||
|
return self.filter_device('name', value)
|
||||||
|
|
||||||
|
@strawberry_django.filter_field()
|
||||||
|
def virtual_machine_id(self, value: list[str], prefix) -> Q:
|
||||||
|
return Q(interface_id__in=VMInterface.objects.filter(virtual_machine_id__in=value))
|
||||||
|
|
||||||
|
@strawberry_django.filter_field()
|
||||||
|
def virtual_machine(self, value: list[str], prefix) -> Q:
|
||||||
|
return Q(interface_id__in=VMInterface.objects.filter(virtual_machine__name__in=value))
|
||||||
|
|
||||||
|
def filter_device(self, field, value) -> Q:
|
||||||
|
"""Helper to standardize logic for device and device_id filters"""
|
||||||
|
devices = Device.objects.filter(**{f'{field}__in': value})
|
||||||
|
interface_ids = []
|
||||||
|
for device in devices:
|
||||||
|
interface_ids.extend(device.vc_interfaces().values_list('id', flat=True))
|
||||||
|
return Q(interface_id__in=interface_ids)
|
||||||
|
|
||||||
|
|
||||||
@strawberry_django.filter_type(models.IPAddress, lookups=True)
|
@strawberry_django.filter_type(models.IPAddress, lookups=True)
|
||||||
class IPAddressFilter(ContactFilterMixin, TenancyFilterMixin, PrimaryModelFilterMixin):
|
class IPAddressFilter(ContactFilterMixin, TenancyFilterMixin, PrimaryModelFilterMixin):
|
||||||
|
@ -162,6 +162,11 @@ class Aggregate(ContactsMixin, GetAvailablePrefixesMixin, PrimaryModel):
|
|||||||
return self.prefix.version
|
return self.prefix.version
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ipv6_full(self):
|
||||||
|
if self.prefix and self.prefix.version == 6:
|
||||||
|
return netaddr.IPAddress(self.prefix).format(netaddr.ipv6_full)
|
||||||
|
|
||||||
def get_child_prefixes(self):
|
def get_child_prefixes(self):
|
||||||
"""
|
"""
|
||||||
Return all Prefixes within this Aggregate
|
Return all Prefixes within this Aggregate
|
||||||
@ -330,6 +335,11 @@ class Prefix(ContactsMixin, GetAvailablePrefixesMixin, CachedScopeMixin, Primary
|
|||||||
def mask_length(self):
|
def mask_length(self):
|
||||||
return self.prefix.prefixlen if self.prefix else None
|
return self.prefix.prefixlen if self.prefix else None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ipv6_full(self):
|
||||||
|
if self.prefix and self.prefix.version == 6:
|
||||||
|
return netaddr.IPAddress(self.prefix).format(netaddr.ipv6_full)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def depth(self):
|
def depth(self):
|
||||||
return self._depth
|
return self._depth
|
||||||
@ -808,6 +818,11 @@ class IPAddress(ContactsMixin, PrimaryModel):
|
|||||||
self._original_assigned_object_id = self.__dict__.get('assigned_object_id')
|
self._original_assigned_object_id = self.__dict__.get('assigned_object_id')
|
||||||
self._original_assigned_object_type_id = self.__dict__.get('assigned_object_type_id')
|
self._original_assigned_object_type_id = self.__dict__.get('assigned_object_type_id')
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ipv6_full(self):
|
||||||
|
if self.address and self.address.version == 6:
|
||||||
|
return netaddr.IPAddress(self.address).format(netaddr.ipv6_full)
|
||||||
|
|
||||||
def get_duplicates(self):
|
def get_duplicates(self):
|
||||||
return IPAddress.objects.filter(
|
return IPAddress.objects.filter(
|
||||||
vrf=self.vrf,
|
vrf=self.vrf,
|
||||||
|
@ -1068,6 +1068,9 @@ class ServiceTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
|||||||
role = DeviceRole.objects.create(name='Device Role 1', slug='device-role-1')
|
role = DeviceRole.objects.create(name='Device Role 1', slug='device-role-1')
|
||||||
device = Device.objects.create(name='Device 1', site=site, device_type=devicetype, role=role)
|
device = Device.objects.create(name='Device 1', site=site, device_type=devicetype, role=role)
|
||||||
interface = Interface.objects.create(device=device, name='Interface 1', type=InterfaceTypeChoices.TYPE_VIRTUAL)
|
interface = Interface.objects.create(device=device, name='Interface 1', type=InterfaceTypeChoices.TYPE_VIRTUAL)
|
||||||
|
fhrp_group = FHRPGroup.objects.create(
|
||||||
|
name='Group 1', group_id=1234, protocol=FHRPGroupProtocolChoices.PROTOCOL_CARP
|
||||||
|
)
|
||||||
|
|
||||||
services = (
|
services = (
|
||||||
Service(parent=device, name='Service 1', protocol=ServiceProtocolChoices.PROTOCOL_TCP, ports=[101]),
|
Service(parent=device, name='Service 1', protocol=ServiceProtocolChoices.PROTOCOL_TCP, ports=[101]),
|
||||||
@ -1079,6 +1082,7 @@ class ServiceTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
|||||||
ip_addresses = (
|
ip_addresses = (
|
||||||
IPAddress(assigned_object=interface, address='192.0.2.1/24'),
|
IPAddress(assigned_object=interface, address='192.0.2.1/24'),
|
||||||
IPAddress(assigned_object=interface, address='192.0.2.2/24'),
|
IPAddress(assigned_object=interface, address='192.0.2.2/24'),
|
||||||
|
IPAddress(assigned_object=fhrp_group, address='192.0.2.3/24'),
|
||||||
)
|
)
|
||||||
IPAddress.objects.bulk_create(ip_addresses)
|
IPAddress.objects.bulk_create(ip_addresses)
|
||||||
|
|
||||||
@ -1100,6 +1104,7 @@ class ServiceTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
|||||||
"dcim.device,Device 1,Service 1,tcp,1,192.0.2.1/24,First service",
|
"dcim.device,Device 1,Service 1,tcp,1,192.0.2.1/24,First service",
|
||||||
"dcim.device,Device 1,Service 2,tcp,2,192.0.2.2/24,Second service",
|
"dcim.device,Device 1,Service 2,tcp,2,192.0.2.2/24,Second service",
|
||||||
"dcim.device,Device 1,Service 3,udp,3,,Third service",
|
"dcim.device,Device 1,Service 3,udp,3,,Third service",
|
||||||
|
"ipam.fhrpgroup,Group 1,Service 4,udp,4,192.0.2.3/24,Fourth service",
|
||||||
)
|
)
|
||||||
|
|
||||||
cls.csv_update_data = (
|
cls.csv_update_data = (
|
||||||
|
@ -2,7 +2,7 @@ import logging
|
|||||||
from functools import cached_property
|
from functools import cached_property
|
||||||
|
|
||||||
from django.core.exceptions import ObjectDoesNotExist, PermissionDenied
|
from django.core.exceptions import ObjectDoesNotExist, PermissionDenied
|
||||||
from django.db import transaction
|
from django.db import router, transaction
|
||||||
from django.db.models import ProtectedError, RestrictedError
|
from django.db.models import ProtectedError, RestrictedError
|
||||||
from django_pglocks import advisory_lock
|
from django_pglocks import advisory_lock
|
||||||
from netbox.constants import ADVISORY_LOCK_KEYS
|
from netbox.constants import ADVISORY_LOCK_KEYS
|
||||||
@ -170,7 +170,7 @@ class NetBoxModelViewSet(
|
|||||||
|
|
||||||
# Enforce object-level permissions on save()
|
# Enforce object-level permissions on save()
|
||||||
try:
|
try:
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(model)):
|
||||||
instance = serializer.save()
|
instance = serializer.save()
|
||||||
self._validate_objects(instance)
|
self._validate_objects(instance)
|
||||||
except ObjectDoesNotExist:
|
except ObjectDoesNotExist:
|
||||||
@ -190,7 +190,7 @@ class NetBoxModelViewSet(
|
|||||||
|
|
||||||
# Enforce object-level permissions on save()
|
# Enforce object-level permissions on save()
|
||||||
try:
|
try:
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(model)):
|
||||||
instance = serializer.save()
|
instance = serializer.save()
|
||||||
self._validate_objects(instance)
|
self._validate_objects(instance)
|
||||||
except ObjectDoesNotExist:
|
except ObjectDoesNotExist:
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
from django.core.exceptions import ObjectDoesNotExist
|
from django.core.exceptions import ObjectDoesNotExist
|
||||||
from django.db import transaction
|
from django.db import router, transaction
|
||||||
from django.http import Http404
|
from django.http import Http404
|
||||||
from rest_framework import status
|
from rest_framework import status
|
||||||
from rest_framework.response import Response
|
from rest_framework.response import Response
|
||||||
@ -56,22 +56,22 @@ class SequentialBulkCreatesMixin:
|
|||||||
which depends on the evaluation of existing objects (such as checking for free space within a rack) functions
|
which depends on the evaluation of existing objects (such as checking for free space within a rack) functions
|
||||||
appropriately.
|
appropriately.
|
||||||
"""
|
"""
|
||||||
@transaction.atomic
|
|
||||||
def create(self, request, *args, **kwargs):
|
def create(self, request, *args, **kwargs):
|
||||||
if not isinstance(request.data, list):
|
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
|
||||||
# Creating a single object
|
if not isinstance(request.data, list):
|
||||||
return super().create(request, *args, **kwargs)
|
# Creating a single object
|
||||||
|
return super().create(request, *args, **kwargs)
|
||||||
|
|
||||||
return_data = []
|
return_data = []
|
||||||
for data in request.data:
|
for data in request.data:
|
||||||
serializer = self.get_serializer(data=data)
|
serializer = self.get_serializer(data=data)
|
||||||
serializer.is_valid(raise_exception=True)
|
serializer.is_valid(raise_exception=True)
|
||||||
self.perform_create(serializer)
|
self.perform_create(serializer)
|
||||||
return_data.append(serializer.data)
|
return_data.append(serializer.data)
|
||||||
|
|
||||||
headers = self.get_success_headers(serializer.data)
|
headers = self.get_success_headers(serializer.data)
|
||||||
|
|
||||||
return Response(return_data, status=status.HTTP_201_CREATED, headers=headers)
|
return Response(return_data, status=status.HTTP_201_CREATED, headers=headers)
|
||||||
|
|
||||||
|
|
||||||
class BulkUpdateModelMixin:
|
class BulkUpdateModelMixin:
|
||||||
@ -113,7 +113,7 @@ class BulkUpdateModelMixin:
|
|||||||
return Response(data, status=status.HTTP_200_OK)
|
return Response(data, status=status.HTTP_200_OK)
|
||||||
|
|
||||||
def perform_bulk_update(self, objects, update_data, partial):
|
def perform_bulk_update(self, objects, update_data, partial):
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
|
||||||
data_list = []
|
data_list = []
|
||||||
for obj in objects:
|
for obj in objects:
|
||||||
data = update_data.get(obj.id)
|
data = update_data.get(obj.id)
|
||||||
@ -157,7 +157,7 @@ class BulkDestroyModelMixin:
|
|||||||
return Response(status=status.HTTP_204_NO_CONTENT)
|
return Response(status=status.HTTP_204_NO_CONTENT)
|
||||||
|
|
||||||
def perform_bulk_destroy(self, objects):
|
def perform_bulk_destroy(self, objects):
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
|
||||||
for obj in objects:
|
for obj in objects:
|
||||||
if hasattr(obj, 'snapshot'):
|
if hasattr(obj, 'snapshot'):
|
||||||
obj.snapshot()
|
obj.snapshot()
|
||||||
|
@ -231,14 +231,19 @@ SESSION_FILE_PATH = None
|
|||||||
# DISK_BASE_UNIT = 1024
|
# DISK_BASE_UNIT = 1024
|
||||||
# RAM_BASE_UNIT = 1024
|
# RAM_BASE_UNIT = 1024
|
||||||
|
|
||||||
# By default, uploaded media is stored on the local filesystem. Using Django-storages is also supported. Provide the
|
# Within the STORAGES dictionary, "default" is used for image uploads, "staticfiles" is for static files and "scripts"
|
||||||
# class path of the storage driver in STORAGE_BACKEND and any configuration options in STORAGE_CONFIG. For example:
|
# is used for custom scripts. See django-storages and django-storage-swift libraries for more details. By default the
|
||||||
# STORAGE_BACKEND = 'storages.backends.s3boto3.S3Boto3Storage'
|
# following configuration is used:
|
||||||
# STORAGE_CONFIG = {
|
# STORAGES = {
|
||||||
# 'AWS_ACCESS_KEY_ID': 'Key ID',
|
# "default": {
|
||||||
# 'AWS_SECRET_ACCESS_KEY': 'Secret',
|
# "BACKEND": "django.core.files.storage.FileSystemStorage",
|
||||||
# 'AWS_STORAGE_BUCKET_NAME': 'netbox',
|
# },
|
||||||
# 'AWS_S3_REGION_NAME': 'eu-west-1',
|
# "staticfiles": {
|
||||||
|
# "BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
|
||||||
|
# },
|
||||||
|
# "scripts": {
|
||||||
|
# "BACKEND": "extras.storage.ScriptFileSystemStorage",
|
||||||
|
# },
|
||||||
# }
|
# }
|
||||||
|
|
||||||
# Time zone (default: UTC)
|
# Time zone (default: UTC)
|
||||||
|
@ -8,6 +8,7 @@ from django_pglocks import advisory_lock
|
|||||||
from rq.timeouts import JobTimeoutException
|
from rq.timeouts import JobTimeoutException
|
||||||
|
|
||||||
from core.choices import JobStatusChoices
|
from core.choices import JobStatusChoices
|
||||||
|
from core.exceptions import JobFailed
|
||||||
from core.models import Job, ObjectType
|
from core.models import Job, ObjectType
|
||||||
from netbox.constants import ADVISORY_LOCK_KEYS
|
from netbox.constants import ADVISORY_LOCK_KEYS
|
||||||
from netbox.registry import registry
|
from netbox.registry import registry
|
||||||
@ -73,15 +74,21 @@ class JobRunner(ABC):
|
|||||||
This method is called by the Job Scheduler to handle the execution of all job commands. It will maintain the
|
This method is called by the Job Scheduler to handle the execution of all job commands. It will maintain the
|
||||||
job's metadata and handle errors. For periodic jobs, a new job is automatically scheduled using its `interval`.
|
job's metadata and handle errors. For periodic jobs, a new job is automatically scheduled using its `interval`.
|
||||||
"""
|
"""
|
||||||
|
logger = logging.getLogger('netbox.jobs')
|
||||||
|
|
||||||
try:
|
try:
|
||||||
job.start()
|
job.start()
|
||||||
cls(job).run(*args, **kwargs)
|
cls(job).run(*args, **kwargs)
|
||||||
job.terminate()
|
job.terminate()
|
||||||
|
|
||||||
|
except JobFailed:
|
||||||
|
logger.warning(f"Job {job} failed")
|
||||||
|
job.terminate(status=JobStatusChoices.STATUS_FAILED)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
job.terminate(status=JobStatusChoices.STATUS_ERRORED, error=repr(e))
|
job.terminate(status=JobStatusChoices.STATUS_ERRORED, error=repr(e))
|
||||||
if type(e) is JobTimeoutException:
|
if type(e) is JobTimeoutException:
|
||||||
logging.error(e)
|
logger.error(e)
|
||||||
|
|
||||||
# If the executed job is a periodic job, schedule its next execution at the specified interval.
|
# If the executed job is a periodic job, schedule its next execution at the specified interval.
|
||||||
finally:
|
finally:
|
||||||
|
90
netbox/netbox/models/deletion.py
Normal file
90
netbox/netbox/models/deletion.py
Normal file
@ -0,0 +1,90 @@
|
|||||||
|
import logging
|
||||||
|
|
||||||
|
from django.contrib.contenttypes.fields import GenericRelation
|
||||||
|
from django.db import router
|
||||||
|
from django.db.models.deletion import Collector
|
||||||
|
|
||||||
|
logger = logging.getLogger("netbox.models.deletion")
|
||||||
|
|
||||||
|
|
||||||
|
class CustomCollector(Collector):
|
||||||
|
"""
|
||||||
|
Custom collector that handles GenericRelations correctly.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def collect(
|
||||||
|
self,
|
||||||
|
objs,
|
||||||
|
source=None,
|
||||||
|
nullable=False,
|
||||||
|
collect_related=True,
|
||||||
|
source_attr=None,
|
||||||
|
reverse_dependency=False,
|
||||||
|
keep_parents=False,
|
||||||
|
fail_on_restricted=True,
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Override collect to first collect standard dependencies,
|
||||||
|
then add GenericRelations to the dependency graph.
|
||||||
|
"""
|
||||||
|
# Call parent collect first to get all standard dependencies
|
||||||
|
super().collect(
|
||||||
|
objs,
|
||||||
|
source=source,
|
||||||
|
nullable=nullable,
|
||||||
|
collect_related=collect_related,
|
||||||
|
source_attr=source_attr,
|
||||||
|
reverse_dependency=reverse_dependency,
|
||||||
|
keep_parents=keep_parents,
|
||||||
|
fail_on_restricted=fail_on_restricted,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Track which GenericRelations we've already processed to prevent infinite recursion
|
||||||
|
processed_relations = set()
|
||||||
|
|
||||||
|
# Now add GenericRelations to the dependency graph
|
||||||
|
for _, instances in list(self.data.items()):
|
||||||
|
for instance in instances:
|
||||||
|
# Get all GenericRelations for this model
|
||||||
|
for field in instance._meta.private_fields:
|
||||||
|
if isinstance(field, GenericRelation):
|
||||||
|
# Create a unique key for this relation
|
||||||
|
relation_key = f"{instance._meta.model_name}.{field.name}"
|
||||||
|
if relation_key in processed_relations:
|
||||||
|
continue
|
||||||
|
processed_relations.add(relation_key)
|
||||||
|
|
||||||
|
# Add the model that the generic relation points to as a dependency
|
||||||
|
self.add_dependency(field.related_model, instance, reverse_dependency=True)
|
||||||
|
|
||||||
|
|
||||||
|
class DeleteMixin:
|
||||||
|
"""
|
||||||
|
Mixin to override the model delete function to use our custom collector.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def delete(self, using=None, keep_parents=False):
|
||||||
|
"""
|
||||||
|
Override delete to use our custom collector.
|
||||||
|
"""
|
||||||
|
using = using or router.db_for_write(self.__class__, instance=self)
|
||||||
|
assert self._get_pk_val() is not None, "%s object can't be deleted because its %s attribute is set to None." % (
|
||||||
|
self._meta.object_name,
|
||||||
|
self._meta.pk.attname,
|
||||||
|
)
|
||||||
|
|
||||||
|
collector = CustomCollector(using=using)
|
||||||
|
collector.collect([self], keep_parents=keep_parents)
|
||||||
|
|
||||||
|
return collector.delete()
|
||||||
|
|
||||||
|
delete.alters_data = True
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def verify_mro(cls, instance):
|
||||||
|
"""
|
||||||
|
Verify that this mixin is first in the MRO.
|
||||||
|
"""
|
||||||
|
mro = instance.__class__.__mro__
|
||||||
|
if mro.index(cls) != 0:
|
||||||
|
raise RuntimeError(f"{cls.__name__} must be first in the MRO. Current MRO: {mro}")
|
@ -16,6 +16,7 @@ from extras.choices import *
|
|||||||
from extras.constants import CUSTOMFIELD_EMPTY_VALUES
|
from extras.constants import CUSTOMFIELD_EMPTY_VALUES
|
||||||
from extras.utils import is_taggable
|
from extras.utils import is_taggable
|
||||||
from netbox.config import get_config
|
from netbox.config import get_config
|
||||||
|
from netbox.models.deletion import DeleteMixin
|
||||||
from netbox.registry import registry
|
from netbox.registry import registry
|
||||||
from netbox.signals import post_clean
|
from netbox.signals import post_clean
|
||||||
from utilities.json import CustomFieldJSONEncoder
|
from utilities.json import CustomFieldJSONEncoder
|
||||||
@ -45,7 +46,7 @@ __all__ = (
|
|||||||
# Feature mixins
|
# Feature mixins
|
||||||
#
|
#
|
||||||
|
|
||||||
class ChangeLoggingMixin(models.Model):
|
class ChangeLoggingMixin(DeleteMixin, models.Model):
|
||||||
"""
|
"""
|
||||||
Provides change logging support for a model. Adds the `created` and `last_updated` fields.
|
Provides change logging support for a model. Adds the `created` and `last_updated` fields.
|
||||||
"""
|
"""
|
||||||
|
@ -1,6 +1,8 @@
|
|||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
from typing import Sequence, Optional
|
from typing import Sequence, Optional
|
||||||
|
|
||||||
|
from django.urls import reverse_lazy
|
||||||
|
|
||||||
|
|
||||||
__all__ = (
|
__all__ = (
|
||||||
'get_model_item',
|
'get_model_item',
|
||||||
@ -22,20 +24,46 @@ class MenuItemButton:
|
|||||||
link: str
|
link: str
|
||||||
title: str
|
title: str
|
||||||
icon_class: str
|
icon_class: str
|
||||||
|
_url: Optional[str] = None
|
||||||
permissions: Optional[Sequence[str]] = ()
|
permissions: Optional[Sequence[str]] = ()
|
||||||
color: Optional[str] = None
|
color: Optional[str] = None
|
||||||
|
|
||||||
|
def __post_init__(self):
|
||||||
|
if self.link:
|
||||||
|
self._url = reverse_lazy(self.link)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def url(self):
|
||||||
|
return self._url
|
||||||
|
|
||||||
|
@url.setter
|
||||||
|
def url(self, value):
|
||||||
|
self._url = value
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class MenuItem:
|
class MenuItem:
|
||||||
|
|
||||||
link: str
|
link: str
|
||||||
link_text: str
|
link_text: str
|
||||||
|
_url: Optional[str] = None
|
||||||
permissions: Optional[Sequence[str]] = ()
|
permissions: Optional[Sequence[str]] = ()
|
||||||
auth_required: Optional[bool] = False
|
auth_required: Optional[bool] = False
|
||||||
staff_only: Optional[bool] = False
|
staff_only: Optional[bool] = False
|
||||||
buttons: Optional[Sequence[MenuItemButton]] = ()
|
buttons: Optional[Sequence[MenuItemButton]] = ()
|
||||||
|
|
||||||
|
def __post_init__(self):
|
||||||
|
if self.link:
|
||||||
|
self._url = reverse_lazy(self.link)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def url(self):
|
||||||
|
return self._url
|
||||||
|
|
||||||
|
@url.setter
|
||||||
|
def url(self, value):
|
||||||
|
self._url = value
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class MenuGroup:
|
class MenuGroup:
|
||||||
|
@ -1,3 +1,4 @@
|
|||||||
|
from django.urls import reverse_lazy
|
||||||
from django.utils.text import slugify
|
from django.utils.text import slugify
|
||||||
from django.utils.translation import gettext as _
|
from django.utils.translation import gettext as _
|
||||||
|
|
||||||
@ -32,17 +33,23 @@ class PluginMenuItem:
|
|||||||
This class represents a navigation menu item. This constitutes primary link and its text, but also allows for
|
This class represents a navigation menu item. This constitutes primary link and its text, but also allows for
|
||||||
specifying additional link buttons that appear to the right of the item in the van menu.
|
specifying additional link buttons that appear to the right of the item in the van menu.
|
||||||
|
|
||||||
Links are specified as Django reverse URL strings.
|
Links are specified as Django reverse URL strings suitable for rendering via {% url item.link %}.
|
||||||
|
Alternatively, a pre-generated url can be set on the object which will be rendered literally.
|
||||||
Buttons are each specified as a list of PluginMenuButton instances.
|
Buttons are each specified as a list of PluginMenuButton instances.
|
||||||
"""
|
"""
|
||||||
permissions = []
|
permissions = []
|
||||||
buttons = []
|
buttons = []
|
||||||
|
_url = None
|
||||||
|
|
||||||
def __init__(self, link, link_text, auth_required=False, staff_only=False, permissions=None, buttons=None):
|
def __init__(
|
||||||
|
self, link, link_text, auth_required=False, staff_only=False, permissions=None, buttons=None
|
||||||
|
):
|
||||||
self.link = link
|
self.link = link
|
||||||
self.link_text = link_text
|
self.link_text = link_text
|
||||||
self.auth_required = auth_required
|
self.auth_required = auth_required
|
||||||
self.staff_only = staff_only
|
self.staff_only = staff_only
|
||||||
|
if link:
|
||||||
|
self._url = reverse_lazy(link)
|
||||||
if permissions is not None:
|
if permissions is not None:
|
||||||
if type(permissions) not in (list, tuple):
|
if type(permissions) not in (list, tuple):
|
||||||
raise TypeError(_("Permissions must be passed as a tuple or list."))
|
raise TypeError(_("Permissions must be passed as a tuple or list."))
|
||||||
@ -52,6 +59,14 @@ class PluginMenuItem:
|
|||||||
raise TypeError(_("Buttons must be passed as a tuple or list."))
|
raise TypeError(_("Buttons must be passed as a tuple or list."))
|
||||||
self.buttons = buttons
|
self.buttons = buttons
|
||||||
|
|
||||||
|
@property
|
||||||
|
def url(self):
|
||||||
|
return self._url
|
||||||
|
|
||||||
|
@url.setter
|
||||||
|
def url(self, value):
|
||||||
|
self._url = value
|
||||||
|
|
||||||
|
|
||||||
class PluginMenuButton:
|
class PluginMenuButton:
|
||||||
"""
|
"""
|
||||||
@ -60,11 +75,14 @@ class PluginMenuButton:
|
|||||||
"""
|
"""
|
||||||
color = ButtonColorChoices.DEFAULT
|
color = ButtonColorChoices.DEFAULT
|
||||||
permissions = []
|
permissions = []
|
||||||
|
_url = None
|
||||||
|
|
||||||
def __init__(self, link, title, icon_class, color=None, permissions=None):
|
def __init__(self, link, title, icon_class, color=None, permissions=None):
|
||||||
self.link = link
|
self.link = link
|
||||||
self.title = title
|
self.title = title
|
||||||
self.icon_class = icon_class
|
self.icon_class = icon_class
|
||||||
|
if link:
|
||||||
|
self._url = reverse_lazy(link)
|
||||||
if permissions is not None:
|
if permissions is not None:
|
||||||
if type(permissions) not in (list, tuple):
|
if type(permissions) not in (list, tuple):
|
||||||
raise TypeError(_("Permissions must be passed as a tuple or list."))
|
raise TypeError(_("Permissions must be passed as a tuple or list."))
|
||||||
@ -73,3 +91,11 @@ class PluginMenuButton:
|
|||||||
if color not in ButtonColorChoices.values():
|
if color not in ButtonColorChoices.values():
|
||||||
raise ValueError(_("Button color must be a choice within ButtonColorChoices."))
|
raise ValueError(_("Button color must be a choice within ButtonColorChoices."))
|
||||||
self.color = color
|
self.color = color
|
||||||
|
|
||||||
|
@property
|
||||||
|
def url(self):
|
||||||
|
return self._url
|
||||||
|
|
||||||
|
@url.setter
|
||||||
|
def url(self, value):
|
||||||
|
self._url = value
|
||||||
|
@ -54,6 +54,14 @@ PREFERENCES = {
|
|||||||
default='bottom',
|
default='bottom',
|
||||||
description=_('Where the paginator controls will be displayed relative to a table')
|
description=_('Where the paginator controls will be displayed relative to a table')
|
||||||
),
|
),
|
||||||
|
'ui.tables.striping': UserPreference(
|
||||||
|
label=_('Striped table rows'),
|
||||||
|
choices=(
|
||||||
|
('', _('Disabled')),
|
||||||
|
('true', _('Enabled')),
|
||||||
|
),
|
||||||
|
description=_('Render table rows with alternating colors to increase readability'),
|
||||||
|
),
|
||||||
|
|
||||||
# Miscellaneous
|
# Miscellaneous
|
||||||
'data_format': UserPreference(
|
'data_format': UserPreference(
|
||||||
|
@ -115,11 +115,13 @@ class CachedValueSearchBackend(SearchBackend):
|
|||||||
if lookup in (LookupTypes.STARTSWITH, LookupTypes.ENDSWITH):
|
if lookup in (LookupTypes.STARTSWITH, LookupTypes.ENDSWITH):
|
||||||
# "Starts/ends with" matches are valid only on string values
|
# "Starts/ends with" matches are valid only on string values
|
||||||
query_filter &= Q(type=FieldTypes.STRING)
|
query_filter &= Q(type=FieldTypes.STRING)
|
||||||
elif lookup == LookupTypes.PARTIAL:
|
elif lookup in (LookupTypes.PARTIAL, LookupTypes.EXACT):
|
||||||
try:
|
try:
|
||||||
# If the value looks like an IP address, add an extra match for CIDR values
|
# If the value looks like an IP address, add extra filters for CIDR/INET values
|
||||||
address = str(netaddr.IPNetwork(value.strip()).cidr)
|
address = str(netaddr.IPNetwork(value.strip()).cidr)
|
||||||
query_filter |= Q(type=FieldTypes.CIDR) & Q(value__net_contains_or_equals=address)
|
query_filter |= Q(type=FieldTypes.INET) & Q(value__net_host=address)
|
||||||
|
if lookup == LookupTypes.PARTIAL:
|
||||||
|
query_filter |= Q(type=FieldTypes.CIDR) & Q(value__net_contains_or_equals=address)
|
||||||
except (AddrFormatError, ValueError):
|
except (AddrFormatError, ValueError):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
@ -66,6 +66,9 @@ class BaseTable(tables.Table):
|
|||||||
if column.visible:
|
if column.visible:
|
||||||
model = getattr(self.Meta, 'model')
|
model = getattr(self.Meta, 'model')
|
||||||
accessor = column.accessor
|
accessor = column.accessor
|
||||||
|
if accessor.startswith('custom_field_data__'):
|
||||||
|
# Ignore custom field references
|
||||||
|
continue
|
||||||
prefetch_path = []
|
prefetch_path = []
|
||||||
for field_name in accessor.split(accessor.SEPARATOR):
|
for field_name in accessor.split(accessor.SEPARATOR):
|
||||||
try:
|
try:
|
||||||
@ -163,6 +166,8 @@ class BaseTable(tables.Table):
|
|||||||
columns = userconfig.get(f"tables.{self.name}.columns")
|
columns = userconfig.get(f"tables.{self.name}.columns")
|
||||||
if ordering is None:
|
if ordering is None:
|
||||||
ordering = userconfig.get(f"tables.{self.name}.ordering")
|
ordering = userconfig.get(f"tables.{self.name}.ordering")
|
||||||
|
if userconfig.get("ui.tables.striping"):
|
||||||
|
self.attrs['class'] += ' table-striped'
|
||||||
|
|
||||||
# Fall back to the default columns & ordering
|
# Fall back to the default columns & ordering
|
||||||
if columns is None and hasattr(settings, 'DEFAULT_USER_PREFERENCES'):
|
if columns is None and hasattr(settings, 'DEFAULT_USER_PREFERENCES'):
|
||||||
|
@ -7,11 +7,15 @@ from django_rq import get_queue
|
|||||||
from ..jobs import *
|
from ..jobs import *
|
||||||
from core.models import DataSource, Job
|
from core.models import DataSource, Job
|
||||||
from core.choices import JobStatusChoices
|
from core.choices import JobStatusChoices
|
||||||
|
from core.exceptions import JobFailed
|
||||||
|
from utilities.testing import disable_warnings
|
||||||
|
|
||||||
|
|
||||||
class TestJobRunner(JobRunner):
|
class TestJobRunner(JobRunner):
|
||||||
|
|
||||||
def run(self, *args, **kwargs):
|
def run(self, *args, **kwargs):
|
||||||
pass
|
if kwargs.get('make_fail', False):
|
||||||
|
raise JobFailed()
|
||||||
|
|
||||||
|
|
||||||
class JobRunnerTestCase(TestCase):
|
class JobRunnerTestCase(TestCase):
|
||||||
@ -49,6 +53,12 @@ class JobRunnerTest(JobRunnerTestCase):
|
|||||||
|
|
||||||
self.assertEqual(job.status, JobStatusChoices.STATUS_COMPLETED)
|
self.assertEqual(job.status, JobStatusChoices.STATUS_COMPLETED)
|
||||||
|
|
||||||
|
def test_handle_failed(self):
|
||||||
|
with disable_warnings('netbox.jobs'):
|
||||||
|
job = TestJobRunner.enqueue(immediate=True, make_fail=True)
|
||||||
|
|
||||||
|
self.assertEqual(job.status, JobStatusChoices.STATUS_FAILED)
|
||||||
|
|
||||||
def test_handle_errored(self):
|
def test_handle_errored(self):
|
||||||
class ErroredJobRunner(TestJobRunner):
|
class ErroredJobRunner(TestJobRunner):
|
||||||
EXP = Exception('Test error')
|
EXP = Exception('Test error')
|
||||||
|
@ -6,7 +6,7 @@ from django.contrib import messages
|
|||||||
from django.contrib.contenttypes.fields import GenericForeignKey, GenericRel
|
from django.contrib.contenttypes.fields import GenericForeignKey, GenericRel
|
||||||
from django.contrib.contenttypes.models import ContentType
|
from django.contrib.contenttypes.models import ContentType
|
||||||
from django.core.exceptions import FieldDoesNotExist, ObjectDoesNotExist, ValidationError
|
from django.core.exceptions import FieldDoesNotExist, ObjectDoesNotExist, ValidationError
|
||||||
from django.db import transaction, IntegrityError
|
from django.db import IntegrityError, router, transaction
|
||||||
from django.db.models import ManyToManyField, ProtectedError, RestrictedError
|
from django.db.models import ManyToManyField, ProtectedError, RestrictedError
|
||||||
from django.db.models.fields.reverse_related import ManyToManyRel
|
from django.db.models.fields.reverse_related import ManyToManyRel
|
||||||
from django.forms import ModelMultipleChoiceField, MultipleHiddenInput
|
from django.forms import ModelMultipleChoiceField, MultipleHiddenInput
|
||||||
@ -278,7 +278,7 @@ class BulkCreateView(GetReturnURLMixin, BaseMultiObjectView):
|
|||||||
logger.debug("Form validation was successful")
|
logger.debug("Form validation was successful")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(model)):
|
||||||
new_objs = self._create_objects(form, request)
|
new_objs = self._create_objects(form, request)
|
||||||
|
|
||||||
# Enforce object-level permissions
|
# Enforce object-level permissions
|
||||||
@ -501,7 +501,7 @@ class BulkImportView(GetReturnURLMixin, BaseMultiObjectView):
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
# Iterate through data and bind each record to a new model form instance.
|
# Iterate through data and bind each record to a new model form instance.
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(model)):
|
||||||
new_objs = self.create_and_update_objects(form, request)
|
new_objs = self.create_and_update_objects(form, request)
|
||||||
|
|
||||||
# Enforce object-level permissions
|
# Enforce object-level permissions
|
||||||
@ -681,7 +681,7 @@ class BulkEditView(GetReturnURLMixin, BaseMultiObjectView):
|
|||||||
if form.is_valid():
|
if form.is_valid():
|
||||||
logger.debug("Form validation was successful")
|
logger.debug("Form validation was successful")
|
||||||
try:
|
try:
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(model)):
|
||||||
updated_objects = self._update_objects(form, request)
|
updated_objects = self._update_objects(form, request)
|
||||||
|
|
||||||
# Enforce object-level permissions
|
# Enforce object-level permissions
|
||||||
@ -778,7 +778,7 @@ class BulkRenameView(GetReturnURLMixin, BaseMultiObjectView):
|
|||||||
|
|
||||||
if form.is_valid():
|
if form.is_valid():
|
||||||
try:
|
try:
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
|
||||||
renamed_pks = self._rename_objects(form, selected_objects)
|
renamed_pks = self._rename_objects(form, selected_objects)
|
||||||
|
|
||||||
if '_apply' in request.POST:
|
if '_apply' in request.POST:
|
||||||
@ -875,7 +875,7 @@ class BulkDeleteView(GetReturnURLMixin, BaseMultiObjectView):
|
|||||||
queryset = self.queryset.filter(pk__in=pk_list)
|
queryset = self.queryset.filter(pk__in=pk_list)
|
||||||
deleted_count = queryset.count()
|
deleted_count = queryset.count()
|
||||||
try:
|
try:
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(model)):
|
||||||
for obj in queryset:
|
for obj in queryset:
|
||||||
# Take a snapshot of change-logged models
|
# Take a snapshot of change-logged models
|
||||||
if hasattr(obj, 'snapshot'):
|
if hasattr(obj, 'snapshot'):
|
||||||
@ -980,7 +980,7 @@ class BulkComponentCreateView(GetReturnURLMixin, BaseMultiObjectView):
|
|||||||
}
|
}
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
|
||||||
|
|
||||||
for obj in data['pk']:
|
for obj in data['pk']:
|
||||||
|
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
from django.contrib.auth.mixins import LoginRequiredMixin
|
from django.contrib.auth.mixins import LoginRequiredMixin
|
||||||
from django.contrib.contenttypes.models import ContentType
|
from django.contrib.contenttypes.models import ContentType
|
||||||
from django.contrib import messages
|
from django.contrib import messages
|
||||||
from django.db import transaction
|
from django.db import router, transaction
|
||||||
from django.db.models import Q
|
from django.db.models import Q
|
||||||
from django.shortcuts import get_object_or_404, redirect, render
|
from django.shortcuts import get_object_or_404, redirect, render
|
||||||
from django.utils.translation import gettext_lazy as _
|
from django.utils.translation import gettext_lazy as _
|
||||||
@ -240,7 +240,7 @@ class BulkSyncDataView(GetReturnURLMixin, BaseMultiObjectView):
|
|||||||
data_file__isnull=False
|
data_file__isnull=False
|
||||||
)
|
)
|
||||||
|
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
|
||||||
for obj in selected_objects:
|
for obj in selected_objects:
|
||||||
obj.sync(save=True)
|
obj.sync(save=True)
|
||||||
|
|
||||||
|
@ -282,7 +282,7 @@ class ObjectEditView(GetReturnURLMixin, BaseObjectView):
|
|||||||
logger.debug("Form validation was successful")
|
logger.debug("Form validation was successful")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(model)):
|
||||||
object_created = form.instance.pk is None
|
object_created = form.instance.pk is None
|
||||||
obj = form.save()
|
obj = form.save()
|
||||||
|
|
||||||
@ -570,7 +570,7 @@ class ComponentCreateView(GetReturnURLMixin, BaseObjectView):
|
|||||||
|
|
||||||
if not form.errors and not component_form.errors:
|
if not form.errors and not component_form.errors:
|
||||||
try:
|
try:
|
||||||
with transaction.atomic():
|
with transaction.atomic(using=router.db_for_write(self.queryset.model)):
|
||||||
# Create the new components
|
# Create the new components
|
||||||
new_objs = []
|
new_objs = []
|
||||||
for component_form in new_components:
|
for component_form in new_components:
|
||||||
|
BIN
netbox/project-static/dist/netbox.css
vendored
BIN
netbox/project-static/dist/netbox.css
vendored
Binary file not shown.
BIN
netbox/project-static/dist/netbox.js
vendored
BIN
netbox/project-static/dist/netbox.js
vendored
Binary file not shown.
BIN
netbox/project-static/dist/netbox.js.map
vendored
BIN
netbox/project-static/dist/netbox.js.map
vendored
Binary file not shown.
@ -23,14 +23,14 @@
|
|||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@mdi/font": "7.4.47",
|
"@mdi/font": "7.4.47",
|
||||||
"@tabler/core": "1.3.2",
|
"@tabler/core": "1.4.0",
|
||||||
"bootstrap": "5.3.6",
|
"bootstrap": "5.3.7",
|
||||||
"clipboard": "2.0.11",
|
"clipboard": "2.0.11",
|
||||||
"flatpickr": "4.6.13",
|
"flatpickr": "4.6.13",
|
||||||
"gridstack": "12.2.1",
|
"gridstack": "12.2.2",
|
||||||
"htmx.org": "2.0.4",
|
"htmx.org": "2.0.6",
|
||||||
"query-string": "9.2.0",
|
"query-string": "9.2.2",
|
||||||
"sass": "1.89.1",
|
"sass": "1.89.2",
|
||||||
"tom-select": "2.4.3",
|
"tom-select": "2.4.3",
|
||||||
"typeface-inter": "3.18.1",
|
"typeface-inter": "3.18.1",
|
||||||
"typeface-roboto-mono": "1.1.13"
|
"typeface-roboto-mono": "1.1.13"
|
||||||
@ -39,15 +39,15 @@
|
|||||||
"@types/bootstrap": "5.2.10",
|
"@types/bootstrap": "5.2.10",
|
||||||
"@types/cookie": "^0.6.0",
|
"@types/cookie": "^0.6.0",
|
||||||
"@types/node": "^22.3.0",
|
"@types/node": "^22.3.0",
|
||||||
"@typescript-eslint/eslint-plugin": "^8.1.0",
|
"@typescript-eslint/eslint-plugin": "^8.37.0",
|
||||||
"@typescript-eslint/parser": "^8.1.0",
|
"@typescript-eslint/parser": "^8.37.0",
|
||||||
"esbuild": "^0.25.3",
|
"esbuild": "^0.25.6",
|
||||||
"esbuild-sass-plugin": "^3.3.1",
|
"esbuild-sass-plugin": "^3.3.1",
|
||||||
"eslint": "<9.0",
|
"eslint": "<9.0",
|
||||||
"eslint-config-prettier": "^9.1.0",
|
"eslint-config-prettier": "^9.1.0",
|
||||||
"eslint-import-resolver-typescript": "^3.6.3",
|
"eslint-import-resolver-typescript": "^3.6.3",
|
||||||
"eslint-plugin-import": "^2.30.0",
|
"eslint-plugin-import": "^2.32.0",
|
||||||
"eslint-plugin-prettier": "^5.2.1",
|
"eslint-plugin-prettier": "^5.5.1",
|
||||||
"prettier": "^3.3.3",
|
"prettier": "^3.3.3",
|
||||||
"typescript": "<5.5"
|
"typescript": "<5.5"
|
||||||
},
|
},
|
||||||
|
4
netbox/project-static/styles/custom/racks.scss
Normal file
4
netbox/project-static/styles/custom/racks.scss
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
.rack-loading-container {
|
||||||
|
min-height: 200px;
|
||||||
|
margin-left: 30px;
|
||||||
|
}
|
@ -27,3 +27,4 @@
|
|||||||
@import 'custom/markdown';
|
@import 'custom/markdown';
|
||||||
@import 'custom/misc';
|
@import 'custom/misc';
|
||||||
@import 'custom/notifications';
|
@import 'custom/notifications';
|
||||||
|
@import 'custom/racks';
|
||||||
|
File diff suppressed because it is too large
Load Diff
@ -1,3 +1,3 @@
|
|||||||
version: "4.3.2"
|
version: "4.3.4"
|
||||||
edition: "Community"
|
edition: "Community"
|
||||||
published: "2025-06-05"
|
published: "2025-07-15"
|
||||||
|
@ -45,7 +45,7 @@
|
|||||||
</div>
|
</div>
|
||||||
{% elif perms.dcim.add_cable %}
|
{% elif perms.dcim.add_cable %}
|
||||||
<div class="dropdown">
|
<div class="dropdown">
|
||||||
<button type="button" class="btn btn-success dropdown-toggle" data-bs-toggle="dropdown" aria-haspopup="true" aria-expanded="false">
|
<button type="button" class="btn btn-primary dropdown-toggle" data-bs-toggle="dropdown" aria-haspopup="true" aria-expanded="false">
|
||||||
<span class="mdi mdi-ethernet-cable" aria-hidden="true"></span> {% trans "Connect" %}
|
<span class="mdi mdi-ethernet-cable" aria-hidden="true"></span> {% trans "Connect" %}
|
||||||
</button>
|
</button>
|
||||||
<ul class="dropdown-menu">
|
<ul class="dropdown-menu">
|
||||||
|
@ -65,7 +65,7 @@
|
|||||||
{% trans "Not Connected" %}
|
{% trans "Not Connected" %}
|
||||||
{% if perms.dcim.add_cable %}
|
{% if perms.dcim.add_cable %}
|
||||||
<div class="dropdown float-end">
|
<div class="dropdown float-end">
|
||||||
<button type="button" class="btn btn-primary btn-sm dropdown-toggle" data-bs-toggle="dropdown" aria-expanded="false">
|
<button type="button" class="btn btn-primary dropdown-toggle" data-bs-toggle="dropdown" aria-expanded="false">
|
||||||
<span class="mdi mdi-ethernet-cable" aria-hidden="true"></span> {% trans "Connect" %}
|
<span class="mdi mdi-ethernet-cable" aria-hidden="true"></span> {% trans "Connect" %}
|
||||||
</button>
|
</button>
|
||||||
<ul class="dropdown-menu dropdown-menu-end">
|
<ul class="dropdown-menu dropdown-menu-end">
|
||||||
|
@ -65,7 +65,7 @@
|
|||||||
{% trans "Not Connected" %}
|
{% trans "Not Connected" %}
|
||||||
{% if perms.dcim.add_cable %}
|
{% if perms.dcim.add_cable %}
|
||||||
<div class="dropdown float-end">
|
<div class="dropdown float-end">
|
||||||
<button type="button" class="btn btn-primary btn-sm dropdown-toggle" data-bs-toggle="dropdown" aria-expanded="false">
|
<button type="button" class="btn btn-primary dropdown-toggle" data-bs-toggle="dropdown" aria-expanded="false">
|
||||||
<span class="mdi mdi-ethernet-cable" aria-hidden="true"></span> {% trans "Connect" %}
|
<span class="mdi mdi-ethernet-cable" aria-hidden="true"></span> {% trans "Connect" %}
|
||||||
</button>
|
</button>
|
||||||
<ul class="dropdown-menu dropdown-menu-end">
|
<ul class="dropdown-menu dropdown-menu-end">
|
||||||
|
@ -308,7 +308,7 @@
|
|||||||
{% trans "Services" %}
|
{% trans "Services" %}
|
||||||
{% if perms.ipam.add_service %}
|
{% if perms.ipam.add_service %}
|
||||||
<div class="card-actions">
|
<div class="card-actions">
|
||||||
<a href="{% url 'ipam:service_add' %}?device={{ object.pk }}" class="btn btn-ghost-primary btn-sm">
|
<a href="{% url 'ipam:service_add' %}?parent_object_type={{ object|content_type_id }}&parent={{ object.pk }}" class="btn btn-ghost-primary btn-sm">
|
||||||
<span class="mdi mdi-plus-thick" aria-hidden="true"></span> {% trans "Add a service" %}
|
<span class="mdi mdi-plus-thick" aria-hidden="true"></span> {% trans "Add a service" %}
|
||||||
</a>
|
</a>
|
||||||
</div>
|
</div>
|
||||||
|
@ -1,6 +1,17 @@
|
|||||||
{% load i18n %}
|
{% load i18n %}
|
||||||
<div style="margin-left: -30px">
|
<div style="margin-left: -30px">
|
||||||
<object data="{% url 'dcim-api:rack-elevation' pk=object.pk %}?face={{face}}&render=svg{% if extra_params %}&{{ extra_params }}{% endif %}" class="rack_elevation" aria-label="{% trans "Rack elevation" %}"></object>
|
<div
|
||||||
|
hx-get="{% url 'dcim-api:rack-elevation' pk=object.pk %}?face={{ face }}&render=svg{% if extra_params %}&{{ extra_params }}{% endif %}"
|
||||||
|
hx-trigger="intersect"
|
||||||
|
hx-swap="outerHTML"
|
||||||
|
aria-label="{% trans "Rack elevation" %}"
|
||||||
|
>
|
||||||
|
<div class="d-flex justify-content-center align-items-center rack-loading-container">
|
||||||
|
<div class="spinner-border" role="status">
|
||||||
|
<span class="visually-hidden">{% trans "Loading..." %}</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="text-center mt-3">
|
<div class="text-center mt-3">
|
||||||
<a class="btn btn-outline-primary" href="{% url 'dcim-api:rack-elevation' pk=object.pk %}?face={{face}}&render=svg{% if extra_params %}&{{ extra_params }}{% endif %}" hx-boost="false">
|
<a class="btn btn-outline-primary" href="{% url 'dcim-api:rack-elevation' pk=object.pk %}?face={{face}}&render=svg{% if extra_params %}&{{ extra_params }}{% endif %}" hx-boost="false">
|
||||||
|
@ -118,10 +118,6 @@
|
|||||||
{% else %}
|
{% else %}
|
||||||
<div class="card-body text-muted">
|
<div class="card-body text-muted">
|
||||||
{% trans "Not connected" %}
|
{% trans "Not connected" %}
|
||||||
</div>
|
|
||||||
{% endif %}
|
|
||||||
{% if not object.mark_connected and not object.cable %}
|
|
||||||
<div class="card-footer">
|
|
||||||
{% if perms.dcim.add_cable %}
|
{% if perms.dcim.add_cable %}
|
||||||
<a href="{% url 'dcim:cable_add' %}?a_terminations_type=dcim.powerfeed&a_terminations={{ object.pk }}&b_terminations_type=dcim.powerport&return_url={{ object.get_absolute_url }}" class="btn btn-primary float-end">
|
<a href="{% url 'dcim:cable_add' %}?a_terminations_type=dcim.powerfeed&a_terminations={{ object.pk }}&b_terminations_type=dcim.powerport&return_url={{ object.get_absolute_url }}" class="btn btn-primary float-end">
|
||||||
<i class="mdi mdi-ethernet-cable" aria-hidden="true"></i> {% trans "Connect" %}
|
<i class="mdi mdi-ethernet-cable" aria-hidden="true"></i> {% trans "Connect" %}
|
||||||
|
@ -14,7 +14,7 @@
|
|||||||
</tr>
|
</tr>
|
||||||
<tr>
|
<tr>
|
||||||
<th scope="row">Description</th>
|
<th scope="row">Description</th>
|
||||||
<td>{{ object.description|markdown|placeholder }}</td>
|
<td>{{ object.description|placeholder }}</td>
|
||||||
</tr>
|
</tr>
|
||||||
<tr>
|
<tr>
|
||||||
<th scope="row">Base Choices</th>
|
<th scope="row">Base Choices</th>
|
||||||
|
@ -53,7 +53,16 @@
|
|||||||
{# Script output. Legacy reports will not have this. #}
|
{# Script output. Legacy reports will not have this. #}
|
||||||
{% if 'output' in job.data %}
|
{% if 'output' in job.data %}
|
||||||
<div class="card mb-3">
|
<div class="card mb-3">
|
||||||
<h2 class="card-header">{% trans "Output" %}</h2>
|
<h2 class="card-header d-flex justify-content-between">
|
||||||
|
{% trans "Output" %}
|
||||||
|
{% if job.completed %}
|
||||||
|
<div>
|
||||||
|
<a href="?export=output" class="btn btn-primary lh-1" role="button">
|
||||||
|
<i class="mdi mdi-download" aria-hidden="true"></i> {% trans "Download" %}
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</h2>
|
||||||
{% if job.data.output %}
|
{% if job.data.output %}
|
||||||
<pre class="card-body font-monospace">{{ job.data.output }}</pre>
|
<pre class="card-body font-monospace">{{ job.data.output }}</pre>
|
||||||
{% else %}
|
{% else %}
|
||||||
|
@ -29,11 +29,7 @@
|
|||||||
<div class="hr-text">
|
<div class="hr-text">
|
||||||
<span>{% trans "Custom Fields" %}</span>
|
<span>{% trans "Custom Fields" %}</span>
|
||||||
</div>
|
</div>
|
||||||
{% for name in filter_form.custom_fields %}
|
{% render_custom_fields filter_form %}
|
||||||
{% with field=filter_form|get_item:name %}
|
|
||||||
{% render_field field %}
|
|
||||||
{% endwith %}
|
|
||||||
{% endfor %}
|
|
||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
|
@ -154,7 +154,7 @@
|
|||||||
{% trans "Services" %}
|
{% trans "Services" %}
|
||||||
{% if perms.ipam.add_service %}
|
{% if perms.ipam.add_service %}
|
||||||
<div class="card-actions">
|
<div class="card-actions">
|
||||||
<a href="{% url 'ipam:service_add' %}?virtual_machine={{ object.pk }}" class="btn btn-ghost-primary btn-sm">
|
<a href="{% url 'ipam:service_add' %}?parent_object_type={{ object|content_type_id }}&parent={{ object.pk }}" class="btn btn-ghost-primary btn-sm">
|
||||||
<span class="mdi mdi-plus-thick" aria-hidden="true"></span> {% trans "Add a service" %}
|
<span class="mdi mdi-plus-thick" aria-hidden="true"></span> {% trans "Add a service" %}
|
||||||
</a>
|
</a>
|
||||||
</div>
|
</div>
|
||||||
|
@ -19,6 +19,10 @@ class ContactGroupTable(NetBoxTable):
|
|||||||
verbose_name=_('Name'),
|
verbose_name=_('Name'),
|
||||||
linkify=True
|
linkify=True
|
||||||
)
|
)
|
||||||
|
parent = tables.Column(
|
||||||
|
verbose_name=_('Parent'),
|
||||||
|
linkify=True,
|
||||||
|
)
|
||||||
contact_count = columns.LinkedCountColumn(
|
contact_count = columns.LinkedCountColumn(
|
||||||
viewname='tenancy:contact_list',
|
viewname='tenancy:contact_list',
|
||||||
url_params={'group_id': 'pk'},
|
url_params={'group_id': 'pk'},
|
||||||
@ -34,7 +38,7 @@ class ContactGroupTable(NetBoxTable):
|
|||||||
class Meta(NetBoxTable.Meta):
|
class Meta(NetBoxTable.Meta):
|
||||||
model = ContactGroup
|
model = ContactGroup
|
||||||
fields = (
|
fields = (
|
||||||
'pk', 'name', 'contact_count', 'description', 'comments', 'slug', 'tags', 'created',
|
'pk', 'name', 'parent', 'contact_count', 'description', 'comments', 'slug', 'tags', 'created',
|
||||||
'last_updated', 'actions',
|
'last_updated', 'actions',
|
||||||
)
|
)
|
||||||
default_columns = ('pk', 'name', 'contact_count', 'description')
|
default_columns = ('pk', 'name', 'contact_count', 'description')
|
||||||
|
@ -16,6 +16,10 @@ class TenantGroupTable(NetBoxTable):
|
|||||||
verbose_name=_('Name'),
|
verbose_name=_('Name'),
|
||||||
linkify=True
|
linkify=True
|
||||||
)
|
)
|
||||||
|
parent = tables.Column(
|
||||||
|
verbose_name=_('Parent'),
|
||||||
|
linkify=True,
|
||||||
|
)
|
||||||
tenant_count = columns.LinkedCountColumn(
|
tenant_count = columns.LinkedCountColumn(
|
||||||
viewname='tenancy:tenant_list',
|
viewname='tenancy:tenant_list',
|
||||||
url_params={'group_id': 'pk'},
|
url_params={'group_id': 'pk'},
|
||||||
@ -31,7 +35,7 @@ class TenantGroupTable(NetBoxTable):
|
|||||||
class Meta(NetBoxTable.Meta):
|
class Meta(NetBoxTable.Meta):
|
||||||
model = TenantGroup
|
model = TenantGroup
|
||||||
fields = (
|
fields = (
|
||||||
'pk', 'id', 'name', 'tenant_count', 'description', 'comments', 'slug', 'tags', 'created',
|
'pk', 'id', 'name', 'parent', 'tenant_count', 'description', 'comments', 'slug', 'tags', 'created',
|
||||||
'last_updated', 'actions',
|
'last_updated', 'actions',
|
||||||
)
|
)
|
||||||
default_columns = ('pk', 'name', 'tenant_count', 'description')
|
default_columns = ('pk', 'name', 'tenant_count', 'description')
|
||||||
|
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user