mirror of
https://github.com/netbox-community/netbox.git
synced 2025-12-27 15:47:46 -06:00
Compare commits
53 Commits
20660-scri
...
12318-case
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4f2f61c90d | ||
|
|
a34553325e | ||
|
|
06052f8eaa | ||
|
|
dac0a06f4f | ||
|
|
52d4498caf | ||
|
|
5bbab7eb47 | ||
|
|
87505e0bb9 | ||
|
|
7d82493052 | ||
|
|
77c08b7bf9 | ||
|
|
adad7c2209 | ||
|
|
5ad6bd88f6 | ||
|
|
2bebfccf9b | ||
|
|
b7cc4c418b | ||
|
|
37a9d03348 | ||
|
|
a91af996d5 | ||
|
|
bb290dc792 | ||
|
|
fcdb7ff6c8 | ||
|
|
18a308ae3a | ||
|
|
c63e60a62b | ||
|
|
82db8a9c02 | ||
|
|
bb75bceec5 | ||
|
|
9a68cde95f | ||
|
|
6c723dfb1a | ||
|
|
9b85d92ad0 | ||
|
|
917a2c2618 | ||
|
|
6388705e57 | ||
|
|
ac335c3d87 | ||
|
|
a54c508da2 | ||
|
|
d69042f26e | ||
|
|
f6290dd7af | ||
|
|
adce67a7cf | ||
|
|
f82f084c02 | ||
|
|
43fc7fb58a | ||
|
|
11099b01bb | ||
|
|
5dc48f3a88 | ||
|
|
1ee23ba6fa | ||
|
|
23d7515b41 | ||
|
|
12818f1786 | ||
|
|
f0ae0da1c7 | ||
|
|
c30e4813b7 | ||
|
|
57a7afd548 | ||
|
|
b4eaeead13 | ||
|
|
24fff6bd74 | ||
|
|
b9567208d4 | ||
|
|
cfcea7c941 | ||
|
|
21ba27fb39 | ||
|
|
c0e4d1c1e3 | ||
|
|
d95eaa7ba2 | ||
|
|
5506901867 | ||
|
|
ec9da88134 | ||
|
|
e221f1fffa | ||
|
|
530dad279a | ||
|
|
b1439dc298 |
4
.github/ISSUE_TEMPLATE/02-bug_report.yaml
vendored
4
.github/ISSUE_TEMPLATE/02-bug_report.yaml
vendored
@@ -35,9 +35,9 @@ body:
|
||||
label: Python Version
|
||||
description: What version of Python are you currently running?
|
||||
options:
|
||||
- "3.10"
|
||||
- "3.11"
|
||||
- "3.12"
|
||||
- "3.13"
|
||||
- "3.14"
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
|
||||
2
.github/workflows/ci.yml
vendored
2
.github/workflows/ci.yml
vendored
@@ -31,7 +31,7 @@ jobs:
|
||||
NETBOX_CONFIGURATION: netbox.configuration_testing
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ['3.10', '3.11', '3.12']
|
||||
python-version: ['3.12', '3.13']
|
||||
node-version: ['20.x']
|
||||
services:
|
||||
redis:
|
||||
|
||||
1235
contrib/openapi.json
1235
contrib/openapi.json
File diff suppressed because it is too large
Load Diff
@@ -2,7 +2,7 @@
|
||||
|
||||
## Local Authentication
|
||||
|
||||
Local user accounts and groups can be created in NetBox under the "Authentication" section in the "Admin" menu. This section is available only to users with the "staff" permission enabled.
|
||||
Local user accounts and groups can be created in NetBox under the "Authentication" section in the "Admin" menu.
|
||||
|
||||
At a minimum, each user account must have a username and password set. User accounts may also denote a first name, last name, and email address. [Permissions](../permissions.md) may also be assigned to individual users and/or groups as needed.
|
||||
|
||||
|
||||
@@ -1,5 +1,15 @@
|
||||
# GraphQL API Parameters
|
||||
|
||||
## GRAPHQL_DEFAULT_VERSION
|
||||
|
||||
!!! note "This parameter was introduced in NetBox v4.5."
|
||||
|
||||
Default: `1`
|
||||
|
||||
Designates the default version of the GraphQL API served by `/graphql/`. To access a specific version, append the version number to the URL, e.g. `/graphql/v2/`.
|
||||
|
||||
---
|
||||
|
||||
## GRAPHQL_ENABLED
|
||||
|
||||
!!! tip "Dynamic Configuration Parameter"
|
||||
|
||||
@@ -127,19 +127,3 @@ The list of groups that promote an remote User to Superuser on Login. If group i
|
||||
Default: `[]` (Empty list)
|
||||
|
||||
The list of users that get promoted to Superuser on Login. If user isn't present in list on next Login, the Role gets revoked. (Requires `REMOTE_AUTH_ENABLED` and `REMOTE_AUTH_GROUP_SYNC_ENABLED` )
|
||||
|
||||
---
|
||||
|
||||
## REMOTE_AUTH_STAFF_GROUPS
|
||||
|
||||
Default: `[]` (Empty list)
|
||||
|
||||
The list of groups that promote an remote User to Staff on Login. If group isn't present on next Login, the Role gets revoked. (Requires `REMOTE_AUTH_ENABLED` and `REMOTE_AUTH_GROUP_SYNC_ENABLED` )
|
||||
|
||||
---
|
||||
|
||||
## REMOTE_AUTH_STAFF_USERS
|
||||
|
||||
Default: `[]` (Empty list)
|
||||
|
||||
The list of users that get promoted to Staff on Login. If user isn't present in list on next Login, the Role gets revoked. (Requires `REMOTE_AUTH_ENABLED` and `REMOTE_AUTH_GROUP_SYNC_ENABLED` )
|
||||
|
||||
@@ -23,6 +23,31 @@ ALLOWED_HOSTS = ['*']
|
||||
|
||||
---
|
||||
|
||||
## API_TOKEN_PEPPERS
|
||||
|
||||
!!! info "This parameter was introduced in NetBox v4.5."
|
||||
|
||||
[Cryptographic peppers](https://en.wikipedia.org/wiki/Pepper_(cryptography)) are employed to generate hashes of sensitive values on the server. This parameter defines the peppers used to hash v2 API tokens in NetBox. You must define at least one pepper before creating a v2 API token. See the [API documentation](../integrations/rest-api.md#authentication) for further information about how peppers are used.
|
||||
|
||||
```python
|
||||
API_TOKEN_PEPPERS = {
|
||||
# DO NOT USE THIS EXAMPLE PEPPER IN PRODUCTION
|
||||
1: 'kp7ht*76fiQAhUi5dHfASLlYUE_S^gI^(7J^K5M!LfoH@vl&b_',
|
||||
}
|
||||
```
|
||||
|
||||
!!! warning "Peppers are sensitive"
|
||||
Treat pepper values as extremely sensitive. Consider populating peppers from environment variables at initialization time rather than defining them in the configuration file, if feasible.
|
||||
|
||||
Peppers must be at least 50 characters in length and should comprise a random string with a diverse character set. Consider using the Python script at `$INSTALL_ROOT/netbox/generate_secret_key.py` to generate a pepper value.
|
||||
|
||||
It is recommended to start with a pepper ID of `1`. Additional peppers can be introduced later as needed to begin rotating token hashes.
|
||||
|
||||
!!! tip
|
||||
Although NetBox will run without `API_TOKEN_PEPPERS` defined, the use of v2 API tokens will be unavailable.
|
||||
|
||||
---
|
||||
|
||||
## DATABASE
|
||||
|
||||
!!! warning "Legacy Configuration Parameter"
|
||||
|
||||
@@ -1,16 +1,5 @@
|
||||
# Security & Authentication Parameters
|
||||
|
||||
## ALLOW_TOKEN_RETRIEVAL
|
||||
|
||||
Default: `False`
|
||||
|
||||
!!! note
|
||||
The default value of this parameter changed from `True` to `False` in NetBox v4.3.0.
|
||||
|
||||
If disabled, the values of API tokens will not be displayed after each token's initial creation. A user **must** record the value of a token prior to its creation, or it will be lost. Note that this affects _all_ users, regardless of assigned permissions.
|
||||
|
||||
---
|
||||
|
||||
## ALLOWED_URL_SCHEMES
|
||||
|
||||
!!! tip "Dynamic Configuration Parameter"
|
||||
|
||||
@@ -131,17 +131,6 @@ self.log_info(f"Running as user {username} (IP: {ip_address})...")
|
||||
|
||||
For a complete list of available request parameters, please see the [Django documentation](https://docs.djangoproject.com/en/stable/ref/request-response/).
|
||||
|
||||
## Reading Data from Files
|
||||
|
||||
The Script class provides two convenience methods for reading data from files:
|
||||
|
||||
* `load_yaml`
|
||||
* `load_json`
|
||||
|
||||
These two methods will load data in YAML or JSON format, respectively, from files within the local path (i.e. `SCRIPTS_ROOT`).
|
||||
|
||||
**Note:** These convenience methods are deprecated and will be removed in NetBox v4.4. These only work if running scripts within the local path, they will not work if using a storage other than ScriptFileSystemStorage.
|
||||
|
||||
## Logging
|
||||
|
||||
The Script object provides a set of convenient functions for recording messages at different severity levels:
|
||||
|
||||
@@ -7,7 +7,7 @@ Getting started with NetBox development is pretty straightforward, and should fe
|
||||
* A Linux system or compatible environment
|
||||
* A PostgreSQL server, which can be installed locally [per the documentation](../installation/1-postgresql.md)
|
||||
* A Redis server, which can also be [installed locally](../installation/2-redis.md)
|
||||
* Python 3.10 or later
|
||||
* Python 3.12 or later
|
||||
|
||||
### 1. Fork the Repo
|
||||
|
||||
|
||||
@@ -8,7 +8,7 @@ NetBox's REST API, powered by the [Django REST Framework](https://www.django-res
|
||||
|
||||
```no-highlight
|
||||
curl -s -X POST \
|
||||
-H "Authorization: Token $TOKEN" \
|
||||
-H "Authorization: Bearer $TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
http://netbox/api/ipam/prefixes/ \
|
||||
--data '{"prefix": "192.0.2.0/24", "site": {"name": "Branch 12"}}'
|
||||
|
||||
@@ -90,3 +90,10 @@ http://netbox:8000/api/extras/config-templates/123/render/ \
|
||||
"bar": 123
|
||||
}'
|
||||
```
|
||||
|
||||
!!! note "Permissions"
|
||||
Rendering configuration templates via the REST API requires appropriate permissions for the relevant object type:
|
||||
|
||||
* To render a device's configuration via `/api/dcim/devices/{id}/render-config/`, assign a permission for "DCIM > Device" with the `render_config` action.
|
||||
* To render a virtual machine's configuration via `/api/virtualization/virtual-machines/{id}/render-config/`, assign a permission for "Virtualization > Virtual Machine" with the `render_config` action.
|
||||
* To render a config template directly via `/api/extras/config-templates/{id}/render/`, assign a permission for "Extras > Config Template" with the `render` action.
|
||||
|
||||
@@ -34,9 +34,6 @@ Sets the default number of rows displayed on paginated tables.
|
||||
### Paginator placement
|
||||
Controls where pagination controls are rendered relative to a table.
|
||||
|
||||
### HTMX navigation (experimental)
|
||||
Enables partial‑page navigation for supported views. Disable this preference if unexpected behavior is observed.
|
||||
|
||||
### Striped table rows
|
||||
Toggles alternating row backgrounds on tables.
|
||||
|
||||
|
||||
@@ -6,8 +6,8 @@ This section of the documentation discusses installing and configuring the NetBo
|
||||
|
||||
Begin by installing all system packages required by NetBox and its dependencies.
|
||||
|
||||
!!! warning "Python 3.10 or later required"
|
||||
NetBox supports Python 3.10, 3.11, and 3.12.
|
||||
!!! warning "Python 3.12 or later required"
|
||||
NetBox supports only Python 3.12 or later.
|
||||
|
||||
```no-highlight
|
||||
sudo apt install -y python3 python3-pip python3-venv python3-dev \
|
||||
@@ -15,7 +15,7 @@ build-essential libxml2-dev libxslt1-dev libffi-dev libpq-dev \
|
||||
libssl-dev zlib1g-dev
|
||||
```
|
||||
|
||||
Before continuing, check that your installed Python version is at least 3.10:
|
||||
Before continuing, check that your installed Python version is at least 3.12:
|
||||
|
||||
```no-highlight
|
||||
python3 -V
|
||||
@@ -120,6 +120,23 @@ If you are not yet sure what the domain name and/or IP address of the NetBox ins
|
||||
ALLOWED_HOSTS = ['*']
|
||||
```
|
||||
|
||||
### API_TOKEN_PEPPERS
|
||||
|
||||
Define at least one random cryptographic pepper, identified by a numeric ID starting at 1. This will be used to generate SHA256 checksums for API tokens.
|
||||
|
||||
```python
|
||||
API_TOKEN_PEPPERS = {
|
||||
# DO NOT USE THIS EXAMPLE PEPPER IN PRODUCTION
|
||||
1: 'kp7ht*76fiQAhUi5dHfASLlYUE_S^gI^(7J^K5M!LfoH@vl&b_',
|
||||
}
|
||||
```
|
||||
|
||||
!!! tip
|
||||
As with [`SECRET_KEY`](#secret_key) below, you can use the `generate_secret_key.py` script to generate a random pepper:
|
||||
```no-highlight
|
||||
python3 ../generate_secret_key.py
|
||||
```
|
||||
|
||||
### DATABASES
|
||||
|
||||
This parameter holds the PostgreSQL database configuration details. The default database must be defined; additional databases may be defined as needed e.g. by plugins.
|
||||
@@ -235,10 +252,10 @@ Once NetBox has been configured, we're ready to proceed with the actual installa
|
||||
sudo /opt/netbox/upgrade.sh
|
||||
```
|
||||
|
||||
Note that **Python 3.10 or later is required** for NetBox v4.0 and later releases. If the default Python installation on your server is set to a lesser version, pass the path to the supported installation as an environment variable named `PYTHON`. (Note that the environment variable must be passed _after_ the `sudo` command.)
|
||||
Note that **Python 3.12 or later is required** for NetBox v4.5 and later releases. If the default Python installation on your server is set to a lesser version, pass the path to the supported installation as an environment variable named `PYTHON`. (Note that the environment variable must be passed _after_ the `sudo` command.)
|
||||
|
||||
```no-highlight
|
||||
sudo PYTHON=/usr/bin/python3.10 /opt/netbox/upgrade.sh
|
||||
sudo PYTHON=/usr/bin/python3.12 /opt/netbox/upgrade.sh
|
||||
```
|
||||
|
||||
!!! note
|
||||
|
||||
@@ -60,6 +60,3 @@ You should see output similar to the following:
|
||||
If the NetBox service fails to start, issue the command `journalctl -eu netbox` to check for log messages that may indicate the problem.
|
||||
|
||||
Once you've verified that the WSGI workers are up and running, move on to HTTP server setup.
|
||||
|
||||
!!! note
|
||||
There is a bug in the current stable release of gunicorn (v21.2.0) where automatic restarts of the worker processes can result in 502 errors under heavy load. (See [gunicorn bug #3038](https://github.com/benoitc/gunicorn/issues/3038) for more detail.) Users who encounter this issue may opt to downgrade to an earlier, unaffected release of gunicorn (`pip install gunicorn==20.1.0`). Note, however, that this earlier release does not officially support Python 3.11.
|
||||
|
||||
@@ -121,7 +121,6 @@ AUTH_LDAP_MIRROR_GROUPS = True
|
||||
# Define special user types using groups. Exercise great caution when assigning superuser status.
|
||||
AUTH_LDAP_USER_FLAGS_BY_GROUP = {
|
||||
"is_active": "cn=active,ou=groups,dc=example,dc=com",
|
||||
"is_staff": "cn=staff,ou=groups,dc=example,dc=com",
|
||||
"is_superuser": "cn=superuser,ou=groups,dc=example,dc=com"
|
||||
}
|
||||
|
||||
@@ -134,7 +133,6 @@ AUTH_LDAP_CACHE_TIMEOUT = 3600
|
||||
```
|
||||
|
||||
* `is_active` - All users must be mapped to at least this group to enable authentication. Without this, users cannot log in.
|
||||
* `is_staff` - Users mapped to this group are enabled for access to the administration tools; this is the equivalent of checking the "staff status" box on a manually created user. This doesn't grant any specific permissions.
|
||||
* `is_superuser` - Users mapped to this group will be granted superuser status. Superusers are implicitly granted all permissions.
|
||||
|
||||
!!! warning
|
||||
@@ -248,7 +246,6 @@ AUTH_LDAP_MIRROR_GROUPS = True
|
||||
# Define special user types using groups. Exercise great caution when assigning superuser status.
|
||||
AUTH_LDAP_USER_FLAGS_BY_GROUP = {
|
||||
"is_active": "cn=active,ou=groups,dc=example,dc=com",
|
||||
"is_staff": "cn=staff,ou=groups,dc=example,dc=com",
|
||||
"is_superuser": "cn=superuser,ou=groups,dc=example,dc=com"
|
||||
}
|
||||
|
||||
|
||||
@@ -27,7 +27,7 @@ The following sections detail how to set up a new instance of NetBox:
|
||||
|
||||
| Dependency | Supported Versions |
|
||||
|------------|--------------------|
|
||||
| Python | 3.10, 3.11, 3.12 |
|
||||
| Python | 3.12, 3.13, 3.14 |
|
||||
| PostgreSQL | 14+ |
|
||||
| Redis | 4.0+ |
|
||||
|
||||
|
||||
@@ -19,7 +19,7 @@ NetBox requires the following dependencies:
|
||||
|
||||
| Dependency | Supported Versions |
|
||||
|------------|--------------------|
|
||||
| Python | 3.10, 3.11, 3.12 |
|
||||
| Python | 3.12, 3.13, 3.14 |
|
||||
| PostgreSQL | 14+ |
|
||||
| Redis | 4.0+ |
|
||||
|
||||
@@ -27,6 +27,7 @@ NetBox requires the following dependencies:
|
||||
|
||||
| NetBox Version | Python min | Python max | PostgreSQL min | Redis min | Documentation |
|
||||
|:--------------:|:----------:|:----------:|:--------------:|:---------:|:-----------------------------------------------------------------------------------------:|
|
||||
| 4.5 | 3.12 | 3.14 | 14 | 4.0 | [Link](https://github.com/netbox-community/netbox/blob/v4.5.0/docs/installation/index.md) |
|
||||
| 4.4 | 3.10 | 3.12 | 14 | 4.0 | [Link](https://github.com/netbox-community/netbox/blob/v4.4.0/docs/installation/index.md) |
|
||||
| 4.3 | 3.10 | 3.12 | 14 | 4.0 | [Link](https://github.com/netbox-community/netbox/blob/v4.3.0/docs/installation/index.md) |
|
||||
| 4.2 | 3.10 | 3.12 | 13 | 4.0 | [Link](https://github.com/netbox-community/netbox/blob/v4.2.0/docs/installation/index.md) |
|
||||
@@ -130,7 +131,7 @@ sudo ./upgrade.sh
|
||||
If the default version of Python is not at least 3.10, you'll need to pass the path to a supported Python version as an environment variable when calling the upgrade script. For example:
|
||||
|
||||
```no-highlight
|
||||
sudo PYTHON=/usr/bin/python3.10 ./upgrade.sh
|
||||
sudo PYTHON=/usr/bin/python3.12 ./upgrade.sh
|
||||
```
|
||||
|
||||
!!! note
|
||||
|
||||
@@ -80,7 +80,7 @@ Likewise, the site, rack, and device objects are located under the "DCIM" applic
|
||||
|
||||
The full hierarchy of available endpoints can be viewed by navigating to the API root in a web browser.
|
||||
|
||||
Each model generally has two views associated with it: a list view and a detail view. The list view is used to retrieve a list of multiple objects and to create new objects. The detail view is used to retrieve, update, or delete an single existing object. All objects are referenced by their numeric primary key (`id`).
|
||||
Each model generally has two views associated with it: a list view and a detail view. The list view is used to retrieve a list of multiple objects and to create new objects. The detail view is used to retrieve, update, or delete a single existing object. All objects are referenced by their numeric primary key (`id`).
|
||||
|
||||
* `/api/dcim/devices/` - List existing devices or create a new device
|
||||
* `/api/dcim/devices/123/` - Retrieve, update, or delete the device with ID 123
|
||||
@@ -653,18 +653,22 @@ The NetBox REST API primarily employs token-based authentication. For convenienc
|
||||
|
||||
### Tokens
|
||||
|
||||
A token is a unique identifier mapped to a NetBox user account. Each user may have one or more tokens which he or she can use for authentication when making REST API requests. To create a token, navigate to the API tokens page under your user profile.
|
||||
A token is a secret, unique identifier mapped to a NetBox user account. Each user may have one or more tokens which he or she can use for authentication when making REST API requests. To create a token, navigate to the API tokens page under your user profile. When creating a token, NetBox will automatically populate a randomly-generated token value.
|
||||
|
||||
!!! note "Tokens cannot be retrieved once created"
|
||||
Once a token has been created, its plaintext value cannot be retrieved. For this reason, you must take care to securely record the token locally immediately upon its creation. If a token plaintext is lost, it cannot be recovered: A new token must be created.
|
||||
|
||||
By default, all users can create and manage their own REST API tokens under the user control panel in the UI or via the REST API. This ability can be disabled by overriding the [`DEFAULT_PERMISSIONS`](../configuration/security.md#default_permissions) configuration parameter.
|
||||
|
||||
Each token contains a 160-bit key represented as 40 hexadecimal characters. When creating a token, you'll typically leave the key field blank so that a random key will be automatically generated. However, NetBox allows you to specify a key in case you need to restore a previously deleted token to operation.
|
||||
|
||||
Additionally, a token can be set to expire at a specific time. This can be useful if an external client needs to be granted temporary access to NetBox.
|
||||
|
||||
!!! info "Restricting Token Retrieval"
|
||||
The ability to retrieve the key value of a previously-created API token can be restricted by disabling the [`ALLOW_TOKEN_RETRIEVAL`](../configuration/security.md#allow_token_retrieval) configuration parameter.
|
||||
#### v1 and v2 Tokens
|
||||
|
||||
### Restricting Write Operations
|
||||
Beginning with NetBox v4.5, two versions of API token are supported, denoted as v1 and v2. Users are strongly encouraged to create only v2 tokens and to discontinue the use of v1 tokens. Support for v1 tokens will be removed in a future NetBox release.
|
||||
|
||||
v2 API tokens offer much stronger security. The token plaintext given at creation time is hashed together with a configured [cryptographic pepper](../configuration/required-parameters.md#api_token_peppers) to generate a unique checksum. This checksum is irreversible; the token plaintext is never stored on the server and thus cannot be retrieved even with database-level access.
|
||||
|
||||
#### Restricting Write Operations
|
||||
|
||||
By default, a token can be used to perform all actions via the API that a user would be permitted to do via the web UI. Deselecting the "write enabled" option will restrict API requests made with the token to read operations (e.g. GET) only.
|
||||
|
||||
@@ -681,10 +685,22 @@ It is possible to provision authentication tokens for other users via the REST A
|
||||
|
||||
### Authenticating to the API
|
||||
|
||||
An authentication token is attached to a request by setting the `Authorization` header to the string `Token` followed by a space and the user's token:
|
||||
An authentication token is included with a request in its `Authorization` header. The format of the header value depends on the version of token in use. v2 tokens use the following form, concatenating the token's prefix (`nbt_`) and key with its plaintext value, separated by a period:
|
||||
|
||||
```
|
||||
$ curl -H "Authorization: Token $TOKEN" \
|
||||
Authorization: Bearer nbt_<key>.<token>
|
||||
```
|
||||
|
||||
Legacy v1 tokens use the prefix `Token` rather than `Bearer`, and include only the token plaintext. (v1 tokens do not have a key.)
|
||||
|
||||
```
|
||||
Authorization: Token <token>
|
||||
```
|
||||
|
||||
Below is an example REST API request utilizing a v2 token.
|
||||
|
||||
```
|
||||
$ curl -H "Authorization: Bearer nbt_4F9DAouzURLb.zjebxBPzICiPbWz0Wtx0fTL7bCKXKGTYhNzkgC2S" \
|
||||
-H "Accept: application/json; indent=4" \
|
||||
https://netbox/api/dcim/sites/
|
||||
{
|
||||
|
||||
@@ -173,12 +173,12 @@ classifiers=[
|
||||
'Intended Audience :: Developers',
|
||||
'Natural Language :: English',
|
||||
"Programming Language :: Python :: 3 :: Only",
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Programming Language :: Python :: 3.13',
|
||||
'Programming Language :: Python :: 3.14',
|
||||
]
|
||||
|
||||
requires-python = ">=3.10.0"
|
||||
requires-python = ">=3.12.0"
|
||||
|
||||
```
|
||||
|
||||
@@ -195,7 +195,7 @@ python3 -m venv ~/.virtualenvs/my_plugin
|
||||
You can make NetBox available within this environment by creating a path file pointing to its location. This will add NetBox to the Python path upon activation. (Be sure to adjust the command below to specify your actual virtual environment path, Python version, and NetBox installation.)
|
||||
|
||||
```shell
|
||||
echo /opt/netbox/netbox > $VENV/lib/python3.10/site-packages/netbox.pth
|
||||
echo /opt/netbox/netbox > $VENV/lib/python3.12/site-packages/netbox.pth
|
||||
```
|
||||
|
||||
## Development Installation
|
||||
|
||||
@@ -64,14 +64,17 @@ item1 = PluginMenuItem(
|
||||
|
||||
A `PluginMenuItem` has the following attributes:
|
||||
|
||||
| Attribute | Required | Description |
|
||||
|-----------------|----------|----------------------------------------------------------------------------------------------------------|
|
||||
| `link` | Yes | Name of the URL path to which this menu item links |
|
||||
| `link_text` | Yes | The text presented to the user |
|
||||
| `permissions` | - | A list of permissions required to display this link |
|
||||
| `auth_required` | - | Display only for authenticated users |
|
||||
| `staff_only` | - | Display only for users who have `is_staff` set to true (any specified permissions will also be required) |
|
||||
| `buttons` | - | An iterable of PluginMenuButton instances to include |
|
||||
| Attribute | Required | Description |
|
||||
|-----------------|----------|------------------------------------------------------|
|
||||
| `link` | Yes | Name of the URL path to which this menu item links |
|
||||
| `link_text` | Yes | The text presented to the user |
|
||||
| `permissions` | - | A list of permissions required to display this link |
|
||||
| `auth_required` | - | Display only for authenticated users |
|
||||
| `staff_only` | - | Display only for superusers |
|
||||
| `buttons` | - | An iterable of PluginMenuButton instances to include |
|
||||
|
||||
!!! note "Changed in NetBox v4.5"
|
||||
In releases prior to NetBox v4.5, `staff_only` restricted display of a menu item to only users with `is_staff` set to True. In NetBox v4.5, the `is_staff` flag was removed from the user model. Menu items with `staff_only` set to True are now displayed only for superusers.
|
||||
|
||||
## Menu Buttons
|
||||
|
||||
|
||||
@@ -1,57 +0,0 @@
|
||||
from django.utils.translation import gettext as _
|
||||
|
||||
from account.models import UserToken
|
||||
from netbox.tables import NetBoxTable, columns
|
||||
|
||||
__all__ = (
|
||||
'UserTokenTable',
|
||||
)
|
||||
|
||||
|
||||
TOKEN = """<samp><span id="token_{{ record.pk }}">{{ record }}</span></samp>"""
|
||||
|
||||
ALLOWED_IPS = """{{ value|join:", " }}"""
|
||||
|
||||
COPY_BUTTON = """
|
||||
{% if settings.ALLOW_TOKEN_RETRIEVAL %}
|
||||
{% copy_content record.pk prefix="token_" color="success" %}
|
||||
{% endif %}
|
||||
"""
|
||||
|
||||
|
||||
class UserTokenTable(NetBoxTable):
|
||||
"""
|
||||
Table for users to manager their own API tokens under account views.
|
||||
"""
|
||||
key = columns.TemplateColumn(
|
||||
verbose_name=_('Key'),
|
||||
template_code=TOKEN,
|
||||
)
|
||||
write_enabled = columns.BooleanColumn(
|
||||
verbose_name=_('Write Enabled')
|
||||
)
|
||||
created = columns.DateTimeColumn(
|
||||
timespec='minutes',
|
||||
verbose_name=_('Created'),
|
||||
)
|
||||
expires = columns.DateTimeColumn(
|
||||
timespec='minutes',
|
||||
verbose_name=_('Expires'),
|
||||
)
|
||||
last_used = columns.DateTimeColumn(
|
||||
verbose_name=_('Last Used'),
|
||||
)
|
||||
allowed_ips = columns.TemplateColumn(
|
||||
verbose_name=_('Allowed IPs'),
|
||||
template_code=ALLOWED_IPS
|
||||
)
|
||||
actions = columns.ActionsColumn(
|
||||
actions=('edit', 'delete'),
|
||||
extra_buttons=COPY_BUTTON
|
||||
)
|
||||
|
||||
class Meta(NetBoxTable.Meta):
|
||||
model = UserToken
|
||||
fields = (
|
||||
'pk', 'id', 'key', 'description', 'write_enabled', 'created', 'expires', 'last_used', 'allowed_ips',
|
||||
)
|
||||
@@ -26,8 +26,9 @@ from extras.tables import BookmarkTable, NotificationTable, SubscriptionTable
|
||||
from netbox.authentication import get_auth_backend_display, get_saml_idps
|
||||
from netbox.config import get_config
|
||||
from netbox.views import generic
|
||||
from users import forms, tables
|
||||
from users import forms
|
||||
from users.models import UserConfig
|
||||
from users.tables import TokenTable
|
||||
from utilities.request import safe_for_redirect
|
||||
from utilities.string import remove_linebreaks
|
||||
from utilities.views import register_model_view
|
||||
@@ -328,7 +329,8 @@ class UserTokenListView(LoginRequiredMixin, View):
|
||||
|
||||
def get(self, request):
|
||||
tokens = UserToken.objects.filter(user=request.user)
|
||||
table = tables.UserTokenTable(tokens)
|
||||
table = TokenTable(tokens)
|
||||
table.columns.hide('user')
|
||||
table.configure(request)
|
||||
|
||||
return render(request, 'account/token_list.html', {
|
||||
@@ -343,11 +345,9 @@ class UserTokenView(LoginRequiredMixin, View):
|
||||
|
||||
def get(self, request, pk):
|
||||
token = get_object_or_404(UserToken.objects.filter(user=request.user), pk=pk)
|
||||
key = token.key if settings.ALLOW_TOKEN_RETRIEVAL else None
|
||||
|
||||
return render(request, 'account/token.html', {
|
||||
'object': token,
|
||||
'key': key,
|
||||
})
|
||||
|
||||
|
||||
|
||||
97
netbox/circuits/migrations/0053_ci_collations.py
Normal file
97
netbox/circuits/migrations/0053_ci_collations.py
Normal file
@@ -0,0 +1,97 @@
|
||||
from django.db import migrations, models
|
||||
|
||||
PATTERN_OPS_INDEXES = [
|
||||
'circuits_circuitgroup_name_ec8ac1e5_like',
|
||||
'circuits_circuitgroup_slug_61ca866b_like',
|
||||
'circuits_circuittype_name_8256ea9a_like',
|
||||
'circuits_circuittype_slug_9b4b3cf9_like',
|
||||
'circuits_provider_name_8f2514f5_like',
|
||||
'circuits_provider_slug_c3c0aa10_like',
|
||||
'circuits_virtualcircuittype_name_5184db16_like',
|
||||
'circuits_virtualcircuittype_slug_75d5c661_like',
|
||||
]
|
||||
|
||||
|
||||
def remove_indexes(apps, schema_editor):
|
||||
for idx in PATTERN_OPS_INDEXES:
|
||||
schema_editor.execute(f'DROP INDEX IF EXISTS {idx}')
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('circuits', '0052_extend_circuit_abs_distance_upper_limit'),
|
||||
('dcim', '0217_ci_collations'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(
|
||||
code=remove_indexes,
|
||||
reverse_code=migrations.RunPython.noop,
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='circuit',
|
||||
name='cid',
|
||||
field=models.CharField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='circuitgroup',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='circuitgroup',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='circuittype',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='circuittype',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='provider',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='provider',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='provideraccount',
|
||||
name='account',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='provideraccount',
|
||||
name='name',
|
||||
field=models.CharField(blank=True, db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='providernetwork',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='virtualcircuit',
|
||||
name='cid',
|
||||
field=models.CharField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='virtualcircuittype',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='virtualcircuittype',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
]
|
||||
@@ -41,9 +41,10 @@ class Circuit(ContactsMixin, ImageAttachmentsMixin, DistanceMixin, PrimaryModel)
|
||||
ProviderAccount. Circuit port speed and commit rate are measured in Kbps.
|
||||
"""
|
||||
cid = models.CharField(
|
||||
max_length=100,
|
||||
verbose_name=_('circuit ID'),
|
||||
help_text=_('Unique circuit ID')
|
||||
max_length=100,
|
||||
db_collation='case_insensitive',
|
||||
help_text=_('Unique circuit ID'),
|
||||
)
|
||||
provider = models.ForeignKey(
|
||||
to='circuits.Provider',
|
||||
|
||||
@@ -21,13 +21,14 @@ class Provider(ContactsMixin, PrimaryModel):
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
help_text=_('Full name of the provider'),
|
||||
db_collation="natural_sort"
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
asns = models.ManyToManyField(
|
||||
to='ipam.ASN',
|
||||
@@ -56,13 +57,15 @@ class ProviderAccount(ContactsMixin, PrimaryModel):
|
||||
related_name='accounts'
|
||||
)
|
||||
account = models.CharField(
|
||||
verbose_name=_('account ID'),
|
||||
max_length=100,
|
||||
verbose_name=_('account ID')
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
blank=True
|
||||
db_collation='ci_natural_sort',
|
||||
blank=True,
|
||||
)
|
||||
|
||||
clone_fields = ('provider', )
|
||||
@@ -97,7 +100,7 @@ class ProviderNetwork(PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
provider = models.ForeignKey(
|
||||
to='circuits.Provider',
|
||||
|
||||
@@ -34,9 +34,10 @@ class VirtualCircuit(PrimaryModel):
|
||||
A virtual connection between two or more endpoints, delivered across one or more physical circuits.
|
||||
"""
|
||||
cid = models.CharField(
|
||||
max_length=100,
|
||||
verbose_name=_('circuit ID'),
|
||||
help_text=_('Unique circuit ID')
|
||||
max_length=100,
|
||||
db_collation='case_insensitive',
|
||||
help_text=_('Unique circuit ID'),
|
||||
)
|
||||
provider_network = models.ForeignKey(
|
||||
to='circuits.ProviderNetwork',
|
||||
|
||||
@@ -9,7 +9,6 @@ from drf_spectacular.utils import OpenApiParameter, extend_schema
|
||||
from rest_framework import viewsets
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
from rest_framework.permissions import IsAdminUser
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.routers import APIRootView
|
||||
from rest_framework.viewsets import ReadOnlyModelViewSet
|
||||
@@ -24,7 +23,7 @@ from netbox.api.authentication import IsAuthenticatedOrLoginNotRequired
|
||||
from netbox.api.metadata import ContentTypeMetadata
|
||||
from netbox.api.pagination import LimitOffsetListPagination
|
||||
from netbox.api.viewsets import NetBoxModelViewSet, NetBoxReadOnlyModelViewSet
|
||||
|
||||
from utilities.api import IsSuperuser
|
||||
from . import serializers
|
||||
|
||||
|
||||
@@ -100,7 +99,7 @@ class BaseRQViewSet(viewsets.ViewSet):
|
||||
"""
|
||||
Base class for RQ view sets. Provides a list() method. Subclasses must implement get_data().
|
||||
"""
|
||||
permission_classes = [IsAdminUser]
|
||||
permission_classes = [IsSuperuser]
|
||||
serializer_class = None
|
||||
|
||||
def get_data(self):
|
||||
|
||||
30
netbox/core/migrations/0020_ci_collations.py
Normal file
30
netbox/core/migrations/0020_ci_collations.py
Normal file
@@ -0,0 +1,30 @@
|
||||
from django.db import migrations, models
|
||||
|
||||
PATTERN_OPS_INDEXES = [
|
||||
'core_datasource_name_17788499_like',
|
||||
]
|
||||
|
||||
|
||||
def remove_indexes(apps, schema_editor):
|
||||
for idx in PATTERN_OPS_INDEXES:
|
||||
schema_editor.execute(f'DROP INDEX IF EXISTS {idx}')
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('core', '0019_configrevision_active'),
|
||||
('dcim', '0217_ci_collations'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(
|
||||
code=remove_indexes,
|
||||
reverse_code=migrations.RunPython.noop,
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='datasource',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
]
|
||||
@@ -1,3 +0,0 @@
|
||||
# TODO: Remove this module in NetBox v4.5
|
||||
# Provided for backward compatibility
|
||||
from .object_types import *
|
||||
@@ -38,7 +38,8 @@ class DataSource(JobsMixin, PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
type = models.CharField(
|
||||
verbose_name=_('type'),
|
||||
|
||||
@@ -8,6 +8,7 @@ from rq.job import Job as RQ_Job, JobStatus
|
||||
from rq.registry import FailedJobRegistry, StartedJobRegistry
|
||||
|
||||
from rest_framework import status
|
||||
from users.constants import TOKEN_PREFIX
|
||||
from users.models import Token, User
|
||||
from utilities.testing import APITestCase, APIViewTestCases, TestCase
|
||||
from utilities.testing.utils import disable_logging
|
||||
@@ -107,14 +108,14 @@ class ObjectTypeTest(APITestCase):
|
||||
def test_list_objects(self):
|
||||
object_type_count = ObjectType.objects.count()
|
||||
|
||||
response = self.client.get(reverse('extras-api:objecttype-list'), **self.header)
|
||||
response = self.client.get(reverse('core-api:objecttype-list'), **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
self.assertEqual(response.data['count'], object_type_count)
|
||||
|
||||
def test_get_object(self):
|
||||
object_type = ObjectType.objects.first()
|
||||
|
||||
url = reverse('extras-api:objecttype-detail', kwargs={'pk': object_type.pk})
|
||||
url = reverse('core-api:objecttype-detail', kwargs={'pk': object_type.pk})
|
||||
self.assertHttpStatus(self.client.get(url, **self.header), status.HTTP_200_OK)
|
||||
|
||||
|
||||
@@ -134,12 +135,9 @@ class BackgroundTaskTestCase(TestCase):
|
||||
Create a user and token for API calls.
|
||||
"""
|
||||
# Create the test user and assign permissions
|
||||
self.user = User.objects.create_user(username='testuser')
|
||||
self.user.is_staff = True
|
||||
self.user.is_active = True
|
||||
self.user.save()
|
||||
self.user = User.objects.create_user(username='testuser', is_active=True)
|
||||
self.token = Token.objects.create(user=self.user)
|
||||
self.header = {'HTTP_AUTHORIZATION': f'Token {self.token.key}'}
|
||||
self.header = {'HTTP_AUTHORIZATION': f'Bearer {TOKEN_PREFIX}{self.token.key}.{self.token.token}'}
|
||||
|
||||
# Clear all queues prior to running each test
|
||||
get_queue('default').connection.flushall()
|
||||
@@ -150,13 +148,11 @@ class BackgroundTaskTestCase(TestCase):
|
||||
url = reverse('core-api:rqqueue-list')
|
||||
|
||||
# Attempt to load view without permission
|
||||
self.user.is_staff = False
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_staff = True
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
@@ -165,7 +161,16 @@ class BackgroundTaskTestCase(TestCase):
|
||||
self.assertIn('low', str(response.content))
|
||||
|
||||
def test_background_queue(self):
|
||||
response = self.client.get(reverse('core-api:rqqueue-detail', args=['default']), **self.header)
|
||||
url = reverse('core-api:rqqueue-detail', args=['default'])
|
||||
|
||||
# Attempt to load view without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertIn('default', str(response.content))
|
||||
self.assertIn('oldest_job_timestamp', str(response.content))
|
||||
@@ -174,8 +179,16 @@ class BackgroundTaskTestCase(TestCase):
|
||||
def test_background_task_list(self):
|
||||
queue = get_queue('default')
|
||||
queue.enqueue(self.dummy_job_default)
|
||||
url = reverse('core-api:rqtask-list')
|
||||
|
||||
response = self.client.get(reverse('core-api:rqtask-list'), **self.header)
|
||||
# Attempt to load view without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertIn('origin', str(response.content))
|
||||
self.assertIn('core.tests.test_api.BackgroundTaskTestCase.dummy_job_default()', str(response.content))
|
||||
@@ -183,8 +196,16 @@ class BackgroundTaskTestCase(TestCase):
|
||||
def test_background_task(self):
|
||||
queue = get_queue('default')
|
||||
job = queue.enqueue(self.dummy_job_default)
|
||||
url = reverse('core-api:rqtask-detail', args=[job.id])
|
||||
|
||||
response = self.client.get(reverse('core-api:rqtask-detail', args=[job.id]), **self.header)
|
||||
# Attempt to load view without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertIn(str(job.id), str(response.content))
|
||||
self.assertIn('origin', str(response.content))
|
||||
@@ -194,45 +215,65 @@ class BackgroundTaskTestCase(TestCase):
|
||||
def test_background_task_delete(self):
|
||||
queue = get_queue('default')
|
||||
job = queue.enqueue(self.dummy_job_default)
|
||||
url = reverse('core-api:rqtask-delete', args=[job.id])
|
||||
|
||||
response = self.client.post(reverse('core-api:rqtask-delete', args=[job.id]), **self.header)
|
||||
# Attempt to load view without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertFalse(RQ_Job.exists(job.id, connection=queue.connection))
|
||||
queue = get_queue('default')
|
||||
self.assertNotIn(job.id, queue.job_ids)
|
||||
|
||||
def test_background_task_requeue(self):
|
||||
queue = get_queue('default')
|
||||
|
||||
# Enqueue & run a job that will fail
|
||||
queue = get_queue('default')
|
||||
job = queue.enqueue(self.dummy_job_failing)
|
||||
worker = get_worker('default')
|
||||
with disable_logging():
|
||||
worker.work(burst=True)
|
||||
self.assertTrue(job.is_failed)
|
||||
url = reverse('core-api:rqtask-requeue', args=[job.id])
|
||||
|
||||
# Attempt to requeue the job without permission
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Re-enqueue the failed job and check that its status has been reset
|
||||
response = self.client.post(reverse('core-api:rqtask-requeue', args=[job.id]), **self.header)
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
job = RQ_Job.fetch(job.id, queue.connection)
|
||||
self.assertFalse(job.is_failed)
|
||||
|
||||
def test_background_task_enqueue(self):
|
||||
queue = get_queue('default')
|
||||
|
||||
# Enqueue some jobs that each depends on its predecessor
|
||||
queue = get_queue('default')
|
||||
job = previous_job = None
|
||||
for _ in range(0, 3):
|
||||
job = queue.enqueue(self.dummy_job_default, depends_on=previous_job)
|
||||
previous_job = job
|
||||
url = reverse('core-api:rqtask-enqueue', args=[job.id])
|
||||
|
||||
# Check that the last job to be enqueued has a status of deferred
|
||||
self.assertIsNotNone(job)
|
||||
self.assertEqual(job.get_status(), JobStatus.DEFERRED)
|
||||
self.assertIsNone(job.enqueued_at)
|
||||
|
||||
# Attempt to force-enqueue the job without permission
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Force-enqueue the deferred job
|
||||
response = self.client.post(reverse('core-api:rqtask-enqueue', args=[job.id]), **self.header)
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
# Check that job's status is updated correctly
|
||||
@@ -242,19 +283,27 @@ class BackgroundTaskTestCase(TestCase):
|
||||
|
||||
def test_background_task_stop(self):
|
||||
queue = get_queue('default')
|
||||
|
||||
worker = get_worker('default')
|
||||
job = queue.enqueue(self.dummy_job_default)
|
||||
worker.prepare_job_execution(job)
|
||||
|
||||
url = reverse('core-api:rqtask-stop', args=[job.id])
|
||||
self.assertEqual(job.get_status(), JobStatus.STARTED)
|
||||
response = self.client.post(reverse('core-api:rqtask-stop', args=[job.id]), **self.header)
|
||||
|
||||
# Attempt to stop the task without permission
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Stop the task
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
with disable_logging():
|
||||
worker.monitor_work_horse(job, queue) # Sets the job as Failed and removes from Started
|
||||
started_job_registry = StartedJobRegistry(queue.name, connection=queue.connection)
|
||||
self.assertEqual(len(started_job_registry), 0)
|
||||
|
||||
# Verify that the task was cancelled
|
||||
canceled_job_registry = FailedJobRegistry(queue.name, connection=queue.connection)
|
||||
self.assertEqual(len(canceled_job_registry), 1)
|
||||
self.assertIn(job.id, canceled_job_registry)
|
||||
@@ -262,19 +311,34 @@ class BackgroundTaskTestCase(TestCase):
|
||||
def test_worker_list(self):
|
||||
worker1 = get_worker('default', name=uuid.uuid4().hex)
|
||||
worker1.register_birth()
|
||||
|
||||
worker2 = get_worker('high')
|
||||
worker2.register_birth()
|
||||
url = reverse('core-api:rqworker-list')
|
||||
|
||||
response = self.client.get(reverse('core-api:rqworker-list'), **self.header)
|
||||
# Attempt to fetch the worker list without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Fetch the worker list
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertIn(str(worker1.name), str(response.content))
|
||||
|
||||
def test_worker(self):
|
||||
worker1 = get_worker('default', name=uuid.uuid4().hex)
|
||||
worker1.register_birth()
|
||||
url = reverse('core-api:rqworker-detail', args=[worker1.name])
|
||||
|
||||
response = self.client.get(reverse('core-api:rqworker-detail', args=[worker1.name]), **self.header)
|
||||
# Attempt to fetch a worker without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Fetch the worker
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertIn(str(worker1.name), str(response.content))
|
||||
self.assertIn('birth_date', str(response.content))
|
||||
|
||||
@@ -158,7 +158,7 @@ class BackgroundTaskTestCase(TestCase):
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
self.user.is_staff = True
|
||||
self.user.is_superuser = True
|
||||
self.user.is_active = True
|
||||
self.user.save()
|
||||
|
||||
@@ -171,13 +171,13 @@ class BackgroundTaskTestCase(TestCase):
|
||||
url = reverse('core:background_queue_list')
|
||||
|
||||
# Attempt to load view without permission
|
||||
self.user.is_staff = False
|
||||
self.user.is_superuser = False
|
||||
self.user.save()
|
||||
response = self.client.get(url)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_staff = True
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
@@ -356,7 +356,7 @@ class SystemTestCase(TestCase):
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
|
||||
self.user.is_staff = True
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
|
||||
def test_system_view_default(self):
|
||||
|
||||
@@ -372,7 +372,7 @@ class ConfigRevisionRestoreView(ContentTypePermissionRequiredMixin, View):
|
||||
class BaseRQView(UserPassesTestMixin, View):
|
||||
|
||||
def test_func(self):
|
||||
return self.request.user.is_staff
|
||||
return self.request.user.is_superuser
|
||||
|
||||
|
||||
class BackgroundQueueListView(TableMixin, BaseRQView):
|
||||
@@ -555,7 +555,7 @@ class WorkerView(BaseRQView):
|
||||
class SystemView(UserPassesTestMixin, View):
|
||||
|
||||
def test_func(self):
|
||||
return self.request.user.is_staff
|
||||
return self.request.user.is_superuser
|
||||
|
||||
def get(self, request):
|
||||
|
||||
@@ -638,7 +638,7 @@ class BasePluginView(UserPassesTestMixin, View):
|
||||
CACHE_KEY_CATALOG_ERROR = 'plugins-catalog-error'
|
||||
|
||||
def test_func(self):
|
||||
return self.request.user.is_staff
|
||||
return self.request.user.is_superuser
|
||||
|
||||
def get_cached_plugins(self, request):
|
||||
catalog_plugins = {}
|
||||
|
||||
@@ -1,10 +1,8 @@
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from drf_spectacular.types import OpenApiTypes
|
||||
from drf_spectacular.utils import extend_schema_field
|
||||
from rest_framework import serializers
|
||||
|
||||
from dcim.choices import *
|
||||
from dcim.constants import *
|
||||
from dcim.models import Cable, CablePath, CableTermination
|
||||
from netbox.api.fields import ChoiceField, ContentTypeField
|
||||
from netbox.api.serializers import BaseModelSerializer, GenericObjectSerializer, NetBoxModelSerializer
|
||||
@@ -51,9 +49,11 @@ class TracedCableSerializer(BaseModelSerializer):
|
||||
|
||||
class CableTerminationSerializer(NetBoxModelSerializer):
|
||||
termination_type = ContentTypeField(
|
||||
queryset=ContentType.objects.filter(CABLE_TERMINATION_MODELS)
|
||||
read_only=True,
|
||||
)
|
||||
termination = serializers.SerializerMethodField(
|
||||
read_only=True,
|
||||
)
|
||||
termination = serializers.SerializerMethodField(read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = CableTermination
|
||||
@@ -61,6 +61,8 @@ class CableTerminationSerializer(NetBoxModelSerializer):
|
||||
'id', 'url', 'display', 'cable', 'cable_end', 'termination_type', 'termination_id',
|
||||
'termination', 'created', 'last_updated',
|
||||
]
|
||||
read_only_fields = fields
|
||||
brief_fields = ('id', 'url', 'display', 'cable', 'cable_end', 'termination_type', 'termination_id')
|
||||
|
||||
@extend_schema_field(serializers.JSONField(allow_null=True))
|
||||
def get_termination(self, obj):
|
||||
|
||||
@@ -155,7 +155,7 @@ class PowerOutletTemplateSerializer(ComponentTemplateSerializer):
|
||||
model = PowerOutletTemplate
|
||||
fields = [
|
||||
'id', 'url', 'display', 'device_type', 'module_type', 'name', 'label', 'type',
|
||||
'power_port', 'feed_leg', 'description', 'created', 'last_updated',
|
||||
'color', 'power_port', 'feed_leg', 'description', 'created', 'last_updated',
|
||||
]
|
||||
brief_fields = ('id', 'url', 'display', 'name', 'description')
|
||||
|
||||
|
||||
@@ -16,7 +16,7 @@ from extras.api.mixins import ConfigContextQuerySetMixin, RenderConfigMixin
|
||||
from netbox.api.authentication import IsAuthenticatedOrLoginNotRequired
|
||||
from netbox.api.metadata import ContentTypeMetadata
|
||||
from netbox.api.pagination import StripCountAnnotationsPaginator
|
||||
from netbox.api.viewsets import NetBoxModelViewSet, MPTTLockedMixin
|
||||
from netbox.api.viewsets import NetBoxModelViewSet, MPTTLockedMixin, NetBoxReadOnlyModelViewSet
|
||||
from netbox.api.viewsets.mixins import SequentialBulkCreatesMixin
|
||||
from utilities.api import get_serializer_for_model
|
||||
from utilities.query_functions import CollateAsChar
|
||||
@@ -563,7 +563,7 @@ class CableViewSet(NetBoxModelViewSet):
|
||||
filterset_class = filtersets.CableFilterSet
|
||||
|
||||
|
||||
class CableTerminationViewSet(NetBoxModelViewSet):
|
||||
class CableTerminationViewSet(NetBoxReadOnlyModelViewSet):
|
||||
metadata_class = ContentTypeMetadata
|
||||
queryset = CableTermination.objects.all()
|
||||
serializer_class = serializers.CableTerminationSerializer
|
||||
|
||||
@@ -842,7 +842,7 @@ class PowerOutletTemplateFilterSet(ChangeLoggedModelFilterSet, ModularDeviceType
|
||||
|
||||
class Meta:
|
||||
model = PowerOutletTemplate
|
||||
fields = ('id', 'name', 'label', 'type', 'feed_leg', 'description')
|
||||
fields = ('id', 'name', 'label', 'type', 'color', 'feed_leg', 'description')
|
||||
|
||||
|
||||
class InterfaceTemplateFilterSet(ChangeLoggedModelFilterSet, ModularDeviceTypeComponentFilterSet):
|
||||
|
||||
@@ -1163,6 +1163,10 @@ class PowerOutletTemplateBulkEditForm(ComponentTemplateBulkEditForm):
|
||||
choices=add_blank_choice(PowerOutletTypeChoices),
|
||||
required=False
|
||||
)
|
||||
color = ColorField(
|
||||
label=_('Color'),
|
||||
required=False
|
||||
)
|
||||
power_port = forms.ModelChoiceField(
|
||||
label=_('Power port'),
|
||||
queryset=PowerPortTemplate.objects.all(),
|
||||
|
||||
@@ -1092,14 +1092,14 @@ class PowerOutletTemplateForm(ModularComponentTemplateForm):
|
||||
FieldSet('device_type', name=_('Device Type')),
|
||||
FieldSet('module_type', name=_('Module Type')),
|
||||
),
|
||||
'name', 'label', 'type', 'power_port', 'feed_leg', 'description',
|
||||
'name', 'label', 'type', 'color', 'power_port', 'feed_leg', 'description',
|
||||
),
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = PowerOutletTemplate
|
||||
fields = [
|
||||
'device_type', 'module_type', 'name', 'label', 'type', 'power_port', 'feed_leg', 'description',
|
||||
'device_type', 'module_type', 'name', 'label', 'type', 'color', 'power_port', 'feed_leg', 'description',
|
||||
]
|
||||
|
||||
|
||||
|
||||
@@ -673,6 +673,7 @@ class PowerOutletType(ModularComponentType, CabledObjectMixin, PathEndpointMixin
|
||||
)
|
||||
class PowerOutletTemplateType(ModularComponentTemplateType):
|
||||
power_port: Annotated["PowerPortTemplateType", strawberry.lazy('dcim.graphql.types')] | None
|
||||
color: str
|
||||
|
||||
|
||||
@strawberry_django.type(
|
||||
|
||||
17
netbox/dcim/migrations/0216_poweroutlettemplate_color.py
Normal file
17
netbox/dcim/migrations/0216_poweroutlettemplate_color.py
Normal file
@@ -0,0 +1,17 @@
|
||||
import utilities.fields
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('dcim', '0215_rackreservation_status'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='poweroutlettemplate',
|
||||
name='color',
|
||||
field=utilities.fields.ColorField(blank=True, max_length=6),
|
||||
),
|
||||
]
|
||||
26
netbox/dcim/migrations/0217_ci_collations.py
Normal file
26
netbox/dcim/migrations/0217_ci_collations.py
Normal file
@@ -0,0 +1,26 @@
|
||||
from django.contrib.postgres.operations import CreateCollation
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('dcim', '0216_poweroutlettemplate_color'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# Create a case-insensitive collation
|
||||
CreateCollation(
|
||||
'case_insensitive',
|
||||
provider='icu',
|
||||
locale='und-u-ks-level2',
|
||||
deterministic=False,
|
||||
),
|
||||
# Create a case-insensitive collation with natural sorting
|
||||
CreateCollation(
|
||||
'ci_natural_sort',
|
||||
provider='icu',
|
||||
locale='und-u-kn-true-ks-level2',
|
||||
deterministic=False,
|
||||
),
|
||||
]
|
||||
311
netbox/dcim/migrations/0218_ci_collations.py
Normal file
311
netbox/dcim/migrations/0218_ci_collations.py
Normal file
@@ -0,0 +1,311 @@
|
||||
from django.db import migrations, models
|
||||
|
||||
PATTERN_OPS_INDEXES = [
|
||||
'dcim_devicerole_slug_7952643b_like',
|
||||
'dcim_devicetype_slug_448745bd_like',
|
||||
'dcim_inventoryitemrole_name_4c8cfe6d_like',
|
||||
'dcim_inventoryitemrole_slug_3556c227_like',
|
||||
'dcim_location_slug_352c5472_like',
|
||||
'dcim_manufacturer_name_841fcd92_like',
|
||||
'dcim_manufacturer_slug_00430749_like',
|
||||
'dcim_moduletypeprofile_name_1709c36e_like',
|
||||
'dcim_platform_slug_b0908ae4_like',
|
||||
'dcim_rackrole_name_9077cfcc_like',
|
||||
'dcim_rackrole_slug_40bbcd3a_like',
|
||||
'dcim_racktype_slug_6bbb384a_like',
|
||||
'dcim_region_slug_ff078a66_like',
|
||||
'dcim_site_name_8fe66c76_like',
|
||||
'dcim_site_slug_4412c762_like',
|
||||
'dcim_sitegroup_slug_a11d2b04_like',
|
||||
]
|
||||
|
||||
|
||||
def remove_indexes(apps, schema_editor):
|
||||
for idx in PATTERN_OPS_INDEXES:
|
||||
schema_editor.execute(f'DROP INDEX IF EXISTS {idx}')
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
('dcim', '0217_ci_collations'),
|
||||
('extras', '0134_ci_collations'),
|
||||
('ipam', '0083_ci_collations'),
|
||||
('tenancy', '0021_ci_collations'),
|
||||
('virtualization', '0048_populate_mac_addresses'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(
|
||||
code=remove_indexes,
|
||||
reverse_code=migrations.RunPython.noop,
|
||||
),
|
||||
migrations.RemoveConstraint(
|
||||
model_name='device',
|
||||
name='dcim_device_unique_name_site_tenant',
|
||||
),
|
||||
migrations.RemoveConstraint(
|
||||
model_name='device',
|
||||
name='dcim_device_unique_name_site',
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='consoleport',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='consoleporttemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='consoleserverport',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='consoleserverporttemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='device',
|
||||
name='name',
|
||||
field=models.CharField(blank=True, db_collation='ci_natural_sort', max_length=64, null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='devicebay',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='devicebaytemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='devicerole',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='devicerole',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='devicetype',
|
||||
name='model',
|
||||
field=models.CharField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='devicetype',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='frontport',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='frontporttemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='interface',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='interfacetemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='inventoryitem',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='inventoryitemrole',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='inventoryitemrole',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='inventoryitemtemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='location',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='location',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='manufacturer',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='manufacturer',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='modulebay',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='modulebaytemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='moduletype',
|
||||
name='model',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='moduletypeprofile',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='platform',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='platform',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='powerfeed',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='poweroutlet',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='poweroutlettemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='powerpanel',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='powerport',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='powerporttemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='rack',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='rackrole',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='rackrole',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='racktype',
|
||||
name='model',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='racktype',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='rearport',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='rearporttemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='region',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='region',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='site',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='site',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='sitegroup',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='sitegroup',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='virtualdevicecontext',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AddConstraint(
|
||||
model_name='device',
|
||||
constraint=models.UniqueConstraint(
|
||||
models.F('name'), models.F('site'), models.F('tenant'), name='dcim_device_unique_name_site_tenant'
|
||||
),
|
||||
),
|
||||
migrations.AddConstraint(
|
||||
model_name='device',
|
||||
constraint=models.UniqueConstraint(
|
||||
models.F('name'),
|
||||
models.F('site'),
|
||||
condition=models.Q(('tenant__isnull', True)),
|
||||
name='dcim_device_unique_name_site',
|
||||
violation_error_message='Device name must be unique per site.',
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -43,10 +43,10 @@ class ComponentTemplateModel(ChangeLoggedModel, TrackingModelMixin):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=64,
|
||||
db_collation='ci_natural_sort',
|
||||
help_text=_(
|
||||
"{module} is accepted as a substitution for the module bay position when attached to a module type."
|
||||
),
|
||||
db_collation="natural_sort"
|
||||
)
|
||||
label = models.CharField(
|
||||
verbose_name=_('label'),
|
||||
@@ -339,6 +339,10 @@ class PowerOutletTemplate(ModularComponentTemplateModel):
|
||||
blank=True,
|
||||
null=True
|
||||
)
|
||||
color = ColorField(
|
||||
verbose_name=_('color'),
|
||||
blank=True
|
||||
)
|
||||
power_port = models.ForeignKey(
|
||||
to='dcim.PowerPortTemplate',
|
||||
on_delete=models.SET_NULL,
|
||||
@@ -389,6 +393,7 @@ class PowerOutletTemplate(ModularComponentTemplateModel):
|
||||
name=self.resolve_name(kwargs.get('module')),
|
||||
label=self.resolve_label(kwargs.get('module')),
|
||||
type=self.type,
|
||||
color=self.color,
|
||||
power_port=power_port,
|
||||
feed_leg=self.feed_leg,
|
||||
**kwargs
|
||||
@@ -399,6 +404,7 @@ class PowerOutletTemplate(ModularComponentTemplateModel):
|
||||
return {
|
||||
'name': self.name,
|
||||
'type': self.type,
|
||||
'color': self.color,
|
||||
'power_port': self.power_port.name if self.power_port else None,
|
||||
'feed_leg': self.feed_leg,
|
||||
'label': self.label,
|
||||
|
||||
@@ -52,7 +52,7 @@ class ComponentModel(NetBoxModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=64,
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
label = models.CharField(
|
||||
verbose_name=_('label'),
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
import decimal
|
||||
import yaml
|
||||
|
||||
from functools import cached_property
|
||||
|
||||
import yaml
|
||||
from django.contrib.contenttypes.fields import GenericForeignKey, GenericRelation
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.core.exceptions import ValidationError
|
||||
@@ -10,7 +9,6 @@ from django.core.files.storage import default_storage
|
||||
from django.core.validators import MaxValueValidator, MinValueValidator
|
||||
from django.db import models
|
||||
from django.db.models import F, ProtectedError, prefetch_related_objects
|
||||
from django.db.models.functions import Lower
|
||||
from django.db.models.signals import post_save
|
||||
from django.urls import reverse
|
||||
from django.utils.safestring import mark_safe
|
||||
@@ -25,8 +23,8 @@ from extras.querysets import ConfigContextModelQuerySet
|
||||
from netbox.choices import ColorChoices
|
||||
from netbox.config import ConfigItem
|
||||
from netbox.models import NestedGroupModel, OrganizationalModel, PrimaryModel
|
||||
from netbox.models.mixins import WeightMixin
|
||||
from netbox.models.features import ContactsMixin, ImageAttachmentsMixin
|
||||
from netbox.models.mixins import WeightMixin
|
||||
from utilities.fields import ColorField, CounterCacheField
|
||||
from utilities.prefetch import get_prefetchable_fields
|
||||
from utilities.tracking import TrackingModelMixin
|
||||
@@ -34,7 +32,6 @@ from .device_components import *
|
||||
from .mixins import RenderConfigMixin
|
||||
from .modules import Module
|
||||
|
||||
|
||||
__all__ = (
|
||||
'Device',
|
||||
'DeviceRole',
|
||||
@@ -83,11 +80,13 @@ class DeviceType(ImageAttachmentsMixin, PrimaryModel, WeightMixin):
|
||||
)
|
||||
model = models.CharField(
|
||||
verbose_name=_('model'),
|
||||
max_length=100
|
||||
max_length=100,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100
|
||||
max_length=100,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
default_platform = models.ForeignKey(
|
||||
to='dcim.Platform',
|
||||
@@ -525,7 +524,7 @@ class Device(
|
||||
max_length=64,
|
||||
blank=True,
|
||||
null=True,
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
serial = models.CharField(
|
||||
max_length=50,
|
||||
@@ -721,11 +720,11 @@ class Device(
|
||||
ordering = ('name', 'pk') # Name may be null
|
||||
constraints = (
|
||||
models.UniqueConstraint(
|
||||
Lower('name'), 'site', 'tenant',
|
||||
'name', 'site', 'tenant',
|
||||
name='%(app_label)s_%(class)s_unique_name_site_tenant'
|
||||
),
|
||||
models.UniqueConstraint(
|
||||
Lower('name'), 'site',
|
||||
'name', 'site',
|
||||
name='%(app_label)s_%(class)s_unique_name_site',
|
||||
condition=Q(tenant__isnull=True),
|
||||
violation_error_message=_("Device name must be unique per site.")
|
||||
@@ -1119,7 +1118,7 @@ class VirtualChassis(PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=64,
|
||||
db_collation="natural_sort"
|
||||
db_collation='natural_sort',
|
||||
)
|
||||
domain = models.CharField(
|
||||
verbose_name=_('domain'),
|
||||
@@ -1182,7 +1181,7 @@ class VirtualDeviceContext(PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=64,
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
status = models.CharField(
|
||||
verbose_name=_('status'),
|
||||
|
||||
@@ -31,7 +31,8 @@ class ModuleTypeProfile(PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
schema = models.JSONField(
|
||||
blank=True,
|
||||
@@ -72,7 +73,8 @@ class ModuleType(ImageAttachmentsMixin, PrimaryModel, WeightMixin):
|
||||
)
|
||||
model = models.CharField(
|
||||
verbose_name=_('model'),
|
||||
max_length=100
|
||||
max_length=100,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
part_number = models.CharField(
|
||||
verbose_name=_('part number'),
|
||||
|
||||
@@ -37,7 +37,7 @@ class PowerPanel(ContactsMixin, ImageAttachmentsMixin, PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
|
||||
prerequisite_models = (
|
||||
@@ -88,7 +88,7 @@ class PowerFeed(PrimaryModel, PathEndpoint, CabledObjectModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
status = models.CharField(
|
||||
verbose_name=_('status'),
|
||||
|
||||
@@ -137,12 +137,14 @@ class RackType(RackBase):
|
||||
)
|
||||
model = models.CharField(
|
||||
verbose_name=_('model'),
|
||||
max_length=100
|
||||
max_length=100,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
|
||||
clone_fields = (
|
||||
@@ -262,7 +264,7 @@ class Rack(ContactsMixin, ImageAttachmentsMixin, RackBase):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
facility_id = models.CharField(
|
||||
max_length=50,
|
||||
|
||||
@@ -142,13 +142,14 @@ class Site(ContactsMixin, ImageAttachmentsMixin, PrimaryModel):
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True,
|
||||
help_text=_("Full name of the site"),
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort',
|
||||
help_text=_("Full name of the site")
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
status = models.CharField(
|
||||
verbose_name=_('status'),
|
||||
|
||||
@@ -211,6 +211,9 @@ class PowerPortTemplateTable(ComponentTemplateTable):
|
||||
|
||||
|
||||
class PowerOutletTemplateTable(ComponentTemplateTable):
|
||||
color = columns.ColorColumn(
|
||||
verbose_name=_('Color'),
|
||||
)
|
||||
actions = columns.ActionsColumn(
|
||||
actions=('edit', 'delete'),
|
||||
extra_buttons=MODULAR_COMPONENT_TEMPLATE_BUTTONS
|
||||
@@ -218,7 +221,7 @@ class PowerOutletTemplateTable(ComponentTemplateTable):
|
||||
|
||||
class Meta(ComponentTemplateTable.Meta):
|
||||
model = models.PowerOutletTemplate
|
||||
fields = ('pk', 'name', 'label', 'type', 'power_port', 'feed_leg', 'description', 'actions')
|
||||
fields = ('pk', 'name', 'label', 'type', 'color', 'power_port', 'feed_leg', 'description', 'actions')
|
||||
empty_text = "None"
|
||||
|
||||
|
||||
|
||||
@@ -13,7 +13,8 @@ from ipam.choices import VLANQinQRoleChoices
|
||||
from ipam.models import ASN, RIR, VLAN, VRF
|
||||
from netbox.api.serializers import GenericObjectSerializer
|
||||
from tenancy.models import Tenant
|
||||
from users.models import User
|
||||
from users.constants import TOKEN_PREFIX
|
||||
from users.models import Token, User
|
||||
from utilities.testing import APITestCase, APIViewTestCases, create_test_device, disable_logging
|
||||
from virtualization.models import Cluster, ClusterType
|
||||
from wireless.choices import WirelessChannelChoices
|
||||
@@ -1306,7 +1307,6 @@ class DeviceTest(APIViewTestCases.APIViewTestCase):
|
||||
}
|
||||
user_permissions = (
|
||||
'dcim.view_site', 'dcim.view_rack', 'dcim.view_location', 'dcim.view_devicerole', 'dcim.view_devicetype',
|
||||
'extras.view_configtemplate',
|
||||
)
|
||||
|
||||
@classmethod
|
||||
@@ -1486,12 +1486,58 @@ class DeviceTest(APIViewTestCases.APIViewTestCase):
|
||||
device.config_template = configtemplate
|
||||
device.save()
|
||||
|
||||
self.add_permissions('dcim.add_device')
|
||||
url = reverse('dcim-api:device-detail', kwargs={'pk': device.pk}) + 'render-config/'
|
||||
self.add_permissions('dcim.render_config_device', 'dcim.view_device')
|
||||
url = reverse('dcim-api:device-render-config', kwargs={'pk': device.pk})
|
||||
response = self.client.post(url, {}, format='json', **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
self.assertEqual(response.data['content'], f'Config for device {device.name}')
|
||||
|
||||
def test_render_config_without_permission(self):
|
||||
configtemplate = ConfigTemplate.objects.create(
|
||||
name='Config Template 1',
|
||||
template_code='Config for device {{ device.name }}'
|
||||
)
|
||||
|
||||
device = Device.objects.first()
|
||||
device.config_template = configtemplate
|
||||
device.save()
|
||||
|
||||
# No permissions added - user has no render_config permission
|
||||
url = reverse('dcim-api:device-render-config', kwargs={'pk': device.pk})
|
||||
response = self.client.post(url, {}, format='json', **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_404_NOT_FOUND)
|
||||
|
||||
def test_render_config_token_write_enabled(self):
|
||||
configtemplate = ConfigTemplate.objects.create(
|
||||
name='Config Template 1',
|
||||
template_code='Config for device {{ device.name }}'
|
||||
)
|
||||
|
||||
device = Device.objects.first()
|
||||
device.config_template = configtemplate
|
||||
device.save()
|
||||
|
||||
self.add_permissions('dcim.render_config_device', 'dcim.view_device')
|
||||
url = reverse('dcim-api:device-render-config', kwargs={'pk': device.pk})
|
||||
|
||||
# Request without token auth should fail with PermissionDenied
|
||||
response = self.client.post(url, {}, format='json')
|
||||
self.assertHttpStatus(response, status.HTTP_403_FORBIDDEN)
|
||||
|
||||
# Create token with write_enabled=False
|
||||
token = Token.objects.create(version=2, user=self.user, write_enabled=False)
|
||||
token_header = f'Bearer {TOKEN_PREFIX}{token.key}.{token.token}'
|
||||
|
||||
# Request with write-disabled token should fail
|
||||
response = self.client.post(url, {}, format='json', HTTP_AUTHORIZATION=token_header)
|
||||
self.assertHttpStatus(response, status.HTTP_403_FORBIDDEN)
|
||||
|
||||
# Enable write and retry
|
||||
token.write_enabled = True
|
||||
token.save()
|
||||
response = self.client.post(url, {}, format='json', HTTP_AUTHORIZATION=token_header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
|
||||
|
||||
class ModuleTest(APIViewTestCases.APIViewTestCase):
|
||||
model = Module
|
||||
@@ -2376,6 +2422,33 @@ class CableTest(APIViewTestCases.APIViewTestCase):
|
||||
]
|
||||
|
||||
|
||||
class CableTerminationTest(
|
||||
APIViewTestCases.GetObjectViewTestCase,
|
||||
APIViewTestCases.ListObjectsViewTestCase,
|
||||
):
|
||||
model = CableTermination
|
||||
brief_fields = ['cable', 'cable_end', 'display', 'id', 'termination_id', 'termination_type', 'url']
|
||||
|
||||
@classmethod
|
||||
def setUpTestData(cls):
|
||||
device1 = create_test_device('Device 1')
|
||||
device2 = create_test_device('Device 2')
|
||||
|
||||
interfaces = []
|
||||
for device in (device1, device2):
|
||||
for i in range(0, 10):
|
||||
interfaces.append(Interface(device=device, type=InterfaceTypeChoices.TYPE_1GE_FIXED, name=f'eth{i}'))
|
||||
Interface.objects.bulk_create(interfaces)
|
||||
|
||||
cables = (
|
||||
Cable(a_terminations=[interfaces[0]], b_terminations=[interfaces[10]], label='Cable 1'),
|
||||
Cable(a_terminations=[interfaces[1]], b_terminations=[interfaces[11]], label='Cable 2'),
|
||||
Cable(a_terminations=[interfaces[2]], b_terminations=[interfaces[12]], label='Cable 3'),
|
||||
)
|
||||
for cable in cables:
|
||||
cable.save()
|
||||
|
||||
|
||||
class ConnectedDeviceTest(APITestCase):
|
||||
|
||||
@classmethod
|
||||
|
||||
@@ -1919,18 +1919,21 @@ class PowerOutletTemplateTestCase(TestCase, DeviceComponentTemplateFilterSetTest
|
||||
device_type=device_types[0],
|
||||
name='Power Outlet 1',
|
||||
feed_leg=PowerOutletFeedLegChoices.FEED_LEG_A,
|
||||
color=ColorChoices.COLOR_RED,
|
||||
description='foobar1'
|
||||
),
|
||||
PowerOutletTemplate(
|
||||
device_type=device_types[1],
|
||||
name='Power Outlet 2',
|
||||
feed_leg=PowerOutletFeedLegChoices.FEED_LEG_B,
|
||||
color=ColorChoices.COLOR_GREEN,
|
||||
description='foobar2'
|
||||
),
|
||||
PowerOutletTemplate(
|
||||
device_type=device_types[2],
|
||||
name='Power Outlet 3',
|
||||
feed_leg=PowerOutletFeedLegChoices.FEED_LEG_C,
|
||||
color=ColorChoices.COLOR_BLUE,
|
||||
description='foobar3'
|
||||
),
|
||||
))
|
||||
@@ -1943,6 +1946,10 @@ class PowerOutletTemplateTestCase(TestCase, DeviceComponentTemplateFilterSetTest
|
||||
params = {'feed_leg': [PowerOutletFeedLegChoices.FEED_LEG_A, PowerOutletFeedLegChoices.FEED_LEG_B]}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||
|
||||
def test_color(self):
|
||||
params = {'color': [ColorChoices.COLOR_RED, ColorChoices.COLOR_GREEN]}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||
|
||||
|
||||
class InterfaceTemplateTestCase(TestCase, DeviceComponentTemplateFilterSetTests, ChangeLoggedFilterSetTests):
|
||||
queryset = InterfaceTemplate.objects.all()
|
||||
|
||||
@@ -4,6 +4,7 @@ from rest_framework.renderers import JSONRenderer
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.status import HTTP_400_BAD_REQUEST
|
||||
|
||||
from netbox.api.authentication import TokenWritePermission
|
||||
from netbox.api.renderers import TextRenderer
|
||||
from .serializers import ConfigTemplateSerializer
|
||||
|
||||
@@ -64,12 +65,24 @@ class RenderConfigMixin(ConfigTemplateRenderMixin):
|
||||
"""
|
||||
Provides a /render-config/ endpoint for REST API views whose model may have a ConfigTemplate assigned.
|
||||
"""
|
||||
|
||||
def get_permissions(self):
|
||||
# For render_config action, check only token write ability (not model permissions)
|
||||
if self.action == 'render_config':
|
||||
return [TokenWritePermission()]
|
||||
return super().get_permissions()
|
||||
|
||||
@action(detail=True, methods=['post'], url_path='render-config', renderer_classes=[JSONRenderer, TextRenderer])
|
||||
def render_config(self, request, pk):
|
||||
"""
|
||||
Resolve and render the preferred ConfigTemplate for this Device.
|
||||
"""
|
||||
# Override restrict() on the default queryset to enforce the render_config & view actions
|
||||
self.queryset = self.queryset.model.objects.restrict(request.user, 'render_config').restrict(
|
||||
request.user, 'view'
|
||||
)
|
||||
instance = self.get_object()
|
||||
|
||||
object_type = instance._meta.model_name
|
||||
configtemplate = instance.get_config_template()
|
||||
if not configtemplate:
|
||||
|
||||
@@ -1,10 +1,8 @@
|
||||
from django.urls import include, path
|
||||
|
||||
from core.api.views import ObjectTypeViewSet
|
||||
from netbox.api.routers import NetBoxRouter
|
||||
from . import views
|
||||
|
||||
|
||||
router = NetBoxRouter()
|
||||
router.APIRootView = views.ExtrasRootView
|
||||
|
||||
@@ -29,9 +27,6 @@ router.register('config-context-profiles', views.ConfigContextProfileViewSet)
|
||||
router.register('config-templates', views.ConfigTemplateViewSet)
|
||||
router.register('scripts', views.ScriptViewSet, basename='script')
|
||||
|
||||
# TODO: Remove in NetBox v4.5
|
||||
router.register('object-types', ObjectTypeViewSet)
|
||||
|
||||
app_name = 'extras-api'
|
||||
urlpatterns = [
|
||||
path('dashboard/', views.DashboardView.as_view(), name='dashboard'),
|
||||
|
||||
@@ -16,7 +16,7 @@ from rq import Worker
|
||||
from extras import filtersets
|
||||
from extras.jobs import ScriptJob
|
||||
from extras.models import *
|
||||
from netbox.api.authentication import IsAuthenticatedOrLoginNotRequired
|
||||
from netbox.api.authentication import IsAuthenticatedOrLoginNotRequired, TokenWritePermission
|
||||
from netbox.api.features import SyncedDataMixin
|
||||
from netbox.api.metadata import ContentTypeMetadata
|
||||
from netbox.api.renderers import TextRenderer
|
||||
@@ -238,13 +238,22 @@ class ConfigTemplateViewSet(SyncedDataMixin, ConfigTemplateRenderMixin, NetBoxMo
|
||||
serializer_class = serializers.ConfigTemplateSerializer
|
||||
filterset_class = filtersets.ConfigTemplateFilterSet
|
||||
|
||||
def get_permissions(self):
|
||||
# For render action, check only token write ability (not model permissions)
|
||||
if self.action == 'render':
|
||||
return [TokenWritePermission()]
|
||||
return super().get_permissions()
|
||||
|
||||
@action(detail=True, methods=['post'], renderer_classes=[JSONRenderer, TextRenderer])
|
||||
def render(self, request, pk):
|
||||
"""
|
||||
Render a ConfigTemplate using the context data provided (if any). If the client requests "text/plain" data,
|
||||
return the raw rendered content, rather than serialized JSON.
|
||||
"""
|
||||
# Override restrict() on the default queryset to enforce the render & view actions
|
||||
self.queryset = self.queryset.model.objects.restrict(request.user, 'render').restrict(request.user, 'view')
|
||||
configtemplate = self.get_object()
|
||||
|
||||
context = request.data
|
||||
|
||||
return self.render_configtemplate(request, configtemplate, context)
|
||||
|
||||
114
netbox/extras/migrations/0134_ci_collations.py
Normal file
114
netbox/extras/migrations/0134_ci_collations.py
Normal file
@@ -0,0 +1,114 @@
|
||||
import django.core.validators
|
||||
import re
|
||||
from django.db import migrations, models
|
||||
|
||||
PATTERN_OPS_INDEXES = [
|
||||
'extras_configcontext_name_4bbfe25d_like',
|
||||
'extras_configcontextprofile_name_070de83b_like',
|
||||
'extras_customfield_name_2fe72707_like',
|
||||
'extras_customfieldchoiceset_name_963e63ea_like',
|
||||
'extras_customlink_name_daed2d18_like',
|
||||
'extras_eventrule_name_899453c6_like',
|
||||
'extras_notificationgroup_name_70b0a3f9_like',
|
||||
'extras_savedfilter_name_8a4bbd09_like',
|
||||
'extras_savedfilter_slug_4f93a959_like',
|
||||
'extras_tag_name_9550b3d9_like',
|
||||
'extras_tag_slug_aaa5b7e9_like',
|
||||
'extras_webhook_name_82cf60b5_like',
|
||||
]
|
||||
|
||||
|
||||
def remove_indexes(apps, schema_editor):
|
||||
for idx in PATTERN_OPS_INDEXES:
|
||||
schema_editor.execute(f'DROP INDEX IF EXISTS {idx}')
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
('extras', '0133_make_cf_minmax_decimal'),
|
||||
('dcim', '0217_ci_collations'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(
|
||||
code=remove_indexes,
|
||||
reverse_code=migrations.RunPython.noop,
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='configcontext',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='configcontextprofile',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='customfield',
|
||||
name='name',
|
||||
field=models.CharField(
|
||||
db_collation='ci_natural_sort',
|
||||
max_length=50,
|
||||
unique=True,
|
||||
validators=[
|
||||
django.core.validators.RegexValidator(
|
||||
flags=re.RegexFlag['IGNORECASE'],
|
||||
message='Only alphanumeric characters and underscores are allowed.',
|
||||
regex='^[a-z0-9_]+$',
|
||||
),
|
||||
django.core.validators.RegexValidator(
|
||||
flags=re.RegexFlag['IGNORECASE'],
|
||||
inverse_match=True,
|
||||
message='Double underscores are not permitted in custom field names.',
|
||||
regex='__',
|
||||
),
|
||||
],
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='customfieldchoiceset',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='customlink',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='eventrule',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=150, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='notificationgroup',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='savedfilter',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='savedfilter',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='tag',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='tag',
|
||||
name='slug',
|
||||
field=models.SlugField(allow_unicode=True, db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='webhook',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=150, unique=True),
|
||||
),
|
||||
]
|
||||
@@ -35,7 +35,8 @@ class ConfigContextProfile(SyncedDataMixin, PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
description = models.CharField(
|
||||
verbose_name=_('description'),
|
||||
@@ -77,7 +78,8 @@ class ConfigContext(SyncedDataMixin, CloningMixin, CustomLinksMixin, ChangeLogge
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
profile = models.ForeignKey(
|
||||
to='extras.ConfigContextProfile',
|
||||
|
||||
@@ -94,6 +94,7 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
|
||||
verbose_name=_('name'),
|
||||
max_length=50,
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
help_text=_('Internal field name'),
|
||||
validators=(
|
||||
RegexValidator(
|
||||
@@ -779,7 +780,8 @@ class CustomFieldChoiceSet(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel
|
||||
"""
|
||||
name = models.CharField(
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
description = models.CharField(
|
||||
max_length=200,
|
||||
|
||||
@@ -59,7 +59,8 @@ class EventRule(CustomFieldsMixin, ExportTemplatesMixin, TagsMixin, ChangeLogged
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=150,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
description = models.CharField(
|
||||
verbose_name=_('description'),
|
||||
@@ -164,7 +165,8 @@ class Webhook(CustomFieldsMixin, ExportTemplatesMixin, TagsMixin, ChangeLoggedMo
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=150,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
description = models.CharField(
|
||||
verbose_name=_('description'),
|
||||
@@ -307,7 +309,8 @@ class CustomLink(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
enabled = models.BooleanField(
|
||||
verbose_name=_('enabled'),
|
||||
@@ -468,12 +471,14 @@ class SavedFilter(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
description = models.CharField(
|
||||
verbose_name=_('description'),
|
||||
|
||||
@@ -125,7 +125,8 @@ class NotificationGroup(ChangeLoggedModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
description = models.CharField(
|
||||
verbose_name=_('description'),
|
||||
|
||||
@@ -2,7 +2,7 @@ from django.conf import settings
|
||||
from django.db import models
|
||||
from django.urls import reverse
|
||||
from django.utils.text import slugify
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _, pgettext_lazy
|
||||
from taggit.models import TagBase, GenericTaggedItemBase
|
||||
|
||||
from netbox.choices import ColorChoices
|
||||
@@ -25,6 +25,21 @@ class Tag(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel, TagBase):
|
||||
id = models.BigAutoField(
|
||||
primary_key=True
|
||||
)
|
||||
# Override TagBase.name to set db_collation
|
||||
name = models.CharField(
|
||||
verbose_name=pgettext_lazy("A tag name", "name"),
|
||||
unique=True,
|
||||
max_length=100,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
# Override TagBase.slug to set db_collation
|
||||
slug = models.SlugField(
|
||||
verbose_name=pgettext_lazy("A tag slug", "slug"),
|
||||
unique=True,
|
||||
max_length=100,
|
||||
allow_unicode=True,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
color = ColorField(
|
||||
verbose_name=_('color'),
|
||||
default=ColorChoices.COLOR_GREY
|
||||
|
||||
@@ -1,12 +1,9 @@
|
||||
import inspect
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
|
||||
import yaml
|
||||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.core.files.storage import storages
|
||||
from django.core.validators import RegexValidator
|
||||
from django.utils import timezone
|
||||
@@ -490,7 +487,7 @@ class BaseScript:
|
||||
if self.fieldsets:
|
||||
fieldsets.extend(self.fieldsets)
|
||||
else:
|
||||
fields = list(name for name, _ in self._get_vars().items())
|
||||
fields = list(name for name, __ in self._get_vars().items())
|
||||
fieldsets.append((_('Script Data'), fields))
|
||||
|
||||
# Append the default fieldset if defined in the Meta class
|
||||
@@ -582,40 +579,6 @@ class BaseScript:
|
||||
self._log(message, obj, level=LogLevelChoices.LOG_FAILURE)
|
||||
self.failed = True
|
||||
|
||||
#
|
||||
# Convenience functions
|
||||
#
|
||||
|
||||
def load_yaml(self, filename):
|
||||
"""
|
||||
Return data from a YAML file
|
||||
"""
|
||||
# TODO: DEPRECATED: Remove this method in v4.5
|
||||
self._log(
|
||||
_("load_yaml is deprecated and will be removed in v4.5"),
|
||||
level=LogLevelChoices.LOG_WARNING
|
||||
)
|
||||
file_path = os.path.join(settings.SCRIPTS_ROOT, filename)
|
||||
with open(file_path, 'r') as datafile:
|
||||
data = yaml.load(datafile, Loader=yaml.SafeLoader)
|
||||
|
||||
return data
|
||||
|
||||
def load_json(self, filename):
|
||||
"""
|
||||
Return data from a JSON file
|
||||
"""
|
||||
# TODO: DEPRECATED: Remove this method in v4.5
|
||||
self._log(
|
||||
_("load_json is deprecated and will be removed in v4.5"),
|
||||
level=LogLevelChoices.LOG_WARNING
|
||||
)
|
||||
file_path = os.path.join(settings.SCRIPTS_ROOT, filename)
|
||||
with open(file_path, 'r') as datafile:
|
||||
data = json.load(datafile)
|
||||
|
||||
return data
|
||||
|
||||
#
|
||||
# Legacy Report functionality
|
||||
#
|
||||
|
||||
@@ -3,6 +3,7 @@ import datetime
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.urls import reverse
|
||||
from django.utils.timezone import make_aware, now
|
||||
from rest_framework import status
|
||||
|
||||
from core.choices import ManagedFileRootPathChoices
|
||||
from core.events import *
|
||||
@@ -11,7 +12,8 @@ from dcim.models import Device, DeviceRole, DeviceType, Manufacturer, Rack, Loca
|
||||
from extras.choices import *
|
||||
from extras.models import *
|
||||
from extras.scripts import BooleanVar, IntegerVar, Script as PythonClass, StringVar
|
||||
from users.models import Group, User
|
||||
from users.constants import TOKEN_PREFIX
|
||||
from users.models import Group, Token, User
|
||||
from utilities.testing import APITestCase, APIViewTestCases
|
||||
|
||||
|
||||
@@ -854,6 +856,47 @@ class ConfigTemplateTest(APIViewTestCases.APIViewTestCase):
|
||||
)
|
||||
ConfigTemplate.objects.bulk_create(config_templates)
|
||||
|
||||
def test_render(self):
|
||||
configtemplate = ConfigTemplate.objects.first()
|
||||
|
||||
self.add_permissions('extras.render_configtemplate', 'extras.view_configtemplate')
|
||||
url = reverse('extras-api:configtemplate-render', kwargs={'pk': configtemplate.pk})
|
||||
response = self.client.post(url, {'foo': 'bar'}, format='json', **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
self.assertEqual(response.data['content'], 'Foo: bar')
|
||||
|
||||
def test_render_without_permission(self):
|
||||
configtemplate = ConfigTemplate.objects.first()
|
||||
|
||||
# No permissions added - user has no render permission
|
||||
url = reverse('extras-api:configtemplate-render', kwargs={'pk': configtemplate.pk})
|
||||
response = self.client.post(url, {'foo': 'bar'}, format='json', **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_404_NOT_FOUND)
|
||||
|
||||
def test_render_token_write_enabled(self):
|
||||
configtemplate = ConfigTemplate.objects.first()
|
||||
|
||||
self.add_permissions('extras.render_configtemplate', 'extras.view_configtemplate')
|
||||
url = reverse('extras-api:configtemplate-render', kwargs={'pk': configtemplate.pk})
|
||||
|
||||
# Request without token auth should fail with PermissionDenied
|
||||
response = self.client.post(url, {'foo': 'bar'}, format='json')
|
||||
self.assertHttpStatus(response, status.HTTP_403_FORBIDDEN)
|
||||
|
||||
# Create token with write_enabled=False
|
||||
token = Token.objects.create(version=2, user=self.user, write_enabled=False)
|
||||
token_header = f'Bearer {TOKEN_PREFIX}{token.key}.{token.token}'
|
||||
|
||||
# Request with write-disabled token should fail
|
||||
response = self.client.post(url, {'foo': 'bar'}, format='json', HTTP_AUTHORIZATION=token_header)
|
||||
self.assertHttpStatus(response, status.HTTP_403_FORBIDDEN)
|
||||
|
||||
# Enable write and retry
|
||||
token.write_enabled = True
|
||||
token.save()
|
||||
response = self.client.post(url, {'foo': 'bar'}, format='json', HTTP_AUTHORIZATION=token_header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
|
||||
|
||||
class ScriptTest(APITestCase):
|
||||
|
||||
|
||||
@@ -363,7 +363,7 @@ class EventRuleTest(APITestCase):
|
||||
body = json.loads(request.body)
|
||||
self.assertEqual(body['event'], 'created')
|
||||
self.assertEqual(body['timestamp'], job.kwargs['timestamp'])
|
||||
self.assertEqual(body['model'], 'site')
|
||||
self.assertEqual(body['object_type'], 'dcim.site')
|
||||
self.assertEqual(body['username'], 'testuser')
|
||||
self.assertEqual(body['request_id'], str(request_id))
|
||||
self.assertEqual(body['data']['name'], 'Site 1')
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
import logging
|
||||
import tempfile
|
||||
from datetime import date, datetime, timezone
|
||||
from decimal import Decimal
|
||||
|
||||
@@ -9,7 +7,6 @@ from netaddr import IPAddress, IPNetwork
|
||||
|
||||
from dcim.models import DeviceRole
|
||||
from extras.scripts import *
|
||||
from utilities.testing import disable_logging
|
||||
|
||||
CHOICES = (
|
||||
('ff0000', 'Red'),
|
||||
@@ -35,35 +32,6 @@ JSON_DATA = """
|
||||
"""
|
||||
|
||||
|
||||
class ScriptTest(TestCase):
|
||||
|
||||
def test_load_yaml(self):
|
||||
datafile = tempfile.NamedTemporaryFile()
|
||||
datafile.write(bytes(YAML_DATA, 'UTF-8'))
|
||||
datafile.seek(0)
|
||||
|
||||
with disable_logging(level=logging.WARNING):
|
||||
data = Script().load_yaml(datafile.name)
|
||||
self.assertEqual(data, {
|
||||
'Foo': 123,
|
||||
'Bar': 456,
|
||||
'Baz': ['A', 'B', 'C'],
|
||||
})
|
||||
|
||||
def test_load_json(self):
|
||||
datafile = tempfile.NamedTemporaryFile()
|
||||
datafile.write(bytes(JSON_DATA, 'UTF-8'))
|
||||
datafile.seek(0)
|
||||
|
||||
with disable_logging(level=logging.WARNING):
|
||||
data = Script().load_json(datafile.name)
|
||||
self.assertEqual(data, {
|
||||
'Foo': 123,
|
||||
'Bar': 456,
|
||||
'Baz': ['A', 'B', 'C'],
|
||||
})
|
||||
|
||||
|
||||
class ScriptVariablesTest(TestCase):
|
||||
|
||||
def test_stringvar(self):
|
||||
|
||||
@@ -52,7 +52,6 @@ def send_webhook(event_rule, object_type, event_type, data, timestamp, username,
|
||||
'event': WEBHOOK_EVENT_TYPES.get(event_type, event_type),
|
||||
'timestamp': timestamp,
|
||||
'object_type': '.'.join(object_type.natural_key()),
|
||||
'model': object_type.model,
|
||||
'username': username,
|
||||
'request_id': request.id if request else None,
|
||||
'data': data,
|
||||
@@ -100,7 +99,7 @@ def send_webhook(event_rule, object_type, event_type, data, timestamp, username,
|
||||
'data': body.encode('utf8'),
|
||||
}
|
||||
logger.info(
|
||||
f"Sending {params['method']} request to {params['url']} ({context['model']} {context['event']})"
|
||||
f"Sending {params['method']} request to {params['url']} ({context['object_type']} {context['event']})"
|
||||
)
|
||||
logger.debug(params)
|
||||
try:
|
||||
|
||||
100
netbox/ipam/migrations/0083_ci_collations.py
Normal file
100
netbox/ipam/migrations/0083_ci_collations.py
Normal file
@@ -0,0 +1,100 @@
|
||||
from django.db import migrations, models
|
||||
|
||||
PATTERN_OPS_INDEXES = [
|
||||
'ipam_asnrange_name_c7585e73_like',
|
||||
'ipam_asnrange_slug_c8a7d8a1_like',
|
||||
'ipam_rir_name_64a71982_like',
|
||||
'ipam_rir_slug_ff1a369a_like',
|
||||
'ipam_role_name_13784849_like',
|
||||
'ipam_role_slug_309ca14c_like',
|
||||
'ipam_routetarget_name_212be79f_like',
|
||||
'ipam_servicetemplate_name_1a2f3410_like',
|
||||
'ipam_vlangroup_slug_40abcf6b_like',
|
||||
'ipam_vlantranslationpolicy_name_17e0a007_like',
|
||||
'ipam_vrf_rd_0ac1bde1_like',
|
||||
]
|
||||
|
||||
|
||||
def remove_indexes(apps, schema_editor):
|
||||
for idx in PATTERN_OPS_INDEXES:
|
||||
schema_editor.execute(f'DROP INDEX IF EXISTS {idx}')
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ipam', '0082_add_prefix_network_containment_indexes'),
|
||||
('dcim', '0217_ci_collations'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(
|
||||
code=remove_indexes,
|
||||
reverse_code=migrations.RunPython.noop,
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='asnrange',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='asnrange',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='rir',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='rir',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='role',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='role',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='routetarget',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=21, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='servicetemplate',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='vlan',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='vlangroup',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='vlangroup',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='vlantranslationpolicy',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='vrf',
|
||||
name='rd',
|
||||
field=models.CharField(blank=True, db_collation='case_insensitive', max_length=21, null=True, unique=True),
|
||||
),
|
||||
]
|
||||
@@ -18,12 +18,7 @@ class ASNRange(OrganizationalModel):
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True,
|
||||
db_collation="natural_sort"
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
rir = models.ForeignKey(
|
||||
to='ipam.RIR',
|
||||
|
||||
@@ -50,7 +50,8 @@ class ServiceTemplate(ServiceBase, PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
|
||||
class Meta:
|
||||
|
||||
@@ -37,11 +37,12 @@ class VLANGroup(OrganizationalModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100
|
||||
max_length=100,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
scope_type = models.ForeignKey(
|
||||
to='contenttypes.ContentType',
|
||||
@@ -214,7 +215,8 @@ class VLAN(PrimaryModel):
|
||||
)
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=64
|
||||
max_length=64,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
tenant = models.ForeignKey(
|
||||
to='tenancy.Tenant',
|
||||
@@ -362,6 +364,7 @@ class VLANTranslationPolicy(PrimaryModel):
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
|
||||
class Meta:
|
||||
|
||||
@@ -19,11 +19,12 @@ class VRF(PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
db_collation="natural_sort"
|
||||
db_collation='natural_sort',
|
||||
)
|
||||
rd = models.CharField(
|
||||
max_length=VRF_RD_MAX_LENGTH,
|
||||
unique=True,
|
||||
db_collation='case_insensitive',
|
||||
blank=True,
|
||||
null=True,
|
||||
verbose_name=_('route distinguisher'),
|
||||
@@ -75,8 +76,8 @@ class RouteTarget(PrimaryModel):
|
||||
verbose_name=_('name'),
|
||||
max_length=VRF_RD_MAX_LENGTH, # Same format options as VRF RD (RFC 4360 section 4)
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
help_text=_('Route target value (formatted in accordance with RFC 4360)'),
|
||||
db_collation="natural_sort"
|
||||
)
|
||||
tenant = models.ForeignKey(
|
||||
to='tenancy.Tenant',
|
||||
|
||||
@@ -2,47 +2,90 @@ import logging
|
||||
|
||||
from django.conf import settings
|
||||
from django.utils import timezone
|
||||
from rest_framework import authentication, exceptions
|
||||
from drf_spectacular.extensions import OpenApiAuthenticationExtension
|
||||
from rest_framework import exceptions
|
||||
from rest_framework.authentication import BaseAuthentication, get_authorization_header
|
||||
from rest_framework.permissions import BasePermission, DjangoObjectPermissions, SAFE_METHODS
|
||||
|
||||
from netbox.config import get_config
|
||||
from users.constants import TOKEN_PREFIX
|
||||
from users.models import Token
|
||||
from utilities.request import get_client_ip
|
||||
|
||||
V1_KEYWORD = 'Token'
|
||||
V2_KEYWORD = 'Bearer'
|
||||
|
||||
class TokenAuthentication(authentication.TokenAuthentication):
|
||||
|
||||
class TokenAuthentication(BaseAuthentication):
|
||||
"""
|
||||
A custom authentication scheme which enforces Token expiration times and source IP restrictions.
|
||||
"""
|
||||
model = Token
|
||||
|
||||
def authenticate(self, request):
|
||||
result = super().authenticate(request)
|
||||
|
||||
if result:
|
||||
token = result[1]
|
||||
|
||||
# Enforce source IP restrictions (if any) set on the token
|
||||
if token.allowed_ips:
|
||||
client_ip = get_client_ip(request)
|
||||
if client_ip is None:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"Client IP address could not be determined for validation. Check that the HTTP server is "
|
||||
"correctly configured to pass the required header(s)."
|
||||
)
|
||||
if not token.validate_client_ip(client_ip):
|
||||
raise exceptions.AuthenticationFailed(
|
||||
f"Source IP {client_ip} is not permitted to authenticate using this token."
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
def authenticate_credentials(self, key):
|
||||
model = self.get_model()
|
||||
# Authorization header is not present; ignore
|
||||
if not (auth := get_authorization_header(request).split()):
|
||||
return
|
||||
# Unrecognized header; ignore
|
||||
if auth[0].lower() not in (V1_KEYWORD.lower().encode(), V2_KEYWORD.lower().encode()):
|
||||
return
|
||||
# Check for extraneous token content
|
||||
if len(auth) != 2:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
'Invalid authorization header: Must be in the form "Bearer <key>.<token>" or "Token <token>"'
|
||||
)
|
||||
# Extract the key (if v2) & token plaintext from the auth header
|
||||
try:
|
||||
token = model.objects.prefetch_related('user').get(key=key)
|
||||
except model.DoesNotExist:
|
||||
raise exceptions.AuthenticationFailed("Invalid token")
|
||||
auth_value = auth[1].decode()
|
||||
except UnicodeError:
|
||||
raise exceptions.AuthenticationFailed("Invalid authorization header: Token contains invalid characters")
|
||||
|
||||
# Infer token version from presence or absence of prefix
|
||||
version = 2 if auth_value.startswith(TOKEN_PREFIX) else 1
|
||||
|
||||
if version == 1:
|
||||
key, plaintext = None, auth_value
|
||||
else:
|
||||
auth_value = auth_value.removeprefix(TOKEN_PREFIX)
|
||||
try:
|
||||
key, plaintext = auth_value.split('.', 1)
|
||||
except ValueError:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"Invalid authorization header: Could not parse key from v2 token. Did you mean to use 'Token' "
|
||||
"instead of 'Bearer'?"
|
||||
)
|
||||
|
||||
# Look for a matching token in the database
|
||||
try:
|
||||
qs = Token.objects.prefetch_related('user')
|
||||
if version == 1:
|
||||
# Fetch v1 token by querying plaintext value directly
|
||||
token = qs.get(version=version, plaintext=plaintext)
|
||||
else:
|
||||
# Fetch v2 token by key, then validate the plaintext
|
||||
token = qs.get(version=version, key=key)
|
||||
if not token.validate(plaintext):
|
||||
# Key is valid but plaintext is not. Raise DoesNotExist to guard against key enumeration.
|
||||
raise Token.DoesNotExist()
|
||||
except Token.DoesNotExist:
|
||||
raise exceptions.AuthenticationFailed(f"Invalid v{version} token")
|
||||
|
||||
# Enforce source IP restrictions (if any) set on the token
|
||||
if token.allowed_ips:
|
||||
client_ip = get_client_ip(request)
|
||||
if client_ip is None:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"Client IP address could not be determined for validation. Check that the HTTP server is "
|
||||
"correctly configured to pass the required header(s)."
|
||||
)
|
||||
if not token.validate_client_ip(client_ip):
|
||||
raise exceptions.AuthenticationFailed(
|
||||
f"Source IP {client_ip} is not permitted to authenticate using this token."
|
||||
)
|
||||
|
||||
# Enforce the Token's expiration time, if one has been set.
|
||||
if token.is_expired:
|
||||
raise exceptions.AuthenticationFailed("Token expired")
|
||||
|
||||
# Update last used, but only once per minute at most. This reduces write load on the database
|
||||
if not token.last_used or (timezone.now() - token.last_used).total_seconds() > 60:
|
||||
@@ -54,11 +97,8 @@ class TokenAuthentication(authentication.TokenAuthentication):
|
||||
else:
|
||||
Token.objects.filter(pk=token.pk).update(last_used=timezone.now())
|
||||
|
||||
# Enforce the Token's expiration time, if one has been set.
|
||||
if token.is_expired:
|
||||
raise exceptions.AuthenticationFailed("Token expired")
|
||||
|
||||
user = token.user
|
||||
|
||||
# When LDAP authentication is active try to load user data from LDAP directory
|
||||
if 'netbox.authentication.LDAPBackend' in settings.REMOTE_AUTH_BACKEND:
|
||||
from netbox.authentication import LDAPBackend
|
||||
@@ -124,6 +164,20 @@ class TokenPermissions(DjangoObjectPermissions):
|
||||
return super().has_object_permission(request, view, obj)
|
||||
|
||||
|
||||
class TokenWritePermission(BasePermission):
|
||||
"""
|
||||
Verify the token has write_enabled for unsafe methods, without requiring specific model permissions.
|
||||
Used for custom actions that accept user data but don't map to standard CRUD operations.
|
||||
"""
|
||||
|
||||
def has_permission(self, request, view):
|
||||
if not isinstance(request.auth, Token):
|
||||
raise exceptions.PermissionDenied(
|
||||
"TokenWritePermission requires token authentication."
|
||||
)
|
||||
return bool(request.method in SAFE_METHODS or request.auth.write_enabled)
|
||||
|
||||
|
||||
class IsAuthenticatedOrLoginNotRequired(BasePermission):
|
||||
"""
|
||||
Returns True if the user is authenticated or LOGIN_REQUIRED is False.
|
||||
@@ -132,3 +186,17 @@ class IsAuthenticatedOrLoginNotRequired(BasePermission):
|
||||
if not settings.LOGIN_REQUIRED:
|
||||
return True
|
||||
return request.user.is_authenticated
|
||||
|
||||
|
||||
class TokenScheme(OpenApiAuthenticationExtension):
|
||||
target_class = 'netbox.api.authentication.TokenAuthentication'
|
||||
name = 'tokenAuth'
|
||||
match_subclasses = True
|
||||
|
||||
def get_security_definition(self, auto_schema):
|
||||
return {
|
||||
'type': 'apiKey',
|
||||
'in': 'header',
|
||||
'name': 'Authorization',
|
||||
'description': '`Token <token>` (v1) or `Bearer <key>.<token>` (v2)',
|
||||
}
|
||||
|
||||
@@ -184,14 +184,13 @@ class RemoteUserBackend(_RemoteUserBackend):
|
||||
else:
|
||||
user.groups.clear()
|
||||
logger.debug(f"Stripping user {user} from Groups")
|
||||
|
||||
# Evaluate superuser status
|
||||
user.is_superuser = self._is_superuser(user)
|
||||
logger.debug(f"User {user} is Superuser: {user.is_superuser}")
|
||||
logger.debug(
|
||||
f"User {user} should be Superuser: {self._is_superuser(user)}")
|
||||
|
||||
user.is_staff = self._is_staff(user)
|
||||
logger.debug(f"User {user} is Staff: {user.is_staff}")
|
||||
logger.debug(f"User {user} should be Staff: {self._is_staff(user)}")
|
||||
user.save()
|
||||
return user
|
||||
|
||||
@@ -251,19 +250,8 @@ class RemoteUserBackend(_RemoteUserBackend):
|
||||
return bool(result)
|
||||
|
||||
def _is_staff(self, user):
|
||||
logger = logging.getLogger('netbox.auth.RemoteUserBackend')
|
||||
staff_groups = settings.REMOTE_AUTH_STAFF_GROUPS
|
||||
logger.debug(f"Superuser Groups: {staff_groups}")
|
||||
staff_users = settings.REMOTE_AUTH_STAFF_USERS
|
||||
logger.debug(f"Staff Users :{staff_users}")
|
||||
user_groups = set()
|
||||
for g in user.groups.all():
|
||||
user_groups.add(g.name)
|
||||
logger.debug(f"User {user.username} is in Groups:{user_groups}")
|
||||
result = user.username in staff_users or (
|
||||
set(user_groups) & set(staff_groups))
|
||||
logger.debug(f"User {user.username} in Staff Users :{result}")
|
||||
return bool(result)
|
||||
# Retain for pre-v4.5 compatibility
|
||||
return user.is_superuser
|
||||
|
||||
def configure_user(self, request, user):
|
||||
logger = logging.getLogger('netbox.auth.RemoteUserBackend')
|
||||
|
||||
@@ -68,6 +68,16 @@ REDIS = {
|
||||
# https://docs.djangoproject.com/en/stable/ref/settings/#std:setting-SECRET_KEY
|
||||
SECRET_KEY = ''
|
||||
|
||||
# Define a mapping of cryptographic peppers to use when hashing API tokens. A minimum of one pepper is required to
|
||||
# enable v2 API tokens (NetBox v4.5+). Define peppers as a mapping of numeric ID to pepper value, as shown below. Each
|
||||
# pepper must be at least 50 characters in length.
|
||||
#
|
||||
# API_TOKEN_PEPPERS = {
|
||||
# 1: "<random string>",
|
||||
# 2: "<random string>",
|
||||
# }
|
||||
API_TOKEN_PEPPERS = {}
|
||||
|
||||
|
||||
#########################
|
||||
# #
|
||||
@@ -81,9 +91,6 @@ ADMINS = [
|
||||
# ('John Doe', 'jdoe@example.com'),
|
||||
]
|
||||
|
||||
# Permit the retrieval of API tokens after their creation.
|
||||
ALLOW_TOKEN_RETRIEVAL = False
|
||||
|
||||
# Enable any desired validators for local account passwords below. For a list of included validators, please see the
|
||||
# Django documentation at https://docs.djangoproject.com/en/stable/topics/auth/passwords/#password-validation.
|
||||
AUTH_PASSWORD_VALIDATORS = [
|
||||
|
||||
@@ -43,7 +43,9 @@ SECRET_KEY = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789'
|
||||
|
||||
DEFAULT_PERMISSIONS = {}
|
||||
|
||||
ALLOW_TOKEN_RETRIEVAL = True
|
||||
API_TOKEN_PEPPERS = {
|
||||
1: 'TEST-VALUE-DO-NOT-USE-TEST-VALUE-DO-NOT-USE-TEST-VALUE-DO-NOT-USE',
|
||||
}
|
||||
|
||||
LOGGING = {
|
||||
'version': 1,
|
||||
|
||||
@@ -28,7 +28,6 @@ def preferences(request):
|
||||
user_preferences = request.user.config if request.user.is_authenticated else {}
|
||||
return {
|
||||
'preferences': user_preferences,
|
||||
'htmx_navigation': user_preferences.get('ui.htmx_navigation', False) == 'true'
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import strawberry
|
||||
from django.conf import settings
|
||||
from strawberry_django.optimizer import DjangoOptimizerExtension
|
||||
from strawberry.extensions import MaxAliasesLimiter # , SchemaExtension
|
||||
from strawberry.extensions import MaxAliasesLimiter
|
||||
from strawberry.schema.config import StrawberryConfig
|
||||
|
||||
from circuits.graphql.schema import CircuitsQuery
|
||||
@@ -16,9 +16,17 @@ from virtualization.graphql.schema import VirtualizationQuery
|
||||
from vpn.graphql.schema import VPNQuery
|
||||
from wireless.graphql.schema import WirelessQuery
|
||||
|
||||
__all__ = (
|
||||
'Query',
|
||||
'QueryV1',
|
||||
'QueryV2',
|
||||
'schema_v1',
|
||||
'schema_v2',
|
||||
)
|
||||
|
||||
|
||||
@strawberry.type
|
||||
class Query(
|
||||
class QueryV1(
|
||||
UsersQuery,
|
||||
CircuitsQuery,
|
||||
CoreQuery,
|
||||
@@ -31,11 +39,44 @@ class Query(
|
||||
WirelessQuery,
|
||||
*registry['plugins']['graphql_schemas'], # Append plugin schemas
|
||||
):
|
||||
"""Query class for GraphQL API v1"""
|
||||
pass
|
||||
|
||||
|
||||
schema = strawberry.Schema(
|
||||
query=Query,
|
||||
@strawberry.type
|
||||
class QueryV2(
|
||||
UsersQuery,
|
||||
CircuitsQuery,
|
||||
CoreQuery,
|
||||
DCIMQuery,
|
||||
ExtrasQuery,
|
||||
IPAMQuery,
|
||||
TenancyQuery,
|
||||
VirtualizationQuery,
|
||||
VPNQuery,
|
||||
WirelessQuery,
|
||||
*registry['plugins']['graphql_schemas'], # Append plugin schemas
|
||||
):
|
||||
"""Query class for GraphQL API v2"""
|
||||
pass
|
||||
|
||||
|
||||
# Expose a default Query class for the configured default GraphQL version
|
||||
class Query(QueryV2 if settings.GRAPHQL_DEFAULT_VERSION == 2 else QueryV1):
|
||||
pass
|
||||
|
||||
|
||||
# Generate schemas for both versions of the GraphQL API
|
||||
schema_v1 = strawberry.Schema(
|
||||
query=QueryV1,
|
||||
config=StrawberryConfig(auto_camel_case=False),
|
||||
extensions=[
|
||||
DjangoOptimizerExtension(prefetch_custom_queryset=True),
|
||||
MaxAliasesLimiter(max_alias_count=settings.GRAPHQL_MAX_ALIASES),
|
||||
]
|
||||
)
|
||||
schema_v2 = strawberry.Schema(
|
||||
query=QueryV2,
|
||||
config=StrawberryConfig(auto_camel_case=False),
|
||||
extensions=[
|
||||
DjangoOptimizerExtension(prefetch_custom_queryset=True),
|
||||
|
||||
16
netbox/netbox/graphql/utils.py
Normal file
16
netbox/netbox/graphql/utils.py
Normal file
@@ -0,0 +1,16 @@
|
||||
from django.conf import settings
|
||||
|
||||
from netbox.graphql.schema import schema_v1, schema_v2
|
||||
|
||||
__all__ = (
|
||||
'get_default_schema',
|
||||
)
|
||||
|
||||
|
||||
def get_default_schema():
|
||||
"""
|
||||
Returns the GraphQL schema corresponding to the value of the NETBOX_GRAPHQL_DEFAULT_SCHEMA setting.
|
||||
"""
|
||||
if settings.GRAPHQL_DEFAULT_VERSION == 2:
|
||||
return schema_v2
|
||||
return schema_v1
|
||||
@@ -50,21 +50,15 @@ class NetBoxFeatureSet(
|
||||
# Base model classes
|
||||
#
|
||||
|
||||
class ChangeLoggedModel(ChangeLoggingMixin, CustomValidationMixin, EventRulesMixin, models.Model):
|
||||
class BaseModel(models.Model):
|
||||
"""
|
||||
Base model for ancillary models; provides limited functionality for models which don't
|
||||
support NetBox's full feature set.
|
||||
"""
|
||||
objects = RestrictedQuerySet.as_manager()
|
||||
A global base model for all NetBox objects.
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
|
||||
class NetBoxModel(NetBoxFeatureSet, models.Model):
|
||||
"""
|
||||
Base model for most object types. Suitable for use by plugins.
|
||||
This class provides some important overrides to Django's default functionality, such as
|
||||
- Overriding the default manager to use RestrictedQuerySet
|
||||
- Extending `clean()` to validate GenericForeignKey fields
|
||||
"""
|
||||
|
||||
objects = RestrictedQuerySet.as_manager()
|
||||
|
||||
class Meta:
|
||||
@@ -103,6 +97,25 @@ class NetBoxModel(NetBoxFeatureSet, models.Model):
|
||||
setattr(self, field.name, obj)
|
||||
|
||||
|
||||
class ChangeLoggedModel(ChangeLoggingMixin, CustomValidationMixin, EventRulesMixin, BaseModel):
|
||||
"""
|
||||
Base model for ancillary models; provides limited functionality for models which don't
|
||||
support NetBox's full feature set.
|
||||
"""
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
|
||||
class NetBoxModel(NetBoxFeatureSet, BaseModel):
|
||||
"""
|
||||
Base model for most object types. Suitable for use by plugins.
|
||||
"""
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
|
||||
#
|
||||
# NetBox internal base models
|
||||
#
|
||||
@@ -140,11 +153,13 @@ class NestedGroupModel(NetBoxFeatureSet, MPTTModel):
|
||||
)
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100
|
||||
max_length=100,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100
|
||||
max_length=100,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
description = models.CharField(
|
||||
verbose_name=_('description'),
|
||||
@@ -177,7 +192,7 @@ class NestedGroupModel(NetBoxFeatureSet, MPTTModel):
|
||||
})
|
||||
|
||||
|
||||
class OrganizationalModel(NetBoxFeatureSet, models.Model):
|
||||
class OrganizationalModel(NetBoxModel):
|
||||
"""
|
||||
Organizational models are those which are used solely to categorize and qualify other objects, and do not convey
|
||||
any real information about the infrastructure being modeled (for example, functional device roles). Organizational
|
||||
@@ -189,12 +204,14 @@ class OrganizationalModel(NetBoxFeatureSet, models.Model):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
description = models.CharField(
|
||||
verbose_name=_('description'),
|
||||
@@ -202,8 +219,6 @@ class OrganizationalModel(NetBoxFeatureSet, models.Model):
|
||||
blank=True
|
||||
)
|
||||
|
||||
objects = RestrictedQuerySet.as_manager()
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
ordering = ('name',)
|
||||
|
||||
@@ -3,12 +3,12 @@ from collections import OrderedDict
|
||||
from django.apps import apps
|
||||
from django.urls.exceptions import NoReverseMatch
|
||||
from drf_spectacular.utils import extend_schema
|
||||
from rest_framework import permissions
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.reverse import reverse
|
||||
from rest_framework.views import APIView
|
||||
|
||||
from netbox.registry import registry
|
||||
from utilities.api import IsSuperuser
|
||||
|
||||
|
||||
@extend_schema(exclude=True)
|
||||
@@ -16,7 +16,7 @@ class InstalledPluginsAPIView(APIView):
|
||||
"""
|
||||
API view for listing all installed plugins
|
||||
"""
|
||||
permission_classes = [permissions.IsAdminUser]
|
||||
permission_classes = [IsSuperuser]
|
||||
_ignore_model_permissions = True
|
||||
schema = None
|
||||
|
||||
|
||||
@@ -26,16 +26,6 @@ def get_csv_delimiters():
|
||||
PREFERENCES = {
|
||||
|
||||
# User interface
|
||||
'ui.htmx_navigation': UserPreference(
|
||||
label=_('HTMX Navigation'),
|
||||
choices=(
|
||||
('', _('Disabled')),
|
||||
('true', _('Enabled')),
|
||||
),
|
||||
description=_('Enable dynamic UI navigation'),
|
||||
default=False,
|
||||
warning=_('Experimental feature')
|
||||
),
|
||||
'locale.language': UserPreference(
|
||||
label=_('Language'),
|
||||
choices=(
|
||||
|
||||
@@ -20,6 +20,7 @@ from netbox.plugins import PluginConfig
|
||||
from netbox.registry import registry
|
||||
import storages.utils # type: ignore
|
||||
from utilities.release import load_release_data
|
||||
from utilities.security import validate_peppers
|
||||
from utilities.string import trailing_slash
|
||||
from .monkey import get_unique_validators
|
||||
|
||||
@@ -43,9 +44,9 @@ VERSION = RELEASE.full_version # Retained for backward compatibility
|
||||
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
# Validate Python version
|
||||
if sys.version_info < (3, 10):
|
||||
if sys.version_info < (3, 12):
|
||||
raise RuntimeError(
|
||||
f"NetBox requires Python 3.10 or later. (Currently installed: Python {platform.python_version()})"
|
||||
f"NetBox requires Python 3.12 or later. (Currently installed: Python {platform.python_version()})"
|
||||
)
|
||||
|
||||
#
|
||||
@@ -75,8 +76,8 @@ elif hasattr(configuration, 'DATABASE') and hasattr(configuration, 'DATABASES'):
|
||||
|
||||
# Set static config parameters
|
||||
ADMINS = getattr(configuration, 'ADMINS', [])
|
||||
ALLOW_TOKEN_RETRIEVAL = getattr(configuration, 'ALLOW_TOKEN_RETRIEVAL', False)
|
||||
ALLOWED_HOSTS = getattr(configuration, 'ALLOWED_HOSTS') # Required
|
||||
API_TOKEN_PEPPERS = getattr(configuration, 'API_TOKEN_PEPPERS', {})
|
||||
AUTH_PASSWORD_VALIDATORS = getattr(configuration, 'AUTH_PASSWORD_VALIDATORS', [
|
||||
{
|
||||
"NAME": "django.contrib.auth.password_validation.MinimumLengthValidator",
|
||||
@@ -136,6 +137,7 @@ EVENTS_PIPELINE = getattr(configuration, 'EVENTS_PIPELINE', [
|
||||
EXEMPT_VIEW_PERMISSIONS = getattr(configuration, 'EXEMPT_VIEW_PERMISSIONS', [])
|
||||
FIELD_CHOICES = getattr(configuration, 'FIELD_CHOICES', {})
|
||||
FILE_UPLOAD_MAX_MEMORY_SIZE = getattr(configuration, 'FILE_UPLOAD_MAX_MEMORY_SIZE', 2621440)
|
||||
GRAPHQL_DEFAULT_VERSION = getattr(configuration, 'GRAPHQL_DEFAULT_VERSION', 1)
|
||||
GRAPHQL_MAX_ALIASES = getattr(configuration, 'GRAPHQL_MAX_ALIASES', 10)
|
||||
HOSTNAME = getattr(configuration, 'HOSTNAME', platform.node())
|
||||
HTTP_PROXIES = getattr(configuration, 'HTTP_PROXIES', {})
|
||||
@@ -174,8 +176,6 @@ REMOTE_AUTH_SUPERUSERS = getattr(configuration, 'REMOTE_AUTH_SUPERUSERS', [])
|
||||
REMOTE_AUTH_USER_EMAIL = getattr(configuration, 'REMOTE_AUTH_USER_EMAIL', 'HTTP_REMOTE_USER_EMAIL')
|
||||
REMOTE_AUTH_USER_FIRST_NAME = getattr(configuration, 'REMOTE_AUTH_USER_FIRST_NAME', 'HTTP_REMOTE_USER_FIRST_NAME')
|
||||
REMOTE_AUTH_USER_LAST_NAME = getattr(configuration, 'REMOTE_AUTH_USER_LAST_NAME', 'HTTP_REMOTE_USER_LAST_NAME')
|
||||
REMOTE_AUTH_STAFF_GROUPS = getattr(configuration, 'REMOTE_AUTH_STAFF_GROUPS', [])
|
||||
REMOTE_AUTH_STAFF_USERS = getattr(configuration, 'REMOTE_AUTH_STAFF_USERS', [])
|
||||
# Required by extras/migrations/0109_script_models.py
|
||||
REPORTS_ROOT = getattr(configuration, 'REPORTS_ROOT', os.path.join(BASE_DIR, 'reports')).rstrip('/')
|
||||
RQ_DEFAULT_TIMEOUT = getattr(configuration, 'RQ_DEFAULT_TIMEOUT', 300)
|
||||
@@ -229,6 +229,12 @@ if len(SECRET_KEY) < 50:
|
||||
f" python {BASE_DIR}/generate_secret_key.py"
|
||||
)
|
||||
|
||||
# Validate API token peppers
|
||||
if API_TOKEN_PEPPERS:
|
||||
validate_peppers(API_TOKEN_PEPPERS)
|
||||
else:
|
||||
warnings.warn("API_TOKEN_PEPPERS is not defined. v2 API tokens cannot be used.")
|
||||
|
||||
# Validate update repo URL and timeout
|
||||
if RELEASE_CHECK_URL:
|
||||
try:
|
||||
|
||||
@@ -270,7 +270,7 @@ class ActionsColumn(tables.Column):
|
||||
if not (self.actions or self.extra_buttons):
|
||||
return ''
|
||||
# Skip dummy records (e.g. available VLANs or IP ranges replacing individual IPs)
|
||||
if type(record) is not model or not getattr(record, 'pk', None):
|
||||
if not isinstance(record, model) or not getattr(record, 'pk', None):
|
||||
return ''
|
||||
|
||||
if request := getattr(table, 'context', {}).get('request'):
|
||||
|
||||
@@ -8,6 +8,7 @@ from rest_framework.test import APIClient
|
||||
|
||||
from core.models import ObjectType
|
||||
from dcim.models import Rack, Site
|
||||
from users.constants import TOKEN_PREFIX
|
||||
from users.models import Group, ObjectPermission, Token, User
|
||||
from utilities.testing import TestCase
|
||||
from utilities.testing.api import APITestCase
|
||||
@@ -16,67 +17,159 @@ from utilities.testing.api import APITestCase
|
||||
class TokenAuthenticationTestCase(APITestCase):
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_token_authentication(self):
|
||||
url = reverse('dcim-api:site-list')
|
||||
|
||||
def test_no_token(self):
|
||||
# Request without a token should return a 403
|
||||
response = self.client.get(url)
|
||||
response = self.client.get(reverse('dcim-api:site-list'))
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_v1_token_valid(self):
|
||||
# Create a v1 token
|
||||
token = Token.objects.create(version=1, user=self.user)
|
||||
|
||||
# Valid token should return a 200
|
||||
token = Token.objects.create(user=self.user)
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token.key}')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
header = f'Token {token.token}'
|
||||
response = self.client.get(reverse('dcim-api:site-list'), HTTP_AUTHORIZATION=header)
|
||||
self.assertEqual(response.status_code, 200, response.data)
|
||||
|
||||
# Check that the token's last_used time has been updated
|
||||
token.refresh_from_db()
|
||||
self.assertIsNotNone(token.last_used)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_v1_token_invalid(self):
|
||||
# Invalid token should return a 403
|
||||
header = 'Token XXXXXXXXXX'
|
||||
response = self.client.get(reverse('dcim-api:site-list'), HTTP_AUTHORIZATION=header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
self.assertEqual(response.data['detail'], "Invalid v1 token")
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_v2_token_valid(self):
|
||||
# Create a v2 token
|
||||
token = Token.objects.create(version=2, user=self.user)
|
||||
|
||||
# Valid token should return a 200
|
||||
header = f'Bearer {TOKEN_PREFIX}{token.key}.{token.token}'
|
||||
response = self.client.get(reverse('dcim-api:site-list'), HTTP_AUTHORIZATION=header)
|
||||
self.assertEqual(response.status_code, 200, response.data)
|
||||
|
||||
# Check that the token's last_used time has been updated
|
||||
token.refresh_from_db()
|
||||
self.assertIsNotNone(token.last_used)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_v2_token_invalid(self):
|
||||
# Invalid token should return a 403
|
||||
header = f'Bearer {TOKEN_PREFIX}XXXXXX.XXXXXXXXXX'
|
||||
response = self.client.get(reverse('dcim-api:site-list'), HTTP_AUTHORIZATION=header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
self.assertEqual(response.data['detail'], "Invalid v2 token")
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_token_expiration(self):
|
||||
url = reverse('dcim-api:site-list')
|
||||
|
||||
# Request without a non-expired token should succeed
|
||||
token = Token.objects.create(user=self.user)
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token.key}')
|
||||
# Create v1 & v2 tokens
|
||||
future = datetime.datetime(2100, 1, 1, tzinfo=datetime.timezone.utc)
|
||||
token1 = Token.objects.create(version=1, user=self.user, expires=future)
|
||||
token2 = Token.objects.create(version=2, user=self.user, expires=future)
|
||||
|
||||
# Request with a non-expired token should succeed
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token1.token}')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Bearer {TOKEN_PREFIX}{token2.key}.{token2.token}')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
# Request with an expired token should fail
|
||||
token.expires = datetime.datetime(2020, 1, 1, tzinfo=datetime.timezone.utc)
|
||||
token.save()
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token.key}')
|
||||
past = datetime.datetime(2020, 1, 1, tzinfo=datetime.timezone.utc)
|
||||
token1.expires = past
|
||||
token1.save()
|
||||
token2.expires = past
|
||||
token2.save()
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token1.key}')
|
||||
self.assertEqual(response.status_code, 403)
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Bearer {TOKEN_PREFIX}{token2.key}')
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_token_write_enabled(self):
|
||||
url = reverse('dcim-api:site-list')
|
||||
data = {
|
||||
'name': 'Site 1',
|
||||
'slug': 'site-1',
|
||||
}
|
||||
data = [
|
||||
{
|
||||
'name': 'Site 1',
|
||||
'slug': 'site-1',
|
||||
},
|
||||
{
|
||||
'name': 'Site 2',
|
||||
'slug': 'site-2',
|
||||
},
|
||||
]
|
||||
self.add_permissions('dcim.view_site', 'dcim.add_site')
|
||||
|
||||
# Request with a write-disabled token should fail
|
||||
token = Token.objects.create(user=self.user, write_enabled=False)
|
||||
response = self.client.post(url, data, format='json', HTTP_AUTHORIZATION=f'Token {token.key}')
|
||||
# Create v1 & v2 tokens
|
||||
token1 = Token.objects.create(version=1, user=self.user, write_enabled=False)
|
||||
token2 = Token.objects.create(version=2, user=self.user, write_enabled=False)
|
||||
|
||||
token1_header = f'Token {token1.token}'
|
||||
token2_header = f'Bearer {TOKEN_PREFIX}{token2.key}.{token2.token}'
|
||||
|
||||
# GET request with a write-disabled token should succeed
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=token1_header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=token2_header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
# POST request with a write-disabled token should fail
|
||||
response = self.client.post(url, data[0], format='json', HTTP_AUTHORIZATION=token1_header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
response = self.client.post(url, data[1], format='json', HTTP_AUTHORIZATION=token2_header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Request with a write-enabled token should succeed
|
||||
token.write_enabled = True
|
||||
token.save()
|
||||
response = self.client.post(url, data, format='json', HTTP_AUTHORIZATION=f'Token {token.key}')
|
||||
self.assertEqual(response.status_code, 403)
|
||||
# POST request with a write-enabled token should succeed
|
||||
token1.write_enabled = True
|
||||
token1.save()
|
||||
token2.write_enabled = True
|
||||
token2.save()
|
||||
response = self.client.post(url, data[0], format='json', HTTP_AUTHORIZATION=token1_header)
|
||||
self.assertEqual(response.status_code, 201)
|
||||
response = self.client.post(url, data[1], format='json', HTTP_AUTHORIZATION=token2_header)
|
||||
self.assertEqual(response.status_code, 201)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_token_allowed_ips(self):
|
||||
url = reverse('dcim-api:site-list')
|
||||
|
||||
# Create v1 & v2 tokens
|
||||
token1 = Token.objects.create(version=1, user=self.user, allowed_ips=['192.0.2.0/24'])
|
||||
token2 = Token.objects.create(version=2, user=self.user, allowed_ips=['192.0.2.0/24'])
|
||||
|
||||
# Request from a non-allowed client IP should fail
|
||||
token = Token.objects.create(user=self.user, allowed_ips=['192.0.2.0/24'])
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token.key}', REMOTE_ADDR='127.0.0.1')
|
||||
response = self.client.get(
|
||||
url,
|
||||
HTTP_AUTHORIZATION=f'Token {token1.token}',
|
||||
REMOTE_ADDR='127.0.0.1'
|
||||
)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
response = self.client.get(
|
||||
url,
|
||||
HTTP_AUTHORIZATION=f'Bearer {TOKEN_PREFIX}{token2.key}.{token2.token}',
|
||||
REMOTE_ADDR='127.0.0.1'
|
||||
)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Request with an expired token should fail
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token.key}', REMOTE_ADDR='192.0.2.1')
|
||||
# Request from an allowed client IP should succeed
|
||||
response = self.client.get(
|
||||
url,
|
||||
HTTP_AUTHORIZATION=f'Token {token1.token}',
|
||||
REMOTE_ADDR='192.0.2.1'
|
||||
)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
response = self.client.get(
|
||||
url,
|
||||
HTTP_AUTHORIZATION=f'Bearer {TOKEN_PREFIX}{token2.key}.{token2.token}',
|
||||
REMOTE_ADDR='192.0.2.1'
|
||||
)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
|
||||
@@ -427,7 +520,7 @@ class ObjectPermissionAPIViewTestCase(TestCase):
|
||||
"""
|
||||
self.user = User.objects.create(username='testuser')
|
||||
self.token = Token.objects.create(user=self.user)
|
||||
self.header = {'HTTP_AUTHORIZATION': 'Token {}'.format(self.token.key)}
|
||||
self.header = {'HTTP_AUTHORIZATION': f'Bearer {TOKEN_PREFIX}{self.token.key}.{self.token.token}'}
|
||||
|
||||
@override_settings(EXEMPT_VIEW_PERMISSIONS=[])
|
||||
def test_get_object(self):
|
||||
|
||||
@@ -6,7 +6,8 @@ from drf_spectacular.views import SpectacularAPIView, SpectacularRedocView, Spec
|
||||
|
||||
from account.views import LoginView, LogoutView
|
||||
from netbox.api.views import APIRootView, StatusView
|
||||
from netbox.graphql.schema import schema
|
||||
from netbox.graphql.schema import schema_v1, schema_v2
|
||||
from netbox.graphql.utils import get_default_schema
|
||||
from netbox.graphql.views import NetBoxGraphQLView
|
||||
from netbox.plugins.urls import plugin_patterns, plugin_api_patterns
|
||||
from netbox.views import HomeView, MediaView, StaticMediaFailureView, SearchView, htmx
|
||||
@@ -40,7 +41,7 @@ _patterns = [
|
||||
# HTMX views
|
||||
path('htmx/object-selector/', htmx.ObjectSelectorView.as_view(), name='htmx_object_selector'),
|
||||
|
||||
# API
|
||||
# REST API
|
||||
path('api/', APIRootView.as_view(), name='api-root'),
|
||||
path('api/circuits/', include('circuits.api.urls')),
|
||||
path('api/core/', include('core.api.urls')),
|
||||
@@ -54,6 +55,7 @@ _patterns = [
|
||||
path('api/wireless/', include('wireless.api.urls')),
|
||||
path('api/status/', StatusView.as_view(), name='api-status'),
|
||||
|
||||
# REST API schema
|
||||
path(
|
||||
"api/schema/",
|
||||
cache_page(timeout=86400, key_prefix=f"api_schema_{settings.RELEASE.version}")(
|
||||
@@ -64,8 +66,10 @@ _patterns = [
|
||||
path('api/schema/swagger-ui/', SpectacularSwaggerView.as_view(url_name='schema'), name='api_docs'),
|
||||
path('api/schema/redoc/', SpectacularRedocView.as_view(url_name='schema'), name='api_redocs'),
|
||||
|
||||
# GraphQL
|
||||
path('graphql/', NetBoxGraphQLView.as_view(schema=schema), name='graphql'),
|
||||
# GraphQL API
|
||||
path('graphql/', NetBoxGraphQLView.as_view(schema=get_default_schema()), name='graphql'),
|
||||
path('graphql/v1/', NetBoxGraphQLView.as_view(schema=schema_v1), name='graphql_v1'),
|
||||
path('graphql/v2/', NetBoxGraphQLView.as_view(schema=schema_v2), name='graphql_v2'),
|
||||
|
||||
# Serving static media in Django to pipe it through LoginRequiredMiddleware
|
||||
path('media/<path:path>', MediaView.as_view(), name='media'),
|
||||
|
||||
@@ -47,9 +47,9 @@ class HomeView(ConditionalLoginRequiredMixin, View):
|
||||
))
|
||||
dashboard = get_default_dashboard(config=DEFAULT_DASHBOARD).get_layout()
|
||||
|
||||
# Check whether a new release is available. (Only for staff/superusers.)
|
||||
# Check whether a new release is available. (Only for superusers.)
|
||||
new_release = None
|
||||
if request.user.is_staff or request.user.is_superuser:
|
||||
if request.user.is_superuser:
|
||||
latest_release = cache.get('latest_release')
|
||||
if latest_release:
|
||||
release_version, release_url = latest_release
|
||||
|
||||
@@ -39,10 +39,6 @@
|
||||
<th scope="row">{% trans "Superuser" %}</th>
|
||||
<td>{% checkmark request.user.is_superuser %}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Staff" %}</th>
|
||||
<td>{% checkmark request.user.is_staff %}</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -1,62 +1,8 @@
|
||||
{% extends 'generic/object.html' %}
|
||||
{% load form_helpers %}
|
||||
{% load helpers %}
|
||||
{% extends 'users/token.html' %}
|
||||
{% load i18n %}
|
||||
{% load plugins %}
|
||||
|
||||
{% block breadcrumbs %}
|
||||
<li class="breadcrumb-item"><a href="{% url 'account:usertoken_list' %}">{% trans "My API Tokens" %}</a></li>
|
||||
<li class="breadcrumb-item">
|
||||
<a href="{% url 'account:usertoken_list' %}">{% trans "My API Tokens" %}</a>
|
||||
</li>
|
||||
{% endblock breadcrumbs %}
|
||||
|
||||
{% block title %}{% trans "Token" %} {{ object }}{% endblock %}
|
||||
|
||||
{% block subtitle %}{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row">
|
||||
<div class="col col-md-12">
|
||||
<div class="card">
|
||||
<h2 class="card-header">{% trans "Token" %}</h2>
|
||||
<table class="table table-hover attr-table">
|
||||
<tr>
|
||||
<th scope="row">{% trans "Key" %}</th>
|
||||
<td>
|
||||
{% if key %}
|
||||
<div class="float-end">
|
||||
{% copy_content "token_id" %}
|
||||
</div>
|
||||
<div id="token_id">{{ key }}</div>
|
||||
{% else %}
|
||||
{{ object.partial }}
|
||||
{% endif %}
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Description" %}</th>
|
||||
<td>{{ object.description|placeholder }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Write enabled" %}</th>
|
||||
<td>{% checkmark object.write_enabled %}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Created" %}</th>
|
||||
<td>{{ object.created|isodatetime }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Expires" %}</th>
|
||||
<td>{{ object.expires|isodatetime|placeholder }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Last used" %}</th>
|
||||
<td>{{ object.last_used|isodatetime|placeholder }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Allowed IPs" %}</th>
|
||||
<td>{{ object.allowed_ips|join:", "|placeholder }}</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
@@ -95,7 +95,7 @@ Blocks:
|
||||
|
||||
{# Page content #}
|
||||
<div class="page-wrapper">
|
||||
<div id="page-content" {% htmx_boost %}>
|
||||
<div id="page-content">
|
||||
|
||||
{# Page header #}
|
||||
{% block header %}
|
||||
|
||||
@@ -27,15 +27,13 @@
|
||||
<div class="mt-1 small text-secondary">
|
||||
{% if request.user.is_superuser %}
|
||||
{% trans "Admin" %}
|
||||
{% elif request.user.is_staff %}
|
||||
{% trans "Staff" %}
|
||||
{% else %}
|
||||
{% trans "User" %}
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</a>
|
||||
<div class="dropdown-menu dropdown-menu-end dropdown-menu-arrow" {% htmx_boost %}>
|
||||
<div class="dropdown-menu dropdown-menu-end dropdown-menu-arrow">
|
||||
<a href="{% url 'account:profile' %}" class="dropdown-item">
|
||||
<i class="mdi mdi-account"></i> {% trans "Profile" %}
|
||||
</a>
|
||||
|
||||
@@ -37,7 +37,7 @@
|
||||
path. Refer to <a href="{{ docs_url }}">the installation documentation</a> for further guidance.
|
||||
{% endblocktrans %}
|
||||
<ul>
|
||||
{% if request.user.is_staff or request.user.is_superuser %}
|
||||
{% if request.user.is_superuser %}
|
||||
<li><code>STATIC_ROOT: <strong>{{ settings.STATIC_ROOT }}</strong></code></li>
|
||||
{% endif %}
|
||||
<li><code>STATIC_URL: <strong>{{ settings.STATIC_URL }}</strong></code></li>
|
||||
|
||||
@@ -14,9 +14,24 @@
|
||||
<h2 class="card-header">{% trans "Token" %}</h2>
|
||||
<table class="table table-hover attr-table">
|
||||
<tr>
|
||||
<th scope="row">{% trans "Key" %}</th>
|
||||
<td>{% if settings.ALLOW_TOKEN_RETRIEVAL %}{{ object.key }}{% else %}{{ object.partial }}{% endif %}</td>
|
||||
<th scope="row">{% trans "Version" %}</th>
|
||||
<td>{{ object.version }}</td>
|
||||
</tr>
|
||||
{% if object.version == 1 %}
|
||||
<tr>
|
||||
<th scope="row">{% trans "Token" %}</th>
|
||||
<td>{{ object.partial }}</td>
|
||||
</tr>
|
||||
{% else %}
|
||||
<tr>
|
||||
<th scope="row">{% trans "Key" %}</th>
|
||||
<td>{{ object }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Pepper ID" %}</th>
|
||||
<td>{{ object.pepper_id }}</td>
|
||||
</tr>
|
||||
{% endif %}
|
||||
<tr>
|
||||
<th scope="row">{% trans "User" %}</th>
|
||||
<td>
|
||||
|
||||
@@ -35,10 +35,6 @@
|
||||
<th scope="row">{% trans "Active" %}</th>
|
||||
<td>{% checkmark object.is_active %}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Staff" %}</th>
|
||||
<td>{% checkmark object.is_staff %}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Superuser" %}</th>
|
||||
<td>{% checkmark object.is_superuser %}</td>
|
||||
|
||||
70
netbox/tenancy/migrations/0021_ci_collations.py
Normal file
70
netbox/tenancy/migrations/0021_ci_collations.py
Normal file
@@ -0,0 +1,70 @@
|
||||
from django.db import migrations, models
|
||||
|
||||
PATTERN_OPS_INDEXES = [
|
||||
'tenancy_contactgroup_slug_5b0f3e75_like',
|
||||
'tenancy_contactrole_name_44b01a1f_like',
|
||||
'tenancy_contactrole_slug_c5837d7d_like',
|
||||
'tenancy_tenant_slug_0716575e_like',
|
||||
'tenancy_tenantgroup_name_53363199_like',
|
||||
'tenancy_tenantgroup_slug_e2af1cb6_like',
|
||||
]
|
||||
|
||||
|
||||
def remove_indexes(apps, schema_editor):
|
||||
for idx in PATTERN_OPS_INDEXES:
|
||||
schema_editor.execute(f'DROP INDEX IF EXISTS {idx}')
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('tenancy', '0020_remove_contactgroupmembership'),
|
||||
('dcim', '0217_ci_collations'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(
|
||||
code=remove_indexes,
|
||||
reverse_code=migrations.RunPython.noop,
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='contactgroup',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='contactgroup',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='contactrole',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='contactrole',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='tenant',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='tenant',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='tenantgroup',
|
||||
name='name',
|
||||
field=models.CharField(db_collation='ci_natural_sort', max_length=100, unique=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='tenantgroup',
|
||||
name='slug',
|
||||
field=models.SlugField(db_collation='case_insensitive', max_length=100, unique=True),
|
||||
),
|
||||
]
|
||||
@@ -55,7 +55,7 @@ class Contact(PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
db_collation="natural_sort"
|
||||
db_collation='natural_sort',
|
||||
)
|
||||
title = models.CharField(
|
||||
verbose_name=_('title'),
|
||||
|
||||
@@ -19,12 +19,13 @@ class TenantGroup(NestedGroupModel):
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
unique=True,
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort'
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100,
|
||||
unique=True
|
||||
unique=True,
|
||||
db_collation='case_insensitive'
|
||||
)
|
||||
|
||||
class Meta:
|
||||
@@ -41,11 +42,12 @@ class Tenant(ContactsMixin, PrimaryModel):
|
||||
name = models.CharField(
|
||||
verbose_name=_('name'),
|
||||
max_length=100,
|
||||
db_collation="natural_sort"
|
||||
db_collation='ci_natural_sort',
|
||||
)
|
||||
slug = models.SlugField(
|
||||
verbose_name=_('slug'),
|
||||
max_length=100
|
||||
max_length=100,
|
||||
db_collation='case_insensitive',
|
||||
)
|
||||
group = models.ForeignKey(
|
||||
to='tenancy.TenantGroup',
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
from django.conf import settings
|
||||
from django.contrib.auth import authenticate
|
||||
from rest_framework import serializers
|
||||
from rest_framework.exceptions import AuthenticationFailed, PermissionDenied
|
||||
@@ -15,14 +14,13 @@ __all__ = (
|
||||
|
||||
|
||||
class TokenSerializer(ValidatedModelSerializer):
|
||||
key = serializers.CharField(
|
||||
min_length=40,
|
||||
max_length=40,
|
||||
allow_blank=True,
|
||||
token = serializers.CharField(
|
||||
required=False,
|
||||
write_only=not settings.ALLOW_TOKEN_RETRIEVAL
|
||||
default=Token.generate,
|
||||
)
|
||||
user = UserSerializer(
|
||||
nested=True
|
||||
)
|
||||
user = UserSerializer(nested=True)
|
||||
allowed_ips = serializers.ListField(
|
||||
child=IPNetworkSerializer(),
|
||||
required=False,
|
||||
@@ -33,15 +31,20 @@ class TokenSerializer(ValidatedModelSerializer):
|
||||
class Meta:
|
||||
model = Token
|
||||
fields = (
|
||||
'id', 'url', 'display_url', 'display', 'user', 'created', 'expires', 'last_used', 'key', 'write_enabled',
|
||||
'description', 'allowed_ips',
|
||||
'id', 'url', 'display_url', 'display', 'version', 'key', 'user', 'description', 'created', 'expires',
|
||||
'last_used', 'write_enabled', 'pepper_id', 'allowed_ips', 'token',
|
||||
)
|
||||
brief_fields = ('id', 'url', 'display', 'key', 'write_enabled', 'description')
|
||||
read_only_fields = ('key',)
|
||||
brief_fields = ('id', 'url', 'display', 'version', 'key', 'write_enabled', 'description')
|
||||
|
||||
def to_internal_value(self, data):
|
||||
if not getattr(self.instance, 'key', None) and 'key' not in data:
|
||||
data['key'] = Token.generate_key()
|
||||
return super().to_internal_value(data)
|
||||
def get_fields(self):
|
||||
fields = super().get_fields()
|
||||
|
||||
# Make user field read-only if updating an existing Token.
|
||||
if self.instance is not None:
|
||||
fields['user'].read_only = True
|
||||
|
||||
return fields
|
||||
|
||||
def validate(self, data):
|
||||
|
||||
@@ -75,8 +78,8 @@ class TokenProvisionSerializer(TokenSerializer):
|
||||
class Meta:
|
||||
model = Token
|
||||
fields = (
|
||||
'id', 'url', 'display_url', 'display', 'user', 'created', 'expires', 'last_used', 'key', 'write_enabled',
|
||||
'description', 'allowed_ips', 'username', 'password',
|
||||
'id', 'url', 'display_url', 'display', 'version', 'user', 'key', 'created', 'expires', 'last_used', 'key',
|
||||
'write_enabled', 'description', 'allowed_ips', 'username', 'password', 'token',
|
||||
)
|
||||
|
||||
def validate(self, data):
|
||||
|
||||
@@ -52,7 +52,7 @@ class UserSerializer(ValidatedModelSerializer):
|
||||
model = User
|
||||
fields = (
|
||||
'id', 'url', 'display_url', 'display', 'username', 'password', 'first_name', 'last_name', 'email',
|
||||
'is_staff', 'is_active', 'date_joined', 'last_login', 'groups', 'permissions',
|
||||
'is_active', 'date_joined', 'last_login', 'groups', 'permissions',
|
||||
)
|
||||
brief_fields = ('id', 'url', 'display', 'username')
|
||||
extra_kwargs = {
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user