mirror of
https://github.com/netbox-community/netbox.git
synced 2025-12-29 00:27:45 -06:00
Compare commits
58 Commits
20660-scri
...
19724-grap
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
76caae12fa | ||
|
|
26c91f01c6 | ||
|
|
af55da008b | ||
|
|
810d1c2418 | ||
|
|
91b2d61ea4 | ||
|
|
b7b7b00885 | ||
|
|
87505e0bb9 | ||
|
|
7d82493052 | ||
|
|
77c08b7bf9 | ||
|
|
adad7c2209 | ||
|
|
595b343cd0 | ||
|
|
730aee9b26 | ||
|
|
8aa1e2802b | ||
|
|
c2d19119cb | ||
|
|
0c4d0fa2e8 | ||
|
|
5ad6bd88f6 | ||
|
|
2bebfccf9b | ||
|
|
b7cc4c418b | ||
|
|
37a9d03348 | ||
|
|
a91af996d5 | ||
|
|
bb290dc792 | ||
|
|
fcdb7ff6c8 | ||
|
|
18a308ae3a | ||
|
|
c63e60a62b | ||
|
|
82db8a9c02 | ||
|
|
bb75bceec5 | ||
|
|
9a68cde95f | ||
|
|
6c723dfb1a | ||
|
|
9b85d92ad0 | ||
|
|
917a2c2618 | ||
|
|
6388705e57 | ||
|
|
ac335c3d87 | ||
|
|
a54c508da2 | ||
|
|
d69042f26e | ||
|
|
f6290dd7af | ||
|
|
adce67a7cf | ||
|
|
f82f084c02 | ||
|
|
43fc7fb58a | ||
|
|
11099b01bb | ||
|
|
5dc48f3a88 | ||
|
|
1ee23ba6fa | ||
|
|
23d7515b41 | ||
|
|
12818f1786 | ||
|
|
f0ae0da1c7 | ||
|
|
c30e4813b7 | ||
|
|
57a7afd548 | ||
|
|
b4eaeead13 | ||
|
|
24fff6bd74 | ||
|
|
b9567208d4 | ||
|
|
cfcea7c941 | ||
|
|
21ba27fb39 | ||
|
|
c0e4d1c1e3 | ||
|
|
d95eaa7ba2 | ||
|
|
5506901867 | ||
|
|
ec9da88134 | ||
|
|
e221f1fffa | ||
|
|
530dad279a | ||
|
|
b1439dc298 |
4
.github/ISSUE_TEMPLATE/02-bug_report.yaml
vendored
4
.github/ISSUE_TEMPLATE/02-bug_report.yaml
vendored
@@ -35,9 +35,9 @@ body:
|
||||
label: Python Version
|
||||
description: What version of Python are you currently running?
|
||||
options:
|
||||
- "3.10"
|
||||
- "3.11"
|
||||
- "3.12"
|
||||
- "3.13"
|
||||
- "3.14"
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
|
||||
2
.github/workflows/ci.yml
vendored
2
.github/workflows/ci.yml
vendored
@@ -31,7 +31,7 @@ jobs:
|
||||
NETBOX_CONFIGURATION: netbox.configuration_testing
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ['3.10', '3.11', '3.12']
|
||||
python-version: ['3.12', '3.13']
|
||||
node-version: ['20.x']
|
||||
services:
|
||||
redis:
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
repos:
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
rev: v0.14.1
|
||||
rev: v0.6.9
|
||||
hooks:
|
||||
- id: ruff
|
||||
name: "Ruff linter"
|
||||
|
||||
1235
contrib/openapi.json
1235
contrib/openapi.json
File diff suppressed because it is too large
Load Diff
@@ -2,7 +2,7 @@
|
||||
|
||||
## Local Authentication
|
||||
|
||||
Local user accounts and groups can be created in NetBox under the "Authentication" section in the "Admin" menu. This section is available only to users with the "staff" permission enabled.
|
||||
Local user accounts and groups can be created in NetBox under the "Authentication" section in the "Admin" menu.
|
||||
|
||||
At a minimum, each user account must have a username and password set. User accounts may also denote a first name, last name, and email address. [Permissions](../permissions.md) may also be assigned to individual users and/or groups as needed.
|
||||
|
||||
|
||||
@@ -1,5 +1,15 @@
|
||||
# GraphQL API Parameters
|
||||
|
||||
## GRAPHQL_DEFAULT_VERSION
|
||||
|
||||
!!! note "This parameter was introduced in NetBox v4.5."
|
||||
|
||||
Default: `1`
|
||||
|
||||
Designates the default version of the GraphQL API served by `/graphql/`. To access a specific version, append the version number to the URL, e.g. `/graphql/v2/`.
|
||||
|
||||
---
|
||||
|
||||
## GRAPHQL_ENABLED
|
||||
|
||||
!!! tip "Dynamic Configuration Parameter"
|
||||
|
||||
@@ -127,19 +127,3 @@ The list of groups that promote an remote User to Superuser on Login. If group i
|
||||
Default: `[]` (Empty list)
|
||||
|
||||
The list of users that get promoted to Superuser on Login. If user isn't present in list on next Login, the Role gets revoked. (Requires `REMOTE_AUTH_ENABLED` and `REMOTE_AUTH_GROUP_SYNC_ENABLED` )
|
||||
|
||||
---
|
||||
|
||||
## REMOTE_AUTH_STAFF_GROUPS
|
||||
|
||||
Default: `[]` (Empty list)
|
||||
|
||||
The list of groups that promote an remote User to Staff on Login. If group isn't present on next Login, the Role gets revoked. (Requires `REMOTE_AUTH_ENABLED` and `REMOTE_AUTH_GROUP_SYNC_ENABLED` )
|
||||
|
||||
---
|
||||
|
||||
## REMOTE_AUTH_STAFF_USERS
|
||||
|
||||
Default: `[]` (Empty list)
|
||||
|
||||
The list of users that get promoted to Staff on Login. If user isn't present in list on next Login, the Role gets revoked. (Requires `REMOTE_AUTH_ENABLED` and `REMOTE_AUTH_GROUP_SYNC_ENABLED` )
|
||||
|
||||
@@ -23,6 +23,31 @@ ALLOWED_HOSTS = ['*']
|
||||
|
||||
---
|
||||
|
||||
## API_TOKEN_PEPPERS
|
||||
|
||||
!!! info "This parameter was introduced in NetBox v4.5."
|
||||
|
||||
[Cryptographic peppers](https://en.wikipedia.org/wiki/Pepper_(cryptography)) are employed to generate hashes of sensitive values on the server. This parameter defines the peppers used to hash v2 API tokens in NetBox. You must define at least one pepper before creating a v2 API token. See the [API documentation](../integrations/rest-api.md#authentication) for further information about how peppers are used.
|
||||
|
||||
```python
|
||||
API_TOKEN_PEPPERS = {
|
||||
# DO NOT USE THIS EXAMPLE PEPPER IN PRODUCTION
|
||||
1: 'kp7ht*76fiQAhUi5dHfASLlYUE_S^gI^(7J^K5M!LfoH@vl&b_',
|
||||
}
|
||||
```
|
||||
|
||||
!!! warning "Peppers are sensitive"
|
||||
Treat pepper values as extremely sensitive. Consider populating peppers from environment variables at initialization time rather than defining them in the configuration file, if feasible.
|
||||
|
||||
Peppers must be at least 50 characters in length and should comprise a random string with a diverse character set. Consider using the Python script at `$INSTALL_ROOT/netbox/generate_secret_key.py` to generate a pepper value.
|
||||
|
||||
It is recommended to start with a pepper ID of `1`. Additional peppers can be introduced later as needed to begin rotating token hashes.
|
||||
|
||||
!!! tip
|
||||
Although NetBox will run without `API_TOKEN_PEPPERS` defined, the use of v2 API tokens will be unavailable.
|
||||
|
||||
---
|
||||
|
||||
## DATABASE
|
||||
|
||||
!!! warning "Legacy Configuration Parameter"
|
||||
|
||||
@@ -1,16 +1,5 @@
|
||||
# Security & Authentication Parameters
|
||||
|
||||
## ALLOW_TOKEN_RETRIEVAL
|
||||
|
||||
Default: `False`
|
||||
|
||||
!!! note
|
||||
The default value of this parameter changed from `True` to `False` in NetBox v4.3.0.
|
||||
|
||||
If disabled, the values of API tokens will not be displayed after each token's initial creation. A user **must** record the value of a token prior to its creation, or it will be lost. Note that this affects _all_ users, regardless of assigned permissions.
|
||||
|
||||
---
|
||||
|
||||
## ALLOWED_URL_SCHEMES
|
||||
|
||||
!!! tip "Dynamic Configuration Parameter"
|
||||
|
||||
@@ -131,17 +131,6 @@ self.log_info(f"Running as user {username} (IP: {ip_address})...")
|
||||
|
||||
For a complete list of available request parameters, please see the [Django documentation](https://docs.djangoproject.com/en/stable/ref/request-response/).
|
||||
|
||||
## Reading Data from Files
|
||||
|
||||
The Script class provides two convenience methods for reading data from files:
|
||||
|
||||
* `load_yaml`
|
||||
* `load_json`
|
||||
|
||||
These two methods will load data in YAML or JSON format, respectively, from files within the local path (i.e. `SCRIPTS_ROOT`).
|
||||
|
||||
**Note:** These convenience methods are deprecated and will be removed in NetBox v4.4. These only work if running scripts within the local path, they will not work if using a storage other than ScriptFileSystemStorage.
|
||||
|
||||
## Logging
|
||||
|
||||
The Script object provides a set of convenient functions for recording messages at different severity levels:
|
||||
@@ -404,61 +393,6 @@ A complete date & time. Returns a `datetime.datetime` object.
|
||||
|
||||
Custom scripts can be run via the web UI by navigating to the script, completing any required form data, and clicking the "run script" button. It is possible to schedule a script to be executed at specified time in the future. A scheduled script can be canceled by deleting the associated job result object.
|
||||
|
||||
#### Prefilling variables via URL parameters
|
||||
|
||||
Script form fields can be prefilled by appending query parameters to the script URL. Each parameter name must match the variable name defined on the script class. Prefilled values are treated as initial values and can be edited before execution. Multiple values can be supplied by repeating the same parameter. Query values must be percent‑encoded where required (for example, spaces as `%20`).
|
||||
|
||||
Examples:
|
||||
|
||||
For string and integer variables, when a script defines:
|
||||
|
||||
```python
|
||||
from extras.scripts import Script, StringVar, IntegerVar
|
||||
|
||||
class MyScript(Script):
|
||||
name = StringVar()
|
||||
count = IntegerVar()
|
||||
```
|
||||
|
||||
the following URL prefills the `name` and `count` fields:
|
||||
|
||||
```
|
||||
https://<netbox>/extras/scripts/<script_id>/?name=Branch42&count=3
|
||||
```
|
||||
|
||||
For object variables (`ObjectVar`), supply the object’s primary key (PK):
|
||||
|
||||
```
|
||||
https://<netbox>/extras/scripts/<script_id>/?device=1
|
||||
```
|
||||
|
||||
If an object ID cannot be resolved or the object is not visible to the requesting user, the field remains unpopulated.
|
||||
|
||||
Supported variable types:
|
||||
|
||||
| Variable class | Expected input | Example query string |
|
||||
|--------------------------|---------------------------------|---------------------------------------------|
|
||||
| `StringVar` | string (percent‑encoded) | `?name=Branch42` |
|
||||
| `TextVar` | string (percent‑encoded) | `?notes=Initial%20value` |
|
||||
| `IntegerVar` | integer | `?count=3` |
|
||||
| `DecimalVar` | decimal number | `?ratio=0.75` |
|
||||
| `BooleanVar` | value → `True`; empty → `False` | `?enabled=true` (True), `?enabled=` (False) |
|
||||
| `ChoiceVar` | choice value (not label) | `?role=edge` |
|
||||
| `MultiChoiceVar` | choice values (repeat) | `?roles=edge&roles=core` |
|
||||
| `ObjectVar(Device)` | PK (integer) | `?device=1` |
|
||||
| `MultiObjectVar(Device)` | PKs (repeat) | `?devices=1&devices=2` |
|
||||
| `IPAddressVar` | IP address | `?ip=198.51.100.10` |
|
||||
| `IPAddressWithMaskVar` | IP address with mask | `?addr=192.0.2.1/24` |
|
||||
| `IPNetworkVar` | IP network prefix | `?network=2001:db8::/64` |
|
||||
| `DateVar` | date `YYYY-MM-DD` | `?date=2025-01-05` |
|
||||
| `DateTimeVar` | ISO datetime | `?when=2025-01-05T14:30:00` |
|
||||
| `FileVar` | — (not supported) | — |
|
||||
|
||||
!!! note
|
||||
- The parameter names above are examples; use the actual variable attribute names defined by the script.
|
||||
- For `BooleanVar`, only an empty value (`?enabled=`) unchecks the box; any other value including `false` or `0` checks it.
|
||||
- File uploads (`FileVar`) cannot be prefilled via URL parameters.
|
||||
|
||||
### Via the API
|
||||
|
||||
To run a script via the REST API, issue a POST request to the script's endpoint specifying the form data and commitment. For example, to run a script named `example.MyReport`, we would make a request such as the following:
|
||||
|
||||
@@ -7,7 +7,7 @@ Getting started with NetBox development is pretty straightforward, and should fe
|
||||
* A Linux system or compatible environment
|
||||
* A PostgreSQL server, which can be installed locally [per the documentation](../installation/1-postgresql.md)
|
||||
* A Redis server, which can also be [installed locally](../installation/2-redis.md)
|
||||
* Python 3.10 or later
|
||||
* Python 3.12 or later
|
||||
|
||||
### 1. Fork the Repo
|
||||
|
||||
|
||||
@@ -8,7 +8,7 @@ NetBox's REST API, powered by the [Django REST Framework](https://www.django-res
|
||||
|
||||
```no-highlight
|
||||
curl -s -X POST \
|
||||
-H "Authorization: Token $TOKEN" \
|
||||
-H "Authorization: Bearer $TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
http://netbox/api/ipam/prefixes/ \
|
||||
--data '{"prefix": "192.0.2.0/24", "site": {"name": "Branch 12"}}'
|
||||
|
||||
@@ -34,9 +34,6 @@ Sets the default number of rows displayed on paginated tables.
|
||||
### Paginator placement
|
||||
Controls where pagination controls are rendered relative to a table.
|
||||
|
||||
### HTMX navigation (experimental)
|
||||
Enables partial‑page navigation for supported views. Disable this preference if unexpected behavior is observed.
|
||||
|
||||
### Striped table rows
|
||||
Toggles alternating row backgrounds on tables.
|
||||
|
||||
|
||||
@@ -6,8 +6,8 @@ This section of the documentation discusses installing and configuring the NetBo
|
||||
|
||||
Begin by installing all system packages required by NetBox and its dependencies.
|
||||
|
||||
!!! warning "Python 3.10 or later required"
|
||||
NetBox supports Python 3.10, 3.11, and 3.12.
|
||||
!!! warning "Python 3.12 or later required"
|
||||
NetBox supports only Python 3.12 or later.
|
||||
|
||||
```no-highlight
|
||||
sudo apt install -y python3 python3-pip python3-venv python3-dev \
|
||||
@@ -15,7 +15,7 @@ build-essential libxml2-dev libxslt1-dev libffi-dev libpq-dev \
|
||||
libssl-dev zlib1g-dev
|
||||
```
|
||||
|
||||
Before continuing, check that your installed Python version is at least 3.10:
|
||||
Before continuing, check that your installed Python version is at least 3.12:
|
||||
|
||||
```no-highlight
|
||||
python3 -V
|
||||
@@ -120,6 +120,23 @@ If you are not yet sure what the domain name and/or IP address of the NetBox ins
|
||||
ALLOWED_HOSTS = ['*']
|
||||
```
|
||||
|
||||
### API_TOKEN_PEPPERS
|
||||
|
||||
Define at least one random cryptographic pepper, identified by a numeric ID starting at 1. This will be used to generate SHA256 checksums for API tokens.
|
||||
|
||||
```python
|
||||
API_TOKEN_PEPPERS = {
|
||||
# DO NOT USE THIS EXAMPLE PEPPER IN PRODUCTION
|
||||
1: 'kp7ht*76fiQAhUi5dHfASLlYUE_S^gI^(7J^K5M!LfoH@vl&b_',
|
||||
}
|
||||
```
|
||||
|
||||
!!! tip
|
||||
As with [`SECRET_KEY`](#secret_key) below, you can use the `generate_secret_key.py` script to generate a random pepper:
|
||||
```no-highlight
|
||||
python3 ../generate_secret_key.py
|
||||
```
|
||||
|
||||
### DATABASES
|
||||
|
||||
This parameter holds the PostgreSQL database configuration details. The default database must be defined; additional databases may be defined as needed e.g. by plugins.
|
||||
@@ -235,10 +252,10 @@ Once NetBox has been configured, we're ready to proceed with the actual installa
|
||||
sudo /opt/netbox/upgrade.sh
|
||||
```
|
||||
|
||||
Note that **Python 3.10 or later is required** for NetBox v4.0 and later releases. If the default Python installation on your server is set to a lesser version, pass the path to the supported installation as an environment variable named `PYTHON`. (Note that the environment variable must be passed _after_ the `sudo` command.)
|
||||
Note that **Python 3.12 or later is required** for NetBox v4.5 and later releases. If the default Python installation on your server is set to a lesser version, pass the path to the supported installation as an environment variable named `PYTHON`. (Note that the environment variable must be passed _after_ the `sudo` command.)
|
||||
|
||||
```no-highlight
|
||||
sudo PYTHON=/usr/bin/python3.10 /opt/netbox/upgrade.sh
|
||||
sudo PYTHON=/usr/bin/python3.12 /opt/netbox/upgrade.sh
|
||||
```
|
||||
|
||||
!!! note
|
||||
|
||||
@@ -60,6 +60,3 @@ You should see output similar to the following:
|
||||
If the NetBox service fails to start, issue the command `journalctl -eu netbox` to check for log messages that may indicate the problem.
|
||||
|
||||
Once you've verified that the WSGI workers are up and running, move on to HTTP server setup.
|
||||
|
||||
!!! note
|
||||
There is a bug in the current stable release of gunicorn (v21.2.0) where automatic restarts of the worker processes can result in 502 errors under heavy load. (See [gunicorn bug #3038](https://github.com/benoitc/gunicorn/issues/3038) for more detail.) Users who encounter this issue may opt to downgrade to an earlier, unaffected release of gunicorn (`pip install gunicorn==20.1.0`). Note, however, that this earlier release does not officially support Python 3.11.
|
||||
|
||||
@@ -121,7 +121,6 @@ AUTH_LDAP_MIRROR_GROUPS = True
|
||||
# Define special user types using groups. Exercise great caution when assigning superuser status.
|
||||
AUTH_LDAP_USER_FLAGS_BY_GROUP = {
|
||||
"is_active": "cn=active,ou=groups,dc=example,dc=com",
|
||||
"is_staff": "cn=staff,ou=groups,dc=example,dc=com",
|
||||
"is_superuser": "cn=superuser,ou=groups,dc=example,dc=com"
|
||||
}
|
||||
|
||||
@@ -134,7 +133,6 @@ AUTH_LDAP_CACHE_TIMEOUT = 3600
|
||||
```
|
||||
|
||||
* `is_active` - All users must be mapped to at least this group to enable authentication. Without this, users cannot log in.
|
||||
* `is_staff` - Users mapped to this group are enabled for access to the administration tools; this is the equivalent of checking the "staff status" box on a manually created user. This doesn't grant any specific permissions.
|
||||
* `is_superuser` - Users mapped to this group will be granted superuser status. Superusers are implicitly granted all permissions.
|
||||
|
||||
!!! warning
|
||||
@@ -248,7 +246,6 @@ AUTH_LDAP_MIRROR_GROUPS = True
|
||||
# Define special user types using groups. Exercise great caution when assigning superuser status.
|
||||
AUTH_LDAP_USER_FLAGS_BY_GROUP = {
|
||||
"is_active": "cn=active,ou=groups,dc=example,dc=com",
|
||||
"is_staff": "cn=staff,ou=groups,dc=example,dc=com",
|
||||
"is_superuser": "cn=superuser,ou=groups,dc=example,dc=com"
|
||||
}
|
||||
|
||||
|
||||
@@ -27,7 +27,7 @@ The following sections detail how to set up a new instance of NetBox:
|
||||
|
||||
| Dependency | Supported Versions |
|
||||
|------------|--------------------|
|
||||
| Python | 3.10, 3.11, 3.12 |
|
||||
| Python | 3.12, 3.13, 3.14 |
|
||||
| PostgreSQL | 14+ |
|
||||
| Redis | 4.0+ |
|
||||
|
||||
|
||||
@@ -19,7 +19,7 @@ NetBox requires the following dependencies:
|
||||
|
||||
| Dependency | Supported Versions |
|
||||
|------------|--------------------|
|
||||
| Python | 3.10, 3.11, 3.12 |
|
||||
| Python | 3.12, 3.13, 3.14 |
|
||||
| PostgreSQL | 14+ |
|
||||
| Redis | 4.0+ |
|
||||
|
||||
@@ -27,6 +27,7 @@ NetBox requires the following dependencies:
|
||||
|
||||
| NetBox Version | Python min | Python max | PostgreSQL min | Redis min | Documentation |
|
||||
|:--------------:|:----------:|:----------:|:--------------:|:---------:|:-----------------------------------------------------------------------------------------:|
|
||||
| 4.5 | 3.12 | 3.14 | 14 | 4.0 | [Link](https://github.com/netbox-community/netbox/blob/v4.5.0/docs/installation/index.md) |
|
||||
| 4.4 | 3.10 | 3.12 | 14 | 4.0 | [Link](https://github.com/netbox-community/netbox/blob/v4.4.0/docs/installation/index.md) |
|
||||
| 4.3 | 3.10 | 3.12 | 14 | 4.0 | [Link](https://github.com/netbox-community/netbox/blob/v4.3.0/docs/installation/index.md) |
|
||||
| 4.2 | 3.10 | 3.12 | 13 | 4.0 | [Link](https://github.com/netbox-community/netbox/blob/v4.2.0/docs/installation/index.md) |
|
||||
@@ -130,7 +131,7 @@ sudo ./upgrade.sh
|
||||
If the default version of Python is not at least 3.10, you'll need to pass the path to a supported Python version as an environment variable when calling the upgrade script. For example:
|
||||
|
||||
```no-highlight
|
||||
sudo PYTHON=/usr/bin/python3.10 ./upgrade.sh
|
||||
sudo PYTHON=/usr/bin/python3.12 ./upgrade.sh
|
||||
```
|
||||
|
||||
!!! note
|
||||
|
||||
@@ -11,7 +11,7 @@ curl -H "Authorization: Token $TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Accept: application/json" \
|
||||
http://netbox/graphql/ \
|
||||
--data '{"query": "query {circuit_list(filters:{status: STATUS_ACTIVE}) {cid provider {name}}}"}'
|
||||
--data '{"query": "query {circuit_list(filters:{status: STATUS_ACTIVE}) {results {cid provider {name}}}}"}'
|
||||
```
|
||||
|
||||
The response will include the requested data formatted as JSON:
|
||||
@@ -36,6 +36,30 @@ The response will include the requested data formatted as JSON:
|
||||
}
|
||||
}
|
||||
```
|
||||
If using the GraphQL API v2 the format will be:
|
||||
|
||||
```json
|
||||
{
|
||||
"data": {
|
||||
"circuit_list": {
|
||||
"results": [
|
||||
{
|
||||
"cid": "1002840283",
|
||||
"provider": {
|
||||
"name": "CenturyLink"
|
||||
}
|
||||
},
|
||||
{
|
||||
"cid": "1002840457",
|
||||
"provider": {
|
||||
"name": "CenturyLink"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
!!! note
|
||||
It's recommended to pass the return data through a JSON parser such as `jq` for better readability.
|
||||
@@ -47,12 +71,15 @@ NetBox provides both a singular and plural query field for each object type:
|
||||
|
||||
For example, query `device(id:123)` to fetch a specific device (identified by its unique ID), and query `device_list` (with an optional set of filters) to fetch all devices.
|
||||
|
||||
!!! note "Changed in NetBox v4.5"
|
||||
If using the GraphQL API v2, List queries now return paginated results. The actual objects are contained within the `results` field of the response, along with `total_count` and `page_info` fields for pagination metadata. Prior to v4.5, list queries returned objects directly as an array.
|
||||
|
||||
For more detail on constructing GraphQL queries, see the [GraphQL queries documentation](https://graphql.org/learn/queries/). For filtering and lookup syntax, please refer to the [Strawberry Django documentation](https://strawberry.rocks/docs/django/guide/filters).
|
||||
|
||||
## Filtering
|
||||
|
||||
!!! note "Changed in NetBox v4.3"
|
||||
The filtering syntax fo the GraphQL API has changed substantially in NetBox v4.3.
|
||||
The filtering syntax for the GraphQL API has changed substantially in NetBox v4.3.
|
||||
|
||||
Filters can be specified as key-value pairs within parentheses immediately following the query name. For example, the following will return only active sites:
|
||||
|
||||
@@ -67,6 +94,21 @@ query {
|
||||
}
|
||||
}
|
||||
```
|
||||
If using the GraphQL API v2 the format will be:
|
||||
|
||||
```
|
||||
query {
|
||||
site_list(
|
||||
filters: {
|
||||
status: STATUS_ACTIVE
|
||||
}
|
||||
) {
|
||||
results {
|
||||
name
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Filters can be combined with logical operators, such as `OR` and `NOT`. For example, the following will return every site that is planned _or_ assigned to a tenant named Foo:
|
||||
|
||||
@@ -88,6 +130,28 @@ query {
|
||||
}
|
||||
}
|
||||
```
|
||||
If using the GraphQL API v2 the format will be:
|
||||
|
||||
```
|
||||
query {
|
||||
site_list(
|
||||
filters: {
|
||||
status: STATUS_PLANNED,
|
||||
OR: {
|
||||
tenant: {
|
||||
name: {
|
||||
exact: "Foo"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
) {
|
||||
results {
|
||||
name
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Filtering can also be applied to related objects. For example, the following query will return only enabled interfaces for each device:
|
||||
|
||||
@@ -102,6 +166,21 @@ query {
|
||||
}
|
||||
}
|
||||
```
|
||||
If using the GraphQL API v2 the format will be:
|
||||
|
||||
```
|
||||
query {
|
||||
device_list {
|
||||
results {
|
||||
id
|
||||
name
|
||||
interfaces(filters: {enabled: {exact: true}}) {
|
||||
name
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Multiple Return Types
|
||||
|
||||
@@ -128,6 +207,31 @@ Certain queries can return multiple types of objects, for example cable terminat
|
||||
}
|
||||
}
|
||||
```
|
||||
If using the GraphQL API v2 the format will be:
|
||||
|
||||
```
|
||||
{
|
||||
cable_list {
|
||||
results {
|
||||
id
|
||||
a_terminations {
|
||||
... on CircuitTerminationType {
|
||||
id
|
||||
class_type
|
||||
}
|
||||
... on ConsolePortType {
|
||||
id
|
||||
class_type
|
||||
}
|
||||
... on ConsoleServerPortType {
|
||||
id
|
||||
class_type
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The field "class_type" is an easy way to distinguish what type of object it is when viewing the returned data, or when filtering. It contains the class name, for example "CircuitTermination" or "ConsoleServerPort".
|
||||
|
||||
@@ -142,6 +246,47 @@ query {
|
||||
}
|
||||
}
|
||||
```
|
||||
### Pagination in GraphQL API V2
|
||||
|
||||
All list queries return paginated results using the `OffsetPaginated` type, which includes:
|
||||
|
||||
- `results`: The list of objects matching the query
|
||||
- `total_count`: The total number of objects matching the filters (without pagination)
|
||||
- `page_info`: Pagination metadata including `offset` and `limit`
|
||||
|
||||
By default, queries return up to 100 results. You can control pagination by specifying the `pagination` parameter with `offset` and `limit` values:
|
||||
|
||||
```
|
||||
query {
|
||||
device_list(pagination: { offset: 0, limit: 20 }) {
|
||||
total_count
|
||||
page_info {
|
||||
offset
|
||||
limit
|
||||
}
|
||||
results {
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
If you don't need pagination metadata, you can simply query the `results`:
|
||||
|
||||
```
|
||||
query {
|
||||
device_list {
|
||||
results {
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
!!! note
|
||||
When not specifying the `pagination` parameter, avoid querying `page_info.limit` as it may return an undefined value. Either provide explicit pagination parameters or only query the `results` and `total_count` fields.
|
||||
|
||||
## Authentication
|
||||
|
||||
|
||||
@@ -80,7 +80,7 @@ Likewise, the site, rack, and device objects are located under the "DCIM" applic
|
||||
|
||||
The full hierarchy of available endpoints can be viewed by navigating to the API root in a web browser.
|
||||
|
||||
Each model generally has two views associated with it: a list view and a detail view. The list view is used to retrieve a list of multiple objects and to create new objects. The detail view is used to retrieve, update, or delete an single existing object. All objects are referenced by their numeric primary key (`id`).
|
||||
Each model generally has two views associated with it: a list view and a detail view. The list view is used to retrieve a list of multiple objects and to create new objects. The detail view is used to retrieve, update, or delete a single existing object. All objects are referenced by their numeric primary key (`id`).
|
||||
|
||||
* `/api/dcim/devices/` - List existing devices or create a new device
|
||||
* `/api/dcim/devices/123/` - Retrieve, update, or delete the device with ID 123
|
||||
@@ -653,18 +653,22 @@ The NetBox REST API primarily employs token-based authentication. For convenienc
|
||||
|
||||
### Tokens
|
||||
|
||||
A token is a unique identifier mapped to a NetBox user account. Each user may have one or more tokens which he or she can use for authentication when making REST API requests. To create a token, navigate to the API tokens page under your user profile.
|
||||
A token is a secret, unique identifier mapped to a NetBox user account. Each user may have one or more tokens which he or she can use for authentication when making REST API requests. To create a token, navigate to the API tokens page under your user profile. When creating a token, NetBox will automatically populate a randomly-generated token value.
|
||||
|
||||
!!! note "Tokens cannot be retrieved once created"
|
||||
Once a token has been created, its plaintext value cannot be retrieved. For this reason, you must take care to securely record the token locally immediately upon its creation. If a token plaintext is lost, it cannot be recovered: A new token must be created.
|
||||
|
||||
By default, all users can create and manage their own REST API tokens under the user control panel in the UI or via the REST API. This ability can be disabled by overriding the [`DEFAULT_PERMISSIONS`](../configuration/security.md#default_permissions) configuration parameter.
|
||||
|
||||
Each token contains a 160-bit key represented as 40 hexadecimal characters. When creating a token, you'll typically leave the key field blank so that a random key will be automatically generated. However, NetBox allows you to specify a key in case you need to restore a previously deleted token to operation.
|
||||
|
||||
Additionally, a token can be set to expire at a specific time. This can be useful if an external client needs to be granted temporary access to NetBox.
|
||||
|
||||
!!! info "Restricting Token Retrieval"
|
||||
The ability to retrieve the key value of a previously-created API token can be restricted by disabling the [`ALLOW_TOKEN_RETRIEVAL`](../configuration/security.md#allow_token_retrieval) configuration parameter.
|
||||
#### v1 and v2 Tokens
|
||||
|
||||
### Restricting Write Operations
|
||||
Beginning with NetBox v4.5, two versions of API token are supported, denoted as v1 and v2. Users are strongly encouraged to create only v2 tokens and to discontinue the use of v1 tokens. Support for v1 tokens will be removed in a future NetBox release.
|
||||
|
||||
v2 API tokens offer much stronger security. The token plaintext given at creation time is hashed together with a configured [cryptographic pepper](../configuration/required-parameters.md#api_token_peppers) to generate a unique checksum. This checksum is irreversible; the token plaintext is never stored on the server and thus cannot be retrieved even with database-level access.
|
||||
|
||||
#### Restricting Write Operations
|
||||
|
||||
By default, a token can be used to perform all actions via the API that a user would be permitted to do via the web UI. Deselecting the "write enabled" option will restrict API requests made with the token to read operations (e.g. GET) only.
|
||||
|
||||
@@ -681,10 +685,22 @@ It is possible to provision authentication tokens for other users via the REST A
|
||||
|
||||
### Authenticating to the API
|
||||
|
||||
An authentication token is attached to a request by setting the `Authorization` header to the string `Token` followed by a space and the user's token:
|
||||
An authentication token is included with a request in its `Authorization` header. The format of the header value depends on the version of token in use. v2 tokens use the following form, concatenating the token's prefix (`nbt_`) and key with its plaintext value, separated by a period:
|
||||
|
||||
```
|
||||
$ curl -H "Authorization: Token $TOKEN" \
|
||||
Authorization: Bearer nbt_<key>.<token>
|
||||
```
|
||||
|
||||
Legacy v1 tokens use the prefix `Token` rather than `Bearer`, and include only the token plaintext. (v1 tokens do not have a key.)
|
||||
|
||||
```
|
||||
Authorization: Token <token>
|
||||
```
|
||||
|
||||
Below is an example REST API request utilizing a v2 token.
|
||||
|
||||
```
|
||||
$ curl -H "Authorization: Bearer nbt_4F9DAouzURLb.zjebxBPzICiPbWz0Wtx0fTL7bCKXKGTYhNzkgC2S" \
|
||||
-H "Accept: application/json; indent=4" \
|
||||
https://netbox/api/dcim/sites/
|
||||
{
|
||||
|
||||
@@ -173,12 +173,12 @@ classifiers=[
|
||||
'Intended Audience :: Developers',
|
||||
'Natural Language :: English',
|
||||
"Programming Language :: Python :: 3 :: Only",
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Programming Language :: Python :: 3.13',
|
||||
'Programming Language :: Python :: 3.14',
|
||||
]
|
||||
|
||||
requires-python = ">=3.10.0"
|
||||
requires-python = ">=3.12.0"
|
||||
|
||||
```
|
||||
|
||||
@@ -195,7 +195,7 @@ python3 -m venv ~/.virtualenvs/my_plugin
|
||||
You can make NetBox available within this environment by creating a path file pointing to its location. This will add NetBox to the Python path upon activation. (Be sure to adjust the command below to specify your actual virtual environment path, Python version, and NetBox installation.)
|
||||
|
||||
```shell
|
||||
echo /opt/netbox/netbox > $VENV/lib/python3.10/site-packages/netbox.pth
|
||||
echo /opt/netbox/netbox > $VENV/lib/python3.12/site-packages/netbox.pth
|
||||
```
|
||||
|
||||
## Development Installation
|
||||
|
||||
@@ -64,14 +64,17 @@ item1 = PluginMenuItem(
|
||||
|
||||
A `PluginMenuItem` has the following attributes:
|
||||
|
||||
| Attribute | Required | Description |
|
||||
|-----------------|----------|----------------------------------------------------------------------------------------------------------|
|
||||
| `link` | Yes | Name of the URL path to which this menu item links |
|
||||
| `link_text` | Yes | The text presented to the user |
|
||||
| `permissions` | - | A list of permissions required to display this link |
|
||||
| `auth_required` | - | Display only for authenticated users |
|
||||
| `staff_only` | - | Display only for users who have `is_staff` set to true (any specified permissions will also be required) |
|
||||
| `buttons` | - | An iterable of PluginMenuButton instances to include |
|
||||
| Attribute | Required | Description |
|
||||
|-----------------|----------|------------------------------------------------------|
|
||||
| `link` | Yes | Name of the URL path to which this menu item links |
|
||||
| `link_text` | Yes | The text presented to the user |
|
||||
| `permissions` | - | A list of permissions required to display this link |
|
||||
| `auth_required` | - | Display only for authenticated users |
|
||||
| `staff_only` | - | Display only for superusers |
|
||||
| `buttons` | - | An iterable of PluginMenuButton instances to include |
|
||||
|
||||
!!! note "Changed in NetBox v4.5"
|
||||
In releases prior to NetBox v4.5, `staff_only` restricted display of a menu item to only users with `is_staff` set to True. In NetBox v4.5, the `is_staff` flag was removed from the user model. Menu items with `staff_only` set to True are now displayed only for superusers.
|
||||
|
||||
## Menu Buttons
|
||||
|
||||
|
||||
@@ -1,57 +0,0 @@
|
||||
from django.utils.translation import gettext as _
|
||||
|
||||
from account.models import UserToken
|
||||
from netbox.tables import NetBoxTable, columns
|
||||
|
||||
__all__ = (
|
||||
'UserTokenTable',
|
||||
)
|
||||
|
||||
|
||||
TOKEN = """<samp><span id="token_{{ record.pk }}">{{ record }}</span></samp>"""
|
||||
|
||||
ALLOWED_IPS = """{{ value|join:", " }}"""
|
||||
|
||||
COPY_BUTTON = """
|
||||
{% if settings.ALLOW_TOKEN_RETRIEVAL %}
|
||||
{% copy_content record.pk prefix="token_" color="success" %}
|
||||
{% endif %}
|
||||
"""
|
||||
|
||||
|
||||
class UserTokenTable(NetBoxTable):
|
||||
"""
|
||||
Table for users to manager their own API tokens under account views.
|
||||
"""
|
||||
key = columns.TemplateColumn(
|
||||
verbose_name=_('Key'),
|
||||
template_code=TOKEN,
|
||||
)
|
||||
write_enabled = columns.BooleanColumn(
|
||||
verbose_name=_('Write Enabled')
|
||||
)
|
||||
created = columns.DateTimeColumn(
|
||||
timespec='minutes',
|
||||
verbose_name=_('Created'),
|
||||
)
|
||||
expires = columns.DateTimeColumn(
|
||||
timespec='minutes',
|
||||
verbose_name=_('Expires'),
|
||||
)
|
||||
last_used = columns.DateTimeColumn(
|
||||
verbose_name=_('Last Used'),
|
||||
)
|
||||
allowed_ips = columns.TemplateColumn(
|
||||
verbose_name=_('Allowed IPs'),
|
||||
template_code=ALLOWED_IPS
|
||||
)
|
||||
actions = columns.ActionsColumn(
|
||||
actions=('edit', 'delete'),
|
||||
extra_buttons=COPY_BUTTON
|
||||
)
|
||||
|
||||
class Meta(NetBoxTable.Meta):
|
||||
model = UserToken
|
||||
fields = (
|
||||
'pk', 'id', 'key', 'description', 'write_enabled', 'created', 'expires', 'last_used', 'allowed_ips',
|
||||
)
|
||||
@@ -26,8 +26,9 @@ from extras.tables import BookmarkTable, NotificationTable, SubscriptionTable
|
||||
from netbox.authentication import get_auth_backend_display, get_saml_idps
|
||||
from netbox.config import get_config
|
||||
from netbox.views import generic
|
||||
from users import forms, tables
|
||||
from users import forms
|
||||
from users.models import UserConfig
|
||||
from users.tables import TokenTable
|
||||
from utilities.request import safe_for_redirect
|
||||
from utilities.string import remove_linebreaks
|
||||
from utilities.views import register_model_view
|
||||
@@ -328,7 +329,8 @@ class UserTokenListView(LoginRequiredMixin, View):
|
||||
|
||||
def get(self, request):
|
||||
tokens = UserToken.objects.filter(user=request.user)
|
||||
table = tables.UserTokenTable(tokens)
|
||||
table = TokenTable(tokens)
|
||||
table.columns.hide('user')
|
||||
table.configure(request)
|
||||
|
||||
return render(request, 'account/token_list.html', {
|
||||
@@ -343,11 +345,9 @@ class UserTokenView(LoginRequiredMixin, View):
|
||||
|
||||
def get(self, request, pk):
|
||||
token = get_object_or_404(UserToken.objects.filter(user=request.user), pk=pk)
|
||||
key = token.key if settings.ALLOW_TOKEN_RETRIEVAL else None
|
||||
|
||||
return render(request, 'account/token.html', {
|
||||
'object': token,
|
||||
'key': key,
|
||||
})
|
||||
|
||||
|
||||
|
||||
@@ -2,12 +2,13 @@ from typing import List
|
||||
|
||||
import strawberry
|
||||
import strawberry_django
|
||||
from strawberry_django.pagination import OffsetPaginated
|
||||
|
||||
from .types import *
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class CircuitsQuery:
|
||||
class CircuitsQueryV1:
|
||||
circuit: CircuitType = strawberry_django.field()
|
||||
circuit_list: List[CircuitType] = strawberry_django.field()
|
||||
|
||||
@@ -40,3 +41,41 @@ class CircuitsQuery:
|
||||
|
||||
virtual_circuit_type: VirtualCircuitTypeType = strawberry_django.field()
|
||||
virtual_circuit_type_list: List[VirtualCircuitTypeType] = strawberry_django.field()
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class CircuitsQuery:
|
||||
circuit: CircuitType = strawberry_django.field()
|
||||
circuit_list: OffsetPaginated[CircuitType] = strawberry_django.offset_paginated()
|
||||
|
||||
circuit_termination: CircuitTerminationType = strawberry_django.field()
|
||||
circuit_termination_list: OffsetPaginated[CircuitTerminationType] = strawberry_django.offset_paginated()
|
||||
|
||||
circuit_type: CircuitTypeType = strawberry_django.field()
|
||||
circuit_type_list: OffsetPaginated[CircuitTypeType] = strawberry_django.offset_paginated()
|
||||
|
||||
circuit_group: CircuitGroupType = strawberry_django.field()
|
||||
circuit_group_list: OffsetPaginated[CircuitGroupType] = strawberry_django.offset_paginated()
|
||||
|
||||
circuit_group_assignment: CircuitGroupAssignmentType = strawberry_django.field()
|
||||
circuit_group_assignment_list: OffsetPaginated[CircuitGroupAssignmentType] = strawberry_django.offset_paginated()
|
||||
|
||||
provider: ProviderType = strawberry_django.field()
|
||||
provider_list: OffsetPaginated[ProviderType] = strawberry_django.offset_paginated()
|
||||
|
||||
provider_account: ProviderAccountType = strawberry_django.field()
|
||||
provider_account_list: OffsetPaginated[ProviderAccountType] = strawberry_django.offset_paginated()
|
||||
|
||||
provider_network: ProviderNetworkType = strawberry_django.field()
|
||||
provider_network_list: OffsetPaginated[ProviderNetworkType] = strawberry_django.offset_paginated()
|
||||
|
||||
virtual_circuit: VirtualCircuitType = strawberry_django.field()
|
||||
virtual_circuit_list: OffsetPaginated[VirtualCircuitType] = strawberry_django.offset_paginated()
|
||||
|
||||
virtual_circuit_termination: VirtualCircuitTerminationType = strawberry_django.field()
|
||||
virtual_circuit_termination_list: OffsetPaginated[VirtualCircuitTerminationType] = (
|
||||
strawberry_django.offset_paginated()
|
||||
)
|
||||
|
||||
virtual_circuit_type: VirtualCircuitTypeType = strawberry_django.field()
|
||||
virtual_circuit_type_list: OffsetPaginated[VirtualCircuitTypeType] = strawberry_django.offset_paginated()
|
||||
|
||||
@@ -9,7 +9,6 @@ from drf_spectacular.utils import OpenApiParameter, extend_schema
|
||||
from rest_framework import viewsets
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
from rest_framework.permissions import IsAdminUser
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.routers import APIRootView
|
||||
from rest_framework.viewsets import ReadOnlyModelViewSet
|
||||
@@ -24,7 +23,7 @@ from netbox.api.authentication import IsAuthenticatedOrLoginNotRequired
|
||||
from netbox.api.metadata import ContentTypeMetadata
|
||||
from netbox.api.pagination import LimitOffsetListPagination
|
||||
from netbox.api.viewsets import NetBoxModelViewSet, NetBoxReadOnlyModelViewSet
|
||||
|
||||
from utilities.api import IsSuperuser
|
||||
from . import serializers
|
||||
|
||||
|
||||
@@ -100,7 +99,7 @@ class BaseRQViewSet(viewsets.ViewSet):
|
||||
"""
|
||||
Base class for RQ view sets. Provides a list() method. Subclasses must implement get_data().
|
||||
"""
|
||||
permission_classes = [IsAdminUser]
|
||||
permission_classes = [IsSuperuser]
|
||||
serializer_class = None
|
||||
|
||||
def get_data(self):
|
||||
|
||||
@@ -2,14 +2,24 @@ from typing import List
|
||||
|
||||
import strawberry
|
||||
import strawberry_django
|
||||
from strawberry_django.pagination import OffsetPaginated
|
||||
|
||||
from .types import *
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class CoreQuery:
|
||||
class CoreQueryV1:
|
||||
data_file: DataFileType = strawberry_django.field()
|
||||
data_file_list: List[DataFileType] = strawberry_django.field()
|
||||
|
||||
data_source: DataSourceType = strawberry_django.field()
|
||||
data_source_list: List[DataSourceType] = strawberry_django.field()
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class CoreQuery:
|
||||
data_file: DataFileType = strawberry_django.field()
|
||||
data_file_list: OffsetPaginated[DataFileType] = strawberry_django.offset_paginated()
|
||||
|
||||
data_source: DataSourceType = strawberry_django.field()
|
||||
data_source_list: OffsetPaginated[DataSourceType] = strawberry_django.offset_paginated()
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
# TODO: Remove this module in NetBox v4.5
|
||||
# Provided for backward compatibility
|
||||
from .object_types import *
|
||||
@@ -8,6 +8,7 @@ from rq.job import Job as RQ_Job, JobStatus
|
||||
from rq.registry import FailedJobRegistry, StartedJobRegistry
|
||||
|
||||
from rest_framework import status
|
||||
from users.constants import TOKEN_PREFIX
|
||||
from users.models import Token, User
|
||||
from utilities.testing import APITestCase, APIViewTestCases, TestCase
|
||||
from utilities.testing.utils import disable_logging
|
||||
@@ -107,14 +108,14 @@ class ObjectTypeTest(APITestCase):
|
||||
def test_list_objects(self):
|
||||
object_type_count = ObjectType.objects.count()
|
||||
|
||||
response = self.client.get(reverse('extras-api:objecttype-list'), **self.header)
|
||||
response = self.client.get(reverse('core-api:objecttype-list'), **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
self.assertEqual(response.data['count'], object_type_count)
|
||||
|
||||
def test_get_object(self):
|
||||
object_type = ObjectType.objects.first()
|
||||
|
||||
url = reverse('extras-api:objecttype-detail', kwargs={'pk': object_type.pk})
|
||||
url = reverse('core-api:objecttype-detail', kwargs={'pk': object_type.pk})
|
||||
self.assertHttpStatus(self.client.get(url, **self.header), status.HTTP_200_OK)
|
||||
|
||||
|
||||
@@ -134,12 +135,9 @@ class BackgroundTaskTestCase(TestCase):
|
||||
Create a user and token for API calls.
|
||||
"""
|
||||
# Create the test user and assign permissions
|
||||
self.user = User.objects.create_user(username='testuser')
|
||||
self.user.is_staff = True
|
||||
self.user.is_active = True
|
||||
self.user.save()
|
||||
self.user = User.objects.create_user(username='testuser', is_active=True)
|
||||
self.token = Token.objects.create(user=self.user)
|
||||
self.header = {'HTTP_AUTHORIZATION': f'Token {self.token.key}'}
|
||||
self.header = {'HTTP_AUTHORIZATION': f'Bearer {TOKEN_PREFIX}{self.token.key}.{self.token.token}'}
|
||||
|
||||
# Clear all queues prior to running each test
|
||||
get_queue('default').connection.flushall()
|
||||
@@ -150,13 +148,11 @@ class BackgroundTaskTestCase(TestCase):
|
||||
url = reverse('core-api:rqqueue-list')
|
||||
|
||||
# Attempt to load view without permission
|
||||
self.user.is_staff = False
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_staff = True
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
@@ -165,7 +161,16 @@ class BackgroundTaskTestCase(TestCase):
|
||||
self.assertIn('low', str(response.content))
|
||||
|
||||
def test_background_queue(self):
|
||||
response = self.client.get(reverse('core-api:rqqueue-detail', args=['default']), **self.header)
|
||||
url = reverse('core-api:rqqueue-detail', args=['default'])
|
||||
|
||||
# Attempt to load view without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertIn('default', str(response.content))
|
||||
self.assertIn('oldest_job_timestamp', str(response.content))
|
||||
@@ -174,8 +179,16 @@ class BackgroundTaskTestCase(TestCase):
|
||||
def test_background_task_list(self):
|
||||
queue = get_queue('default')
|
||||
queue.enqueue(self.dummy_job_default)
|
||||
url = reverse('core-api:rqtask-list')
|
||||
|
||||
response = self.client.get(reverse('core-api:rqtask-list'), **self.header)
|
||||
# Attempt to load view without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertIn('origin', str(response.content))
|
||||
self.assertIn('core.tests.test_api.BackgroundTaskTestCase.dummy_job_default()', str(response.content))
|
||||
@@ -183,8 +196,16 @@ class BackgroundTaskTestCase(TestCase):
|
||||
def test_background_task(self):
|
||||
queue = get_queue('default')
|
||||
job = queue.enqueue(self.dummy_job_default)
|
||||
url = reverse('core-api:rqtask-detail', args=[job.id])
|
||||
|
||||
response = self.client.get(reverse('core-api:rqtask-detail', args=[job.id]), **self.header)
|
||||
# Attempt to load view without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertIn(str(job.id), str(response.content))
|
||||
self.assertIn('origin', str(response.content))
|
||||
@@ -194,45 +215,65 @@ class BackgroundTaskTestCase(TestCase):
|
||||
def test_background_task_delete(self):
|
||||
queue = get_queue('default')
|
||||
job = queue.enqueue(self.dummy_job_default)
|
||||
url = reverse('core-api:rqtask-delete', args=[job.id])
|
||||
|
||||
response = self.client.post(reverse('core-api:rqtask-delete', args=[job.id]), **self.header)
|
||||
# Attempt to load view without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertFalse(RQ_Job.exists(job.id, connection=queue.connection))
|
||||
queue = get_queue('default')
|
||||
self.assertNotIn(job.id, queue.job_ids)
|
||||
|
||||
def test_background_task_requeue(self):
|
||||
queue = get_queue('default')
|
||||
|
||||
# Enqueue & run a job that will fail
|
||||
queue = get_queue('default')
|
||||
job = queue.enqueue(self.dummy_job_failing)
|
||||
worker = get_worker('default')
|
||||
with disable_logging():
|
||||
worker.work(burst=True)
|
||||
self.assertTrue(job.is_failed)
|
||||
url = reverse('core-api:rqtask-requeue', args=[job.id])
|
||||
|
||||
# Attempt to requeue the job without permission
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Re-enqueue the failed job and check that its status has been reset
|
||||
response = self.client.post(reverse('core-api:rqtask-requeue', args=[job.id]), **self.header)
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
job = RQ_Job.fetch(job.id, queue.connection)
|
||||
self.assertFalse(job.is_failed)
|
||||
|
||||
def test_background_task_enqueue(self):
|
||||
queue = get_queue('default')
|
||||
|
||||
# Enqueue some jobs that each depends on its predecessor
|
||||
queue = get_queue('default')
|
||||
job = previous_job = None
|
||||
for _ in range(0, 3):
|
||||
job = queue.enqueue(self.dummy_job_default, depends_on=previous_job)
|
||||
previous_job = job
|
||||
url = reverse('core-api:rqtask-enqueue', args=[job.id])
|
||||
|
||||
# Check that the last job to be enqueued has a status of deferred
|
||||
self.assertIsNotNone(job)
|
||||
self.assertEqual(job.get_status(), JobStatus.DEFERRED)
|
||||
self.assertIsNone(job.enqueued_at)
|
||||
|
||||
# Attempt to force-enqueue the job without permission
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Force-enqueue the deferred job
|
||||
response = self.client.post(reverse('core-api:rqtask-enqueue', args=[job.id]), **self.header)
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
# Check that job's status is updated correctly
|
||||
@@ -242,19 +283,27 @@ class BackgroundTaskTestCase(TestCase):
|
||||
|
||||
def test_background_task_stop(self):
|
||||
queue = get_queue('default')
|
||||
|
||||
worker = get_worker('default')
|
||||
job = queue.enqueue(self.dummy_job_default)
|
||||
worker.prepare_job_execution(job)
|
||||
|
||||
url = reverse('core-api:rqtask-stop', args=[job.id])
|
||||
self.assertEqual(job.get_status(), JobStatus.STARTED)
|
||||
response = self.client.post(reverse('core-api:rqtask-stop', args=[job.id]), **self.header)
|
||||
|
||||
# Attempt to stop the task without permission
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Stop the task
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.post(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
with disable_logging():
|
||||
worker.monitor_work_horse(job, queue) # Sets the job as Failed and removes from Started
|
||||
started_job_registry = StartedJobRegistry(queue.name, connection=queue.connection)
|
||||
self.assertEqual(len(started_job_registry), 0)
|
||||
|
||||
# Verify that the task was cancelled
|
||||
canceled_job_registry = FailedJobRegistry(queue.name, connection=queue.connection)
|
||||
self.assertEqual(len(canceled_job_registry), 1)
|
||||
self.assertIn(job.id, canceled_job_registry)
|
||||
@@ -262,19 +311,34 @@ class BackgroundTaskTestCase(TestCase):
|
||||
def test_worker_list(self):
|
||||
worker1 = get_worker('default', name=uuid.uuid4().hex)
|
||||
worker1.register_birth()
|
||||
|
||||
worker2 = get_worker('high')
|
||||
worker2.register_birth()
|
||||
url = reverse('core-api:rqworker-list')
|
||||
|
||||
response = self.client.get(reverse('core-api:rqworker-list'), **self.header)
|
||||
# Attempt to fetch the worker list without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Fetch the worker list
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertIn(str(worker1.name), str(response.content))
|
||||
|
||||
def test_worker(self):
|
||||
worker1 = get_worker('default', name=uuid.uuid4().hex)
|
||||
worker1.register_birth()
|
||||
url = reverse('core-api:rqworker-detail', args=[worker1.name])
|
||||
|
||||
response = self.client.get(reverse('core-api:rqworker-detail', args=[worker1.name]), **self.header)
|
||||
# Attempt to fetch a worker without permission
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Fetch the worker
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url, **self.header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertIn(str(worker1.name), str(response.content))
|
||||
self.assertIn('birth_date', str(response.content))
|
||||
|
||||
@@ -158,7 +158,7 @@ class BackgroundTaskTestCase(TestCase):
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
self.user.is_staff = True
|
||||
self.user.is_superuser = True
|
||||
self.user.is_active = True
|
||||
self.user.save()
|
||||
|
||||
@@ -171,13 +171,13 @@ class BackgroundTaskTestCase(TestCase):
|
||||
url = reverse('core:background_queue_list')
|
||||
|
||||
# Attempt to load view without permission
|
||||
self.user.is_staff = False
|
||||
self.user.is_superuser = False
|
||||
self.user.save()
|
||||
response = self.client.get(url)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Load view with permission
|
||||
self.user.is_staff = True
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
response = self.client.get(url)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
@@ -356,7 +356,7 @@ class SystemTestCase(TestCase):
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
|
||||
self.user.is_staff = True
|
||||
self.user.is_superuser = True
|
||||
self.user.save()
|
||||
|
||||
def test_system_view_default(self):
|
||||
|
||||
@@ -372,7 +372,7 @@ class ConfigRevisionRestoreView(ContentTypePermissionRequiredMixin, View):
|
||||
class BaseRQView(UserPassesTestMixin, View):
|
||||
|
||||
def test_func(self):
|
||||
return self.request.user.is_staff
|
||||
return self.request.user.is_superuser
|
||||
|
||||
|
||||
class BackgroundQueueListView(TableMixin, BaseRQView):
|
||||
@@ -555,7 +555,7 @@ class WorkerView(BaseRQView):
|
||||
class SystemView(UserPassesTestMixin, View):
|
||||
|
||||
def test_func(self):
|
||||
return self.request.user.is_staff
|
||||
return self.request.user.is_superuser
|
||||
|
||||
def get(self, request):
|
||||
|
||||
@@ -638,7 +638,7 @@ class BasePluginView(UserPassesTestMixin, View):
|
||||
CACHE_KEY_CATALOG_ERROR = 'plugins-catalog-error'
|
||||
|
||||
def test_func(self):
|
||||
return self.request.user.is_staff
|
||||
return self.request.user.is_superuser
|
||||
|
||||
def get_cached_plugins(self, request):
|
||||
catalog_plugins = {}
|
||||
|
||||
@@ -1,10 +1,8 @@
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from drf_spectacular.types import OpenApiTypes
|
||||
from drf_spectacular.utils import extend_schema_field
|
||||
from rest_framework import serializers
|
||||
|
||||
from dcim.choices import *
|
||||
from dcim.constants import *
|
||||
from dcim.models import Cable, CablePath, CableTermination
|
||||
from netbox.api.fields import ChoiceField, ContentTypeField
|
||||
from netbox.api.serializers import BaseModelSerializer, GenericObjectSerializer, NetBoxModelSerializer
|
||||
@@ -51,9 +49,11 @@ class TracedCableSerializer(BaseModelSerializer):
|
||||
|
||||
class CableTerminationSerializer(NetBoxModelSerializer):
|
||||
termination_type = ContentTypeField(
|
||||
queryset=ContentType.objects.filter(CABLE_TERMINATION_MODELS)
|
||||
read_only=True,
|
||||
)
|
||||
termination = serializers.SerializerMethodField(
|
||||
read_only=True,
|
||||
)
|
||||
termination = serializers.SerializerMethodField(read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = CableTermination
|
||||
@@ -61,6 +61,8 @@ class CableTerminationSerializer(NetBoxModelSerializer):
|
||||
'id', 'url', 'display', 'cable', 'cable_end', 'termination_type', 'termination_id',
|
||||
'termination', 'created', 'last_updated',
|
||||
]
|
||||
read_only_fields = fields
|
||||
brief_fields = ('id', 'url', 'display', 'cable', 'cable_end', 'termination_type', 'termination_id')
|
||||
|
||||
@extend_schema_field(serializers.JSONField(allow_null=True))
|
||||
def get_termination(self, obj):
|
||||
|
||||
@@ -16,7 +16,7 @@ from extras.api.mixins import ConfigContextQuerySetMixin, RenderConfigMixin
|
||||
from netbox.api.authentication import IsAuthenticatedOrLoginNotRequired
|
||||
from netbox.api.metadata import ContentTypeMetadata
|
||||
from netbox.api.pagination import StripCountAnnotationsPaginator
|
||||
from netbox.api.viewsets import NetBoxModelViewSet, MPTTLockedMixin
|
||||
from netbox.api.viewsets import NetBoxModelViewSet, MPTTLockedMixin, NetBoxReadOnlyModelViewSet
|
||||
from netbox.api.viewsets.mixins import SequentialBulkCreatesMixin
|
||||
from utilities.api import get_serializer_for_model
|
||||
from utilities.query_functions import CollateAsChar
|
||||
@@ -563,7 +563,7 @@ class CableViewSet(NetBoxModelViewSet):
|
||||
filterset_class = filtersets.CableFilterSet
|
||||
|
||||
|
||||
class CableTerminationViewSet(NetBoxModelViewSet):
|
||||
class CableTerminationViewSet(NetBoxReadOnlyModelViewSet):
|
||||
metadata_class = ContentTypeMetadata
|
||||
queryset = CableTermination.objects.all()
|
||||
serializer_class = serializers.CableTerminationSerializer
|
||||
|
||||
@@ -14,16 +14,16 @@ from netbox.filtersets import (
|
||||
AttributeFiltersMixin, BaseFilterSet, ChangeLoggedModelFilterSet, NestedGroupModelFilterSet, NetBoxModelFilterSet,
|
||||
OrganizationalModelFilterSet,
|
||||
)
|
||||
from tenancy.filtersets import ContactModelFilterSet, TenancyFilterSet
|
||||
from tenancy.filtersets import TenancyFilterSet, ContactModelFilterSet
|
||||
from tenancy.models import *
|
||||
from users.models import User
|
||||
from utilities.filters import (
|
||||
ContentTypeFilter, MultiValueCharFilter, MultiValueMACAddressFilter, MultiValueNumberFilter, MultiValueWWNFilter,
|
||||
NumericArrayFilter, TreeNodeMultipleChoiceFilter,
|
||||
)
|
||||
from virtualization.models import Cluster, ClusterGroup, VirtualMachine, VMInterface
|
||||
from virtualization.models import Cluster, ClusterGroup, VMInterface, VirtualMachine
|
||||
from vpn.models import L2VPN
|
||||
from wireless.choices import WirelessChannelChoices, WirelessRoleChoices
|
||||
from wireless.choices import WirelessRoleChoices, WirelessChannelChoices
|
||||
from wireless.models import WirelessLAN, WirelessLink
|
||||
from .choices import *
|
||||
from .constants import *
|
||||
@@ -1807,14 +1807,6 @@ class MACAddressFilterSet(NetBoxModelFilterSet):
|
||||
queryset=VMInterface.objects.all(),
|
||||
label=_('VM interface (ID)'),
|
||||
)
|
||||
assigned = django_filters.BooleanFilter(
|
||||
method='filter_assigned',
|
||||
label=_('Is assigned'),
|
||||
)
|
||||
primary = django_filters.BooleanFilter(
|
||||
method='filter_primary',
|
||||
label=_('Is primary'),
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = MACAddress
|
||||
@@ -1851,29 +1843,6 @@ class MACAddressFilterSet(NetBoxModelFilterSet):
|
||||
vminterface__in=interface_ids
|
||||
)
|
||||
|
||||
def filter_assigned(self, queryset, name, value):
|
||||
params = {
|
||||
'assigned_object_type__isnull': True,
|
||||
'assigned_object_id__isnull': True,
|
||||
}
|
||||
if value:
|
||||
return queryset.exclude(**params)
|
||||
else:
|
||||
return queryset.filter(**params)
|
||||
|
||||
def filter_primary(self, queryset, name, value):
|
||||
interface_mac_ids = Interface.objects.filter(primary_mac_address_id__isnull=False).values_list(
|
||||
'primary_mac_address_id', flat=True
|
||||
)
|
||||
vminterface_mac_ids = VMInterface.objects.filter(primary_mac_address_id__isnull=False).values_list(
|
||||
'primary_mac_address_id', flat=True
|
||||
)
|
||||
query = Q(pk__in=interface_mac_ids) | Q(pk__in=vminterface_mac_ids)
|
||||
if value:
|
||||
return queryset.filter(query)
|
||||
else:
|
||||
return queryset.exclude(query)
|
||||
|
||||
|
||||
class CommonInterfaceFilterSet(django_filters.FilterSet):
|
||||
mode = django_filters.MultipleChoiceFilter(
|
||||
|
||||
@@ -1676,16 +1676,12 @@ class MACAddressFilterForm(NetBoxModelFilterSetForm):
|
||||
model = MACAddress
|
||||
fieldsets = (
|
||||
FieldSet('q', 'filter_id', 'tag'),
|
||||
FieldSet('mac_address', name=_('Attributes')),
|
||||
FieldSet(
|
||||
'device_id', 'virtual_machine_id', 'assigned', 'primary',
|
||||
name=_('Assignments'),
|
||||
),
|
||||
FieldSet('mac_address', 'device_id', 'virtual_machine_id', name=_('MAC address')),
|
||||
)
|
||||
selector_fields = ('filter_id', 'q', 'device_id', 'virtual_machine_id')
|
||||
mac_address = forms.CharField(
|
||||
required=False,
|
||||
label=_('MAC address'),
|
||||
label=_('MAC address')
|
||||
)
|
||||
device_id = DynamicModelMultipleChoiceField(
|
||||
queryset=Device.objects.all(),
|
||||
@@ -1697,20 +1693,6 @@ class MACAddressFilterForm(NetBoxModelFilterSetForm):
|
||||
required=False,
|
||||
label=_('Assigned VM'),
|
||||
)
|
||||
assigned = forms.NullBooleanField(
|
||||
required=False,
|
||||
label=_('Assigned to an interface'),
|
||||
widget=forms.Select(
|
||||
choices=BOOLEAN_WITH_BLANK_CHOICES
|
||||
),
|
||||
)
|
||||
primary = forms.NullBooleanField(
|
||||
required=False,
|
||||
label=_('Primary MAC of an interface'),
|
||||
widget=forms.Select(
|
||||
choices=BOOLEAN_WITH_BLANK_CHOICES
|
||||
),
|
||||
)
|
||||
tag = TagFilterField(model)
|
||||
|
||||
|
||||
|
||||
@@ -755,10 +755,7 @@ class ModuleForm(ModuleCommonForm, NetBoxModelForm):
|
||||
queryset=ModuleBay.objects.all(),
|
||||
query_params={
|
||||
'device_id': '$device'
|
||||
},
|
||||
context={
|
||||
'disabled': 'installed_module',
|
||||
},
|
||||
}
|
||||
)
|
||||
module_type = DynamicModelChoiceField(
|
||||
label=_('Module type'),
|
||||
|
||||
@@ -18,9 +18,7 @@ from netbox.graphql.filter_mixins import (
|
||||
ImageAttachmentFilterMixin,
|
||||
WeightFilterMixin,
|
||||
)
|
||||
from tenancy.graphql.filter_mixins import ContactFilterMixin, TenancyFilterMixin
|
||||
from virtualization.models import VMInterface
|
||||
|
||||
from tenancy.graphql.filter_mixins import TenancyFilterMixin, ContactFilterMixin
|
||||
from .filter_mixins import (
|
||||
CabledObjectModelFilterMixin,
|
||||
ComponentModelFilterMixin,
|
||||
@@ -421,24 +419,6 @@ class MACAddressFilter(PrimaryModelFilterMixin):
|
||||
)
|
||||
assigned_object_id: ID | None = strawberry_django.filter_field()
|
||||
|
||||
@strawberry_django.filter_field()
|
||||
def assigned(self, value: bool, prefix) -> Q:
|
||||
return Q(**{f'{prefix}assigned_object_id__isnull': (not value)})
|
||||
|
||||
@strawberry_django.filter_field()
|
||||
def primary(self, value: bool, prefix) -> Q:
|
||||
interface_mac_ids = models.Interface.objects.filter(primary_mac_address_id__isnull=False).values_list(
|
||||
'primary_mac_address_id', flat=True
|
||||
)
|
||||
vminterface_mac_ids = VMInterface.objects.filter(primary_mac_address_id__isnull=False).values_list(
|
||||
'primary_mac_address_id', flat=True
|
||||
)
|
||||
query = Q(**{f'{prefix}pk__in': interface_mac_ids}) | Q(**{f'{prefix}pk__in': vminterface_mac_ids})
|
||||
if value:
|
||||
return Q(query)
|
||||
else:
|
||||
return ~Q(query)
|
||||
|
||||
|
||||
@strawberry_django.filter_type(models.Interface, lookups=True)
|
||||
class InterfaceFilter(ModularComponentModelFilterMixin, InterfaceBaseFilterMixin, CabledObjectModelFilterMixin):
|
||||
|
||||
@@ -2,12 +2,13 @@ from typing import List
|
||||
|
||||
import strawberry
|
||||
import strawberry_django
|
||||
from strawberry_django.pagination import OffsetPaginated
|
||||
|
||||
from .types import *
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class DCIMQuery:
|
||||
class DCIMQueryV1:
|
||||
cable: CableType = strawberry_django.field()
|
||||
cable_list: List[CableType] = strawberry_django.field()
|
||||
|
||||
@@ -136,3 +137,137 @@ class DCIMQuery:
|
||||
|
||||
virtual_device_context: VirtualDeviceContextType = strawberry_django.field()
|
||||
virtual_device_context_list: List[VirtualDeviceContextType] = strawberry_django.field()
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class DCIMQuery:
|
||||
cable: CableType = strawberry_django.field()
|
||||
cable_list: OffsetPaginated[CableType] = strawberry_django.offset_paginated()
|
||||
|
||||
console_port: ConsolePortType = strawberry_django.field()
|
||||
console_port_list: OffsetPaginated[ConsolePortType] = strawberry_django.offset_paginated()
|
||||
|
||||
console_port_template: ConsolePortTemplateType = strawberry_django.field()
|
||||
console_port_template_list: OffsetPaginated[ConsolePortTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
console_server_port: ConsoleServerPortType = strawberry_django.field()
|
||||
console_server_port_list: OffsetPaginated[ConsoleServerPortType] = strawberry_django.offset_paginated()
|
||||
|
||||
console_server_port_template: ConsoleServerPortTemplateType = strawberry_django.field()
|
||||
console_server_port_template_list: OffsetPaginated[ConsoleServerPortTemplateType] = (
|
||||
strawberry_django.offset_paginated()
|
||||
)
|
||||
|
||||
device: DeviceType = strawberry_django.field()
|
||||
device_list: OffsetPaginated[DeviceType] = strawberry_django.offset_paginated()
|
||||
|
||||
device_bay: DeviceBayType = strawberry_django.field()
|
||||
device_bay_list: OffsetPaginated[DeviceBayType] = strawberry_django.offset_paginated()
|
||||
|
||||
device_bay_template: DeviceBayTemplateType = strawberry_django.field()
|
||||
device_bay_template_list: OffsetPaginated[DeviceBayTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
device_role: DeviceRoleType = strawberry_django.field()
|
||||
device_role_list: OffsetPaginated[DeviceRoleType] = strawberry_django.offset_paginated()
|
||||
|
||||
device_type: DeviceTypeType = strawberry_django.field()
|
||||
device_type_list: OffsetPaginated[DeviceTypeType] = strawberry_django.offset_paginated()
|
||||
|
||||
front_port: FrontPortType = strawberry_django.field()
|
||||
front_port_list: OffsetPaginated[FrontPortType] = strawberry_django.offset_paginated()
|
||||
|
||||
front_port_template: FrontPortTemplateType = strawberry_django.field()
|
||||
front_port_template_list: OffsetPaginated[FrontPortTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
mac_address: MACAddressType = strawberry_django.field()
|
||||
mac_address_list: OffsetPaginated[MACAddressType] = strawberry_django.offset_paginated()
|
||||
|
||||
interface: InterfaceType = strawberry_django.field()
|
||||
interface_list: OffsetPaginated[InterfaceType] = strawberry_django.offset_paginated()
|
||||
|
||||
interface_template: InterfaceTemplateType = strawberry_django.field()
|
||||
interface_template_list: OffsetPaginated[InterfaceTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
inventory_item: InventoryItemType = strawberry_django.field()
|
||||
inventory_item_list: OffsetPaginated[InventoryItemType] = strawberry_django.offset_paginated()
|
||||
|
||||
inventory_item_role: InventoryItemRoleType = strawberry_django.field()
|
||||
inventory_item_role_list: OffsetPaginated[InventoryItemRoleType] = strawberry_django.offset_paginated()
|
||||
|
||||
inventory_item_template: InventoryItemTemplateType = strawberry_django.field()
|
||||
inventory_item_template_list: OffsetPaginated[InventoryItemTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
location: LocationType = strawberry_django.field()
|
||||
location_list: OffsetPaginated[LocationType] = strawberry_django.offset_paginated()
|
||||
|
||||
manufacturer: ManufacturerType = strawberry_django.field()
|
||||
manufacturer_list: OffsetPaginated[ManufacturerType] = strawberry_django.offset_paginated()
|
||||
|
||||
module: ModuleType = strawberry_django.field()
|
||||
module_list: OffsetPaginated[ModuleType] = strawberry_django.offset_paginated()
|
||||
|
||||
module_bay: ModuleBayType = strawberry_django.field()
|
||||
module_bay_list: OffsetPaginated[ModuleBayType] = strawberry_django.offset_paginated()
|
||||
|
||||
module_bay_template: ModuleBayTemplateType = strawberry_django.field()
|
||||
module_bay_template_list: OffsetPaginated[ModuleBayTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
module_type_profile: ModuleTypeProfileType = strawberry_django.field()
|
||||
module_type_profile_list: OffsetPaginated[ModuleTypeProfileType] = strawberry_django.offset_paginated()
|
||||
|
||||
module_type: ModuleTypeType = strawberry_django.field()
|
||||
module_type_list: OffsetPaginated[ModuleTypeType] = strawberry_django.offset_paginated()
|
||||
|
||||
platform: PlatformType = strawberry_django.field()
|
||||
platform_list: OffsetPaginated[PlatformType] = strawberry_django.offset_paginated()
|
||||
|
||||
power_feed: PowerFeedType = strawberry_django.field()
|
||||
power_feed_list: OffsetPaginated[PowerFeedType] = strawberry_django.offset_paginated()
|
||||
|
||||
power_outlet: PowerOutletType = strawberry_django.field()
|
||||
power_outlet_list: OffsetPaginated[PowerOutletType] = strawberry_django.offset_paginated()
|
||||
|
||||
power_outlet_template: PowerOutletTemplateType = strawberry_django.field()
|
||||
power_outlet_template_list: OffsetPaginated[PowerOutletTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
power_panel: PowerPanelType = strawberry_django.field()
|
||||
power_panel_list: OffsetPaginated[PowerPanelType] = strawberry_django.offset_paginated()
|
||||
|
||||
power_port: PowerPortType = strawberry_django.field()
|
||||
power_port_list: OffsetPaginated[PowerPortType] = strawberry_django.offset_paginated()
|
||||
|
||||
power_port_template: PowerPortTemplateType = strawberry_django.field()
|
||||
power_port_template_list: OffsetPaginated[PowerPortTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
rack_type: RackTypeType = strawberry_django.field()
|
||||
rack_type_list: OffsetPaginated[RackTypeType] = strawberry_django.offset_paginated()
|
||||
|
||||
rack: RackType = strawberry_django.field()
|
||||
rack_list: OffsetPaginated[RackType] = strawberry_django.offset_paginated()
|
||||
|
||||
rack_reservation: RackReservationType = strawberry_django.field()
|
||||
rack_reservation_list: OffsetPaginated[RackReservationType] = strawberry_django.offset_paginated()
|
||||
|
||||
rack_role: RackRoleType = strawberry_django.field()
|
||||
rack_role_list: OffsetPaginated[RackRoleType] = strawberry_django.offset_paginated()
|
||||
|
||||
rear_port: RearPortType = strawberry_django.field()
|
||||
rear_port_list: OffsetPaginated[RearPortType] = strawberry_django.offset_paginated()
|
||||
|
||||
rear_port_template: RearPortTemplateType = strawberry_django.field()
|
||||
rear_port_template_list: OffsetPaginated[RearPortTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
region: RegionType = strawberry_django.field()
|
||||
region_list: OffsetPaginated[RegionType] = strawberry_django.offset_paginated()
|
||||
|
||||
site: SiteType = strawberry_django.field()
|
||||
site_list: OffsetPaginated[SiteType] = strawberry_django.offset_paginated()
|
||||
|
||||
site_group: SiteGroupType = strawberry_django.field()
|
||||
site_group_list: OffsetPaginated[SiteGroupType] = strawberry_django.offset_paginated()
|
||||
|
||||
virtual_chassis: VirtualChassisType = strawberry_django.field()
|
||||
virtual_chassis_list: OffsetPaginated[VirtualChassisType] = strawberry_django.offset_paginated()
|
||||
|
||||
virtual_device_context: VirtualDeviceContextType = strawberry_django.field()
|
||||
virtual_device_context_list: OffsetPaginated[VirtualDeviceContextType] = strawberry_django.offset_paginated()
|
||||
|
||||
@@ -1174,9 +1174,6 @@ class MACAddressTable(NetBoxTable):
|
||||
orderable=False,
|
||||
verbose_name=_('Parent')
|
||||
)
|
||||
is_primary = columns.BooleanColumn(
|
||||
verbose_name=_('Primary')
|
||||
)
|
||||
tags = columns.TagColumn(
|
||||
url_name='dcim:macaddress_list'
|
||||
)
|
||||
@@ -1187,7 +1184,7 @@ class MACAddressTable(NetBoxTable):
|
||||
class Meta(DeviceComponentTable.Meta):
|
||||
model = models.MACAddress
|
||||
fields = (
|
||||
'pk', 'id', 'mac_address', 'assigned_object_parent', 'assigned_object', 'description', 'is_primary',
|
||||
'comments', 'tags', 'created', 'last_updated',
|
||||
'pk', 'id', 'mac_address', 'assigned_object_parent', 'assigned_object', 'description', 'comments', 'tags',
|
||||
'created', 'last_updated',
|
||||
)
|
||||
default_columns = ('pk', 'mac_address', 'assigned_object_parent', 'assigned_object', 'description')
|
||||
|
||||
@@ -2376,6 +2376,33 @@ class CableTest(APIViewTestCases.APIViewTestCase):
|
||||
]
|
||||
|
||||
|
||||
class CableTerminationTest(
|
||||
APIViewTestCases.GetObjectViewTestCase,
|
||||
APIViewTestCases.ListObjectsViewTestCase,
|
||||
):
|
||||
model = CableTermination
|
||||
brief_fields = ['cable', 'cable_end', 'display', 'id', 'termination_id', 'termination_type', 'url']
|
||||
|
||||
@classmethod
|
||||
def setUpTestData(cls):
|
||||
device1 = create_test_device('Device 1')
|
||||
device2 = create_test_device('Device 2')
|
||||
|
||||
interfaces = []
|
||||
for device in (device1, device2):
|
||||
for i in range(0, 10):
|
||||
interfaces.append(Interface(device=device, type=InterfaceTypeChoices.TYPE_1GE_FIXED, name=f'eth{i}'))
|
||||
Interface.objects.bulk_create(interfaces)
|
||||
|
||||
cables = (
|
||||
Cable(a_terminations=[interfaces[0]], b_terminations=[interfaces[10]], label='Cable 1'),
|
||||
Cable(a_terminations=[interfaces[1]], b_terminations=[interfaces[11]], label='Cable 2'),
|
||||
Cable(a_terminations=[interfaces[2]], b_terminations=[interfaces[12]], label='Cable 3'),
|
||||
)
|
||||
for cable in cables:
|
||||
cable.save()
|
||||
|
||||
|
||||
class ConnectedDeviceTest(APITestCase):
|
||||
|
||||
@classmethod
|
||||
|
||||
@@ -10,7 +10,7 @@ from netbox.choices import ColorChoices, WeightUnitChoices
|
||||
from tenancy.models import Tenant, TenantGroup
|
||||
from users.models import User
|
||||
from utilities.testing import ChangeLoggedFilterSetTests, create_test_device, create_test_virtualmachine
|
||||
from virtualization.models import Cluster, ClusterGroup, ClusterType, VirtualMachine, VMInterface
|
||||
from virtualization.models import Cluster, ClusterType, ClusterGroup, VMInterface, VirtualMachine
|
||||
from wireless.choices import WirelessChannelChoices, WirelessRoleChoices
|
||||
from wireless.models import WirelessLink
|
||||
|
||||
@@ -7164,20 +7164,9 @@ class MACAddressTestCase(TestCase, ChangeLoggedFilterSetTests):
|
||||
MACAddress(mac_address='00-00-00-05-01-01', assigned_object=vm_interfaces[1]),
|
||||
MACAddress(mac_address='00-00-00-06-01-01', assigned_object=vm_interfaces[2]),
|
||||
MACAddress(mac_address='00-00-00-06-01-02', assigned_object=vm_interfaces[2]),
|
||||
# unassigned
|
||||
MACAddress(mac_address='00-00-00-07-01-01'),
|
||||
)
|
||||
MACAddress.objects.bulk_create(mac_addresses)
|
||||
|
||||
# Set MAC addresses as primary
|
||||
for idx, interface in enumerate(interfaces):
|
||||
interface.primary_mac_address = mac_addresses[idx]
|
||||
interface.save()
|
||||
for idx, vm_interface in enumerate(vm_interfaces):
|
||||
# Offset by 4 for device MACs
|
||||
vm_interface.primary_mac_address = mac_addresses[idx + 4]
|
||||
vm_interface.save()
|
||||
|
||||
def test_mac_address(self):
|
||||
params = {'mac_address': ['00-00-00-01-01-01', '00-00-00-02-01-01']}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||
@@ -7209,15 +7198,3 @@ class MACAddressTestCase(TestCase, ChangeLoggedFilterSetTests):
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||
params = {'vminterface': [vm_interfaces[0].name, vm_interfaces[1].name]}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||
|
||||
def test_assigned(self):
|
||||
params = {'assigned': True}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 8)
|
||||
params = {'assigned': False}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 1)
|
||||
|
||||
def test_primary(self):
|
||||
params = {'primary': True}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 6)
|
||||
params = {'primary': False}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 3)
|
||||
|
||||
@@ -5,7 +5,6 @@ from rest_framework import serializers
|
||||
from core.api.serializers_.jobs import JobSerializer
|
||||
from extras.models import Script
|
||||
from netbox.api.serializers import ValidatedModelSerializer
|
||||
from utilities.datetime import local_now
|
||||
|
||||
__all__ = (
|
||||
'ScriptDetailSerializer',
|
||||
@@ -67,31 +66,11 @@ class ScriptInputSerializer(serializers.Serializer):
|
||||
interval = serializers.IntegerField(required=False, allow_null=True)
|
||||
|
||||
def validate_schedule_at(self, value):
|
||||
"""
|
||||
Validates the specified schedule time for a script execution.
|
||||
"""
|
||||
if value:
|
||||
if not self.context['script'].python_class.scheduling_enabled:
|
||||
raise serializers.ValidationError(_('Scheduling is not enabled for this script.'))
|
||||
if value < local_now():
|
||||
raise serializers.ValidationError(_('Scheduled time must be in the future.'))
|
||||
if value and not self.context['script'].python_class.scheduling_enabled:
|
||||
raise serializers.ValidationError(_("Scheduling is not enabled for this script."))
|
||||
return value
|
||||
|
||||
def validate_interval(self, value):
|
||||
"""
|
||||
Validates the provided interval based on the script's scheduling configuration.
|
||||
"""
|
||||
if value and not self.context['script'].python_class.scheduling_enabled:
|
||||
raise serializers.ValidationError(_('Scheduling is not enabled for this script.'))
|
||||
raise serializers.ValidationError(_("Scheduling is not enabled for this script."))
|
||||
return value
|
||||
|
||||
def validate(self, data):
|
||||
"""
|
||||
Validates the given data and ensures the necessary fields are populated.
|
||||
"""
|
||||
# Set the schedule_at time to now if only an interval is provided
|
||||
# while handling the case where schedule_at is null.
|
||||
if data.get('interval') and not data.get('schedule_at'):
|
||||
data['schedule_at'] = local_now()
|
||||
|
||||
return super().validate(data)
|
||||
|
||||
@@ -1,10 +1,8 @@
|
||||
from django.urls import include, path
|
||||
|
||||
from core.api.views import ObjectTypeViewSet
|
||||
from netbox.api.routers import NetBoxRouter
|
||||
from . import views
|
||||
|
||||
|
||||
router = NetBoxRouter()
|
||||
router.APIRootView = views.ExtrasRootView
|
||||
|
||||
@@ -29,9 +27,6 @@ router.register('config-context-profiles', views.ConfigContextProfileViewSet)
|
||||
router.register('config-templates', views.ConfigTemplateViewSet)
|
||||
router.register('scripts', views.ScriptViewSet, basename='script')
|
||||
|
||||
# TODO: Remove in NetBox v4.5
|
||||
router.register('object-types', ObjectTypeViewSet)
|
||||
|
||||
app_name = 'extras-api'
|
||||
urlpatterns = [
|
||||
path('dashboard/', views.DashboardView.as_view(), name='dashboard'),
|
||||
|
||||
@@ -2,12 +2,13 @@ from typing import List
|
||||
|
||||
import strawberry
|
||||
import strawberry_django
|
||||
from strawberry_django.pagination import OffsetPaginated
|
||||
|
||||
from .types import *
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class ExtrasQuery:
|
||||
class ExtrasQueryV1:
|
||||
config_context: ConfigContextType = strawberry_django.field()
|
||||
config_context_list: List[ConfigContextType] = strawberry_django.field()
|
||||
|
||||
@@ -58,3 +59,57 @@ class ExtrasQuery:
|
||||
|
||||
event_rule: EventRuleType = strawberry_django.field()
|
||||
event_rule_list: List[EventRuleType] = strawberry_django.field()
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class ExtrasQuery:
|
||||
config_context: ConfigContextType = strawberry_django.field()
|
||||
config_context_list: OffsetPaginated[ConfigContextType] = strawberry_django.offset_paginated()
|
||||
|
||||
config_context_profile: ConfigContextProfileType = strawberry_django.field()
|
||||
config_context_profile_list: OffsetPaginated[ConfigContextProfileType] = strawberry_django.offset_paginated()
|
||||
|
||||
config_template: ConfigTemplateType = strawberry_django.field()
|
||||
config_template_list: OffsetPaginated[ConfigTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
custom_field: CustomFieldType = strawberry_django.field()
|
||||
custom_field_list: OffsetPaginated[CustomFieldType] = strawberry_django.offset_paginated()
|
||||
|
||||
custom_field_choice_set: CustomFieldChoiceSetType = strawberry_django.field()
|
||||
custom_field_choice_set_list: OffsetPaginated[CustomFieldChoiceSetType] = strawberry_django.offset_paginated()
|
||||
|
||||
custom_link: CustomLinkType = strawberry_django.field()
|
||||
custom_link_list: OffsetPaginated[CustomLinkType] = strawberry_django.offset_paginated()
|
||||
|
||||
export_template: ExportTemplateType = strawberry_django.field()
|
||||
export_template_list: OffsetPaginated[ExportTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
image_attachment: ImageAttachmentType = strawberry_django.field()
|
||||
image_attachment_list: OffsetPaginated[ImageAttachmentType] = strawberry_django.offset_paginated()
|
||||
|
||||
saved_filter: SavedFilterType = strawberry_django.field()
|
||||
saved_filter_list: OffsetPaginated[SavedFilterType] = strawberry_django.offset_paginated()
|
||||
|
||||
table_config: TableConfigType = strawberry_django.field()
|
||||
table_config_list: OffsetPaginated[TableConfigType] = strawberry_django.offset_paginated()
|
||||
|
||||
journal_entry: JournalEntryType = strawberry_django.field()
|
||||
journal_entry_list: OffsetPaginated[JournalEntryType] = strawberry_django.offset_paginated()
|
||||
|
||||
notification: NotificationType = strawberry_django.field()
|
||||
notification_list: OffsetPaginated[NotificationType] = strawberry_django.offset_paginated()
|
||||
|
||||
notification_group: NotificationGroupType = strawberry_django.field()
|
||||
notification_group_list: OffsetPaginated[NotificationGroupType] = strawberry_django.offset_paginated()
|
||||
|
||||
subscription: SubscriptionType = strawberry_django.field()
|
||||
subscription_list: OffsetPaginated[SubscriptionType] = strawberry_django.offset_paginated()
|
||||
|
||||
tag: TagType = strawberry_django.field()
|
||||
tag_list: OffsetPaginated[TagType] = strawberry_django.offset_paginated()
|
||||
|
||||
webhook: WebhookType = strawberry_django.field()
|
||||
webhook_list: OffsetPaginated[WebhookType] = strawberry_django.offset_paginated()
|
||||
|
||||
event_rule: EventRuleType = strawberry_django.field()
|
||||
event_rule_list: OffsetPaginated[EventRuleType] = strawberry_django.offset_paginated()
|
||||
|
||||
@@ -535,15 +535,6 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
|
||||
# URL
|
||||
elif self.type == CustomFieldTypeChoices.TYPE_URL:
|
||||
field = LaxURLField(assume_scheme='https', required=required, initial=initial)
|
||||
if self.validation_regex:
|
||||
field.validators = [
|
||||
RegexValidator(
|
||||
regex=self.validation_regex,
|
||||
message=mark_safe(_("Values must match this regex: <code>{regex}</code>").format(
|
||||
regex=escape(self.validation_regex)
|
||||
))
|
||||
)
|
||||
]
|
||||
|
||||
# JSON
|
||||
elif self.type == CustomFieldTypeChoices.TYPE_JSON:
|
||||
@@ -693,13 +684,6 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
|
||||
if self.validation_regex and not re.match(self.validation_regex, value):
|
||||
raise ValidationError(_("Value must match regex '{regex}'").format(regex=self.validation_regex))
|
||||
|
||||
# Validate URL field
|
||||
elif self.type == CustomFieldTypeChoices.TYPE_URL:
|
||||
if type(value) is not str:
|
||||
raise ValidationError(_("Value must be a string."))
|
||||
if self.validation_regex and not re.match(self.validation_regex, value):
|
||||
raise ValidationError(_("Value must match regex '{regex}'").format(regex=self.validation_regex))
|
||||
|
||||
# Validate integer
|
||||
elif self.type == CustomFieldTypeChoices.TYPE_INTEGER:
|
||||
if type(value) is not int:
|
||||
|
||||
@@ -3,7 +3,6 @@ import importlib.util
|
||||
import os
|
||||
import sys
|
||||
|
||||
from django.core.cache import cache
|
||||
from django.core.files.storage import storages
|
||||
from django.db import models
|
||||
from django.http import HttpResponse
|
||||
@@ -31,14 +30,7 @@ class CustomStoragesLoader(importlib.abc.Loader):
|
||||
return None # Use default module creation
|
||||
|
||||
def exec_module(self, module):
|
||||
# Cache storage for 5 minutes (300 seconds)
|
||||
cache_key = "storage_scripts"
|
||||
storage = cache.get(cache_key)
|
||||
|
||||
if storage is None:
|
||||
storage = storages['scripts']
|
||||
cache.set(cache_key, storage, timeout=300) # 5 minutes
|
||||
|
||||
storage = storages.create_storage(storages.backends["scripts"])
|
||||
with storage.open(self.filename, 'rb') as f:
|
||||
code = f.read()
|
||||
exec(code, module.__dict__)
|
||||
|
||||
@@ -1,12 +1,9 @@
|
||||
import inspect
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
|
||||
import yaml
|
||||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.core.files.storage import storages
|
||||
from django.core.validators import RegexValidator
|
||||
from django.utils import timezone
|
||||
@@ -490,7 +487,7 @@ class BaseScript:
|
||||
if self.fieldsets:
|
||||
fieldsets.extend(self.fieldsets)
|
||||
else:
|
||||
fields = list(name for name, _ in self._get_vars().items())
|
||||
fields = list(name for name, __ in self._get_vars().items())
|
||||
fieldsets.append((_('Script Data'), fields))
|
||||
|
||||
# Append the default fieldset if defined in the Meta class
|
||||
@@ -582,40 +579,6 @@ class BaseScript:
|
||||
self._log(message, obj, level=LogLevelChoices.LOG_FAILURE)
|
||||
self.failed = True
|
||||
|
||||
#
|
||||
# Convenience functions
|
||||
#
|
||||
|
||||
def load_yaml(self, filename):
|
||||
"""
|
||||
Return data from a YAML file
|
||||
"""
|
||||
# TODO: DEPRECATED: Remove this method in v4.5
|
||||
self._log(
|
||||
_("load_yaml is deprecated and will be removed in v4.5"),
|
||||
level=LogLevelChoices.LOG_WARNING
|
||||
)
|
||||
file_path = os.path.join(settings.SCRIPTS_ROOT, filename)
|
||||
with open(file_path, 'r') as datafile:
|
||||
data = yaml.load(datafile, Loader=yaml.SafeLoader)
|
||||
|
||||
return data
|
||||
|
||||
def load_json(self, filename):
|
||||
"""
|
||||
Return data from a JSON file
|
||||
"""
|
||||
# TODO: DEPRECATED: Remove this method in v4.5
|
||||
self._log(
|
||||
_("load_json is deprecated and will be removed in v4.5"),
|
||||
level=LogLevelChoices.LOG_WARNING
|
||||
)
|
||||
file_path = os.path.join(settings.SCRIPTS_ROOT, filename)
|
||||
with open(file_path, 'r') as datafile:
|
||||
data = json.load(datafile)
|
||||
|
||||
return data
|
||||
|
||||
#
|
||||
# Legacy Report functionality
|
||||
#
|
||||
|
||||
@@ -3,7 +3,6 @@ import datetime
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.urls import reverse
|
||||
from django.utils.timezone import make_aware, now
|
||||
from rest_framework import status
|
||||
|
||||
from core.choices import ManagedFileRootPathChoices
|
||||
from core.events import *
|
||||
@@ -859,16 +858,16 @@ class ConfigTemplateTest(APIViewTestCases.APIViewTestCase):
|
||||
class ScriptTest(APITestCase):
|
||||
|
||||
class TestScriptClass(PythonClass):
|
||||
|
||||
class Meta:
|
||||
name = 'Test script'
|
||||
commit = True
|
||||
scheduling_enabled = True
|
||||
name = "Test script"
|
||||
|
||||
var1 = StringVar()
|
||||
var2 = IntegerVar()
|
||||
var3 = BooleanVar()
|
||||
|
||||
def run(self, data, commit=True):
|
||||
|
||||
self.log_info(data['var1'])
|
||||
self.log_success(data['var2'])
|
||||
self.log_failure(data['var3'])
|
||||
@@ -879,16 +878,14 @@ class ScriptTest(APITestCase):
|
||||
def setUpTestData(cls):
|
||||
module = ScriptModule.objects.create(
|
||||
file_root=ManagedFileRootPathChoices.SCRIPTS,
|
||||
file_path='script.py',
|
||||
file_path='/var/tmp/script.py'
|
||||
)
|
||||
script = Script.objects.create(
|
||||
Script.objects.create(
|
||||
module=module,
|
||||
name='Test script',
|
||||
name="Test script",
|
||||
is_executable=True,
|
||||
)
|
||||
cls.url = reverse('extras-api:script-detail', kwargs={'pk': script.pk})
|
||||
|
||||
@property
|
||||
def python_class(self):
|
||||
return self.TestScriptClass
|
||||
|
||||
@@ -901,7 +898,7 @@ class ScriptTest(APITestCase):
|
||||
def test_get_script(self):
|
||||
module = ScriptModule.objects.get(
|
||||
file_root=ManagedFileRootPathChoices.SCRIPTS,
|
||||
file_path='script.py',
|
||||
file_path='/var/tmp/script.py'
|
||||
)
|
||||
script = module.scripts.all().first()
|
||||
url = reverse('extras-api:script-detail', kwargs={'pk': script.pk})
|
||||
@@ -912,76 +909,6 @@ class ScriptTest(APITestCase):
|
||||
self.assertEqual(response.data['vars']['var2'], 'IntegerVar')
|
||||
self.assertEqual(response.data['vars']['var3'], 'BooleanVar')
|
||||
|
||||
def test_schedule_script_past_time_rejected(self):
|
||||
"""
|
||||
Scheduling with past schedule_at should fail.
|
||||
"""
|
||||
self.add_permissions('extras.run_script')
|
||||
|
||||
payload = {
|
||||
'data': {'var1': 'hello', 'var2': 1, 'var3': False},
|
||||
'commit': True,
|
||||
'schedule_at': now() - datetime.timedelta(hours=1),
|
||||
}
|
||||
response = self.client.post(self.url, payload, format='json', **self.header)
|
||||
|
||||
self.assertHttpStatus(response, status.HTTP_400_BAD_REQUEST)
|
||||
self.assertIn('schedule_at', response.data)
|
||||
# Be tolerant of exact wording but ensure we failed on schedule_at being in the past
|
||||
self.assertIn('future', str(response.data['schedule_at']).lower())
|
||||
|
||||
def test_schedule_script_interval_only(self):
|
||||
"""
|
||||
Interval without schedule_at should auto-set schedule_at now.
|
||||
"""
|
||||
self.add_permissions('extras.run_script')
|
||||
|
||||
payload = {
|
||||
'data': {'var1': 'hello', 'var2': 1, 'var3': False},
|
||||
'commit': True,
|
||||
'interval': 60,
|
||||
}
|
||||
response = self.client.post(self.url, payload, format='json', **self.header)
|
||||
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
# The latest job is returned in the script detail serializer under "result"
|
||||
self.assertIn('result', response.data)
|
||||
self.assertEqual(response.data['result']['interval'], 60)
|
||||
# Ensure a start time was autopopulated
|
||||
self.assertIsNotNone(response.data['result']['scheduled'])
|
||||
|
||||
def test_schedule_script_when_disabled(self):
|
||||
"""
|
||||
Scheduling should fail when script.scheduling_enabled=False.
|
||||
"""
|
||||
self.add_permissions('extras.run_script')
|
||||
|
||||
# Temporarily disable scheduling on the in-test Python class
|
||||
original = getattr(self.TestScriptClass.Meta, 'scheduling_enabled', True)
|
||||
self.TestScriptClass.Meta.scheduling_enabled = False
|
||||
base = {
|
||||
'data': {'var1': 'hello', 'var2': 1, 'var3': False},
|
||||
'commit': True,
|
||||
}
|
||||
# Check both schedule_at and interval paths
|
||||
cases = [
|
||||
{**base, 'schedule_at': now() + datetime.timedelta(minutes=5)},
|
||||
{**base, 'interval': 60},
|
||||
]
|
||||
try:
|
||||
for case in cases:
|
||||
with self.subTest(case=list(case.keys())):
|
||||
response = self.client.post(self.url, case, format='json', **self.header)
|
||||
|
||||
self.assertHttpStatus(response, status.HTTP_400_BAD_REQUEST)
|
||||
# Error should be attached to whichever field we used
|
||||
key = 'schedule_at' if 'schedule_at' in case else 'interval'
|
||||
self.assertIn(key, response.data)
|
||||
self.assertIn('scheduling is not enabled', str(response.data[key]).lower())
|
||||
finally:
|
||||
# Restore the original setting for other tests
|
||||
self.TestScriptClass.Meta.scheduling_enabled = original
|
||||
|
||||
|
||||
class CreatedUpdatedFilterTest(APITestCase):
|
||||
|
||||
|
||||
@@ -1300,28 +1300,6 @@ class CustomFieldAPITest(APITestCase):
|
||||
response = self.client.patch(url, data, format='json', **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
|
||||
def test_url_regex_validation(self):
|
||||
"""
|
||||
Test that validation_regex is applied to URL custom fields (fixes #20498).
|
||||
"""
|
||||
site2 = Site.objects.get(name='Site 2')
|
||||
url = reverse('dcim-api:site-detail', kwargs={'pk': site2.pk})
|
||||
self.add_permissions('dcim.change_site')
|
||||
|
||||
cf_url = CustomField.objects.get(name='url_field')
|
||||
cf_url.validation_regex = r'^https://' # Require HTTPS
|
||||
cf_url.save()
|
||||
|
||||
# Test invalid URL (http instead of https)
|
||||
data = {'custom_fields': {'url_field': 'http://example.com'}}
|
||||
response = self.client.patch(url, data, format='json', **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
# Test valid URL (https)
|
||||
data = {'custom_fields': {'url_field': 'https://example.com'}}
|
||||
response = self.client.patch(url, data, format='json', **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
|
||||
def test_uniqueness_validation(self):
|
||||
# Create a unique custom field
|
||||
cf_text = CustomField.objects.get(name='text_field')
|
||||
|
||||
@@ -363,7 +363,7 @@ class EventRuleTest(APITestCase):
|
||||
body = json.loads(request.body)
|
||||
self.assertEqual(body['event'], 'created')
|
||||
self.assertEqual(body['timestamp'], job.kwargs['timestamp'])
|
||||
self.assertEqual(body['model'], 'site')
|
||||
self.assertEqual(body['object_type'], 'dcim.site')
|
||||
self.assertEqual(body['username'], 'testuser')
|
||||
self.assertEqual(body['request_id'], str(request_id))
|
||||
self.assertEqual(body['data']['name'], 'Site 1')
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
import logging
|
||||
import tempfile
|
||||
from datetime import date, datetime, timezone
|
||||
from decimal import Decimal
|
||||
|
||||
@@ -9,7 +7,6 @@ from netaddr import IPAddress, IPNetwork
|
||||
|
||||
from dcim.models import DeviceRole
|
||||
from extras.scripts import *
|
||||
from utilities.testing import disable_logging
|
||||
|
||||
CHOICES = (
|
||||
('ff0000', 'Red'),
|
||||
@@ -35,35 +32,6 @@ JSON_DATA = """
|
||||
"""
|
||||
|
||||
|
||||
class ScriptTest(TestCase):
|
||||
|
||||
def test_load_yaml(self):
|
||||
datafile = tempfile.NamedTemporaryFile()
|
||||
datafile.write(bytes(YAML_DATA, 'UTF-8'))
|
||||
datafile.seek(0)
|
||||
|
||||
with disable_logging(level=logging.WARNING):
|
||||
data = Script().load_yaml(datafile.name)
|
||||
self.assertEqual(data, {
|
||||
'Foo': 123,
|
||||
'Bar': 456,
|
||||
'Baz': ['A', 'B', 'C'],
|
||||
})
|
||||
|
||||
def test_load_json(self):
|
||||
datafile = tempfile.NamedTemporaryFile()
|
||||
datafile.write(bytes(JSON_DATA, 'UTF-8'))
|
||||
datafile.seek(0)
|
||||
|
||||
with disable_logging(level=logging.WARNING):
|
||||
data = Script().load_json(datafile.name)
|
||||
self.assertEqual(data, {
|
||||
'Foo': 123,
|
||||
'Bar': 456,
|
||||
'Baz': ['A', 'B', 'C'],
|
||||
})
|
||||
|
||||
|
||||
class ScriptVariablesTest(TestCase):
|
||||
|
||||
def test_stringvar(self):
|
||||
|
||||
@@ -52,7 +52,6 @@ def send_webhook(event_rule, object_type, event_type, data, timestamp, username,
|
||||
'event': WEBHOOK_EVENT_TYPES.get(event_type, event_type),
|
||||
'timestamp': timestamp,
|
||||
'object_type': '.'.join(object_type.natural_key()),
|
||||
'model': object_type.model,
|
||||
'username': username,
|
||||
'request_id': request.id if request else None,
|
||||
'data': data,
|
||||
@@ -100,7 +99,7 @@ def send_webhook(event_rule, object_type, event_type, data, timestamp, username,
|
||||
'data': body.encode('utf8'),
|
||||
}
|
||||
logger.info(
|
||||
f"Sending {params['method']} request to {params['url']} ({context['model']} {context['event']})"
|
||||
f"Sending {params['method']} request to {params['url']} ({context['object_type']} {context['event']})"
|
||||
)
|
||||
logger.debug(params)
|
||||
try:
|
||||
|
||||
@@ -170,7 +170,7 @@ class IPAddressFilter(ContactFilterMixin, TenancyFilterMixin, PrimaryModelFilter
|
||||
|
||||
@strawberry_django.filter_field()
|
||||
def assigned(self, value: bool, prefix) -> Q:
|
||||
return Q(**{f"{prefix}assigned_object_id__isnull": not value})
|
||||
return Q(assigned_object_id__isnull=(not value))
|
||||
|
||||
@strawberry_django.filter_field()
|
||||
def parent(self, value: list[str], prefix) -> Q:
|
||||
|
||||
@@ -2,12 +2,13 @@ from typing import List
|
||||
|
||||
import strawberry
|
||||
import strawberry_django
|
||||
from strawberry_django.pagination import OffsetPaginated
|
||||
|
||||
from .types import *
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class IPAMQuery:
|
||||
class IPAMQueryV1:
|
||||
asn: ASNType = strawberry_django.field()
|
||||
asn_list: List[ASNType] = strawberry_django.field()
|
||||
|
||||
@@ -61,3 +62,60 @@ class IPAMQuery:
|
||||
|
||||
vrf: VRFType = strawberry_django.field()
|
||||
vrf_list: List[VRFType] = strawberry_django.field()
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class IPAMQuery:
|
||||
asn: ASNType = strawberry_django.field()
|
||||
asn_list: OffsetPaginated[ASNType] = strawberry_django.offset_paginated()
|
||||
|
||||
asn_range: ASNRangeType = strawberry_django.field()
|
||||
asn_range_list: OffsetPaginated[ASNRangeType] = strawberry_django.offset_paginated()
|
||||
|
||||
aggregate: AggregateType = strawberry_django.field()
|
||||
aggregate_list: OffsetPaginated[AggregateType] = strawberry_django.offset_paginated()
|
||||
|
||||
ip_address: IPAddressType = strawberry_django.field()
|
||||
ip_address_list: OffsetPaginated[IPAddressType] = strawberry_django.offset_paginated()
|
||||
|
||||
ip_range: IPRangeType = strawberry_django.field()
|
||||
ip_range_list: OffsetPaginated[IPRangeType] = strawberry_django.offset_paginated()
|
||||
|
||||
prefix: PrefixType = strawberry_django.field()
|
||||
prefix_list: OffsetPaginated[PrefixType] = strawberry_django.offset_paginated()
|
||||
|
||||
rir: RIRType = strawberry_django.field()
|
||||
rir_list: OffsetPaginated[RIRType] = strawberry_django.offset_paginated()
|
||||
|
||||
role: RoleType = strawberry_django.field()
|
||||
role_list: OffsetPaginated[RoleType] = strawberry_django.offset_paginated()
|
||||
|
||||
route_target: RouteTargetType = strawberry_django.field()
|
||||
route_target_list: OffsetPaginated[RouteTargetType] = strawberry_django.offset_paginated()
|
||||
|
||||
service: ServiceType = strawberry_django.field()
|
||||
service_list: OffsetPaginated[ServiceType] = strawberry_django.offset_paginated()
|
||||
|
||||
service_template: ServiceTemplateType = strawberry_django.field()
|
||||
service_template_list: OffsetPaginated[ServiceTemplateType] = strawberry_django.offset_paginated()
|
||||
|
||||
fhrp_group: FHRPGroupType = strawberry_django.field()
|
||||
fhrp_group_list: OffsetPaginated[FHRPGroupType] = strawberry_django.offset_paginated()
|
||||
|
||||
fhrp_group_assignment: FHRPGroupAssignmentType = strawberry_django.field()
|
||||
fhrp_group_assignment_list: OffsetPaginated[FHRPGroupAssignmentType] = strawberry_django.offset_paginated()
|
||||
|
||||
vlan: VLANType = strawberry_django.field()
|
||||
vlan_list: OffsetPaginated[VLANType] = strawberry_django.offset_paginated()
|
||||
|
||||
vlan_group: VLANGroupType = strawberry_django.field()
|
||||
vlan_group_list: OffsetPaginated[VLANGroupType] = strawberry_django.offset_paginated()
|
||||
|
||||
vlan_translation_policy: VLANTranslationPolicyType = strawberry_django.field()
|
||||
vlan_translation_policy_list: OffsetPaginated[VLANTranslationPolicyType] = strawberry_django.offset_paginated()
|
||||
|
||||
vlan_translation_rule: VLANTranslationRuleType = strawberry_django.field()
|
||||
vlan_translation_rule_list: OffsetPaginated[VLANTranslationRuleType] = strawberry_django.offset_paginated()
|
||||
|
||||
vrf: VRFType = strawberry_django.field()
|
||||
vrf_list: OffsetPaginated[VRFType] = strawberry_django.offset_paginated()
|
||||
|
||||
@@ -3,7 +3,6 @@ import django_tables2 as tables
|
||||
|
||||
from ipam.models import *
|
||||
from netbox.tables import NetBoxTable, columns
|
||||
from tenancy.tables import ContactsColumnMixin
|
||||
|
||||
__all__ = (
|
||||
'ServiceTable',
|
||||
@@ -36,7 +35,7 @@ class ServiceTemplateTable(NetBoxTable):
|
||||
default_columns = ('pk', 'name', 'protocol', 'ports', 'description')
|
||||
|
||||
|
||||
class ServiceTable(ContactsColumnMixin, NetBoxTable):
|
||||
class ServiceTable(NetBoxTable):
|
||||
name = tables.Column(
|
||||
verbose_name=_('Name'),
|
||||
linkify=True
|
||||
@@ -61,7 +60,7 @@ class ServiceTable(ContactsColumnMixin, NetBoxTable):
|
||||
class Meta(NetBoxTable.Meta):
|
||||
model = Service
|
||||
fields = (
|
||||
'pk', 'id', 'name', 'parent', 'protocol', 'ports', 'ipaddresses', 'description', 'contacts', 'comments',
|
||||
'tags', 'created', 'last_updated',
|
||||
'pk', 'id', 'name', 'parent', 'protocol', 'ports', 'ipaddresses', 'description', 'comments', 'tags',
|
||||
'created', 'last_updated',
|
||||
)
|
||||
default_columns = ('pk', 'name', 'parent', 'protocol', 'ports', 'description')
|
||||
|
||||
@@ -2,47 +2,90 @@ import logging
|
||||
|
||||
from django.conf import settings
|
||||
from django.utils import timezone
|
||||
from rest_framework import authentication, exceptions
|
||||
from drf_spectacular.extensions import OpenApiAuthenticationExtension
|
||||
from rest_framework import exceptions
|
||||
from rest_framework.authentication import BaseAuthentication, get_authorization_header
|
||||
from rest_framework.permissions import BasePermission, DjangoObjectPermissions, SAFE_METHODS
|
||||
|
||||
from netbox.config import get_config
|
||||
from users.constants import TOKEN_PREFIX
|
||||
from users.models import Token
|
||||
from utilities.request import get_client_ip
|
||||
|
||||
V1_KEYWORD = 'Token'
|
||||
V2_KEYWORD = 'Bearer'
|
||||
|
||||
class TokenAuthentication(authentication.TokenAuthentication):
|
||||
|
||||
class TokenAuthentication(BaseAuthentication):
|
||||
"""
|
||||
A custom authentication scheme which enforces Token expiration times and source IP restrictions.
|
||||
"""
|
||||
model = Token
|
||||
|
||||
def authenticate(self, request):
|
||||
result = super().authenticate(request)
|
||||
|
||||
if result:
|
||||
token = result[1]
|
||||
|
||||
# Enforce source IP restrictions (if any) set on the token
|
||||
if token.allowed_ips:
|
||||
client_ip = get_client_ip(request)
|
||||
if client_ip is None:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"Client IP address could not be determined for validation. Check that the HTTP server is "
|
||||
"correctly configured to pass the required header(s)."
|
||||
)
|
||||
if not token.validate_client_ip(client_ip):
|
||||
raise exceptions.AuthenticationFailed(
|
||||
f"Source IP {client_ip} is not permitted to authenticate using this token."
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
def authenticate_credentials(self, key):
|
||||
model = self.get_model()
|
||||
# Authorization header is not present; ignore
|
||||
if not (auth := get_authorization_header(request).split()):
|
||||
return
|
||||
# Unrecognized header; ignore
|
||||
if auth[0].lower() not in (V1_KEYWORD.lower().encode(), V2_KEYWORD.lower().encode()):
|
||||
return
|
||||
# Check for extraneous token content
|
||||
if len(auth) != 2:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
'Invalid authorization header: Must be in the form "Bearer <key>.<token>" or "Token <token>"'
|
||||
)
|
||||
# Extract the key (if v2) & token plaintext from the auth header
|
||||
try:
|
||||
token = model.objects.prefetch_related('user').get(key=key)
|
||||
except model.DoesNotExist:
|
||||
raise exceptions.AuthenticationFailed("Invalid token")
|
||||
auth_value = auth[1].decode()
|
||||
except UnicodeError:
|
||||
raise exceptions.AuthenticationFailed("Invalid authorization header: Token contains invalid characters")
|
||||
|
||||
# Infer token version from presence or absence of prefix
|
||||
version = 2 if auth_value.startswith(TOKEN_PREFIX) else 1
|
||||
|
||||
if version == 1:
|
||||
key, plaintext = None, auth_value
|
||||
else:
|
||||
auth_value = auth_value.removeprefix(TOKEN_PREFIX)
|
||||
try:
|
||||
key, plaintext = auth_value.split('.', 1)
|
||||
except ValueError:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"Invalid authorization header: Could not parse key from v2 token. Did you mean to use 'Token' "
|
||||
"instead of 'Bearer'?"
|
||||
)
|
||||
|
||||
# Look for a matching token in the database
|
||||
try:
|
||||
qs = Token.objects.prefetch_related('user')
|
||||
if version == 1:
|
||||
# Fetch v1 token by querying plaintext value directly
|
||||
token = qs.get(version=version, plaintext=plaintext)
|
||||
else:
|
||||
# Fetch v2 token by key, then validate the plaintext
|
||||
token = qs.get(version=version, key=key)
|
||||
if not token.validate(plaintext):
|
||||
# Key is valid but plaintext is not. Raise DoesNotExist to guard against key enumeration.
|
||||
raise Token.DoesNotExist()
|
||||
except Token.DoesNotExist:
|
||||
raise exceptions.AuthenticationFailed(f"Invalid v{version} token")
|
||||
|
||||
# Enforce source IP restrictions (if any) set on the token
|
||||
if token.allowed_ips:
|
||||
client_ip = get_client_ip(request)
|
||||
if client_ip is None:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"Client IP address could not be determined for validation. Check that the HTTP server is "
|
||||
"correctly configured to pass the required header(s)."
|
||||
)
|
||||
if not token.validate_client_ip(client_ip):
|
||||
raise exceptions.AuthenticationFailed(
|
||||
f"Source IP {client_ip} is not permitted to authenticate using this token."
|
||||
)
|
||||
|
||||
# Enforce the Token's expiration time, if one has been set.
|
||||
if token.is_expired:
|
||||
raise exceptions.AuthenticationFailed("Token expired")
|
||||
|
||||
# Update last used, but only once per minute at most. This reduces write load on the database
|
||||
if not token.last_used or (timezone.now() - token.last_used).total_seconds() > 60:
|
||||
@@ -54,11 +97,8 @@ class TokenAuthentication(authentication.TokenAuthentication):
|
||||
else:
|
||||
Token.objects.filter(pk=token.pk).update(last_used=timezone.now())
|
||||
|
||||
# Enforce the Token's expiration time, if one has been set.
|
||||
if token.is_expired:
|
||||
raise exceptions.AuthenticationFailed("Token expired")
|
||||
|
||||
user = token.user
|
||||
|
||||
# When LDAP authentication is active try to load user data from LDAP directory
|
||||
if 'netbox.authentication.LDAPBackend' in settings.REMOTE_AUTH_BACKEND:
|
||||
from netbox.authentication import LDAPBackend
|
||||
@@ -132,3 +172,17 @@ class IsAuthenticatedOrLoginNotRequired(BasePermission):
|
||||
if not settings.LOGIN_REQUIRED:
|
||||
return True
|
||||
return request.user.is_authenticated
|
||||
|
||||
|
||||
class TokenScheme(OpenApiAuthenticationExtension):
|
||||
target_class = 'netbox.api.authentication.TokenAuthentication'
|
||||
name = 'tokenAuth'
|
||||
match_subclasses = True
|
||||
|
||||
def get_security_definition(self, auto_schema):
|
||||
return {
|
||||
'type': 'apiKey',
|
||||
'in': 'header',
|
||||
'name': 'Authorization',
|
||||
'description': '`Token <token>` (v1) or `Bearer <key>.<token>` (v2)',
|
||||
}
|
||||
|
||||
@@ -184,14 +184,13 @@ class RemoteUserBackend(_RemoteUserBackend):
|
||||
else:
|
||||
user.groups.clear()
|
||||
logger.debug(f"Stripping user {user} from Groups")
|
||||
|
||||
# Evaluate superuser status
|
||||
user.is_superuser = self._is_superuser(user)
|
||||
logger.debug(f"User {user} is Superuser: {user.is_superuser}")
|
||||
logger.debug(
|
||||
f"User {user} should be Superuser: {self._is_superuser(user)}")
|
||||
|
||||
user.is_staff = self._is_staff(user)
|
||||
logger.debug(f"User {user} is Staff: {user.is_staff}")
|
||||
logger.debug(f"User {user} should be Staff: {self._is_staff(user)}")
|
||||
user.save()
|
||||
return user
|
||||
|
||||
@@ -251,19 +250,8 @@ class RemoteUserBackend(_RemoteUserBackend):
|
||||
return bool(result)
|
||||
|
||||
def _is_staff(self, user):
|
||||
logger = logging.getLogger('netbox.auth.RemoteUserBackend')
|
||||
staff_groups = settings.REMOTE_AUTH_STAFF_GROUPS
|
||||
logger.debug(f"Superuser Groups: {staff_groups}")
|
||||
staff_users = settings.REMOTE_AUTH_STAFF_USERS
|
||||
logger.debug(f"Staff Users :{staff_users}")
|
||||
user_groups = set()
|
||||
for g in user.groups.all():
|
||||
user_groups.add(g.name)
|
||||
logger.debug(f"User {user.username} is in Groups:{user_groups}")
|
||||
result = user.username in staff_users or (
|
||||
set(user_groups) & set(staff_groups))
|
||||
logger.debug(f"User {user.username} in Staff Users :{result}")
|
||||
return bool(result)
|
||||
# Retain for pre-v4.5 compatibility
|
||||
return user.is_superuser
|
||||
|
||||
def configure_user(self, request, user):
|
||||
logger = logging.getLogger('netbox.auth.RemoteUserBackend')
|
||||
|
||||
@@ -68,6 +68,16 @@ REDIS = {
|
||||
# https://docs.djangoproject.com/en/stable/ref/settings/#std:setting-SECRET_KEY
|
||||
SECRET_KEY = ''
|
||||
|
||||
# Define a mapping of cryptographic peppers to use when hashing API tokens. A minimum of one pepper is required to
|
||||
# enable v2 API tokens (NetBox v4.5+). Define peppers as a mapping of numeric ID to pepper value, as shown below. Each
|
||||
# pepper must be at least 50 characters in length.
|
||||
#
|
||||
# API_TOKEN_PEPPERS = {
|
||||
# 1: "<random string>",
|
||||
# 2: "<random string>",
|
||||
# }
|
||||
API_TOKEN_PEPPERS = {}
|
||||
|
||||
|
||||
#########################
|
||||
# #
|
||||
@@ -81,9 +91,6 @@ ADMINS = [
|
||||
# ('John Doe', 'jdoe@example.com'),
|
||||
]
|
||||
|
||||
# Permit the retrieval of API tokens after their creation.
|
||||
ALLOW_TOKEN_RETRIEVAL = False
|
||||
|
||||
# Enable any desired validators for local account passwords below. For a list of included validators, please see the
|
||||
# Django documentation at https://docs.djangoproject.com/en/stable/topics/auth/passwords/#password-validation.
|
||||
AUTH_PASSWORD_VALIDATORS = [
|
||||
|
||||
@@ -43,7 +43,9 @@ SECRET_KEY = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789'
|
||||
|
||||
DEFAULT_PERMISSIONS = {}
|
||||
|
||||
ALLOW_TOKEN_RETRIEVAL = True
|
||||
API_TOKEN_PEPPERS = {
|
||||
1: 'TEST-VALUE-DO-NOT-USE-TEST-VALUE-DO-NOT-USE-TEST-VALUE-DO-NOT-USE',
|
||||
}
|
||||
|
||||
LOGGING = {
|
||||
'version': 1,
|
||||
|
||||
@@ -28,7 +28,6 @@ def preferences(request):
|
||||
user_preferences = request.user.config if request.user.is_authenticated else {}
|
||||
return {
|
||||
'preferences': user_preferences,
|
||||
'htmx_navigation': user_preferences.get('ui.htmx_navigation', False) == 'true'
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -1,24 +1,50 @@
|
||||
import strawberry
|
||||
from django.conf import settings
|
||||
from strawberry_django.optimizer import DjangoOptimizerExtension
|
||||
from strawberry.extensions import MaxAliasesLimiter # , SchemaExtension
|
||||
from strawberry.extensions import MaxAliasesLimiter
|
||||
from strawberry.schema.config import StrawberryConfig
|
||||
|
||||
from circuits.graphql.schema import CircuitsQuery
|
||||
from core.graphql.schema import CoreQuery
|
||||
from dcim.graphql.schema import DCIMQuery
|
||||
from extras.graphql.schema import ExtrasQuery
|
||||
from ipam.graphql.schema import IPAMQuery
|
||||
from circuits.graphql.schema import CircuitsQuery, CircuitsQueryV1
|
||||
from core.graphql.schema import CoreQuery, CoreQueryV1
|
||||
from dcim.graphql.schema import DCIMQuery, DCIMQueryV1
|
||||
from extras.graphql.schema import ExtrasQuery, ExtrasQueryV1
|
||||
from ipam.graphql.schema import IPAMQuery, IPAMQueryV1
|
||||
from netbox.registry import registry
|
||||
from tenancy.graphql.schema import TenancyQuery
|
||||
from users.graphql.schema import UsersQuery
|
||||
from virtualization.graphql.schema import VirtualizationQuery
|
||||
from vpn.graphql.schema import VPNQuery
|
||||
from wireless.graphql.schema import WirelessQuery
|
||||
from tenancy.graphql.schema import TenancyQuery, TenancyQueryV1
|
||||
from users.graphql.schema import UsersQuery, UsersQueryV1
|
||||
from virtualization.graphql.schema import VirtualizationQuery, VirtualizationQueryV1
|
||||
from vpn.graphql.schema import VPNQuery, VPNQueryV1
|
||||
from wireless.graphql.schema import WirelessQuery, WirelessQueryV1
|
||||
|
||||
__all__ = (
|
||||
'Query',
|
||||
'QueryV1',
|
||||
'QueryV2',
|
||||
'schema_v1',
|
||||
'schema_v2',
|
||||
)
|
||||
|
||||
|
||||
@strawberry.type
|
||||
class Query(
|
||||
class QueryV1(
|
||||
UsersQueryV1,
|
||||
CircuitsQueryV1,
|
||||
CoreQueryV1,
|
||||
DCIMQueryV1,
|
||||
ExtrasQueryV1,
|
||||
IPAMQueryV1,
|
||||
TenancyQueryV1,
|
||||
VirtualizationQueryV1,
|
||||
VPNQueryV1,
|
||||
WirelessQueryV1,
|
||||
*registry['plugins']['graphql_schemas'], # Append plugin schemas
|
||||
):
|
||||
"""Query class for GraphQL API v1"""
|
||||
pass
|
||||
|
||||
|
||||
@strawberry.type
|
||||
class QueryV2(
|
||||
UsersQuery,
|
||||
CircuitsQuery,
|
||||
CoreQuery,
|
||||
@@ -31,11 +57,26 @@ class Query(
|
||||
WirelessQuery,
|
||||
*registry['plugins']['graphql_schemas'], # Append plugin schemas
|
||||
):
|
||||
"""Query class for GraphQL API v2"""
|
||||
pass
|
||||
|
||||
|
||||
schema = strawberry.Schema(
|
||||
query=Query,
|
||||
# Expose a default Query class for the configured default GraphQL version
|
||||
class Query(QueryV2 if settings.GRAPHQL_DEFAULT_VERSION == 2 else QueryV1):
|
||||
pass
|
||||
|
||||
|
||||
# Generate schemas for both versions of the GraphQL API
|
||||
schema_v1 = strawberry.Schema(
|
||||
query=QueryV1,
|
||||
config=StrawberryConfig(auto_camel_case=False),
|
||||
extensions=[
|
||||
DjangoOptimizerExtension(prefetch_custom_queryset=True),
|
||||
MaxAliasesLimiter(max_alias_count=settings.GRAPHQL_MAX_ALIASES),
|
||||
]
|
||||
)
|
||||
schema_v2 = strawberry.Schema(
|
||||
query=QueryV2,
|
||||
config=StrawberryConfig(auto_camel_case=False),
|
||||
extensions=[
|
||||
DjangoOptimizerExtension(prefetch_custom_queryset=True),
|
||||
|
||||
16
netbox/netbox/graphql/utils.py
Normal file
16
netbox/netbox/graphql/utils.py
Normal file
@@ -0,0 +1,16 @@
|
||||
from django.conf import settings
|
||||
|
||||
from netbox.graphql.schema import schema_v1, schema_v2
|
||||
|
||||
__all__ = (
|
||||
'get_default_schema',
|
||||
)
|
||||
|
||||
|
||||
def get_default_schema():
|
||||
"""
|
||||
Returns the GraphQL schema corresponding to the value of the NETBOX_GRAPHQL_DEFAULT_SCHEMA setting.
|
||||
"""
|
||||
if settings.GRAPHQL_DEFAULT_VERSION == 2:
|
||||
return schema_v2
|
||||
return schema_v1
|
||||
@@ -50,21 +50,15 @@ class NetBoxFeatureSet(
|
||||
# Base model classes
|
||||
#
|
||||
|
||||
class ChangeLoggedModel(ChangeLoggingMixin, CustomValidationMixin, EventRulesMixin, models.Model):
|
||||
class BaseModel(models.Model):
|
||||
"""
|
||||
Base model for ancillary models; provides limited functionality for models which don't
|
||||
support NetBox's full feature set.
|
||||
"""
|
||||
objects = RestrictedQuerySet.as_manager()
|
||||
A global base model for all NetBox objects.
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
|
||||
class NetBoxModel(NetBoxFeatureSet, models.Model):
|
||||
"""
|
||||
Base model for most object types. Suitable for use by plugins.
|
||||
This class provides some important overrides to Django's default functionality, such as
|
||||
- Overriding the default manager to use RestrictedQuerySet
|
||||
- Extending `clean()` to validate GenericForeignKey fields
|
||||
"""
|
||||
|
||||
objects = RestrictedQuerySet.as_manager()
|
||||
|
||||
class Meta:
|
||||
@@ -103,6 +97,25 @@ class NetBoxModel(NetBoxFeatureSet, models.Model):
|
||||
setattr(self, field.name, obj)
|
||||
|
||||
|
||||
class ChangeLoggedModel(ChangeLoggingMixin, CustomValidationMixin, EventRulesMixin, BaseModel):
|
||||
"""
|
||||
Base model for ancillary models; provides limited functionality for models which don't
|
||||
support NetBox's full feature set.
|
||||
"""
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
|
||||
class NetBoxModel(NetBoxFeatureSet, BaseModel):
|
||||
"""
|
||||
Base model for most object types. Suitable for use by plugins.
|
||||
"""
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
|
||||
#
|
||||
# NetBox internal base models
|
||||
#
|
||||
@@ -177,7 +190,7 @@ class NestedGroupModel(NetBoxFeatureSet, MPTTModel):
|
||||
})
|
||||
|
||||
|
||||
class OrganizationalModel(NetBoxFeatureSet, models.Model):
|
||||
class OrganizationalModel(NetBoxModel):
|
||||
"""
|
||||
Organizational models are those which are used solely to categorize and qualify other objects, and do not convey
|
||||
any real information about the infrastructure being modeled (for example, functional device roles). Organizational
|
||||
@@ -202,8 +215,6 @@ class OrganizationalModel(NetBoxFeatureSet, models.Model):
|
||||
blank=True
|
||||
)
|
||||
|
||||
objects = RestrictedQuerySet.as_manager()
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
ordering = ('name',)
|
||||
|
||||
@@ -3,12 +3,12 @@ from collections import OrderedDict
|
||||
from django.apps import apps
|
||||
from django.urls.exceptions import NoReverseMatch
|
||||
from drf_spectacular.utils import extend_schema
|
||||
from rest_framework import permissions
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.reverse import reverse
|
||||
from rest_framework.views import APIView
|
||||
|
||||
from netbox.registry import registry
|
||||
from utilities.api import IsSuperuser
|
||||
|
||||
|
||||
@extend_schema(exclude=True)
|
||||
@@ -16,7 +16,7 @@ class InstalledPluginsAPIView(APIView):
|
||||
"""
|
||||
API view for listing all installed plugins
|
||||
"""
|
||||
permission_classes = [permissions.IsAdminUser]
|
||||
permission_classes = [IsSuperuser]
|
||||
_ignore_model_permissions = True
|
||||
schema = None
|
||||
|
||||
|
||||
@@ -26,16 +26,6 @@ def get_csv_delimiters():
|
||||
PREFERENCES = {
|
||||
|
||||
# User interface
|
||||
'ui.htmx_navigation': UserPreference(
|
||||
label=_('HTMX Navigation'),
|
||||
choices=(
|
||||
('', _('Disabled')),
|
||||
('true', _('Enabled')),
|
||||
),
|
||||
description=_('Enable dynamic UI navigation'),
|
||||
default=False,
|
||||
warning=_('Experimental feature')
|
||||
),
|
||||
'locale.language': UserPreference(
|
||||
label=_('Language'),
|
||||
choices=(
|
||||
|
||||
@@ -20,6 +20,7 @@ from netbox.plugins import PluginConfig
|
||||
from netbox.registry import registry
|
||||
import storages.utils # type: ignore
|
||||
from utilities.release import load_release_data
|
||||
from utilities.security import validate_peppers
|
||||
from utilities.string import trailing_slash
|
||||
from .monkey import get_unique_validators
|
||||
|
||||
@@ -43,9 +44,9 @@ VERSION = RELEASE.full_version # Retained for backward compatibility
|
||||
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
# Validate Python version
|
||||
if sys.version_info < (3, 10):
|
||||
if sys.version_info < (3, 12):
|
||||
raise RuntimeError(
|
||||
f"NetBox requires Python 3.10 or later. (Currently installed: Python {platform.python_version()})"
|
||||
f"NetBox requires Python 3.12 or later. (Currently installed: Python {platform.python_version()})"
|
||||
)
|
||||
|
||||
#
|
||||
@@ -75,8 +76,8 @@ elif hasattr(configuration, 'DATABASE') and hasattr(configuration, 'DATABASES'):
|
||||
|
||||
# Set static config parameters
|
||||
ADMINS = getattr(configuration, 'ADMINS', [])
|
||||
ALLOW_TOKEN_RETRIEVAL = getattr(configuration, 'ALLOW_TOKEN_RETRIEVAL', False)
|
||||
ALLOWED_HOSTS = getattr(configuration, 'ALLOWED_HOSTS') # Required
|
||||
API_TOKEN_PEPPERS = getattr(configuration, 'API_TOKEN_PEPPERS', {})
|
||||
AUTH_PASSWORD_VALIDATORS = getattr(configuration, 'AUTH_PASSWORD_VALIDATORS', [
|
||||
{
|
||||
"NAME": "django.contrib.auth.password_validation.MinimumLengthValidator",
|
||||
@@ -136,6 +137,7 @@ EVENTS_PIPELINE = getattr(configuration, 'EVENTS_PIPELINE', [
|
||||
EXEMPT_VIEW_PERMISSIONS = getattr(configuration, 'EXEMPT_VIEW_PERMISSIONS', [])
|
||||
FIELD_CHOICES = getattr(configuration, 'FIELD_CHOICES', {})
|
||||
FILE_UPLOAD_MAX_MEMORY_SIZE = getattr(configuration, 'FILE_UPLOAD_MAX_MEMORY_SIZE', 2621440)
|
||||
GRAPHQL_DEFAULT_VERSION = getattr(configuration, 'GRAPHQL_DEFAULT_VERSION', 1)
|
||||
GRAPHQL_MAX_ALIASES = getattr(configuration, 'GRAPHQL_MAX_ALIASES', 10)
|
||||
HOSTNAME = getattr(configuration, 'HOSTNAME', platform.node())
|
||||
HTTP_PROXIES = getattr(configuration, 'HTTP_PROXIES', {})
|
||||
@@ -174,8 +176,6 @@ REMOTE_AUTH_SUPERUSERS = getattr(configuration, 'REMOTE_AUTH_SUPERUSERS', [])
|
||||
REMOTE_AUTH_USER_EMAIL = getattr(configuration, 'REMOTE_AUTH_USER_EMAIL', 'HTTP_REMOTE_USER_EMAIL')
|
||||
REMOTE_AUTH_USER_FIRST_NAME = getattr(configuration, 'REMOTE_AUTH_USER_FIRST_NAME', 'HTTP_REMOTE_USER_FIRST_NAME')
|
||||
REMOTE_AUTH_USER_LAST_NAME = getattr(configuration, 'REMOTE_AUTH_USER_LAST_NAME', 'HTTP_REMOTE_USER_LAST_NAME')
|
||||
REMOTE_AUTH_STAFF_GROUPS = getattr(configuration, 'REMOTE_AUTH_STAFF_GROUPS', [])
|
||||
REMOTE_AUTH_STAFF_USERS = getattr(configuration, 'REMOTE_AUTH_STAFF_USERS', [])
|
||||
# Required by extras/migrations/0109_script_models.py
|
||||
REPORTS_ROOT = getattr(configuration, 'REPORTS_ROOT', os.path.join(BASE_DIR, 'reports')).rstrip('/')
|
||||
RQ_DEFAULT_TIMEOUT = getattr(configuration, 'RQ_DEFAULT_TIMEOUT', 300)
|
||||
@@ -229,6 +229,12 @@ if len(SECRET_KEY) < 50:
|
||||
f" python {BASE_DIR}/generate_secret_key.py"
|
||||
)
|
||||
|
||||
# Validate API token peppers
|
||||
if API_TOKEN_PEPPERS:
|
||||
validate_peppers(API_TOKEN_PEPPERS)
|
||||
else:
|
||||
warnings.warn("API_TOKEN_PEPPERS is not defined. v2 API tokens cannot be used.")
|
||||
|
||||
# Validate update repo URL and timeout
|
||||
if RELEASE_CHECK_URL:
|
||||
try:
|
||||
|
||||
@@ -270,7 +270,7 @@ class ActionsColumn(tables.Column):
|
||||
if not (self.actions or self.extra_buttons):
|
||||
return ''
|
||||
# Skip dummy records (e.g. available VLANs or IP ranges replacing individual IPs)
|
||||
if type(record) is not model or not getattr(record, 'pk', None):
|
||||
if not isinstance(record, model) or not getattr(record, 'pk', None):
|
||||
return ''
|
||||
|
||||
if request := getattr(table, 'context', {}).get('request'):
|
||||
|
||||
@@ -8,6 +8,7 @@ from rest_framework.test import APIClient
|
||||
|
||||
from core.models import ObjectType
|
||||
from dcim.models import Rack, Site
|
||||
from users.constants import TOKEN_PREFIX
|
||||
from users.models import Group, ObjectPermission, Token, User
|
||||
from utilities.testing import TestCase
|
||||
from utilities.testing.api import APITestCase
|
||||
@@ -16,67 +17,159 @@ from utilities.testing.api import APITestCase
|
||||
class TokenAuthenticationTestCase(APITestCase):
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_token_authentication(self):
|
||||
url = reverse('dcim-api:site-list')
|
||||
|
||||
def test_no_token(self):
|
||||
# Request without a token should return a 403
|
||||
response = self.client.get(url)
|
||||
response = self.client.get(reverse('dcim-api:site-list'))
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_v1_token_valid(self):
|
||||
# Create a v1 token
|
||||
token = Token.objects.create(version=1, user=self.user)
|
||||
|
||||
# Valid token should return a 200
|
||||
token = Token.objects.create(user=self.user)
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token.key}')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
header = f'Token {token.token}'
|
||||
response = self.client.get(reverse('dcim-api:site-list'), HTTP_AUTHORIZATION=header)
|
||||
self.assertEqual(response.status_code, 200, response.data)
|
||||
|
||||
# Check that the token's last_used time has been updated
|
||||
token.refresh_from_db()
|
||||
self.assertIsNotNone(token.last_used)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_v1_token_invalid(self):
|
||||
# Invalid token should return a 403
|
||||
header = 'Token XXXXXXXXXX'
|
||||
response = self.client.get(reverse('dcim-api:site-list'), HTTP_AUTHORIZATION=header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
self.assertEqual(response.data['detail'], "Invalid v1 token")
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_v2_token_valid(self):
|
||||
# Create a v2 token
|
||||
token = Token.objects.create(version=2, user=self.user)
|
||||
|
||||
# Valid token should return a 200
|
||||
header = f'Bearer {TOKEN_PREFIX}{token.key}.{token.token}'
|
||||
response = self.client.get(reverse('dcim-api:site-list'), HTTP_AUTHORIZATION=header)
|
||||
self.assertEqual(response.status_code, 200, response.data)
|
||||
|
||||
# Check that the token's last_used time has been updated
|
||||
token.refresh_from_db()
|
||||
self.assertIsNotNone(token.last_used)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_v2_token_invalid(self):
|
||||
# Invalid token should return a 403
|
||||
header = f'Bearer {TOKEN_PREFIX}XXXXXX.XXXXXXXXXX'
|
||||
response = self.client.get(reverse('dcim-api:site-list'), HTTP_AUTHORIZATION=header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
self.assertEqual(response.data['detail'], "Invalid v2 token")
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_token_expiration(self):
|
||||
url = reverse('dcim-api:site-list')
|
||||
|
||||
# Request without a non-expired token should succeed
|
||||
token = Token.objects.create(user=self.user)
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token.key}')
|
||||
# Create v1 & v2 tokens
|
||||
future = datetime.datetime(2100, 1, 1, tzinfo=datetime.timezone.utc)
|
||||
token1 = Token.objects.create(version=1, user=self.user, expires=future)
|
||||
token2 = Token.objects.create(version=2, user=self.user, expires=future)
|
||||
|
||||
# Request with a non-expired token should succeed
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token1.token}')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Bearer {TOKEN_PREFIX}{token2.key}.{token2.token}')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
# Request with an expired token should fail
|
||||
token.expires = datetime.datetime(2020, 1, 1, tzinfo=datetime.timezone.utc)
|
||||
token.save()
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token.key}')
|
||||
past = datetime.datetime(2020, 1, 1, tzinfo=datetime.timezone.utc)
|
||||
token1.expires = past
|
||||
token1.save()
|
||||
token2.expires = past
|
||||
token2.save()
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token1.key}')
|
||||
self.assertEqual(response.status_code, 403)
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Bearer {TOKEN_PREFIX}{token2.key}')
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_token_write_enabled(self):
|
||||
url = reverse('dcim-api:site-list')
|
||||
data = {
|
||||
'name': 'Site 1',
|
||||
'slug': 'site-1',
|
||||
}
|
||||
data = [
|
||||
{
|
||||
'name': 'Site 1',
|
||||
'slug': 'site-1',
|
||||
},
|
||||
{
|
||||
'name': 'Site 2',
|
||||
'slug': 'site-2',
|
||||
},
|
||||
]
|
||||
self.add_permissions('dcim.view_site', 'dcim.add_site')
|
||||
|
||||
# Request with a write-disabled token should fail
|
||||
token = Token.objects.create(user=self.user, write_enabled=False)
|
||||
response = self.client.post(url, data, format='json', HTTP_AUTHORIZATION=f'Token {token.key}')
|
||||
# Create v1 & v2 tokens
|
||||
token1 = Token.objects.create(version=1, user=self.user, write_enabled=False)
|
||||
token2 = Token.objects.create(version=2, user=self.user, write_enabled=False)
|
||||
|
||||
token1_header = f'Token {token1.token}'
|
||||
token2_header = f'Bearer {TOKEN_PREFIX}{token2.key}.{token2.token}'
|
||||
|
||||
# GET request with a write-disabled token should succeed
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=token1_header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=token2_header)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
# POST request with a write-disabled token should fail
|
||||
response = self.client.post(url, data[0], format='json', HTTP_AUTHORIZATION=token1_header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
response = self.client.post(url, data[1], format='json', HTTP_AUTHORIZATION=token2_header)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Request with a write-enabled token should succeed
|
||||
token.write_enabled = True
|
||||
token.save()
|
||||
response = self.client.post(url, data, format='json', HTTP_AUTHORIZATION=f'Token {token.key}')
|
||||
self.assertEqual(response.status_code, 403)
|
||||
# POST request with a write-enabled token should succeed
|
||||
token1.write_enabled = True
|
||||
token1.save()
|
||||
token2.write_enabled = True
|
||||
token2.save()
|
||||
response = self.client.post(url, data[0], format='json', HTTP_AUTHORIZATION=token1_header)
|
||||
self.assertEqual(response.status_code, 201)
|
||||
response = self.client.post(url, data[1], format='json', HTTP_AUTHORIZATION=token2_header)
|
||||
self.assertEqual(response.status_code, 201)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True, EXEMPT_VIEW_PERMISSIONS=['*'])
|
||||
def test_token_allowed_ips(self):
|
||||
url = reverse('dcim-api:site-list')
|
||||
|
||||
# Create v1 & v2 tokens
|
||||
token1 = Token.objects.create(version=1, user=self.user, allowed_ips=['192.0.2.0/24'])
|
||||
token2 = Token.objects.create(version=2, user=self.user, allowed_ips=['192.0.2.0/24'])
|
||||
|
||||
# Request from a non-allowed client IP should fail
|
||||
token = Token.objects.create(user=self.user, allowed_ips=['192.0.2.0/24'])
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token.key}', REMOTE_ADDR='127.0.0.1')
|
||||
response = self.client.get(
|
||||
url,
|
||||
HTTP_AUTHORIZATION=f'Token {token1.token}',
|
||||
REMOTE_ADDR='127.0.0.1'
|
||||
)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
response = self.client.get(
|
||||
url,
|
||||
HTTP_AUTHORIZATION=f'Bearer {TOKEN_PREFIX}{token2.key}.{token2.token}',
|
||||
REMOTE_ADDR='127.0.0.1'
|
||||
)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
# Request with an expired token should fail
|
||||
response = self.client.get(url, HTTP_AUTHORIZATION=f'Token {token.key}', REMOTE_ADDR='192.0.2.1')
|
||||
# Request from an allowed client IP should succeed
|
||||
response = self.client.get(
|
||||
url,
|
||||
HTTP_AUTHORIZATION=f'Token {token1.token}',
|
||||
REMOTE_ADDR='192.0.2.1'
|
||||
)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
response = self.client.get(
|
||||
url,
|
||||
HTTP_AUTHORIZATION=f'Bearer {TOKEN_PREFIX}{token2.key}.{token2.token}',
|
||||
REMOTE_ADDR='192.0.2.1'
|
||||
)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
|
||||
@@ -427,7 +520,7 @@ class ObjectPermissionAPIViewTestCase(TestCase):
|
||||
"""
|
||||
self.user = User.objects.create(username='testuser')
|
||||
self.token = Token.objects.create(user=self.user)
|
||||
self.header = {'HTTP_AUTHORIZATION': 'Token {}'.format(self.token.key)}
|
||||
self.header = {'HTTP_AUTHORIZATION': f'Bearer {TOKEN_PREFIX}{self.token.key}.{self.token.token}'}
|
||||
|
||||
@override_settings(EXEMPT_VIEW_PERMISSIONS=[])
|
||||
def test_get_object(self):
|
||||
|
||||
@@ -46,9 +46,9 @@ class GraphQLTestCase(TestCase):
|
||||
class GraphQLAPITestCase(APITestCase):
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True)
|
||||
def test_graphql_filter_objects(self):
|
||||
def test_graphql_filter_objects_v1(self):
|
||||
"""
|
||||
Test the operation of filters for GraphQL API requests.
|
||||
Test the operation of filters for GraphQL API v1 requests (old format with List[Type]).
|
||||
"""
|
||||
sites = (
|
||||
Site(name='Site 1', slug='site-1'),
|
||||
@@ -85,7 +85,7 @@ class GraphQLAPITestCase(APITestCase):
|
||||
obj_perm.object_types.add(ObjectType.objects.get_for_model(Location))
|
||||
obj_perm.object_types.add(ObjectType.objects.get_for_model(Site))
|
||||
|
||||
url = reverse('graphql')
|
||||
url = reverse('graphql_v1')
|
||||
|
||||
# A valid request should return the filtered list
|
||||
query = '{location_list(filters: {site_id: "' + str(sites[0].pk) + '"}) {id site {id}}}'
|
||||
@@ -126,3 +126,91 @@ class GraphQLAPITestCase(APITestCase):
|
||||
data = json.loads(response.content)
|
||||
self.assertNotIn('errors', data)
|
||||
self.assertEqual(len(data['data']['site']['locations']), 0)
|
||||
|
||||
@override_settings(LOGIN_REQUIRED=True)
|
||||
def test_graphql_filter_objects(self):
|
||||
"""
|
||||
Test the operation of filters for GraphQL API v2 requests (new format with OffsetPaginated).
|
||||
"""
|
||||
sites = (
|
||||
Site(name='Site 1', slug='site-1'),
|
||||
Site(name='Site 2', slug='site-2'),
|
||||
Site(name='Site 3', slug='site-3'),
|
||||
)
|
||||
Site.objects.bulk_create(sites)
|
||||
Location.objects.create(
|
||||
site=sites[0],
|
||||
name='Location 1',
|
||||
slug='location-1',
|
||||
status=LocationStatusChoices.STATUS_PLANNED
|
||||
),
|
||||
Location.objects.create(
|
||||
site=sites[1],
|
||||
name='Location 2',
|
||||
slug='location-2',
|
||||
status=LocationStatusChoices.STATUS_STAGING
|
||||
),
|
||||
Location.objects.create(
|
||||
site=sites[1],
|
||||
name='Location 3',
|
||||
slug='location-3',
|
||||
status=LocationStatusChoices.STATUS_ACTIVE
|
||||
),
|
||||
|
||||
# Add object-level permission
|
||||
obj_perm = ObjectPermission(
|
||||
name='Test permission',
|
||||
actions=['view']
|
||||
)
|
||||
obj_perm.save()
|
||||
obj_perm.users.add(self.user)
|
||||
obj_perm.object_types.add(ObjectType.objects.get_for_model(Location))
|
||||
obj_perm.object_types.add(ObjectType.objects.get_for_model(Site))
|
||||
|
||||
url = reverse('graphql_v2')
|
||||
|
||||
# A valid request should return the filtered list
|
||||
query = '{location_list(filters: {site_id: "' + str(sites[0].pk) + '"}) {results {id site {id}} total_count}}'
|
||||
response = self.client.post(url, data={'query': query}, format="json", **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
data = json.loads(response.content)
|
||||
self.assertNotIn('errors', data)
|
||||
self.assertEqual(len(data['data']['location_list']['results']), 1)
|
||||
self.assertEqual(data['data']['location_list']['total_count'], 1)
|
||||
self.assertIsNotNone(data['data']['location_list']['results'][0]['site'])
|
||||
|
||||
# Test OR logic
|
||||
query = """{
|
||||
location_list( filters: {
|
||||
status: STATUS_PLANNED,
|
||||
OR: {status: STATUS_STAGING}
|
||||
}) {
|
||||
results {
|
||||
id site {id}
|
||||
}
|
||||
total_count
|
||||
}
|
||||
}"""
|
||||
response = self.client.post(url, data={'query': query}, format="json", **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
data = json.loads(response.content)
|
||||
self.assertNotIn('errors', data)
|
||||
self.assertEqual(len(data['data']['location_list']['results']), 2)
|
||||
self.assertEqual(data['data']['location_list']['total_count'], 2)
|
||||
|
||||
# An invalid request should return an empty list
|
||||
query = '{location_list(filters: {site_id: "99999"}) {results {id site {id}} total_count}}' # Invalid site ID
|
||||
response = self.client.post(url, data={'query': query}, format="json", **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
data = json.loads(response.content)
|
||||
self.assertEqual(len(data['data']['location_list']['results']), 0)
|
||||
self.assertEqual(data['data']['location_list']['total_count'], 0)
|
||||
|
||||
# Removing the permissions from location should result in an empty locations list
|
||||
obj_perm.object_types.remove(ObjectType.objects.get_for_model(Location))
|
||||
query = '{site(id: ' + str(sites[0].pk) + ') {id locations {id}}}'
|
||||
response = self.client.post(url, data={'query': query}, format="json", **self.header)
|
||||
self.assertHttpStatus(response, status.HTTP_200_OK)
|
||||
data = json.loads(response.content)
|
||||
self.assertNotIn('errors', data)
|
||||
self.assertEqual(len(data['data']['site']['locations']), 0)
|
||||
|
||||
@@ -6,7 +6,8 @@ from drf_spectacular.views import SpectacularAPIView, SpectacularRedocView, Spec
|
||||
|
||||
from account.views import LoginView, LogoutView
|
||||
from netbox.api.views import APIRootView, StatusView
|
||||
from netbox.graphql.schema import schema
|
||||
from netbox.graphql.schema import schema_v1, schema_v2
|
||||
from netbox.graphql.utils import get_default_schema
|
||||
from netbox.graphql.views import NetBoxGraphQLView
|
||||
from netbox.plugins.urls import plugin_patterns, plugin_api_patterns
|
||||
from netbox.views import HomeView, MediaView, StaticMediaFailureView, SearchView, htmx
|
||||
@@ -40,7 +41,7 @@ _patterns = [
|
||||
# HTMX views
|
||||
path('htmx/object-selector/', htmx.ObjectSelectorView.as_view(), name='htmx_object_selector'),
|
||||
|
||||
# API
|
||||
# REST API
|
||||
path('api/', APIRootView.as_view(), name='api-root'),
|
||||
path('api/circuits/', include('circuits.api.urls')),
|
||||
path('api/core/', include('core.api.urls')),
|
||||
@@ -54,6 +55,7 @@ _patterns = [
|
||||
path('api/wireless/', include('wireless.api.urls')),
|
||||
path('api/status/', StatusView.as_view(), name='api-status'),
|
||||
|
||||
# REST API schema
|
||||
path(
|
||||
"api/schema/",
|
||||
cache_page(timeout=86400, key_prefix=f"api_schema_{settings.RELEASE.version}")(
|
||||
@@ -64,8 +66,10 @@ _patterns = [
|
||||
path('api/schema/swagger-ui/', SpectacularSwaggerView.as_view(url_name='schema'), name='api_docs'),
|
||||
path('api/schema/redoc/', SpectacularRedocView.as_view(url_name='schema'), name='api_redocs'),
|
||||
|
||||
# GraphQL
|
||||
path('graphql/', NetBoxGraphQLView.as_view(schema=schema), name='graphql'),
|
||||
# GraphQL API
|
||||
path('graphql/', NetBoxGraphQLView.as_view(schema=get_default_schema()), name='graphql'),
|
||||
path('graphql/v1/', NetBoxGraphQLView.as_view(schema=schema_v1), name='graphql_v1'),
|
||||
path('graphql/v2/', NetBoxGraphQLView.as_view(schema=schema_v2), name='graphql_v2'),
|
||||
|
||||
# Serving static media in Django to pipe it through LoginRequiredMiddleware
|
||||
path('media/<path:path>', MediaView.as_view(), name='media'),
|
||||
|
||||
@@ -47,9 +47,9 @@ class HomeView(ConditionalLoginRequiredMixin, View):
|
||||
))
|
||||
dashboard = get_default_dashboard(config=DEFAULT_DASHBOARD).get_layout()
|
||||
|
||||
# Check whether a new release is available. (Only for staff/superusers.)
|
||||
# Check whether a new release is available. (Only for superusers.)
|
||||
new_release = None
|
||||
if request.user.is_staff or request.user.is_superuser:
|
||||
if request.user.is_superuser:
|
||||
latest_release = cache.get('latest_release')
|
||||
if latest_release:
|
||||
release_version, release_url = latest_release
|
||||
|
||||
2
netbox/project-static/dist/netbox.css
vendored
2
netbox/project-static/dist/netbox.css
vendored
File diff suppressed because one or more lines are too long
2
netbox/project-static/dist/netbox.js
vendored
2
netbox/project-static/dist/netbox.js
vendored
File diff suppressed because one or more lines are too long
4
netbox/project-static/dist/netbox.js.map
vendored
4
netbox/project-static/dist/netbox.js.map
vendored
File diff suppressed because one or more lines are too long
@@ -20,13 +20,11 @@ function slugify(slug: string, chars: number): string {
|
||||
* For any slug fields, add event listeners to handle automatically generating slug values.
|
||||
*/
|
||||
export function initReslug(): void {
|
||||
for (const slugButton of getElements<HTMLButtonElement>('button.reslug')) {
|
||||
for (const slugButton of getElements<HTMLButtonElement>('button#reslug')) {
|
||||
const form = slugButton.form;
|
||||
if (form == null) continue;
|
||||
|
||||
const slugField = form.querySelector('input.slug-field') as HTMLInputElement;
|
||||
const slugField = form.querySelector('#id_slug') as HTMLInputElement;
|
||||
if (slugField == null) continue;
|
||||
|
||||
const sourceId = slugField.getAttribute('slug-source');
|
||||
const sourceField = form.querySelector(`#id_${sourceId}`) as HTMLInputElement;
|
||||
|
||||
|
||||
@@ -16,11 +16,6 @@ pre {
|
||||
background: var(--#{$prefix}bg-surface);
|
||||
}
|
||||
|
||||
// Permit copying of badge text
|
||||
.badge {
|
||||
user-select: text;
|
||||
}
|
||||
|
||||
// Button adjustments
|
||||
.btn {
|
||||
// Tabler sets display: flex
|
||||
|
||||
@@ -39,10 +39,6 @@
|
||||
<th scope="row">{% trans "Superuser" %}</th>
|
||||
<td>{% checkmark request.user.is_superuser %}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Staff" %}</th>
|
||||
<td>{% checkmark request.user.is_staff %}</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -1,62 +1,8 @@
|
||||
{% extends 'generic/object.html' %}
|
||||
{% load form_helpers %}
|
||||
{% load helpers %}
|
||||
{% extends 'users/token.html' %}
|
||||
{% load i18n %}
|
||||
{% load plugins %}
|
||||
|
||||
{% block breadcrumbs %}
|
||||
<li class="breadcrumb-item"><a href="{% url 'account:usertoken_list' %}">{% trans "My API Tokens" %}</a></li>
|
||||
<li class="breadcrumb-item">
|
||||
<a href="{% url 'account:usertoken_list' %}">{% trans "My API Tokens" %}</a>
|
||||
</li>
|
||||
{% endblock breadcrumbs %}
|
||||
|
||||
{% block title %}{% trans "Token" %} {{ object }}{% endblock %}
|
||||
|
||||
{% block subtitle %}{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row">
|
||||
<div class="col col-md-12">
|
||||
<div class="card">
|
||||
<h2 class="card-header">{% trans "Token" %}</h2>
|
||||
<table class="table table-hover attr-table">
|
||||
<tr>
|
||||
<th scope="row">{% trans "Key" %}</th>
|
||||
<td>
|
||||
{% if key %}
|
||||
<div class="float-end">
|
||||
{% copy_content "token_id" %}
|
||||
</div>
|
||||
<div id="token_id">{{ key }}</div>
|
||||
{% else %}
|
||||
{{ object.partial }}
|
||||
{% endif %}
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Description" %}</th>
|
||||
<td>{{ object.description|placeholder }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Write enabled" %}</th>
|
||||
<td>{% checkmark object.write_enabled %}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Created" %}</th>
|
||||
<td>{{ object.created|isodatetime }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Expires" %}</th>
|
||||
<td>{{ object.expires|isodatetime|placeholder }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Last used" %}</th>
|
||||
<td>{{ object.last_used|isodatetime|placeholder }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Allowed IPs" %}</th>
|
||||
<td>{{ object.allowed_ips|join:", "|placeholder }}</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
@@ -95,7 +95,7 @@ Blocks:
|
||||
|
||||
{# Page content #}
|
||||
<div class="page-wrapper">
|
||||
<div id="page-content" {% htmx_boost %}>
|
||||
<div id="page-content">
|
||||
|
||||
{# Page header #}
|
||||
{% block header %}
|
||||
|
||||
@@ -27,33 +27,31 @@
|
||||
<div class="mt-1 small text-secondary">
|
||||
{% if request.user.is_superuser %}
|
||||
{% trans "Admin" %}
|
||||
{% elif request.user.is_staff %}
|
||||
{% trans "Staff" %}
|
||||
{% else %}
|
||||
{% trans "User" %}
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</a>
|
||||
<div class="dropdown-menu dropdown-menu-end dropdown-menu-arrow" {% htmx_boost %}>
|
||||
<div class="dropdown-menu dropdown-menu-end dropdown-menu-arrow">
|
||||
<a href="{% url 'account:profile' %}" class="dropdown-item">
|
||||
<i class="dropdown-item-icon mdi mdi-account"></i> {% trans "Profile" %}
|
||||
<i class="mdi mdi-account"></i> {% trans "Profile" %}
|
||||
</a>
|
||||
<a href="{% url 'account:bookmarks' %}" class="dropdown-item">
|
||||
<i class="dropdown-item-icon mdi mdi-bookmark"></i> {% trans "Bookmarks" %}
|
||||
<i class="mdi mdi-bookmark"></i> {% trans "Bookmarks" %}
|
||||
</a>
|
||||
<a href="{% url 'account:subscriptions' %}" class="dropdown-item">
|
||||
<i class="dropdown-item-icon mdi mdi-bell"></i> {% trans "Subscriptions" %}
|
||||
<i class="mdi mdi-bell"></i> {% trans "Subscriptions" %}
|
||||
</a>
|
||||
<a href="{% url 'account:preferences' %}" class="dropdown-item">
|
||||
<i class="dropdown-item-icon mdi mdi-wrench"></i> {% trans "Preferences" %}
|
||||
<i class="mdi mdi-wrench"></i> {% trans "Preferences" %}
|
||||
</a>
|
||||
<a href="{% url 'account:usertoken_list' %}" class="dropdown-item">
|
||||
<i class="dropdown-item-icon mdi mdi-key"></i> {% trans "API Tokens" %}
|
||||
<i class="mdi mdi-key"></i> {% trans "API Tokens" %}
|
||||
</a>
|
||||
<hr class="dropdown-divider" />
|
||||
<a href="{% url 'logout' %}" hx-disable="true" class="dropdown-item">
|
||||
<i class="dropdown-item-icon mdi mdi-logout-variant"></i> {% trans "Log Out" %}
|
||||
<i class="mdi mdi-logout-variant"></i> {% trans "Log Out" %}
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -37,7 +37,7 @@
|
||||
path. Refer to <a href="{{ docs_url }}">the installation documentation</a> for further guidance.
|
||||
{% endblocktrans %}
|
||||
<ul>
|
||||
{% if request.user.is_staff or request.user.is_superuser %}
|
||||
{% if request.user.is_superuser %}
|
||||
<li><code>STATIC_ROOT: <strong>{{ settings.STATIC_ROOT }}</strong></code></li>
|
||||
{% endif %}
|
||||
<li><code>STATIC_URL: <strong>{{ settings.STATIC_URL }}</strong></code></li>
|
||||
|
||||
@@ -14,9 +14,24 @@
|
||||
<h2 class="card-header">{% trans "Token" %}</h2>
|
||||
<table class="table table-hover attr-table">
|
||||
<tr>
|
||||
<th scope="row">{% trans "Key" %}</th>
|
||||
<td>{% if settings.ALLOW_TOKEN_RETRIEVAL %}{{ object.key }}{% else %}{{ object.partial }}{% endif %}</td>
|
||||
<th scope="row">{% trans "Version" %}</th>
|
||||
<td>{{ object.version }}</td>
|
||||
</tr>
|
||||
{% if object.version == 1 %}
|
||||
<tr>
|
||||
<th scope="row">{% trans "Token" %}</th>
|
||||
<td>{{ object.partial }}</td>
|
||||
</tr>
|
||||
{% else %}
|
||||
<tr>
|
||||
<th scope="row">{% trans "Key" %}</th>
|
||||
<td>{{ object }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Pepper ID" %}</th>
|
||||
<td>{{ object.pepper_id }}</td>
|
||||
</tr>
|
||||
{% endif %}
|
||||
<tr>
|
||||
<th scope="row">{% trans "User" %}</th>
|
||||
<td>
|
||||
|
||||
@@ -35,10 +35,6 @@
|
||||
<th scope="row">{% trans "Active" %}</th>
|
||||
<td>{% checkmark object.is_active %}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Staff" %}</th>
|
||||
<td>{% checkmark object.is_staff %}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Superuser" %}</th>
|
||||
<td>{% checkmark object.is_superuser %}</td>
|
||||
|
||||
@@ -2,12 +2,13 @@ from typing import List
|
||||
|
||||
import strawberry
|
||||
import strawberry_django
|
||||
from strawberry_django.pagination import OffsetPaginated
|
||||
|
||||
from .types import *
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class TenancyQuery:
|
||||
class TenancyQueryV1:
|
||||
tenant: TenantType = strawberry_django.field()
|
||||
tenant_list: List[TenantType] = strawberry_django.field()
|
||||
|
||||
@@ -25,3 +26,24 @@ class TenancyQuery:
|
||||
|
||||
contact_assignment: ContactAssignmentType = strawberry_django.field()
|
||||
contact_assignment_list: List[ContactAssignmentType] = strawberry_django.field()
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class TenancyQuery:
|
||||
tenant: TenantType = strawberry_django.field()
|
||||
tenant_list: OffsetPaginated[TenantType] = strawberry_django.offset_paginated()
|
||||
|
||||
tenant_group: TenantGroupType = strawberry_django.field()
|
||||
tenant_group_list: OffsetPaginated[TenantGroupType] = strawberry_django.offset_paginated()
|
||||
|
||||
contact: ContactType = strawberry_django.field()
|
||||
contact_list: OffsetPaginated[ContactType] = strawberry_django.offset_paginated()
|
||||
|
||||
contact_role: ContactRoleType = strawberry_django.field()
|
||||
contact_role_list: OffsetPaginated[ContactRoleType] = strawberry_django.offset_paginated()
|
||||
|
||||
contact_group: ContactGroupType = strawberry_django.field()
|
||||
contact_group_list: OffsetPaginated[ContactGroupType] = strawberry_django.offset_paginated()
|
||||
|
||||
contact_assignment: ContactAssignmentType = strawberry_django.field()
|
||||
contact_assignment_list: OffsetPaginated[ContactAssignmentType] = strawberry_django.offset_paginated()
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,4 +1,3 @@
|
||||
from django.conf import settings
|
||||
from django.contrib.auth import authenticate
|
||||
from rest_framework import serializers
|
||||
from rest_framework.exceptions import AuthenticationFailed, PermissionDenied
|
||||
@@ -15,14 +14,13 @@ __all__ = (
|
||||
|
||||
|
||||
class TokenSerializer(ValidatedModelSerializer):
|
||||
key = serializers.CharField(
|
||||
min_length=40,
|
||||
max_length=40,
|
||||
allow_blank=True,
|
||||
token = serializers.CharField(
|
||||
required=False,
|
||||
write_only=not settings.ALLOW_TOKEN_RETRIEVAL
|
||||
default=Token.generate,
|
||||
)
|
||||
user = UserSerializer(
|
||||
nested=True
|
||||
)
|
||||
user = UserSerializer(nested=True)
|
||||
allowed_ips = serializers.ListField(
|
||||
child=IPNetworkSerializer(),
|
||||
required=False,
|
||||
@@ -33,15 +31,20 @@ class TokenSerializer(ValidatedModelSerializer):
|
||||
class Meta:
|
||||
model = Token
|
||||
fields = (
|
||||
'id', 'url', 'display_url', 'display', 'user', 'created', 'expires', 'last_used', 'key', 'write_enabled',
|
||||
'description', 'allowed_ips',
|
||||
'id', 'url', 'display_url', 'display', 'version', 'key', 'user', 'description', 'created', 'expires',
|
||||
'last_used', 'write_enabled', 'pepper_id', 'allowed_ips', 'token',
|
||||
)
|
||||
brief_fields = ('id', 'url', 'display', 'key', 'write_enabled', 'description')
|
||||
read_only_fields = ('key',)
|
||||
brief_fields = ('id', 'url', 'display', 'version', 'key', 'write_enabled', 'description')
|
||||
|
||||
def to_internal_value(self, data):
|
||||
if not getattr(self.instance, 'key', None) and 'key' not in data:
|
||||
data['key'] = Token.generate_key()
|
||||
return super().to_internal_value(data)
|
||||
def get_fields(self):
|
||||
fields = super().get_fields()
|
||||
|
||||
# Make user field read-only if updating an existing Token.
|
||||
if self.instance is not None:
|
||||
fields['user'].read_only = True
|
||||
|
||||
return fields
|
||||
|
||||
def validate(self, data):
|
||||
|
||||
@@ -75,8 +78,8 @@ class TokenProvisionSerializer(TokenSerializer):
|
||||
class Meta:
|
||||
model = Token
|
||||
fields = (
|
||||
'id', 'url', 'display_url', 'display', 'user', 'created', 'expires', 'last_used', 'key', 'write_enabled',
|
||||
'description', 'allowed_ips', 'username', 'password',
|
||||
'id', 'url', 'display_url', 'display', 'version', 'user', 'key', 'created', 'expires', 'last_used', 'key',
|
||||
'write_enabled', 'description', 'allowed_ips', 'username', 'password', 'token',
|
||||
)
|
||||
|
||||
def validate(self, data):
|
||||
|
||||
@@ -52,7 +52,7 @@ class UserSerializer(ValidatedModelSerializer):
|
||||
model = User
|
||||
fields = (
|
||||
'id', 'url', 'display_url', 'display', 'username', 'password', 'first_name', 'last_name', 'email',
|
||||
'is_staff', 'is_active', 'date_joined', 'last_login', 'groups', 'permissions',
|
||||
'is_active', 'date_joined', 'last_login', 'groups', 'permissions',
|
||||
)
|
||||
brief_fields = ('id', 'url', 'display', 'username')
|
||||
extra_kwargs = {
|
||||
|
||||
17
netbox/users/choices.py
Normal file
17
netbox/users/choices.py
Normal file
@@ -0,0 +1,17 @@
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from utilities.choices import ChoiceSet
|
||||
|
||||
__all__ = (
|
||||
'TokenVersionChoices',
|
||||
)
|
||||
|
||||
|
||||
class TokenVersionChoices(ChoiceSet):
|
||||
V1 = 1
|
||||
V2 = 2
|
||||
|
||||
CHOICES = [
|
||||
(V1, _('v1')),
|
||||
(V2, _('v2')),
|
||||
]
|
||||
@@ -1,3 +1,5 @@
|
||||
import string
|
||||
|
||||
from django.db.models import Q
|
||||
|
||||
|
||||
@@ -7,3 +9,9 @@ OBJECTPERMISSION_OBJECT_TYPES = Q(
|
||||
)
|
||||
|
||||
CONSTRAINT_TOKEN_USER = '$user'
|
||||
|
||||
# API tokens
|
||||
TOKEN_PREFIX = 'nbt_' # Used for v2 tokens only
|
||||
TOKEN_KEY_LENGTH = 12
|
||||
TOKEN_DEFAULT_LENGTH = 40
|
||||
TOKEN_CHARSET = string.ascii_letters + string.digits
|
||||
|
||||
@@ -81,7 +81,7 @@ class UserFilterSet(BaseFilterSet):
|
||||
class Meta:
|
||||
model = User
|
||||
fields = (
|
||||
'id', 'username', 'first_name', 'last_name', 'email', 'date_joined', 'last_login', 'is_staff', 'is_active',
|
||||
'id', 'username', 'first_name', 'last_name', 'email', 'date_joined', 'last_login', 'is_active',
|
||||
'is_superuser',
|
||||
)
|
||||
|
||||
@@ -130,15 +130,27 @@ class TokenFilterSet(BaseFilterSet):
|
||||
field_name='expires',
|
||||
lookup_expr='lte'
|
||||
)
|
||||
last_used = django_filters.DateTimeFilter()
|
||||
last_used__gte = django_filters.DateTimeFilter(
|
||||
field_name='last_used',
|
||||
lookup_expr='gte'
|
||||
)
|
||||
last_used__lte = django_filters.DateTimeFilter(
|
||||
field_name='last_used',
|
||||
lookup_expr='lte'
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = Token
|
||||
fields = ('id', 'key', 'write_enabled', 'description', 'last_used')
|
||||
fields = (
|
||||
'id', 'version', 'key', 'pepper_id', 'write_enabled', 'description', 'created', 'expires', 'last_used',
|
||||
)
|
||||
|
||||
def search(self, queryset, name, value):
|
||||
if not value.strip():
|
||||
return queryset
|
||||
return queryset.filter(
|
||||
Q(key=value) |
|
||||
Q(user__username__icontains=value) |
|
||||
Q(description__icontains=value)
|
||||
)
|
||||
|
||||
@@ -37,11 +37,6 @@ class UserBulkEditForm(BulkEditForm):
|
||||
widget=BulkEditNullBooleanSelect,
|
||||
label=_('Active')
|
||||
)
|
||||
is_staff = forms.NullBooleanField(
|
||||
required=False,
|
||||
widget=BulkEditNullBooleanSelect,
|
||||
label=_('Staff status')
|
||||
)
|
||||
is_superuser = forms.NullBooleanField(
|
||||
required=False,
|
||||
widget=BulkEditNullBooleanSelect,
|
||||
@@ -50,7 +45,7 @@ class UserBulkEditForm(BulkEditForm):
|
||||
|
||||
model = User
|
||||
fieldsets = (
|
||||
FieldSet('first_name', 'last_name', 'is_active', 'is_staff', 'is_superuser'),
|
||||
FieldSet('first_name', 'last_name', 'is_active', 'is_superuser'),
|
||||
)
|
||||
nullable_fields = ('first_name', 'last_name')
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
from django import forms
|
||||
from django.utils.translation import gettext as _
|
||||
from users.models import *
|
||||
from users.choices import TokenVersionChoices
|
||||
from utilities.forms import CSVModelForm
|
||||
|
||||
|
||||
@@ -23,8 +24,7 @@ class UserImportForm(CSVModelForm):
|
||||
class Meta:
|
||||
model = User
|
||||
fields = (
|
||||
'username', 'first_name', 'last_name', 'email', 'password', 'is_staff',
|
||||
'is_active', 'is_superuser'
|
||||
'username', 'first_name', 'last_name', 'email', 'password', 'is_active', 'is_superuser'
|
||||
)
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
@@ -35,12 +35,18 @@ class UserImportForm(CSVModelForm):
|
||||
|
||||
|
||||
class TokenImportForm(CSVModelForm):
|
||||
key = forms.CharField(
|
||||
label=_('Key'),
|
||||
version = forms.ChoiceField(
|
||||
choices=TokenVersionChoices,
|
||||
initial=TokenVersionChoices.V2,
|
||||
required=False,
|
||||
help_text=_("If no key is provided, one will be generated automatically.")
|
||||
help_text=_("Specify version 1 or 2 (v2 will be used by default)")
|
||||
)
|
||||
token = forms.CharField(
|
||||
label=_('Token'),
|
||||
required=False,
|
||||
help_text=_("If no token is provided, one will be generated automatically.")
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = Token
|
||||
fields = ('user', 'key', 'write_enabled', 'expires', 'description',)
|
||||
fields = ('user', 'version', 'token', 'write_enabled', 'expires', 'description',)
|
||||
|
||||
@@ -3,10 +3,12 @@ from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from netbox.forms import NetBoxModelFilterSetForm
|
||||
from netbox.forms.mixins import SavedFiltersMixin
|
||||
from users.choices import TokenVersionChoices
|
||||
from users.models import Group, ObjectPermission, Token, User
|
||||
from utilities.forms import BOOLEAN_WITH_BLANK_CHOICES, FilterForm
|
||||
from utilities.forms.fields import DynamicModelMultipleChoiceField
|
||||
from utilities.forms.rendering import FieldSet
|
||||
from utilities.forms.utils import add_blank_choice
|
||||
from utilities.forms.widgets import DateTimePicker
|
||||
|
||||
__all__ = (
|
||||
@@ -29,7 +31,7 @@ class UserFilterForm(NetBoxModelFilterSetForm):
|
||||
fieldsets = (
|
||||
FieldSet('q', 'filter_id',),
|
||||
FieldSet('group_id', name=_('Group')),
|
||||
FieldSet('is_active', 'is_staff', 'is_superuser', name=_('Status')),
|
||||
FieldSet('is_active', 'is_superuser', name=_('Status')),
|
||||
)
|
||||
group_id = DynamicModelMultipleChoiceField(
|
||||
queryset=Group.objects.all(),
|
||||
@@ -43,13 +45,6 @@ class UserFilterForm(NetBoxModelFilterSetForm):
|
||||
),
|
||||
label=_('Is Active'),
|
||||
)
|
||||
is_staff = forms.NullBooleanField(
|
||||
required=False,
|
||||
widget=forms.Select(
|
||||
choices=BOOLEAN_WITH_BLANK_CHOICES
|
||||
),
|
||||
label=_('Is Staff'),
|
||||
)
|
||||
is_superuser = forms.NullBooleanField(
|
||||
required=False,
|
||||
widget=forms.Select(
|
||||
@@ -117,7 +112,11 @@ class TokenFilterForm(SavedFiltersMixin, FilterForm):
|
||||
model = Token
|
||||
fieldsets = (
|
||||
FieldSet('q', 'filter_id',),
|
||||
FieldSet('user_id', 'write_enabled', 'expires', 'last_used', name=_('Token')),
|
||||
FieldSet('version', 'user_id', 'write_enabled', 'expires', 'last_used', name=_('Token')),
|
||||
)
|
||||
version = forms.ChoiceField(
|
||||
choices=add_blank_choice(TokenVersionChoices),
|
||||
required=False,
|
||||
)
|
||||
user_id = DynamicModelMultipleChoiceField(
|
||||
queryset=User.objects.all(),
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import json
|
||||
|
||||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.contrib.auth import password_validation
|
||||
from django.contrib.postgres.forms import SimpleArrayField
|
||||
from django.core.exceptions import FieldError
|
||||
@@ -12,14 +11,11 @@ from core.models import ObjectType
|
||||
from ipam.formfields import IPNetworkFormField
|
||||
from ipam.validators import prefix_validator
|
||||
from netbox.preferences import PREFERENCES
|
||||
from users.choices import TokenVersionChoices
|
||||
from users.constants import *
|
||||
from users.models import *
|
||||
from utilities.data import flatten_dict
|
||||
from utilities.forms.fields import (
|
||||
ContentTypeMultipleChoiceField,
|
||||
DynamicModelMultipleChoiceField,
|
||||
JSONField,
|
||||
)
|
||||
from utilities.forms.fields import ContentTypeMultipleChoiceField, DynamicModelMultipleChoiceField, JSONField
|
||||
from utilities.forms.rendering import FieldSet
|
||||
from utilities.forms.widgets import DateTimePicker, SplitMultiSelectWidget
|
||||
from utilities.permissions import qs_filter_from_constraints
|
||||
@@ -64,8 +60,7 @@ class UserConfigFormMetaclass(forms.models.ModelFormMetaclass):
|
||||
class UserConfigForm(forms.ModelForm, metaclass=UserConfigFormMetaclass):
|
||||
fieldsets = (
|
||||
FieldSet(
|
||||
'locale.language', 'pagination.per_page', 'pagination.placement', 'ui.htmx_navigation',
|
||||
'ui.tables.striping',
|
||||
'locale.language', 'pagination.per_page', 'pagination.placement', 'ui.tables.striping',
|
||||
name=_('User Interface')
|
||||
),
|
||||
FieldSet('data_format', 'csv_delimiter', name=_('Miscellaneous')),
|
||||
@@ -115,11 +110,11 @@ class UserConfigForm(forms.ModelForm, metaclass=UserConfigFormMetaclass):
|
||||
|
||||
|
||||
class UserTokenForm(forms.ModelForm):
|
||||
key = forms.CharField(
|
||||
label=_('Key'),
|
||||
token = forms.CharField(
|
||||
label=_('Token'),
|
||||
help_text=_(
|
||||
'Keys must be at least 40 characters in length. <strong>Be sure to record your key</strong> prior to '
|
||||
'submitting this form, as it may no longer be accessible once the token has been created.'
|
||||
'Tokens must be at least 40 characters in length. <strong>Be sure to record your key</strong> prior to '
|
||||
'submitting this form, as it will no longer be accessible once the token has been created.'
|
||||
),
|
||||
widget=forms.TextInput(
|
||||
attrs={'data-clipboard': 'true'}
|
||||
@@ -138,7 +133,7 @@ class UserTokenForm(forms.ModelForm):
|
||||
class Meta:
|
||||
model = Token
|
||||
fields = [
|
||||
'key', 'write_enabled', 'expires', 'description', 'allowed_ips',
|
||||
'version', 'token', 'write_enabled', 'expires', 'description', 'allowed_ips',
|
||||
]
|
||||
widgets = {
|
||||
'expires': DateTimePicker(),
|
||||
@@ -147,13 +142,24 @@ class UserTokenForm(forms.ModelForm):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
# Omit the key field if token retrieval is not permitted
|
||||
if self.instance.pk and not settings.ALLOW_TOKEN_RETRIEVAL:
|
||||
del self.fields['key']
|
||||
if self.instance.pk:
|
||||
# Disable the version & user fields for existing Tokens
|
||||
self.fields['version'].disabled = True
|
||||
self.fields['user'].disabled = True
|
||||
|
||||
# Omit the key field when editing an existing Token
|
||||
del self.fields['token']
|
||||
|
||||
# Generate an initial random key if none has been specified
|
||||
if not self.instance.pk and not self.initial.get('key'):
|
||||
self.initial['key'] = Token.generate_key()
|
||||
elif self.instance._state.adding and not self.initial.get('token'):
|
||||
self.initial['version'] = TokenVersionChoices.V2
|
||||
self.initial['token'] = Token.generate()
|
||||
|
||||
def save(self, commit=True):
|
||||
if self.instance._state.adding and self.cleaned_data.get('token'):
|
||||
self.instance.token = self.cleaned_data['token']
|
||||
|
||||
return super().save(commit=commit)
|
||||
|
||||
|
||||
class TokenForm(UserTokenForm):
|
||||
@@ -162,14 +168,17 @@ class TokenForm(UserTokenForm):
|
||||
label=_('User')
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = Token
|
||||
class Meta(UserTokenForm.Meta):
|
||||
fields = [
|
||||
'user', 'key', 'write_enabled', 'expires', 'description', 'allowed_ips',
|
||||
'version', 'token', 'user', 'write_enabled', 'expires', 'description', 'allowed_ips',
|
||||
]
|
||||
widgets = {
|
||||
'expires': DateTimePicker(),
|
||||
}
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
# If not creating a new Token, disable the user field
|
||||
if self.instance and not self.instance._state.adding:
|
||||
self.fields['user'].disabled = True
|
||||
|
||||
|
||||
class UserForm(forms.ModelForm):
|
||||
@@ -198,7 +207,7 @@ class UserForm(forms.ModelForm):
|
||||
fieldsets = (
|
||||
FieldSet('username', 'password', 'confirm_password', 'first_name', 'last_name', 'email', name=_('User')),
|
||||
FieldSet('groups', name=_('Groups')),
|
||||
FieldSet('is_active', 'is_staff', 'is_superuser', name=_('Status')),
|
||||
FieldSet('is_active', 'is_superuser', name=_('Status')),
|
||||
FieldSet('object_permissions', name=_('Permissions')),
|
||||
)
|
||||
|
||||
@@ -206,7 +215,7 @@ class UserForm(forms.ModelForm):
|
||||
model = User
|
||||
fields = [
|
||||
'username', 'first_name', 'last_name', 'email', 'groups', 'object_permissions',
|
||||
'is_active', 'is_staff', 'is_superuser',
|
||||
'is_active', 'is_superuser',
|
||||
]
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
|
||||
@@ -27,7 +27,6 @@ class UserFilter(BaseObjectTypeFilterMixin):
|
||||
last_name: FilterLookup[str] | None = strawberry_django.filter_field()
|
||||
email: FilterLookup[str] | None = strawberry_django.filter_field()
|
||||
is_superuser: FilterLookup[bool] | None = strawberry_django.filter_field()
|
||||
is_staff: FilterLookup[bool] | None = strawberry_django.filter_field()
|
||||
is_active: FilterLookup[bool] | None = strawberry_django.filter_field()
|
||||
date_joined: DatetimeFilterLookup[datetime] | None = strawberry_django.filter_field()
|
||||
last_login: DatetimeFilterLookup[datetime] | None = strawberry_django.filter_field()
|
||||
|
||||
@@ -2,14 +2,24 @@ from typing import List
|
||||
|
||||
import strawberry
|
||||
import strawberry_django
|
||||
from strawberry_django.pagination import OffsetPaginated
|
||||
|
||||
from .types import *
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class UsersQuery:
|
||||
class UsersQueryV1:
|
||||
group: GroupType = strawberry_django.field()
|
||||
group_list: List[GroupType] = strawberry_django.field()
|
||||
|
||||
user: UserType = strawberry_django.field()
|
||||
user_list: List[UserType] = strawberry_django.field()
|
||||
|
||||
|
||||
@strawberry.type(name="Query")
|
||||
class UsersQuery:
|
||||
group: GroupType = strawberry_django.field()
|
||||
group_list: OffsetPaginated[GroupType] = strawberry_django.offset_paginated()
|
||||
|
||||
user: UserType = strawberry_django.field()
|
||||
user_list: OffsetPaginated[UserType] = strawberry_django.offset_paginated()
|
||||
|
||||
@@ -25,7 +25,7 @@ class GroupType(BaseObjectType):
|
||||
@strawberry_django.type(
|
||||
User,
|
||||
fields=[
|
||||
'id', 'username', 'first_name', 'last_name', 'email', 'is_staff', 'is_active', 'date_joined', 'groups',
|
||||
'id', 'username', 'first_name', 'last_name', 'email', 'is_active', 'date_joined', 'groups',
|
||||
],
|
||||
filters=UserFilter,
|
||||
pagination=True
|
||||
|
||||
15
netbox/users/migrations/0013_user_remove_is_staff.py
Normal file
15
netbox/users/migrations/0013_user_remove_is_staff.py
Normal file
@@ -0,0 +1,15 @@
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('users', '0012_drop_django_admin_log_table'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RemoveField(
|
||||
model_name='user',
|
||||
name='is_staff',
|
||||
),
|
||||
]
|
||||
100
netbox/users/migrations/0014_users_token_v2.py
Normal file
100
netbox/users/migrations/0014_users_token_v2.py
Normal file
@@ -0,0 +1,100 @@
|
||||
import django.core.validators
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('users', '0013_user_remove_is_staff'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# Rename the original key field to "plaintext"
|
||||
migrations.RenameField(
|
||||
model_name='token',
|
||||
old_name='key',
|
||||
new_name='plaintext',
|
||||
),
|
||||
migrations.RunSQL(
|
||||
sql="ALTER INDEX IF EXISTS users_token_key_820deccd_like RENAME TO users_token_plaintext_46c6f315_like",
|
||||
),
|
||||
migrations.RunSQL(
|
||||
sql="ALTER INDEX IF EXISTS users_token_key_key RENAME TO users_token_plaintext_key",
|
||||
),
|
||||
|
||||
# Make plaintext (formerly key) nullable for v2 tokens
|
||||
migrations.AlterField(
|
||||
model_name='token',
|
||||
name='plaintext',
|
||||
field=models.CharField(
|
||||
max_length=40,
|
||||
unique=True,
|
||||
blank=True,
|
||||
null=True,
|
||||
validators=[django.core.validators.MinLengthValidator(40)]
|
||||
),
|
||||
),
|
||||
|
||||
# Add version field to distinguish v1 and v2 tokens
|
||||
migrations.AddField(
|
||||
model_name='token',
|
||||
name='version',
|
||||
field=models.PositiveSmallIntegerField(default=1), # Mark all existing Tokens as v1
|
||||
preserve_default=False,
|
||||
),
|
||||
|
||||
# Change the default version for new tokens to v2
|
||||
migrations.AlterField(
|
||||
model_name='token',
|
||||
name='version',
|
||||
field=models.PositiveSmallIntegerField(default=2),
|
||||
),
|
||||
|
||||
# Add new key, pepper, and hmac_digest fields for v2 tokens
|
||||
migrations.AddField(
|
||||
model_name='token',
|
||||
name='key',
|
||||
field=models.CharField(
|
||||
blank=True,
|
||||
max_length=12,
|
||||
null=True,
|
||||
unique=True,
|
||||
validators=[django.core.validators.MinLengthValidator(12)]
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='token',
|
||||
name='pepper_id',
|
||||
field=models.PositiveSmallIntegerField(blank=True, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='token',
|
||||
name='hmac_digest',
|
||||
field=models.CharField(blank=True, max_length=64, null=True),
|
||||
),
|
||||
|
||||
# Add constraints to enforce v1/v2-dependent fields
|
||||
migrations.AddConstraint(
|
||||
model_name='token',
|
||||
constraint=models.CheckConstraint(
|
||||
name='enforce_version_dependent_fields',
|
||||
condition=models.Q(
|
||||
models.Q(
|
||||
('hmac_digest__isnull', True),
|
||||
('key__isnull', True),
|
||||
('pepper_id__isnull', True),
|
||||
('plaintext__isnull', False),
|
||||
('version', 1)
|
||||
),
|
||||
models.Q(
|
||||
('hmac_digest__isnull', False),
|
||||
('key__isnull', False),
|
||||
('pepper_id__isnull', False),
|
||||
('plaintext__isnull', True),
|
||||
('version', 2)
|
||||
),
|
||||
_connector='OR'
|
||||
)
|
||||
)
|
||||
),
|
||||
]
|
||||
@@ -1,16 +1,22 @@
|
||||
import binascii
|
||||
import os
|
||||
import hashlib
|
||||
import hmac
|
||||
import random
|
||||
|
||||
from django.conf import settings
|
||||
from django.contrib.postgres.fields import ArrayField
|
||||
from django.core.exceptions import ValidationError
|
||||
from django.core.validators import MinLengthValidator
|
||||
from django.db import models
|
||||
from django.db.models import Q
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from netaddr import IPNetwork
|
||||
|
||||
from ipam.fields import IPNetworkField
|
||||
from users.choices import TokenVersionChoices
|
||||
from users.constants import TOKEN_CHARSET, TOKEN_DEFAULT_LENGTH, TOKEN_KEY_LENGTH, TOKEN_PREFIX
|
||||
from users.utils import get_current_pepper
|
||||
from utilities.querysets import RestrictedQuerySet
|
||||
|
||||
__all__ = (
|
||||
@@ -23,11 +29,23 @@ class Token(models.Model):
|
||||
An API token used for user authentication. This extends the stock model to allow each user to have multiple tokens.
|
||||
It also supports setting an expiration time and toggling write ability.
|
||||
"""
|
||||
_token = None
|
||||
|
||||
version = models.PositiveSmallIntegerField(
|
||||
verbose_name=_('version'),
|
||||
choices=TokenVersionChoices,
|
||||
default=TokenVersionChoices.V2,
|
||||
)
|
||||
user = models.ForeignKey(
|
||||
to='users.User',
|
||||
on_delete=models.CASCADE,
|
||||
related_name='tokens'
|
||||
)
|
||||
description = models.CharField(
|
||||
verbose_name=_('description'),
|
||||
max_length=200,
|
||||
blank=True
|
||||
)
|
||||
created = models.DateTimeField(
|
||||
verbose_name=_('created'),
|
||||
auto_now_add=True
|
||||
@@ -42,21 +60,41 @@ class Token(models.Model):
|
||||
blank=True,
|
||||
null=True
|
||||
)
|
||||
key = models.CharField(
|
||||
verbose_name=_('key'),
|
||||
max_length=40,
|
||||
unique=True,
|
||||
validators=[MinLengthValidator(40)]
|
||||
)
|
||||
write_enabled = models.BooleanField(
|
||||
verbose_name=_('write enabled'),
|
||||
default=True,
|
||||
help_text=_('Permit create/update/delete operations using this key')
|
||||
)
|
||||
description = models.CharField(
|
||||
verbose_name=_('description'),
|
||||
max_length=200,
|
||||
blank=True
|
||||
# For legacy v1 tokens, this field stores the plaintext 40-char token value. Not used for v2.
|
||||
plaintext = models.CharField(
|
||||
verbose_name=_('plaintext'),
|
||||
max_length=40,
|
||||
unique=True,
|
||||
blank=True,
|
||||
null=True,
|
||||
validators=[MinLengthValidator(40)],
|
||||
)
|
||||
key = models.CharField(
|
||||
verbose_name=_('key'),
|
||||
max_length=TOKEN_KEY_LENGTH,
|
||||
unique=True,
|
||||
blank=True,
|
||||
null=True,
|
||||
validators=[MinLengthValidator(TOKEN_KEY_LENGTH)],
|
||||
help_text=_('v2 token identification key'),
|
||||
)
|
||||
pepper_id = models.PositiveSmallIntegerField(
|
||||
verbose_name=_('pepper ID'),
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text=_('ID of the cryptographic pepper used to hash the token (v2 only)'),
|
||||
)
|
||||
hmac_digest = models.CharField(
|
||||
verbose_name=_('digest'),
|
||||
max_length=64,
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text=_('SHA256 hash of the token and pepper (v2 only)'),
|
||||
)
|
||||
allowed_ips = ArrayField(
|
||||
base_field=IPNetworkField(),
|
||||
@@ -72,29 +110,113 @@ class Token(models.Model):
|
||||
objects = RestrictedQuerySet.as_manager()
|
||||
|
||||
class Meta:
|
||||
ordering = ('-created',)
|
||||
verbose_name = _('token')
|
||||
verbose_name_plural = _('tokens')
|
||||
ordering = ('-created',)
|
||||
constraints = [
|
||||
models.CheckConstraint(
|
||||
name='enforce_version_dependent_fields',
|
||||
condition=(
|
||||
Q(
|
||||
version=1,
|
||||
key__isnull=True,
|
||||
pepper_id__isnull=True,
|
||||
hmac_digest__isnull=True,
|
||||
plaintext__isnull=False
|
||||
) |
|
||||
Q(
|
||||
version=2,
|
||||
key__isnull=False,
|
||||
pepper_id__isnull=False,
|
||||
hmac_digest__isnull=False,
|
||||
plaintext__isnull=True
|
||||
)
|
||||
),
|
||||
),
|
||||
]
|
||||
|
||||
def __init__(self, *args, token=None, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
# This stores the initial plaintext value (if given) on the creation of a new Token. If not provided, a
|
||||
# random token value will be generated and assigned immediately prior to saving the Token instance.
|
||||
self.token = token
|
||||
|
||||
def __str__(self):
|
||||
return self.key if settings.ALLOW_TOKEN_RETRIEVAL else self.partial
|
||||
return self.key if self.v2 else self.partial
|
||||
|
||||
def get_absolute_url(self):
|
||||
return reverse('users:token', args=[self.pk])
|
||||
|
||||
@property
|
||||
def v1(self):
|
||||
return self.version == 1
|
||||
|
||||
@property
|
||||
def v2(self):
|
||||
return self.version == 2
|
||||
|
||||
@property
|
||||
def partial(self):
|
||||
return f'**********************************{self.key[-6:]}' if self.key else ''
|
||||
"""
|
||||
Return a sanitized representation of a v1 token.
|
||||
"""
|
||||
return f'**********************************{self.plaintext[-6:]}' if self.plaintext else ''
|
||||
|
||||
@property
|
||||
def token(self):
|
||||
return self._token
|
||||
|
||||
@token.setter
|
||||
def token(self, value):
|
||||
if not self._state.adding:
|
||||
raise ValueError("Cannot assign a new plaintext value for an existing token.")
|
||||
self._token = value
|
||||
if value is not None:
|
||||
if self.v1:
|
||||
self.plaintext = value
|
||||
elif self.v2:
|
||||
self.key = self.key or self.generate_key()
|
||||
self.update_digest()
|
||||
|
||||
def clean(self):
|
||||
if self._state.adding:
|
||||
if self.pepper_id is not None and self.pepper_id not in settings.API_TOKEN_PEPPERS:
|
||||
raise ValidationError(_(
|
||||
"Invalid pepper ID: {id}. Check configured API_TOKEN_PEPPERS."
|
||||
).format(id=self.pepper_id))
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if not self.key:
|
||||
self.key = self.generate_key()
|
||||
# If creating a new Token and no token value has been specified, generate one
|
||||
if self._state.adding and self.token is None:
|
||||
self.token = self.generate()
|
||||
|
||||
return super().save(*args, **kwargs)
|
||||
|
||||
@classmethod
|
||||
def generate_key(cls):
|
||||
"""
|
||||
Generate and return a random alphanumeric key for v2 tokens.
|
||||
"""
|
||||
return cls.generate(length=TOKEN_KEY_LENGTH)
|
||||
|
||||
@staticmethod
|
||||
def generate_key():
|
||||
# Generate a random 160-bit key expressed in hexadecimal.
|
||||
return binascii.hexlify(os.urandom(20)).decode()
|
||||
def generate(length=TOKEN_DEFAULT_LENGTH):
|
||||
"""
|
||||
Generate and return a random token value of the given length.
|
||||
"""
|
||||
return ''.join(random.choice(TOKEN_CHARSET) for _ in range(length))
|
||||
|
||||
def update_digest(self):
|
||||
"""
|
||||
Recalculate and save the HMAC digest using the currently defined pepper and token values.
|
||||
"""
|
||||
self.pepper_id, pepper = get_current_pepper()
|
||||
self.hmac_digest = hmac.new(
|
||||
pepper.encode('utf-8'),
|
||||
self.token.encode('utf-8'),
|
||||
hashlib.sha256
|
||||
).hexdigest()
|
||||
|
||||
@property
|
||||
def is_expired(self):
|
||||
@@ -102,6 +224,26 @@ class Token(models.Model):
|
||||
return False
|
||||
return True
|
||||
|
||||
def validate(self, token):
|
||||
"""
|
||||
Validate the given plaintext against the token.
|
||||
|
||||
For v1 tokens, check that the given value is equal to the stored plaintext. For v2 tokens, calculate an HMAC
|
||||
from the Token's pepper ID and the given plaintext value, and check whether the result matches the recorded
|
||||
digest.
|
||||
"""
|
||||
if self.v1:
|
||||
return token == self.token
|
||||
if self.v2:
|
||||
token = token.removeprefix(TOKEN_PREFIX)
|
||||
try:
|
||||
pepper = settings.API_TOKEN_PEPPERS[self.pepper_id]
|
||||
except KeyError:
|
||||
# Invalid pepper ID
|
||||
return False
|
||||
digest = hmac.new(pepper.encode('utf-8'), token.encode('utf-8'), hashlib.sha256).hexdigest()
|
||||
return digest == self.hmac_digest
|
||||
|
||||
def validate_client_ip(self, client_ip):
|
||||
"""
|
||||
Validate the API client IP address against the source IP restrictions (if any) set on the token.
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user