mirror of
https://github.com/netbox-community/netbox.git
synced 2025-07-21 11:37:21 -06:00
commit
6638fd88b4
2
.github/ISSUE_TEMPLATE/bug_report.yaml
vendored
2
.github/ISSUE_TEMPLATE/bug_report.yaml
vendored
@ -14,7 +14,7 @@ body:
|
||||
attributes:
|
||||
label: NetBox version
|
||||
description: What version of NetBox are you currently running?
|
||||
placeholder: v3.4.4
|
||||
placeholder: v3.4.5
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
|
2
.github/ISSUE_TEMPLATE/feature_request.yaml
vendored
2
.github/ISSUE_TEMPLATE/feature_request.yaml
vendored
@ -14,7 +14,7 @@ body:
|
||||
attributes:
|
||||
label: NetBox version
|
||||
description: What version of NetBox are you currently running?
|
||||
placeholder: v3.4.4
|
||||
placeholder: v3.4.5
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
|
2
.github/workflows/stale.yml
vendored
2
.github/workflows/stale.yml
vendored
@ -24,7 +24,7 @@ jobs:
|
||||
necessary.
|
||||
close-pr-message: >
|
||||
This PR has been automatically closed due to lack of activity.
|
||||
days-before-stale: 60
|
||||
days-before-stale: 90
|
||||
days-before-close: 30
|
||||
exempt-issue-labels: 'status: accepted,status: blocked,status: needs milestone'
|
||||
operations-per-run: 100
|
||||
|
@ -5,6 +5,7 @@ NetBox includes a `housekeeping` management command that should be run nightly.
|
||||
* Clearing expired authentication sessions from the database
|
||||
* Deleting changelog records older than the configured [retention time](../configuration/miscellaneous.md#changelog_retention)
|
||||
* Deleting job result records older than the configured [retention time](../configuration/miscellaneous.md#jobresult_retention)
|
||||
* Check for new NetBox releases (if [`RELEASE_CHECK_URL`](../configuration/miscellaneous.md#release_check_url) is set)
|
||||
|
||||
This command can be invoked directly, or by using the shell script provided at `/opt/netbox/contrib/netbox-housekeeping.sh`. This script can be linked from your cron scheduler's daily jobs directory (e.g. `/etc/cron.daily`) or referenced directly within the cron configuration file.
|
||||
|
||||
|
@ -69,6 +69,14 @@ By default, NetBox will permit users to create duplicate prefixes and IP address
|
||||
|
||||
---
|
||||
|
||||
## `FILE_UPLOAD_MAX_MEMORY_SIZE`
|
||||
|
||||
Default: `2621440` (2.5 MB).
|
||||
|
||||
The maximum amount (in bytes) of uploaded data that will be held in memory before being written to the filesystem. Changing this setting can be useful for example to be able to upload files bigger than 2.5MB to custom scripts for processing.
|
||||
|
||||
---
|
||||
|
||||
## GRAPHQL_ENABLED
|
||||
|
||||
!!! tip "Dynamic Configuration Parameter"
|
||||
|
@ -54,15 +54,19 @@ Each model should have a corresponding FilterSet class defined. This is used to
|
||||
|
||||
Create a table class for the model in `tables.py` by subclassing `utilities.tables.BaseTable`. Under the table's `Meta` class, be sure to list both the fields and default columns.
|
||||
|
||||
## 9. Create the object template
|
||||
## 9. Create a SearchIndex subclass
|
||||
|
||||
If this model will be included in global search results, create a subclass of `netbox.search.SearchIndex` for it and specify the fields to be indexed.
|
||||
|
||||
## 10. Create the object template
|
||||
|
||||
Create the HTML template for the object view. (The other views each typically employ a generic template.) This template should extend `generic/object.html`.
|
||||
|
||||
## 10. Add the model to the navigation menu
|
||||
## 11. Add the model to the navigation menu
|
||||
|
||||
Add the relevant navigation menu items in `netbox/netbox/navigation/menu.py`.
|
||||
|
||||
## 11. REST API components
|
||||
## 12. REST API components
|
||||
|
||||
Create the following for each model:
|
||||
|
||||
@ -71,13 +75,13 @@ Create the following for each model:
|
||||
* API view in `api/views.py`
|
||||
* Endpoint route in `api/urls.py`
|
||||
|
||||
## 12. GraphQL API components
|
||||
## 13. GraphQL API components
|
||||
|
||||
Create a Graphene object type for the model in `graphql/types.py` by subclassing the appropriate class from `netbox.graphql.types`.
|
||||
|
||||
Also extend the schema class defined in `graphql/schema.py` with the individual object and object list fields per the established convention.
|
||||
|
||||
## 13. Add tests
|
||||
## 14. Add tests
|
||||
|
||||
Add tests for the following:
|
||||
|
||||
@ -85,7 +89,7 @@ Add tests for the following:
|
||||
* API views
|
||||
* Filter sets
|
||||
|
||||
## 14. Documentation
|
||||
## 15. Documentation
|
||||
|
||||
Create a new documentation page for the model in `docs/models/<app_label>/<model_name>.md`. Include this file under the "features" documentation where appropriate.
|
||||
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
NetBox was originally developed by its lead maintainer, [Jeremy Stretch](https://github.com/jeremystretch), while he was working as a network engineer at [DigitalOcean](https://www.digitalocean.com/) in 2015 as part of an effort to automate their network provisioning. Recognizing the new tool's potential, DigitalOcean agreed to release it as an open source project in June 2016.
|
||||
|
||||
Since then, thousands of organizations around the world have embraced NetBox as their central network source of truth to empower both network operators and automation.
|
||||
Since then, thousands of organizations around the world have embraced NetBox as their central network source of truth to empower both network operators and automation. Today, the open source project is stewarded by [NetBox Labs](https://netboxlabs.com/) and a team of volunteer maintainers. Beyond the core product, myriad [plugins](https://netbox.dev/plugins/) have been developed by the NetBox community to enhance and expand its feature set.
|
||||
|
||||
## Key Features
|
||||
|
||||
@ -17,6 +17,7 @@ NetBox was built specifically to serve the needs of network engineers and operat
|
||||
* AS number (ASN) management
|
||||
* Rack elevations with SVG rendering
|
||||
* Device modeling using pre-defined types
|
||||
* Virtual chassis and device contexts
|
||||
* Network, power, and console cabling with SVG traces
|
||||
* Power distribution modeling
|
||||
* Data circuit and provider tracking
|
||||
@ -29,12 +30,13 @@ NetBox was built specifically to serve the needs of network engineers and operat
|
||||
* Tenant ownership assignment
|
||||
* Device & VM configuration contexts for advanced configuration rendering
|
||||
* Custom fields for data model extension
|
||||
* Support for custom validation rules
|
||||
* Custom validation rules
|
||||
* Custom reports & scripts executable directly within the UI
|
||||
* Extensive plugin framework for adding custom functionality
|
||||
* Single sign-on (SSO) authentication
|
||||
* Robust object-based permissions
|
||||
* Detailed, automatic change logging
|
||||
* Global search engine
|
||||
* NAPALM integration
|
||||
|
||||
## What NetBox Is Not
|
||||
|
@ -97,7 +97,7 @@ Multiple conditions can be combined into nested sets using AND or OR logic. This
|
||||
|
||||
### Examples
|
||||
|
||||
`status` is "active" and `primary_ip` is defined _or_ the "exempt" tag is applied.
|
||||
`status` is "active" and `primary_ip4` is defined _or_ the "exempt" tag is applied.
|
||||
|
||||
```json
|
||||
{
|
||||
@ -109,8 +109,8 @@ Multiple conditions can be combined into nested sets using AND or OR logic. This
|
||||
"value": "active"
|
||||
},
|
||||
{
|
||||
"attr": "primary_ip",
|
||||
"value": "",
|
||||
"attr": "primary_ip4",
|
||||
"value": null,
|
||||
"negate": true
|
||||
}
|
||||
]
|
||||
|
@ -1,5 +1,30 @@
|
||||
# NetBox v3.4
|
||||
|
||||
## v3.4.5 (2023-02-21)
|
||||
|
||||
### Enhancements
|
||||
|
||||
* [#11110](https://github.com/netbox-community/netbox/issues/11110) - Add `start_address` and `end_address` filters for IP ranges
|
||||
* [#11592](https://github.com/netbox-community/netbox/issues/11592) - Introduce `FILE_UPLOAD_MAX_MEMORY_SIZE` configuration parameter
|
||||
* [#11685](https://github.com/netbox-community/netbox/issues/11685) - Match on containing prefixes and aggregates when querying for IP addresses using global search
|
||||
* [#11787](https://github.com/netbox-community/netbox/issues/11787) - Upgrade script will automatically rebuild missing search cache
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* [#11032](https://github.com/netbox-community/netbox/issues/11032) - Fix false custom validation errors during component creation
|
||||
* [#11226](https://github.com/netbox-community/netbox/issues/11226) - Ensure scripts and reports within submodules are automatically reloaded
|
||||
* [#11459](https://github.com/netbox-community/netbox/issues/11459) - Enable evaluating null values in custom validation rules
|
||||
* [#11473](https://github.com/netbox-community/netbox/issues/11473) - GraphQL requests specifying an invalid filter should return an empty queryset
|
||||
* [#11582](https://github.com/netbox-community/netbox/issues/11582) - Ensure form validation errors are displayed when adding virtual chassis members
|
||||
* [#11601](https://github.com/netbox-community/netbox/issues/11601) - Fix partial matching of start/end addresses for IP range search
|
||||
* [#11683](https://github.com/netbox-community/netbox/issues/11683) - Fix CSV header attribute detection when auto-detecting import format
|
||||
* [#11711](https://github.com/netbox-community/netbox/issues/11711) - Fix CSV import for multiple-object custom fields
|
||||
* [#11723](https://github.com/netbox-community/netbox/issues/11723) - Circuit terminations should link to their associated circuits (rather than site or provider network)
|
||||
* [#11775](https://github.com/netbox-community/netbox/issues/11775) - Skip checking for old search cache records when creating a new object
|
||||
* [#11786](https://github.com/netbox-community/netbox/issues/11786) - List only applicable object types in form widget when filtering custom fields
|
||||
|
||||
---
|
||||
|
||||
## v3.4.4 (2023-02-02)
|
||||
|
||||
### Enhancements
|
||||
|
@ -196,12 +196,10 @@ class CircuitTermination(
|
||||
)
|
||||
|
||||
def __str__(self):
|
||||
return f'Termination {self.term_side}: {self.site or self.provider_network}'
|
||||
return f'{self.circuit}: Termination {self.term_side}'
|
||||
|
||||
def get_absolute_url(self):
|
||||
if self.site:
|
||||
return self.site.get_absolute_url()
|
||||
return self.provider_network.get_absolute_url()
|
||||
return self.circuit.get_absolute_url()
|
||||
|
||||
def clean(self):
|
||||
super().clean()
|
||||
|
@ -44,7 +44,8 @@ class Condition:
|
||||
bool: (EQ, CONTAINS),
|
||||
int: (EQ, GT, GTE, LT, LTE, CONTAINS),
|
||||
float: (EQ, GT, GTE, LT, LTE, CONTAINS),
|
||||
list: (EQ, IN, CONTAINS)
|
||||
list: (EQ, IN, CONTAINS),
|
||||
type(None): (EQ,)
|
||||
}
|
||||
|
||||
def __init__(self, attr, value, op=EQ, negate=False):
|
||||
|
8
netbox/extras/fields.py
Normal file
8
netbox/extras/fields.py
Normal file
@ -0,0 +1,8 @@
|
||||
from django.db.models import TextField
|
||||
|
||||
|
||||
class CachedValueField(TextField):
|
||||
"""
|
||||
Currently a dummy field to prevent custom lookups being applied globally to TextField.
|
||||
"""
|
||||
pass
|
@ -38,8 +38,7 @@ class CustomFieldFilterForm(SavedFiltersMixin, FilterForm):
|
||||
('Attributes', ('type', 'content_type_id', 'group_name', 'weight', 'required', 'ui_visibility')),
|
||||
)
|
||||
content_type_id = ContentTypeMultipleChoiceField(
|
||||
queryset=ContentType.objects.all(),
|
||||
limit_choices_to=FeatureQuery('custom_fields'),
|
||||
queryset=ContentType.objects.filter(FeatureQuery('custom_fields').get_query()),
|
||||
required=False,
|
||||
label=_('Object type')
|
||||
)
|
||||
@ -79,8 +78,7 @@ class JobResultFilterForm(SavedFiltersMixin, FilterForm):
|
||||
)
|
||||
obj_type = ContentTypeChoiceField(
|
||||
label=_('Object Type'),
|
||||
queryset=ContentType.objects.all(),
|
||||
limit_choices_to=FeatureQuery('job_results'), # TODO: This doesn't actually work
|
||||
queryset=ContentType.objects.filter(FeatureQuery('job_results').get_query()),
|
||||
required=False,
|
||||
)
|
||||
status = MultipleChoiceField(
|
||||
@ -135,8 +133,7 @@ class CustomLinkFilterForm(SavedFiltersMixin, FilterForm):
|
||||
('Attributes', ('content_types', 'enabled', 'new_window', 'weight')),
|
||||
)
|
||||
content_types = ContentTypeMultipleChoiceField(
|
||||
queryset=ContentType.objects.all(),
|
||||
limit_choices_to=FeatureQuery('custom_links'),
|
||||
queryset=ContentType.objects.filter(FeatureQuery('custom_links').get_query()),
|
||||
required=False
|
||||
)
|
||||
enabled = forms.NullBooleanField(
|
||||
@ -162,8 +159,7 @@ class ExportTemplateFilterForm(SavedFiltersMixin, FilterForm):
|
||||
('Attributes', ('content_types', 'mime_type', 'file_extension', 'as_attachment')),
|
||||
)
|
||||
content_types = ContentTypeMultipleChoiceField(
|
||||
queryset=ContentType.objects.all(),
|
||||
limit_choices_to=FeatureQuery('export_templates'),
|
||||
queryset=ContentType.objects.filter(FeatureQuery('export_templates').get_query()),
|
||||
required=False
|
||||
)
|
||||
mime_type = forms.CharField(
|
||||
@ -187,8 +183,7 @@ class SavedFilterFilterForm(SavedFiltersMixin, FilterForm):
|
||||
('Attributes', ('content_types', 'enabled', 'shared', 'weight')),
|
||||
)
|
||||
content_types = ContentTypeMultipleChoiceField(
|
||||
queryset=ContentType.objects.all(),
|
||||
limit_choices_to=FeatureQuery('export_templates'),
|
||||
queryset=ContentType.objects.filter(FeatureQuery('export_templates').get_query()),
|
||||
required=False
|
||||
)
|
||||
enabled = forms.NullBooleanField(
|
||||
@ -215,8 +210,7 @@ class WebhookFilterForm(SavedFiltersMixin, FilterForm):
|
||||
('Events', ('type_create', 'type_update', 'type_delete')),
|
||||
)
|
||||
content_type_id = ContentTypeMultipleChoiceField(
|
||||
queryset=ContentType.objects.all(),
|
||||
limit_choices_to=FeatureQuery('webhooks'),
|
||||
queryset=ContentType.objects.filter(FeatureQuery('webhooks').get_query()),
|
||||
required=False,
|
||||
label=_('Object type')
|
||||
)
|
||||
|
@ -1,4 +1,5 @@
|
||||
from django.db.models import CharField, Lookup
|
||||
from django.db.models import CharField, TextField, Lookup
|
||||
from .fields import CachedValueField
|
||||
|
||||
|
||||
class Empty(Lookup):
|
||||
@ -14,4 +15,18 @@ class Empty(Lookup):
|
||||
return 'CAST(LENGTH(%s) AS BOOLEAN) != %s' % (lhs, rhs), params
|
||||
|
||||
|
||||
class NetContainsOrEquals(Lookup):
|
||||
"""
|
||||
This lookup has the same functionality as the one from the ipam app except lhs is cast to inet
|
||||
"""
|
||||
lookup_name = 'net_contains_or_equals'
|
||||
|
||||
def as_sql(self, qn, connection):
|
||||
lhs, lhs_params = self.process_lhs(qn, connection)
|
||||
rhs, rhs_params = self.process_rhs(qn, connection)
|
||||
params = lhs_params + rhs_params
|
||||
return 'CAST(%s AS INET) >>= %s' % (lhs, rhs), params
|
||||
|
||||
|
||||
CharField.register_lookup(Empty)
|
||||
CachedValueField.register_lookup(NetContainsOrEquals)
|
||||
|
@ -37,7 +37,7 @@ class Command(BaseCommand):
|
||||
f"clearing sessions; skipping."
|
||||
)
|
||||
|
||||
# Delete expired ObjectRecords
|
||||
# Delete expired ObjectChanges
|
||||
if options['verbosity']:
|
||||
self.stdout.write("[*] Checking for expired changelog records")
|
||||
if config.CHANGELOG_RETENTION:
|
||||
|
@ -15,6 +15,11 @@ class Command(BaseCommand):
|
||||
nargs='*',
|
||||
help='One or more apps or models to reindex',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--lazy',
|
||||
action='store_true',
|
||||
help="For each model, reindex objects only if no cache entries already exist"
|
||||
)
|
||||
|
||||
def _get_indexers(self, *model_names):
|
||||
indexers = {}
|
||||
@ -60,14 +65,15 @@ class Command(BaseCommand):
|
||||
raise CommandError("No indexers found!")
|
||||
self.stdout.write(f'Reindexing {len(indexers)} models.')
|
||||
|
||||
# Clear all cached values for the specified models
|
||||
self.stdout.write('Clearing cached values... ', ending='')
|
||||
self.stdout.flush()
|
||||
content_types = [
|
||||
ContentType.objects.get_for_model(model) for model in indexers.keys()
|
||||
]
|
||||
deleted_count = search_backend.clear(content_types)
|
||||
self.stdout.write(f'{deleted_count} entries deleted.')
|
||||
# Clear all cached values for the specified models (if not being lazy)
|
||||
if not kwargs['lazy']:
|
||||
self.stdout.write('Clearing cached values... ', ending='')
|
||||
self.stdout.flush()
|
||||
content_types = [
|
||||
ContentType.objects.get_for_model(model) for model in indexers.keys()
|
||||
]
|
||||
deleted_count = search_backend.clear(content_types)
|
||||
self.stdout.write(f'{deleted_count} entries deleted.')
|
||||
|
||||
# Index models
|
||||
self.stdout.write('Indexing models')
|
||||
@ -76,11 +82,18 @@ class Command(BaseCommand):
|
||||
model_name = model._meta.model_name
|
||||
self.stdout.write(f' {app_label}.{model_name}... ', ending='')
|
||||
self.stdout.flush()
|
||||
|
||||
if kwargs['lazy']:
|
||||
content_type = ContentType.objects.get_for_model(model)
|
||||
if cached_count := search_backend.count(object_types=[content_type]):
|
||||
self.stdout.write(f'Skipping (found {cached_count} existing).')
|
||||
continue
|
||||
|
||||
i = search_backend.cache(model.objects.iterator(), remove_existing=False)
|
||||
if i:
|
||||
self.stdout.write(f'{i} entries cached.')
|
||||
else:
|
||||
self.stdout.write(f'None found.')
|
||||
self.stdout.write(f'No objects found.')
|
||||
|
||||
msg = f'Completed.'
|
||||
if total_count := search_backend.size:
|
||||
|
@ -1,25 +1,9 @@
|
||||
import sys
|
||||
import uuid
|
||||
|
||||
import django.db.models.deletion
|
||||
import django.db.models.lookups
|
||||
from django.core import management
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
def reindex(apps, schema_editor):
|
||||
# Build the search index (except during tests)
|
||||
if 'test' not in sys.argv:
|
||||
management.call_command(
|
||||
'reindex',
|
||||
'circuits',
|
||||
'dcim',
|
||||
'extras',
|
||||
'ipam',
|
||||
'tenancy',
|
||||
'virtualization',
|
||||
'wireless',
|
||||
)
|
||||
import extras.fields
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
@ -49,7 +33,7 @@ class Migration(migrations.Migration):
|
||||
('object_id', models.PositiveBigIntegerField()),
|
||||
('field', models.CharField(max_length=200)),
|
||||
('type', models.CharField(max_length=30)),
|
||||
('value', models.TextField()),
|
||||
('value', extras.fields.CachedValueField()),
|
||||
('weight', models.PositiveSmallIntegerField(default=1000)),
|
||||
('object_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='+', to='contenttypes.contenttype')),
|
||||
],
|
||||
@ -57,8 +41,4 @@ class Migration(migrations.Migration):
|
||||
'ordering': ('weight', 'object_type', 'object_id'),
|
||||
},
|
||||
),
|
||||
migrations.RunPython(
|
||||
code=reindex,
|
||||
reverse_code=migrations.RunPython.noop
|
||||
),
|
||||
]
|
||||
|
@ -20,10 +20,12 @@ from netbox.models import ChangeLoggedModel
|
||||
from netbox.models.features import CloningMixin, ExportTemplatesMixin, WebhooksMixin
|
||||
from netbox.search import FieldTypes
|
||||
from utilities import filters
|
||||
from utilities.forms import (
|
||||
CSVChoiceField, CSVMultipleChoiceField, DatePicker, DynamicModelChoiceField, DynamicModelMultipleChoiceField,
|
||||
JSONField, LaxURLField, StaticSelectMultiple, StaticSelect, add_blank_choice,
|
||||
from utilities.forms.fields import (
|
||||
CSVChoiceField, CSVModelChoiceField, CSVModelMultipleChoiceField, CSVMultipleChoiceField, DynamicModelChoiceField,
|
||||
DynamicModelMultipleChoiceField, JSONField, LaxURLField,
|
||||
)
|
||||
from utilities.forms.widgets import DatePicker, StaticSelectMultiple, StaticSelect
|
||||
from utilities.forms.utils import add_blank_choice
|
||||
from utilities.querysets import RestrictedQuerySet
|
||||
from utilities.validators import validate_regex
|
||||
|
||||
@ -413,7 +415,8 @@ class CustomField(CloningMixin, ExportTemplatesMixin, WebhooksMixin, ChangeLogge
|
||||
# Object
|
||||
elif self.type == CustomFieldTypeChoices.TYPE_OBJECT:
|
||||
model = self.object_type.model_class()
|
||||
field = DynamicModelChoiceField(
|
||||
field_class = CSVModelChoiceField if for_csv_import else DynamicModelChoiceField
|
||||
field = field_class(
|
||||
queryset=model.objects.all(),
|
||||
required=required,
|
||||
initial=initial
|
||||
@ -422,10 +425,11 @@ class CustomField(CloningMixin, ExportTemplatesMixin, WebhooksMixin, ChangeLogge
|
||||
# Multiple objects
|
||||
elif self.type == CustomFieldTypeChoices.TYPE_MULTIOBJECT:
|
||||
model = self.object_type.model_class()
|
||||
field = DynamicModelMultipleChoiceField(
|
||||
field_class = CSVModelMultipleChoiceField if for_csv_import else DynamicModelMultipleChoiceField
|
||||
field = field_class(
|
||||
queryset=model.objects.all(),
|
||||
required=required,
|
||||
initial=initial
|
||||
initial=initial,
|
||||
)
|
||||
|
||||
# Text
|
||||
|
@ -4,6 +4,7 @@ from django.contrib.contenttypes.models import ContentType
|
||||
from django.db import models
|
||||
|
||||
from utilities.fields import RestrictedGenericForeignKey
|
||||
from ..fields import CachedValueField
|
||||
|
||||
__all__ = (
|
||||
'CachedValue',
|
||||
@ -36,7 +37,7 @@ class CachedValue(models.Model):
|
||||
type = models.CharField(
|
||||
max_length=30
|
||||
)
|
||||
value = models.TextField()
|
||||
value = CachedValueField()
|
||||
weight = models.PositiveSmallIntegerField(
|
||||
default=1000
|
||||
)
|
||||
|
@ -524,27 +524,39 @@ def get_scripts(use_names=False):
|
||||
defined name in place of the actual module name.
|
||||
"""
|
||||
scripts = {}
|
||||
# Iterate through all modules within the scripts path. These are the user-created files in which reports are
|
||||
|
||||
# Get all modules within the scripts path. These are the user-created files in which scripts are
|
||||
# defined.
|
||||
for importer, module_name, _ in pkgutil.iter_modules([settings.SCRIPTS_ROOT]):
|
||||
# Use a lock as removing and loading modules is not thread safe
|
||||
with lock:
|
||||
# Remove cached module to ensure consistency with filesystem
|
||||
if module_name in sys.modules:
|
||||
modules = list(pkgutil.iter_modules([settings.SCRIPTS_ROOT]))
|
||||
modules_bases = set([name.split(".")[0] for _, name, _ in modules])
|
||||
|
||||
# Deleting from sys.modules needs to done behind a lock to prevent race conditions where a module is
|
||||
# removed from sys.modules while another thread is importing
|
||||
with lock:
|
||||
for module_name in list(sys.modules.keys()):
|
||||
# Everything sharing a base module path with a module in the script folder is removed.
|
||||
# We also remove all modules with a base module called "scripts". This allows modifying imported
|
||||
# non-script modules without having to reload the RQ worker.
|
||||
module_base = module_name.split(".")[0]
|
||||
if module_base == "scripts" or module_base in modules_bases:
|
||||
del sys.modules[module_name]
|
||||
|
||||
module = importer.find_module(module_name).load_module(module_name)
|
||||
for importer, module_name, _ in modules:
|
||||
module = importer.find_module(module_name).load_module(module_name)
|
||||
|
||||
if use_names and hasattr(module, 'name'):
|
||||
module_name = module.name
|
||||
|
||||
module_scripts = {}
|
||||
script_order = getattr(module, "script_order", ())
|
||||
ordered_scripts = [cls for cls in script_order if is_script(cls)]
|
||||
unordered_scripts = [cls for _, cls in inspect.getmembers(module, is_script) if cls not in script_order]
|
||||
|
||||
for cls in [*ordered_scripts, *unordered_scripts]:
|
||||
# For scripts in submodules use the full import path w/o the root module as the name
|
||||
script_name = cls.full_name.split(".", maxsplit=1)[1]
|
||||
module_scripts[script_name] = cls
|
||||
|
||||
if module_scripts:
|
||||
scripts[module_name] = module_scripts
|
||||
|
||||
|
@ -126,6 +126,16 @@ class ConditionSetTest(TestCase):
|
||||
with self.assertRaises(ValueError):
|
||||
ConditionSet({'foo': []})
|
||||
|
||||
def test_null_value(self):
|
||||
cs = ConditionSet({
|
||||
'and': [
|
||||
{'attr': 'a', 'value': None, 'op': 'eq', 'negate': True},
|
||||
]
|
||||
})
|
||||
self.assertFalse(cs.eval({'a': None}))
|
||||
self.assertTrue(cs.eval({'a': "string"}))
|
||||
self.assertTrue(cs.eval({'a': {"key": "value"}}))
|
||||
|
||||
def test_and_single_depth(self):
|
||||
cs = ConditionSet({
|
||||
'and': [
|
||||
|
@ -405,6 +405,14 @@ class IPRangeFilterSet(TenancyFilterSet, NetBoxModelFilterSet):
|
||||
field_name='start_address',
|
||||
lookup_expr='family'
|
||||
)
|
||||
start_address = MultiValueCharFilter(
|
||||
method='filter_address',
|
||||
label=_('Address'),
|
||||
)
|
||||
end_address = MultiValueCharFilter(
|
||||
method='filter_address',
|
||||
label=_('Address'),
|
||||
)
|
||||
contains = django_filters.CharFilter(
|
||||
method='search_contains',
|
||||
label=_('Ranges which contain this prefix or IP'),
|
||||
@ -441,9 +449,9 @@ class IPRangeFilterSet(TenancyFilterSet, NetBoxModelFilterSet):
|
||||
def search(self, queryset, name, value):
|
||||
if not value.strip():
|
||||
return queryset
|
||||
qs_filter = Q(description__icontains=value)
|
||||
qs_filter = Q(description__icontains=value) | Q(start_address__contains=value) | Q(end_address__contains=value)
|
||||
try:
|
||||
ipaddress = str(netaddr.IPNetwork(value.strip()).cidr)
|
||||
ipaddress = str(netaddr.IPNetwork(value.strip()))
|
||||
qs_filter |= Q(start_address=ipaddress)
|
||||
qs_filter |= Q(end_address=ipaddress)
|
||||
except (AddrFormatError, ValueError):
|
||||
@ -461,6 +469,12 @@ class IPRangeFilterSet(TenancyFilterSet, NetBoxModelFilterSet):
|
||||
except (AddrFormatError, ValueError):
|
||||
return queryset.none()
|
||||
|
||||
def filter_address(self, queryset, name, value):
|
||||
try:
|
||||
return queryset.filter(**{f'{name}__net_in': value})
|
||||
except ValidationError:
|
||||
return queryset.none()
|
||||
|
||||
|
||||
class IPAddressFilterSet(NetBoxModelFilterSet, TenancyFilterSet):
|
||||
family = django_filters.NumberFilter(
|
||||
|
31
netbox/ipam/migrations/0064_clear_search_cache.py
Normal file
31
netbox/ipam/migrations/0064_clear_search_cache.py
Normal file
@ -0,0 +1,31 @@
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
def clear_cache(apps, schema_editor):
|
||||
"""
|
||||
Clear existing CachedValues referencing IPAddressFields or IPNetworkFields. (#11658
|
||||
introduced new cache record types for these.)
|
||||
"""
|
||||
ContentType = apps.get_model('contenttypes', 'ContentType')
|
||||
CachedValue = apps.get_model('extras', 'CachedValue')
|
||||
|
||||
for model_name in ('Aggregate', 'IPAddress', 'IPRange', 'Prefix'):
|
||||
try:
|
||||
content_type = ContentType.objects.get(app_label='ipam', model=model_name.lower())
|
||||
CachedValue.objects.filter(object_type=content_type).delete()
|
||||
except ContentType.DoesNotExist:
|
||||
pass
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ipam', '0063_standardize_description_comments'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(
|
||||
code=clear_cache,
|
||||
reverse_code=migrations.RunPython.noop
|
||||
),
|
||||
]
|
@ -6,7 +6,7 @@ from netbox.search import SearchIndex, register_search
|
||||
class AggregateIndex(SearchIndex):
|
||||
model = models.Aggregate
|
||||
fields = (
|
||||
('prefix', 100),
|
||||
('prefix', 120),
|
||||
('description', 500),
|
||||
('date_added', 2000),
|
||||
('comments', 5000),
|
||||
@ -70,7 +70,7 @@ class L2VPNIndex(SearchIndex):
|
||||
class PrefixIndex(SearchIndex):
|
||||
model = models.Prefix
|
||||
fields = (
|
||||
('prefix', 100),
|
||||
('prefix', 110),
|
||||
('description', 500),
|
||||
('comments', 5000),
|
||||
)
|
||||
|
@ -680,6 +680,14 @@ class IPRangeTestCase(TestCase, ChangeLoggedFilterSetTests):
|
||||
params = {'family': '6'}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 4)
|
||||
|
||||
def test_start_address(self):
|
||||
params = {'start_address': ['10.0.1.100', '10.0.2.100']}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||
|
||||
def test_end_address(self):
|
||||
params = {'end_address': ['10.0.1.199', '10.0.2.199']}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||
|
||||
def test_contains(self):
|
||||
params = {'contains': '10.0.1.150/24'}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 1)
|
||||
|
@ -107,6 +107,9 @@ CORS_ORIGIN_REGEX_WHITELIST = [
|
||||
# r'^(https?://)?(\w+\.)?example\.com$',
|
||||
]
|
||||
|
||||
# The name to use for the CSRF token cookie.
|
||||
CSRF_COOKIE_NAME = 'csrftoken'
|
||||
|
||||
# Set to True to enable server debugging. WARNING: Debugging introduces a substantial performance penalty and may reveal
|
||||
# sensitive information about your installation. Only enable debugging while performing testing. Never enable debugging
|
||||
# on a production system.
|
||||
@ -127,6 +130,9 @@ EMAIL = {
|
||||
'FROM_EMAIL': '',
|
||||
}
|
||||
|
||||
# Localization
|
||||
ENABLE_LOCALIZATION = False
|
||||
|
||||
# Exempt certain models from the enforcement of view permissions. Models listed here will be viewable by all users and
|
||||
# by anonymous users. List models in the form `<app>.<model>`. Add '*' to this list to exempt all models.
|
||||
EXEMPT_VIEW_PERMISSIONS = [
|
||||
@ -168,16 +174,6 @@ LOGOUT_REDIRECT_URL = 'home'
|
||||
# the default value of this setting is derived from the installed location.
|
||||
# MEDIA_ROOT = '/opt/netbox/netbox/media'
|
||||
|
||||
# By default uploaded media is stored on the local filesystem. Using Django-storages is also supported. Provide the
|
||||
# class path of the storage driver in STORAGE_BACKEND and any configuration options in STORAGE_CONFIG. For example:
|
||||
# STORAGE_BACKEND = 'storages.backends.s3boto3.S3Boto3Storage'
|
||||
# STORAGE_CONFIG = {
|
||||
# 'AWS_ACCESS_KEY_ID': 'Key ID',
|
||||
# 'AWS_SECRET_ACCESS_KEY': 'Secret',
|
||||
# 'AWS_STORAGE_BUCKET_NAME': 'netbox',
|
||||
# 'AWS_S3_REGION_NAME': 'eu-west-1',
|
||||
# }
|
||||
|
||||
# Expose Prometheus monitoring metrics at the HTTP endpoint '/metrics'
|
||||
METRICS_ENABLED = False
|
||||
|
||||
@ -217,9 +213,6 @@ RQ_DEFAULT_TIMEOUT = 300
|
||||
# this setting is derived from the installed location.
|
||||
# SCRIPTS_ROOT = '/opt/netbox/netbox/scripts'
|
||||
|
||||
# The name to use for the csrf token cookie.
|
||||
CSRF_COOKIE_NAME = 'csrftoken'
|
||||
|
||||
# The name to use for the session cookie.
|
||||
SESSION_COOKIE_NAME = 'sessionid'
|
||||
|
||||
@ -228,8 +221,15 @@ SESSION_COOKIE_NAME = 'sessionid'
|
||||
# database access.) Note that the user as which NetBox runs must have read and write permissions to this path.
|
||||
SESSION_FILE_PATH = None
|
||||
|
||||
# Localization
|
||||
ENABLE_LOCALIZATION = False
|
||||
# By default, uploaded media is stored on the local filesystem. Using Django-storages is also supported. Provide the
|
||||
# class path of the storage driver in STORAGE_BACKEND and any configuration options in STORAGE_CONFIG. For example:
|
||||
# STORAGE_BACKEND = 'storages.backends.s3boto3.S3Boto3Storage'
|
||||
# STORAGE_CONFIG = {
|
||||
# 'AWS_ACCESS_KEY_ID': 'Key ID',
|
||||
# 'AWS_SECRET_ACCESS_KEY': 'Secret',
|
||||
# 'AWS_STORAGE_BUCKET_NAME': 'netbox',
|
||||
# 'AWS_S3_REGION_NAME': 'eu-west-1',
|
||||
# }
|
||||
|
||||
# Time zone (default: UTC)
|
||||
TIME_ZONE = 'UTC'
|
||||
|
@ -60,6 +60,8 @@ class ObjectListField(DjangoListField):
|
||||
filterset_class = django_object_type._meta.filterset_class
|
||||
if filterset_class:
|
||||
filterset = filterset_class(data=args, queryset=queryset, request=info.context)
|
||||
if not filterset.is_valid():
|
||||
return queryset.none()
|
||||
return filterset.qs
|
||||
|
||||
return queryset
|
||||
|
@ -257,6 +257,10 @@ class CustomValidationMixin(models.Model):
|
||||
def clean(self):
|
||||
super().clean()
|
||||
|
||||
# If the instance is a base for replications, skip custom validation
|
||||
if getattr(self, '_replicated_base', False):
|
||||
return
|
||||
|
||||
# Send the post_clean signal
|
||||
post_clean.send(sender=self.__class__, instance=self)
|
||||
|
||||
|
@ -2,6 +2,7 @@ from collections import namedtuple
|
||||
|
||||
from django.db import models
|
||||
|
||||
from ipam.fields import IPAddressField, IPNetworkField
|
||||
from netbox.registry import registry
|
||||
|
||||
ObjectFieldValue = namedtuple('ObjectFieldValue', ('name', 'type', 'weight', 'value'))
|
||||
@ -11,6 +12,8 @@ class FieldTypes:
|
||||
FLOAT = 'float'
|
||||
INTEGER = 'int'
|
||||
STRING = 'str'
|
||||
INET = 'inet'
|
||||
CIDR = 'cidr'
|
||||
|
||||
|
||||
class LookupTypes:
|
||||
@ -43,6 +46,10 @@ class SearchIndex:
|
||||
field_cls = instance._meta.get_field(field_name).__class__
|
||||
if issubclass(field_cls, (models.FloatField, models.DecimalField)):
|
||||
return FieldTypes.FLOAT
|
||||
if issubclass(field_cls, IPAddressField):
|
||||
return FieldTypes.INET
|
||||
if issubclass(field_cls, IPNetworkField):
|
||||
return FieldTypes.CIDR
|
||||
if issubclass(field_cls, models.IntegerField):
|
||||
return FieldTypes.INTEGER
|
||||
return FieldTypes.STRING
|
||||
|
@ -3,10 +3,12 @@ from collections import defaultdict
|
||||
from django.conf import settings
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.core.exceptions import ImproperlyConfigured
|
||||
from django.db.models import F, Window
|
||||
from django.db.models import F, Window, Q
|
||||
from django.db.models.functions import window
|
||||
from django.db.models.signals import post_delete, post_save
|
||||
from django.utils.module_loading import import_string
|
||||
import netaddr
|
||||
from netaddr.core import AddrFormatError
|
||||
|
||||
from extras.models import CachedValue, CustomField
|
||||
from netbox.registry import registry
|
||||
@ -52,11 +54,11 @@ class SearchBackend:
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def caching_handler(self, sender, instance, **kwargs):
|
||||
def caching_handler(self, sender, instance, created, **kwargs):
|
||||
"""
|
||||
Receiver for the post_save signal, responsible for caching object creation/changes.
|
||||
"""
|
||||
self.cache(instance)
|
||||
self.cache(instance, remove_existing=not created)
|
||||
|
||||
def removal_handler(self, sender, instance, **kwargs):
|
||||
"""
|
||||
@ -78,7 +80,13 @@ class SearchBackend:
|
||||
|
||||
def clear(self, object_types=None):
|
||||
"""
|
||||
Delete *all* cached data.
|
||||
Delete *all* cached data (optionally filtered by object type).
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def count(self, object_types=None):
|
||||
"""
|
||||
Return a count of all cache entries (optionally filtered by object type).
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
@ -95,18 +103,24 @@ class CachedValueSearchBackend(SearchBackend):
|
||||
|
||||
def search(self, value, user=None, object_types=None, lookup=DEFAULT_LOOKUP_TYPE):
|
||||
|
||||
# Define the search parameters
|
||||
params = {
|
||||
f'value__{lookup}': value
|
||||
}
|
||||
query_filter = Q(**{f'value__{lookup}': value})
|
||||
|
||||
if object_types:
|
||||
query_filter &= Q(object_type__in=object_types)
|
||||
|
||||
if lookup in (LookupTypes.STARTSWITH, LookupTypes.ENDSWITH):
|
||||
# Partial string matches are valid only on string values
|
||||
params['type'] = FieldTypes.STRING
|
||||
if object_types:
|
||||
params['object_type__in'] = object_types
|
||||
query_filter &= Q(type=FieldTypes.STRING)
|
||||
|
||||
if lookup == LookupTypes.PARTIAL:
|
||||
try:
|
||||
address = str(netaddr.IPNetwork(value.strip()).cidr)
|
||||
query_filter |= Q(type=FieldTypes.CIDR) & Q(value__net_contains_or_equals=address)
|
||||
except (AddrFormatError, ValueError):
|
||||
pass
|
||||
|
||||
# Construct the base queryset to retrieve matching results
|
||||
queryset = CachedValue.objects.filter(**params).annotate(
|
||||
queryset = CachedValue.objects.filter(query_filter).annotate(
|
||||
# Annotate the rank of each result for its object according to its weight
|
||||
row_number=Window(
|
||||
expression=window.RowNumber(),
|
||||
@ -210,6 +224,12 @@ class CachedValueSearchBackend(SearchBackend):
|
||||
# Call _raw_delete() on the queryset to avoid first loading instances into memory
|
||||
return qs._raw_delete(using=qs.db)
|
||||
|
||||
def count(self, object_types=None):
|
||||
qs = CachedValue.objects.all()
|
||||
if object_types:
|
||||
qs = qs.filter(object_type__in=object_types)
|
||||
return qs.count()
|
||||
|
||||
@property
|
||||
def size(self):
|
||||
return CachedValue.objects.count()
|
||||
|
@ -24,7 +24,7 @@ from netbox.constants import RQ_QUEUE_DEFAULT, RQ_QUEUE_HIGH, RQ_QUEUE_LOW
|
||||
# Environment setup
|
||||
#
|
||||
|
||||
VERSION = '3.4.4'
|
||||
VERSION = '3.4.5'
|
||||
|
||||
# Hostname
|
||||
HOSTNAME = platform.node()
|
||||
@ -91,6 +91,7 @@ DOCS_ROOT = getattr(configuration, 'DOCS_ROOT', os.path.join(os.path.dirname(BAS
|
||||
EMAIL = getattr(configuration, 'EMAIL', {})
|
||||
EXEMPT_VIEW_PERMISSIONS = getattr(configuration, 'EXEMPT_VIEW_PERMISSIONS', [])
|
||||
FIELD_CHOICES = getattr(configuration, 'FIELD_CHOICES', {})
|
||||
FILE_UPLOAD_MAX_MEMORY_SIZE = getattr(configuration, 'FILE_UPLOAD_MAX_MEMORY_SIZE', 2621440)
|
||||
HTTP_PROXIES = getattr(configuration, 'HTTP_PROXIES', None)
|
||||
INTERNAL_IPS = getattr(configuration, 'INTERNAL_IPS', ('127.0.0.1', '::1'))
|
||||
JINJA2_FILTERS = getattr(configuration, 'JINJA2_FILTERS', {})
|
||||
|
@ -384,8 +384,8 @@ class BulkImportView(GetReturnURLMixin, BaseMultiObjectView):
|
||||
'data': record,
|
||||
'instance': instance,
|
||||
}
|
||||
if form.cleaned_data['format'] == ImportFormatChoices.CSV:
|
||||
model_form_kwargs['headers'] = form._csv_headers
|
||||
if hasattr(form, '_csv_headers'):
|
||||
model_form_kwargs['headers'] = form._csv_headers # Add CSV headers
|
||||
model_form = self.model_form(**model_form_kwargs)
|
||||
|
||||
# When updating, omit all form fields other than those specified in the record. (No
|
||||
|
@ -436,6 +436,10 @@ class ComponentCreateView(GetReturnURLMixin, BaseObjectView):
|
||||
form = self.initialize_form(request)
|
||||
instance = self.alter_object(self.queryset.model(), request)
|
||||
|
||||
# Note that the form instance is a replicated field base
|
||||
# This is needed to avoid running custom validators multiple times
|
||||
form.instance._replicated_base = hasattr(self.form, "replication_fields")
|
||||
|
||||
if form.is_valid():
|
||||
new_components = []
|
||||
data = deepcopy(request.POST)
|
||||
|
@ -5,6 +5,8 @@
|
||||
|
||||
{% block content %}
|
||||
<form action="" method="post" enctype="multipart/form-data" class="form-object-edit">
|
||||
{% render_errors membership_form %}
|
||||
|
||||
{% csrf_token %}
|
||||
<div class="card">
|
||||
<h5 class="card-header">Add New Member</h5>
|
||||
|
@ -8,6 +8,10 @@
|
||||
<div class="tab-content">
|
||||
<div class="tab-pane show active" id="edit-form" role="tabpanel" aria-labelledby="object-list-tab">
|
||||
<form action="" method="post" enctype="multipart/form-data" class="form-object-edit">
|
||||
{% for form in formset %}
|
||||
{% render_errors form %}
|
||||
{% endfor %}
|
||||
|
||||
{% csrf_token %}
|
||||
{{ pk_form.pk }}
|
||||
{{ formset.management_form }}
|
||||
|
@ -96,7 +96,7 @@ class LoginView(View):
|
||||
# Authenticate user
|
||||
auth_login(request, form.get_user())
|
||||
logger.info(f"User {request.user} successfully authenticated")
|
||||
messages.info(request, f"Logged in as {request.user}.")
|
||||
messages.success(request, f"Logged in as {request.user}.")
|
||||
|
||||
# Ensure the user has a UserConfig defined. (This should normally be handled by
|
||||
# create_userconfig() on user creation.)
|
||||
|
@ -197,6 +197,8 @@ class ImportForm(BootstrapMixin, forms.Form):
|
||||
self.cleaned_data['data'] = self._clean_json(data)
|
||||
elif format == ImportFormatChoices.YAML:
|
||||
self.cleaned_data['data'] = self._clean_yaml(data)
|
||||
else:
|
||||
raise forms.ValidationError(f"Unknown data format: {format}")
|
||||
|
||||
def _detect_format(self, data):
|
||||
"""
|
||||
|
@ -1,5 +1,5 @@
|
||||
bleach==5.0.1
|
||||
Django==4.1.6
|
||||
Django==4.1.7
|
||||
django-cors-headers==3.13.0
|
||||
django-debug-toolbar==3.8.1
|
||||
django-filter==22.1
|
||||
@ -9,23 +9,23 @@ django-pglocks==1.0.4
|
||||
django-prometheus==2.2.0
|
||||
django-redis==5.2.0
|
||||
django-rich==1.4.0
|
||||
django-rq==2.6.0
|
||||
django-tables2==2.5.1
|
||||
django-rq==2.7.0
|
||||
django-tables2==2.5.2
|
||||
django-taggit==3.1.0
|
||||
django-timezone-field==5.0
|
||||
djangorestframework==3.14.0
|
||||
drf-yasg[validation]==1.21.4
|
||||
drf-yasg[validation]==1.21.5
|
||||
graphene-django==3.0.0
|
||||
gunicorn==20.1.0
|
||||
Jinja2==3.1.2
|
||||
Markdown==3.3.7
|
||||
mkdocs-material==9.0.10
|
||||
mkdocs-material==9.0.13
|
||||
mkdocstrings[python-legacy]==0.20.0
|
||||
netaddr==0.8.0
|
||||
Pillow==9.4.0
|
||||
psycopg2-binary==2.9.5
|
||||
PyYAML==6.0
|
||||
sentry-sdk==1.14.0
|
||||
sentry-sdk==1.15.0
|
||||
social-auth-app-django==5.0.0
|
||||
social-auth-core[openidconnect]==4.3.0
|
||||
svgwrite==1.4.3
|
||||
|
@ -103,6 +103,11 @@ COMMAND="python3 netbox/manage.py remove_stale_contenttypes --no-input"
|
||||
echo "Removing stale content types ($COMMAND)..."
|
||||
eval $COMMAND || exit 1
|
||||
|
||||
# Rebuild the search cache (lazily)
|
||||
COMMAND="python3 netbox/manage.py reindex --lazy"
|
||||
echo "Rebuilding search cache ($COMMAND)..."
|
||||
eval $COMMAND || exit 1
|
||||
|
||||
# Delete any expired user sessions
|
||||
COMMAND="python3 netbox/manage.py clearsessions"
|
||||
echo "Removing expired user sessions ($COMMAND)..."
|
||||
|
Loading…
Reference in New Issue
Block a user