Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UI Racks Table View Performance Scaling Problem - Loading time increases with large number of racks - django N+1 query issue #18606

Open
marcusyuri opened this issue Feb 8, 2025 · 0 comments
Labels
severity: low Does not significantly disrupt application functionality, or a workaround is available status: needs owner This issue is tentatively accepted pending a volunteer committed to its implementation type: bug A confirmed report of unexpected behavior in the application

Comments

@marcusyuri
Copy link

Deployment Type

Self-hosted

NetBox Version

v4.2.3

Python Version

3.10

Steps to Reproduce

  1. Starts with a clean NetBox Database
  2. Create a site (ex: site) and a rack type (ex: Rack)
  3. Batch import 10000 racks on this site (sample csv import file below - you may perform partial upload to avoid server timeout)

Rack import.txt

  1. Click on Racks -> Racks

It will take several seconds to load. In our case, with about 20K racks instances on NetBox Database, it takes about 15 seconds to load.

  1. Enable database debug on configuration.py
  2. Click on Racks -> Racks
  3. Look at database log

Expected Behavior

The Rack View loads with a few miliseconds, and the database log may show about <10 SQL querys to render the view.

Observed Behavior

In Database debug log we can see that for Each Rack row displayed in NetBox UI was performed another 3 SQLs querys, to fetch dcim_device, dcim_devicetype and dcim_rackreservation objects, filtered by current rack. It appears to be an django N+1 query issue:

I assume that those additional query was to fetch the device count by rack, and available rack U space stats.

Since in the rack UI view we are only interested on the totals, I am suggesting that a query refactoring, using SQL COUNT/GROUP BY, may solve this optimization/performance issue. It appears to have several django examples on mitigation the N+1 problem https://dev.to/herchila/how-to-avoid-n1-queries-in-django-tips-and-solutions-2ajo

Query #1
(0.006) SELECT "dcim_device"."id", "dcim_device"."created", "dcim_device"."last_updated", "dcim_device"."custom_field_data", "dcim_device"."description", "dcim_device"."comments", "dcim_device"."local_context_data", "dcim_device"."config_template_id", "dcim_device"."device_type_id", "dcim_device"."role_id", "dcim_device"."tenant_id", "dcim_device"."platform_id", "dcim_device"."name", "dcim_device"."serial", "dcim_device"."asset_tag", "dcim_device"."site_id", "dcim_device"."location_id", "dcim_device"."rack_id", "dcim_device"."position", "dcim_device"."face", "dcim_device"."status", "dcim_device"."airflow", "dcim_device"."primary_ip4_id", "dcim_device"."primary_ip6_id", "dcim_device"."oob_ip_id", "dcim_device"."cluster_id", "dcim_device"."virtual_chassis_id", "dcim_device"."vc_position", "dcim_device"."vc_priority", "dcim_device"."latitude", "dcim_device"."longitude", "dcim_device"."console_port_count", "dcim_device"."console_server_port_count", "dcim_device"."power_port_count", "dcim_device"."power_outlet_count", "dcim_device"."interface_count", "dcim_device"."front_port_count", "dcim_device"."rear_port_count", "dcim_device"."device_bay_count", "dcim_device"."module_bay_count", "dcim_device"."inventory_item_count" FROM "dcim_device" INNER JOIN "dcim_devicetype" ON ("dcim_device"."device_type_id" = "dcim_devicetype"."id") WHERE ("dcim_device"."rack_id" = 1327 AND "dcim_device"."position" >= 1 AND NOT ("dcim_devicetype"."exclude_from_utilization")) ORDER BY "dcim_device"."name" ASC, "dcim_device"."id" ASC; args=(1327, Decimal('1')); alias=default

Query #2
(0.001) SELECT "dcim_devicetype"."id", "dcim_devicetype"."created", "dcim_devicetype"."last_updated", "dcim_devicetype"."custom_field_data", "dcim_devicetype"."description", "dcim_devicetype"."comments", "dcim_devicetype"."weight", "dcim_devicetype"."weight_unit", "dcim_devicetype"."_abs_weight", "dcim_devicetype"."manufacturer_id", "dcim_devicetype"."model", "dcim_devicetype"."slug", "dcim_devicetype"."default_platform_id", "dcim_devicetype"."part_number", "dcim_devicetype"."u_height", "dcim_devicetype"."exclude_from_utilization", "dcim_devicetype"."is_full_depth", "dcim_devicetype"."subdevice_role", "dcim_devicetype"."airflow", "dcim_devicetype"."front_image", "dcim_devicetype"."rear_image", "dcim_devicetype"."console_port_template_count", "dcim_devicetype"."console_server_port_template_count", "dcim_devicetype"."power_port_template_count", "dcim_devicetype"."power_outlet_template_count", "dcim_devicetype"."interface_template_count", "dcim_devicetype"."front_port_template_count", "dcim_devicetype"."rear_port_template_count", "dcim_devicetype"."device_bay_template_count", "dcim_devicetype"."module_bay_template_count", "dcim_devicetype"."inventory_item_template_count" FROM "dcim_devicetype" WHERE "dcim_devicetype"."id" IN (504, 397); args=(504, 397); alias=default

Query #3
(0.001) SELECT "dcim_rackreservation"."id", "dcim_rackreservation"."created", "dcim_rackreservation"."last_updated", "dcim_rackreservation"."custom_field_data", "dcim_rackreservation"."comments", "dcim_rackreservation"."rack_id", "dcim_rackreservation"."units", "dcim_rackreservation"."tenant_id", "dcim_rackreservation"."user_id", "dcim_rackreservation"."description" FROM "dcim_rackreservation" WHERE "dcim_rackreservation"."rack_id" = 1327 ORDER BY "dcim_rackreservation"."created" ASC, "dcim_rackreservation"."id" ASC; args=(1327,); alias=default

@marcusyuri marcusyuri added status: needs triage This issue is awaiting triage by a maintainer type: bug A confirmed report of unexpected behavior in the application labels Feb 8, 2025
@arthanson arthanson added status: needs owner This issue is tentatively accepted pending a volunteer committed to its implementation severity: low Does not significantly disrupt application functionality, or a workaround is available and removed status: needs triage This issue is awaiting triage by a maintainer labels Feb 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
severity: low Does not significantly disrupt application functionality, or a workaround is available status: needs owner This issue is tentatively accepted pending a volunteer committed to its implementation type: bug A confirmed report of unexpected behavior in the application
Projects
None yet
Development

No branches or pull requests

2 participants