You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I host Netbox in a public cloud and query it to pull out tenant data associated with specific prefixes to encrich log data in our SIEM. Even with a local in mem key=value cache this creates a lot of requests to my cloud hosted netbox install. I though about having a local copy of netbox that is a copy of my cloud version to query against instead. This would get rid of the WAN network latency per request and keep me from using all my burst credits in AWS.
The problems comes with keeping the local copy in sync with the cloud copy. I initially thought replicate the DB but I would have to make a large amount of network changes to get this to work. Then I though maybe I can use pynetbox to sync the two across each instances api.
My thought process is query the cloud version to get.all() per endpoint and run it through dict() and somehow load that data into a bulk create or update against the local copy. I am probably overlooking so much but I want to see if the is a recommended approach before I start spinning by wheels or if someone generally just has a suggestion on how to accomplish this.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I host Netbox in a public cloud and query it to pull out tenant data associated with specific prefixes to encrich log data in our SIEM. Even with a local in mem key=value cache this creates a lot of requests to my cloud hosted netbox install. I though about having a local copy of netbox that is a copy of my cloud version to query against instead. This would get rid of the WAN network latency per request and keep me from using all my burst credits in AWS.
The problems comes with keeping the local copy in sync with the cloud copy. I initially thought replicate the DB but I would have to make a large amount of network changes to get this to work. Then I though maybe I can use pynetbox to sync the two across each instances api.
My thought process is query the cloud version to get.all() per endpoint and run it through dict() and somehow load that data into a bulk create or update against the local copy. I am probably overlooking so much but I want to see if the is a recommended approach before I start spinning by wheels or if someone generally just has a suggestion on how to accomplish this.
Beta Was this translation helpful? Give feedback.
All reactions