creates a departmental visualization based on d3.js and support for o… #630
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
…fficer supervisor numbers which can be displayed in the graph. for issue #527
Status
Ready for review / in progress
Description of Changes
Fixes #527 , supersedes PR #541 which got too out of date
Changes proposed in this pull request:
Notes for Deployment
To setup a demo, clear out (most of) the randomly generated data to make room for @dismantl 's Baltimore dataset, as the supervisor information makes it more interesting to look at. download it here and save it as baltimore_processed.csv. then set NUM_OFFICERS from 15000 to 10 in OpenOversight/app/config.py. (setting it to 0 breaks the mock dataset and keeping all of the random data in the visualization makes it too slow).
Then rebuild; the visualizations will be available from the Browse page.
Notes
Performance is much better on a less huge dataset, maybe a loading animation can be a later PR. Another potential PR is to randomly generate supervisor relationships in test_data.py, instead of the bulk upload procedure I use below. Unit information could also be incorporated.
Screenshots (if appropriate)
Tests and linting
I have rebased my changes on current
develop
pytests pass in the development environment on my local machine
flake8
checks pass