All the files and programs used to build the website and images used in Campaign Trails
- HTML, CSS, JS:
- Used to produce the webpage and interactive elements
- R (ggplot and gganimate)
- Used to make the animated visuals of campaign visits
- Python (sklearn)
- Used to produce a Logistic Regression model of win probability
- Figma
- Used to make hand-drawn visuals that were uploaded to the webpage
- Excel
- General use for organization of CSVs and quick analysis
- Raw Graphs
- Website used to produce reference bubble charts for campaign cities
- Google Newspaper Archive
- Archive used to obtain certain newspaper clippings used on webpage
- CLOC
- Used to obtain metadata about the project's code
- Plotly: Used to obtain population statistics for 1,000 U.S. cities
- Campaign Visits Database: Resource created by Christopher J. Devine containing campaign visits from 2008 to 2020
- p2004.org: Website manually scraped to compile campaign visits for the 2004 election
- Vote Hub: Page used to manually scrape campaign visits for the 2024 election
- Odyssee: Previous Cornell Data Journal project mainly used for webpage inspiration
- FiveThirtyEight: Visual inspiration for past-election animated maps
- 2216 lines of hand-written code
- 16 R files
- 30 CSVs
- 4688 CSV rows
- 53 images
Project lead:
- Nikhil Chinchalkar
Team members:
- Ella Sanchez
- Natalie Miller
- Shashank Kalyanaraman
- Vivian Guo
- Cornell Data Journal for the platform