Skip to content

Commit 19a226b

Browse files
committed
readme and env template
1 parent 632dd7c commit 19a226b

File tree

2 files changed

+89
-1
lines changed

2 files changed

+89
-1
lines changed

video_archive/README.md

Lines changed: 82 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,84 @@
11
# AI-Powered Video Archive
2+
![Screenshot of AI Video Archive](./readme_pics/ui_preview.png)
23

3-
Want to know how this project works? Learn more here: [https://daleonai.com/building-an-ai-powered-searchable-video-archive](https://daleonai.com/building-an-ai-powered-searchable-video-archive).
4+
This project builds a AI-Powered Searchable Video Archive on Google Cloud.
5+
Upload your videos to [Cloud Storage ](cloud.google.com/storage)
6+
and use this archive to quickly search and explore them. The tool analyzes
7+
videos for objects in images (e.g. "baby," "wedding," "snow"), transcripts,
8+
and on-screen text.
9+
10+
It's built using the Google Cloud [Video Intelligence API](cloud.google.com/video-intelligence), [Firebase](firebase.com), [Algolia](algolia.com) (for search),
11+
and runs under a frontend built in [Flutter](flutter.dev).
12+
13+
For an in-depth overview, check out [this blog post](https://daleonai.com/building-an-ai-powered-searchable-video-archive).
14+
15+
To run this project yourself, you'll need to get a couple things set up.
16+
17+
## Setting Up an AI-Powered Video Archive
18+
19+
1. First, you'll need to sign up for a couple of accounts, all of which should be free:
20+
- [Google Cloud Platform](cloud.google.com)
21+
- [Firebase](firebase.com)
22+
- [Algolia](algolia.com)
23+
24+
2. [Create a new Firebase project](https://firebase.google.com/docs/storage/web/start) and set it up to support authentication, hosting, functions, and firestore. Make sure to also download and install the Firebase CLI and find your Firebase config, which should look like this:
25+
26+
```
27+
var firebaseConfig = {
28+
apiKey: '<your-api-key>',
29+
authDomain: '<your-auth-domain>',
30+
databaseURL: '<your-database-url>',
31+
storageBucket: '<your-storage-bucket-url>'
32+
};
33+
```
34+
35+
More on Firebase in a second.
36+
37+
3. In `functions/.env_template`, make a copy called `.env`:
38+
39+
```cp functions/.env_template .env```
40+
41+
4. Next you'll need to create some [Storage Buckets](cloud.google.com/storage), as
42+
described in the file `functions/.env`, for storing your videos,
43+
video metadata (as extracted by the Video Intelligence API), and thumbnails, respectively:
44+
45+
```
46+
VIDEO_BUCKET="BUCKET FOR UPLOADING VIDEOS"
47+
VIDEO_JSON_BUCKET="BUCKET FOR WRITING ANALYSIS JSON"
48+
THUMBNAIL_BUCKET="BUCKET FOR STORING THUMBNAILS"
49+
```
50+
51+
You can create a bucket called `my_sick_video_bucket` by running:
52+
53+
`gsutil mb my_sick_video_bucket`
54+
55+
Once you've created three buckets, update the corresponding variables in your `.env` file.
56+
57+
5. Next you'll need to set up [Algolia](algolia.com), which will handle search queries. If you haven't read [that blog post](https://daleonai.com/building-an-ai-powered-searchable-video-archive) I linked, now would be a good
58+
time.
59+
- Create a new Algolia search index and name it something like "production_VIDEOS." Fill that value in the corresponding line in your `functions/.env` file.
60+
- Find your Algolia authentication keys (your search key and your admin api key)
61+
and fill them in the `.env` file accordingly.
62+
63+
6. Enable the Video Intelligence API in the GCP console (just search "Video Intelligence" in the search box [here](https://console.cloud.google.com/apis/library) and follow the prompts).
64+
65+
7. Almost done with the backend! If you've got the Firebase CLI installed locally, you can deploy the backend by running:
66+
67+
`firebase deploy`
68+
69+
8. Woo hoo! Check that the backend is working by uploading some video files to
70+
your video bucket. You can do this using a GUI in the Google Cloud console Storage UI, or in code by running:
71+
72+
`gsutil cp path/to/your/video.mp4 userid/name_of_your_video_bucket_goes_here`
73+
74+
Notice that the way this project is structured, you should create a subfolder
75+
under your video bucket which is userid. This is used to support multiple users.
76+
77+
By default, the video archive doesn't recognize timestamps. However, if you know
78+
when a video was taken, you can pass it along as you upload the video by running the command:
79+
80+
```
81+
gsutil -h "x-goog-meta-timestamp:2002-04-07 12_31_12" -m cp -r your_video gs://your_userid/your_folder
82+
```
83+
84+
It passes along metadata with the video file to indicate when it was filmed.
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
VIDEO_BUCKET="BUCKET FOR UPLOADING VIDEOS"
2+
VIDEO_JSON_BUCKET="BUCKET FOR WRITING ANALYSIS JSON"
3+
THUMBNAIL_BUCKET="BUCKET FOR STORING THUMBNAILS"
4+
ALGOLIA_APPID="JFROXWOP8P"
5+
ALGOLIA_ADMIN_APIKEY="325f717de5091462ea33cc4f8110c894"
6+
ALOGLIA_INDEX="prod_VIDEOS"
7+
ALGOLIA_SEARCH_KEY="0edb186f86bab9a46534c8828cdf4659"

0 commit comments

Comments
 (0)