Skip to content

Commit

Permalink
more notebook cleanup
Browse files Browse the repository at this point in the history
  • Loading branch information
benofben committed Feb 14, 2023
1 parent 5db9099 commit eebac97
Show file tree
Hide file tree
Showing 4 changed files with 35 additions and 194 deletions.
5 changes: 2 additions & 3 deletions Lab 6 - Graph Data Science/embedding.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
"We're going to generate a graph embedding using Neo4j Graph Data Science (GDS). This will be an additional feature we can use to train our machine learning model later.\n",
"\n",
"## Install Prerequisites\n",
"First off, you'll also need to install a few packages."
"First off, you'll need to install the Neo4j Graph Data Science Package."
]
},
{
Expand All @@ -26,8 +26,7 @@
},
"outputs": [],
"source": [
"%pip install --quiet --upgrade graphdatascience\n",
"%pip install --quiet google-cloud-storage"
"%pip install --quiet --upgrade graphdatascience"
]
},
{
Expand Down
104 changes: 12 additions & 92 deletions Lab 7 - Vertex AI AutoML/vertex_ai_embedding.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2,77 +2,24 @@
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {
"id": "JAPoU8Sm5E6e"
},
"source": [
"<a href=\"https://colab.research.google.com/github/neo4j-partners/hands-on-lab-neo4j-and-vertex-ai/blob/main/Lab%206%20-%20Vertex%20AI/vertex_ai_embedding.ipynb\" target=\"_blank\">\n",
" <img src=\"https://cloud.google.com/ml-engine/images/colab-logo-32px.png\" alt=\"Colab logo\"> Run in Colab\n",
"</a>"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "BKipBL0kWY7w"
},
"source": [
"# Install Additional Packages\n",
"First off, you'll also need to install a few packages."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "tDipS8p-27qg",
"outputId": "7cc33a24-7505-47ec-be92-18346e02a506"
},
"outputs": [],
"source": [
"%pip install --quiet google.cloud.aiplatform"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "JBXAh7fVt9Ou"
},
"source": [
"# Restart the Kernel\n",
"After you install the additional packages, you need to restart the notebook kernel so it can find the packages. When you run this, you may get a notification that the kernel crashed. You can disregard that."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "ySSyV4T_3dQB",
"outputId": "8964ce8b-552f-4578-b3fc-b212fee2ade6"
},
"outputs": [],
"source": [
"import IPython\n",
"\n",
"app = IPython.Application.instance()\n",
"app.kernel.do_shutdown(True)"
"# Vertex AI Embedding\n",
"Now, let's build a similar classifier to the last one. But, this time we're going to use data from our graph embedding."
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {
"id": "Id6tjQDbgf2S"
},
"source": [
"# Authenticate your Google Cloud Account\n",
"These steps will authenticate the notebook using your Google Cloud credentials."
"## Setup Variables\n",
"First we need to set a few variables."
]
},
{
Expand All @@ -83,49 +30,22 @@
},
"outputs": [],
"source": [
"# Enter the inputs!\n",
"PROJECT_ID=''\n",
"while PROJECT_ID=='':\n",
" PROJECT_ID = input('Enter your GCP Project ID: ')\n",
"# Edit these variables!\n",
"PROJECT_ID = 'your-project-id'\n",
"REGION = 'us-west1'\n",
"\n",
"STORAGE_BUCKET = PROJECT_ID + '-form13'\n",
"REGION = 'us-east1'"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "1XoT1nT_JlYx"
},
"outputs": [],
"source": [
"import os\n",
"os.environ[\"GCLOUD_PROJECT\"] = PROJECT_ID"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "HucMnpmVgfmX"
},
"outputs": [],
"source": [
"try:\n",
" from google.colab import auth as google_auth\n",
" google_auth.authenticate_user()\n",
"except:\n",
" pass"
"# You can leave this default\n",
"STORAGE_BUCKET = PROJECT_ID + '-form13'"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {
"id": "ArK3cfKsdT1x"
},
"source": [
"# Train a Model on GCP\n",
"## Train a Model on GCP\n",
"We'll use the original and engineered features to train an AutoML model."
]
},
Expand Down
117 changes: 20 additions & 97 deletions Lab 7 - Vertex AI AutoML/vertex_ai_raw.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2,78 +2,27 @@
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {
"id": "JAPoU8Sm5E6e"
},
"source": [
"<a href=\"https://colab.research.google.com/github/neo4j-partners/hands-on-lab-neo4j-and-vertex-ai/blob/main/Lab%206%20-%20Vertex%20AI/vertex_ai_raw.ipynb\" target=\"_blank\">\n",
" <img src=\"https://cloud.google.com/ml-engine/images/colab-logo-32px.png\" alt=\"Colab logo\"> Run in Colab\n",
"</a>"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "BKipBL0kWY7w"
},
"source": [
"# Install Additional Packages\n",
"First off, you'll also need to install a few packages."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "tDipS8p-27qg",
"outputId": "94bca470-e7fc-4071-a016-22bd2f7be9f0"
},
"outputs": [],
"source": [
"%pip install --quiet google-cloud-storage\n",
"%pip install --quiet google.cloud.aiplatform"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "JBXAh7fVt9Ou"
},
"source": [
"# Restart the Kernel\n",
"After you install the additional packages, you need to restart the notebook kernel so it can find the packages. When you run this, you may get a notification that the kernel crashed. You can disregard that."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "ySSyV4T_3dQB",
"outputId": "2b646b87-fb36-468c-f23c-c85cef25a165"
},
"outputs": [],
"source": [
"import IPython\n",
"\n",
"app = IPython.Application.instance()\n",
"app.kernel.do_shutdown(True)"
"# Vertex AI Raw\n",
"First, we're going to work with the raw data set. We'll do the following:\n",
"* Pull it from a bucket\n",
"* Break into train, test and validation sets\n",
"* Train a classifier"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {
"id": "nRLomQ5ekAkE"
},
"source": [
"# Download and Split the Data\n",
"Now let's download the data set and split it into training, validation and test sets."
"## Download and Split the Data\n",
"Let's download the data set and split it into training, validation and test sets."
]
},
{
Expand Down Expand Up @@ -111,13 +60,14 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {
"id": "Id6tjQDbgf2S"
},
"source": [
"# Authenticate your Google Cloud Account\n",
"These steps will authenticate the notebook using your Google Cloud credentials."
"## Setup Variables\n",
"Now we need to set a few variables."
]
},
{
Expand All @@ -128,50 +78,22 @@
},
"outputs": [],
"source": [
"# Enter the inputs!\n",
"PROJECT_ID=''\n",
"while PROJECT_ID=='':\n",
" PROJECT_ID = input('Enter your GCP Project ID: ')\n",
"# Edit these variables!\n",
"PROJECT_ID = 'your-project-id'\n",
"REGION = 'us-west1'\n",
"\n",
"# You can leave these defaults\n",
"STORAGE_BUCKET = PROJECT_ID + '-form13'\n",
"REGION = 'us-east1'"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "1XoT1nT_JlYx"
},
"outputs": [],
"source": [
"import os\n",
"os.environ['GCLOUD_PROJECT'] = PROJECT_ID"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "HucMnpmVgfmX"
},
"outputs": [],
"source": [
"try:\n",
" from google.colab import auth as google_auth\n",
" google_auth.authenticate_user()\n",
"except:\n",
" pass"
"# You can leave this default\n",
"STORAGE_BUCKET = PROJECT_ID + '-form13'"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {
"id": "FUU7z4FjJS90"
},
"source": [
"# Upload to a GCP Cloud Storage Bucket\n",
"## Upload to a GCP Cloud Storage Bucket\n",
"\n",
"To get the data into Vertex AI, we must first put it in a bucket as a CSV."
]
Expand Down Expand Up @@ -214,12 +136,13 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {
"id": "ArK3cfKsdT1x"
},
"source": [
"# Train a Model on GCP\n",
"## Train a Model on GCP\n",
"We'll use the original features to train an AutoML model."
]
},
Expand Down
3 changes: 1 addition & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,6 @@ If you have a Google Cloud account with permissions that allow you to invoke Ver
* Exploration with Neo4j Bloom
* [Lab 5 - Vertex AI Workbench](Lab%205%20-%20Vertex%20AI%20Workbench/README.md) 10 min)
* to do

* Break (10 min)

### Part 3
Expand All @@ -69,7 +68,7 @@ If you have a Google Cloud account with permissions that allow you to invoke Ver
* Lecture - [Vertex AI](https://docs.google.com/presentation/d/19TewJE5YgESTmN9qW4MOtFP4m39uPhUaRXErkCzrdbE/edit?usp=sharing) (10 min)
* What is Vertex AI?
* Using Vertex AI with Neo4j
* [Lab 7 - Vertex AI](Lab%207%20-%20Vertex%20AI) (15 min)
* [Lab 7 - Vertex AI AutoML](Lab%207%20-%20Vertex%20AI%20AutoML) (15 min)
* Raw Data
* Data with Embedding
* [Lab 8 - Cleanup](Lab%208%20-%20Cleanup) (5 min)
Expand Down

0 comments on commit eebac97

Please sign in to comment.