Skip to content

Commit 36e13a1

Browse files
committed
Bring BigQuery samples up to standard.
1 parent ce94259 commit 36e13a1

File tree

8 files changed

+452
-318
lines changed

8 files changed

+452
-318
lines changed

bigquery/README.md

+41-38
Original file line numberDiff line numberDiff line change
@@ -42,21 +42,21 @@ __Usage:__ `node datasets --help`
4242

4343
```
4444
Commands:
45-
create <datasetId> Creates a new dataset.
46-
delete <datasetId> Deletes a dataset.
47-
list [projectId] Lists all datasets in the specified project or the current project.
48-
size <datasetId> [projectId] Calculates the size of a dataset.
45+
create <datasetId> Creates a new dataset.
46+
delete <datasetId> Deletes a dataset.
47+
list Lists datasets.
4948
5049
Options:
51-
--help Show help [boolean]
50+
--projectId, -p The Project ID to use. Defaults to the value of the GCLOUD_PROJECT or GOOGLE_CLOUD_PROJECT
51+
environment variables. [string]
52+
--help Show help [boolean]
5253
5354
Examples:
54-
node datasets create my_dataset Creates a new dataset named "my_dataset".
55-
node datasets delete my_dataset Deletes a dataset named "my_dataset".
56-
node datasets list Lists all datasets in the current project.
57-
node datasets list bigquery-public-data Lists all datasets in the "bigquery-public-data" project.
58-
node datasets size my_dataset Calculates the size of "my_dataset" in the current project.
59-
node datasets size hacker_news bigquery-public-data Calculates the size of "bigquery-public-data:hacker_news".
55+
node datasets.js create my_dataset Creates a new dataset named "my_dataset".
56+
node datasets.js delete my_dataset Deletes a dataset named "my_dataset".
57+
node datasets.js list Lists all datasets in the project specified by the
58+
GCLOUD_PROJECT or GOOGLE_CLOUD_PROJECT environments variables.
59+
node datasets.js list --projectId=bigquery-public-data Lists all datasets in the "bigquery-public-data" project.
6060
6161
For more information, see https://cloud.google.com/bigquery/docs
6262
```
@@ -77,14 +77,16 @@ Commands:
7777
shakespeare Queries a public Shakespeare dataset.
7878
7979
Options:
80-
--help Show help [boolean]
80+
--projectId, -p The Project ID to use. Defaults to the value of the GCLOUD_PROJECT or GOOGLE_CLOUD_PROJECT
81+
environment variables. [string]
82+
--help Show help [boolean]
8183
8284
Examples:
83-
node queries sync "SELECT * FROM publicdata.samples.natality Synchronously queries the natality dataset.
84-
LIMIT 5;"
85-
node queries async "SELECT * FROM Queries the natality dataset as a job.
85+
node queries.js sync "SELECT * FROM Synchronously queries the natality dataset.
8686
publicdata.samples.natality LIMIT 5;"
87-
node queries shakespeare Queries a public Shakespeare dataset.
87+
node queries.js async "SELECT * FROM Queries the natality dataset as a job.
88+
publicdata.samples.natality LIMIT 5;"
89+
node queries.js shakespeare Queries a public Shakespeare dataset.
8890
8991
For more information, see https://cloud.google.com/bigquery/docs
9092
```
@@ -100,41 +102,42 @@ __Usage:__ `node tables --help`
100102

101103
```
102104
Commands:
103-
create <datasetId> <tableId> <schema> [projectId] Creates a new table.
104-
list <datasetId> [projectId] Lists all tables in a dataset.
105-
delete <datasetId> <tableId> [projectId] Deletes a table.
105+
create <datasetId> <tableId> <schema> Creates a new table.
106+
list <datasetId> Lists all tables in a dataset.
107+
delete <datasetId> <tableId> Deletes a table.
106108
copy <srcDatasetId> <srcTableId> <destDatasetId> Makes a copy of a table.
107-
<destTableId> [projectId]
108-
browse <datasetId> <tableId> [projectId] Lists rows in a table.
109-
import <datasetId> <tableId> <fileName> [projectId] Imports data from a local file into a table.
109+
<destTableId>
110+
browse <datasetId> <tableId> Lists rows in a table.
111+
import <datasetId> <tableId> <fileName> Imports data from a local file into a table.
110112
import-gcs <datasetId> <tableId> <bucketName> <fileName> Imports data from a Google Cloud Storage file into a
111-
[projectId] table.
113+
table.
112114
export <datasetId> <tableId> <bucketName> <fileName> Export a table from BigQuery to Google Cloud Storage.
113-
[projectId]
114-
insert <datasetId> <tableId> <json_or_file> [projectId] Insert a JSON array (as a string or newline-delimited
115+
insert <datasetId> <tableId> <json_or_file> Insert a JSON array (as a string or newline-delimited
115116
file) into a BigQuery table.
116117
117118
Options:
118-
--help Show help [boolean]
119+
--projectId, -p The Project ID to use. Defaults to the value of the GCLOUD_PROJECT or GOOGLE_CLOUD_PROJECT
120+
environment variables. [string]
121+
--help Show help [boolean]
119122
120123
Examples:
121-
node tables create my_dataset my_table "Name:string, Createss a new table named "my_table" in "my_dataset".
124+
node tables.js create my_dataset my_table "Name:string, Creates a new table named "my_table" in "my_dataset".
122125
Age:integer, Weight:float, IsMagic:boolean"
123-
node tables list my_dataset Lists tables in "my_dataset".
124-
node tables browse my_dataset my_table Displays rows from "my_table" in "my_dataset".
125-
node tables delete my_dataset my_table Deletes "my_table" from "my_dataset".
126-
node tables import my_dataset my_table ./data.csv Imports a local file into a table.
127-
node tables import-gcs my_dataset my_table my-bucket Imports a GCS file into a table.
126+
node tables.js list my_dataset Lists tables in "my_dataset".
127+
node tables.js browse my_dataset my_table Displays rows from "my_table" in "my_dataset".
128+
node tables.js delete my_dataset my_table Deletes "my_table" from "my_dataset".
129+
node tables.js import my_dataset my_table ./data.csv Imports a local file into a table.
130+
node tables.js import-gcs my_dataset my_table my-bucket Imports a GCS file into a table.
128131
data.csv
129-
node tables export my_dataset my_table my-bucket my-file Exports my_dataset:my_table to gcs://my-bucket/my-file
132+
node tables.js export my_dataset my_table my-bucket my-file Exports my_dataset:my_table to gcs://my-bucket/my-file
130133
as raw CSV.
131-
node tables export my_dataset my_table my-bucket my-file -f Exports my_dataset:my_table to gcs://my-bucket/my-file
132-
JSON --gzip as gzipped JSON.
133-
node tables insert my_dataset my_table json_string Inserts the JSON array represented by json_string into
134+
node tables.js export my_dataset my_table my-bucket my-file Exports my_dataset:my_table to gcs://my-bucket/my-file
135+
-f JSON --gzip as gzipped JSON.
136+
node tables.js insert my_dataset my_table json_string Inserts the JSON array represented by json_string into
134137
my_dataset:my_table.
135-
node tables insert my_dataset my_table json_file Inserts the JSON objects contained in json_file (one per
138+
node tables.js insert my_dataset my_table json_file Inserts the JSON objects contained in json_file (one per
136139
line) into my_dataset:my_table.
137-
node tables copy src_dataset src_table dest_dataset Copies src_dataset:src_table to dest_dataset:dest_table.
140+
node tables.js copy src_dataset src_table dest_dataset Copies src_dataset:src_table to dest_dataset:dest_table.
138141
dest_table
139142
140143
For more information, see https://cloud.google.com/bigquery/docs

bigquery/datasets.js

+84-78
Original file line numberDiff line numberDiff line change
@@ -15,123 +15,129 @@
1515

1616
'use strict';
1717

18-
const BigQuery = require('@google-cloud/bigquery');
18+
function createDataset (datasetId, projectId) {
19+
// [START bigquery_create_dataset]
20+
// Imports the Google Cloud client library
21+
const BigQuery = require('@google-cloud/bigquery');
22+
23+
// The project ID to use, e.g. "your-project-id"
24+
// const projectId = "your-project-id";
1925

20-
// [START bigquery_create_dataset]
21-
function createDataset (datasetId) {
2226
// Instantiates a client
23-
const bigquery = BigQuery();
27+
const bigquery = BigQuery({
28+
projectId: projectId
29+
});
30+
31+
// The ID for the new dataset, e.g. "my_new_dataset"
32+
// const datasetId = "my_new_dataset";
2433

25-
// Creates a new dataset, e.g. "my_new_dataset"
26-
return bigquery.createDataset(datasetId)
34+
// Creates a new dataset
35+
bigquery.createDataset(datasetId)
2736
.then((results) => {
2837
const dataset = results[0];
2938
console.log(`Dataset ${dataset.id} created.`);
30-
return dataset;
39+
})
40+
.catch((err) => {
41+
console.error('ERROR:', err);
3142
});
43+
// [END bigquery_create_dataset]
3244
}
33-
// [END bigquery_create_dataset]
3445

35-
// [START bigquery_delete_dataset]
36-
function deleteDataset (datasetId) {
46+
function deleteDataset (datasetId, projectId) {
47+
// [START bigquery_delete_dataset]
48+
// Imports the Google Cloud client library
49+
const BigQuery = require('@google-cloud/bigquery');
50+
51+
// The project ID to use, e.g. "your-project-id"
52+
// const projectId = "your-project-id";
53+
3754
// Instantiates a client
38-
const bigquery = BigQuery();
55+
const bigquery = BigQuery({
56+
projectId: projectId
57+
});
58+
59+
// The ID of the dataset to delete, e.g. "my_new_dataset"
60+
// const datasetId = "my_new_dataset";
3961

40-
// References an existing dataset, e.g. "my_dataset"
62+
// Creates a reference to the existing dataset
4163
const dataset = bigquery.dataset(datasetId);
4264

4365
// Deletes the dataset
44-
return dataset.delete()
66+
dataset.delete()
4567
.then(() => {
4668
console.log(`Dataset ${dataset.id} deleted.`);
69+
})
70+
.catch((err) => {
71+
console.error('ERROR:', err);
4772
});
73+
// [END bigquery_delete_dataset]
4874
}
49-
// [END bigquery_delete_dataset]
5075

51-
// [START bigquery_list_datasets]
5276
function listDatasets (projectId) {
77+
// [START bigquery_list_datasets]
78+
// Imports the Google Cloud client library
79+
const BigQuery = require('@google-cloud/bigquery');
80+
81+
// The project ID to use, e.g. "your-project-id"
82+
// const projectId = "your-project-id";
83+
5384
// Instantiates a client
5485
const bigquery = BigQuery({
5586
projectId: projectId
5687
});
5788

5889
// Lists all datasets in the specified project
59-
return bigquery.getDatasets()
90+
bigquery.getDatasets()
6091
.then((results) => {
6192
const datasets = results[0];
6293
console.log('Datasets:');
6394
datasets.forEach((dataset) => console.log(dataset.id));
64-
return datasets;
95+
})
96+
.catch((err) => {
97+
console.error('ERROR:', err);
6598
});
99+
// [END bigquery_list_datasets]
66100
}
67-
// [END bigquery_list_datasets]
68-
69-
// [START bigquery_get_dataset_size]
70-
function getDatasetSize (datasetId, projectId) {
71-
// Instantiate a client
72-
const bigquery = BigQuery({
73-
projectId: projectId
74-
});
75-
76-
// References an existing dataset, e.g. "my_dataset"
77-
const dataset = bigquery.dataset(datasetId);
78101

79-
// Lists all tables in the dataset
80-
return dataset.getTables()
81-
.then((results) => results[0])
82-
// Retrieve the metadata for each table
83-
.then((tables) => Promise.all(tables.map((table) => table.get())))
84-
.then((results) => results.map((result) => result[0]))
85-
// Select the size of each table
86-
.then((tables) => tables.map((table) => (parseInt(table.metadata.numBytes, 10) / 1000) / 1000))
87-
// Sum up the sizes
88-
.then((sizes) => sizes.reduce((cur, prev) => cur + prev, 0))
89-
// Print and return the size
90-
.then((sum) => {
91-
console.log(`Size of ${dataset.id}: ${sum} MB`);
92-
return sum;
93-
});
94-
}
95-
// [END bigquery_get_dataset_size]
96-
97-
// The command-line program
98-
const cli = require(`yargs`);
99-
100-
const program = module.exports = {
101-
createDataset: createDataset,
102-
deleteDataset: deleteDataset,
103-
listDatasets: listDatasets,
104-
getDatasetSize: getDatasetSize,
105-
main: (args) => {
106-
// Run the command-line program
107-
cli.help().strict().parse(args).argv; // eslint-disable-line
108-
}
109-
};
110-
111-
cli
102+
const cli = require(`yargs`)
112103
.demand(1)
113-
.command(`create <datasetId>`, `Creates a new dataset.`, {}, (opts) => {
114-
program.createDataset(opts.datasetId);
115-
})
116-
.command(`delete <datasetId>`, `Deletes a dataset.`, {}, (opts) => {
117-
program.deleteDataset(opts.datasetId);
118-
})
119-
.command(`list [projectId]`, `Lists all datasets in the specified project or the current project.`, {}, (opts) => {
120-
program.listDatasets(opts.projectId || process.env.GCLOUD_PROJECT);
121-
})
122-
.command(`size <datasetId> [projectId]`, `Calculates the size of a dataset.`, {}, (opts) => {
123-
program.getDatasetSize(opts.datasetId, opts.projectId || process.env.GCLOUD_PROJECT);
104+
.options({
105+
projectId: {
106+
alias: 'p',
107+
default: process.env.GCLOUD_PROJECT || process.env.GOOGLE_CLOUD_PROJECT,
108+
description: 'The Project ID to use. Defaults to the value of the GCLOUD_PROJECT or GOOGLE_CLOUD_PROJECT environment variables.',
109+
requiresArg: true,
110+
type: 'string'
111+
}
124112
})
113+
.command(
114+
`create <datasetId>`,
115+
`Creates a new dataset.`,
116+
{},
117+
(opts) => createDataset(opts.datasetId, opts.projectId)
118+
)
119+
.command(
120+
`delete <datasetId>`,
121+
`Deletes a dataset.`,
122+
{},
123+
(opts) => deleteDataset(opts.datasetId, opts.projectId)
124+
)
125+
.command(
126+
`list`,
127+
`Lists datasets.`,
128+
{},
129+
(opts) => listDatasets(opts.projectId)
130+
)
125131
.example(`node $0 create my_dataset`, `Creates a new dataset named "my_dataset".`)
126132
.example(`node $0 delete my_dataset`, `Deletes a dataset named "my_dataset".`)
127-
.example(`node $0 list`, `Lists all datasets in the current project.`)
128-
.example(`node $0 list bigquery-public-data`, `Lists all datasets in the "bigquery-public-data" project.`)
129-
.example(`node $0 size my_dataset`, `Calculates the size of "my_dataset" in the current project.`)
130-
.example(`node $0 size hacker_news bigquery-public-data`, `Calculates the size of "bigquery-public-data:hacker_news".`)
133+
.example(`node $0 list`, `Lists all datasets in the project specified by the GCLOUD_PROJECT or GOOGLE_CLOUD_PROJECT environments variables.`)
134+
.example(`node $0 list --projectId=bigquery-public-data`, `Lists all datasets in the "bigquery-public-data" project.`)
131135
.wrap(120)
132136
.recommendCommands()
133-
.epilogue(`For more information, see https://cloud.google.com/bigquery/docs`);
137+
.epilogue(`For more information, see https://cloud.google.com/bigquery/docs`)
138+
.help()
139+
.strict();
134140

135141
if (module === require.main) {
136-
program.main(process.argv.slice(2));
142+
cli.parse(process.argv.slice(2));
137143
}

bigquery/package.json

+1-1
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
"yargs": "7.1.0"
2828
},
2929
"devDependencies": {
30-
"@google-cloud/nodejs-repo-tools": "1.3.1",
30+
"@google-cloud/nodejs-repo-tools": "1.3.2",
3131
"ava": "0.19.1",
3232
"proxyquire": "1.7.11",
3333
"sinon": "2.1.0",

0 commit comments

Comments
 (0)