Closed
Description
Page Name: bigquery-usage
Release: 0.8.0
Looking at the section for inserting data asynchronously, the load_from_storage method doesn't exist in the table class. There is a load_table_from_storage method in client class but it doesn't have the same args. Is this feature not yet released or just improperly documented?
Inserting data (asynchronous)
Start a job loading data asynchronously from a set of CSV files, located on Google Cloud Storage, appending rows into an existing table. First, create the job locally:
>>> from gcloud import bigquery
>>> client = bigquery.Client()
>>> table = dataset.table(name='person_ages')
>>> job = table.load_from_storage(bucket_name='bucket-name',
... object_name_glob='object-prefix*',
... source_format='CSV',
... skip_leading_rows=1,
... write_disposition='truncate')
>>> job.job_id
'e3344fba-09df-4ae0-8337-fddee34b3840'
>>> job.type
'load'
>>> job.created
None
>>> job.state
None