Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add BigQuery, Source Repositories #113

Merged
merged 57 commits into from
Mar 1, 2019
Merged
Show file tree
Hide file tree
Changes from 54 commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
e87bd0f
Add InSpec support for backend service
slevenick Jan 25, 2019
6a8728b
Merge pull request #87 from modular-magician/codegen-pr-1300
slevenick Jan 25, 2019
9ecfaaa
Add HTTP health check for InSpec
slevenick Jan 25, 2019
0942279
Merge pull request #88 from modular-magician/codegen-pr-1303
slevenick Jan 25, 2019
98a2d36
Add HTTPS health check to InSpec
slevenick Jan 25, 2019
f0be43a
Merge pull request #89 from modular-magician/codegen-pr-1305
slevenick Jan 25, 2019
e4f5d3e
Add compute instance template for InSpec
slevenick Jan 26, 2019
2e55d45
Merge pull request #90 from modular-magician/codegen-pr-1308
slevenick Jan 28, 2019
0a10a32
Add compute global address to InSpec
slevenick Jan 28, 2019
752a463
Merge pull request #91 from modular-magician/codegen-pr-1309
slevenick Jan 28, 2019
2f6ded5
Inspec url map
slevenick Jan 28, 2019
9d387eb
Merge pull request #92 from modular-magician/codegen-pr-1310
slevenick Jan 28, 2019
adb5a42
Add InSpec support for HTTP proxy
slevenick Jan 28, 2019
226fac4
Merge pull request #94 from modular-magician/codegen-pr-1314
slevenick Jan 28, 2019
67a8582
Add global forwarding rule generation to InSpec
slevenick Jan 29, 2019
1753ef9
Merge pull request #95 from modular-magician/codegen-pr-1319
slevenick Jan 29, 2019
9ce89f7
Add support for target TCP proxy in InSpec
slevenick Jan 29, 2019
2c51de4
Merge pull request #96 from modular-magician/codegen-pr-1321
slevenick Jan 30, 2019
bf0e504
Inspec regional cluster
slevenick Jan 30, 2019
a562de7
Merge pull request #97 from modular-magician/codegen-pr-1295
slevenick Jan 30, 2019
ec04e25
Add InSpec support for compute routes
slevenick Jan 30, 2019
c18dd70
Merge pull request #98 from modular-magician/codegen-pr-1331
slevenick Jan 31, 2019
a167f4c
Update InSpec doc template to use underscored name in title box
slevenick Jan 31, 2019
7aceed0
Merge pull request #100 from modular-magician/codegen-pr-1333
slevenick Jan 31, 2019
73aaadb
Add router support in InSpec
slevenick Jan 31, 2019
df79fb9
Merge pull request #99 from modular-magician/codegen-pr-1332
slevenick Jan 31, 2019
507ad5c
Add support for InSpec disk snapshot
slevenick Feb 1, 2019
c3d9a69
Merge pull request #101 from modular-magician/codegen-pr-1343
slevenick Feb 1, 2019
858fa89
Inspec ssl certificate
slevenick Feb 2, 2019
55558ec
Merge pull request #102 from modular-magician/codegen-pr-1347
slevenick Feb 5, 2019
280de46
Fix InSpec pubsub subscription test
slevenick Feb 6, 2019
3608612
Merge pull request #103 from modular-magician/codegen-pr-1357
slevenick Feb 6, 2019
ed63fb1
InSpec add support for BigQuery Dataset
slevenick Feb 6, 2019
e12467d
Merge pull request #104 from modular-magician/codegen-pr-1358
slevenick Feb 8, 2019
a3bbe4b
Retrieve SOA record using DNS zone instead of building it from record…
matco Feb 12, 2019
ac3d1fd
Inspec nested refactor
slevenick Feb 13, 2019
8360494
Merge pull request #105 from modular-magician/codegen-pr-1368
slevenick Feb 13, 2019
009f814
Remove old nested objects with bad namespaces
slevenick Feb 13, 2019
c268f98
Add VCR back for unit testing in InSpec
slevenick Feb 13, 2019
a8cc444
Merge branch 'master' of https://github.com/inspec/inspec-gcp
slevenick Feb 13, 2019
e24b30c
Merge pull request #107 from modular-magician/codegen-pr-1373
slevenick Feb 13, 2019
519ebca
Add terraform upgrade to Rakefile
slevenick Feb 15, 2019
bf0cbf2
Templates, inspec.yaml for bigquery table
slevenick Feb 15, 2019
a9b2537
Merge pull request #110 from modular-magician/codegen-pr-1399
slevenick Feb 15, 2019
2688372
Retrieve SOA record using DNS zone instead of building it from record…
rambleraptor Feb 15, 2019
fb2b900
Add InSpec support for source repositories
slevenick Feb 19, 2019
28ec6a7
Add labels to Pubsub Subscription/Topics (#109)
modular-magician Feb 19, 2019
f5b6860
Update display names across products based on cloud.google.com (#106)
modular-magician Feb 19, 2019
3cf9d74
Merge branch 'master' into codegen-pr-1411
slevenick Feb 20, 2019
d627f42
Merge pull request #112 from modular-magician/codegen-pr-1411
slevenick Feb 20, 2019
b68cb8b
Add convenience outputs for public/private IP in Cloud SQL
rileykarson Feb 20, 2019
c5b2dec
Merge pull request #116 from modular-magician/codegen-pr-1417
nat-henderson Feb 20, 2019
1f43702
Merge remote-tracking branch 'origin/master' into gcp-master
slevenick Feb 25, 2019
e591cf4
Reset merge issues
slevenick Feb 25, 2019
bce4ef4
Add notes on API requirements to markdown docs for InSpec generated r…
slevenick Feb 28, 2019
a7b11d4
Merge pull request #119 from modular-magician/codegen-pr-1449
slevenick Feb 28, 2019
0c000e9
Improve docs for Cloud Build (#118)
modular-magician Mar 1, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions Gemfile
Original file line number Diff line number Diff line change
Expand Up @@ -13,5 +13,6 @@ group :development do
gem 'passgen'
gem 'pry-coolline'
gem 'rake'
gem 'vcr'
gem 'webmock'
end
2 changes: 1 addition & 1 deletion Rakefile
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ namespace :test do

task :init_workspace do
# Initialize terraform workspace
cmd = format("cd %s/build/ && terraform init", integration_dir)
cmd = format("cd %s/build/ && terraform init -upgrade", integration_dir)
sh(cmd)
end

Expand Down
111 changes: 111 additions & 0 deletions docs/resources/google_bigquery_table.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
---
title: About the google_bigquery_table resource
platform: gcp
---

## Syntax
A `google_bigquery_table` is used to test a Google Table resource

## Examples
```
describe google_bigquery_table(project: 'chef-gcp-inspec', dataset: 'inspec_gcp_dataset', name: 'inspec_gcp_bigquery_table') do
it { should exist }

its('expiration_time') { should cmp '1738882264000' }
its('time_partitioning.type') { should eq 'DAY' }
its('description') { should eq 'A BigQuery table' }
end

describe google_bigquery_table(project: 'chef-gcp-inspec', dataset: 'inspec_gcp_dataset', name: 'nonexistent') do
it { should_not exist }
end
```

## Properties
Properties that can be accessed from the `google_bigquery_table` resource:

* `table_reference`: Reference describing the ID of this table

* `datasetId`: The ID of the dataset containing this table

* `projectId`: The ID of the project containing this table

* `tableId`: The ID of the the table

* `creation_time`: The time when this dataset was created, in milliseconds since the epoch.

* `description`: A user-friendly description of the dataset

* `friendly_name`: A descriptive name for this table

* `id`: An opaque ID uniquely identifying the table.

* `labels`: The labels associated with this dataset. You can use these to organize and group your datasets

* `last_modified_time`: The time when this table was last modified, in milliseconds since the epoch.

* `location`: The geographic location where the table resides. This value is inherited from the dataset.

* `name`: Name of the table

* `num_bytes`: The size of this table in bytes, excluding any data in the streaming buffer.

* `num_long_term_bytes`: The number of bytes in the table that are considered "long-term storage".

* `num_rows`: The number of rows of data in this table, excluding any data in the streaming buffer.

* `type`: Describes the table type

* `view`: The view definition.

* `useLegacySql`: Specifies whether to use BigQuery's legacy SQL for this view

* `userDefinedFunctionResources`: Describes user-defined function resources used in the query.

* `time_partitioning`: If specified, configures time-based partitioning for this table.

* `expirationMs`: Number of milliseconds for which to keep the storage for a partition.

* `type`: The only type supported is DAY, which will generate one partition per day.

* `streaming_buffer`: Contains information regarding this table's streaming buffer, if one is present. This field will be absent if the table is not being streamed to or if there is no data in the streaming buffer.

* `estimatedBytes`: A lower-bound estimate of the number of bytes currently in the streaming buffer.

* `estimatedRows`: A lower-bound estimate of the number of rows currently in the streaming buffer.

* `oldestEntryTime`: Contains the timestamp of the oldest entry in the streaming buffer, in milliseconds since the epoch, if the streaming buffer is available.

* `schema`: Describes the schema of this table

* `fields`: Describes the fields in a table.

* `encryption_configuration`: Custom encryption configuration

* `kmsKeyName`: Describes the Cloud KMS encryption key that will be used to protect destination BigQuery table. The BigQuery Service Account associated with your project requires access to this encryption key.

* `expiration_time`: The time when this table expires, in milliseconds since the epoch. If not present, the table will persist indefinitely.

* `external_data_configuration`: Describes the data format, location, and other properties of a table stored outside of BigQuery. By defining these properties, the data source can then be queried as if it were a standard BigQuery table.

* `autodetect`: Try to detect schema and format options automatically. Any option specified explicitly will be honored.

* `compression`: The compression type of the data source

* `ignoreUnknownValues`: Indicates if BigQuery should allow extra values that are not represented in the table schema

* `maxBadRecords`: The maximum number of bad records that BigQuery can ignore when reading data

* `sourceFormat`: The data format

* `sourceUris`: The fully-qualified URIs that point to your data in Google Cloud. For Google Cloud Storage URIs: Each URI can contain one '*' wildcard character and it must come after the 'bucket' name. Size limits related to load jobs apply to external data sources. For Google Cloud Bigtable URIs: Exactly one URI can be specified and it has be a fully specified and valid HTTPS URL for a Google Cloud Bigtable table. For Google Cloud Datastore backups, exactly one URI can be specified. Also, the '*' wildcard character is not allowed.

* `schema`: The schema for the data. Schema is required for CSV and JSON formats

* `googleSheetsOptions`: Additional options if sourceFormat is set to GOOGLE_SHEETS.

* `csvOptions`: Additional properties to set if sourceFormat is set to CSV.

* `bigtableOptions`: Additional options if sourceFormat is set to BIGTABLE.

* `dataset`: Name of the dataset
47 changes: 47 additions & 0 deletions docs/resources/google_bigquery_tables.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
---
title: About the google_bigquery_tables resource
platform: gcp
---

## Syntax
A `google_bigquery_tables` is used to test a Google Table resource

## Examples
```
describe.one do
google_bigquery_tables(project: 'chef-gcp-inspec', dataset: 'inspec_gcp_dataset').table_references.each do |table_reference|
describe google_bigquery_table(project: 'chef-gcp-inspec', dataset: 'inspec_gcp_dataset', name: table_reference.table_id) do
its('expiration_time') { should cmp '1738882264000' }
its('description') { should eq 'A BigQuery table' }
end
end
end
```

## Properties
Properties that can be accessed from the `google_bigquery_tables` resource:

See [google_bigquery_table.md](google_bigquery_table.md) for more detailed information
* `table_references`: an array of `google_bigquery_table` table_reference
* `creation_times`: an array of `google_bigquery_table` creation_time
* `friendly_names`: an array of `google_bigquery_table` friendly_name
* `ids`: an array of `google_bigquery_table` id
* `labels`: an array of `google_bigquery_table` labels
* `last_modified_times`: an array of `google_bigquery_table` last_modified_time
* `locations`: an array of `google_bigquery_table` location
* `num_bytes`: an array of `google_bigquery_table` num_bytes
* `num_long_term_bytes`: an array of `google_bigquery_table` num_long_term_bytes
* `num_rows`: an array of `google_bigquery_table` num_rows
* `types`: an array of `google_bigquery_table` type
* `views`: an array of `google_bigquery_table` view
* `time_partitionings`: an array of `google_bigquery_table` time_partitioning
* `streaming_buffers`: an array of `google_bigquery_table` streaming_buffer
* `schemas`: an array of `google_bigquery_table` schema
* `encryption_configurations`: an array of `google_bigquery_table` encryption_configuration
* `expiration_times`: an array of `google_bigquery_table` expiration_time
* `external_data_configurations`: an array of `google_bigquery_table` external_data_configuration
* `datasets`: an array of `google_bigquery_table` dataset

## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.
2 changes: 1 addition & 1 deletion docs/resources/google_dns_resource_record_set.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,4 +28,4 @@ Properties that can be accessed from the `google_dns_resource_record_set` resour

* `target`: As defined in RFC 1035 (section 5) and RFC 1034 (section 3.6.1)

* `managed_zone`: Identifies the managed zone addressed by this request. Can be the managed zone name or id.
* `managed_zone`: Identifies the managed zone addressed by this request.
4 changes: 4 additions & 0 deletions docs/resources/google_pubsub_subscription.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,12 @@ Properties that can be accessed from the `google_pubsub_subscription` resource:

* `topic`: A reference to a Topic resource.

* `labels`: A set of key/value label pairs to assign to this Subscription.

* `push_config`: If push delivery is used with this subscription, this field is used to configure it. An empty pushConfig signifies that the subscriber will pull and ack messages using API methods.

* `pushEndpoint`: A URL locating the endpoint to which messages should be pushed. For example, a Webhook endpoint might use "https://example.com/push".

* `attributes`: Endpoint configuration attributes. Every endpoint has a set of API supported attributes that can be used to control different aspects of the message delivery. The currently supported attribute is x-goog-version, which you can use to change the format of the pushed message. This attribute indicates the version of the data expected by the endpoint. This controls the shape of the pushed message (i.e., its fields and metadata). The endpoint version is based on the version of the Pub/Sub API. If not present during the subscriptions.create call, it will default to the version of the API used to make such call. If not present during a subscriptions.modifyPushConfig call, its value will not be changed. subscriptions.get calls will always return a valid version, even if the subscription was created without this attribute. The possible values for this attribute are: - v1beta1: uses the push format defined in the v1beta1 Pub/Sub API. - v1 or v1beta2: uses the push format defined in the v1 Pub/Sub API.

* `ack_deadline_seconds`: This value is the maximum time after a subscriber receives a message before the subscriber should acknowledge the message. After message delivery but before the ack deadline expires and before the message is acknowledged, it is an outstanding message and will not be delivered again during that time (on a best-effort basis). For pull subscriptions, this value is used as the initial value for the ack deadline. To override this value for a given message, call subscriptions.modifyAckDeadline with the corresponding ackId if using pull. The minimum custom deadline you can specify is 10 seconds. The maximum custom deadline you can specify is 600 seconds (10 minutes). If this parameter is 0, a default value of 10 seconds is used. For push delivery, this value is also used to set the request timeout for the call to the push endpoint. If the subscriber never acknowledges the message, the Pub/Sub system will eventually redeliver the message.
1 change: 1 addition & 0 deletions docs/resources/google_pubsub_subscriptions.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ Properties that can be accessed from the `google_pubsub_subscriptions` resource:
See [google_pubsub_subscription.md](google_pubsub_subscription.md) for more detailed information
* `names`: an array of `google_pubsub_subscription` name
* `topics`: an array of `google_pubsub_subscription` topic
* `labels`: an array of `google_pubsub_subscription` labels
* `push_configs`: an array of `google_pubsub_subscription` push_config
* `ack_deadline_seconds`: an array of `google_pubsub_subscription` ack_deadline_seconds

Expand Down
2 changes: 2 additions & 0 deletions docs/resources/google_pubsub_topic.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,3 +21,5 @@ end
Properties that can be accessed from the `google_pubsub_topic` resource:

* `name`: Name of the topic.

* `labels`: A set of key/value label pairs to assign to this Topic.
1 change: 1 addition & 0 deletions docs/resources/google_pubsub_topics.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ Properties that can be accessed from the `google_pubsub_topics` resource:

See [google_pubsub_topic.md](google_pubsub_topic.md) for more detailed information
* `names`: an array of `google_pubsub_topic` name
* `labels`: an array of `google_pubsub_topic` labels

## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
Expand Down
31 changes: 31 additions & 0 deletions docs/resources/google_sourcerepo_repositories.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
---
title: About the google_sourcerepo_repositories resource
platform: gcp
---

## Syntax
A `google_sourcerepo_repositories` is used to test a Google Repository resource

## Examples
```
repo_name = 'inspec-gcp-repository'
describe.one do
google_sourcerepo_repositories(project: 'chef-gcp-inspec').names.each do |name|
describe name do
it { should match /\/repos\/#{repo_name}$/ }
end
end
end
```

## Properties
Properties that can be accessed from the `google_sourcerepo_repositories` resource:

See [google_sourcerepo_repository.md](google_sourcerepo_repository.md) for more detailed information
* `names`: an array of `google_sourcerepo_repository` name
* `urls`: an array of `google_sourcerepo_repository` url
* `sizes`: an array of `google_sourcerepo_repository` size

## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.
Copy link

@skpaterson skpaterson Feb 27, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we add something to the documentation here to explain that the Cloud Source Repository API should be enabled? Might be worthwhile doing the same for other new services e.g. here's a similar example for compute engine:

## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea! Added this on all generated docs

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome, thanks @slevenick !

27 changes: 27 additions & 0 deletions docs/resources/google_sourcerepo_repository.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
---
title: About the google_sourcerepo_repository resource
platform: gcp
---

## Syntax
A `google_sourcerepo_repository` is used to test a Google Repository resource

## Examples
```
describe google_sourcerepo_repository(project: 'chef-gcp-inspec', name: 'inspec-gcp-repository') do
it { should exist }
end

describe google_sourcerepo_repository(project: 'chef-gcp-inspec', name: 'nonexistent') do
it { should_not exist }
end
```

## Properties
Properties that can be accessed from the `google_sourcerepo_repository` resource:

* `name`: Resource name of the repository, of the form projects/{{project}}/repos/{{repo}}. The repo name may contain slashes. eg, projects/myproject/repos/name/with/slash

* `url`: URL to clone the repository from Google Cloud Source Repositories.

* `size`: The disk usage of the repo, in bytes.
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** AUTO GENERATED CODE ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
module GoogleInSpec
module BigQuery
module Property
class TableEncryptionConfiguration
attr_reader :kms_key_name

def initialize(args = nil)
return if args.nil?
@kms_key_name = args['kmsKeyName']
end
end
end
end
end
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** AUTO GENERATED CODE ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
require 'google/bigquery/property/table_external_data_configuration_bigtable_options'
require 'google/bigquery/property/table_external_data_configuration_bigtable_options_column_families'
require 'google/bigquery/property/table_external_data_configuration_csv_options'
require 'google/bigquery/property/table_external_data_configuration_google_sheets_options'
require 'google/bigquery/property/table_external_data_configuration_schema'
require 'google/bigquery/property/table_external_data_configuration_schema_fields'
module GoogleInSpec
module BigQuery
module Property
class TableExternalDataConfiguration
attr_reader :autodetect

attr_reader :compression

attr_reader :ignore_unknown_values

attr_reader :max_bad_records

attr_reader :source_format

attr_reader :source_uris

attr_reader :schema

attr_reader :google_sheets_options

attr_reader :csv_options

attr_reader :bigtable_options

def initialize(args = nil)
return if args.nil?
@autodetect = args['autodetect']
@compression = args['compression']
@ignore_unknown_values = args['ignoreUnknownValues']
@max_bad_records = args['maxBadRecords']
@source_format = args['sourceFormat']
@source_uris = args['sourceUris']
@schema = GoogleInSpec::BigQuery::Property::TableExternalDataConfigurationSchema.new(args['schema'])
@google_sheets_options = GoogleInSpec::BigQuery::Property::TableExternalDataConfigurationGoogleSheetsOptions.new(args['googleSheetsOptions'])
@csv_options = GoogleInSpec::BigQuery::Property::TableExternalDataConfigurationCsvOptions.new(args['csvOptions'])
@bigtable_options = GoogleInSpec::BigQuery::Property::TableExternalDataConfigurationBigtableOptions.new(args['bigtableOptions'])
end
end
end
end
end
Loading