Skip to content

Commit

Permalink
Update s3_file to 2.6.6, git: 23c88744be6069455a39a1bc7b88e44169fe8718
Browse files Browse the repository at this point in the history
  • Loading branch information
workeitel committed Jan 13, 2016
1 parent b826296 commit 191a0f7
Show file tree
Hide file tree
Showing 12 changed files with 122 additions and 20 deletions.
5 changes: 5 additions & 0 deletions s3_file/.gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,7 @@
.DS_Store
metadata.json
.kitchen.local.yml
.kitchen
.s3.yml
Berksfile.lock
*.iml
28 changes: 28 additions & 0 deletions s3_file/.kitchen.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
---
driver:
name: vagrant

provisioner:
name: chef_solo
<% s3 = YAML.load_file('.s3.yml') %>
platforms:
<% %w(11.16.4 12.0.1).each do |chef_version|
%w(ubuntu-12.04 ubuntu-14.04 centos-6.6 centos-7.0).each do |platform|
%>
- name: <%= platform %>
driver_config:
require_chef_omnibus: <%= chef_version %>
<% end
end %>
suites:
- name: default
run_list:
- recipe[s3_file_test::default]
attributes:
s3_file_test:
file: <%= s3['file'] %>
bucket: <%= s3['bucket'] %>
region: <%= s3['region'] %>
access_key: <%= s3['access_key'] %>
secret_key: <%= s3['secret_key'] %>

6 changes: 6 additions & 0 deletions s3_file/Berksfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
source 'https://supermarket.chef.io'

metadata

cookbook 's3_file_test', :path => 'test/fixtures/cookbooks/s3_file_test'

23 changes: 17 additions & 6 deletions s3_file/README.rdoc → s3_file/README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
= DESCRIPTION:
#DESCRIPTION
An LWRP that can be used to fetch files from S3.

I created this LWRP to solve the chicken-and-egg problem of fetching files from S3 on the first Chef run on a newly provisioned machine. Ruby libraries that are installed on that first run are not available to Chef during the run, so I couldn't use a library like Fog to get what I needed from S3.

This LWRP has no dependencies beyond the Ruby standard library, so it can be used on the first run of Chef.

= REQUIREMENTS:
#REQUIREMENTS
An Amazon Web Services account and something in S3 to fetch.

Multi-part S3 uploads do not put the MD5 of the content in the ETag header. If x-amz-meta-digest is provided in User-Defined Metadata on the S3 Object it is processed as if it were a Digest header (RFC 3230).
Expand All @@ -14,7 +14,7 @@ The MD5 of the local file will be checked against the MD5 from x-amz-meta-digest

If credentials are not provided, s3_file will attempt to use the first instance profile associated with the instance. See documentation at http://docs.aws.amazon.com/IAM/latest/UserGuide/instance-profiles.html for more on instance profiles.

= USAGE:
#USAGE
s3_file acts like other file resources. The only supported action is :create, which is the default.

Attribute Parameters:
Expand Down Expand Up @@ -46,14 +46,14 @@ Example:
decrypted_file_checksum "SHA256 hex digest of decrypted file"
end

= MD5 and Multi-Part Upload:
#MD5 and Multi-Part Upload
s3_file compares the MD5 hash of a local file, if present, and the ETag header of the S3 object. If they do not match, then the remote object will be downloaded and notifiations will be fired.

In most cases, the ETag of an S3 object will be identical to its MD5 hash. However, if the file was uploaded to S3 via multi-part upload, then the ETag will be set to the MD5 hash of the first uploaded part. In these cases, MD5 of the local file and remote object will never match.

To work around this issue, set an X-Amz-Meta-Digest tag on your S3 object with value set to `md5=MD5 of the entire object`. s3_file will then use that value in place of the ETag value, and will skip downloading in case the MD5 of the local file matches the value of the X-Amz-Meta-Digest header.

= USING ENCRYPTED S3 FILES:
#USING ENCRYPTED S3 FILES
s3_file can decrypt files that have been encrypted using an AES-256-CBC cipher. To use the decryption part of the resource, you must provide a decryption_key which can be generated by following the instructions below. You can also include an optional decrypted_file_checksum which allows Chef to check to see if it needs to redownload the encrypted file. Note that this checksum is different from the one in S3 because the file you compare to is already decrypted so a SHA256 checksum is used instead of the MD5. Instructions to generate the decrypted_file_checksum are below as well.

To use s3_file with encrypted files:
Expand All @@ -74,7 +74,7 @@ Try `bin/s3_crypto -g > my_new_key`.

You can use the utility `bin/s3_crypto` to encrypt files prior to uploading to S3 and to decrypt files prior to make sure the encryption is working.

= ChefSpec matcher
#ChefSpec matcher
s3_file comes with a matcher to use in {ChefSpec}[https://github.com/sethvargo/chefspec].

This spec checks the code from the USAGE example above:
Expand All @@ -84,3 +84,14 @@ This spec checks the code from the USAGE example above:
.with(bucket: "my-s3-bucket", remote_path: "/my/s3/key")
end

#Testing

This cookbook has Test Kitchen integration tests. To test, create a .s3.yml file with the following S3 details.

file: file
bucket: bucket
region: xx-xxxx-x
access_key: XXXXXXXXXXXXXXXXXXXX
secret_key: XXXXXXXXXXXXXXXXXXXX

If you're using the ChefDK then type `chef exec kitchen test`, otherwise `kitchen test`.
34 changes: 22 additions & 12 deletions s3_file/libraries/s3_file.rb
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,15 @@ def self.with_region_detect(region = nil)
end

def self.do_request(method, url, bucket, path, aws_access_key_id, aws_secret_access_key, token, region)
url = "https://#{bucket}.s3.amazonaws.com" if url.nil?
if url.nil?
url = 'https://' + (
if bucket =~ /\./
"s3.amazonaws.com/#{bucket}"
else
"#{bucket}.s3.amazonaws.com"
end
)
end

with_region_detect(region) do |real_region|
client.reset_before_execution_procs
Expand Down Expand Up @@ -122,27 +130,27 @@ def self.get_from_s3(bucket, url, path, aws_access_key_id, aws_secret_access_key
for attempts in 0..retries
begin
response = do_request("GET", url, bucket, path, aws_access_key_id, aws_secret_access_key, token, region)
break
return response
# break
rescue client::MovedPermanently, client::Found, client::TemporaryRedirect => e
uri = URI.parse(e.response.header['location'])
path = uri.path
uri.path = ""
url = uri.to_s
retry
rescue => e
error = e.respond_to?(:response) ? e.response : e
if attempts < retries
Chef::Log.warn(error)
sleep 5
next
else
Chef::Log.fatal(error)
raise e
if e.respond_to? :response
msg = e.response
if attempts < retries
Chef::Log.warn msg
next
else
Chef::Log.fatal msg
end
end
raise e
end
end

return response
end

def self.aes256_decrypt(key, file)
Expand Down Expand Up @@ -201,6 +209,8 @@ def self.verify_md5_checksum(checksum, file)
def self.client
require 'rest-client'
RestClient.proxy = ENV['http_proxy']
RestClient.proxy = ENV['https_proxy']
RestClient.proxy = ENV['no_proxy']
RestClient
end
end
4 changes: 2 additions & 2 deletions s3_file/metadata.rb
Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,5 @@
maintainer_email "brandon.adams@me.com"
license "MIT"
description "Installs/Configures s3_file LWRP"
long_description IO.read(File.join(File.dirname(__FILE__), 'README.rdoc'))
version "2.5.4"
long_description IO.read(File.join(File.dirname(__FILE__), 'README.md'))
version "2.6.6"
1 change: 1 addition & 0 deletions s3_file/recipes/dependencies.rb
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
chef_gem 'rest-client' do
version node['s3_file']['rest-client']['version']
action :install
compile_time true if Chef::Resource::ChefGem.method_defined?(:compile_time)
end
12 changes: 12 additions & 0 deletions s3_file/test/fixtures/cookbooks/s3_file_test/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
#Description

This cookbook defines acceptance tests for `s3_file`. It simple attempts to fetch a file from S3.

##Attributes

- `node['s3_file_test']['bucket']` - The bucket where the test file resides
- `node['s3_file_test']['region']` - The AWS region for the bucket
- `node['s3_file_test']['file']` - The name of the test file
- `node['s3_file_test']['access_key']` - The AWS access key which allows us to fetch our test S3 file
- `node['s3_file_test']['secret_key']` - The AWS secret key which allows us to fetch our test S3 file

10 changes: 10 additions & 0 deletions s3_file/test/fixtures/cookbooks/s3_file_test/metadata.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
name 's3_file_test'
maintainer 'Brandon Adams'
maintainer_email 'brandon.adams@me.com'
license 'MIT'
description 'Tests s3_file LWRP'
long_description IO.read(File.join(File.dirname(__FILE__), 'README.md'))
version '0.1.0'

depends 's3_file'

11 changes: 11 additions & 0 deletions s3_file/test/fixtures/cookbooks/s3_file_test/recipes/default.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@

s3_file '/root/kitchen-test' do
remote_path node['s3_file_test']['file']
bucket node['s3_file_test']['bucket']
aws_access_key_id node['s3_file_test']['access_key']
aws_secret_access_key node['s3_file_test']['secret_key']
mode 0600
owner 'root'
group 'root'
end

5 changes: 5 additions & 0 deletions s3_file/test/integration/default/serverspec/default_spec.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
require 'spec_helper'

describe file('/root/kitchen-test') do
it { should be_file }
end
3 changes: 3 additions & 0 deletions s3_file/test/integration/default/serverspec/spec_helper.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
require 'serverspec'

set :backend, :exec

0 comments on commit 191a0f7

Please sign in to comment.