Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
taupirho authored Jun 21, 2018
1 parent 5bf48e9 commit 2eaf2b9
Showing 1 changed file with 1 addition and 4 deletions.
5 changes: 1 addition & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,7 @@ After the stream was created it was just a matter of creating two python lambdas
and I have put plenty of comments in so won't discuss them further here. A slightly unusual feature is that neither lambda is triggered
by an event - although they can and usually will be. They are stand-alone and can be run manually as and when required or more likely as
part of an AWS Step function process __(see my article on using step functions [here](https://github.com/taupirho/using-aws-step))__.
I haven't included any error/retry processing in my examples but in production you obviously would include this. Also for asynchronous
i.e event based - running you would set up DLQ's for the reading/writing processes to send failed messages to - either to an
SNS topic or to SQS for futher investigation and/or processing. Keep an eye on the DeadLetterErrors cloudwatch metric though as writes
to DLQ's can fail too! The only other thing to note is that the lambdas obviously need permission to read and write to kinesis. I took the
The only other thing to note is that the lambdas obviously need permission to read and write to kinesis. I took the
easy option and extended the default lambda-execution-role to allow all access to kinesis but again in a production system you would want
to nail this down to very specific permissions.

Expand Down

0 comments on commit 2eaf2b9

Please sign in to comment.