Skip to content
This repository was archived by the owner on Jun 18, 2020. It is now read-only.
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions samples/dynamo-db-export-as-csv/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,6 @@ Steps to run the pipeline using the cli.
1) aws datapipeline create-pipeline --name ddb-backup --unique-id some-unique-id
=> Returns a pipeline-id

2) aws datapipeline put-pipeline-definition --pipeline-id <pipeline-id> --pipeline-definition file:///home/user/ddb-to-csv.json
2) aws datapipeline put-pipeline-definition --pipeline-id &lt;pipeline-id&gt; --pipeline-definition file:///home/user/ddb-to-csv.json

3) aws datapipeline activate-pipeline --pipeline-id <pipeline-id>
3) aws datapipeline activate-pipeline --pipeline-id &lt;pipeline-id&gt;
22 changes: 14 additions & 8 deletions samples/dynamo-db-export/DynamoDB-export.json
Original file line number Diff line number Diff line change
Expand Up @@ -40,9 +40,9 @@
"masterInstanceType": "m1.medium",
"coreInstanceType": "#{myInstanceType}",
"coreInstanceCount": "#{myInstanceCount}",
"region" : "#{myRegion}",
"terminateAfter" : "12 hours",
"keyPair" : "ramsug-test-desktop"
"region" : "#{myRegion}",
"terminateAfter" : "12 hours",
"keyPair" : "#{myKeyPair}"
}
],
"parameters": [
Expand Down Expand Up @@ -71,15 +71,21 @@
},
{
"description": "Instance Count",
"watermark" : " (IOPS / 300) for m1.medium.(IOPS / 1500) for m3.xlarge",
"watermark" : " (IOPS / 300) for m1.medium.(IOPS / 1500) for m3.xlarge",
"id": "myInstanceCount",
"type": "Integer"
},
{
"description" : "Region",
"watermark" : "Region of DynamoDB Table/EMR cluster",
{
"description" : "Region",
"watermark" : "Region of DynamoDB Table/EMR cluster",
"id" : "myRegion",
"type" : "String"
}
},
{
"description" : "KeyPair",
"watermark" : "KeyPair for EC2 instances",
"id" : "myKeyPair",
"type" : "String"
}
]
}
3 changes: 2 additions & 1 deletion samples/dynamo-db-export/example-parameters.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
"myDDBTableName" : "dynamo-table-name",
"myInstanceType" : "m1.medium",
"myInstanceCount" : "1",
"myRegion" : "eu-west-1"
"myRegion" : "eu-west-1",
"myKeyPair" : "key-for-ddb-backup"
}
}
8 changes: 5 additions & 3 deletions samples/dynamo-db-export/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,11 @@ This pipeline exports data from a Dynamo DB Table to a S3 location using an EMR

Steps to run the pipeline using the cli.

1) aws datapipeline create-pipeline --name ddb-backup --unique-id some-unique-id
1) aws ec2 create-key-pair --key-name key-for-ddb-backup

2) aws datapipeline create-pipeline --name ddb-backup --unique-id some-unique-id
=> Returns a pipeline-id

2) aws datapipeline put-pipeline-definition --pipeline-id <pipeline-id> --pipeline-definition file:///home/user/DynamoDB-export.json --parameter-values-uri file:///home/user/example-parameters.json
3) aws datapipeline put-pipeline-definition --pipeline-id &lt;pipeline-id&gt; --pipeline-definition file:///home/user/DynamoDB-export.json --parameter-values-uri file:///home/user/example-parameters.json

3) aws datapipeline activate-pipeline --pipeline-id <pipeline-id>
4) aws datapipeline activate-pipeline --pipeline-id &lt;pipeline-id&gt;
4 changes: 2 additions & 2 deletions samples/dynamo-db-to-redshift/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,6 @@ Steps to run the pipeline using the cli.
1) aws datapipeline create-pipeline --name ddb-backup --unique-id some-unique-id
=> Returns a pipeline-id

2) aws datapipeline put-pipeline-definition --pipeline-id <pipeline-id> --pipeline-definition file:///home/user/dynamo-db-to-redshift.json
2) aws datapipeline put-pipeline-definition --pipeline-id &lt;pipeline-id&gt; --pipeline-definition file:///home/user/dynamo-db-to-redshift.json

3) aws datapipeline activate-pipeline --pipeline-id <pipeline-id>
3) aws datapipeline activate-pipeline --pipeline-id &lt;pipeline-id&gt;
4 changes: 2 additions & 2 deletions samples/rds-to-rds-copy/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,6 @@ Steps to run the pipeline using the cli.
1) aws datapipeline create-pipeline --name ddb-backup --unique-id some-unique-id
=> Returns a pipeline-id

2) aws datapipeline put-pipeline-definition --pipeline-id <pipeline-id> --pipeline-definition file:///home/user/rds-to-rds-copy.json
2) aws datapipeline put-pipeline-definition --pipeline-id &lt;pipeline-id&gt; --pipeline-definition file:///home/user/rds-to-rds-copy.json

3) aws datapipeline activate-pipeline --pipeline-id <pipeline-id>
3) aws datapipeline activate-pipeline --pipeline-id &lt;pipeline-id&gt;