Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replaced /mybucket/ with amzn-s3-demo-bucket #9220

Open
wants to merge 2 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion awscli/examples/cloudformation/_package_description.rst
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ For example, if your AWS Lambda function source code is in the
``/home/user/code/lambdafunction/`` folder, specify
``CodeUri: /home/user/code/lambdafunction`` for the
``AWS::Serverless::Function`` resource. The command returns a template and replaces
the local path with the S3 location: ``CodeUri: s3://mybucket/lambdafunction.zip``.
the local path with the S3 location: ``CodeUri: s3://amzn-s3-demo-bucket/lambdafunction.zip``.

If you specify a file, the command directly uploads it to the S3 bucket. If you
specify a folder, the command zips the folder and then uploads the .zip file.
Expand Down
16 changes: 8 additions & 8 deletions awscli/examples/emr/add-steps.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

- Command::

aws emr add-steps --cluster-id j-XXXXXXXX --steps Type=CUSTOM_JAR,Name=CustomJAR,ActionOnFailure=CONTINUE,Jar=s3://mybucket/mytest.jar,Args=arg1,arg2,arg3 Type=CUSTOM_JAR,Name=CustomJAR,ActionOnFailure=CONTINUE,Jar=s3://mybucket/mytest.jar,MainClass=mymainclass,Args=arg1,arg2,arg3
aws emr add-steps --cluster-id j-XXXXXXXX --steps Type=CUSTOM_JAR,Name=CustomJAR,ActionOnFailure=CONTINUE,Jar=s3://amzn-s3-demo-bucket/mytest.jar,Args=arg1,arg2,arg3 Type=CUSTOM_JAR,Name=CustomJAR,ActionOnFailure=CONTINUE,Jar=s3://amzn-s3-demo-bucket/mytest.jar,MainClass=mymainclass,Args=arg1,arg2,arg3

- Required parameters::

Expand All @@ -25,7 +25,7 @@

- Command::

aws emr add-steps --cluster-id j-XXXXXXXX --steps Type=STREAMING,Name='Streaming Program',ActionOnFailure=CONTINUE,Args=[-files,s3://elasticmapreduce/samples/wordcount/wordSplitter.py,-mapper,wordSplitter.py,-reducer,aggregate,-input,s3://elasticmapreduce/samples/wordcount/input,-output,s3://mybucket/wordcount/output]
aws emr add-steps --cluster-id j-XXXXXXXX --steps Type=STREAMING,Name='Streaming Program',ActionOnFailure=CONTINUE,Args=[-files,s3://elasticmapreduce/samples/wordcount/wordSplitter.py,-mapper,wordSplitter.py,-reducer,aggregate,-input,s3://elasticmapreduce/samples/wordcount/input,-output,s3://amzn-s3-demo-bucket/wordcount/output]

- Required parameters::

Expand All @@ -40,7 +40,7 @@
[
{
"Name": "JSON Streaming Step",
"Args": ["-files","s3://elasticmapreduce/samples/wordcount/wordSplitter.py","-mapper","wordSplitter.py","-reducer","aggregate","-input","s3://elasticmapreduce/samples/wordcount/input","-output","s3://mybucket/wordcount/output"],
"Args": ["-files","s3://elasticmapreduce/samples/wordcount/wordSplitter.py","-mapper","wordSplitter.py","-reducer","aggregate","-input","s3://elasticmapreduce/samples/wordcount/input","-output","s3://amzn-s3-demo-bucket/wordcount/output"],
"ActionOnFailure": "CONTINUE",
"Type": "STREAMING"
}
Expand Down Expand Up @@ -72,15 +72,15 @@ NOTE: JSON arguments must include options and values as their own items in the l
"ActionOnFailure": "CONTINUE",
"Args": [
"-files",
"s3://mybucket/mapper.py,s3://mybucket/reducer.py",
"s3://amzn-s3-demo-bucket/mapper.py,s3://amzn-s3-demo-bucket/reducer.py",
"-mapper",
"mapper.py",
"-reducer",
"reducer.py",
"-input",
"s3://mybucket/input",
"s3://amzn-s3-demo-bucket/input",
"-output",
"s3://mybucket/output"]
"s3://amzn-s3-demo-bucket/output"]
}
]

Expand Down Expand Up @@ -109,7 +109,7 @@ NOTE: JSON arguments must include options and values as their own items in the l

- Command::

aws emr add-steps --cluster-id j-XXXXXXXX --steps Type=HIVE,Name='Hive program',ActionOnFailure=CONTINUE,Args=[-f,s3://mybucket/myhivescript.q,-d,INPUT=s3://mybucket/myhiveinput,-d,OUTPUT=s3://mybucket/myhiveoutput,arg1,arg2] Type=HIVE,Name='Hive steps',ActionOnFailure=TERMINATE_CLUSTER,Args=[-f,s3://elasticmapreduce/samples/hive-ads/libs/model-build.q,-d,INPUT=s3://elasticmapreduce/samples/hive-ads/tables,-d,OUTPUT=s3://mybucket/hive-ads/output/2014-04-18/11-07-32,-d,LIBS=s3://elasticmapreduce/samples/hive-ads/libs]
aws emr add-steps --cluster-id j-XXXXXXXX --steps Type=HIVE,Name='Hive program',ActionOnFailure=CONTINUE,Args=[-f,s3://amzn-s3-demo-bucket/myhivescript.q,-d,INPUT=s3://amzn-s3-demo-bucket/myhiveinput,-d,OUTPUT=s3://amzn-s3-demo-bucket/myhiveoutput,arg1,arg2] Type=HIVE,Name='Hive steps',ActionOnFailure=TERMINATE_CLUSTER,Args=[-f,s3://elasticmapreduce/samples/hive-ads/libs/model-build.q,-d,INPUT=s3://elasticmapreduce/samples/hive-ads/tables,-d,OUTPUT=s3://amzn-s3-demo-bucket/hive-ads/output/2014-04-18/11-07-32,-d,LIBS=s3://elasticmapreduce/samples/hive-ads/libs]


- Required parameters::
Expand All @@ -134,7 +134,7 @@ NOTE: JSON arguments must include options and values as their own items in the l

- Command::

aws emr add-steps --cluster-id j-XXXXXXXX --steps Type=PIG,Name='Pig program',ActionOnFailure=CONTINUE,Args=[-f,s3://mybucket/mypigscript.pig,-p,INPUT=s3://mybucket/mypiginput,-p,OUTPUT=s3://mybucket/mypigoutput,arg1,arg2] Type=PIG,Name='Pig program',Args=[-f,s3://elasticmapreduce/samples/pig-apache/do-reports2.pig,-p,INPUT=s3://elasticmapreduce/samples/pig-apache/input,-p,OUTPUT=s3://mybucket/pig-apache/output,arg1,arg2]
aws emr add-steps --cluster-id j-XXXXXXXX --steps Type=PIG,Name='Pig program',ActionOnFailure=CONTINUE,Args=[-f,s3://amzn-s3-demo-bucket/mypigscript.pig,-p,INPUT=s3://amzn-s3-demo-bucket/mypiginput,-p,OUTPUT=s3://amzn-s3-demo-bucket/mypigoutput,arg1,arg2] Type=PIG,Name='Pig program',Args=[-f,s3://elasticmapreduce/samples/pig-apache/do-reports2.pig,-p,INPUT=s3://elasticmapreduce/samples/pig-apache/input,-p,OUTPUT=s3://amzn-s3-demo-bucket/pig-apache/output,arg1,arg2]


- Required parameters::
Expand Down
10 changes: 5 additions & 5 deletions awscli/examples/emr/create-cluster-examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -369,7 +369,7 @@ The following ``create-cluster`` examples add a streaming step to a cluster that
The following example specifies the step inline. ::

aws emr create-cluster \
--steps Type=STREAMING,Name='Streaming Program',ActionOnFailure=CONTINUE,Args=[-files,s3://elasticmapreduce/samples/wordcount/wordSplitter.py,-mapper,wordSplitter.py,-reducer,aggregate,-input,s3://elasticmapreduce/samples/wordcount/input,-output,s3://mybucket/wordcount/output] \
--steps Type=STREAMING,Name='Streaming Program',ActionOnFailure=CONTINUE,Args=[-files,s3://elasticmapreduce/samples/wordcount/wordSplitter.py,-mapper,wordSplitter.py,-reducer,aggregate,-input,s3://elasticmapreduce/samples/wordcount/input,-output,s3://amzn-s3-demo-bucket/wordcount/output] \
--release-label emr-5.3.1 \
--instance-groups InstanceGroupType=MASTER,InstanceCount=1,InstanceType=m4.large InstanceGroupType=CORE,InstanceCount=2,InstanceType=m4.large \
--auto-terminate
Expand Down Expand Up @@ -397,7 +397,7 @@ Contents of ``multiplefiles.json``::
"-input",
"s3://elasticmapreduce/samples/wordcount/input",
"-output",
"s3://mybucket/wordcount/output"
"s3://amzn-s3-demo-bucket/wordcount/output"
],
"ActionOnFailure": "CONTINUE",
"Type": "STREAMING"
Expand All @@ -409,7 +409,7 @@ Contents of ``multiplefiles.json``::
The following example add Hive steps when creating a cluster. Hive steps require parameters ``Type`` and ``Args``. Hive steps optional parameters are ``Name`` and ``ActionOnFailure``. ::

aws emr create-cluster \
--steps Type=HIVE,Name='Hive program',ActionOnFailure=CONTINUE,ActionOnFailure=TERMINATE_CLUSTER,Args=[-f,s3://elasticmapreduce/samples/hive-ads/libs/model-build.q,-d,INPUT=s3://elasticmapreduce/samples/hive-ads/tables,-d,OUTPUT=s3://mybucket/hive-ads/output/2014-04-18/11-07-32,-d,LIBS=s3://elasticmapreduce/samples/hive-ads/libs] \
--steps Type=HIVE,Name='Hive program',ActionOnFailure=CONTINUE,ActionOnFailure=TERMINATE_CLUSTER,Args=[-f,s3://elasticmapreduce/samples/hive-ads/libs/model-build.q,-d,INPUT=s3://elasticmapreduce/samples/hive-ads/tables,-d,OUTPUT=s3://amzn-s3-demo-bucket/hive-ads/output/2014-04-18/11-07-32,-d,LIBS=s3://elasticmapreduce/samples/hive-ads/libs] \
--applications Name=Hive \
--release-label emr-5.3.1 \
--instance-groups InstanceGroupType=MASTER,InstanceCount=1,InstanceType=m4.large InstanceGroupType=CORE,InstanceCount=2,InstanceType=m4.large
Expand All @@ -419,7 +419,7 @@ The following example add Hive steps when creating a cluster. Hive steps require
The following example adds Pig steps when creating a cluster. Pig steps required parameters are ``Type`` and ``Args``. Pig steps optional parameters are ``Name`` and ``ActionOnFailure``. ::

aws emr create-cluster \
--steps Type=PIG,Name='Pig program',ActionOnFailure=CONTINUE,Args=[-f,s3://elasticmapreduce/samples/pig-apache/do-reports2.pig,-p,INPUT=s3://elasticmapreduce/samples/pig-apache/input,-p,OUTPUT=s3://mybucket/pig-apache/output] \
--steps Type=PIG,Name='Pig program',ActionOnFailure=CONTINUE,Args=[-f,s3://elasticmapreduce/samples/pig-apache/do-reports2.pig,-p,INPUT=s3://elasticmapreduce/samples/pig-apache/input,-p,OUTPUT=s3://amzn-s3-demo-bucket/pig-apache/output] \
--applications Name=Pig \
--release-label emr-5.3.1 \
--instance-groups InstanceGroupType=MASTER,InstanceCount=1,InstanceType=m4.large InstanceGroupType=CORE,InstanceCount=2,InstanceType=m4.large
Expand All @@ -429,7 +429,7 @@ The following example adds Pig steps when creating a cluster. Pig steps required
The following ``create-cluster`` example runs two bootstrap actions defined as scripts that are stored in Amazon S3. ::

aws emr create-cluster \
--bootstrap-actions Path=s3://mybucket/myscript1,Name=BootstrapAction1,Args=[arg1,arg2] Path=s3://mybucket/myscript2,Name=BootstrapAction2,Args=[arg1,arg2] \
--bootstrap-actions Path=s3://amzn-s3-demo-bucket/myscript1,Name=BootstrapAction1,Args=[arg1,arg2] Path=s3://amzn-s3-demo-bucket/myscript2,Name=BootstrapAction2,Args=[arg1,arg2] \
--release-label emr-5.3.1 \
--instance-groups InstanceGroupType=MASTER,InstanceCount=1,InstanceType=m4.large InstanceGroupType=CORE,InstanceCount=2,InstanceType=m4.large \
--auto-terminate
Expand Down
44 changes: 22 additions & 22 deletions awscli/examples/rds/cancel-export-task.rst
Original file line number Diff line number Diff line change
@@ -1,23 +1,23 @@
**To cancel a snapshot export to Amazon S3**
The following ``cancel-export-task`` example cancels an export task in progress that is exporting a snapshot to Amazon S3. ::
aws rds cancel-export-task \
--export-task-identifier my-s3-export-1
Output::
{
"ExportTaskIdentifier": "my-s3-export-1",
"SourceArn": "arn:aws:rds:us-east-1:123456789012:snapshot:publisher-final-snapshot",
"SnapshotTime": "2019-03-24T20:01:09.815Z",
"S3Bucket": "mybucket",
"S3Prefix": "",
"IamRoleArn": "arn:aws:iam::123456789012:role/service-role/export-snap-S3-role",
"KmsKeyId": "arn:aws:kms:us-east-1:123456789012:key/abcd0000-7bfd-4594-af38-aabbccddeeff",
"Status": "CANCELING",
"PercentProgress": 0,
"TotalExtractedDataInGB": 0
}
**To cancel a snapshot export to Amazon S3**

The following ``cancel-export-task`` example cancels an export task in progress that is exporting a snapshot to Amazon S3. ::

aws rds cancel-export-task \
--export-task-identifier my-s3-export-1

Output::

{
"ExportTaskIdentifier": "my-s3-export-1",
"SourceArn": "arn:aws:rds:us-east-1:123456789012:snapshot:publisher-final-snapshot",
"SnapshotTime": "2019-03-24T20:01:09.815Z",
"S3Bucket": "amzn-s3-demo-bucket",
"S3Prefix": "",
"IamRoleArn": "arn:aws:iam::123456789012:role/service-role/export-snap-S3-role",
"KmsKeyId": "arn:aws:kms:us-east-1:123456789012:key/abcd0000-7bfd-4594-af38-aabbccddeeff",
"Status": "CANCELING",
"PercentProgress": 0,
"TotalExtractedDataInGB": 0
}

For more information, see `Canceling a snapshot export task <https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ExportSnapshot.html#USER_ExportSnapshot.Canceling>`__ in the *Amazon RDS User Guide* or `Canceling a snapshot export task <https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/USER_ExportSnapshot.html#USER_ExportSnapshot.Canceling>`__ in the *Amazon Aurora User Guide*.
80 changes: 40 additions & 40 deletions awscli/examples/rds/describe-export-tasks.rst
Original file line number Diff line number Diff line change
@@ -1,40 +1,40 @@
**To describe snapshot export tasks**
The following ``describe-export-tasks`` example returns information about snapshot exports to Amazon S3. ::
aws rds describe-export-tasks
Output::
{
"ExportTasks": [
{
"ExportTaskIdentifier": "test-snapshot-export",
"SourceArn": "arn:aws:rds:us-west-2:123456789012:snapshot:test-snapshot",
"SnapshotTime": "2020-03-02T18:26:28.163Z",
"TaskStartTime": "2020-03-02T18:57:56.896Z",
"TaskEndTime": "2020-03-02T19:10:31.985Z",
"S3Bucket": "mybucket",
"S3Prefix": "",
"IamRoleArn": "arn:aws:iam::123456789012:role/service-role/ExportRole",
"KmsKeyId": "arn:aws:kms:us-west-2:123456789012:key/abcd0000-7fca-4128-82f2-aabbccddeeff",
"Status": "COMPLETE",
"PercentProgress": 100,
"TotalExtractedDataInGB": 0
},
{
"ExportTaskIdentifier": "my-s3-export",
"SourceArn": "arn:aws:rds:us-west-2:123456789012:snapshot:db5-snapshot-test",
"SnapshotTime": "2020-03-27T20:48:42.023Z",
"S3Bucket": "mybucket",
"S3Prefix": "",
"IamRoleArn": "arn:aws:iam::123456789012:role/service-role/ExportRole",
"KmsKeyId": "arn:aws:kms:us-west-2:123456789012:key/abcd0000-7fca-4128-82f2-aabbccddeeff",
"Status": "STARTING",
"PercentProgress": 0,
"TotalExtractedDataInGB": 0
}
]
}
For more information, see `Monitoring Snapshot Exports <https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ExportSnapshot.html#USER_ExportSnapshot.Monitoring>`__ in the *Amazon RDS User Guide*.
**To describe snapshot export tasks**

The following ``describe-export-tasks`` example returns information about snapshot exports to Amazon S3. ::

aws rds describe-export-tasks

Output::

{
"ExportTasks": [
{
"ExportTaskIdentifier": "test-snapshot-export",
"SourceArn": "arn:aws:rds:us-west-2:123456789012:snapshot:test-snapshot",
"SnapshotTime": "2020-03-02T18:26:28.163Z",
"TaskStartTime": "2020-03-02T18:57:56.896Z",
"TaskEndTime": "2020-03-02T19:10:31.985Z",
"S3Bucket": "amzn-s3-demo-bucket",
"S3Prefix": "",
"IamRoleArn": "arn:aws:iam::123456789012:role/service-role/ExportRole",
"KmsKeyId": "arn:aws:kms:us-west-2:123456789012:key/abcd0000-7fca-4128-82f2-aabbccddeeff",
"Status": "COMPLETE",
"PercentProgress": 100,
"TotalExtractedDataInGB": 0
},
{
"ExportTaskIdentifier": "my-s3-export",
"SourceArn": "arn:aws:rds:us-west-2:123456789012:snapshot:db5-snapshot-test",
"SnapshotTime": "2020-03-27T20:48:42.023Z",
"S3Bucket": "amzn-s3-demo-bucket",
"S3Prefix": "",
"IamRoleArn": "arn:aws:iam::123456789012:role/service-role/ExportRole",
"KmsKeyId": "arn:aws:kms:us-west-2:123456789012:key/abcd0000-7fca-4128-82f2-aabbccddeeff",
"Status": "STARTING",
"PercentProgress": 0,
"TotalExtractedDataInGB": 0
}
]
}

For more information, see `Monitoring Snapshot Exports <https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ExportSnapshot.html#USER_ExportSnapshot.Monitoring>`__ in the *Amazon RDS User Guide*.
Loading