What is @aws-cdk/aws-s3?
@aws-cdk/aws-s3 is an AWS Cloud Development Kit (CDK) library that allows you to define Amazon S3 buckets and related resources using code. This package provides a high-level, object-oriented abstraction to create and manage S3 buckets, configure bucket policies, set up event notifications, and more.
What are @aws-cdk/aws-s3's main functionalities?
Create an S3 Bucket
This code sample demonstrates how to create a versioned S3 bucket with a removal policy that destroys the bucket when the stack is deleted.
const s3 = require('@aws-cdk/aws-s3');
const cdk = require('@aws-cdk/core');
class MyStack extends cdk.Stack {
constructor(scope, id, props) {
super(scope, id, props);
new s3.Bucket(this, 'MyFirstBucket', {
versioned: true,
removalPolicy: cdk.RemovalPolicy.DESTROY,
});
}
}
const app = new cdk.App();
new MyStack(app, 'MyStack');
Add Bucket Policy
This code sample shows how to add a bucket policy to an S3 bucket, allowing any principal to perform the 's3:GetObject' action on all objects in the bucket.
const s3 = require('@aws-cdk/aws-s3');
const cdk = require('@aws-cdk/core');
class MyStack extends cdk.Stack {
constructor(scope, id, props) {
super(scope, id, props);
const bucket = new s3.Bucket(this, 'MyBucket');
bucket.addToResourcePolicy(new cdk.aws_iam.PolicyStatement({
actions: ['s3:GetObject'],
resources: [bucket.arnForObjects('*')],
principals: [new cdk.aws_iam.AnyPrincipal()],
}));
}
}
const app = new cdk.App();
new MyStack(app, 'MyStack');
Enable Event Notifications
This code sample demonstrates how to enable event notifications for an S3 bucket. It sets up a notification to an SNS topic whenever an object is created in the bucket.
const s3 = require('@aws-cdk/aws-s3');
const cdk = require('@aws-cdk/core');
const sns = require('@aws-cdk/aws-sns');
const s3n = require('@aws-cdk/aws-s3-notifications');
class MyStack extends cdk.Stack {
constructor(scope, id, props) {
super(scope, id, props);
const bucket = new s3.Bucket(this, 'MyBucket');
const topic = new sns.Topic(this, 'MyTopic');
bucket.addEventNotification(s3.EventType.OBJECT_CREATED, new s3n.SnsDestination(topic));
}
}
const app = new cdk.App();
new MyStack(app, 'MyStack');
Other packages similar to @aws-cdk/aws-s3
aws-sdk
The aws-sdk package is the official AWS SDK for JavaScript, which provides low-level APIs for interacting with AWS services, including S3. Unlike @aws-cdk/aws-s3, which is used for defining infrastructure as code, aws-sdk is used for making API calls to AWS services at runtime.
serverless
The serverless package is a framework for building and deploying serverless applications on AWS and other cloud providers. It allows you to define S3 buckets and other resources in a serverless.yml configuration file. While it provides similar functionalities, it is more focused on serverless architectures and deployments.
terraform
Terraform is an open-source infrastructure as code software tool that provides a consistent CLI workflow to manage hundreds of cloud services. It allows you to define S3 buckets and other AWS resources using HashiCorp Configuration Language (HCL). Terraform is cloud-agnostic and can manage resources across multiple cloud providers.
AWS S3 Construct Library
Define an unencrypted S3 bucket.
new Bucket(this, 'MyFirstBucket');
Bucket
constructs expose the following deploy-time attributes:
bucketArn
- the ARN of the bucket (i.e. arn:aws:s3:::bucket_name
)bucketName
- the name of the bucket (i.e. bucket_name
)bucketUrl
- the URL of the bucket (i.e.
https://s3.us-west-1.amazonaws.com/onlybucket
)arnForObjects(...pattern)
- the ARN of an object or objects within the
bucket (i.e.
arn:aws:s3:::my_corporate_bucket/exampleobject.png
or
arn:aws:s3:::my_corporate_bucket/Development/*
)urlForObject(key)
- the URL of an object within the bucket (i.e.
https://s3.cn-north-1.amazonaws.com.cn/china-bucket/mykey
)
Encryption
Define a KMS-encrypted bucket:
const bucket = new Bucket(this, 'MyUnencryptedBucket', {
encryption: BucketEncryption.Kms
});
assert(bucket.encryptionKey instanceof kms.EncryptionKey);
You can also supply your own key:
const myKmsKey = new kms.EncryptionKey(this, 'MyKey');
const bucket = new Bucket(this, 'MyEncryptedBucket', {
encryption: BucketEncryption.Kms,
encryptionKey: myKmsKey
});
assert(bucket.encryptionKey === myKmsKey);
Use BucketEncryption.ManagedKms
to use the S3 master KMS key:
const bucket = new Bucket(this, 'Buck', {
encryption: BucketEncryption.ManagedKms
});
assert(bucket.encryptionKey == null);
Bucket Policy
By default, a bucket policy will be automatically created for the bucket upon the first call to addToPolicy(statement)
:
const bucket = new Bucket(this, 'MyBucket');
bucket.addToPolicy(statement);
You can bring you own policy as well:
const policy = new BucketPolicy(this, 'MyBucketPolicy');
const bucket = new Bucket(this, 'MyBucket', { policy });
Buckets as sources in CodePipeline
This package also defines an Action that allows you to use a
Bucket as a source in CodePipeline:
import codepipeline = require('@aws-cdk/aws-codepipeline');
import s3 = require('@aws-cdk/aws-s3');
const sourceBucket = new s3.Bucket(this, 'MyBucket', {
versioned: true,
});
const pipeline = new codepipeline.Pipeline(this, 'MyPipeline');
const sourceStage = new codepipeline.Stage(this, 'Source', {
pipeline,
});
const sourceAction = new s3.PipelineSource(this, 'S3Source', {
stage: sourceStage,
bucket: sourceBucket,
bucketKey: 'path/to/file.zip',
artifactName: 'SourceOuptut',
});
You can also add the Bucket to the Pipeline directly:
const sourceAction = sourceBucket.addToPipeline(sourceStage, 'CodeCommit', {
bucketKey: 'path/to/file.zip',
artifactName: 'SourceOutput',
});
Importing and Exporting Buckets
You can create a Bucket
construct that represents an external/existing/unowned bucket by using the Bucket.import
factory method.
This method accepts an object that adheres to BucketRef
which basically include tokens to bucket's attributes.
This means that you can define a BucketRef
using token literals:
const bucket = Bucket.import(this, {
bucketArn: new BucketArn('arn:aws:s3:::my-bucket')
});
bucket.grantReadWrite(user);
The bucket.export()
method can be used to "export" the bucket from the current stack. It returns a BucketRef
object that can later be used in a call to Bucket.import
in another stack.
Here's an example.
Let's define a stack with an S3 bucket and export it using bucket.export()
.
class Producer extends Stack {
public readonly myBucketRef: BucketRef;
constructor(parent: App, name: string) {
super(parent, name);
const bucket = new Bucket(this, 'MyBucket');
this.myBucketRef = bucket.export();
}
}
Now let's define a stack that requires a BucketRef as an input and uses
Bucket.import
to create a Bucket
object that represents this external
bucket. Grant a user principal created within this consuming stack read/write
permissions to this bucket and contents.
interface ConsumerProps {
public userBucketRef: BucketRef;
}
class Consumer extends Stack {
constructor(parent: App, name: string, props: ConsumerProps) {
super(parent, name);
const user = new User(this, 'MyUser');
const userBucket = Bucket.import(this, props.userBucketRef);
userBucket.grantReadWrite(user);
}
}
Now, let's define our CDK app to bind these together:
const app = new App(process.argv);
const producer = new Producer(app, 'produce');
new Consumer(app, 'consume', {
userBucketRef: producer.myBucketRef
});
process.stdout.write(app.run());
Bucket Notifications
The Amazon S3 notification feature enables you to receive notifications when
certain events happen in your bucket as described under S3 Bucket
Notifications of the S3 Developer Guide.
To subscribe for bucket notifications, use the bucket.onEvent
method. The
bucket.onObjectCreated
and bucket.onObjectRemoved
can also be used for these
common use cases.
The following example will subscribe an SNS topic to be notified of all
``s3:ObjectCreated:*` events:
const myTopic = new sns.Topic(this, 'MyTopic');
bucket.onEvent(s3.EventType.ObjectCreated, myTopic);
This call will also ensure that the topic policy can accept notifications for
this specific bucket.
The following destinations are currently supported:
sns.Topic
sqs.Queue
lambda.Function
It is also possible to specify S3 object key filters when subscribing. The
following example will notify myQueue
when objects prefixed with foo/
and
have the .jpg
suffix are removed from the bucket.
bucket.onEvent(s3.EventType.ObjectRemoved, myQueue, { prefix: 'foo/', suffix: '.jpg' });