
Product
Introducing Rust Support in Socket
Socket now supports Rust and Cargo, offering package search for all users and experimental SBOM generation for enterprise projects.
@dotcom-tool-kit/upload-assets-to-s3
Advanced tools
Upload files to a configured AWS S3 bucket.
Install @dotcom-tool-kit/upload-assets-to-s3
as a devDependency
in your app:
npm install --save-dev @dotcom-tool-kit/upload-assets-to-s3
Add the plugin to your Tool Kit configuration:
plugins:
- '@dotcom-tool-kit/upload-assets-to-s3'
UploadAssetsToS3
Upload files to an AWS S3 bucket.
Property | Description | Type | Default |
---|---|---|---|
accessKeyIdEnvVar | variable name of the project's aws access key id. If uploading to multiple buckets the same credentials will need to work for all | string | 'AWS_ACCESS_HASHED_ASSETS' |
secretAccessKeyEnvVar | variable name of the project's aws secret access key | string | 'AWS_SECRET_HASHED_ASSETS' |
directory | the folder in the project whose contents will be uploaded to S3 | string | 'public' |
reviewBucket | the development or test S3 bucket | Array<string> | ["ft-next-hashed-assets-preview"] |
prodBucket | production S3 bucket(s). The same files will be uploaded to each. Note: most Customer Products buckets that have a prod and prod-us version are already configured in AWS to replicate file changes from one to the other so you don't need to specify both here. Also, if multiple buckets are specified the same credentials will need to be valid for both for the upload to be successful. | Array<string> | ["ft-next-hashed-assets-prod"] |
region | the AWS region your buckets are stored in (let the Platforms team know if you need to upload to multiple buckets in multiple regions). | string | 'eu-west-1' |
destination | the destination folder for uploaded assets. Set to '' to upload assets to the top level of the bucket | string | 'hashed-assets/page-kit' |
extensions | file extensions to be uploaded to S3 | string | 'js,css,map,gz,br,png,jpg,jpeg,gif,webp,svg,ico,json' |
cacheControl | header that controls how long your files stay in a CloudFront cache before CloudFront forwards another request to your origin | string | 'public, max-age=31536000, stale-while-revalidate=60, stale-if-error=3600' |
All properties are optional.
You can test uploads to S3 locally on your review bucket to check that you are happy with the configuration. To do this set your NODE_ENV
to branch
:
$ export NODE_ENV=branch
If the AWS key names for accessing the review bucket are different to the prod bucket then update those in the .toolkitrc.yml
.
The UploadAssetsToS3
task can run on any hook so you can configure it to run on a local hook to test deployment from the command line. For example, it could be added to your build:local
hook as follows:
plugins:
- '@dotcom-tool-kit/webpack'
- '@dotcom-tool-kit/upload-assets-to-s3'
hooks:
'build:local':
- WebpackDevelopment
- UploadAssetsToS3
Then running npm run build
will run the UploadAssetsToS3
task on your review bucket.
Name | Description | Preconfigured Hook |
---|---|---|
UploadAssetsToS3 | Uploads provided files to a given S3 bucket | release:remote |
FAQs
Upload files to a configured AWS S3 bucket.
We found that @dotcom-tool-kit/upload-assets-to-s3 demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socket now supports Rust and Cargo, offering package search for all users and experimental SBOM generation for enterprise projects.
Product
Socket’s precomputed reachability slashes false positives by flagging up to 80% of vulnerabilities as irrelevant, with no setup and instant results.
Product
Socket is launching experimental protection for Chrome extensions, scanning for malware and risky permissions to prevent silent supply chain attacks.