
Security News
Deno 2.2 Improves Dependency Management and Expands Node.js Compatibility
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
express-gcs-uploader
Advanced tools
This is a express upload plugin that will auto copy the upload data to Google Cloud Storage. And also have a download strategy for choice:
This is a express upload plugin that will auto copy the upload data to Google Cloud Storage. And also have a download strategy for choice:
Note: All the service is base on multr, if you want to add more into the option, you can reference to multr's option.
npm install express-gcs-uploader --save
var gcsuplder = require('express-gcs-uploader');
gcsuplder.auth({
rootdir: __dirname,
upload_url: '/uploads',
download_url: '/download',
tmpFolder: '/tmp',
cdn_url: 'http://your.bucket.com.tw', //option: for gcs public read or something like that
keep_filename: true, //option: use for keep the original file name in the remote
cache: true, //option: will write to local everytime when read from gcs
bucket: 'your.bucket.com.tw',
projectId: 'your-project-id',
keyFilename: '/path/to/your/key.json'
});
The configuration detail describe below:
app.use('/downloads/:id', gcsuplder.downloadproxy);
In this case, the route like: http://localhost:3000/downloads/e13b6a98a0d50b5990123a83eb87f2a8.png will listen the resource get. And the ":id " will be the filename that we can get from our upload.
If you want use the "cdn_url" to let cloud storage web site bucket can be your CDN path. You should set the default acl to the bucket objects for the uploaded object to grant a default read permission. ( About the website bucket, please reference to the doc: https://cloud.google.com/storage/docs/website-configuration )
gsutil defacl ch -p allUsers:R gs://your.bucket.com.tw
Make a route for receive upload
router.post('/uploadtest', function(req, res, next) {
res.end('done...');
});
Upload using curl:
curl -F "image=@/Users/peihsinsu/Pictures/pic2.png" http://localhost:3000/uploadtest -X POST
Upload using curl with subfolder:
curl -F "image=@/Users/peihsinsu/Pictures/pic2.png" http://localhost:3000/uploadtest -X POST -H 'subfolder:myfolder'
Upload using upload form:
<form method="post" action="/uploadtest" name="submit" enctype="multipart/form-data">
<input type="file" name="fileField"><br /><br />
<input type="submit" name="submit" value="Submit">
</form>
FAQs
This is a express upload plugin that will auto copy the upload data to Google Cloud Storage. And also have a download strategy for choice:
The npm package express-gcs-uploader receives a total of 1 weekly downloads. As such, express-gcs-uploader popularity was classified as not popular.
We found that express-gcs-uploader demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
Security News
React's CRA deprecation announcement sparked community criticism over framework recommendations, leading to quick updates acknowledging build tools like Vite as valid alternatives.
Security News
Ransomware payment rates hit an all-time low in 2024 as law enforcement crackdowns, stronger defenses, and shifting policies make attacks riskier and less profitable.