![Oracle Drags Its Feet in the JavaScript Trademark Dispute](https://cdn.sanity.io/images/cgdhsj6q/production/919c3b22c24f93884c548d60cbb338e819ff2435-1024x1024.webp?w=400&fit=max&auto=format)
Security News
Oracle Drags Its Feet in the JavaScript Trademark Dispute
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
express-gcs-uploader
Advanced tools
This is a express upload plugin that will auto copy the upload data to Google Cloud Storage. And also have a download strategy for choice:
This is a express upload plugin that will auto copy the upload data to Google Cloud Storage. And also have a download strategy for choice:
Note: All the service is base on multr, if you want to add more into the option, you can reference to multr's option.
npm install express-gcs-uploader --save
var gcsuplder = require('express-gcs-uploader');
gcsuplder.auth({
rootdir: __dirname,
upload_url: '/uploads',
download_url: '/download',
tmpFolder: '/tmp',
cdn_url: 'http://your.bucket.com.tw', //option: for gcs public read or something like that
keep_filename: true, //option: use for keep the original file name in the remote
cache: true, //option: will write to local everytime when read from gcs
bucket: 'your.bucket.com.tw',
projectId: 'your-project-id',
keyFilename: '/path/to/your/key.json'
});
The configuration detail describe below:
app.use('/downloads/:id', gcsuplder.downloadproxy);
In this case, the route like: http://localhost:3000/downloads/e13b6a98a0d50b5990123a83eb87f2a8.png will listen the resource get. And the ":id " will be the filename that we can get from our upload.
If you want use the "cdn_url" to let cloud storage web site bucket can be your CDN path. You should set the default acl to the bucket objects for the uploaded object to grant a default read permission. ( About the website bucket, please reference to the doc: https://cloud.google.com/storage/docs/website-configuration )
gsutil defacl ch -p allUsers:R gs://your.bucket.com.tw
Make a route for receive upload
router.post('/uploadtest', function(req, res, next) {
res.end('done...');
});
Upload using curl:
curl -F "image=@/Users/peihsinsu/Pictures/pic2.png" http://localhost:3000/uploadtest -X POST
Upload using curl with subfolder:
curl -F "image=@/Users/peihsinsu/Pictures/pic2.png" http://localhost:3000/uploadtest -X POST -H 'subfolder:myfolder'
Upload using upload form:
<form method="post" action="/uploadtest" name="submit" enctype="multipart/form-data">
<input type="file" name="fileField"><br /><br />
<input type="submit" name="submit" value="Submit">
</form>
FAQs
This is a express upload plugin that will auto copy the upload data to Google Cloud Storage. And also have a download strategy for choice:
We found that express-gcs-uploader demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Security News
The Linux Foundation is warning open source developers that compliance with global sanctions is mandatory, highlighting legal risks and restrictions on contributions.
Security News
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.