Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
@adobe/aem-upload
Advanced tools
In AEM Assets 6.5 and prior, a single post request to a servlet that manges asset binaries is enough for uploading files. Newer versions of AEM can be configured to use direct binary upload, which means that asset binaries are no longer uploaded straight to AEM. Because of this there is a more complex algorithm to follow when uploading asset binaries. This library will check the configuration of the target AEM instance, and will either use the direct binary upload algorithm or the create asset servlet, depending on the configuration.
This tool is provided for making uploading easier, and can be used as a command line executable or required as a Node.js module.
A command line tool for for uploading assets to an AEM instance is available as a plugin for the Adobe I/O CLI. Please see the plugin repository for more information.
This library supports uploading files to a target instance, while providing support for monitoring transfer progress, cancelling transfers, and other features.
This project uses node and npm. Go check them out if you don't have them locally installed.
It can be installed like any other Node.js module.
$ npm install @adobe/aem-upload
To add the module to your Node.js project:
const DirectBinary = require('@adobe/aem-upload');
Following is the minimum amount of code required to upload files to a target AEM instance.
const DirectBinary = require('@adobe/aem-upload');
// URL to the folder in AEM where assets will be uploaded. Folder
// must already exist.
const targetUrl = 'http://localhost:4502/content/dam/target';
// list of all local files that will be uploaded.
const uploadFiles = [
{
fileName: 'file1.jpg', // name of the file as it will appear in AEM
fileSize: 1024, // total size, in bytes, of the file
filePath: '/Users/me/Documents/my_file.jpg' // Full path to the local file
},
{
fileName: 'file2.jpg',
fileSize: 512,
filePath: '/Users/me/Documents/file2.jpg'
}
];
const upload = new DirectBinary.DirectBinaryUpload();
const options = new DirectBinary.DirectBinaryUploadOptions()
.withUrl(targetUrl)
.withUploadFiles(uploadFiles);
// this call will upload the files. The method returns a Promise, which will be resolved
// when all files have uploaded.
upload.uploadFiles(options)
.then(result => {
// "result" contains various information about the upload process, including
// performance metrics and errors that may have occurred for individual files
// at this point, assuming no errors, there will be two new assets in AEM:
// http://localhost:4502/content/dam/target/file1.jpg
// http://localhost:4502/content/dam/target/file2.jpg
})
.catch(err => {
// the Promise will reject if something causes the upload process to fail at
// a high level. Note that individual file failures will NOT trigger this
// "err" will be an instance of UploadError. See "Error Handling"
// for more information
});
The DirectBinaryUploadOptions
class supports the following options. Items with * are required.
Option | Type | Description | ||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
* URL | string |
Full, absolute URL to the Folder in the target instance where the specified files will be uploaded. This value is expected to be URI encoded.
Example
options.withUrl('http://localhost:4502/content/dam/target');
| ||||||||||||||||||||||||||||||
* Upload Files | Array |
List of files that will be uploaded to the target URL. Each item in the array should be
an object consisting of the following properties:
Example options.withUploadFiles([ { fileName: 'file1.jpg', fileSize: 1024, filePath: '/Users/me/Documents/file1.jpg' }, { fileName: 'file2.jpg', fileSize: 2048, // note that this assumes HTML similar to: // <form name="formName"> // <input type="file" name="fileInputName" /> // </form> blob: document.forms['formName']['fileInputName'].files[0] } ]); | ||||||||||||||||||||||||||||||
fetch options | object |
Consumers can control the options that the library will provide to the Fetch API when submitting HTTP requests. These options will be passed as-is to Fetch, so consumers should reference the Fetch API documentation to determine which values are applicable.
Example options.withHttpOptions({
agent: myAgent,
headers: {
'authorization': '12345'
'cookie': 'mycookie=myvalue'
}
});
| ||||||||||||||||||||||||||||||
concurrent | boolean |
If true , multiple files in the supplied list of upload files will
transfer simultaneously. If false , only one file will transfer at
a time, and the next file will not begin transferring until the current file
finishes.
Default: false .
Example options.withConcurrent(true);
| ||||||||||||||||||||||||||||||
max concurrent requests | number |
The maximum number of concurrent HTTP requests that are allowed at any one time. As explained in the
concurrent option, the library will concurrently upload multiple files at once. This value
essentially indicates the maximum number of files that the process will upload at once.
A value less than 2 will instruct the library not to upload more than one file concurrently. Default: 5 .
Example options.withMaxConcurrent(2);
| ||||||||||||||||||||||||||||||
http retry count | number |
The number of times that the process will retry a failed HTTP request before
giving up. For example, if the retry count is 3 then the process will submit
the same HTTP request up to 3 times if the response indicates a failure.
Default: 3
Example options.withHttpRetryCount(5);
| ||||||||||||||||||||||||||||||
http retry delay | number |
The amount of time that the process will wait before retrying a failed HTTP
request. The value is specified in milliseconds. With each increasing retry,
the delay will increase by its value. For example, if the delay is 5000 then
the first retry will wait 5 seconds, the second 10 seconds, the third 15
seconds, etc.
Default: 5000
Example options.withHttpRetryDelay(3000);
|
If a file fails to upload, the process will move to the next file in the list. The overall process itself will only fail if something catastrophic happens that prevents it from continuing to upload files. It's left up to the consumer to determine if there were individual file upload failures and react to them accordingly.
All errors reported by the upload process will be instances of UploadError
, which are
standard javascript Error
instances with an additional code
value that indicates
the type of error. Specific codes are available in DirectBinary.DirectBinaryUploadErrorCodes
.
The following is an example of handling errors, at either the process or file level.
const codes = DirectBinary.DirectBinaryUploadErrorCodes;
const upload = new DirectBinary.DirectBinaryUpload();
upload.uploadFiles(options) // assume that options is defined previously
.then(result => {
// use this method to retrieve ALL errors during the process
const { errors = [] } = result;
errors.forEach(error => {
if (error.code === codes.ALREADY_EXISTS) {
// handle case where a file already exists
}
});
const { detailedResult = [] } = result;
// or retrieve individual file errors
detailedResult.forEach((fileResult) => {
const { result = {} } = fileResult;
const { errors = [] } = result;
errors.forEach((fileErr) => {
// content of fileErr may vary
});
});
})
.catch(err => {
if (err.code === codes.NOT_SUPPORTED) {
// handle case where direct binary access is not enabled
// on the target instance
}
});
Another way of handling individual file errors is to listen for the upload process's Events.
The process implements automatic HTTP retry handling, meaning that if an HTTP request fails then the process will wait for a specified interval and retry the same HTTP request a given number of times. If the request still fails after the given number of retries, it will report the error as normal using the last error. Any errors that caused a retry, in either a success scenario or failure scenario, will be reported in the result in a dedicated structure.
As the upload process moves through individual files, it will send events as it goes through the stages of uploading a file. These events are listed below.
Event | Description | Data | ||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fileuploadstart | Indicates an upload of one or more files is starting. |
The data sent with the event will be a simple javascript object
with the following properties:
| ||||||||||||||||||
fileuploadend | Indicates an upload of one or more files has finished. |
The data sent with the event will be the same as the `fileuploadstart` event, with the following additional elements:
| ||||||||||||||||||
filestart | Indicates that a file has started to upload. |
The data sent with the event will be a simple javascript object
with the following properties:
| ||||||||||||||||||
fileprogress | Sent periodically and includes information about how much of the file has uploaded. |
A simple javascript object containing the same properties as "filestart,"
in addition to the following properties:
| ||||||||||||||||||
fileend | Indicates that a file has uploaded successfully. This event will not be sent if the file failed to upload, or if the file upload was cancelled. |
A simple javascript object containing the same properties as "filestart."
| ||||||||||||||||||
fileerror | Sent if a file fails to upload. This event will not be sent if the file uploads successfully, or if the file upload was cancelled. |
A simple javascript object containing the same properties as "filestart,"
in addition to the following properties:
| ||||||||||||||||||
filecancelled | Indicates that a file upload was cancelled. |
A simple javascript object containing the same properties as "filestart."
| ||||||||||||||||||
foldercreated | Indicates that the upload process created a new folder in the target. |
The data sent with the event will be a simple javascript object
with the following properties:
|
The following is an example of how to handle various events.
const upload = new DirectBinary.DirectBinaryUpload();
upload.on('filestart', data => {
const { fileName } = data;
// specific handling that should occur when a file begins uploading
});
upload.on('fileprogress', data => {
const { fileName, transferred } = data;
// specific handling that should occur as a file uploads
});
upload.on('fileend', data => {
const { fileName } = data;
// specific handling that should occur when a file finishes uploading successfully
});
upload.on('fileerror', data => {
const { fileName, errors } = data;
// specific handling that should occur when a file files to upload
});
// assume options has been declared previously
upload.uploadFiles(options);
The library supports uploading local files and folders. For folders, the tool will include all immediate children files in the folder. It will not process sub-folders unless the "deep upload" option is specified.
When deep uploading, the library will create a folder structure in the target that mirrors the folder being uploaded. The title of the newly created folders will match the name of the folder as it exists on the local filesystem. The path of the target may be modified depending on path character restrictions in AEM, and depending on the options provided in the upload (see "Function for processing folder node names" for more information).
Whenever the library creates a new folder, it will emit the foldercreated
event. See event documentation for details.
The following example illustrates how to upload local files.
const {
FileSystemUploadOptions,
FileSystemUpload
} = require('@adobe/aem-upload');
// configure options to use basic authentication
const options = new FileSystemUploadOptions()
.withUrl('http://localhost:4502/content/dam/target-folder')
.withHttpOptions({
headers: {
Authorization: Buffer.from('admin:admin').toString('base64')
}
});
// upload a single asset and all assets in a given folder
const fileUpload = new FileSystemUpload();
await fileUpload.upload(options, [
'/Users/me/myasset.jpg',
'/Users/me/myfolder'
]);
There is a set of options, FileSystemUploadOptions
, that are specific to uploading local files. In addition to default options, the following options are available.
Option | Type | Description |
---|---|---|
Maximum number of files | number |
The maximum number of files that the library will attempt to upload. If the target upload exceeds this number then the process will fail with an exception. Default: 1000.
Example
options.withMaxUploadFiles(100);
|
Perform deep upload | boolean |
If true, the process will include all descendent folders and files when given a folder to upload. If false, the process will only upload those files immediately inside the folder to upload. Default: false.
Example
options.withDeepUpload(true);
|
Function for processing folder node names | function |
When performing a deep upload, the tool will create folders in AEM that match local folders
being uploaded. The tool will "clean" the folder names of certain characters when creating node names
for each folder. The unmodified folder name will become the node's title.
This option allows customization of the functionality that cleans the folder's name. The option should be a function . It will receive a single argument value: the name of the
folder to be cleaned. The return value of the function should be a Promise , which
should resolve with the clean folder name.
The default functionality will convert the folder name to lower case and replace whitespace and any of the characters %;#,+?^{} with the replacement value specified in the options.
Regardless of this function, the library will always replace any of the characters ./:[]|*\ with the replacement value specified in the options.
Example
// This example will skip any special processing
|
Function for processing asset node names | function |
When performing a deep upload, the tool will create assets in AEM that match local files
being uploaded. The tool will "clean" the file names of certain characters when creating node names
for each asset.
This option allows customization of the functionality that cleans the file's name. The option should be a function . It will receive a single argument value: the name of the
file to be cleaned. The return value of the function should be a Promise , which
should resolve with the clean asset name.
The default functionality will replace any of the characters #%{}?& with the replacement value specified in the options.
Regardless of this function, the library will always replace any of the characters ./:[]|*\ with the replacement value specified in the options.
Example
// This example will skip any special processing
|
Replacement value for invalid node characters | string |
Specifies the value to use when replacing invalid characters in folder and asset node names. This value is used in the default functions that clean
folder/asset names, and is always used when replacing any of the characters ./:[]|*\ ; the value of this option cannot
contain any of those characters. Default: -
For example, assume the folder name My Test Folder #2 . With the default settings, the folder's node would be my-test-folder--2 .
Example
options.withInvalidCharacterReplaceValue('_');
|
Upload file options | object |
Specifies the options to use when uploading each file as part of the file system upload. Most of the options provided when using
`DirectBinaryUploadOptions.withUploadFiles()` are valid. The exceptions are `fileName`, `fileSize`, `filePath`, and `blob`, which
will be ignored.
Example
options.withUploadFileOptions({
|
The library will log various messages as it goes through the process of uploading items. It will use whichever logger it's given, as long as the object supports methods debug()
, info()
, warn()
, and error()
. For maximum detail, the library also assumes that each of these methods can accept formatted messages: log.info('message with %s', 'formatting');
. The logging will work regardless of formatting support, but there will be more information when formatting works correctly.
To provide a logger to the library, pass a log
element in the options sent into the DirectBinaryUpload
constructor. Here is a simple example that will log all the library's messages to console
:
const upload = new DirectBinary.DirectBinaryUpload({
log: {
debug: (...theArguments) => console.log.apply(null, theArguments),
info: (...theArguments) => console.log.apply(null, theArguments),
warn: (...theArguments) => console.log.apply(null, theArguments),
error: (...theArguments) => console.log.apply(null, theArguments),
}
});
Note that this will also work with the FileSystemUpload
constructor.
The library utilizes the Fetch API, so when running in a browser, proxy settings are detected and applied by the browser. In Node.JS, all HTTP requests are sent directly to the target, without going through a proxy. Auto detecting a system's proxy settings is not supported in Node.JS, but
consumers can use DirectBinaryUploadOptions.withHttpOptions()
to provde an agent
value as recommended by node-fetch
.
This module uses semantic-release when publishing new versions. The process is initiated upon merging commits to the master
branch. Review semantic-release's documentation for commit message format.
PRs whose messages do not meet semantic-release's format will not generate a new release.
Release notes are generated based on git commit messages. Release notes will appear in CHANGELOG.md
.
Contributions are welcomed! Read the Contributing Guide for more information.
This project is licensed under the Apache V2 License. See LICENSE for more information.
FAQs
AEM Assets direct binary access uploading tool
The npm package @adobe/aem-upload receives a total of 231 weekly downloads. As such, @adobe/aem-upload popularity was classified as not popular.
We found that @adobe/aem-upload demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 22 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.