
Security News
Bun 1.2.19 Adds Isolated Installs for Better Monorepo Support
Bun 1.2.19 introduces isolated installs for smoother monorepo workflows, along with performance boosts, new tooling, and key compatibility fixes.
node-red-contrib-spark
Advanced tools
Node-RED Nodes to integrate with the Cisco Spark API.
map
functionality provided in many programming languages and additionally provides options when dealing with input that is in the form of a collection (array of objects).The simplest installation method is done via Node-RED itself. In the "Manage Palette" menu dropdown, search for this module by name. Alternate methods are outlined below.
Via the node-red-admin CLI tool:
# install node-red-admin if needed
npm install -g node-red-admin
# authenticate if your Node-RED administration has been secured
node-red-admin login
# install the module
node-red-admin install node-red-contrib-spark
Via source:
# clone repo
git clone https://github.com/nmarus/node-red-contrib-spark
cd node-red-contrib-spark
# get submodule
git submodule init
git submodule update
# from your node-red installation directory
npm install /path/to/node-red-contrib-spark
Via NPM repository:
# from your node-red installation directory
npm install node-red-contrib-spark
The Spark API Node sends REST queries via messages received by the input connector in the msg.payload
object. Results of the API call are provided at the output in the msg.payload
object.
The Spark API request is passed as a JSON object in msg.payload
. The msg.payload
object can be an empty object {}
or contain optional path, query-string, or body variables provided using "key": "value"
object parameters. Depending on the particular API method, the "key": "value"
properties are defined in either the msg.payload
object or the msg.payload.body
object.
By convention the output from the Spark API call will have a msg.payload
property containing the results of the API call in JSON format. The format of this JSON object will be the same as documented at developer.ciscospark.com for the responses from the API call.
Additionally the following are defined as part of the msg object:
msg.status
: http return codemsg.error
: error object (will evaluate to null
when no error is present)
msg.error.message
: error messagemsg.error.description
: error description (only available for certain errors)msg.error.trackingId
: tracking id (only available for certain errors)msg.headers
- response headers objectmsg._msgid
- unique identifierExample: Get Person by Email
The following object would be sent in the msg.payload
input to a Spark API Node setup for People.getPeople
:
{
"email": "person@example.com"
}
Example: Get Rooms by Type
The following object would be sent in the msg.payload
input to a Spark API Node setup for Rooms.getRooms
:
{
"body": {
"type": "group"
}
}
Example: Update a Room Title
The following object would be sent in the msg.payload
input to a Spark API Node setup for Rooms.updateRoom
:
{
"roomId": "someSparkRoomIdString",
"body": {
"title": "My Renamed Room"
}
}
Example: Create a New Message
The following object would be sent in the msg.payload
input to a Spark API Node setup for Messages.createMessage
:
{
"body": {
"roomId": "someSparkRoomIdString",
"text": "Hello World!"
}
}
Example: Add Person by Email to a Room
The following object would be sent in the msg.payload
input to a Spark API Node setup for Memberships.createMembership
:
{
"body": {
"roomId": "someSparkRoomIdString",
"personEmail": "person@emaple.com"
}
}
The Spark Webhook Node is triggered when a resource event is matched. When the Node is deployed, it automatically creates the associated Cisco Spark Webhook. When the Node is removed, the Webhook reference is automatically removed in the Spark API.
Example Output : msg.payload
{
"id": "Y2lzY29zcGFyazovL3VzL1dFQkhPT0svNzJkYzlhNTctYmY4MC00OTdjLWFhM2MtNjMyYzUyOThkMTFk",
"name": "Test",
"targetUrl": "http://myhost.com:3000/spark388d9fffca49b8",
"resource": "messages",
"event": "created",
"ownedBy": "creator",
"created": "2016-10-23T19:50:23.484Z",
"data": {
"id": "Y2lzY29zcGFyazovL3VzL01FU1NBR0UvZjgyOGM3YTAtOTk1OS0xMWU2LTk5ODYtYzc4MTAwYzIyYTJm",
"roomId": "Y2lzY29zcGFyazovL3VzL1JPT00vNjNhYzQ3MzAtOTUzYy0xMWU2LWEwZmQtNDcxNWExOWY2ZDJi",
"roomType": "group",
"personId": "Y2lzY29zcGFyazovL3VzL1BFT1BMRS8zYzNlZmYxOS04Njg1LTQ2OTEtODViOS1lZjRmMTViZDk2ZDQ",
"personEmail": "nmarus@gmail.com",
"created": "2016-10-23T19:50:37.594Z"
}
}
http(s)://domain.tld:<port>
. The Webhook Node will dynamicly publish webroutes under this URL as /spark<node-uuid>
. Note that the Webhook is automatically created in the Spark API after deploying and automatically removed if the Node is deleted and the flow re-deployedThe Spark Parse Node allows parsing of messages received from either the Webhook or API Node. The parsed value is placed in the msg.payload
of the first output. The value of the "parse" field is delivered in msg.topic
for use with supporting functions like join
. The original msg.payload
is passed through to the second output.
The output specifies how the msg.payload
is formatted. Options are:
If the parser input receives an array, each element of the array is parsed individually. The results are returned as multiple sequential messages to the output with each msg having a msg.payload
and msg.topic
property.
Example Output (Individual) : msg.payload
"Test Room 1"
"Test Room 2"
"Test Room 3"
MIT License Copyright (c) 2016 Nicholas Marus
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
1.1.0: Feature Release
FAQs
Node-RED Nodes to integrate with the Cisco Webex Teams API
The npm package node-red-contrib-spark receives a total of 9 weekly downloads. As such, node-red-contrib-spark popularity was classified as not popular.
We found that node-red-contrib-spark demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Bun 1.2.19 introduces isolated installs for smoother monorepo workflows, along with performance boosts, new tooling, and key compatibility fixes.
Security News
Popular npm packages like eslint-config-prettier were compromised after a phishing attack stole a maintainer’s token, spreading malicious updates.
Security News
/Research
A phishing attack targeted developers using a typosquatted npm domain (npnjs.com) to steal credentials via fake login pages - watch out for similar scams.