node-red-contrib-facial-recognition
Node-RED - Provides a node-red node for Facial Detection & Facial Recognition.
Table of Contents
Install
Install with node-red Palette Manager or,
Run the following command in your Node-RED user directory - typically ~/.node-red
:
node-red-contrib-facial-recognition
About
At its core it uses @vladmandic/face-api and @tensorflow/tfjs-node ver.2.7.x and it can use @tensorflow/tfjs-node-gpu for the CUDA crazy amongst you.
vladmandic was a big help for us nodejs guys. After finding a bug and fielding questions, he took the time to make a nodejs build specific to tfjs-node. If you like this node-red-contrib-facial-recognition, I highly recommend you take the time to goto vladmandic's github page https://github.com/vladmandic/face-api and throw his repo a star.
Usage
Takes a buffered image and runs TensorFlow Facial Detection and/or Facial Recognition to Detect:
- Faces in an image
- Facial landmarks
- Compute Face Descriptors
- Face Expressions
- Age & Gender Recognition
- Face Recognition by Matching Descriptors
Node_Properties
Name
Define the msg name if you wish to change the name displayed on the node.
Image
You can change the msg property value that you send a buffered image of your choice to.
Example: msg.NameOfYourChoice
Settings
This is optional, you do not have to send it anything. Used to override settings in the nodes config Properties menu.
You can change the msg property value that you send an object of your choice to.
Example: msg.NameOfYourChoice
Sending a object to this msg property value will override any settings in the nodes config Properties menu. Great for using input messages to change settings on the fly.
Please see/inport the example Flow section of this documentation for better understanding.
Example:
msg.settings = {
FaceDetector :
{
SsdMobilenetv1 :
{
maxResults : 4,
minConfidence : 0.6
}
},
Tasks :
{
detectAllFaces :
{
withFaceLandmarks : true,
withFaceExpressions : true,
withAgeAndGender : true,
withFaceDescriptors : true
}
},
FaceRecognition :
{
enabled :
{
KnownFacesPath : "/example/known_face",
distanceThreshold : 0.6,
ReInitializeFaceMatcher : false
}
}
};
return msg;
You do not have to fill out every option. You can omit any object key and its value. This node will then use the settings found in the nodes config Properties menu for that omitted object key and its value.
Note: ReInitializeFaceMatcher
Set this value to true if you have changed/edited/added images or image folders to your KnownFacesPath to ReInitialize the FaceMatcher. Used to process all the images into Labeled Face Descriptors for each dir name and individual descriptions for images. Do not leave set to true! it takes significant time to process. Once its ran after you have made changes to images or image folder it is saved to context and used for Facial Recognition.
Else you can just re-deploy node-red and ReInitializeFaceMatcher will run one time only on the first image you send.
Bindings
By default it is set to CPU - @tensorflow/tfjs-node, this will use your CPU to process images. However you may choose to install @tensorflow/tfjs-node-gpu to utilize your video card to process images. This is not an easy process to get CUDA working. However if you go down this rabbit hole the benefits in time to process images are significant.
Good luck.
FaceDetector
SsdMobilenetv1 - A Single Shot Multibox Detector; based on MobileNetV1.
Computes the locations of each face in an image and returns the bounding boxes with it's probability for each face. High accuracy in detecting face bounding boxes at the cost of time to compute.
- maxResults - The max number of faces to return
- minConfidence - returns results for face(s) in a image above Confidence threshold
tinyFaceDetector - a fast realtime face detector, and less resource consuming compared to the SSD Mobilenet V1 face detector. It is poor at detecting small faces. Best face detector on resource limited devices.
- inputSize - size at which image is processed, the smaller the faster, number must be divisible by 32. Common sizes are 128, 160, 224, 320, 416, 512, 608
- scoreThreshold - returns results for face(s) in a image above Confidence threshold
Tasks
detectAllFaces - Utilize the selected FaceDetector to detect multiple faces in a buffered image sent in message by user
detectSingleFace - Utilize the selected FaceDetector to detect a single face in a buffered image sent in message by user. If image contains multiple faces it will only detect one of them, hopefully the highest probability one.
- withFaceLandmarks - computes landmarks for each detected face(s)
- withFaceExpressions - recognize face expressions of each face(s)
- withAgeAndGender - estimate age and recognize gender of each face(s)
- withFaceDescriptor - computes the face descriptors for each face(s)
FaceRecognition
disabled - Don't use any facial recognition
enabled - Performs face recognition, by comparing reference face descriptor(s) to determine the similarity to query face descriptor(s).
- KnownFacesPath - The location of the main folder that contains all subfolders, labeled with persons name. Subfolders should contain close-up face images of the individual. The actual name if the file in this folder does not matter. Please look at the structure of the example folder for more understanding. The name of the subfolder is what is used to label the faces for facial recognition.
Note: If you have changed/edited/added images or image folders to your KnownFacesPath you must redeploy the node to ReInitialize the FaceMatcher. On the first image you send, FaceMatcher is ran to process all the images into Labeled Face Descriptors for each dir name and individual descriptions for images. Then it is saved to context and used for Facial Recognition for all future images you send in a message.
- distanceThreshold - returns results based on measures of how far away, Euclidean distance of face descriptor, the user submitted image is compared to how far away, Euclidean distance of face descriptors, it is to all the faces found in the labeled subfolder that are above the distanceThreshold. Simply put: return names for person if it is this distanceThreshold of confidence. The lower the distanceThreshold is the more likely you are to get a incorrect match. The higher the distanceThreshold is the more likely it is that a person will not be recognized.
Example_Flows
BASIC:
[{"id":"461f9a48.bcb9dc","type":"tab","label":"basic - face","disabled":false,"info":""},{"id":"253c371.ef72948","type":"inject","z":"461f9a48.bcb9dc","name":"","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":160,"y":100,"wires":[["780b7c48.705bb4"]]},{"id":"780b7c48.705bb4","type":"file in","z":"461f9a48.bcb9dc","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (1).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":420,"y":140,"wires":[["723b347d.de38c4"]]},{"id":"ce4c6bba.732628","type":"debug","z":"461f9a48.bcb9dc","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":330,"y":180,"wires":[]},{"id":"723b347d.de38c4","type":"facial-recognition","z":"461f9a48.bcb9dc","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":3,"FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectAllFaces","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_disabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":170,"y":180,"wires":[["ce4c6bba.732628"]]},{"id":"e8f9b7f9.d068c","type":"comment","z":"461f9a48.bcb9dc","name":"Detect all faces in image","info":"","x":150,"y":60,"wires":[]},{"id":"8dfa5c87.07ff88","type":"inject","z":"461f9a48.bcb9dc","name":"","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":160,"y":280,"wires":[["436bc2a1.d01c0c"]]},{"id":"436bc2a1.d01c0c","type":"file in","z":"461f9a48.bcb9dc","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (1).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":420,"y":320,"wires":[["9141c2c6.98d8"]]},{"id":"e60e1e9a.162488","type":"debug","z":"461f9a48.bcb9dc","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":330,"y":360,"wires":[]},{"id":"9141c2c6.98d8","type":"facial-recognition","z":"461f9a48.bcb9dc","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":3,"FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectSingleFace","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_disabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":170,"y":360,"wires":[["e60e1e9a.162488"]]},{"id":"78c855b8.487054","type":"comment","z":"461f9a48.bcb9dc","name":"Detect a single face in image","info":"","x":160,"y":240,"wires":[]},{"id":"89e26069.bd2b28","type":"inject","z":"461f9a48.bcb9dc","name":"","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":160,"y":800,"wires":[["f11f6aa8.b70e28"]]},{"id":"f11f6aa8.b70e28","type":"file in","z":"461f9a48.bcb9dc","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (1).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":420,"y":840,"wires":[["a5c2774c.45ee18"]]},{"id":"b2275b16.4bf098","type":"debug","z":"461f9a48.bcb9dc","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":330,"y":880,"wires":[]},{"id":"a5c2774c.45ee18","type":"facial-recognition","z":"461f9a48.bcb9dc","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":3,"FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectSingleFace","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_enabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":170,"y":880,"wires":[["b2275b16.4bf098"]]},{"id":"a61a85b5.9ca8a8","type":"comment","z":"461f9a48.bcb9dc","name":"Recognize a single face in image","info":"","x":170,"y":680,"wires":[]},{"id":"6014abc6.1b1e44","type":"inject","z":"461f9a48.bcb9dc","name":"","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":160,"y":540,"wires":[["9fdd93b3.5bae78"]]},{"id":"9fdd93b3.5bae78","type":"file in","z":"461f9a48.bcb9dc","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (1).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":420,"y":580,"wires":[["674fa2c4.b218a4"]]},{"id":"9e1ff55a.983518","type":"debug","z":"461f9a48.bcb9dc","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":330,"y":620,"wires":[]},{"id":"674fa2c4.b218a4","type":"facial-recognition","z":"461f9a48.bcb9dc","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":3,"FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectAllFaces","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_enabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":170,"y":620,"wires":[["9e1ff55a.983518"]]},{"id":"8cf4cfb4.ed002","type":"comment","z":"461f9a48.bcb9dc","name":"Recognize all faces in image","info":"","x":160,"y":420,"wires":[]},{"id":"c5fd45b7.420ca","type":"comment","z":"461f9a48.bcb9dc","name":"Note: you will notice it takes longer the first time as it has to load all images","info":"","x":340,"y":460,"wires":[]},{"id":"ef0f1da5.894ba","type":"comment","z":"461f9a48.bcb9dc","name":"the next time you run it it should take less time","info":"","x":250,"y":500,"wires":[]},{"id":"34375cfb.87a60c","type":"comment","z":"461f9a48.bcb9dc","name":"Note: you will notice it takes longer the first time as it has to load all images","info":"","x":340,"y":720,"wires":[]},{"id":"58e76f00.7b4288","type":"comment","z":"461f9a48.bcb9dc","name":"the next time you run it it should take less time","info":"","x":250,"y":760,"wires":[]}]
Advanced:
NOTE: other node-red nodes required
node-red-node-annotate-image
node-red-contrib-image-output
[{"id":"63368d69.d11d5c","type":"tab","label":"Advanced - face","disabled":false,"info":""},{"id":"bdc0b3ae.b9a248","type":"inject","z":"63368d69.d11d5c","name":"","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":180,"y":220,"wires":[["f42eabc3.f6e658"]]},{"id":"f42eabc3.f6e658","type":"file in","z":"63368d69.d11d5c","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (1).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":620,"y":220,"wires":[["510508cd.bc342"]]},{"id":"da4443a6.9f59d8","type":"function","z":"63368d69.d11d5c","name":"Prep rects for annotate image node","func":"//global vars\nvar the_rects;\n\n//was detectAllFaces or detectSingleFace used\n//check to see if payload.Result is an array (detectAllFaces)\nvar Result = msg.payload.Result;\nif ( Array.isArray(Result) ) {\n // get just the rect values and place in array\n the_rects = Result.map(x => {\n //check for label from FaceRecognition\n var match_label;\n if ( x.match ) {\n match_label = x.match._label;\n }\n else {\n match_label = \"\";\n }\n var result = {\n type: \"rect\",\n x: x.detection._box._x,\n y: x.detection._box._y,\n w: x.detection._box._width,\n h: x.detection._box._height,\n label: match_label\n }\n return result;\n });\n msg.annotations = the_rects;\n}\n//else detectSingleFace was used\nelse {\n //check for label from FaceRecognition\n var match_label;\n if ( Result.match ) {\n match_label = Result.match._label;\n }\n else {\n match_label = \"\";\n }\n the_rects = [{\n type: \"rect\",\n x: Result.detection._box._x,\n y: Result.detection._box._y,\n w: Result.detection._box._width,\n h: Result.detection._box._height,\n label: match_label.match._label\n }]\n msg.annotations = the_rects;\n}\n\n\n\n//var xx = msg.payload.Result[0].detection._box._x;\n//var yy = msg.payload.Result[0].detection._box._y;\n//var ww = msg.payload.Result[0].detection._box._width;\n//var hh = msg.payload.Result[0].detection._box._height;\n\n//msg.annotations = [ {\n// type: \"rect\",\n// //bbox: [ xx, yy, ww, hh],\n// x: xx, y: yy, w: ww, h: hh,\n// //bbox: [ 100, 100, 100, 100],\n// label: \"Tara Sanford\"\n//}]\n\nmsg.payload = msg.payload.OriginalBufferedImg;\n\nreturn msg;\n\n","outputs":1,"noerr":0,"initialize":"","finalize":"","x":500,"y":300,"wires":[["c4b929f7.c31d78"]]},{"id":"c4b929f7.c31d78","type":"annotate-image","z":"63368d69.d11d5c","name":"","fill":"","stroke":"#0070c0","lineWidth":"20","fontSize":"48","fontColor":"#0070c0","x":800,"y":300,"wires":[["1504ab31.07575d"]]},{"id":"1504ab31.07575d","type":"image","z":"63368d69.d11d5c","name":"","width":"800","data":"payload","dataType":"msg","thumbnail":false,"active":true,"pass":false,"outputs":0,"x":180,"y":300,"wires":[]},{"id":"1bc415e3.ed6bea","type":"debug","z":"63368d69.d11d5c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":410,"y":260,"wires":[]},{"id":"510508cd.bc342","type":"facial-recognition","z":"63368d69.d11d5c","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":"5","FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":"0.5","Tasks":"detectAllFaces","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_enabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":190,"y":260,"wires":[["1bc415e3.ed6bea","da4443a6.9f59d8"]]},{"id":"ee417043.f6b968","type":"comment","z":"63368d69.d11d5c","name":"Recognize all faces in image","info":"","x":180,"y":100,"wires":[]},{"id":"5bc97bde.44ff3c","type":"comment","z":"63368d69.d11d5c","name":"Note: you will notice it takes longer the first time as it has to load all images","info":"","x":360,"y":140,"wires":[]},{"id":"d0d6d5db.66c4b","type":"comment","z":"63368d69.d11d5c","name":"the next time you run it it should take less time","info":"","x":270,"y":180,"wires":[]}]
Heavy_image_processing_or_mjpeg_video_stream
Every output object message from this node has sec_to_complete with the amount of time it took to process the image.
Based on your device/processing speed/CPU or CUDA will vary the amount of time it takes to process the image.
So if it takes 0.623 seconds to do a facialrecognition and your sending it 15 frames a second you will create a backlog of work and overflow the node.
The best thing to do is use multiple facial-recognition nodes to process the images as individual workers.
then check to see if it is keeping up.
Example Flow:
NOTE: other node-red nodes required
node-red-node-loadbalance
[{"id":"97c3c9fa.50bc1","type":"tab","label":"Heavy image processing","disabled":false,"info":""},{"id":"a9703d55.c50da8","type":"file in","z":"97c3c9fa.50bc1","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (1).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":420,"y":180,"wires":[["b3fccdb8.e5a4c8"]]},{"id":"7f3eb574.32703c","type":"comment","z":"97c3c9fa.50bc1","name":"With out load balance","info":"","x":140,"y":60,"wires":[]},{"id":"1f0c5c85.0cffab","type":"file in","z":"97c3c9fa.50bc1","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (2).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":420,"y":220,"wires":[["b3fccdb8.e5a4c8"]]},{"id":"3c447a94.e1678e","type":"file in","z":"97c3c9fa.50bc1","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (3).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":420,"y":260,"wires":[["b3fccdb8.e5a4c8"]]},{"id":"62895aa0.7571fc","type":"file in","z":"97c3c9fa.50bc1","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (4).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":420,"y":300,"wires":[["b3fccdb8.e5a4c8"]]},{"id":"d8d0a0a4.f0dac","type":"file in","z":"97c3c9fa.50bc1","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (5).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":420,"y":340,"wires":[["b3fccdb8.e5a4c8"]]},{"id":"6c6e6a3.30c7814","type":"comment","z":"97c3c9fa.50bc1","name":"imagine all these images are a mjpg video stream","info":"","x":260,"y":140,"wires":[]},{"id":"bd1e05b6.13c8e8","type":"comment","z":"97c3c9fa.50bc1","name":"using load balance","info":"","x":150,"y":460,"wires":[]},{"id":"22ad7a58.06e9e6","type":"file in","z":"97c3c9fa.50bc1","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (1).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":440,"y":580,"wires":[["86af086c.50ba2"]]},{"id":"af08979a.dc12b8","type":"file in","z":"97c3c9fa.50bc1","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (2).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":440,"y":620,"wires":[["86af086c.50ba2"]]},{"id":"62b5fde3.d4e5b4","type":"file in","z":"97c3c9fa.50bc1","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (3).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":440,"y":660,"wires":[["86af086c.50ba2"]]},{"id":"e1ce11a1.066b38","type":"file in","z":"97c3c9fa.50bc1","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (4).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":440,"y":700,"wires":[["86af086c.50ba2"]]},{"id":"8830dbc5.a4edd","type":"file in","z":"97c3c9fa.50bc1","name":"","filename":".node-red/node_modules/node-red-contrib-facial-recognition/example/unknown_face/sample (5).jpg","format":"","chunk":false,"sendError":false,"encoding":"none","x":440,"y":740,"wires":[["86af086c.50ba2"]]},{"id":"c2d315cd.4e7ef8","type":"comment","z":"97c3c9fa.50bc1","name":"imagine all these images are a mjpg video stream","info":"","x":280,"y":540,"wires":[]},{"id":"86af086c.50ba2","type":"Load Balance","z":"97c3c9fa.50bc1","name":"","routes":"5","outputs":6,"selection":"next","noavailability":"discard","nocapacity":"admin","defaultcapacity":100,"sticky":"","dynamic":"73e6b3d7.113a9c","mps":"","x":180,"y":880,"wires":[[],["46023eb1.06714"],["399db620.ef7e4a"],["9eb0dd84.1b5a4"],["142b2637.ece30a"],["b2d5aa86.9b40d"]]},{"id":"84949984.565528","type":"inject","z":"97c3c9fa.50bc1","name":"","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":180,"y":500,"wires":[["22ad7a58.06e9e6","af08979a.dc12b8","62b5fde3.d4e5b4","e1ce11a1.066b38","8830dbc5.a4edd"]]},{"id":"73e6b3d7.113a9c","type":"inject","z":"97c3c9fa.50bc1","name":"","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":160,"y":100,"wires":[["a9703d55.c50da8","1f0c5c85.0cffab","3c447a94.e1678e","62895aa0.7571fc","d8d0a0a4.f0dac"]]},{"id":"d9539e90.8e1cd","type":"debug","z":"97c3c9fa.50bc1","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":350,"y":380,"wires":[]},{"id":"b3fccdb8.e5a4c8","type":"facial-recognition","z":"97c3c9fa.50bc1","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":3,"FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectAllFaces","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_disabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":170,"y":380,"wires":[["d9539e90.8e1cd"]]},{"id":"300fe892.a26758","type":"debug","z":"97c3c9fa.50bc1","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":710,"y":880,"wires":[]},{"id":"46023eb1.06714","type":"facial-recognition","z":"97c3c9fa.50bc1","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":3,"FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectAllFaces","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_disabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":470,"y":800,"wires":[["300fe892.a26758"]]},{"id":"399db620.ef7e4a","type":"facial-recognition","z":"97c3c9fa.50bc1","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":3,"FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectAllFaces","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_disabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":470,"y":840,"wires":[["300fe892.a26758"]]},{"id":"9eb0dd84.1b5a4","type":"facial-recognition","z":"97c3c9fa.50bc1","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":3,"FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectAllFaces","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_disabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":470,"y":880,"wires":[["300fe892.a26758"]]},{"id":"142b2637.ece30a","type":"facial-recognition","z":"97c3c9fa.50bc1","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":3,"FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectAllFaces","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_disabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":470,"y":920,"wires":[["300fe892.a26758"]]},{"id":"b2d5aa86.9b40d","type":"facial-recognition","z":"97c3c9fa.50bc1","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":3,"FaceDetector_SsdMobilenetv1_minConfidence":"0.6","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectAllFaces","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_disabled","Face_Recognition_enabled_path":"/example/labeled_face","Face_Recognition_distanceThreshold":0.7,"x":470,"y":960,"wires":[["300fe892.a26758"]]}]
Bugs_Feature_request
Please report bugs and feel free to ask for new features directly on GitHub.
License
This project is licensed under Apache 2.0 license.
All photos are free to use and provided by: unsplash.
Work
_Need a node?
_Need automation work?
_Need computers to flip switches?
Contact me at meeki007@gmail.com
Contributor
Thanks to:
The @tensorflow/tfjs-node team for supporting and maintaining a repo that allows us JS guys to create cool stuff.
vladmandic and his @vladmandic/face-api for help and support in creating a nodeJS specific build for the face-api.
protocolus for his work on finding images for the user example.
Joshua Rondeau for free use of his photos.
Brandon Atchison for free use of his photos.
release notes
0.0.0 = (majorchange) . (new_feature) . (bugfix-simple_mod)
version 0.24.88 - typo fix in documentation
version 0.24.87 - added Heavy_image_processing_or_mjpeg_video_stream section and example for it
version 0.24.86 - Fix documentation, added examples and intro image to top of page.
version 0.24.85 - Fix documentation, updating examples!
version 0.24.84 - First Public release