Research
Security News
Malicious npm Packages Inject SSH Backdoors via Typosquatted Libraries
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
github.com/AlexxIT/go2rtc
Ultimate camera streaming application with support RTSP, WebRTC, HomeKit, FFmpeg, RTMP, etc.
Inspired by:
http://localhost:1984/
Optionally:
Developers:
Download binary for your OS from latest release:
go2rtc_win64.zip
- Windows 64-bitgo2rtc_win32.zip
- Windows 32-bitgo2rtc_win_arm64.zip
- Windows ARM 64-bitgo2rtc_linux_amd64
- Linux 64-bitgo2rtc_linux_i386
- Linux 32-bitgo2rtc_linux_arm64
- Linux ARM 64-bit (ex. Raspberry 64-bit OS)go2rtc_linux_arm
- Linux ARM 32-bit (ex. Raspberry 32-bit OS)go2rtc_linux_armv6
- Linux ARMv6 (for old Raspberry 1 and Zero)go2rtc_linux_mipsel
- Linux MIPS (ex. Xiaomi Gateway 3, Wyze cameras)go2rtc_mac_amd64.zip
- Mac Intel 64-bitgo2rtc_mac_arm64.zip
- Mac ARM 64-bitDon't forget to fix the rights chmod +x go2rtc_xxx_xxx
on Linux and Mac.
Container alexxit/go2rtc with support amd64
, 386
, arm64
, arm
. This container is the same as Home Assistant Add-on, but can be used separately from Home Assistant. Container has preinstalled FFmpeg, Ngrok and Python.
https://github.com/AlexxIT/hassio-addons
WebRTC Camera custom component can be used on any Home Assistant installation, including HassWP on Windows. It can automatically download and use the latest version of go2rtc. Or it can connect to an existing version of go2rtc. Addon installation in this case is optional.
Latest, but maybe unstable version:
alexxit/go2rtc:master
or alexxit/go2rtc:master-hardware
versionsgo2rtc master
or go2rtc master hardware
versionsgo2rtc.yaml
in the current work dirrectoryapi
server will start on default 1984 port (TCP)rtsp
server will start on default 8554 port (TCP)webrtc
will use port 8555 (TCP/UDP) for connectionsffmpeg
will use default transcoding optionsConfiguration options and a complete list of settings can be found in the wiki.
Available modules:
go2rtc support different stream source types. You can config one or multiple links of any type as stream source.
Available source types:
RTSP
and RTSPS
cameras with two way audio supportRTMP
streamsHTTP-FLV
, MPEG-TS
, JPEG
(snapshots), MJPEG
streamsRTSP
link and snapshot link using ONVIF
protocolHLS
, files
and many others)Read more about incoming sources
Supported for sources:
Two way audio can be used in browser with WebRTC technology. The browser will give access to the microphone only for HTTPS sites (read more).
go2rtc also support play audio files and live streams on this cameras.
streams:
sonoff_camera: rtsp://rtsp:12345678@192.168.1.123/av_stream/ch0
dahua_camera:
- rtsp://admin:password@192.168.1.123/cam/realmonitor?channel=1&subtype=0&unicast=true&proto=Onvif
- rtsp://admin:password@192.168.1.123/cam/realmonitor?channel=1&subtype=1
amcrest_doorbell:
- rtsp://username:password@192.168.1.123:554/cam/realmonitor?channel=1&subtype=0#backchannel=0
unifi_camera: rtspx://192.168.1.123:7441/fD6ouM72bWoFijxK
glichy_camera: ffmpeg:rstp://username:password@192.168.1.123/live/ch00_1
Recommendations
#backchannel=0
to the end of your RTSP link in YAML config filertspx://
prefix instead of rtsps://
. And don't use ?enableSrtp
suffixOther options
Format: rtsp...#{param1}#{param2}#{param3}
#timeout=30
(in seconds)#media=video
or ignore video - #media=audio
#backchannel=0
- important for some glitchy cameras#transport=ws...
RTSP over WebSocket
streams:
# WebSocket with authorization, RTSP - without
axis-rtsp-ws: rtsp://192.168.1.123:4567/axis-media/media.amp?overview=0&camera=1&resolution=1280x720&videoframeskipmode=empty&Axis-Orig-Sw=true#transport=ws://user:pass@192.168.1.123:4567/rtsp-over-websocket
# WebSocket without authorization, RTSP - with
dahua-rtsp-ws: rtsp://user:pass@192.168.1.123/cam/realmonitor?channel=1&subtype=1&proto=Private3#transport=ws://192.168.1.123/rtspoverwebsocket
You can get stream from RTMP server, for example Frigate.
streams:
rtmp_stream: rtmp://192.168.1.123/live/camera1
Support Content-Type:
video/x-flv
) - same as RTMP, but over HTTPimage/jpeg
) - camera snapshot link, can be converted by go2rtc to MJPEG streammultipart/x
) - simple MJPEG stream over HTTPvideo/mpeg
) - legacy streaming formatSource also support HTTP and TCP streams with autodetection for different formats: MJPEG, H.264/H.265 bitstream, MPEG-TS.
streams:
# [HTTP-FLV] stream in video/x-flv format
http_flv: http://192.168.1.123:20880/api/camera/stream/780900131155/657617
# [JPEG] snapshots from Dahua camera, will be converted to MJPEG stream
dahua_snap: http://admin:password@192.168.1.123/cgi-bin/snapshot.cgi?channel=1
# [MJPEG] stream will be proxied without modification
http_mjpeg: https://mjpeg.sanford.io/count.mjpeg
# [MJPEG or H.264/H.265 bitstream or MPEG-TS]
tcp_magic: tcp://192.168.1.123:12345
# Add custom header
custom_header: "https://mjpeg.sanford.io/count.mjpeg#header=Authorization: Bearer XXX"
PS. Dahua camera has bug: if you select MJPEG codec for RTSP second stream - snapshot won't work.
The source is not very useful if you already know RTSP and snapshot links for your camera. But it can be useful if you don't.
WebUI > Add webpage support ONVIF autodiscovery. Your server must be on the same subnet as the camera. If you use docker, you must use "network host".
streams:
dahua1: onvif://admin:password@192.168.1.123
reolink1: onvif://admin:password@192.168.1.123:8000
tapo1: onvif://admin:password@192.168.1.123:2020
You can get any stream or file or device via FFmpeg and push it to go2rtc. The app will automatically start FFmpeg with the proper arguments when someone starts watching the stream.
Format: ffmpeg:{input}#{param1}#{param2}#{param3}
. Examples:
streams:
# [FILE] all tracks will be copied without transcoding codecs
file1: ffmpeg:/media/BigBuckBunny.mp4
# [FILE] video will be transcoded to H264, audio will be skipped
file2: ffmpeg:/media/BigBuckBunny.mp4#video=h264
# [FILE] video will be copied, audio will be transcoded to pcmu
file3: ffmpeg:/media/BigBuckBunny.mp4#video=copy#audio=pcmu
# [HLS] video will be copied, audio will be skipped
hls: ffmpeg:https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/gear5/prog_index.m3u8#video=copy
# [MJPEG] video will be transcoded to H264
mjpeg: ffmpeg:http://185.97.122.128/cgi-bin/faststream.jpg#video=h264
# [RTSP] video with rotation, should be transcoded, so select H264
rotate: ffmpeg:rtsp://rtsp:12345678@192.168.1.123/av_stream/ch0#video=h264#rotate=90
All trascoding formats has built-in templates: h264
, h265
, opus
, pcmu
, pcmu/16000
, pcmu/48000
, pcma
, pcma/16000
, pcma/48000
, aac
, aac/16000
.
But you can override them via YAML config. You can also add your own formats to config and use them with source params.
ffmpeg:
bin: ffmpeg # path to ffmpeg binary
h264: "-codec:v libx264 -g:v 30 -preset:v superfast -tune:v zerolatency -profile:v main -level:v 4.1"
mycodec: "-any args that supported by ffmpeg..."
myinput: "-fflags nobuffer -flags low_delay -timeout 5000000 -i {input}"
myraw: "-ss 00:00:20"
ffmpeg:camera1#video=h264
)video
and audio
params multiple times (ex. #video=copy#audio=copy#audio=pcmu
)rotate
param with 90
, 180
, 270
or -90
values, important with transcoding (ex. #video=h264#rotate=90
)width
and/or height
params, important with transcoding (ex. #video=h264#width=1280
)drawtext
to add a timestamp (ex. drawtext=x=2:y=2:fontsize=12:fontcolor=white:box=1:boxcolor=black
)
raw
param for any additional FFmpeg arguments (ex. #raw=-vf transpose=1
)input
param to override default input template (ex. #input=rtsp/udp
will change RTSP transport from TCP to UDP+TCP)
#input=-timeout 5000000 -i {input}
)Read more about hardware acceleration.
PS. It is recommended to check the available hardware in the WebUI add page.
You can get video from any USB-camera or Webcam as RTSP or WebRTC stream. This is part of FFmpeg integration.
video_size
and framerate
must be supported by your camera!Format: ffmpeg:device?{input-params}#{param1}#{param2}#{param3}
streams:
linux_usbcam: ffmpeg:device?video=0&video_size=1280x720#video=h264
windows_webcam: ffmpeg:device?video=0#video=h264
macos_facetime: ffmpeg:device?video=0&audio=1&video_size=1280x720&framerate=30#video=h264#audio=pcma
PS. It is recommended to check the available devices in the WebUI add page.
Exec source can run any external application and expect data from it. Two transports are supported - pipe and RTSP.
If you want to use RTSP transport - the command must contain the {output}
argument in any place. On launch, it will be replaced by the local address of the RTSP server.
pipe reads data from app stdout in different formats: MJPEG, H.264/H.265 bitstream, MPEG-TS.
The source can be used with:
streams:
stream: exec:ffmpeg -re -i /media/BigBuckBunny.mp4 -c copy -rtsp_transport tcp -f rtsp {output}
picam_h264: exec:libcamera-vid -t 0 --inline -o -
picam_mjpeg: exec:libcamera-vid -t 0 --codec mjpeg -o -
Some sources may have a dynamic link. And you will need to get it using a bash or python script. Your script should echo a link to the source. RTSP, FFmpeg or any of the supported sources.
Docker and Hass Add-on users has preinstalled python3
, curl
, jq
.
Check examples in wiki.
streams:
apple_hls: echo:python3 hls.py https://developer.apple.com/streaming/examples/basic-stream-osx-ios5.html
Important:
go2rtc support import paired HomeKit devices from Home Assistant. So you can use HomeKit camera with Hass and go2rtc simultaneously. If you using Hass, I recommend pairing devices with it, it will give you more options.
You can pair device with go2rtc on the HomeKit page. If you can't see your devices - reload the page. Also try reboot your HomeKit device (power off). If you still can't see it - you have a problems with mDNS.
If you see a device but it does not have a pair button - it is paired to some ecosystem (Apple Home, Home Assistant, HomeBridge etc). You need to delete device from that ecosystem, and it will be available for pairing. If you cannot unpair device, you will have to reset it.
Important:
VLC
and probably any other playerRecommended settings for using HomeKit Camera with WebRTC, MSE, MP4, RTSP:
streams:
aqara_g3:
- hass:Camera-Hub-G3-AB12
- ffmpeg:aqara_g3#audio=aac#audio=opus
RTSP link with "normal" audio for any player: rtsp://192.168.1.123:8554/aqara_g3?video&audio=aac
This source is in active development! Tested only with Aqara Camera Hub G3 (both EU and CN versions).
Other names: ESeeCloud, dvr163.
username
, password
, port
, ch
and stream
if they are defaultstreams:
camera1: bubble://username:password@192.168.1.123:34567/bubble/live?ch=0&stream=0
Other names: DVR-IP, NetSurveillance, Sofia protocol (NETsurveillance ActiveX plugin XMeye SDK).
username
, password
, port
, channel
and subtype
if they are defaultsubtype=0
for Main stream, and subtype=1
for Extra1 streamstreams:
camera1: dvrip://username:password@192.168.1.123:34567?channel=0&subtype=0
TP-Link Tapo proprietary camera protocol with two way audio support.
admin
usernamestreams:
# cloud password without username
camera1: tapo://cloud-password@192.168.1.123
# admin username and UPPERCASE MD5 cloud-password hash
camera2: tapo://admin:MD5-PASSWORD-HASH@192.168.1.123
TP-Link Kasa non-standard protocol more info.
streams:
kasa: kasa://user:pass@192.168.1.123:19443/https/stream/mixed
Support public cameras from service Ivideon.
streams:
quailcam: ivideon:100-tu5dkUPct39cTp9oNEN2B6/0
Support import camera links from Home Assistant config files:
hass:
config: "/config" # skip this setting if you Hass Add-on user
streams:
generic_camera: hass:Camera1 # Settings > Integrations > Integration Name
aqara_g3: hass:Camera-Hub-G3-AB12
WebRTC Cameras
Any cameras in WebRTC format are supported. But at the moment Home Assistant only supports some Nest cameras in this fomat.
The Nest API only allows you to get a link to a stream for 5 minutes. So every 5 minutes the stream will be reconnected.
streams:
# link to Home Assistant Supervised
hass-webrtc1: hass://supervisor?entity_id=camera.nest_doorbell
# link to external Hass with Long-Lived Access Tokens
hass-webrtc2: hass://192.168.1.123:8123?entity_id=camera.nest_doorbell&token=eyXYZ...
RTSP Cameras
By default, the Home Assistant API does not allow you to get dynamic RTSP link to a camera stream. So more cameras, like Tuya, and possibly others can also be imported by using this method.
This source type support only backchannel audio for Hikvision ISAPI protocol. So it should be used as second source in addition to the RTSP protocol.
streams:
hikvision1:
- rtsp://admin:password@192.168.1.123:554/Streaming/Channels/101
- isapi://admin:password@192.168.1.123:80/
Currently only WebRTC cameras are supported. Stream reconnects every 5 minutes.
For simplicity, it is recommended to connect the Nest/WebRTC camera to the Home Assistant. But if you can somehow get the below parameters - Nest/WebRTC source will work without Hass.
streams:
nest-doorbell: nest:?client_id=***&client_secret=***&refresh_token=***&project_id=***&device_id=***
This source type support Roborock vacuums with cameras. Known working models:
Source support load Roborock credentials from Home Assistant custom integration. Otherwise, you need to log in to your Roborock account (MiHome account is not supported). Go to: go2rtc WebUI > Add webpage. Copy roborock://...
source for your vacuum and paste it to go2rtc.yaml
config.
If you have graphic pin for your vacuum - add it as numeric pin (lines: 123, 456, 678) to the end of the roborock-link.
This source type support four connection formats.
whep
WebRTC/WHEP - is an unapproved standard for WebRTC video/audio viewers. But it may already be supported in some third-party software. It is supported in go2rtc.
go2rtc
This format is only supported in go2rtc. Unlike WHEP it supports asynchronous WebRTC connection and two way audio.
openipc
Support connection to OpenIPC cameras.
wyze
Supports connection to Wyze cameras, using WebRTC protocol. You can use docker-wyze-bridge project to get connection credentials.
kinesis
Supports Amazon Kinesis Video Streams, using WebRTC protocol. You need to specify signalling WebSocket URL with all credentials in query params, client_id
and ice_servers
list in JSON format.
streams:
webrtc-whep: webrtc:http://192.168.1.123:1984/api/webrtc?src=camera1
webrtc-go2rtc: webrtc:ws://192.168.1.123:1984/api/ws?src=camera1
webrtc-openipc: webrtc:ws://192.168.1.123/webrtc_ws#format=openipc#ice_servers=[{"urls":"stun:stun.kinesisvideo.eu-north-1.amazonaws.com:443"}]
webrtc-wyze: webrtc:http://192.168.1.123:5000/signaling/camera1?kvs#format=wyze
webrtc-kinesis: webrtc:wss://...amazonaws.com/?...#format=kinesis#client_id=...#ice_servers=[{...},{...}]
PS. For kinesis
sources you can use echo to get connection params using bash
/python
or any other script language.
This source can get a stream from another go2rtc via WebTorrent protocol.
streams:
webtorrent1: webtorrent:?share=huofssuxaty00izc&pwd=k3l2j9djeg8v8r7e
By default, go2rtc establishes a connection to the source when any client requests it. Go2rtc drops the connection to the source when it has no clients left.
Examples
ffmpeg -re -i BigBuckBunny.mp4 -c copy -rtsp_transport tcp -f rtsp rtsp://localhost:8554/camera1
ffmpeg -re -i BigBuckBunny.mp4 -c mjpeg -f mpjpeg http://localhost:1984/api/stream.mjpeg?dst=camera1
ffmpeg -re -i BigBuckBunny.mp4 -c copy -f flv http://localhost:1984/api/stream.flv?dst=camera1
ffmpeg -re -i BigBuckBunny.mp4 -c copy -f mpegts http://localhost:1984/api/stream.ts?dst=camera1
You can turn the browser of any PC or mobile into an IP-camera with support video and two way audio. Or even broadcast your PC screen:
go2rtc.yaml
links
page for you streamcamera+microphone
or display+speaker
optionwebrtc
local page (your go2rtc should work over HTTPS!) or share link
via WebTorrent technology (work over HTTPS by default)You can use OBS Studio or any other broadcast software with WHIP protocol support. This standard has not yet been approved. But you can download OBS Studio dev version:
go2rtc support play audio files (ex. music or TTS) and live streams (ex. radio) on cameras with two way audio support (RTSP/ONVIF cameras, TP-Link Tapo, Hikvision ISAPI, Roborock vacuums, any Browser).
API example:
POST http://localhost:1984/api/streams?dst=camera1&src=ffmpeg:http://example.com/song.mp3#audio=pcma#input=file
PCMA/8000
codec (ex. Tapo)PCMA/48000
for some Dahua cameras)#input=file
params for transcoding, so file will be transcoded and played in real time#input
param, because it is already in real timesrc
parameterThe HTTP API is the main part for interacting with the application. Default address: http://localhost:1984/
.
Module config
listen: ""
and use, for example, only RTSP client/server protocollisten: "127.0.0.1:1984"
settingbase_path
and host go2rtc on your main app webserver suburlstatic_dir
hosted on root path: /
api:
listen: ":1984" # default ":1984", HTTP API port ("" - disabled)
username: "admin" # default "", Basic auth for WebUI
password: "pass" # default "", Basic auth for WebUI
base_path: "/rtc" # default "", API prefix for serve on suburl (/api => /rtc/api)
static_dir: "www" # default "", folder for static files (custom web interface)
origin: "*" # default "", allow CORS requests (only * supported)
tls_listen: ":443" # default "", enable HTTPS server
tls_cert: | # default "", PEM-encoded fullchain certificate for HTTPS
-----BEGIN CERTIFICATE-----
...
-----END CERTIFICATE-----
tls_key: | # default "", PEM-encoded private key for HTTPS
-----BEGIN PRIVATE KEY-----
...
-----END PRIVATE KEY-----
PS:
You can get any stream as RTSP-stream: rtsp://192.168.1.123:8554/{stream_name}
You can enable external password protection for your RTSP streams. Password protection always disabled for localhost calls (ex. FFmpeg or Hass on same server).
rtsp:
listen: ":8554" # RTSP Server TCP port, default - 8554
username: "admin" # optional, default - disabled
password: "pass" # optional, default - disabled
default_query: "video&audio" # optional, default codecs filters
By default go2rtc provide RTSP-stream with only one first video and only one first audio. You can change it with the default_query
setting:
default_query: "mp4"
- MP4 compatible codecs (H264, H265, AAC)default_query: "video=all&audio=all"
- all tracks from all source (not all players can handle this)default_query: "video=h264,h265"
- only one video track (H264 or H265)default_query: "video&audio=all"
- only one first any video and all audio as separate tracksRead more about codecs filters.
In most cases WebRTC uses direct peer-to-peer connection from your browser to go2rtc and sends media data via UDP. It can't pass media data through your Nginx or Cloudflare or Nabu Casa HTTP TCP connection! It can automatically detects your external IP via public STUN server. It can establish a external direct connection via UDP hole punching technology even if you not open your server to the World.
But about 10-20% of users may need to configure additional settings for external access if mobile phone or go2rtc server behing Symmetric NAT.
webrtc:
listen: ":8555" # address of your local server and port (TCP/UDP)
Static public IP
webrtc:
candidates:
- 216.58.210.174:8555 # if you have static public IP-address
Dynamic public IP
stun
word and external port to YAML config
webrtc:
candidates:
- stun:8555 # if you have dynamic public IP-address
Private IP
ngrok:
command: ...
Hard tech way 1. Own TCP-tunnel
If you have personal VPS, you can create TCP-tunnel and setup in the same way as "Static public IP". But use your VPS IP-address in YAML config.
Hard tech way 2. Using TURN-server
If you have personal VPS, you can install TURN server (e.g. coturn, config example).
webrtc:
ice_servers:
- urls: [stun:stun.l.google.com:19302]
- urls: [turn:123.123.123.123:3478]
username: your_user
credential: your_pass
HomeKit module can work in two modes:
Important
Minimal config
streams:
dahua1: rtsp://admin:password@192.168.1.123/cam/realmonitor?channel=1&subtype=0
homekit:
dahua1: # same stream ID from streams list, default PIN - 19550224
Full config
streams:
dahua1:
- rtsp://admin:password@192.168.1.123/cam/realmonitor?channel=1&subtype=0
- ffmpeg:dahua1#video=h264#hardware # if your camera doesn't support H264, important for HomeKit
- ffmpeg:dahua1#audio=opus # only OPUS audio supported by HomeKit
homekit:
dahua1: # same stream ID from streams list
pin: 12345678 # custom PIN, default: 19550224
name: Dahua camera # custom camera name, default: generated from stream ID
device_id: dahua1 # custom ID, default: generated from stream ID
device_private: dahua1 # custom key, default: generated from stream ID
Proxy HomeKit camera
streams:
aqara1:
- homekit://...
- ffmpeg:aqara1#audio=aac#audio=opus # optional audio transcoding
homekit:
aqara1: # same stream ID from streams list
This module support:
Securely and free. You do not need to open a public access to the go2rtc server. But in some cases (Symmetric NAT) you may need to set up external access to WebRTC module.
To generate sharing link or incoming link - goto go2rtc WebUI (stream links page). This link is temporary and will stop working after go2rtc is restarted!
You can create permanent external links in go2rtc config:
webtorrent:
shares:
super-secret-share: # share name, should be unique among all go2rtc users!
pwd: super-secret-password
src: rtsp-dahua1 # stream name from streams section
Link example: https://alexxit.github.io/go2rtc/#share=02SNtgjKXY&pwd=wznEQqznxW&media=video+audio
TODO: article how it works...
With Ngrok integration you can get external access to your streams in situation when you have Internet with private IP-address.
Ngrok free subscription limitations:
go2rtc will automatically get your external TCP address (if you enable it in ngrok config) and use it with WebRTC connection (if you enable it in webrtc config).
You need manually download Ngrok agent app for your OS and register in Ngrok service.
Tunnel for only WebRTC Stream
You need to add your Ngrok token and WebRTC TCP port to YAML:
ngrok:
command: ngrok tcp 8555 --authtoken eW91IHNoYWxsIG5vdCBwYXNzCnlvdSBzaGFsbCBub3QgcGFzcw
Tunnel for WebRTC and Web interface
You need to create ngrok.yaml
config file and add it to go2rtc config:
ngrok:
command: ngrok start --all --config ngrok.yaml
Ngrok config example:
version: "2"
authtoken: eW91IHNoYWxsIG5vdCBwYXNzCnlvdSBzaGFsbCBub3QgcGFzcw
tunnels:
api:
addr: 1984 # use the same port as in go2rtc config
proto: http
basic_auth:
- admin:password # you can set login/pass for your web interface
webrtc:
addr: 8555 # use the same port as in go2rtc config
proto: tcp
The best and easiest way to use go2rtc inside the Home Assistant is to install the custom integration WebRTC Camera and custom lovelace card.
But go2rtc is also compatible and can be used with RTSPtoWebRTC built-in integration.
You have several options on how to add a camera to Home Assistant:
127.0.0.1
, Port: 1984
rtsp://127.0.0.1:8554/camera1
(change to your stream name, leave everything else as is)You have several options on how to watch the stream from the cameras in Home Assistant:
Camera Entity
=> Picture Entity Card
=> Technology HLS
, codecs: H264/H265/AAC
, poor latency.Camera Entity
=> RTSPtoWebRTC => Picture Entity Card
=> Technology WebRTC
, codecs: H264/PCMU/PCMA/OPUS
, best latency.
http://127.0.0.1:1984/
stun.l.google.com:19302
Camera Entity
or Camera URL
=> WebRTC Camera => Technology: WebRTC/MSE/MP4/MJPEG
, codecs: H264/H265/AAC/PCMU/PCMA/OPUS
, best latency, best compatibility.
You can add camera entity_id
to go2rtc config if you need transcoding:
streams:
"camera.hall": ffmpeg:{input}#video=copy#audio=opus
PS. Default Home Assistant lovelace cards don't support 2-way audio. You can use 2-way audio from Add-on Web UI. But you need use HTTPS to access the microphone. This is a browser restriction and cannot be avoided.
PS. There is also another nice card with go2rtc support - Frigate Lovelace Card.
Provides several features:
API examples:
http://192.168.1.123:1984/api/frame.mp4?src=camera1
(H264, H265)http://192.168.1.123:1984/api/stream.mp4?src=camera1
(H264, H265, AAC)http://192.168.1.123:1984/api/stream.mp4?src=camera1
(H264, H265*, AAC, OPUS, MP3, PCMA, PCMU, PCM)
mp4
, mp4=flac
and mp4=all
param for codec filtersduration
param in seconds (ex. duration=15
)filename
param (ex. filename=record.mp4
)rotate
param with 90
, 180
or 270
valuesscale
param with positive integer values (ex. scale=4:3
)Read more about codecs filters.
PS. Rotate and scale params don't use transcoding and change video using metadata.
HLS is the worst technology for real-time streaming. It can only be useful on devices that do not support more modern technology, like WebRTC, MSE/MP4.
The go2rtc implementation differs from the standards and may not work with all players.
API examples:
http://192.168.1.123:1984/api/stream.m3u8?src=camera1
(H264)http://192.168.1.123:1984/api/stream.m3u8?src=camera1&mp4
(H264, H265, AAC)Read more about codecs filters.
Important. For stream as MJPEG format, your source MUST contain the MJPEG codec. If your stream has a MJPEG codec - you can receive MJPEG stream or JPEG snapshots via API.
You can receive an MJPEG stream in several ways:
With this example, your stream will have both H264 and MJPEG codecs:
streams:
camera1:
- rtsp://rtsp:12345678@192.168.1.123/av_stream/ch0
- ffmpeg:camera1#video=mjpeg
API examples:
http://192.168.1.123:1984/api/stream.mjpeg?src=camera1
http://192.168.1.123:1984/api/frame.jpeg?src=camera1
width
/w
and/or height
/h
paramsrotate
param with 90
, 180
, 270
or -90
valueshardware
/hw
param read moreYou can set different log levels for different modules.
log:
level: info # default level
api: trace
exec: debug
ngrok: info
rtsp: warn
streams: error
webrtc: fatal
By default go2rtc
starts the Web interface on port 1984
and RTSP on port 8554
, as well as use port 8555
for WebRTC connections. The three ports are accessible from your local network. So anyone on your local network can watch video from your cameras without authorization. The same rule applies to the Home Assistant Add-on.
This is not a problem if you trust your local network as much as I do. But you can change this behaviour with a go2rtc.yaml
config:
api:
listen: "127.0.0.1:1984" # localhost
rtsp:
listen: "127.0.0.1:8554" # localhost
webrtc:
listen: ":8555" # external TCP/UDP port
If you need Web interface protection without Home Assistant Add-on - you need to use reverse proxy, like Nginx, Caddy, Ngrok, etc.
PS. Additionally WebRTC will try to use the 8555 UDP port for transmit encrypted media. It works without problems on the local network. And sometimes also works for external access, even if you haven't opened this port on your router (read more). But for stable external WebRTC access, you need to open the 8555 port on your router for both TCP and UDP.
go2rtc can automatically detect which codecs your device supports for WebRTC and MSE technologies.
But it cannot be done for RTSP, HTTP progressive streaming, HLS technologies. You can manually add a codec filter when you create a link to a stream. The filters work the same for all three technologies. Filters do not create a new codec. They only select the suitable codec from existing sources. You can add new codecs to the stream using the FFmpeg transcoding.
Without filters:
Some examples:
rtsp://192.168.1.123:8554/camera1?mp4
- useful for recording as MP4 files (e.g. Hass or Frigate)rtsp://192.168.1.123:8554/camera1?video=h264,h265&audio=aac
- full version of the filter abovertsp://192.168.1.123:8554/camera1?video=h264&audio=aac&audio=opus
- H264 video codec and two separate audio tracksrtsp://192.168.1.123:8554/camera1?video&audio=all
- any video codec and all audio codecs as separate trackshttp://192.168.1.123:1984/api/stream.m3u8?src=camera1&mp4
- HLS stream with MP4 compatible codecs (HLS/fMP4)http://192.168.1.123:1984/api/stream.m3u8?src=camera1&mp4=flac
- HLS stream with PCMA/PCMU/PCM audio support (HLS/fMP4), won't work on old deviceshttp://192.168.1.123:1984/api/stream.mp4?src=camera1&mp4=flac
- MP4 file with PCMA/PCMU/PCM audio support, won't work on old devices (ex. iOS 12)http://192.168.1.123:1984/api/stream.mp4?src=camera1&mp4=all
- MP4 file with non standard audio codecs, won't work on some playersAVC/H.264
video can be played almost anywhere. But HEVC/H.265
has a lot of limitations in supporting with different devices and browsers. It's all about patents and money, you can't do anything about it.
Device | WebRTC | MSE | HTTP | HLS |
---|---|---|---|---|
latency | best | medium | bad | bad |
Desktop Chrome 107+ | H264, OPUS, PCMU, PCMA | H264, H265*, AAC, FLAC*, OPUS | H264, H265*, AAC, FLAC*, OPUS, MP3 | no |
Desktop Edge | H264, OPUS, PCMU, PCMA | H264, H265*, AAC, FLAC*, OPUS | H264, H265*, AAC, FLAC*, OPUS, MP3 | no |
Android Chrome 107+ | H264, OPUS, PCMU, PCMA | H264, H265*, AAC, FLAC*, OPUS | H264, H265*, AAC, FLAC*, OPUS, MP3 | no |
Desktop Firefox | H264, OPUS, PCMU, PCMA | H264, AAC, FLAC*, OPUS | H264, AAC, FLAC*, OPUS | no |
Desktop Safari 14+ | H264, H265*, OPUS, PCMU, PCMA | H264, H265, AAC, FLAC* | no! | H264, H265, AAC, FLAC* |
iPad Safari 14+ | H264, H265*, OPUS, PCMU, PCMA | H264, H265, AAC, FLAC* | no! | H264, H265, AAC, FLAC* |
iPhone Safari 14+ | H264, H265*, OPUS, PCMU, PCMA | no! | no! | H264, H265, AAC, FLAC* |
macOS Hass App | no | no | no | H264, H265, AAC, FLAC* |
HTTP*
- HTTP Progressive Streaming, not related with Progressive download, because the file has no size and no end
Audio
PCMA/PCMU/PCM
codecs to FLAC
for MSE/MP4/HLS so they will work almost anywherePCMU/8000
, PCMA/8000
, OPUS/48000/2
OPUS
and MP3
inside MP4 is part of the standard, but some players do not support them anyway (especially Apple)Apple devices
Codec names
alaw
)mulaw
)s16be
)There are no plans to embed complex transcoding algorithms inside go2rtc. FFmpeg source does a great job with this. Including hardware acceleration support.
But go2rtc has some simple algorithms. They are turned on automatically, you do not need to set them up additionally.
PCM for MSE/MP4/HLS
Go2rtc can pack PCMA
, PCMU
and PCM
codecs into an MP4 container so that they work in all browsers and all built-in players on modern devices. Including Apple QuickTime:
PCMA/PCMU => PCM => FLAC => MSE/MP4/HLS
Resample PCMA/PCMU for WebRTC
By default WebRTC support only PCMA/8000
and PCMU/8000
. But go2rtc can automatically resample PCMA and PCMU codec with with a different sample rate. Also go2rtc can transcode PCM
codec to PCMA/8000
, so WebRTC can play it:
PCM/xxx => PCMA/8000 => WebRTC
PCMA/xxx => PCMA/8000 => WebRTC
PCMU/xxx => PCMU/8000 => WebRTC
Important
For example, you want to watch RTSP-stream from Dahua IPC-K42 camera in your Chrome browser.
H264
in camera settingsAAC/16000
in camera settingsOPUS/48000/2
codec, because it is higher quality than the PCMU/8000
or PCMA/8000
Now you have stream with two sources - RTSP and FFmpeg:
streams:
dahua:
- rtsp://admin:password@192.168.1.123/cam/realmonitor?channel=1&subtype=0&unicast=true&proto=Onvif
- ffmpeg:rtsp://admin:password@192.168.1.123/cam/realmonitor?channel=1&subtype=0#audio=opus
go2rtc automatically match codecs for you browser and all your stream sources. This called multi-source 2-way codecs negotiation. And this is one of the main features of this app.
PS. You can select PCMU
or PCMA
codec in camera setting and don't use transcoding at all. Or you can select AAC
codec for main stream and PCMU
codec for second stream and add both RTSP to YAML config, this also will work fine.
/live/ch00_1
in RTSP URL) - awful but usable RTSP protocol realisation, low stream quality, few settings, packet loss?Using apps for low RTSP delay
ffplay -fflags nobuffer -flags low_delay "rtsp://192.168.1.123:8554/camera1"
Snapshots to Telegram
Q. What's the difference between go2rtc, WebRTC Camera and RTSPtoWebRTC?
go2rtc is a new version of the server-side WebRTC Camera integration, completely rewritten from scratch, with a number of fixes and a huge number of new features. It is compatible with native Home Assistant RTSPtoWebRTC integration. So you can use default lovelace Picture Entity or Picture Glance.
Q. Should I use go2rtc addon or WebRTC Camera integration?
go2rtc is more than just viewing your stream online with WebRTC/MSE/HLS/etc. You can use it all the time for your various tasks. But every time the Hass is rebooted - all integrations are also rebooted. So your streams may be interrupted if you use them in additional tasks.
Basic users can use WebRTC Camera integration. Advanced users can use go2rtc addon or Frigate 12+ addon.
Q. Which RTSP link should I use inside Hass?
You can use direct link to your cameras there (as you always do). go2rtc support zero-config feature. You may leave streams
config section empty. And your streams will be created on the fly on first start from Hass. And your cameras will have multiple connections. Some from Hass directly and one from go2rtc.
Also you can specify your streams in go2rtc config file and use RTSP links to this addon. With additional features: multi-source codecs negotiation or FFmpeg transcoding for unsupported codecs. Or use them as source for Frigate. And your cameras will have one connection from go2rtc. And go2rtc will have multiple connection - some from Hass via RTSP protocol, some from your browser via WebRTC/MSE/HLS protocols.
Use any config what you like.
Q. What about lovelace card with support 2-way audio?
At this moment I am focused on improving stability and adding new features to go2rtc. Maybe someone could write such a card themselves. It's not difficult, I have some sketches.
FAQs
Unknown package
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.