Simulated Video Sources

Often it is desirable to have a simulated video source at hand if no actual video feed is easily available. For instance if you want to do a quick evaluation of the Nabto Edge platform in video scenarios, ie in tunnelled RTSP scenarios or in Nabto Edge WebRTC scenarios. Or to enable client application development before integration is done on the camera. This document outlines different approaches to create simulated video streams.

At the core of all examples in this guide is the GStreamer framework, used for creating a video stream with a test pattern and timestamp:

Installing GStreamer is a pre-requisite for all the examples below - except the Docker based demo that is completely self-contained. GStreamer is typically available using platform standard package managers. For instance, install it using brew on macOS.

RTSP Docker Container

The simplest way to obtain a simulated video source is using our RTSP service Docker container. To summarize the build instructions in the repo README, you build and run the container as follows:

git clone https://github.com/nabto/rtsp-demo-container
cd rtsp-demo-container
docker build -t rtsp-demo-server .
docker run --rm -it -p 8554:8554 rtsp-demo-server

Then you can use the RTSP feed from your RTSP client. For instance, test it using VLC:

vlc --rtsp-tcp rtsp://127.0.0.1:8554/video

Or ffplay:

ffplay -probesize 32 -sync ext -rtsp_flags prefer_tcp rtsp://127.0.0.1/8554

RTSP service using command line

If it is not feasible to use Docker, you can use GStreamer on the commandline instead.

First install gstreamer, typically available using your platform’s package manager, e.g. using brew on macOS.

Next, build and install the Nabto GStreamer RTSP test server: Follow the build instructions. Alternatively, study the Dockerfile for our RTSP demo container described above as compact build instructions.

After succesfully building the tool, run it as follows to get the test pattern with clock as shown above:

./src/gst-rtsp-launch "( videotestsrc is-live=1 ! clockoverlay halignment=right \
  valignment=bottom font-desc=\"Sans, 36\" ! x264enc speed-preset=ultrafast \
  tune=zerolatency ! rtph264pay name=pay0 pt=96 )"

RTP Video Feed Generated Using GStreamer

An RTP feed not wrapped in an RTSP service can be created directly using the following command:

gst-launch-1.0 videotestsrc ! clockoverlay ! video/x-raw,width=640,height=480 ! videoconvert ! queue ! \
  x264enc tune=zerolatency bitrate=1000 key-int-max=30 ! video/x-h264, profile=constrained-baseline ! \
  rtph264pay pt=96 mtu=1200 ! udpsink host=127.0.0.1 port=6000

It yields the same test pattern with timestamp as above.

RTP Video Feed Using Webcam with GStreamer

If you have a webcam available, you can get a video stream from it and use it with GStreamer to serve an RTP feed instead of the test pattern.

In the following examples, adjust the bitrate=... parameter to a value that works well in your environment; the suggested value of 4000 is moderate quality for a HD camera.

Note: If using 2-way video, the client will try to use the webcam. This will fail if the webcam feed is already in use by Gstreamer. For testing 2-way video it is recommended to use the generated video feed described above.

Linux

On Linux, you can use the following, assuming a v4l2 video device at /dev/video0:

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480 ! videoconvert ! queue ! \
  x264enc tune=zerolatency bitrate=4000 key-int-max=30 ! video/x-h264, profile=constrained-baseline ! \
  rtph264pay pt=96 mtu=1200 ! udpsink host=127.0.0.1 port=6000

macOS

On macOS, you can use the following:

gst-launch-1.0 avfvideosrc device-index=0 ! videoconvert ! queue ! \
   x264enc tune=zerolatency bitrate=4000 key-int-max=30 ! video/x-h264, profile=constrained-baseline ! \
   rtph264pay pt=96 mtu=1200 ! udpsink host=127.0.0.1 port=6000

Adjust the parameter 0 in device-index=0 to match your Macbook’s webcam index. You can enumerate all input sources using:

gst-device-monitor-1.0

RTP Video Sink Using GStreamer - Show Incoming RTP Feed

To test two-way video, for instance using Nabto Edge WebRTC, a video sink can be started using GStreamer. It listens for RTP video on a UDP socket and shows the incoming feed in a platform specific view. If using with Nabto’s edge-device-webrtc example application, note that two-way feeds are not supported if you specify an RTSP URL.

gst-launch-1.0 udpsrc uri=udp://127.0.0.1:6001 ! application/x-rtp, payload=96  ! rtph264depay ! \
  h264parse ! avdec_h264 ! videoconvert ! autovideosink

RTP Audio Feed Generated Using GStreamer

An audio feed can be started as shown below. This uses a simple sine wave as source. Similar to the video feed, Gstreamer can also use your mic as source (eg. using the pulsesrc plugin). However, when using 2-way audio, the web browser will use the mic as source so using a sine wave here will help distinguish the feeds if testing locally.

gst-launch-1.0 -v audiotestsrc wave=sine freq=220 volume=0.01 ! audioconvert ! opusenc ! \
  rtpopuspay name=pay0 pt=111 ! udpsink host=127.0.0.1 port=6002

RTP Audio Sink Using GStreamer - Play Incoming Audio

Below is shown how to make Gstreamer listen for RTP audio on a UDP socket, and play it on your speakers. To test this requires the client to send audio to the device.

gst-launch-1.0 -v udpsrc uri=udp://127.0.0.1:6003 \
  caps="application/x-rtp,media=(string)audio,clock-rate=(int)48000,encoding-name=(string)X-GST-OPUS-DRAFT-SPITTKA-00" ! \
  rtpopusdepay ! opusdec ! audioconvert ! autoaudiosink sync=false