I would like to send video over a satellite modem, but these things are as slow as 1990s era dial-up modems. HD video with the H.264 codec streams at 2 to 3 Mbps, so the amount of data must be reduced by a factor of ten or twenty for low speed satcom.
Instead of running at 30 fps, one should stream at 1 or 2 fps and most off the shelf video encoder/decoder devices don't know how to do that, so I dug a Raspberry Pi v3 and a v1.2 camera out of my toy box, installed gstreamer and started tinkering on my Mac to gain some experience with the problem.
Of course one can do the exact same thing on a Linux laptop PC, but what would be the fun in that?
With the gstreamer videorate plugin, one can change the frame rate to almost any value and cranking it down to 1 or 2 fps is no problem. One could go down to a few frames per minute, but super slow streaming could cause a playback synchronization issue, because the player error handler may time out before it manages to synchronize.
Also note that satcom systems spoof the TCP ACK packets locally, to speed things up a bit. This means that TCP and UDP work the same over a satcom link.
https://www.sparkfun.com/products/13826
Download a Raspbian image zip file from here:
https://www.raspberrypi.org/downloads/
Open a terminal and unzip with:
$ ark --batch filename.zip
Write the image to a SD card:
$ dd if=filename.img of=/dev/mmcblk0p
Mount the rootfs partition on the SD card, then enable sshd in /etc/rc.local, using the nano editor:
$ nano rc.local
# netstat -g
IPv6/IPv4 Group Memberships
Interface RefCnt Group
--------------- ------ ---------------------
lo 1 224.0.0.1
enxb827eb9 1 224.0.0.251
enxb827eb9 1 224.0.0.1
wlan0 1 224.0.0.1
lo 1 ip6-allnodes
lo 1 ff01::1eth0 1 ff02::fbeth0 1 ff02::1:ff3c:6e32eth0 1 ip6-allnodeseth0 1 ff01::1
wlan0 1 ip6-allnodes
wlan0 1 ff01::1
Now, when you restart the Pi, it should be ready to stream video. You could edit the above on the Pi with the nano editor, or move the SD card back to your desktop computer, mount the rootfs partition and edit the file there. This a MAJOR advantage of the Pi architecture: If you mess something up and the Pi won't boot, then you can fix the SD card based system on another machine. Also, to replicate a Pi, just make a backup and copy the SD card.
$ gst-launch-1.0 videotestsrc ! autovideosink
If tcpdump shows that the Pi is streaming, then start a player on another machine.
It plays with this:
$ ffplay udp://224.0.1.10:5000
There are various ways to optimize:
bitrate=128 - for low rate coding
tune=zerolatency - to prevent look-ahead and get frames out ASAP
An optimized x264 pad could look like this:
! x264enc bitrate=512 speed-preset=superfast tune=zerolatency !
The zerolatency parameter actually helps:
$ gst-launch-1.0 v4l2src device=/dev/video0 ! x264enc speed-preset=superfast tune=zerolatency ! udpsink host=224.0.1.10 port=5000
With playback like this:
$ ffplay -f h264 -vf "setpts=PTS/4" udp://224.0.1.:5000
However, the moment I add the mpegtsmux, it is too much for the Pi to handle. One would need to hook up a second Pi to convert the raw stream to an encapsulated MPEG-2 TS stream, or use a faster computer.
RPi with Camera |
Instead of running at 30 fps, one should stream at 1 or 2 fps and most off the shelf video encoder/decoder devices don't know how to do that, so I dug a Raspberry Pi v3 and a v1.2 camera out of my toy box, installed gstreamer and started tinkering on my Mac to gain some experience with the problem.
Of course one can do the exact same thing on a Linux laptop PC, but what would be the fun in that?
With the gstreamer videorate plugin, one can change the frame rate to almost any value and cranking it down to 1 or 2 fps is no problem. One could go down to a few frames per minute, but super slow streaming could cause a playback synchronization issue, because the player error handler may time out before it manages to synchronize.
Also note that satcom systems spoof the TCP ACK packets locally, to speed things up a bit. This means that TCP and UDP work the same over a satcom link.
Bake a Pi
Get your RPi3 from here:https://www.sparkfun.com/products/13826
Download a Raspbian image zip file from here:
https://www.raspberrypi.org/downloads/
Open a terminal and unzip with:
$ ark --batch filename.zip
Write the image to a SD card:
$ dd if=filename.img of=/dev/mmcblk0p
Mount the rootfs partition on the SD card, then enable sshd in /etc/rc.local, using the nano editor:
$ nano rc.local
Add the following line at the bottom of rc.local, just before the exit statement:
systemctl restart ssh
You can also add a static IP address, so that you can easily reference the device again over SSH:
ip addr add 192.168.1.200/24 dev eth0
Now you can exit nano and save the rc.local file.
Mount the boot partition on the SD card, then configure the file cmdline.txt to use traditional ethernet device names, so that the ethernet device will be named eth0, the way the UNIX gods intended:
Add "net.ifnames=0 biosdevname=0" to the end of cmdline.txt.
$ sudo raspi-config
Update
Expand root file system
Set hostname to videopi
Enable camera
Add a static IP address in addition to that assigned by DHCP:
$ sudo ip addr add 192.168.1.10/24 dev eth0
Install video utilities:
$ sudo apt-get ffmpeg
$ sudo apt-get install gstreamer1.0-plugins-*
systemctl restart ssh
You can also add a static IP address, so that you can easily reference the device again over SSH:
ip addr add 192.168.1.200/24 dev eth0
Now you can exit nano and save the rc.local file.
Mount the boot partition on the SD card, then configure the file cmdline.txt to use traditional ethernet device names, so that the ethernet device will be named eth0, the way the UNIX gods intended:
Add "net.ifnames=0 biosdevname=0" to the end of cmdline.txt.
Boot Up
Put the SD card in the Pi, boot up and configure it:$ sudo raspi-config
Update
Expand root file system
Set hostname to videopi
Enable camera
Add a static IP address in addition to that assigned by DHCP:
$ sudo ip addr add 192.168.1.10/24 dev eth0
Install video utilities:
$ sudo apt-get ffmpeg
$ sudo apt-get install gstreamer1.0-plugins-*
The Rpi Camera
The default kernel includes the v4l2 driver and the latest raspbian image includes the v4l2 utilities (e.g. v4l2-ctl)
Camera Test:
$ raspistill -o pic.jpg
Load the V4L2 module
$ sudo modprobe bcm2835-v4l2
Add the above to /etc/rc.local to enable the /dev/video0 device at startup.
Camera Test:
$ raspistill -o pic.jpg
Load the V4L2 module
$ sudo modprobe bcm2835-v4l2
Add the above to /etc/rc.local to enable the /dev/video0 device at startup.
Multicast Configuration
http://unixadminschool.com/blog/2014/03/rhel-what-is-multicast-and-how-to-configure-network-interface-with-multicast-address/
Enable multicasting:
$ sudo ifconfig eth0 multicast
Add a multicast route, since without it, multicasting will not work:
$ sudo route add -net 224.0.0.0 netmask 240.0.0.0 dev eth0
$ sudo ip address show
eth02: eth0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast state UP group default qlen 1000
link/ether b8:27:eb:9b:b3:e0 brd ff:ff:ff:ff:ff:ff
inet 192.168.1.104/24 brd 192.168.1.255 scope global enxb827eb9bb3e0
valid_lft forever preferred_lft forever
inet 224.0.1.10/32 scope global enxb827eb9bb3e0
valid_lft forever preferred_lft forever
inet 192.168.1.4/24 scope global secondary enxb827eb9bb3e0
valid_lft forever preferred_lft forever
inet6 fe80::e774:95b:c83c:6e32/64 scope link
valid_lft forever preferred_lft forever
And check the route settings with
$ route -n
Kernel IP routing table
Destination Gateway Genmask Flags Metric Ref Use Iface
192.168.1.0 0.0.0.0 255.255.255.0 U 0 0 0
eth0 224.0.0.0 0.0.0.0 240.0.0.0 U 0 0 0 eth0
Enable multicasting:
$ sudo ifconfig eth0 multicast
Add a multicast route, since without it, multicasting will not work:
$ sudo route add -net 224.0.0.0 netmask 240.0.0.0 dev eth0
$ sudo ip address show
eth02: eth0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast state UP group default qlen 1000
link/ether b8:27:eb:9b:b3:e0 brd ff:ff:ff:ff:ff:ff
inet 192.168.1.104/24 brd 192.168.1.255 scope global enxb827eb9bb3e0
valid_lft forever preferred_lft forever
inet 224.0.1.10/32 scope global enxb827eb9bb3e0
valid_lft forever preferred_lft forever
inet 192.168.1.4/24 scope global secondary enxb827eb9bb3e0
valid_lft forever preferred_lft forever
inet6 fe80::e774:95b:c83c:6e32/64 scope link
valid_lft forever preferred_lft forever
And check the route settings with
$ route -n
Kernel IP routing table
Destination Gateway Genmask Flags Metric Ref Use Iface
192.168.1.0 0.0.0.0 255.255.255.0 U 0 0 0
eth0 224.0.0.0 0.0.0.0 240.0.0.0 U 0 0 0 eth0
# netstat -g
IPv6/IPv4 Group Memberships
Interface RefCnt Group
--------------- ------ ---------------------
lo 1 224.0.0.1
enxb827eb9 1 224.0.0.251
enxb827eb9 1 224.0.0.1
wlan0 1 224.0.0.1
lo 1 ip6-allnodes
lo 1 ff01::1eth0 1 ff02::fbeth0 1 ff02::1:ff3c:6e32eth0 1 ip6-allnodeseth0 1 ff01::1
wlan0 1 ip6-allnodes
wlan0 1 ff01::1
Configuration of rc.local
The bottom of /etc/rc.local should look like this:
# Start the SSH daemon
systemctl start ssh
# Load the V4L2 camera device driver
sudo modprobe bcm2835-v4l2
# Add a static IP address in addition to that assigned by DHCP
ip addr add 192.168.1.10/24 dev eth0
# Enable multicasting:
ifconfig eth0 multicast
# Add a multicast route
route add -net 224.0.0.0 netmask 240.0.0.0 dev eth0
exit 0
# Start the SSH daemon
systemctl start ssh
# Load the V4L2 camera device driver
sudo modprobe bcm2835-v4l2
# Add a static IP address in addition to that assigned by DHCP
ip addr add 192.168.1.10/24 dev eth0
# Enable multicasting:
ifconfig eth0 multicast
# Add a multicast route
route add -net 224.0.0.0 netmask 240.0.0.0 dev eth0
exit 0
Now, when you restart the Pi, it should be ready to stream video. You could edit the above on the Pi with the nano editor, or move the SD card back to your desktop computer, mount the rootfs partition and edit the file there. This a MAJOR advantage of the Pi architecture: If you mess something up and the Pi won't boot, then you can fix the SD card based system on another machine. Also, to replicate a Pi, just make a backup and copy the SD card.
Test The Video Camera Setup
Show a video test pattern on a screen plugged into the RPi:$ gst-launch-1.0 videotestsrc ! autovideosink
Test Pattern
You can also do this without a screen, remotely from another computer, using the Secure Shell X forwarding:
$ ssh -X pi@192.168.1.10 "gst-launch-1.0 videotestsrc ! autovideosink"
Show camera low rate video at 2 fps on a screen plugged into the RPi:
$ gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,framerate=2/1 ! autovideosink
Show camera low rate video at 2 fps over SSH, with X forwarding, from another computer:
$ ssh -X pi@192.168.1.10 "gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,framerate=2/1 ! autovideosink"
Obviously, you got to solve whatever problems you encounter, before continuing, else you'll go nowhere fast...
$ ssh -X pi@192.168.1.10 "gst-launch-1.0 videotestsrc ! autovideosink"
Show camera low rate video at 2 fps on a screen plugged into the RPi:
$ gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,framerate=2/1 ! autovideosink
Yup, that is me at 0 fps...
$ ssh -X pi@192.168.1.10 "gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,framerate=2/1 ! autovideosink"
Obviously, you got to solve whatever problems you encounter, before continuing, else you'll go nowhere fast...
Streaming Video
The problem with setting up a streaming system is that there are many pads and each pad has many options. These options don't necessarily work together and finding a combination that does approximately what you need, can be very time consuming.
However, the defaults usually work. So the best approach is to make a basic stream, get it to work and only then start to experiment, while keeping careful notes of what works and what doesn't.
Howtos:
http://www.einarsundgren.se/gstreamer-basic-real-time-streaming-tutorial/
https://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/rtp/README#n251
If necessary, install gstreamer:
$ sudo apt-get install gstreamer1.0-tools
Install all gstreamer plugins:
$ sudo apt-get install gst*plugin*
The simplest test:
$ gst-launch-1.0 videotestsrc ! autovideosink
and
$ gst-launch-1.0 v4l2src device=/dev/video0 ! autovideosink
A simple raw stream give me ‘message too long’ errors. The solution is the ‘chopmydata’ plugin.
However, the defaults usually work. So the best approach is to make a basic stream, get it to work and only then start to experiment, while keeping careful notes of what works and what doesn't.
Howtos:
http://www.einarsundgren.se/gstreamer-basic-real-time-streaming-tutorial/
https://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/rtp/README#n251
If necessary, install gstreamer:
$ sudo apt-get install gstreamer1.0-tools
Install all gstreamer plugins:
$ sudo apt-get install gst*plugin*
The simplest test:
$ gst-launch-1.0 videotestsrc ! autovideosink
and
$ gst-launch-1.0 v4l2src device=/dev/video0 ! autovideosink
A simple raw stream give me ‘message too long’ errors. The solution is the ‘chopmydata’ plugin.
MJPEG Streaming Examples
To verify that the Pi is streaming, run tcpdump on a second terminal:
$ sudo tcpdump -nlX port 5000
Stream a test pattern with motion jpeg:
$ gst-launch-1.0 videotestsrc ! jpegenc ! chopmydata max-size=9000 ! udpsink host=224.0.1.10 port=5000
If you enable the WiFi access point feature with raspi-config, then the Pi can make a fairly decent home security, or toy drone camera, which only needs a power cable. With multicast streaming, you can connect from multiple computers on the LAN simultaneously.
$ gst-launch-1.0 videotestsrc ! jpegenc ! chopmydata max-size=9000 ! udpsink host=224.0.1.10 port=5000
If you enable the WiFi access point feature with raspi-config, then the Pi can make a fairly decent home security, or toy drone camera, which only needs a power cable. With multicast streaming, you can connect from multiple computers on the LAN simultaneously.
Stream the Pi Camera with motion jpeg:
$ gst-launch-1.0 v4l2src device=/dev/video0 ! jpegenc ! chopmydata max-size=9000 ! udpsink host=224.0.1.10 port=5000
$ gst-launch-1.0 v4l2src device=/dev/video0 ! jpegenc ! chopmydata max-size=9000 ! udpsink host=224.0.1.10 port=5000
If tcpdump shows that the Pi is streaming, then start a player on another machine.
Play the video with ffplay on another machine:
$ ffplay -f mjpeg udp://224.0.1.10:5000
The VLC player should also work, but stupid players like MS Media Player or Apple Quicktime, will not be able to figure out what to do with a simple raw UDP stream. These players need RTP to tell them what to do.
Also note that RTP needs an ODD port number, while everything else need EVEN port numbers. I am not to reason why...
$ ffplay -f mjpeg udp://224.0.1.10:5000
The VLC player should also work, but stupid players like MS Media Player or Apple Quicktime, will not be able to figure out what to do with a simple raw UDP stream. These players need RTP to tell them what to do.
Also note that RTP needs an ODD port number, while everything else need EVEN port numbers. I am not to reason why...
MJPEG Low Frame Rate Examples
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-videorate.html
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-jpegenc.html
The basic framerate pad for 10 fps is:
! video/x-raw,framerate=10/1 !
You can also scale it at the same time:
! video/x-raw,width=800,height=600,framerate=10/1 !
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-jpegenc.html
The basic framerate pad for 10 fps is:
! video/x-raw,framerate=10/1 !
You can also scale it at the same time:
! video/x-raw,width=800,height=600,framerate=10/1 !
When you slow things down a lot, then every frame is different. Consequently the H.264 codec will not work well, so I rather selected the Motion JPEG jpegenc codec.
This low rate videorate stream works:
gst-launch-1.0 v4l2src device=/dev/video0 ! \
video/x-raw,width=800,height=600,framerate=10/1 ! jpegenc !\
chopmydata max-size=9000 ! udpsink host=224.0.1.10 port=5000
Play it with ffplay on another machine:
$ ffplay -f mjpeg udp://224.0.1.10:5000
You may have to install FFMPEG on the player machines:
$ sudo apt-get install ffmpeg
There is a statically compiled version of FFMPEG for Windows. Search online for "zeranoe ffmpeg" to find it.
Note: If you make the frame rate very slow, then it will take ffplay a very long to synchronize, but it should eventually pop up and play.
This low rate videorate stream works:
gst-launch-1.0 v4l2src device=/dev/video0 ! \
video/x-raw,width=800,height=600,framerate=10/1 ! jpegenc !\
chopmydata max-size=9000 ! udpsink host=224.0.1.10 port=5000
Play it with ffplay on another machine:
$ ffplay -f mjpeg udp://224.0.1.10:5000
You may have to install FFMPEG on the player machines:
$ sudo apt-get install ffmpeg
There is a statically compiled version of FFMPEG for Windows. Search online for "zeranoe ffmpeg" to find it.
Note: If you make the frame rate very slow, then it will take ffplay a very long to synchronize, but it should eventually pop up and play.
Example H.264 MPEG-2 TS Pipelines
The H.264 codec should only be used raw, with Matroska, MP4/QuickTime or MPEG-2 TS encapsulation:
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-ugly-plugins/html/gst-plugins-ugly-plugins-x264enc.html
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-ugly-plugins/html/gst-plugins-ugly-plugins-x264enc.html
This raw x264 stream works:
$ gst-launch-1.0 videotestsrc num-buffers=1000 ! x264enc ! udpsink host=224.0.1.10 port=5000
It plays with this:
$ ffplay -f h264 udp://224.0.1.10:5000
This x264 MPEG-2 TS encapsulated stream works, but with too much latency:
$ gst-launch-1.0 videotestsrc ! x264enc ! mpegtsmux ! udpsink host=224.0.1.10 port=5000
or
$ gst-launch-1.0 v4l2src device=/dev/video0 ! x264enc ! mpegtsmux ! udpsink host=224.0.1.10 port=5000
$ gst-launch-1.0 videotestsrc num-buffers=1000 ! x264enc ! udpsink host=224.0.1.10 port=5000
It plays with this:
$ ffplay -f h264 udp://224.0.1.10:5000
This x264 MPEG-2 TS encapsulated stream works, but with too much latency:
$ gst-launch-1.0 videotestsrc ! x264enc ! mpegtsmux ! udpsink host=224.0.1.10 port=5000
or
$ gst-launch-1.0 v4l2src device=/dev/video0 ! x264enc ! mpegtsmux ! udpsink host=224.0.1.10 port=5000
It plays with this:
$ ffplay udp://224.0.1.10:5000
There are various ways to optimize:
bitrate=128 - for low rate coding
tune=zerolatency - to prevent look-ahead and get frames out ASAP
An optimized x264 pad could look like this:
! x264enc bitrate=512 speed-preset=superfast tune=zerolatency !
The zerolatency parameter actually helps:
$ gst-launch-1.0 v4l2src device=/dev/video0 ! x264enc speed-preset=superfast tune=zerolatency ! udpsink host=224.0.1.10 port=5000
With playback like this:
$ ffplay -f h264 -vf "setpts=PTS/4" udp://224.0.1.:5000
However, the moment I add the mpegtsmux, it is too much for the Pi to handle. One would need to hook up a second Pi to convert the raw stream to an encapsulated MPEG-2 TS stream, or use a faster computer.
Processor Load
I found that the processor load is about 30% with MJPEG, so a little Pi is perfectly fine for streaming video from a single camera, if one uses a simple codec.
The x264 codec is a hungry beast and consumes 360% CPU according to top, which means all 4 cores are running balls to the wall, indicating that this codec is not really suitable for a little Pi v3 processor. Nevertheless, it shows that one doesn't have to do H.264 encoding in a FPGA. It can be done on a small embedded processor.
If you need to route video between two subnets, then you should consider sparing yourself the head-ache and rather use unicast streaming. Otherwise, you would need an expensive switch from Cisco, or HPE, or OpenBSD with dvmrpd.
Linux multicast routing is not recommended, for three reasons: No documentation and unsupported, buggy router code. Windows cannot route it at all and FreeBSD needs to be recompiled for multicast routing. Only OpenBSD supports multicast routing out of the box.
Also consider that UDP multicast packets have a Time To Live of 1, meaning that they will be dropped at the first router. Therefore a multicast router also has to increment the TTL.
If you need to use OpenBSD, do get a copy of Absolute OpenBSD - UNIX for the Practically Paranoid, by M.W. Lucas.
This workaround can make a simple stream play in real-time:
$ ffplay -f mjpeg -vf "setpts=PTS/2“ udp://224.0.1.10:5000
Sometimes, ffplay is not part of the FFMPEG installation. If you have this problem and don't want to compile it from source, then you can use ffmpeg with SDL as below, which is what ffplay does also.
Play a stream using FFMPEG and SDL to render it to the default screen:
$ ffmpeg -i udp://224.0.1.10:5000 -f sdl -
You could also play the video with mplayer:
$ mplayer -benchmark udp://224.0.1.10:5000
Of course you can use gstreamer to play it, but I prefer using a different tool for playback as a kind of error check.
You can easily play video with gst-play, same idea as ffplay:
$ gst-play-1.0 udp://224.0.1.10:5000
or with gst-launch:
$ gst-launch-1.0 udpsrc host=224.0.1.10 port=5000 ! autovideosink
The x264 codec is a hungry beast and consumes 360% CPU according to top, which means all 4 cores are running balls to the wall, indicating that this codec is not really suitable for a little Pi v3 processor. Nevertheless, it shows that one doesn't have to do H.264 encoding in a FPGA. It can be done on a small embedded processor.
Multicast Routing
Note that multicast routing is completely different from unicast routing. A multicast packet has no source and destination address. Instead, it has a group address and something concocted from the host MAC. To receive a stream, a host has to subscribe to the group with IGMP.
Here, there be dragons.
If you need to route video between two subnets, then you should consider sparing yourself the head-ache and rather use unicast streaming. Otherwise, you would need an expensive switch from Cisco, or HPE, or OpenBSD with dvmrpd.
Linux multicast routing is not recommended, for three reasons: No documentation and unsupported, buggy router code. Windows cannot route it at all and FreeBSD needs to be recompiled for multicast routing. Only OpenBSD supports multicast routing out of the box.
Do not meddle in the affairs of dragons,
for you are crunchy
and taste good with ketchup.
Also consider that UDP multicast packets have a Time To Live of 1, meaning that they will be dropped at the first router. Therefore a multicast router also has to increment the TTL.
If you need to use OpenBSD, do get a copy of Absolute OpenBSD - UNIX for the Practically Paranoid, by M.W. Lucas.
Five Ways To Play Video With Low Latency
Note that if you don't put a framerate pad in a simple stream, then the presentation time stamps in the stream are wrong/missing, causing the video to play back in slow motion, which can be very befuddling to the uninitiated.
This workaround can make a simple stream play in real-time:
$ ffplay -f mjpeg -vf "setpts=PTS/2“ udp://224.0.1.10:5000
Sometimes, ffplay is not part of the FFMPEG installation. If you have this problem and don't want to compile it from source, then you can use ffmpeg with SDL as below, which is what ffplay does also.
Play a stream using FFMPEG and SDL to render it to the default screen:
$ ffmpeg -i udp://224.0.1.10:5000 -f sdl -
You could also play the video with mplayer:
$ mplayer -benchmark udp://224.0.1.10:5000
Of course you can use gstreamer to play it, but I prefer using a different tool for playback as a kind of error check.
You can easily play video with gst-play, same idea as ffplay:
$ gst-play-1.0 udp://224.0.1.10:5000
or with gst-launch:
$ gst-launch-1.0 udpsrc host=224.0.1.10 port=5000 ! autovideosink
Virtual Video Device
Sometimes it can be very useful to set up a virtual camera device using ffmpeg and v4l2loopback.
# ffmpeg -re -i input.mp4 -map 0:v -f v4l2 /dev/video2
# ffmpeg -re -i input.mp4 -map 0:v -f v4l2 /dev/video2
You can also make the input a FIFO for better control over the producer and consumer.
La Voila!
Herman
Comments
Post a Comment
On topic comments are welcome. Junk will be deleted.