I wanted to throw my nokiyu high into the sky and see how we look from a bird's-eye view.
How to throw is clear: the easiest option is to take a larger kite.
How to look - the question is a bit puzzled.
What turned out:
- gstreamer provides a set of plug-ins that can be linked into chains, giving the output of one to the input of the other. There are simple basic elements among the plugins: input / output from files ({file, fd} {src, sink}) and over the network ({tcp {client, server}, udp} {src, sink}), video capture via the video interface for linux (v4l2src), video output by means of X (ximagesink). There are encoders and decoders of many multimedia formats - I experimented with mpeg4video, h263 and theora. There are tools for forming and parsing RTP streams. There are tools for automatic detection of stream format and decoding (decodebin).
- Pretty fresh gstreamer is included in the standard firmware OS2008. For complete console joy, you need to install the gstreamer-tools package, which contains the gst-launch and gst-inspect utilities.
Fine.
First try
[n810] $ gst-launch -v v4l2src! \
capsfilter caps = "video / x-raw-yuv, format = (fourcc) UYVY, framerate = (fraction) 8/1, width = (int) 640, height = (int) 480"! \
autovideosink
The capsfilter filter sets video capture parameters. They can be changed within reasonable limits, if the video cannot be captured with such parameters, gstreamer writes the nearest valid ones.
Network Transfer
It would be good now to transmit it over the network. The simplest option looks like this (IP desktop 192.168.1.254):
')
[desktop] $ gst-launch -v tcpserversrc host = 0.0.0.0 protocol = gdp! autovideosink
[n810] $ gst-launch -v v4l2src! \
capsfilter caps = "video / x-raw-yuv, format = (fourcc) UYVY, framerate = (fraction) 8/1, width = (int) 320, height = (int) 240"! \
tcpclientsink host = 192.168.1.254 protocol = gdp
The protocol = gdp parameter adds a stream format to the data transmitted over the network: the fact is that only filters with output and input of a compatible format can be combined into a chain. tcp * src with this parameter on the receiving side has the same output format as tcp * sink on the transmit side of the input.
A simple solution, but through wifi it works hard: 7 megabits, 600 packets per second - a noticeable load. 640x480 is already noticeably slow.
Obviously, the next step is to add compression.
Let it be mpeg4:
[desktop] $ gst-launch -v tcpserversrc host = 0.0.0.0 protocol = gdp! decodebin! autovideosink
[n810] $ gst-launch -v v4l2src! \
capsfilter caps = "video / x-raw-yuv, format = (fourcc) UYVY, framerate = (fraction) 8/1, width = (int) 320, height = (int) 240"! \
hantro4200enc! tcpclientsink host = 192.168.1.254 protocol = gdp
Great, everything is in mpeg-cubes, but 110 kilobits and 30 packets per second (:
What other significant disadvantages of this scheme? TCP / IP: packet loss results in retransmission of an already irrelevant picture, which means delays. A disconnection is treated only by restarting the server and client.
Rtp
So you need to tighten the RTP:
[desktop] $ gst-launch -v gstrtpbin name = rtpbin \
udpsrc caps = "application / x-rtp, media = (string) video, clock-rate = (int) 90000, encoding-name = (string) H263" port = 5000! \
rtpbin.recv_rtp_sink_0 rtpbin. ! \
rtph263depay! decodebin! autovideosink
[n810] $ gst-launch -v gstrtpbin name = rtpbin \
v4l2src! \
capsfilter caps = "video / x-raw-yuv, format = (fourcc) UYVY, framerate = (fraction) 8/1, width = (int) 320, height = (int) 240"! \
hantro4200enc stream-type = 5 bit-rate = 512! rtph263pay! \
rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0! \
udpsink port = 5000 host = 192.168.1.254
What changed?
- mpeg4 turned into h263. For some reason, ffmpeg from the desktop machine could not decrypt the sent frames from hantro4200. On the transmitting side, this was expressed by the parameter stream-type = 5. That was less cubes added bit-rate = 512.
- Added rtpbin, used through named inputs and outputs - our video is on the zero channel. The conversion of the compressed video stream to RTP and back was performed by rtph263pay and rtph263depay.
- tcp has become udp. On the receiving side, caps = "..." absorbed the missing information for restoring the flow; This line printed rtph263pay on the transmitting side.
What effect?
- A temporary loss of communication does not break the connection. There is no connection (:
- Viewing can be stopped and restarted. This does not affect the video server.
- Late frames do not inhibit the display of frames that arrived on time.
What is left?
Help on gstreamer plugins on the official website:
gstreamer.freedesktop.org/documentationList of installed plugins and help for their parameters - gst-inspect
PS It turned out a kind of laudatory ode gstreamer'u. Frankly pleased instrument (: