Discussion:
Synchronizing multiple RTP sources
Peter Maersk-Moller
2015-06-22 22:00:53 UTC
Permalink
Hi

Would it be possible with GStreamer to playout mutiple RTP sources
synchronized if the playout for each source takes place in its own process?

Assume you have multiple independent systems with NTP each running the
following pipeline:

gst-launch-1.0 -v -e \
uvch264src SOME_SETTINGS ! h264parse ! queue ! \
mpegtsmux name=muxer ! queue ! rtpmp2tpay ! \
.send_rtp_sink rtpsession name=session .send_rtp_src ! \
udpsink host=$host port=$rtp_port session.send_rtcp_src !\
udpsink host=$host port=$rtcp_port sync=false async=false \
alsasrc device=$alsadev do-timestamp=true ! \
audio/x-raw,rate=44100,channels=1 ! \
audioconvert ! queue ! avenc_aac compliance=-2 ! aacparse ! queue !
muxer.

On the server side we need something like this:

caps='application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33'

gst-launch-1.0 -v -e rtpbin name=rtpbin \
udpsrc port=$rtp_port caps="caps" ! \
rtpbin.recv_rtp_sink_0 rtpbin. ! \
decodebin name=decoder ! queue ! \
SOMEVIDEOSINK sync=true decoder. ! queue ! \
SOMEAUDIOSINK sync=true udpsrc port=$rtcp_port ! \
rtpbin.recv_rtcp_sink_0

This ought to work. The server side should receive RTCP sender reports with
NTP wallclock to RTP timestamp mapping. One can hope decodebin gets this
information and use it when demuxing the TS, decode the two elemntary
streams and display the decoded images and plays the audio.

BUT on the receiving side we in this setups receive multiple streams and
run several processes with the above pipeline. How does one go about
synchronizing this?

I could try to run ALL RTP streams and all RTCP sender reports into same
rtpbin and as such run one single pipeline for all receiving streams. Would
that work?

However such a single pipeline solution would have a problem. If not all
sources are initially sending streams, I suspect the big pipeline with all
the streams would never go from PREROLL to PLAYING because some of the
pipeline elements would miss data for the streams not currently running.
And it may or may not stop, if some of the pipeline elements later miss
data. Is this a correct assumption?

Best regards
Peter
Sebastian Dröge
2015-06-23 07:28:41 UTC
Permalink
Post by Peter Maersk-Moller
Hi
Would it be possible with GStreamer to playout mutiple RTP sources
synchronized if the playout for each source takes place in its own process?
Short answer: yes.

See the example applications here:
http://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-netclock-client.c
http://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-netclock.c

Instead of sharing a network clock, you could also share a GStreamer
NTP or PTP clock, or alternatively set the system clock with wall clock
time on the pipelines and synchronize the system clocks outside
GStreamer.


These examples are doing RTSP, but only to make the setup of the rtpbin
simpler. What matters is those properties that are set on rtpbin and
proper selection of the pipeline clock.
--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com
Peter Maersk-Moller
2015-06-23 10:17:20 UTC
Permalink
Hi Sebastian.

Thanks for the answer/example. So what I read from the example is that
nothing needs to be changed for the sender (liver source):

gst-launch-1.0 -v -e \
uvch264src SOME_SETTINGS ! h264parse ! queue ! \
mpegtsmux name=muxer ! queue ! rtpmp2tpay ! \
.send_rtp_sink rtpsession name=session .send_rtp_src ! \
udpsink host=$host port=$rtp_port session.send_rtcp_src !\
udpsink host=$host port=$rtcp_port sync=false async=false \
alsasrc device=$alsadev do-timestamp=true ! \
audio/x-raw,rate=44100,channels=1 ! \
audioconvert ! queue ! avenc_aac compliance=-2 ! aacparse ! queue !
muxer.

But each receiver needs to have a pipeline like this:

caps='application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33'

gst-launch-1.0 -v -e rtpbin name=rtpbin latency=40 ntp-time-source=3
buffer-mode=synced ntp-sync=true \
udpsrc port=$rtp_port caps="caps" ! \
rtpbin.recv_rtp_sink_0 rtpbin. ! \
decodebin name=decoder ! queue ! \
SOMEVIDEOSINK sync=true decoder. ! queue ! \
SOMEAUDIOSINK sync=true udpsrc port=$rtcp_port ! \
rtpbin.recv_rtcp_sink_0

Assuming each source of RTP/RTCP streams are in agreement about time using
ntpd to set/adjust the clock locally, should this pipeline play the content
synchronized not only with audio/video synchronized but also synchronized
for other similar pipelines?

A couple of things though:

1. ntp-time-source does not seem to be a settable parameter for rtpbin
for (Currently using GStreamer 1.4.5) when listing the module with
gst-inspect-1.0. So how do I set it through CLI using gst-launch-1.0?
2. The latency (PLAYBACK_DELAY_MS) in the example is set to 40ms which I
guess is used to set the latency of the built-in rtpjitterbuffer. Is this
correct? If yes, is this the parameter to control the same overall fixed
delay for all instances of pipelines in this context that needs to be
synchronized?
3. Will two pipelines with the same configuration, but with different
codec for the stream produce identical synchronized delay in the playout? I
can live with the need for identical codes, but of course it will be nice.

I guees the overall principle of synchronizing playout of multiple sources,
each in its own process, that the playout takes place with a fixed delay
using ntp timestamps which is then a fixed identical delay for all
processes compared to wallclock. Is that right?

Best regards and thanks for the info.
Peter
Post by Sebastian Dröge
Post by Peter Maersk-Moller
Hi
Would it be possible with GStreamer to playout mutiple RTP sources
synchronized if the playout for each source takes place in its own process?
Short answer: yes.
http://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-netclock-client.c
http://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-netclock.c
Instead of sharing a network clock, you could also share a GStreamer
NTP or PTP clock, or alternatively set the system clock with wall clock
time on the pipelines and synchronize the system clocks outside
GStreamer.
These examples are doing RTSP, but only to make the setup of the rtpbin
simpler. What matters is those properties that are set on rtpbin and
proper selection of the pipeline clock.
--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com
_______________________________________________
gstreamer-devel mailing list
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Sebastian Dröge
2015-06-23 10:27:20 UTC
Permalink
Post by Peter Maersk-Moller
Hi Sebastian.
Thanks for the answer/example. So what I read from the example is
No, you need to use "use-pipeline-clock" at least. And ensure that both
pipelines (sender and receiver) are using the same clock. You can't do
that with gst-launch!
Post by Peter Maersk-Moller
Assuming each source of RTP/RTCP streams are in agreement about time
using ntpd to set/adjust the clock locally, should this pipeline play
the content synchronized not only with audio/video synchronized but
also synchronized for other similar pipelines?
No, the important part here is that via RTCP some clock time exchange
will happen and then the timestamps on the receiver side are exactly
the same clock time as on the sender side.

Related your later questions: codec and stuff does not matter, what
matters are the timestamps (the buffer timestamps in clock time must be
the same in the end! PTS or running time don't matter), the same clock
and that all receivers use the same latency.
Post by Peter Maersk-Moller
ntp-time-source does not seem to be a settable parameter for rtpbin
for (Currently using GStreamer 1.4.5) when listing the module with
gst-inspect-1.0. So how do I set it through CLI using gst-launch-1.0?
You need 1.5.1 or GIT master. And you can't use gst-launch for all of
this, see above. Please write some proper code instead of using gst
-launch, gst-launch is only a testing tool.

You will most likely also need the new gst_pipeline_set_latency() API
on the receivers.
--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com
Peter Maersk-Moller
2015-06-23 17:18:18 UTC
Permalink
Hi Sebastian.

Please see comments in-line.

Sorry for being difficult. I'm probably missing an important point here.
Post by Sebastian Dröge
Post by Peter Maersk-Moller
Hi Sebastian.
Thanks for the answer/example. So what I read from the example is
No, you need to use "use-pipeline-clock" at least. And ensure that both
pipelines (sender and receiver) are using the same clock. You can't do
that with gst-launch!
I probably misunderstand. Are you telling me to use pipeline-clock in the
sender?
Isn't the pipeline-clock a clock that starts at 0 when the pipeline starts?

Will that work when we have two independent senders, each on their own
system and we eventually want to play-out the two sender streams
synchronized but each stream in its own playout process? I'm probaly
missing the point here.
Post by Sebastian Dröge
Post by Peter Maersk-Moller
Assuming each source of RTP/RTCP streams are in agreement about time
using ntpd to set/adjust the clock locally, should this pipeline play
the content synchronized not only with audio/video synchronized but
also synchronized for other similar pipelines?
No, the important part here is that via RTCP some clock time exchange
will happen and then the timestamps on the receiver side are exactly
the same clock time as on the sender side.
Ok. But still confused. The two senders may have the same understanding
about time through ntp synchonized system time, but they will usually have
started at different times. But since they have same system time, the
timestamps together with each streams RTCP sender report will provide each
separate rtpbin in the playout end with information about how timestamps
relates to absolutely system time and as a consequence of ntp, also the
wall clock.
Post by Sebastian Dröge
Related your later questions: codec and stuff does not matter, what
matters are the timestamps (the buffer timestamps in clock time must be
the same in the end! PTS or running time don't matter), the same clock
and that all receivers use the same latency.
Got it. So two end-to-end-systems (producer and playout) will playout in
sync, if system time on each producer is the same, because the two systems
end-to-end will have the same delay. At least that is what I think you are
writing. I assume that only work if one of them does not have considerably
longer network delay compared to the other ?
Post by Sebastian Dröge
Post by Peter Maersk-Moller
ntp-time-source does not seem to be a settable parameter for rtpbin
for (Currently using GStreamer 1.4.5) when listing the module with
gst-inspect-1.0. So how do I set it through CLI using gst-launch-1.0?
You need 1.5.1 or GIT master. And you can't use gst-launch for all of
this, see above. Please write some proper code instead of using gst
-launch, gst-launch is only a testing tool.
So you keep telling me :-) I think gst-launch is a very useful application
:-)))) for which I can only thank for too little.

Okay, tried to avoid it for years after looking at the documentation, but
there seems no other way around it. It's a little bit odd that somebody has
not already written a CLI GStreamer based app, that can playout (to your
choice of sink) through mutiple independent pipelines synchronously.
Post by Sebastian Dröge
You will most likely also need the new gst_pipeline_set_latency() API
on the receivers.
Got it. Have to go to 1.5.1 or GIT master. Will update and start
experimenting with some code. Synchronized source playout will make life
easier for me in the Snowmix Video Mixer I develop. Mixing is a lot harder
when things are not perfectly in sync.

Thanks for patience and help so far.

Regards
Peter MM
Sebastian Dröge
2015-06-23 17:33:20 UTC
Permalink
Post by Peter Maersk-Moller
Hi Sebastian.
Please see comments in-line.
Sorry for being difficult. I'm probably missing an important point here.
On Tue, Jun 23, 2015 at 12:27 PM, Sebastian Dröge <
Post by Sebastian Dröge
Post by Peter Maersk-Moller
Hi Sebastian.
Thanks for the answer/example. So what I read from the example is
No, you need to use "use-pipeline-clock" at least. And ensure that both
pipelines (sender and receiver) are using the same clock. You can't do
that with gst-launch!
I probably misunderstand. Are you telling me to use pipeline-clock in
the sender?
Isn't the pipeline-clock a clock that starts at 0 when the pipeline starts?
Will that work when we have two independent senders, each on their
own system and we eventually want to play-out the two sender streams
synchronized but each stream in its own playout process? I'm probaly
missing the point here.
That's why you need to ensure to use the same clock on sender and
receiver. Both should report the same times.

This could be done with the GStreamer network clock, a GStreamer PTP or
NTP clock. Or by using the GStreamer system clock (make sure to set it
to wall clock time) and outside GStreamer ensuring that the system
clocks are synchronized.


Alternatively you can also not use ntp-clock-source=3 on sender *and*
receiver (and leave it at 0), in which case you only need to ensure
that the wall time clock on senders and receivers are reporting the
same time. This can be less accurate though.
I didn't really test the case where the pipeline clocks on sender and
receiver are different, as it's much easier to ensure synchronization
of the clock inside GStreamer than making sure the system time is
properly synchronized. See the example applications I linked.


(Don't use use-pipeline-clock anymore, it's deprecated in favour of ntp
-clock-source)
Post by Peter Maersk-Moller
Post by Sebastian Dröge
Assuming each source of RTP/RTCP streams are in agreement about
time
Post by Sebastian Dröge
Post by Peter Maersk-Moller
using ntpd to set/adjust the clock locally, should this pipeline
play
Post by Peter Maersk-Moller
the content synchronized not only with audio/video synchronized
but
Post by Peter Maersk-Moller
also synchronized for other similar pipelines?
No, the important part here is that via RTCP some clock time
exchange
will happen and then the timestamps on the receiver side are
exactly
the same clock time as on the sender side.
Ok. But still confused. The two senders may have the same
understanding about time through ntp synchonized system time, but
they will usually have started at different times. But since they
have same system time, the timestamps together with each streams RTCP
sender report will provide each separate rtpbin in the playout end
with information about how timestamps relates to absolutely system
time and as a consequence of ntp, also the wall clock.
The GStreamer clocks would report the same times on sender and receiver
if you make sure to use the same clock in both pipelines. What is
different is the base time and thus the running time.

The use of ntp-sync=true and buffer-mode=synced and ntp-sync-source=3
would make sure that the different base times are calculated away again
based on the NTP times from the RTCP SR.
Post by Peter Maersk-Moller
Post by Sebastian Dröge
Related your later questions: codec and stuff does not matter, what
matters are the timestamps (the buffer timestamps in clock time must be
the same in the end! PTS or running time don't matter), the same clock
and that all receivers use the same latency.
Got it. So two end-to-end-systems (producer and playout) will
playout in sync, if system time on each producer is the same, because
the two systems end-to-end will have the same delay. At least that is
what I think you are writing. I assume that only work if one of them
does not have considerably longer network delay compared to the other
?
Yes, see also my remark about gst_pipeline_set_latency(). You would
there have to configure the biggest latency of all receivers (including
network and decoder latency!).
Post by Peter Maersk-Moller
Post by Sebastian Dröge
Post by Peter Maersk-Moller
ntp-time-source does not seem to be a settable parameter for
rtpbin
Post by Peter Maersk-Moller
for (Currently using GStreamer 1.4.5) when listing the module
with
Post by Peter Maersk-Moller
gst-inspect-1.0. So how do I set it through CLI using gst-launch
-1.0?
You need 1.5.1 or GIT master. And you can't use gst-launch for all of
this, see above. Please write some proper code instead of using gst
-launch, gst-launch is only a testing tool.
So you keep telling me :-) I think gst-launch is a very useful
application :-)))) for which I can only thank for too little.
Okay, tried to avoid it for years after looking at the documentation,
but there seems no other way around it. It's a little bit odd that
somebody has not already written a CLI GStreamer based app, that can
playout (to your choice of sink) through mutiple independent
pipelines synchronously.
I linked you to one in my previous mail :) That does exactly what we're
talking about here all the time, and uses RTSP to exchange the stream
parameters.
--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com
Peter Maersk-Moller
2015-06-25 08:03:30 UTC
Permalink
Hi Sebastian.

Thanks for hints and tips. Will try both your rtsp example and a manual
with just RTCP and rtpsession/rtpbin with the suggested parameters for
version 1.5.2. Goal is sync and minimal delay. Perhaps both solutions will
provide same delay.

Best regards
Peter MM
Post by Sebastian Dröge
Post by Peter Maersk-Moller
Hi Sebastian.
Please see comments in-line.
Sorry for being difficult. I'm probably missing an important point here.
On Tue, Jun 23, 2015 at 12:27 PM, Sebastian Dröge <
Post by Sebastian Dröge
Post by Peter Maersk-Moller
Hi Sebastian.
Thanks for the answer/example. So what I read from the example is
No, you need to use "use-pipeline-clock" at least. And ensure that both
pipelines (sender and receiver) are using the same clock. You can't do
that with gst-launch!
I probably misunderstand. Are you telling me to use pipeline-clock in
the sender?
Isn't the pipeline-clock a clock that starts at 0 when the pipeline starts?
Will that work when we have two independent senders, each on their
own system and we eventually want to play-out the two sender streams
synchronized but each stream in its own playout process? I'm probaly
missing the point here.
That's why you need to ensure to use the same clock on sender and
receiver. Both should report the same times.
This could be done with the GStreamer network clock, a GStreamer PTP or
NTP clock. Or by using the GStreamer system clock (make sure to set it
to wall clock time) and outside GStreamer ensuring that the system
clocks are synchronized.
Alternatively you can also not use ntp-clock-source=3 on sender *and*
receiver (and leave it at 0), in which case you only need to ensure
that the wall time clock on senders and receivers are reporting the
same time. This can be less accurate though.
I didn't really test the case where the pipeline clocks on sender and
receiver are different, as it's much easier to ensure synchronization
of the clock inside GStreamer than making sure the system time is
properly synchronized. See the example applications I linked.
(Don't use use-pipeline-clock anymore, it's deprecated in favour of ntp
-clock-source)
Post by Peter Maersk-Moller
Post by Sebastian Dröge
Assuming each source of RTP/RTCP streams are in agreement about
time
Post by Sebastian Dröge
Post by Peter Maersk-Moller
using ntpd to set/adjust the clock locally, should this pipeline
play
Post by Peter Maersk-Moller
the content synchronized not only with audio/video synchronized
but
Post by Peter Maersk-Moller
also synchronized for other similar pipelines?
No, the important part here is that via RTCP some clock time exchange
will happen and then the timestamps on the receiver side are exactly
the same clock time as on the sender side.
Ok. But still confused. The two senders may have the same
understanding about time through ntp synchonized system time, but
they will usually have started at different times. But since they
have same system time, the timestamps together with each streams RTCP
sender report will provide each separate rtpbin in the playout end
with information about how timestamps relates to absolutely system
time and as a consequence of ntp, also the wall clock.
The GStreamer clocks would report the same times on sender and receiver
if you make sure to use the same clock in both pipelines. What is
different is the base time and thus the running time.
The use of ntp-sync=true and buffer-mode=synced and ntp-sync-source=3
would make sure that the different base times are calculated away again
based on the NTP times from the RTCP SR.
Post by Peter Maersk-Moller
Post by Sebastian Dröge
Related your later questions: codec and stuff does not matter, what
matters are the timestamps (the buffer timestamps in clock time must be
the same in the end! PTS or running time don't matter), the same clock
and that all receivers use the same latency.
Got it. So two end-to-end-systems (producer and playout) will
playout in sync, if system time on each producer is the same, because
the two systems end-to-end will have the same delay. At least that is
what I think you are writing. I assume that only work if one of them
does not have considerably longer network delay compared to the other
?
Yes, see also my remark about gst_pipeline_set_latency(). You would
there have to configure the biggest latency of all receivers (including
network and decoder latency!).
Post by Peter Maersk-Moller
Post by Sebastian Dröge
Post by Peter Maersk-Moller
ntp-time-source does not seem to be a settable parameter for
rtpbin
Post by Peter Maersk-Moller
for (Currently using GStreamer 1.4.5) when listing the module
with
Post by Peter Maersk-Moller
gst-inspect-1.0. So how do I set it through CLI using gst-launch
-1.0?
You need 1.5.1 or GIT master. And you can't use gst-launch for all of
this, see above. Please write some proper code instead of using gst
-launch, gst-launch is only a testing tool.
So you keep telling me :-) I think gst-launch is a very useful
application :-)))) for which I can only thank for too little.
Okay, tried to avoid it for years after looking at the documentation,
but there seems no other way around it. It's a little bit odd that
somebody has not already written a CLI GStreamer based app, that can
playout (to your choice of sink) through mutiple independent
pipelines synchronously.
I linked you to one in my previous mail :) That does exactly what we're
talking about here all the time, and uses RTSP to exchange the stream
parameters.
--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com
_______________________________________________
gstreamer-devel mailing list
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Loading...