Maxim Ershtein
2018-09-20 18:08:37 UTC
Hello,
I am trying to record VP9/Opus -encoded live stream into .mkv file. The
live stream is being encoded by Chrome browser and sent via WebRTC to
Unreal Media Server.
Unreal Media Server allows installing WebRTC DirectShow Source filter:
http://umediaserver.net/components/index.html
This filter allows connecting to Unreal Media Server and receiving the
original stream.
It works fine in GraphEdit - I can render and play the stream.
But my goal is to record it with ffmpeg to .mkv file with no transcoding.
So I am doing:
*ffmpeg -f dshow -show_video_device_dialog true -i video="Unreal WebRTC
Client Source":audio="Unreal WebRTC Client Source" -c:a copy -c:v copy
c:\temp\output.mkv*
According to documentation, it should work.
Notice that I am doing *-show_video_device_dialog true*
This is because you need to initialize the filter, like all Unreal source
filters, with the URL where to pull the stream from. This URL is not
getting persisted anywhere, so each time you instantiate this filter, you
need to initialize it with the URL. So the dialog allows you to input the
URL.
But when I do it, ffmpeg reports *"Could not get media type"*.
*What am I doing wrong?*
Again, using the same dialog in GraphEdit works fine and I can render pins
and play.
I am also looking for some automation of that dialog - how do I pass that
URL to the filter using command line? The filter readme says:
"Configuration parameters can be provided via property page or via
IFileSourceFilter interface exposed by the filter. When configured via
IFileSourceFilter interface, initialization string needs to be provided as
first parameter of IFileSourceFilter::Load()
method."
So, I guess, ffmpeg could check if IFileSourceFilter interface is supported
and then call a Load function with command line - supplied parameter. If
ffmpeg doesn't support that, any ideas on how to pass the URL to the filter
programmatically?
Note that this is a pretty generic situation - I might have 10 ffmpeg
instances loading this filter and each one is going to use a different URL.
So the URL is not something static and cannot sit in registry or some
config file. So same problem with other network source filters from Unreal,
but I don't need others, as ffmpeg can receive rtsp, rtmp and mpeg-ts
streams by itself.
But ffmpeg, of course, cannot receive browser-encoded WebRTC streams with
VP9/Opus encodings. Hence I am trying to use Unreal WebRTC DirectShow
Source filter.
_______________________________________________
ffmpeg-user mailing list
ffmpeg-***@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
To unsubscribe, visit link above, or email
ffmpeg-user-***@ffmpeg.org with
I am trying to record VP9/Opus -encoded live stream into .mkv file. The
live stream is being encoded by Chrome browser and sent via WebRTC to
Unreal Media Server.
Unreal Media Server allows installing WebRTC DirectShow Source filter:
http://umediaserver.net/components/index.html
This filter allows connecting to Unreal Media Server and receiving the
original stream.
It works fine in GraphEdit - I can render and play the stream.
But my goal is to record it with ffmpeg to .mkv file with no transcoding.
So I am doing:
*ffmpeg -f dshow -show_video_device_dialog true -i video="Unreal WebRTC
Client Source":audio="Unreal WebRTC Client Source" -c:a copy -c:v copy
c:\temp\output.mkv*
According to documentation, it should work.
Notice that I am doing *-show_video_device_dialog true*
This is because you need to initialize the filter, like all Unreal source
filters, with the URL where to pull the stream from. This URL is not
getting persisted anywhere, so each time you instantiate this filter, you
need to initialize it with the URL. So the dialog allows you to input the
URL.
But when I do it, ffmpeg reports *"Could not get media type"*.
*What am I doing wrong?*
Again, using the same dialog in GraphEdit works fine and I can render pins
and play.
I am also looking for some automation of that dialog - how do I pass that
URL to the filter using command line? The filter readme says:
"Configuration parameters can be provided via property page or via
IFileSourceFilter interface exposed by the filter. When configured via
IFileSourceFilter interface, initialization string needs to be provided as
first parameter of IFileSourceFilter::Load()
method."
So, I guess, ffmpeg could check if IFileSourceFilter interface is supported
and then call a Load function with command line - supplied parameter. If
ffmpeg doesn't support that, any ideas on how to pass the URL to the filter
programmatically?
Note that this is a pretty generic situation - I might have 10 ffmpeg
instances loading this filter and each one is going to use a different URL.
So the URL is not something static and cannot sit in registry or some
config file. So same problem with other network source filters from Unreal,
but I don't need others, as ffmpeg can receive rtsp, rtmp and mpeg-ts
streams by itself.
But ffmpeg, of course, cannot receive browser-encoded WebRTC streams with
VP9/Opus encodings. Hence I am trying to use Unreal WebRTC DirectShow
Source filter.
_______________________________________________
ffmpeg-user mailing list
ffmpeg-***@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
To unsubscribe, visit link above, or email
ffmpeg-user-***@ffmpeg.org with