[Theora-dev] Problems with Theora DirectShow filters
theora-dev at huitl.de
Thu Sep 16 10:47:11 PDT 2004
First of all, thank you very much for the help. I could achieve some good
On Wednesday 15 September 2004 10:40, Illiminable wrote:
> You should build a graph something like this if you want to capture...
> Audio Input Source --> Speex Encoder --> }
> } Ogg
> Mux Filter
> Video Input Source --> Theora Encoder --> }
I built my application to use a filter graph like that to capture A/V:
The problem is, I cannot play the recorded file. A bug in my application is
unlikely, because the same symptoms come up when I use GraphBuilder to
This is how I play the stream: http://www.huitl.de/avcapdec.png
OGG Mux (Speex+Theora):
BSPlayer: black window, doesn't advance
- theora decoder and speex decoder => black video window, doesn't advance,
timeout when trying to stop (cancel => stopped)
- only speex decoder, theora demux pin unattached => plays, sound is correct,
crashes at end
- only speex decoder, theora demux pin attached to null renderer => doesn't
advance, timeout when trying to stop (cancel => stopped)
- only theora decoder, speex demux pin unattached => video window opens,
- only theora decoder, speex demux pin attached to null renderer => black
video window, doesn't advance, timeout when trying to stop (cancel =>
As I didn't know whether it's okay to have a demux pin unattached, I attached
the null renderer. It looks as though it's not supposed to be used this
OGG Mux (only theora stream):
- BSPlayer shows first frame, doesn't advance, each time play is clicked, the
image gets more distorted (screenshot below)
- graphedit: ogg demux => theora decode => video renderer works, although much
too fast. timeout when video stops (cancel => stopped)
OGG Mux (only speex stream):
- plays with BSPlayer, sound is correct
- graphedit: ogg demux => speex decode => default directsound device: works,
sound is correct
Is this what "Cannot handle any chained multiplexed files or streams" on
www.illiminable.org/ogg means? This wouldn't be a problem for me as I'm
probably not going to use the ogg mux filter (apart from testing) and rather
keep audio and video separate streams.
But then there's still the playback which is too fast. Could this be related
to "When transcoding theora, if a media decoder (WMV usually) doesn't
advertise its framerate, output may be time distorted"?
This is the timeout that appears quite often: http://www.huitl.de/timeout.png
Distorted video in BSPlayer: http://www.huitl.de/bsplayer_theoraonly.jpg
> >1. The webcam provides the color formats RGB24, I420 and IYUV. The Theora
> >encode filter only accepts YV12, so I can't just connect the webcam
> > capture pin with theora's yv12-in.
> Yes... this is a problem at the moment... probably in the next release i'll
> include a conversion from RGB types to YUV types.
That would be really fine :-)
> One thing... you should never "Render" encoding graphs... the automatic
> graph building is for decoding graphs only.
A'right, I'm not gonna do this again :-)
> >3. Feed data received over a network to the Theora decoder, which in turn
> >should pass it to a video renderer.
> This should already work... if you open a http resource (via full url) with
> the ogg demux source and render it... it should work fine. But again if you
> want to do packet level, and not page level network transport with custom
> or non-http protocols, you'll need to write a custom filter (or use
> something like the sample grabber filter) and do the protocol/network
> operations in the application code.
The sample grabber filter looks suitable for what I want to do. I didn't test
it, this is the next step, but it should allow me to extract data at the end
of the filter chains (can I feed it to the chain's beginning, too?)
audio capture => encode => sample grabber
video capture => encode => sample grabber
sample grabber (/feeder?) => decode => audio out
sample grabber (/feeder?) => decode => video out
As these would be two separate filter chains (in the same graph), is there
some facility for A/V-sync? If I could extract packets this way, I would need
timestamps to synchronize both streams during playback.
What is the difference between packet and page level?
I'm sorry to bother you with these probably very basic DirectShow questions...
> ...i'm more focussed at the moment on getting the core of the code working
> for the majority of the uses... specifics for particular styles of
> application should probably be written by the application developer as they
> will be application specific. I don't realyl want to spend time at this
> stage writing custom network protocol filters for individual applications.
No need to do so as I will try this myself. Keep up the good work! :-)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Size: 189 bytes
Desc: not available
Url : http://lists.xiph.org/pipermail/theora-dev/attachments/20040916/44e03337/attachment-0001.pgp
More information about the Theora-dev