[Theora-dev] Theora decode problem
Shana Cooke (Gitnick)
scooke at viack.com
Fri Oct 1 14:48:51 PDT 2004
I am trying to write a sample app that uses Theora for encoding/decoding
a network video stream. Unfortunately, I keep running into this error:
Run-Time Check Failure #3 - The variable 'MVect' is being used without
being defined.
It always happens on decryption, line 467 in decode.c. The issue is
that CodingMethod is equal to 5 (CODE_GOLDENFRAME), which doesn't
initialize the moving vectors - and yet it goes into this area anyway
and promptly blows up because the moving vectors are garbage.
I'm not sure what I'm doing wrong. Here's my encryption initialization:
theora_info_init(&m_ti);
m_ti.width=176;
m_ti.height=144;
m_ti.frame_width=176;
m_ti.frame_height=144;
m_ti.offset_x=0;
m_ti.offset_y=0;
m_ti.fps_denominator=(ogg_uint32_t)1000000.0;
m_ti.fps_numerator=10 * m_ti.fps_denominator; //10 is 10
frames per second
m_ti.aspect_numerator=1;
m_ti.aspect_denominator=1;
m_ti.colorspace=OC_CS_UNSPECIFIED;
m_ti.target_bitrate=0; // 0 if we are specifying
quality
m_ti.quality=5; // from 0 to 10, 10 is highest
quality but higher files
m_ti.dropframes_p=0;
m_ti.quick_p=1; // should always be 1
m_ti.keyframe_auto_p=1;
m_ti.keyframe_frequency=64;
m_ti.keyframe_frequency_force=64;
m_ti.keyframe_data_target_bitrate=(ogg_uint32_t)(m_ti.target_bitrate*1.5
);
m_ti.keyframe_auto_threshold=80;
m_ti.keyframe_mindistance=8;
m_ti.noise_sensitivity=1;
theora_encode_init(&m_td,&m_ti);
theora_info_clear(&m_ti);
Here is my decrypt init. Since I can join the stream of theora packets
(enclosed in ogg packets) at any time, I don't get headers, but I know
what format I'm sending them in anyway since I'm the one sending them in
the first place.
theora_info_init(&m_ti);
m_ti.width=176;
m_ti.height=144;
m_ti.frame_width=176;
m_ti.frame_height=144;
m_ti.offset_x=0;
m_ti.offset_y=0;
m_ti.fps_denominator=(ogg_uint32_t)1000000.0;
m_ti.fps_numerator=10 * m_ti.fps_denominator; //10 is 10
frames per second
m_ti.aspect_numerator=1;
m_ti.aspect_denominator=1;
m_ti.colorspace=OC_CS_UNSPECIFIED;
m_ti.target_bitrate=0; // 0 if we are specifying
quality
m_ti.quality=5; // from 0 to 10, 10 is highest
quality but higher files
m_ti.dropframes_p=0;
m_ti.quick_p=1; // should always be 1
m_ti.keyframe_auto_p=1;
m_ti.keyframe_frequency=64;
m_ti.keyframe_frequency_force=64;
m_ti.keyframe_data_target_bitrate=(ogg_uint32_t)(m_ti.target_bitrate*1.5
);
m_ti.keyframe_auto_threshold=80;
m_ti.keyframe_mindistance=8;
m_ti.noise_sensitivity=1;
theora_decode_init(&m_td,&m_ti);
Any advice? I know the packets are received intact on the other end,
however, I keep getting stuck on this line in the compiler.
There must be something wrong with my encryption/decryption
initialization to cause it to enter this wrong code segment.
Shana
More information about the Theora-dev
mailing list