<div dir="ltr">I posted a while back "complaining" about lack of a theora player on the iPhone. Porting the code for libtheora (and libogg/libvorbis) was (relatively) painless, and appears to be working. I'm up against a tougher challenge now... rendering the video! <div>
<br></div><div>I tried using the brand-new SOC-funded SDL port to iphone to get the "player_example.c" code up and running, but ran into a major roadblock:<br></div><div><br></div><div>The SDL port to iphone uses an opengl-es driver and opengl drivers for SDL don't support YUV overlay's! So the problem (at least for now) is how I can get the YUV encoded frames from a theora movie rendered on the iPhone?</div>
<div><br></div><div>The options I've considered are: </div><div>a) convert from YUV to RGB in code (this probably won't work for realtime decoding)</div><div>b) convert from YUV to RGB using pixel shading (If it sounds like I don't know what I'm talking about, that's correct... I've simply seen smarter people talk about doing the conversion on the GPU using pixel shading???)</div>
<div>c) convert the frames to still images in a format that the iphone can display natively (doesn't jpeg use the same YUV encoding that theora does?) and render them as a series of images (I have no clue if this is feasible or not, or if the conversion/decoding would be worse than just doing a or b above)</div>
<div>d) something else?</div><div>e) give up!</div><div><br></div><div>So any feedback would be great! </div><div>On a positive note, the iphone sdk's NDA should be lifted very soon..</div><div>On a negative note, the guy that ported vlc to iphone (google vlc4iphone) hasn't released any of the source... so I can't even cheat to see how he's doing it... and asking him directly hasn't worked.</div>
<div>Steve</div></div>