[Icecast-dev] /* Check for midi header in logical stream */
Rücker Thomas
thomas.ruecker at tieto.com
Sat Jun 23 11:50:10 PDT 2012
Hi,
On 21/06/12 16:46, Marc wrote:
>
> 2012/6/21 Andrés González <acandido at hi-iberia.es
> <mailto:acandido at hi-iberia.es>>
>
> On 20/06/12 15:01, Marc wrote:
>
>> as an long time macintosh user , musican/producer/programmer , i
>> am very upset that another great technology (DSS ) vanished
>> because of http streaming so i turned my interest
>> towards icecast, whitch seems an fantastic and evolved media
>> streaming server.
>>
>> I am very interested in Midi, especialy the possibility to *sync
>> Audio with Midi*.
>> So my question , would it be possible to stream a Vocal track for
>> example, via icecast and syncronise a reciever via Midi clock ?
>
> If I understood you well, Icecast will do the job in the server
> part; the complex part is the receiver. I would try GStreamer for
> that.
>
>
> Thanks for the reply , I was looking around on the interwebs and i
> found nothing then else then "subscription" based services like
> *ejamming.com <http://ejamming.com>* ,its expensive and difficult to
> handle , a musican friend did not get it to work (Portsettings and
> mapping no UPnP ).
>
> Imagine one lead vocal track could triggers several sound sources
> around the world , you think the *latency* from midi would be a
> problem for drums for example ?
>
I have this underlying feeling, that we're not clear about how this is
supposed to fit together.
a) http streaming can have very significant latency due to caching in
the listener software (only little is added by icecast itself). Total it
will be on the order of 1-30s easily.
b) two separate clients will never be exactly sync streaming wise (see a)
c) I'm not aware of any currently existing software that would mux
vorbis and midi in an ogg container.
d) neither am I aware of software that can demux and use vorbis and midi.
(I'm not saying c/d does not exist, but I haven't heard of it)
So how would this work?
The singer would just sing to a metronome or prerecorded midi track and
that would be muxed in a stream that would go through Icecast to several
musicians?
The singer then would not have the chance to hear what anyone plays 'out
there' as that would happen with an too huge delay and not synchronized.
How do all the instruments fit in then? Are they supposed to be
'hearing' each other (in terms of some sort of global midi network?).
Or does everyone just jam out to the lead singer+metronome and
afterwards everything is fitted together (thanks to the midi time code)?
I may be missing something obvious, but would like to avoid a long
discussion and then we find out that each one of us was talking about
something completely different.
Cheers
Thomas
PS: Excuse my MIDI illiteracy, it's been 10 years since I had my
keyboard hooked up via MIDI to my PC with tracking software.
More information about the Icecast-dev
mailing list