Simple low latency audio streaming server?

Short question: Does anyone have any suggestions for an low CPU/RAM, live (line/mic in), audio-only, one way, low latency streaming server that can be listened to via a web browser?

Longer story:
My church has a little Orange Pi in a 3D printed case mounted under the sound desk. This model of OPi has a balanced mic input, which is pretty rare but very handy in this case since we have a balanced output to give it.
I set it up hastily during coronavirus with Icecast to allow sick people to listen to the service, since we don’t do any other sort of live streaming. It worked fairly well for this, but Icecast seems to create an ever increasing latency - starting around 4-5 seconds, and growing steadily to around 30 seconds after an hour, and several minutes if you leave it long enough. This is at the server side - disconnecting and reconnecting gives you the same long delay.

Recently I’ve been looking for something better… Ideally around 100ms latency when on the LAN, and accessible via a simple web browser so we can allow anyone to just enter the URL. It obviously also needs to be able to take live audio from the sound card.

I’d prefer to apt-get something from the Debian repos, but don’t mind manual installs of stuff if necessary. Open to pretty much any self hosted solution at this stage. I’ve found a few projects on github that sound like they might work, but haven’t had a chance to try them yet.

The issue here is caused by buffering and the protocol you’re using. WebRTC is your best bet on keeping latency down using “standard tools”.

GitHub - bluenviron/mediamtx: Ready-to-use SRT / WebRTC / RTSP / RTMP / LL-HLS media server and media proxy that allows to read, publish, proxy, record and playback video and audio streams. and/or GitHub - porjo/babelcast: a WebRTC audio broadcast server might be of interst along with ffmpeg.

2 Likes

Buffering was definitely an issue with VLC accessing the icecast stream - by default it would be over a minute behind with its fixed buffer and the low but rate I was running. I did bring that right down and it got much better, but still worsened over time at the server end.
Also, telling other people how to adjust the buffers in clients wouldn’t be fun, so switching to WebRTC and browsers is definitely a better option.

That first link looks good. I had stumbled across that when first looking but lost it and couldn’t find it again :rofl:
And they provide a docker setup I can test with… I think I know what I’ll be doing later.

1 Like