Using video streaming formats with VideoPlayer

Hi! What low latency stream formats are supported from the video player?

Original Post on Discord

by user 726710936823398470

Hello there – we have an experimental AI bot :robot: that might be able to help you with your question. Would you like to try this out?

How it works:
Reply to the messages of the bot that end with a thread_id to interact with it and ask follow-up questions. You can also mention the bot directly in this channel to get help.

Help requested: please wait a moment…

Hello Jochri! The VideoPlayer component in Needle Engine supports media track constraints following the MediaTrackConstraints API, which is used to specify a variety of video and audio requirements and settings (see the documentation).

For low latency stream formats specifically, Needle Engine documentation doesn’t provide a list but aligns with web standards, meaning any formats supporting the MediaTrackConstraints in web browsers can be used, such as WebRTC for real-time communications. For more advanced or specific requirements, you would typically need to ensure that the client’s browser and the server-side technology also support low-latency streaming for the chosen format.

1 messages left GPT 4
thread_y1xgKx8reHVj2VcRA2mT4y0R

I there a list of supported formats? Where do I fiind documentation about MediaTrackConstraints ?

by user 726710936823398470

I found information related to ScreenCaptureOptions and the ScreenCapture component, but nothing specific about the list of supported formats.

As for MediaTrackConstraints, it seems like the documentation may not contain specific details about it. However, we can refer to the general web documentation on MediaTrackConstraints for more information. If there’s anything else you’d like me to look for, please let me know!

0 messages left GPT 3.5
thread_y8jkRyXC8d5g70q16JHhmzRh
Thanks for your feedback!

@herbst🌵 Can you assist?

by user 726710936823398470

Please don’t ping team members. Time zones and work hours are a thing.

Sry :pleading_face:

by user 726710936823398470

The bot above is right – we’re using the underlying web APIs directly so anything that can be played back in a

What are you specifically trying to do?

I try to find the best way to convert from an NDI stream to the Needle engine for. Synced distribution using currently the screen share sample

by user 726710936823398470

NDI is an UDP-based protocol as far as I’m aware. I don’t believe that can be received in a web browser directly.

The NDI tools can convert to a virtual webcam and you can receive that for example; I believe that’s called “NDI Webcam Input”

Yes that method is working but somehow limited to 1 camera input and is browser based. It would be easier and more scalable to display directly a stream for everyone in the room. Imagine a few hundred people with xr glasses on a concert

by user 726710936823398470

For that you’d typically use WebRTC and have a proper server that can serve that many people. NDI isn’t really designed for hundreds of end points

So from an architecture standpoint if you want to serve video for a virtual concert you’d basically pipe the data via NDI (your production stream) into a server that turns it into WebRTC or RTMP/HLS

One simple way to test would probably be to use OBS for conversion – it can take NDI and send the data to an RTMP server. There are a few RTMP servers that you can also run locally for testing (e.g. MonaServer)

That’s what I try😃

by user 726710936823398470

Which software is that?

by user 726710936823398470

:+1: that looks similar to what OBS does. Which RTMP server are you using?

https://cph-msl.akamaized.net/hls/live/2000341/test/master.m3u8 from players.akamai.com gives me VideoPlayer renderTexture not implemented yet. Please use material override instead. but thats also not working

by user 726710936823398470