RTSP is a very common IP camera protocol. If Azure would add the capability to directly ingest RTSP, it would greatly improve the ability to take advantage of Azure's services for IP cameras. Now these cameras always require the additional overhead of an encoder/trans-coder just to convert from RTSP to RTMP which adds delay, processing requirements, and is zero value add from a users/developer perspective.63 votes
AMS is great, but it's intended for one-to-many applications.
I need to create a one-to-one (single way) stream application, with very low latency (~2 seconds).
But, at this right moment, only adaptative protocols are used to output the stream, so it is not possible.
If you could integrate RTMP output stream and reduce the latency, would be wonderful40 votes
We recently added a Low Latency option on our v3 Live Event API to bring latency down to 8-10 seconds e2e.
See the blog post here – https://azure.microsoft.com/en-us/blog/what-s-new-in-azure-media-services-video-processing/
We will begin work next on Ultra Low Latency ( and CMAF low latency) which is targeted at 2s range, but keep in mind tha latencies down at that level do not scale well to large audiences with high stability at the client side. Client side still needs to maintain about a 1 second buffer for the stream to stay stable.
On a live encoder channel, when you stop pushing a live feed on ingest points, le encoded feeds are showing the joined image.
Could it be possible to replace this default slate by a custom image in a channel?29 votes
Thanks for the feedback, this has been added to our backlog for the live encoder.
Currently, we cannot use existing uploaded Assets for "live" streaming. This is useful when we do rerun a broadcasting.
A scenario is that, we can watch same recorded conference video and chat about it at any place with their own devices. Or, just what we can expect about rerunning program in broadcast.21 votes
Add support for Media Indexer.... Bonus for Live Audio Translation, like Skype Translator14 votes
Multiple cameras are set on the same field.
We hope users can select each camera video.
In this situation, we must prepare a channels for each camera on Azure Media Services.
Of course, time stamp information exactly synchs the video from all the cameras.
we hope synchronization function of data between the channels on Azure Media Services.13 votes
Add support to reproduce playlist of VOD streams using live streaming channel to emulate the classic TV format. Thank you.6 votes
Now I am using the live encoding of Azure Media service.In tradition way, If I encode the file on premise by Wirecast, I could add the own logo by Wirecast; But If I use the live encoding, could I add own logo without Wirecast ?6 votes
We will review the request to add overlaying when encoding.
It's helpful to key rotation feature for 24x365 live streaming if Contents Holder required that supported by PlayReady 3.0 or later. Hope we can provide this as option.5 votes
Currently multiple audio streams (multi language) for live encoding ingest only work for RTP single bitrate video streams. Please add support for multiple language tracks to live multi-bitrate Smooth streams pushed from a client encoder. Scenario: Elemental Live encoder creates multi-language multi-video bitrate Smooth stream that gets pushed to AMS live channel over public internet connection. This is needed, because the RTP (UDP) protocol is not reliable for pushing over public internet connections, and the Azure Media Encoder encoding profiles do not always meet the specific scenarios or encoding options an on premise live encoder can offer.5 votes
Video based examples of live streaming with code with steps broken down api functionality. Probably would involve a series of videos.
Video based examples of live streaming with C# code as an MVC project with steps broken down api functionality. Probably would involve a series of videos. Examples would include connecting to the account, creating the asset, allowing the broadcast to be remotely started, Running the broadcast, ending the broadcast, exporting the resulting asset to a storage account.4 votes
Today, when we try a live encoder to ingest into AMS Live channel, if it does not work, we have very limited info on why the ingest is refus
refused. It would be great if an option is provided to disclose the detailed exception info via either API, portal, etc.4 votes
In live streamin we have a bug with flash media live encoding to use encoding with channels in order to try keep the frame rate. I think that the focus must be in solve this issues and make better compatilbility whit other programs too this happens with obs and fmpeg too.4 votes
As an advisory member of the SRT alliance, Microsoft should think of adding SRT video support to allow for the direct ingest of SRT video streams into Azure Media Services. This will allow for very low latency, secure, and internet resilient video (up to 2% packet loss sustained) to be sent for re-play.3 votes
The live broadcast of the media service.
We hope to provide IOS, AndroId SDK.
Only a few software is available at present.3 votes
Right now we cannot record a live stream for longer than the maximum archive window length of 25 hours. If the live stream goes on for longer than 25 hours we begin to lose data.
It would be useful to have some function that enables recording of live stream without being restricted by a certain duration. For example, we want to continuously record a security camera stream that is active 24/7. The recorded assets can be in chunks of 1-hour duration.3 votes
Currently live streaming programs have to be started and stopped manually. It would be useful to be able to schedule when to start and stop programs allows for precise timing as well as help transitioning between live events seamlessly.3 votes
Some high-end encorder support MPEG-DASH streaming. It helps to Azure Media Services support MPEG-DASH as ingest support to dynamic packaging/encryption/manifest for those, that's valuable.3 votes
We are working on a new DASH ingest protocol with the broader industry.
You can monitor work and participate here:
Once the standards work is complete, we will implement the solution.
Using 3 to 4+ fragmented ingest streams, embedded with VITC (SMPTE) timecode. Buffer at the AMS server, for several seconds (this should be configurable, to allow for extra time when satellite and terrestrial internet redundancy are utilized). Then analyze the 3+ streams for similarity, both video and audio would be cool. When the stream that is being pushed out to clients, differs greatly from the others, automatically switch over. It would also be nice to be able to get notifications, not necessarily to the encoders themselves, but maybe and configurable https endpoint.3 votes
This is now under review by the Streaming team. Thank you for your feedback.
Use case being sending in-band metadata about what's currently being streamed, used by clients as cuepoints, for information display, to invoke time-aligned actions and so on. Because the data can be embedded within the stream, the application playing the video doesn't need to load data from two separate sources and doesn't need to do anything to keep the two data streams in sync.3 votes
- Don't see your idea?