Azure Media Services
-
Directly ingesting RTSP
RTSP is a very common IP camera protocol. If Azure would add the capability to directly ingest RTSP, it would greatly improve the ability to take advantage of Azure's services for IP cameras. Now these cameras always require the additional overhead of an encoder/trans-coder just to convert from RTSP to RTMP which adds delay, processing requirements, and is zero value add from a users/developer perspective.
49 votes -
Reduce live streaming latency for Azure Media Services live streaming
AMS is great, but it's intended for one-to-many applications.
I need to create a one-to-one (single way) stream application, with very low latency (~2 seconds).
But, at this right moment, only adaptative protocols are used to output the stream, so it is not possible.
If you could integrate RTMP output stream and reduce the latency, would be wonderful35 votesWe recently added a Low Latency option on our v3 Live Event API to bring latency down to 8-10 seconds e2e.
See the blog post here – https://azure.microsoft.com/en-us/blog/what-s-new-in-azure-media-services-video-processing/
We will begin work next on Ultra Low Latency ( and CMAF low latency) which is targeted at 2s range, but keep in mind tha latencies down at that level do not scale well to large audiences with high stability at the client side. Client side still needs to maintain about a 1 second buffer for the stream to stay stable.
-
Define custom slate image on a live encoder channel which is not receiving ingest feed
On a live encoder channel, when you stop pushing a live feed on ingest points, le encoded feeds are showing the joined image.
Could it be possible to replace this default slate by a custom image in a channel?27 votesThanks for the feedback, this has been added to our backlog for the live encoder.
-
Support for 24/7 live streaming
The live streaming channels should be able to keep online for a longer time window to support 24/7 live streams. The actual service works as long as you don't use the transcoding functionality but has some issues.
For this type of streams a channel has to persist at all time and player URL has to be the same... even when a the ingest RTMP has some issues. A really nice feature would be a static slate/image that is showed in the live stream when there is no RTMP signal for some time.
22 votes -
Reuse Asset for livestreaming
Currently, we cannot use existing uploaded Assets for "live" streaming. This is useful when we do rerun a broadcasting.
A scenario is that, we can watch same recorded conference video and chat about it at any place with their own devices. Or, just what we can expect about rerunning program in broadcast.
13 votes -
Add support for Media Indexer on Live streams
Add support for Media Indexer.... Bonus for Live Audio Translation, like Skype Translator
12 votes -
Multiple cameras sync live streaming
Multiple cameras are set on the same field.
We hope users can select each camera video.In this situation, we must prepare a channels for each camera on Azure Media Services.
Of course, time stamp information exactly synchs the video from all the cameras.we hope synchronization function of data between the channels on Azure Media Services.
9 votes -
Support key rotation services for Live Streaming
It's helpful to key rotation feature for 24x365 live streaming if Contents Holder required that supported by PlayReady 3.0 or later. Hope we can provide this as option.
5 votes -
Live streaming support for VOD like a playlist or classic TV
Add support to reproduce playlist of VOD streams using live streaming channel to emulate the classic TV format. Thank you.
5 votes -
How to add own logo during live encoding
Now I am using the live encoding of Azure Media service.In tradition way, If I encode the file on premise by Wirecast, I could add the own logo by Wirecast; But If I use the live encoding, could I add own logo without Wirecast ?
5 votesAzure Media Player is written in JavaScript using Html5 and CSS standards. As such you can overlay anything you want over the video element.
We will review the request to add overlaying when encoding.
-
Add multi-audio support for Smooth live stream multi-bitrate ingest from client encoders
Currently multiple audio streams (multi language) for live encoding ingest only work for RTP single bitrate video streams. Please add support for multiple language tracks to live multi-bitrate Smooth streams pushed from a client encoder. Scenario: Elemental Live encoder creates multi-language multi-video bitrate Smooth stream that gets pushed to AMS live channel over public internet connection. This is needed, because the RTP (UDP) protocol is not reliable for pushing over public internet connections, and the Azure Media Encoder encoding profiles do not always meet the specific scenarios or encoding options an on premise live encoder can offer.
5 votes -
Video based examples of live streaming with code with steps broken down api functionality. Probably would involve a series of videos.
Video based examples of live streaming with C# code as an MVC project with steps broken down api functionality. Probably would involve a series of videos. Examples would include connecting to the account, creating the asset, allowing the broadcast to be remotely started, Running the broadcast, ending the broadcast, exporting the resulting asset to a storage account.
4 votes -
Today, when we try a live encoder to ingest into AMS Live channel, if it does not work, we have very limited info on why the ingest is refus
refused. It would be great if an option is provided to disclose the detailed exception info via either API, portal, etc.
4 votes -
Flash Media Live Encoder, obs, solve problems with encoding channels
In live streamin we have a bug with flash media live encoding to use encoding with channels in order to try keep the frame rate. I think that the focus must be in solve this issues and make better compatilbility whit other programs too this happens with obs and fmpeg too.
4 votes -
Use your cell phone to live broadcast.
The live broadcast of the media service.
We hope to provide IOS, AndroId SDK.
Only a few software is available at present.3 votes -
Automatic continuous recording
Right now we cannot record a live stream for longer than the maximum archive window length of 25 hours. If the live stream goes on for longer than 25 hours we begin to lose data.
It would be useful to have some function that enables recording of live stream without being restricted by a certain duration. For example, we want to continuously record a security camera stream that is active 24/7. The recorded assets can be in chunks of 1-hour duration.3 votes -
Scheduling program to start and stop at specific time
Currently live streaming programs have to be started and stopped manually. It would be useful to be able to schedule when to start and stop programs allows for precise timing as well as help transitioning between live events seamlessly.
3 votes -
Automatic failover for live streaming
Using 3 to 4+ fragmented ingest streams, embedded with VITC (SMPTE) timecode. Buffer at the AMS server, for several seconds (this should be configurable, to allow for extra time when satellite and terrestrial internet redundancy are utilized). Then analyze the 3+ streams for similarity, both video and audio would be cool. When the stream that is being pushed out to clients, differs greatly from the others, automatically switch over. It would also be nice to be able to get notifications, not necessarily to the encoders themselves, but maybe and configurable https endpoint.
3 votesThis is now under review by the Streaming team. Thank you for your feedback.
-
Support timed ID3 metadata in HLS
Use case being sending in-band metadata about what's currently being streamed, used by clients as cuepoints, for information display, to invoke time-aligned actions and so on. Because the data can be embedded within the stream, the application playing the video doesn't need to load data from two separate sources and doesn't need to do anything to keep the two data streams in sync.
3 votes -
Media encoder thumbnail job while live streaming
Thumbnail job of Media Encoder Standard while live streaming.
It does not follow the asset that has not been created yet.
Job finishes before live broadcast is completed.Currently, we must execute polling thread for thumbnail creation while live streaming.
3 votes
- Don't see your idea?