Mayo

My feedback

  1. 5 votes
    Sign in
    (thinking…)
    Password icon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    2 comments  ·  Azure Functions  ·  Flag idea as inappropriate…  ·  Admin →
    Mayo commented  · 

    That's excellent news, Jeff. Thank you.

    Mayo supported this idea  · 
    Mayo commented  · 

    +1

    There are many documented performance and reliability issues on the durable functions GitHub repo where the root cause is Azure Storage. Making it easy to choose and use alternative data stores such as Redis or CosmosDb is a good thing (lower latencies for all the storage activities at the very least).

  2. 38 votes
    Vote
    Sign in
    (thinking…)
    Password icon
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Additional Services » ClearDB  ·  Flag idea as inappropriate…  ·  Admin →
    Mayo supported this idea  · 
  3. 23 votes
    Sign in
    (thinking…)
    Password icon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    2 comments  ·  Azure Functions  ·  Flag idea as inappropriate…  ·  Admin →

    This is an awesome idea, and we’re exploring a few options to make it a reality.
    However, the 600 connection limit per instance should be enough for most applications if you’re reusing or closing connections. If you truly need 600 open connections you are likely to run into the 10 minute timeout per execution.
    Even after we add this you will still need to be mindful of your connection management.

    Keep the votes coming!
    —Alex

    Mayo supported this idea  · 
  4. 21 votes
    Sign in
    (thinking…)
    Password icon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    6 comments  ·  Azure Functions  ·  Flag idea as inappropriate…  ·  Admin →

    Update: Still planned!
    -
    This is something that we have enabled internally, and are in the planning process of highlighting TCP connections for customers in the “Diagnose and Solve problems” tab. However, we do not have an ETA yet.

    Thanks for the feedback!
    Alex
    Azure Functions Team

    Mayo commented  · 

    The TCP connections pane has disappeared in the new version of the "Diagnose & Solve Problems" page. And not only in that, the pane wasn't very useful in the first version of the page. It showed the number of connections (good) but gave absolutely no details to help you diagnose what was creating the connections e.g. the counts of connections to different ip addresses? or the counts of connections to other azure services (usually the case).

    Exceeding the host thresholds is one of the most frustrating and frequent problems (ran into it on three consecutive projects with enterprise clients) we run into when doing anything remotely high volume with Azure Functions. There needs to be better diagnostics around this.

    Mayo supported this idea  · 
  5. 274 votes
    Vote
    Sign in
    (thinking…)
    Password icon
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  Azure Cosmos DB  ·  Flag idea as inappropriate…  ·  Admin →
    Mayo supported this idea  · 
  6. 49 votes
    Sign in
    (thinking…)
    Password icon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    3 comments  ·  Azure Functions  ·  Flag idea as inappropriate…  ·  Admin →
    Mayo supported this idea  · 
  7. 167 votes
    Sign in
    (thinking…)
    Password icon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  14 comments  ·  Data Lake  ·  Flag idea as inappropriate…  ·  Admin →
    Mayo supported this idea  · 
  8. 13 votes
    Vote
    Sign in
    (thinking…)
    Password icon
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Service Bus  ·  Flag idea as inappropriate…  ·  Admin →
    Mayo shared this idea  · 
  9. 25 votes
    Sign in
    (thinking…)
    Password icon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  4 comments  ·  Data Lake  ·  Flag idea as inappropriate…  ·  Admin →
    Mayo commented  · 

    Yes, please! This use case: Event Hubs Archive --Data Factory--> Data Lake Store <-- U-SQL ingest <--Scheduler is vital. Right now there are mucho blockers on the adl-a side, with support for Avro and the empty Avro files (file header, no blocks) generated by event hub capture.

    Event better, it would be a wow moment to extract this data (from avro) into a table and automatically keep it up to date when new files arrive.

    Mayo supported this idea  · 
  10. 92 votes
    Sign in
    (thinking…)
    Password icon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  17 comments  ·  Data Lake  ·  Flag idea as inappropriate…  ·  Admin →
    Mayo supported this idea  · 
    Mayo commented  · 

    Using Data Lake Analytics to process event hub capture files (Avro) is a huge use case and right now it's a fairly awful experience on the Data Analytics side.

    There are multiple versions of the MS Avro libraries floating around (with different bugs e.g. seekable vs non seekable streams), none of them currently handle the empty avro file (header but no blocks) sent by event hub capture....it's a mess.

    We almost had a wow moment with event hub capture --> data lake --> data lake analytics. It fell apart on the data lake analytics side.

    Please implement this. Please.

Feedback and Knowledge Base