Matt

My feedback

  1. 34 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  5 comments  ·  Data Lake  ·  Flag idea as inappropriate…  ·  Admin →
    Matt supported this idea  · 
  2. 146 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    21 comments  ·  Data Lake  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment
    Matt commented  · 

    I am in no way associated with Microsoft; take what I say as you wish or as an random internet stranger, however we have a significant contract with access to folks there. That being said...

    To be clear, it is absolutely not supported per Microsoft employees and forum moderators. I mention this because I saw some older comments mentioning just changing the protocol to abfss. They also WILL be starting to sunset ADLA within 5 years or so per my dedicated ms engineers (I was told this about a year ago). This is really disappointing because I don't want a managed instance of hadoop / databricks. I want quick hitter pay per job easy to learn code and massively scalable.

    As such I see no reason this will EVER get implemented as much as we all want it.

    Here is a forum link (although older) still holds true: https://social.msdn.microsoft.com/Forums/en-US/5ce97eef-8940-4591-a19c-934f71825e7d/connect-data-lake-analytics-to-adls-gen-2

    Matt supported this idea  · 
  3. 83 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  1 comment  ·  Azure Synapse Analytics » SQL/T-SQL  ·  Flag idea as inappropriate…  ·  Admin →
    Matt supported this idea  · 
  4. 247 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Data Factory  ·  Flag idea as inappropriate…  ·  Admin →
    Matt supported this idea  · 
    An error occurred while saving the comment
    Matt commented  · 

    This seems to have a lot of votes, are there any plans to change this value? Thanks.

  5. 20 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    7 comments  ·  Data Factory  ·  Flag idea as inappropriate…  ·  Admin →
    Matt supported this idea  · 
  6. 17 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Storage » Blobs  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment
    Matt commented  · 

    This seems like a no brainer. Why is this not a standard option? As it is, I process big data and several thousand csv files per day. I have to delete them in a loop, and then there is a problem with limitations with loop execution count and timeouts...really annoying.

    Matt supported this idea  · 
  7. 127 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Logic Apps » Connectors  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment
    Matt commented  · 

    Currently what I had to do is create stored procedures, then call the stored procedures with some parameters. Those parameters are the values that are used when the sproc is run. That sproc essentially runs an update/insert with your values.

    That's my "hokie work-around"

    Matt supported this idea  · 
  8. 79 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Logic Apps » Connectors  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment
    Matt commented  · 

    Along with U-SQL job submission would be the number of AU Units and the location of a script that you want to run. An option for "Waiting until the job is complete" would be nice as well instead of timing out and needing some loop logic

    Matt supported this idea  · 

Feedback and Knowledge Base