Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Raghavendra Narayana

My feedback

  1. 41 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    18 comments  ·  Azure Synapse Analytics » SQL/Loading  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment
    Raghavendra Narayana commented  · 

    For loading data to Azure SQL DWH, one of the alternate I am thinking is, creating Azure SQL DWH Table with Heap which is allowing VARCHAR(MAX), NVARCHAR(MAX), VARBINARY(MAX) data types. Create External Table with multiple columns for columns having more than allowed length. In the Select statement while exporting data from Source, use Substring to split the column data, load the Text file to External Table with same number of Columns, finally while loading to Azure SQL DWH Table, merge the content of multiple column into one column. Let me know your views or other alternates please.

    ---
    CREATE TABLE play.table1 (c1 varchar(max), c2 nvarchar(max), c3 varbinary(max))
    with (heap)
    --this is working in Azure SQL DWH
    --I tried loading 1 million character length data with mix of alphanumeric and few special characters, it is working and storing the data. I am using Table which has 50k character length, I am able to regular operations.
    --Yes, only Heap supporting (MAX), for "CLUSTERED COLUMNSTORE INDEX" Table types (MAX) is not working.
    ---

    Raghavendra Narayana supported this idea  · 
  2. 681 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    33 comments  ·  SQL Database  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment
    Raghavendra Narayana commented  · 

    I agree, this is extra difficulty. More than changing the code, validation and testing takes time.
    For now, I am thinking to create below function and do find & replace Getdate() with udo.Getdate(getdate()) in the code. Let me know any issues or challenges you see OR better alternate available. Note: udo is a schema name I used, it can be anything like dbo or your own schema name. I am adding 4 hours, you need to change this accordingly to your time zone.

    CREATE FUNCTION udo.GETDATE (@d datetime) RETURNS DATETIME
    AS
    BEGIN
    RETURN DATEADD(HOUR, 4, @d);
    end

    Raghavendra Narayana supported this idea  · 
  3. 39 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    6 comments  ·  Azure Synapse Analytics » SQL/T-SQL  ·  Flag idea as inappropriate…  ·  Admin →
    Raghavendra Narayana supported this idea  · 
  4. 41 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Data Lake  ·  Flag idea as inappropriate…  ·  Admin →
    Raghavendra Narayana supported this idea  · 

Feedback and Knowledge Base