Will

My feedback

  1. 3 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →

    Upvotes: 3

    <=-=Apr 24 2015 5:53AM=-=>

    I couldn’t agree more with Chandan. I’d add that you should remove some of the cache limitations too. Chandan’s experience is my exact experience too…my clients are moving to Postgres to save money. They start by moving non-critical, micro-service databases to Postgres. After they see how easy it is to do and how much money they save they quickly move all greenfield projects to Postgres. Then they start talking about migrating more stuff. You guys really screwed the pooch with your licensing changes in 2012.

    Here’s the rub…in the process of moving to a different data platform they are also making the move to different development stacks. So instead of just losing your SQL Server licensing $$$ you are also losing your MSDN subscribers. Whereas, if you considered giving Express away with larger db sizes you could at least retain the tools licensing. Generally…

    Will supported this idea  · 
  2. 7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  SQL Database  ·  Flag idea as inappropriate…  ·  Admin →
    Will supported this idea  · 
  3. 30 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    5 comments  ·  SQL Managed Instance  ·  Flag idea as inappropriate…  ·  Admin →
    Will commented  · 

    It would be really useful if SQL Managed Instance could use OPENROWSET to read CSV files and Excel files. (It may be possible for SQL Managed Instance to read these from Azure blobs, I'm not sure). Can SQL Managed Instance use OPENROWSET to read from Azure 'Files', the new Azure implementation of SMB? This would make it really easy to upload Excel files into Azure 'Files' storage and then load them into SQL.

    Will supported this idea  · 
  4. 20 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    6 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →
    Will supported this idea  · 
  5. 16 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  4 comments  ·  SQL Server » Other  ·  Flag idea as inappropriate…  ·  Admin →
    Will supported this idea  · 
  6. 15 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  0 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →
    Will supported this idea  · 
  7. 15 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →
    Will supported this idea  · 
  8. 65 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    12 comments  ·  SQL Server » Other  ·  Flag idea as inappropriate…  ·  Admin →
    Will supported this idea  · 
  9. 21 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →
    Will supported this idea  · 
  10. 233 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    77 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →
    Will supported this idea  · 
  11. 3 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →
    Will supported this idea  · 
  12. 1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL Server » Bugs  ·  Flag idea as inappropriate…  ·  Admin →

    Upvotes: 3

    <=-=Jan 11 2010 7:02AM=-=>

    ping!

    <=-=Jan 20 2010 3:53PM=-=>

    Harman, sorry for the late response. I tried the following for your sceanrio

    data file
    1,1,1
    2,2,2
    3,3
    4,4,4

    BCP command and output as follows. you will note that BCP fails on 3rd row. you can use format file to skip column(s) but that will need to be done for all rows. Your case is different. I am going to request someone from SSIS team to comment on the regresssion in behavior from DTS package in SQL2000

    c:\Temp>bcp bulktest..t1 in a.dat -c -Ssunila02\sql2008sp1 -t, -T

    Starting copy…
    SQLState = 22005, NativeError = 0
    Error = [Microsoft][SQL Server Native Client 10.0]Invalid character value for cast specification

    2 rows copied.
    Network packet size (bytes): 4096
    Clock Time (ms.) Total : 437 Average : (4.58 rows per sec.)

    <=-=Jan 20 2010 8:11PM=-=>

    Please test BULK INSERT as that’s what we tested…

    Will supported this idea  · 
  13. 2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →

    Upvotes: 21

    <=-=Mar 24 2006 10:28AM=-=>

    This is a reasonable suggestion and will will consider it in the future. Could you supply more information why you need this enhancement? You can have commas in values that are quoted. Could you explain why this does not meet your needs?

    <=-=Mar 25 2006 3:23PM=-=>

    It’s a common question on how to load files where some fields are enclosed in quotes, and where there is a header where it is not. Example:

    col1,col2,col3,col4
    12,“This is col2”,“And col3”,19

    This particular file is not loadable by BCP at all. On the other hand:

    col1,col2,col3,col4
    “12”,“This is col2”,“And col3”,19

    this one is, because in the format file you can defined an initial dummy column that will swallow the header.

    I have previously suggested that it should be possible to describe a header, so that BCP can skip it.

    Adding pre-defined support for CSV seems like a…

    Will supported this idea  · 
  14. 2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →

    Upvotes: 8

    <=-=Apr 11 2008 9:46AM=-=>

    Thank you very much for your feedback. We have had other requests for compression and use of standard input and ouput by bcp recently and we will take them all into consideration for a future release of SQL Server. Unfortunately it’s to late to consider this request for SQL Server 2008.

    <=-=Mar 23 2010 7:40AM=-=>

    It’s probably not the type of feature in scope for 2008 R2 either, but this would be an excellent feature for the next full release of SQL server.

    <=-=Nov 23 2010 2:04PM=-=>

    Any movement on this in the last 2 1/2 years?

    Will supported this idea  · 
  15. 4 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →

    Upvotes: 41

    <=-=Oct 4 2007 10:13AM=-=>

    Thank you for your feature suggestion. We are considering the improvement in our next release.

    <=-=Jan 23 2012 11:23AM=-=>

    It’s been three and a half years, and we’re headed toward our third release since this item was filed, but is this still being considered for “our next release”?

    <=-=Feb 22 2012 7:56AM=-=>

    One simple workaround to allow automation – (A 3 step job):
    1. Create a file _header.txt that has the column headings
    i.e. copy the first row [with header] of a select * from table, paste into excell, keep only headings and save as .txt

    2. bcp out to Data.txt file
    i.e. bcp DB.dbo.table_name out c:\table_name
    Data.txt -Sserver -T -c

    3. Concatenate files with the copy command
    i.e. copy /a /Y C:\tbl_header.txt + C:\tbl_Data.txt C:\tbl.txt

    Also, you can bcp out of a view which offers tremendous additional functionality… :-)

    <=-=May 28 2014 3:54AM=-=>

    Will supported this idea  · 
  16. 2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →

    Upvotes: 17

    <=-=Nov 20 2009 2:50PM=-=>

    Hi,
    Thank-you for your feedback and feature request for the BCP.exe utility. We are investigating the best way to address your scenario. We will post an update once we have something concrete.

    Regards,
    Jimmy Wu
    Microsoft SQL Server

    <=-=Nov 21 2009 12:48PM=-=>

    The ability to pipe input and output with the BCP command would greatly enhance automation for applications that I support. We use similar techniques for other DBMS systems since the bulk loading utilities support the capability. The example above which shows the use of compression tools in conjunction with BCP are just the beginning.

    <=-=Nov 15 2012 9:05AM=-=>

    Any updates on this? Saving data to a slow disk completely defeats the purpose of a bulk copy facility. Postgres has had this feature for years.

    <=-=Jun 26 2013 6:03AM=-=>

    1) Many GNU programs accept – (dash) filename which means “use stdio”. bcp…

    Will supported this idea  · 
  17. 2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →

    Upvotes: 3

    <=-=Mar 17 2008 12:35PM=-=>

    Thank you very much for your feedback.
    These are all very reasonable suggestions. The feature set for SQL Server 2008 is now frozen but we will give them serious consideration for inclusion in a future release of SQL Server.

    <=-=Dec 16 2011 6:44PM=-=>

    Include the snapshot agent while your at it. It needs it just as much as any of the other tools.

    Will supported this idea  · 
  18. 3 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →
    Will supported this idea  · 
  19. 15 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  SQL Server » Suggestions  ·  Flag idea as inappropriate…  ·  Admin →

    Upvotes: 8

    <=-=Mar 4 2017 9:33AM=-=>

    We understand the problem. XML is designed differently and the goal was to provide rich query language with XPath support, which might require a lot of memory and processing logic. OPENJSON is designed to be more lightweight and just scan JSON text and return values where it finds them. There are pros and cons for both approaches (similar to pros and cons for DOM and SAX parsers).

    The key differentiators between JSON and XML is the fact that JSON is better for scan based processing of JSON columns where you just pick few values from JSON text, and XML is better for rich querying and indexing.

    I will keep this item open and let people vote for this; however, we cannot guarantee that this kind of re-design of XML will be done in near future.

    If you need to use shredded nodes in some…

    Will supported this idea  · 
  20. 1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL Server » Bugs  ·  Flag idea as inappropriate…  ·  Admin →

    Upvotes: 1

    <=-=Jun 6 2017 11:19AM=-=>

    there are number of workarounds that are available, using temp tables, nodes() etc that can be used to improve this, however I think this plan difference should be covered by legacy cardinality estimation or database compatibility set to 2008 r2

    Will supported this idea  · 
← Previous 1 3 4 5

Feedback and Knowledge Base