As I'm sure you already tried your second case maybe be mitigated a few ways and I think this may help in better understanding the source of the conversion issue. I suspect is the most common way:
1. Nesting your results set.
SELECT v.TableID, UserInput=CONVERT(int, v.UserInput)
SELECT tt.TableID, tt.UserInput
FROM dbo.TheTable tt
INNER JOIN dbo.Interpretation i
WHERE tt.IntprID = i.IntprID
AND i.Type = 'integer'
I believe that would mitigate your conversion error. I didn't try in your example but have used similar many-many times.
The cause I believe revolves around efficient conversion, the optimizer drills down and applies the conversion as an expression, applying it on the read side of the sequence. By nesting the result I believe we push the expression to the outer side (the results set). I could be wrong and I am sure one of the engine guys and gals can correct me on this.
To your first example, I'd start by saying everything revolves around trade-offs and an outside determinism, like having to author a single statement query being often the root of the problem not what the optimizer did with the mess it was sent. Mitigation then comes in the form of aggressive cover and filtered indexes or as you noted breaking the query into silos for processing.
Put it another way, I agree CTE's can be hit and miss particularly as the number of internal temporary objects are employed. Remember a lot of sorting, spooling, yada-yada-yada goes into the plan the more likely a CTE's going to get tripped up. I humbly would note a plan performance dramatically different as a CTE, vs. breaking it into steps, points to an under optimized structure (e.g. cover indexes should equalize out the performance and may result in the CTE being "faster").
I am finding a reliable test of complicated (aberrant) construction can be improved by constructing given query as CTE and seeing just how messy it becomes. Once you see where the break down occurs you may be able to design filtered / cover indexes to satisfy the request. Then you get to decide on the frequency, e.g. is it worth adding the cover index(s) or simply authoring the query in steps (divide and conquer).
Upvotes: 1<=-=Nov 28 2017 8:19AM=-=>
Update, did some review and confirmed SQL 2016 SP1 CU5 Maintenance plans
are ignorant of the “IgnoreDatabasesInNotOnlineState” setting. As per
https://msdn.microsoft.com/en-us/library/gg617041(v=sql.105).aspx when set to
“True”, offline databases should be ignored. Unfortunately ignorance of this
setting produces a significant number of 18456 events which makes monitoring
for actual attach vector more difficult.
Fyi: Connect #3136970 (note to self to keep track of this).
The Maintenance Plans are SSIS packages. You can export and import the plan via connecting to SSIS. The exported file is an xml file and you have options to encode sensitive details by password on the extract. The only issue I have with this method is scheduling, e.g. you need to script the schedule addition to run the package (or manually implement the schedule creation).
Hopefully that helps.
Key here would be to comply with the CSV rules; notably honoring CrLf's when they appear within quoted block.
Never quite understood why SQL Server didn't/hasn't natively transform from and into CSV. Sure BCP does it, sort of, and one can craft an xsl transformation or directly code an extract aka DTS/SSIS but these methods all suffer from externalizing the transformation.