SQL Server database truncated a big base64 string [duplicate] - c#

How do you view ALL text from an NTEXT or NVARCHAR(max) in SQL Server Management Studio? By default, it only seems to return the first few hundred characters (255?) but sometimes I just want a quick way of viewing the whole field, without having to write a program to do it. Even SSMS 2012 still has this problem :(

I was able to get the full text (99,208 chars) out of a NVARCHAR(MAX) column by selecting (Results To Grid) just that column and then right-clicking on it and then saving the result as a CSV file. To view the result open the CSV file with a text editor (NOT Excel). Funny enough, when I tried to run the same query, but having Results to File enabled, the output was truncated using the Results to Text limit.
The work-around that #MartinSmith described as a comment to the (currently) accepted answer didn't work for me (got an error when trying to view the full XML result complaining about "The '[' character, hexadecimal value 0x5B, cannot be included in a name").

Quick trick-
SELECT CAST('<A><![CDATA[' + CAST(LogInfo as nvarchar(max)) + ']]></A>' AS xml)
FROM Logs
WHERE IDLog = 904862629

In newer versions of SSMS it can be configured in the (Query/Query Options/Results/Grid/Maximum Characters Retrieved) menu:
Old versions of SSMS
Options (Query Results/SQL Server/Results to Grid Page)
To change the options for the current queries, click Query Options on the Query menu, or right-click in the SQL Server Query window and select Query Options.
...
Maximum Characters Retrieved
Enter a number from 1 through 65535 to specify the maximum number of characters that will be displayed in each cell.
Maximum is, as you see, 64k. The default is much smaller.
BTW Results to Text has even more drastic limitation:
Maximum number of characters displayed in each column
This value defaults to 256. Increase this value to display larger result sets without truncation. The maximum value is 8,192.

I have written an add-in for SSMS and this problem is fixed there. You can use one of 2 ways:
you can use "Copy current cell 1:1" to copy original cell data to clipboard:
http://www.ssmsboost.com/Features/ssms-add-in-copy-results-grid-cell-contents-line-with-breaks
Or, alternatively, you can open cell contents in external text editor (notepad++ or notepad) using "Cell visualizers" feature: http://www.ssmsboost.com/Features/ssms-add-in-results-grid-visualizers
(feature allows to open contents of field in any external application, so if you know that it is text - you use text editor to open it. If contents is binary data with picture - you select view as picture. Sample below shows opening a picture):

Return data as XML
SELECT CONVERT(XML, [Data]) AS [Value]
FROM [dbo].[FormData]
WHERE [UID] LIKE '{my-uid}'
Make sure you set a reasonable limit in the SSMS options window, depending on the result you're expecting.
This will work if the text you're returning doesn't contain unencoded characters like & instead of & that will cause the XML conversion to fail.
Returning data using PowerShell
For this you will need the PowerShell SQL Server module installed on the machine on which you'll be running the command.
If you're all set up, configure and run the following script:
Invoke-Sqlcmd -Query "SELECT [Data] FROM [dbo].[FormData] WHERE [UID] LIKE '{my-uid}'" -ServerInstance "database-server-name" -Database "database-name" -Username "user" -Password "password" -MaxCharLength 10000000 | Out-File -filePath "C:\db_data.txt"
Make sure you set the -MaxCharLength parameter to a value that suits your needs.

I was successful with this method today. It's similar to the other answers in that it also converts the contents to XML, just using a different method. As I didn't see FOR XML PATH mentioned amongst the answers, I thought I'd add it for completeness:
SELECT [COL_NVARCHAR_MAX]
FROM [SOME_TABLE]
FOR XML PATH(''), ROOT('ROOT')
This will deliver a valid XML containing the contents of all rows, nested in an outer <ROOT></ROOT> element. The contents of the individual rows will each be contained within an element that, for this example, is called <COL_NVARCHAR_MAX>. The name of that can be changed using an alias via AS.
Special characters like &, < or > or similar will be converted to their respective entities. So you may have to convert <, > and & back to their original character, depending on what you need to do with the result.
EDIT
I just realized that CDATA can be specified using FOR XML too. I find it a bit cumbersome though. This would do it:
SELECT 1 as tag, 0 as parent, [COL_NVARCHAR_MAX] as [COL_NVARCHAR_MAX!1!!CDATA]
FROM [SOME_TABLE]
FOR XML EXPLICIT, ROOT('ROOT')

PowerShell Alternative
This is an old post and I read through the answers. Still, I found it a bit too painful to output multi-line large text fields unaltered from SSMS. I ended up writing a small C# program for my needs, but got to thinking it could probably be done using the command line. Turns out, it is fairly easy to do so with PowerShell.
Start by installing the SqlServer module from an administrative PowerShell.
Install-Module -Name SqlServer
Use Invoke-Sqlcmd to run your query:
$Rows = Invoke-Sqlcmd -Query "select BigColumn from SomeTable where Id = 123" `
-MaxCharLength 2147483647 -ConnectionString $ConnectionString
This will return an array of rows that you can output to the console as follows:
$Rows[0].BigColumn
Or output to a file as follows:
$Rows[0].BigColumn | Out-File -FilePath .\output.txt -Encoding UTF8
The result is a beautiful un-truncated text written to a file for viewing/editing. I am sure there is a similar command to save back the text to SQL Server, although that seems like a different question.
EDIT: It turns out that there was an answer by #dvlsc that described this approach as a secondary solution. I think because it was listed as a secondary answer, is the reason I missed it in the first place. I am going to leave my answer which focuses on the PowerShell approach, but wanted to at least give credit where it was due.

If you only have to view it, I've used this:
print cast(dbo.f_functiondeliveringbigformattedtext(seed) as text)
The end result is that I get line feeds and all the content in the messages window of SMSS.
Of course, it only allows for a single cell - if you want to do a single cell from a number of rows, you could do this:
declare #T varchar(max)=''
select #T=#T
+ isnull(dbo.f_functiondeliveringbigformattedtext(x.a),'NOTHINGFOUND!')
+ replicate(char(13),4)
from x -- table containing multiple rows and a value in column a
print #T
I use this to validate JSON strings generated by SQL code. Too hard to read otherwise!

Use visual studio code with sql server plugin. Super usefull for jsons

Alternative 1: Right Click to copy cell and Paste into Text Editor (hopefully with utf-8 support)
Alternative 2: Right click and export to CSV File
Alternative 3: Use SUBSTRING function to visualize parts of the column. Example:
SELECT SUBSTRING(fileXml,2200,200) FROM mytable WHERE id=123456

The easiest way to quickly view large varchar/text column:
declare #t varchar(max)
select #t = long_column from table
print #t

Related

Creating a DSL Parser for Haystack tags

I need to create a C# parser for tags mentioned in https://www.project-haystack.org/tag .
Basically if the user inputs 'Location=Singapore' kind of string, the parser will need to generate the SQL query accordingly. I even tried to google for some tutorials but couldn't find anything useful. Can someone help me to build this parser? As for the reference I am using a source code I found on Github https://github.com/Vanlightly/DslParser. But this doesn't fit 100% to my project. As the user can enter multiple values into their input (Eg: Location=Singapore AND AHU=Type1 AND IBMSPoint=Point7) and etc. So as mentioned above final SQL query needs to be generated accordingly to what user inputs. (Select * From HaystackTags H where (H.TagName='Location' AND H.TagValue='Singapore') AND (H.TagName='AHU' AND H.TagValue='Type1 ') AND (H.TagName='IBMSPoint' AND H.TagValue='Point7')). I have attached how my database looks like, as the final output will be executing the mentioned database table.Database Design

Having Trouble Pulling a Value with Quotes from SQL Server Table in C#

I'm building a dynamic SSIS package using C# EzAPI and I want to store the Meta Data Mapping in a SQL Server Table.
I am having trouble pulling in the delimiter ("\t" for example) and EOL Characters ("\r\n") from a table into C#.
Here is the line of code that I am trying to get to work:
flatFileColumn.ColumnDelimiter = columns.GetUpperBound(0) ==
Array.IndexOf(columns, column) ? SqlValuesDictionary["EndOfLineChar"] : SqlValuesDictionary["Delimiter"];
These dictionary values are strings I retrieved from a SQL Server Data Reader like this: SqlValuesDictionary["Delimiter"] = DataReader["Delimiter"].ToString()where delimiter is the column containing "\t".
So the problem is when I read in this value from SQL Server, it appears to be escaping every quote and \ and causing the value to get changed when I really want a literal "\t" for tab delimiter.
So When I change the code to read:
flatFileColumn.ColumnDelimiter = columns.GetUpperBound(0) ==
Array.IndexOf(columns, column) ? "\r\n" : "\t";
It works fine.
The debugger is not helping much because it appears to be displaying escapes that aren't even there so trying to replace the extra \s does nothing.
In the debugger the value of SqlValuesDictionary["Delimiter"] appears as "\"\\t\"" when the literal table value is "\t". When I change the table values to \t debugger shows it as "\\t" and it still fails.
Can anyone point me a better direction for how to resolve this? Perhaps I am going about it the wrong way. For those concerned with SQL Injection, these tables will be strictly controlled by internal employees only.
To Clarify and Summarize: What value can I put in a SQL table and how do I retrieve it in C# to be the equivalent of var tab = "\t"
I've had the same problem.
Found two reasonable ways to get around it:
A: Insert an actual <cr><lf> in the table:
UPDATE [EzAPI_Meta].[ezapi].[FlatFile]
SET [RowDelimiter] = CHAR(13)+CHAR(10)
I think it's a problem that you can't see those characters when you look at the table.
B: So I endeed up with replacing the characters in my C# code:
RowDelimiter.Replace("\\r\\n", "\r\n").Replace("{CR}{LF}", "\r\n");
With this one you can enter "\r\n" or "{CR}{LF}" as plain text in your table

How to store text with multi line separators into database?

I have some text,
e.g
line one \r\n
line two \r\n
line threee.
In my database, I have a column define as type text, then I use Entity Framework to map that column, the code generate by Entity Framework is type string
I can successfully save that text into the column in the database. However, from Management Studio, I couldn't see those line separators, when I do copy value, and paste to notepad, they have become one line of text.
Anyone know what the problem is?
Thanks.
The problem is simply that Management Studio doesn't support all characters. It removes the line breaks when you copy the value.
If you change the result from grid to text (Query -> Results To -> Results to text, or ctrl+T), and select the value, you will see that the text comes out as separate lines.
You should stick to NVarchar when it comes to store text data (one of the reason is that it supports more characters).
Heres an extract from msdn:
"ntext, text, and image data types will be removed in a future version of Microsoft SQL Server. Avoid using these data types in new development work, and plan to modify applications that currently use them. Use nvarchar(max), varchar(max), and varbinary(max) instead".
More info:
ntext, text, and image deprecated :http://msdn.microsoft.com/en-us/library/ms187993.aspx
replace text: http://blog.sqlauthority.com/2007/05/26/sql-server-2005-replace-text-with-varcharmax-stop-using-text-ntext-image-data-types/
This could quite simply be a problem with the management studio. Try querying the database from C# and printing the field's value. You will see immediately if the CRLFs are there.

knowing if a string will be truncated when updating database

I'm working on a software that takes a csv file and put the data in a sqlserver. i'm testing it with bad data now and when i make a data string to long (in a line) to be imported in the database i got the error : String or binary data would be truncated the statement has been terminate. that's normal and that's what i should expect. Now i wanna detecte those error before the update to the database. Is there any clever way to detecte this?
The way my software work is that i importe every line in a dataset then show the user the data that will be imported. Then he can click a button to do the actual update. i then do a dataAdapter.Update( Dataset, "something" ) to make the update to the database.
The probleme is that the error row terminate all the update and report the error. So i want to detect the error before i do the update to the server so the other rows will be inserted.
thanks
You will have to check the columns of each row. See if one exceeds the maximum specified in the database, and if yes, exclude it from being inserted.
A different solution would be to explicitly truncate the data and insert the truncated content, which could be done by using SubString.
The only way that I know of is to pre-check the information schema for the character limit:
Select
Column_Name,
Character_Maximum_Length
From
Information_Schema.Columns
Where
Table_Name = 'YourTableName'
What you need is the column metadata.
MSDN: SqlConnection.GetSchema Method
Or, if you have opened a recordset on your database, another solution would be to browse the field object to use its length to truncate the string. For example, with ADO recordset/VB code, you could have some code like this one:
myRecordset.fields(myField) = left(myString, myRecordset.fields(myField).DefinedSize)

I need to parse an HTML formatted country list into SQL inserts. Is there an easier way to do this?

There is about 2000 lines of this, so manually would probably take more work than to figure out a way to do ths programatically. It only needs to work once so I'm not concerned with performance or anything.
<tr><td>Canada (CA)</td><td>Alberta (AB)</td></tr>
<tr><td>Canada (CA)</td><td>British Columbia (BC)</td></tr>
<tr><td>Canada (CA)</td><td>Manitoba (MB)</td></tr>
Basically its formatted like this, and I need to divide it into 4 parts, Country Name, Country Abbreviation, Division Name and Division Abbreviation.
In keeping with my complete lack of efficiency I was planning just to do a string.Replace on the HTML tags after I broke them up and then just finding the index of the opening brackets and grabbing the space delimited strings that are remaining. Then I realized I have no way of keeping track of which is the country and which is the division, as well as figuring out how to group them by country.
So is there a better way to do this? Or better yet, an easier way to populate a database with Country and Provinces/States? I looked around SO and the only readily available databases I can find dont provide the full name of the countries or the provinces/states or use IPs instead of geographic names.
Paste it into a spreadsheet. Some spreadsheets will parse the HTML table for you.
Save it as a .CSV file and process it that way. Or. Add a column to the spreadsheet that says something like the following:
="INSERT INTO COUNTRY(CODE,NAME) VALUES=('" & A1 & "','" & B1 & "');"
Then you have a column of INSERT statements that you can cut, paste and execute.
Edit
Be sure to include the <table> tag when pasting into a spreadsheet.
<table><tr><th>country</th><th>name></th></tr>
<tr><td>Canada (CA)</td><td>Alberta (AB)</td></tr>
<tr><td>Canada (CA)</td><td>British Columbia (BC)</td></tr>
<tr><td>Canada (CA)</td><td>Manitoba (MB)</td></tr>
</table>
Processing a CSV file requires almost no parsing. It's got quotes and commas. Much easier to live with than XML/HTML.
/<tr><td>([^\s]+)\s\(([^\)])\)<\/td><td>([^\s]+)\s\(([^\)])\)<\/td><\/tr>/
Then you should have 4 captures with the 4 pieces of data from any PCRE engine :)
Alternatively, something like http://jacksleight.com/assets/blog/really-shiny/scripts/table-extractor.txt provides more completeness.
Sounds like a problem easily solved by a Regex.
I recently learned that if you open a url from Excel it will try and parse out the table data.
If you are able to see this table in the browser (Internet explorer), you can select the entire table, right click & "Export to Microsoft Excel"
That should help you get data into separate columns, I guess.
do you have to do this programatically? If not, may i suggest just copying and pasting the table (from the browser) onto MS Excel and then clearing all formats? This way tou get a nice table that can then be imported into your database without problem.
just a suggestion... hth
An assembly exists for .Net called System.Xml; you can just reference the assembly and convert your HTML document to a System.Xml.XmlDocument, you can easily pinpoint the HTML node that contains your required data, and use the use the children nodes to add into your data. This requires little string parsing on your part.
Load the HTML data as XElements, use LINQ to grab the values you need, and then INSERT.
Blowing my own trumpet here but my FOSS tool CSVfix will do it with a combination of the read_xml and sql_insert commands.

Categories