Database table has stored document in varbinary.
So i can get in byte[] in C# code.
Now How can i export this byte[] JSON file field.
if (item.IS_VIDEO == 0)
{
var content = ctx.DOCUMENT_TABLE.First(a => a.document_id == item.document_id).DOCUMENT_CONTENT;
if (content != null)
{
publicationClass.document_content = System.Text.Encoding.Default.GetString(content); //for export to json field
}
}
is this a way to export byte[] file to JSON?
Have you considered letting JSON serializer deal with the problem?
byte[] file = File.ReadAllBytes("FilePath"); // replace with how you get your array of bytes
string str = JsonConvert.SerializeObject(file);
This can be then deserialized on the receiving end like this:
var xyz = JsonConvert.DeserializeObject<byte[]>(str);
This appears to be working without any issues, however there might be some size limitations that might be worth investigating before commiting to this method.
Related
I am updating some XML via code in C# and when I insert it into my database (varchar column) I am seeing a extra ?. I think this has to do with the Encoding.UTF8 I am doing. The ? is NOT there all the way through the process of calling SaveChanges() in EF core.
The database column in varchar and cannot be changed due to the way it was set up etc.
I'm not sure how I should convert the xml back to string after saving it from the dashboard (which is dev express)
internal static async Task<string> ConvertXmlToNewConnectionStrings(string xml, List<ConnectionSourceViewModel> connectionSourceViews, int currentUserId)
{
var connectionStringOptions = await GetConnectionStringOptions(currentUserId).ConfigureAwait(false);
System.Xml.Linq.XDocument doc;
using (StringReader s = new StringReader(xml))
{
doc = System.Xml.Linq.XDocument.Load(s);
}
Dashboard d = new Dashboard();
d.LoadFromXDocument(doc);
var sqlDataSources = d.DataSources.OfType<DashboardSqlDataSource>().ToList();
foreach (var item in sqlDataSources)
{
// We do not want the properties in the data source anymore... this includes user name and password etc... the new way dev ex went with this functionality
item.ConnectionParameters = null;
// get the selected connection string from the end user
var connectionStringId = connectionSourceViews.Where(x => x.SqlDataSourceComponentName == item.ComponentName).Select(x => x.SelectedConnectionStringId).FirstOrDefault();
if (string.IsNullOrWhiteSpace(connectionStringId) == false)
{
item.ConnectionName = connectionStringOptions.Where(x => x.Value == connectionStringId).Select(x => x.Text).FirstOrDefault();
}
}
MemoryStream ms = new MemoryStream();
d.SaveToXml(ms);
byte[] array = ms.ToArray();
ms.Close();
xml = Encoding.UTF8.GetString(array, 0, array.Length);
return xml;
}
I'm expecting the ? is not be added to the XML in the database column of varchar after saving it via EF Core
Image of ? at start:
https://ibb.co/LZ2QVtr
I ended up removing the code that does a MemoryStream() and went to saving the document back to the XDocument().
doc = d.SaveToXDocument();
Now everything works fine and no question mark (?) is put in the database!
I have created an object which contains data I want to insert/append to the data currently sitting in a json file.
I have succeeded in getting to to write the data to the file however it overwrites all the data that was there originally.
What i am trying to do is append this Property to the json file whilst keeping all the original information.
This is what I have done so far:
string widthBox = Width.Text.ToString();
string heightBox = Height.Text.ToString();
string WindowSizejson = File.ReadAllText(DownloadConfigFilelocation);
dynamic WindowSizejsonObj = Newtonsoft.Json.JsonConvert.DeserializeObject(WindowSizejson);
JObject windowContent = new JObject(
new JProperty("externalSite",
new JObject(
new JProperty("webLogin",
new JObject(
new JProperty("window", "height=" + heightBox + ",width=" + widthBox + ",resizable,scrollbars")
)
)
)
)
);
This is the data currently in the json file that i need to append the above to.
( have blurred out values due to company security reasons)
You have two choices that I can think of:
1.Read the entire file into an object, add your object, and then
rewrite the entire file (poor performance)
var filePath = #"path.json";
// Read existing json data
var jsonData = System.IO.File.ReadAllText(filePath);
// De-serialize to object or create new list
var SomeObjectList= JsonConvert.DeserializeObject<List<T>>(jsonData)
?? new List<T>();
// Add any new
SomeObjectList.Add(new T()
{
Name = "..."
});
SomeObjectList.Add(new T()
{
Name = "..."
});
// edit
var first = SomeObjectList.FirstOrDefault();
first.Name = "...";
// Update json data string
jsonData = JsonConvert.SerializeObject(SomeObjectList);
System.IO.File.WriteAllText(filePath, jsonData);
Open the file read/write,
parse through until you get to the closing curly brace, then write
the remaining data, then write the close curly brace (not trivial)
Instead of messing around with JProperty, deserialize your json and append your desired data:
JObject obj = JObject.Parse(jsontext);
obj["new_prop"] = "value";//new property as per hirarchy ,same for replacing values
string newjson=obj.ToString();
it's much cleaner and easier to maintain.
One Module reads some files using
File.ReadAllBytes("path")
and stores the result in a database table.
Later I take the result from the table and normaly use
File.WriteAllBytes("outputPath", binary.Data)
to write the file back.
Now I have to change the content of the file. Of course I can write the file to a temp folder, read it back in as a File object, change it, write it back to the destination folder.
But is there a smarter way to do that? Create the File object directly out of the binary data?
Got a solution. Its the Encoding Class:
var binary = GetBinaryFileFromDatabase();
var inputLines = Encoding.Default.GetString(binary).Split('\n');
var outputLines = new List<string>();
foreach (var line in inputLines)
{
var columns = line.Split(';');
if (columns.Length > 28 && columns[28] != null && columns[28].Length > 0)
{
columns[28] = columns[28].Replace(Path.GetFileName(columns[28]), "SomeValue");
}
outputLines.Add(string.Join(";", columns));
}
File.WriteAllLines(outputPath, outputLines);
Thanks everybody!
I have a C# project connected to a SQL Server Database, and I' m doing a format recognition for file upload.
More specifically, I upload files in varbinary and I store they in a column in the Db. When I download they, I want to recognize their format. I found on the web some hot keywords (or metadata) through which I could recognize if they are docx or xlsx or other. My problem is that if I do this on SQL:
select * from table where convert(varchar(8000),objectFile) like '%docprop%'`
it works and the db returns only word file.
But if I want to do this on C# after taking the varbinary how can I do?
I tried this but don' t work:
var item = context.tObject_M.SingleOrDefault(x => x.objectId == objectId);
var files = item.objectFile;
string filess = Convert.ToString(files);
byte[] itemi = item.objectFile;
string ciao = System.Text.Encoding.UTF8.GetString(itemi);
if (ciao.Contains("DOCPROPS"))
{
filess = "ciao";
}
http://www.garykessler.net/library/file_sigs.html
Check this link, you should investigate about magic numbers. This is the only reliable way to known the file type. In your case you can use content type like I told you in the comments !
If you search for sequence starting with [03] [04] [00]
byte[] fileSign={0x03, 0x04, 0x00};
for(int i=0; i<fileSign.Length; i++)
{
if(fileByteArray[i] != fileSign[i])
{
//not the same files
}
}
I'm trying to store a file in an SQL Server database and am using ASP.NET MVC 3 with LINQ to SQL in C#. When I try to set the data type of a column to byte[] it says that this is an invalid data type. Using the vrabinary(MAX) data type instead seems like the logical thing to do.
The goal is to store PDF files in a database that the user has elected to upload using an HTTP file uploader. The error occurs during assignment of the LINQ member variable that refers to the column where the file is to be stored.
Error 3 Cannot implicitly convert type 'System.Web.HttpPostedFileBase' to 'System.Data.Linq.Binary' C:\Users\Aaron Patten\Documents\Visual Studio 2010\Projects\MyApp\MyApp\Controllers\MyController.cs 66 31 MyApp
I haven't found much in the way of other people encountering this problem, so I think there must be something I'm missing that is too elementary for anyone to post.
My controller:
public ActionResult UploadPDF()
{
List<ViewDataUploadFilesResult> r = new List<ViewDataUploadFilesResult>();
HttpPostedFileBase hpf;// = Request.Files[file] as HttpPostedFileBase;
foreach (string file in Request.Files)
{
hpf = Request.Files[file] as HttpPostedFileBase;
if (hpf.ContentLength == 0)
continue;
string savedFileName = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "images\\" + Path.GetFileName(hpf.FileName));
hpf.SaveAs(savedFileName);
r.Add(new ViewDataUploadFilesResult()
{
Name = savedFileName,
Length = hpf.ContentLength
});
}
MyApp.Models.MyAppDataContext db = new MyApp.Models.MyAppDataContext();
MyApp.Models.PDFFile myPDFFile = new MyApp.Models.PDFFile();
myPDFFile.Content = hpf;
db.questions.InsertOnSubmit(myPDFFile);
db.SubmitChanges();
return View("UploadPDF", r);
}
What's the proper way of going about all this?
P.S.: How do I display the PDF as an embedded object without saving it to the server?
D.A.P.
The error message tells you what's wrong... you are setting Content=hpf. hpf is a HttpPostedFileBase so it first needs to be converted to a byte[]. Do that by reading the byte[] out of the hpf's stream.
myPDFFile.Content = new BinaryReader(hpf.InputStream).ReadBytes(hpf.InputStream.Length)
In your LINQ to SQL entity designer change the field giving you trouble from System.Data.Linq.Binary to a type of System.Byte[]. The conversion will be implicit upon submission.