I have created code to upload an image into SQL Server.
Here is the code to convert the image into bytes:
//Use FileInfo object to get file size.
FileInfo fInfo = new FileInfo(p);
//Open FileStream to read file
FileStream fStream = new FileStream(p, FileMode.Open, FileAccess.Read);
byte[] numBytes = new byte[fStream.Length];
fStream.Read(numBytes, 0, Convert.ToInt32(fStream.Length));
//Use BinaryReader to read file stream into byte array.
//BinaryReader br = new BinaryReader(fStream);
//When you use BinaryReader, you need to supply number of bytes to read from file.
//In this case we want to read entire file. So supplying total number of bytes.
// data = br.ReadBytes((int)numBytes);
return numBytes;
And here is the code to add the bytes to a SqlCommand parameter as values:
objCmd.Parameters.Add("#bill_Image", SqlDbType.Binary).Value = imageData;
objCmd.ExecuteNonQuery();
But I am getting error
String or binary data would be truncated. The statement has been terminated
How can I overcome this problem?
Error is clearly indicating that you're trying to save more bytes than allowed by the field definition.
Not sure what sql type you're using for bill_Image but an appropiated field definition to store an image would be varbinary(MAX).
Check definition of the bill_Image column in your database.
it should be something like
bill_Image varbinary(X)
Just increase X or put there MAX instead of a number (if you have more than 8 000 bytes image)
Info about binary/varbinary type here
Related
I have a collection of pdf files saved to MongoDB. I'm attempting to convert all the files to base64 but I'm having trouble retrieving the actual files from the database. The file path is not what I'm expecting and I'm unsure of what else to try
Below is the code I have tried. The error message is:
"System.IO.IOException Message=The filename, directory name, or volume label syntax is incorrect. : 'C:\Users\Jabari.LAPTOP-FERO3UB5\Desktop\codebase\LMO\API\LMO.Api\https:\LMOdevstorage.blob.core.windows.net\documents\57a36b7e6789102ddf\J_Broomstick58059342787ba42f12345.pdf'
From what I can see it's trying to pull from my local codebase and not the actual database. I'm not sure why this is. I'm new to MongoDB operations.
Below is my code
var file = document.AzureDocumentUri;
if (file != null)
{
Byte[] bytes = File.ReadAllBytes(file);
String base64String = Convert.ToBase64String(bytes);
StreamWriter sw64 = new StreamWriter($"{filePath}test64.txt");
sw64.Write(base64String);
sw64.Close();
sw64.Dispose();
}
What I'm expecting to happen is to be able to access the pdf file directly from the database and read it into memory then convert it to base64.
Why is it attempting to download from my local directory instead the collection from the database?
Byte[] bytes = File.ReadAllBytes(file);
is strange, you can't download URI like this.
To find the best way to use file in MongoDB look at GridFS.
MemoryStream ms = new MemoryStream(new byte[] { 0x00, 0x01, 0x02 });
ms.Position = 0;
// MemoryStream ms2 = new MemoryStream(await File.ReadAllBytesAsync("test.pdf"));
// Write to MongoDB
// There is a limitation of 16 MB in record size for mongodb, this what GridFS handle, unlimited file size.
// https://www.mongodb.com/docs/manual/core/document/#:~:text=Document%20Size%20Limit,MongoDB%20provides%20the%20GridFS%20API.
GridFSBucket gridFSBucket = new GridFSBucket(db);
ObjectId id = await gridFSBucket.UploadFromStreamAsync("exemple.bin", ms);
// Get the file
MemoryStream ms3 = new MemoryStream();
await gridFSBucket.DownloadToStreamAsync(id, ms3);
Full example here
https://github.com/iso8859/learn-mongodb-by-example/blob/main/dotnet/02%20-%20Intermediate/StoreFile.cs
hiiii,
i want to read mp3 file by using binary reader, my code is :
using (BinaryReader br = new BinaryReader(File.Open("Songs/testbinary.mp3", FileMode.Open)))
{
int length = (int)br.BaseStream.Length;
byte[] bytes = br.ReadBytes(length);
txtBinary.Text = bytes.ToString();
}
.......
when i execute this code it shows and exception:
The process cannot access the file 'URL\testbinary.mp3' because it is being used by another process.
where "URL" is my actual file location.
You open the same file twice (without any sharing option). To read the content of a file as bytes you can use File.ReadAllBytes
byte[] bytes = File.ReadAllBytes("Songs/testbinary.mp3");
BTW: Don't forget txtBinary.Text = bytes.ToString(); doesn't give you what you think. You will have to use BitConverter.ToString or Convert.ToBase64String
I'm facing one problem which when I want to insert image into the database but the path is null then I will like a default picture to be insert into the default. Can this be possible???
if (imageName != "")
{
MessageBox.Show(imageName);
//Initialize a file stream to read the image file
FileStream fs = new FileStream(#imageName, FileMode.Open, FileAccess.Read);
//Initialize a byte array with size of stream
byte[] imgByteArr = new byte[fs.Length];
//Read data from the file stream and put into the byte array
fs.Read(imgByteArr, 0, Convert.ToInt32(fs.Length));
fs.Close();
cmdInsert.Parameters.AddWithValue("#fImage", imgByteArr);
}
else {
imageName = #"/HouseWivesSavior;component/images/unknown.jpg";
//Initialize a file stream to read the image file
FileStream fs = new FileStream(#imageName, FileMode.Open, FileAccess.Read);
//Initialize a byte array with size of stream
byte[] imgByteArr = new byte[fs.Length];
//Read data from the file stream and put into the byte array
fs.Read(imgByteArr, 0, Convert.ToInt32(fs.Length));
fs.Close();
cmdInsert.Parameters.AddWithValue("#fImage", imgByteArr);
cmdInsert.Parameters.AddWithValue("#fImage", "");
}
Don't do that. Why would you bother keeping a default image in the database? This is expensive in terms of storage and network traffic. Imagine having 1024 records with the same default image of 200KB, you will be wasting 200MB of db storage. Not only that, retrieving BLOBs from a database is a very expensive operation.
Avoid it, if the path is null then insert exactly that, a null value. Then let your business logic decide what to do when there's not image.
And by the way, never check for empty strings like you did in your question. Use the IsNullOrEmpty static method of the string object instead
if(!string.IsNullOrEmpty(imageName))
That will evaluate for null and empty values more efficiently
Hope it helps
Leo
Instead of checking for an empty string, check for null.
var imagePath = string.Empty;
if (imageName == null)
imagePath = "pathToDefaultImage";
else
imagePath = "pathToNonDefaultImage"
... do your work here using imagePath as the path
But you could just save null in your database. And then when you're loading your images from your database, just set a default image when you encounter null. That way you're not saving the same default image over and over.
I am receiving an ArgumentException (Parameter is not valid) when trying to recreate an image from a memory stream. I have distilled it down to this example where I load up an image, copy to a stream, replicate the stream and attempt to recreate the System.Drawing.Image object.
im1 can be saved back out fine, after the MemoryStream copy the stream is the same length as the original stream.
I am assuming that the ArgumentException means that System.Drawing.Image doesnt think my stream is an image.
Why is the copy altering my bytes?
// open image
var im1 = System.Drawing.Image.FromFile(#"original.JPG");
// save into a stream
MemoryStream stream = new MemoryStream();
im1.Save(stream, System.Drawing.Imaging.ImageFormat.Jpeg);
// try saving - succeeds
im1.Save(#"im1.JPG");
// check length
Console.WriteLine(stream.Length);
// copy stream to new stream - this code seems to screw up my image bytes
byte[] allbytes = new byte[stream.Length];
using (var reader = new System.IO.BinaryReader(stream))
{
reader.Read(allbytes, 0, allbytes.Length);
}
MemoryStream copystream = new MemoryStream(allbytes);
// check length - matches im1.Length
Console.WriteLine(copystream.Length);
// reset position in case this is an issue (doesnt seem to make a difference)
copystream.Position = 0;
// recreate image - why does this fail with "Parameter is not valid"?
var im2 = System.Drawing.Image.FromStream(copystream);
// save out im2 - doesnt get to here
im2.Save(#"im2.JPG");
Before reading from the stream you need to rewind its position to zero. You are doing that for the copy right now, but also need to do that for the original.
Also, you don't need to copy to a new stream at all.
I usually resolve such problems by stepping through the program and looking at runtime state to see whether it matches my expectation or not.
I downloaded the source project from the website, using as is, except I changed the target file from upload.php to upload.aspx, which contains the following code to receive the file data:
int chunk = Request.QueryString["chunk"] != null ? int.Parse(Request.QueryString["chunk"]) : 0;
string fileName = Path.GetFileName(Request.Files[0].FileName);
// Read stream
BinaryReader br = new BinaryReader(Request.InputStream);
byte[] buffer = br.ReadBytes((int)br.BaseStream.Length);
br.Close();
//byte[] appended = buffer.Take(149).ToArray();
// Write stream
BinaryWriter bw = new BinaryWriter(File.Open(Server.MapPath("~/uploadfiles" + fileName), chunk == 0 ? FileMode.Create : FileMode.Append));
bw.Write(buffer);
bw.Close();
The problem is when I upload a jpg file, or any other file, there is data prepended and appended to every chunk, that obviously makes the file corrupted, and increases the file size. Any idea why that would happen?
You ned to read from Request.Files[0] not from Request.InputStream.
see marco's post: here