Certain files created from MemoryStream are corrupt - c#

I have code that passes a list of objects containing the filename and binary data from the db to a loop that creates all the files. The problem that I have is the code below executes and appears to create the files correctly (filename & size is as expected) however, most files are "corrupt" when opened. The file types vary from images (jpg/png) to Word documents, Powerpoint presentations and PDF files. What is strange is that PDF files work perfectly, everything else is "corrupt"
My code is below (attachment is the object in the loop, the path is already created at this stage)
if(Directory.Exists(attachmentPath))
{
string absolutePath = attachmentPath + "\\importfiles\\" + parentfolders + "\\";
// no need to check if it exists as it will ignore if it does
Directory.CreateDirectory(absolutePath);
absolutePath += filename;
try
{
byte[] byteStream = null;
object objSave = null;
objSave = attachment.Image;
BinaryFormatter tmpBinF = new BinaryFormatter();
MemoryStream tmpMemStrm = new MemoryStream();
tmpBinF.Serialize(tmpMemStrm, objSave);
byteStream = tmpMemStrm.ToArray();
// Delete the file if it exists.
if (File.Exists(absolutePath))
{
File.Delete(absolutePath);
}
// Create the file.
using (FileStream fs = File.Create(absolutePath))
{
fs.Write(byteStream, 0, byteStream.Length);
fs.Dispose();
}
}
catch (Exception ex)
{
Exceptions.Text += ex.ToString();
}
}
I've used tips from MSDN and followed this tutorial but can't figure out why this is happening.
Thanks go to Amy for pointing out the issue with my approach, if anyone needs it, here's my updated code taking her answer into account. I've also extended it to add a log record on a table in the DB for later use.
if (Directory.Exists(attachmentPath))
{
// build path from the parts
string absolutePath = attachmentPath + "\\importfiles\\" + parentfolders + "\\";
// no need to check if it exists as it will ignore if it does
Directory.CreateDirectory(absolutePath);
absolutePath += filename;
byte[] file = attachment.Image;
try
{
// Delete the file if it exists.
if (File.Exists(absolutePath))
{
File.Delete(absolutePath);
}
// Create the file.
using (FileStream fs = File.Create(absolutePath))
{
fs.Write(file, 0, file.Length);
}
// start logging to the database
// add the Stored procedure
string SP = "sp_add_attachment";
// create the connection & command objects
MySqlConnection myConnection1 = new MySqlConnection(WPConnectionString);
MySqlCommand cmd1;
try
{
// open the connection
myConnection1.Open();
cmd1 = myConnection1.CreateCommand();
// assign the stored procedure string to the command
cmd1.CommandText = SP;
// define the command type
cmd1.CommandType = CommandType.StoredProcedure;
// pass the parameters to the Store Procedure
cmd1.Parameters.AddWithValue("#AttachmentID", attachment.ID);
cmd1.Parameters["#AttachmentID"].Direction = ParameterDirection.Input;
cmd1.Parameters.AddWithValue("#subpath", parentfolders);
cmd1.Parameters["#subpath"].Direction = ParameterDirection.Input;
cmd1.Parameters.AddWithValue("#filename", filename);
cmd1.Parameters["#filename"].Direction = ParameterDirection.Input;
// execute the command
int output = cmd1.ExecuteNonQuery();
// close the connection
myConnection1.Close();
}
catch (Exception ex)
{
Exceptions.Text += "MySQL Exception when logging:" + ex.ToString();
}
}
catch (Exception ex)
{
Exceptions.Text += ex.ToString();
}
}

I don't think using the BinaryFormatter is appropriate. If attachment.Image is a byte array, simply write it to the filestream. Forget the memory stream and the binary formatter entirely.
The Binary Formatter class is used to serialize a .Net class into a byte array. You already have a byte array though, so that step is not needed and is the source of your problem. Using the binary formatter would be appropriate only if the same binary formatter was used to create the blobs in the database. But you're storing files, not .Net objects, so it isn't useful here.
I'm not sure why PDFs would load when other files won't. You'd have to inspect the file using a hex editor to see what changed.

Related

File Still Locked Despite putting StreamReader in Using

My program goes through all files in a folder, reads them, and without altering their information moves them to another location under a different name. However I cannot use the File.Move method because I get the following IOException:
The process cannot access the file because it is being used by another
process.
This is how I am reading the file and adding all its lines to a List<string>:
List<string> lines = null;
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (var sr = new StreamReader(fs, Encoding.Default))
{
lines = new List<string>();
while (!sr.EndOfStream)
lines.Add(sr.ReadLine());
}
And this is the function with which I move the file:
public static bool ArchiveFile(string filePath, string archiveFolderLocation)
{
if (!Directory.Exists(archiveFolderLocation))
Directory.CreateDirectory(archiveFolderLocation);
try
{
string timestamp = string.Format("{0:yyyy-MM-dd HHmmss}", DateTime.Now);
string newFileName = Path.GetFileNameWithoutExtension(filePath) + " " + timestamp;
string destination = string.Format("{0}\\{1}{2}", archiveFolderLocation, newFileName, Path.GetExtension(filePath));
File.Move(filePath, destination);
return true;
}
catch (Exception ex)
{
return false;
}
}
I thought using the using statement is supposed to garbage-collect and release the file after being used. How can I release the file so I can move it and why my file stays locked?
Solved:
Got it. Somewhere between these two calls I was opening a TextReaderobject without disposing it.
I thought using the using statement is supposed to garbage-collect and
release the file after being used. How can I release the file so I can
move it and why my file stays locked?
Not really. Using statement is nothing but :
try { var resource = new SomeResource(); }
finally { resource.Dispose(); // which is not GC.Collect(); }
It works fine so it looks like your file is opened from some other place in your code...
P.S.
By the way you can just do:
List<string> lines = File.ReadAllLines().ToList();
You could use:
string dpath = "D:\\Destination\\";
string spath = "D:\\Source";
string[] flist = Directory.GetFiles(spath);
foreach (string item in flist)
{
File.Move(item, dpath + new FileInfo(item).Name);
}
Replace D:\\Source & D:\\Destination\\ with the required source and destination paths, respectively.

How to clone a HttpPostedFile

I have an application that allows a user to upload a file, the file is scanned with Symantec protection engine before it is saved. The problem I'm encountering is that after the files are scanned with protection engine they have 0 bytes. I'm trying to come up with a solution to this problem.
I've tried the cloning solution mentioned here: Deep cloning objects, but my uploaded files are not all Serializable. I've also tried resetting the stream to 0 in the scan engine class before passing it back to be saved.
I have been in contact with Symantec and they said the custom connection class that was written for this application looks correct, and protection engine is not throwing errors.
I'm open to any solution to this problem.
Here is the code where the file is uploaded:
private void UploadFiles()
{
System.Web.HttpPostedFile objFile;
string strFilename = string.Empty;
if (FileUpload1.HasFile)
{
objFile = FileUpload1.PostedFile;
strFilename = FileUpload1.FileName;
if (GetUploadedFilesCount() < 8)
{
if (IsDuplicateFileName(Path.GetFileName(objFile.FileName)) == false)
{
if (ValidateUploadedFiles(FileUpload1.PostedFile) == true)
{
//stores full path of folder
string strFileLocation = CreateFolder();
//Just to know the uploading folder
mTransactionInfo.FileLocation = strFileLocation.Split('\\').Last();
if (ScanUploadedFile(objFile) == true)
{
SaveFile(objFile, strFileLocation);
}
else
{
lblErrorMessage.Visible = true;
if (mFileStatus != null)
{ lblErrorMessage.Text = mFileStatus.ToString(); }
I can provide the connection class code if anyone needs that, but it is quite large.
You could take a copy of the filestream before you pass it into the scanning engine.
byte[] fileData = null;
using (var binaryReader = new BinaryReader(Request.Files[0].InputStream))
{
fileData = binaryReader.ReadBytes(Request.Files[0].ContentLength);
}
// pass the scanning engine
StreamScanRequest scan = requestManagerObj.CreateStreamScanRequest(Policy.DEFAULT);
//...
Update
To copy the stream you can do this:
MemoryStream ms = new MemoryStream();
file.InputStream.CopyTo(ms);
file.InputStream.Position = ms.Position = 0;

Using the UPDATE MySQL Command in C#

Suppose I have this method from one class:
private void btnChangeImage_Click(object sender, EventArgs e)
{
using (var openFileDialogForImgUser = new OpenFileDialog())
{
string location = null;
string fileName = null;
openFileDialogForImgUser.Filter = "Image Files (*.jpg, *.png, *.gif, *.bmp)|*.jpg; *.png; *.gif; *.bmp|All Files (*.*)|*.*"; // filtering only picture file types
var openFileResult = openFileDialogForImgUser.ShowDialog(); // show the file open dialog box
if (openFileResult == DialogResult.OK)
{
using (var formSaveImg = new FormSave())
{
var saveResult = formSaveImg.ShowDialog();
if (saveResult == DialogResult.Yes)
{
imgUser.Image = new Bitmap(openFileDialogForImgUser.FileName); //showing the image opened in the picturebox
location = openFileDialogForImgUser.FileName;
fileName = openFileDialogForImgUser.SafeFileName;
FileStream fs = new FileStream(location, FileMode.Open, FileAccess.Read); //Creating a filestream to open the image file
int fileLength = (int)fs.Length; // getting the length of the file in bytes
byte[] rawdata = new byte[fileLength]; // creating an array to store the image as bytes
fs.Read(rawdata, 0, (int)fileLength); // using the filestream and converting the image to bits and storing it in an array
MySQLOperations MySQLOperationsObj = new MySQLOperations("localhost", "root", "myPass");
MySQLOperationsObj.saveImage(rawdata);
fs.Close();
}
else
openFileDialogForImgUser.Dispose();
}
}
}
}
And this method from another class (MySQLOperations):
public void saveImage(byte[] rawdata)
{
try
{
string myConnectionString = "Data Source = " + server + "; User = " + user + "; Port = 3306; Password = " + password + ";";
MySqlConnection myConnection = new MySqlConnection(myConnectionString);
string currentUser = FormLogin.userID;
string useDataBaseCommand = "USE " + dbName + ";";
string updateTableCommand = "UPDATE tblUsers SET UserImage = #file WHERE Username = \'" + currentUser + "\';";
MySqlCommand myCommand = new MySqlCommand(useDataBaseCommand + updateTableCommand, myConnection);
myCommand.Parameters.AddWithValue("#file", rawdata);
myConnection.Open();
myCommand.ExecuteNonQuery();
myConnection.Close();
}
catch (Exception ex)
{
MessageBox.Show(ex.Message, "Error!", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
}
If I must, this is my constructor for the MySQLOperations class:
public MySQLOperations(string server, string user, string password)
{
this.server = server;
this.user = user;
this.password = password;
}
What I'm trying to do is save an image file (which the user selects through the open file dialog box) to the database. Problem is I get this error: "You have an error in your SQL syntax; check the manual that correponds to your MySQL server version for the right syntax to use near ';UPDATE tblUsers SET UserImage = _binary'?PNG ... (and so on with some random characters). So, I can't really save the file in the database. I would love to post a picture on how the error is seen at the MessageBox, but I guess my account is not given the privilege to do so yet.
I'm not really sure where the syntax error is in that. I'm thinking, it's in the #file - but that's just a guess. Your help would be very much appreciated.
And oh, the table column UserImage has a type of LONGBLOB.
Other things I'm interested to know about also:
Is it necessary that I add another column for my table to store the
size of the file (because I'm going to need to retrieve the file
to display the image later on)?
Is it okay that I used the using statement that way in the method
btnChangeImage_Click?
Thank you very much.
EDIT: Got the problem solved. Such a simple thing not given attention to. Thanks to everybody who tried to help. I'm still willing to hear your opinion on the questions at the bottom (those on bullets).
I think the problem is in the following line of code:
WHERE Username = \'" + currentUser + "\';"
It should change to the following one:
WHERE Username = " + currentUser;
Or better (to avoid sql injections) to the following one:
WHERE Username = #Username";
myCommand.Parameters.AddWithValue("#Username", currentUser);
Do not store binary files in a MySQL table. Instead, save it to disk and save path to PNG file to MySQL database. Also, use Christos advice to avoid SQL injections.

C# UnauthorizedAccessException error

I am transfering files through sockets. When I try to save files to a custom directory, I get this error using BinaryWrite function.
private void downloadFromServer()
{
try
{
byte[] buffer = new byte[5000 * 1024];
byte[] incomingFile = new byte[5000 * 1024];
buffer = Encoding.Default.GetBytes(getUserName.Text + "Download"
+ getFileName.Text + "end");
clientSocket.Send(buffer);
activityLog.AppendText("Preparing to download... \n");
while (incomingFile != null)
{
clientSocket.Receive(incomingFile);
int receivedBytesLen = incomingFile.Length;
int fileNameLen = BitConverter.ToInt32(incomingFile, 0);
File.WriteAllBytes(fileDir, incomingFile);
}
activityLog.AppendText("File saved to " + fileDir + "\n");
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
File.WriteAllBytes(fileDir, incomingFile); requires file name. From variable name it looks like you are using folder name. I.e. should be #"e:\tempfile.bin" instead of #"e:\".
File.WriteAllBytes
Given a byte array and a file path, this method opens the specified file, writes the contents of the byte array to the file, and then closes the file.
Note if fileDir means file name than you should looks for other not-so-truthful names throughout your code...

Efficient way to read large tab delimited txt file?

I have a tab delimited txt file with 500K records. I'm using the code below to read data to dataset. With 50K it works fine but 500K it gives "Exception of type 'System.OutOfMemoryException' was thrown."
What is the more efficient way to read large tab delimited data?
Or how to resolve this issue? Please give me an example
public DataSet DataToDataSet(string fullpath, string file)
{
string sql = "SELECT * FROM " + file; // Read all the data
OleDbConnection connection = new OleDbConnection // Connection
("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + fullpath + ";"
+ "Extended Properties=\"text;HDR=YES;FMT=Delimited\"");
OleDbDataAdapter ole = new OleDbDataAdapter(sql, connection); // Load the data into the adapter
DataSet dataset = new DataSet(); // To hold the data
ole.Fill(dataset); // Fill the dataset with the data from the adapter
connection.Close(); // Close the connection
connection.Dispose(); // Dispose of the connection
ole.Dispose(); // Get rid of the adapter
return dataset;
}
Use a stream approach with TextFieldParser - this way you will not load the whole file into memory in one go.
You really want to enumerate the source file and process each line at a time. I use the following
public static IEnumerable<string> EnumerateLines(this FileInfo file)
{
using (var stream = File.Open(file.FullName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (var reader = new StreamReader(stream))
{
string line;
while ((line = reader.ReadLine()) != null)
{
yield return line;
}
}
}
Then for each line you can split it using tabs and process each line at a time. This keeps memory down really low for the parsing, you only use memory if the application needs it.
Have you tried the TextReader?
using (TextReader tr = File.OpenText(YourFile))
{
string strLine = string.Empty;
string[] arrColumns = null;
while ((strLine = tr.ReadLine()) != null)
{
arrColumns = strLine .Split('\t');
// Start Fill Your DataSet or Whatever you wanna do with your data
}
tr.Close();
}
I found FileHelpers
The FileHelpers are a free and easy to use .NET library to import/export data from fixed length or delimited records in files, strings or streams.
Maybe it can help.

Categories