Storing files in SQL Server database - c#

I want to store files in my SQL Server database by C# which I have done it without problem.
This is my code:
byte[] file;
using (var stream = new FileStream(letter.FilePath, FileMode.Open, FileAccess.Read))
{
using (var reader = new BinaryReader(stream))
{
file = reader.ReadBytes((int)stream.Length);
letter.ltr_Image = file;
}
}
LetterDB letterDB = new LetterDB();
id = letterDB.LetterActions(letter);
The insert SQL action in the LetterActions module. But I want to know, in order to reduce the size of the database (which increases daily) is there any solution for compressing the files and then store them in the database?

Yes , you can zip your files before storing them in the database, using the ZipFile class. Take a look here: https://msdn.microsoft.com/en-us/library/system.io.compression.zipfile(v=vs.110).aspx
Plenty of sample code out there too. See here:http://imar.spaanjaars.com/414/storing-uploaded-files-in-a-database-or-in-the-file-system-with-aspnet-20

You can compress file like this. Then insert compressed file stream into DB, but when you read it you need decompress it.
If you really need store file in DB, suggest you compress and decompress it by client.
And better way handle file is store them in disk, and only store file path in DB, when client need file use file path get file.

Related

Read and write to the same csv file

I have a CSV file (E.g. Directories.csv) which contains a huge list of directories. I am looping through the directories from the CSV using streamreader and performing some task. I am updating the completed directory list to a dictionary and stuck at this step now.
Ask: I want to capture the data through the loop on which directories are complete in the same CSV just in case the application crashes or server reboots, so that I don't have to re-iterate through the loop again which got completed. (Or) Delete the completed directories row from the CSV
I tried to check online for suggestions and asking to create temp file and move the copy of it. Can this be possible in case the server reboots or application crashes? Please suggest how can I take this forward.
My code:
Dictionary<string, string> directoryDictionary = new Dictionary<string, string>();
using (FileStream fileStreamDirectory = File.Open(outputdir + "\\Directories.csv", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (BufferedStream bufferStreamDirectory = new BufferedStream(fileStreamDirectory))
using (StreamReader streamReaderDirectory = new StreamReader(bufferStreamDirectory))
{
while ((Directoryline = streamReaderDirectory.ReadLine()) != null)
{
#Doing the task here
directoryDictionary.Add(Directoryline, "Completed");
}
}
You can't really insert data into middle of a text file (unless it is fixed width format which in not the case of CSV).
Two options:
read to memory, update in-memory data, rewrite whole table back to the file (may need to keep previous version in case of write failures)
use database that satisfy you criteria and import CSV there to work with.

Saving File from web to Database c#

Im making this website,where the user has a form in which they can choose a file from their computer,and upload it, but i don't know how i can use it. I'm using MVC and a Web Service also in C# where im handling the connection to the Database,where I have to save the file.The file can be a pdf,word or an image.
So the question is,how can I save it and also check its size.Thank you
You can use the ContentLength property of HttpPostedFileBase to get the size of the file -
https://msdn.microsoft.com/en-us/library/system.web.httppostedfilebase.contentlength(v=vs.110).aspx
Then save the InputStream to a byte array
using (MemoryStream stream = new MemoryStream())
{
file.InputStream.CopyTo(stream);
bytes = stream.ToArray();
}
Next assign the byte array to a SqlDbType.VarBinary stored procedure parameter to save it back to the database - assuming this is SQL Server.

Out Of Memory Exception in Foreach

I am trying to create a function that will retrieve all the uploaded files (which are now saved as byte in the database) and download it in a single zip file. I currently have 6000 files to download (and the number could grow).
The functionality is already working (from retrieval to download) if I limit the number of files being downloaded, otherwise, I get an OutOfMemoryException on the ForEach loop.
Here's a pseudo code: (files variable is a list of byte array and file name)
var files = getAllFilesFromDB();
foreach (var file in files)
{
var tempFilePath = Path.Combine(path, filename);
using (FileStream stream = new FileStream(tempfileName, FileMode.Create, FileAccess.ReadWrite))
{
stream.Write(file.byteArray, 0, file.byteArray.Length);
}
}
private readonly IEntityRepository<File> fileRepository;
IEnumerable<FileModel> getAllFilesFromDb()
{
return fileRepository.Select(f => new FileModel(){ fileData = f.byteArray, filename = f.fileName});
}
My question is, is there any other way to do this to avoid getting such errors?
To avoid this problem, you could avoid loading all the contents of all the files in one go. Most likely you will need to split your database call in to two database calls.
Retrieve a list of all the files without their contents but with some identifier - like the PK of the table.
A method which retrieves the contents of an individual file.
Then your (pseudo)code becomes
get list of all files
for each file
get the file contents
write the file to disk
Another possibility is to alter the way your query works currently, so that it uses deferred execution - this means it will not actually load all the files at once, but stream them one at a time from the database - but without seeing more code from your repository implementation, I cannot/ will not guess the right solution for you.

Generate files and ZIP without memory stream

I'm looking for a way to store files in a zip file without a memory stream. My goal is to save a maximum of system memory, while direct disk IO is no problem.
I iterate over a database result set where I have collected some blobs. These are byte-arrays.
What I do it the following (System.IO.Compression):
using (var archive = ZipFile.Open("data.zip", ZipArchiveMode.Update))
{
foreach (var result in results)
{
string fileName = $"{result.Id}.bin";
using (var fileStream = new FileStream(fileName, FileMode.Create, FileAccess.Write))
{
// write the blob data from result.Value
fileStream.Write(result.Value, 0, result.Value.Length);
fileStream.Close();
}
archive.CreateEntryFromFile(fileName, fileName);
}
}
There are 2 problems with this implementation.
I have my *.bin files AND the one *.zip (only need the zip)
I don't know why, but this uses a lot of RAM (~100MB for 15x1.5MB bin files)
Is there a way to completely bybass the memory?
UPDATE:
What I'm trying to achieve is to generate one ZIP file that contains single binary files generated from database blobs. This should happen inside a ASP.NET Web API controller. A user can request the data, but instead of sending the whole data in the HTTP response, I generate the ZIP file in the time of the request, save it to a local file server and send a download link back to the user.
I think your >100 MBs coming from
the results object which should contain at least 15x1.5 MB of blob data
holding the resulting data.zip open inside the foreach Scope.
to minimize the RAM amount of the worker process:
create empty zip-file
do {
(single BLOB query from DB)
(write blob to new or overwrite File)
(open zip file for append)
(append file to zip)
(close and dispose **both** file handles / objects )
}

Compress picture and photos to store in SQL Server database

I want to save many pictures into a SQL Server database file. How can I compress pictures to store in database? What technologies can I use?
Please help me
After Edit:
How Can I Compress And Resize Pictures to Store in Sql Server DataBase?
Pictures don't compress much if they already in jpg and most other formats. You could try compressing them further before sending to SQL Server but it probably isn't worth the effort, unless you have BMPs.
I suggest you look at FILESTREAM to store them in SQL Server (assuming SQL Server 2008) which is more efficient for this kind of data.
You could save them in JPEG and include the file. For doing it, you can use Image.Save method. And if you want to set the quality you can use Encoder.Compression.
You should also make sure that the resolution of the images is not too much.
You can store pictures in sql database by convert images to byte array.You can do it with below code :
byte[] ReadFile(string sPath)
{
byte[] data = null;
FileInfo fInfo = new FileInfo(sPath);
long numBytes = fInfo.Length;
FileStream fStream = new FileStream(sPath, FileMode.Open, FileAccess.Read);
BinaryReader br = new BinaryReader(fStream);
data = br.ReadBytes((int)numBytes);
return data;
}

Categories