I am making a program which requires getting the local directory of an image from a MySQL database table. For the moment, I have made the directory of the image I want to retrieve equal to C:\Users\User\Pictures\PictureName.PNG in the MySQL table. The method I have written up is able to retrieve this data from the database through a Select statement and set the PictureBox image as a new Bitmap with the path retrieved from the Selectstatement.
I couldn't find the information I was looking for online as most of the answers relate to using the BLOB field in MySQL to store images, but my question is if I were to put my database on a server and with the pictures on there too, would my method scale to accommodate for those changes or would I have to add other functionality in order for it to work on a server as well?
Here is my code below:
public void setImage() {
string path = sql.RetrieveImage(SelectQuery.RetrievePicture());
PictureBox1.Image = new Bitmap(path);
}
public string RetrieveImage(Query query) {
string path = "";
// OpenDatabaseConnection() will open up a connection to the database
// through connection and if successful, will return true.
if (this.OpenDatabaseConnection()) {
// query.ToSql() returns a string format of the select statement
// required to retrieve the local image directory
using (MySqlCommand cmd = new MySqlCommand(query.ToSql(), connection)) {
MySqlDataReader dataReader = cmd.ExecuteReader();
dataReader.Read();
path = dataReader[0] + "";
dataReader.Close();
}
this.CloseDatabaseConnection();
}
return path;
}
Storing images in a DB is a bad idea. Storing paths to image files is a much better way to go, IMO.
This SO question has a few links to really good supporting information.
If you must go that route, however, then yes, the BLOB field (or a VarBinary(MAX) in SQLServer) is the way to do it. You can retrieve those images as Byte[] data.
Related
I am working on a table that has a LONGBLOB column and I need to SELECT/INSERT data.
At the moment the code to upload a file to the DB is the following:
using (connection = new MySqlConnection(connectionString))
{
string query = "INSERT INTO files(name, uploader, bin) VALUES (#fileName, #uploader, #bin)";
using (command = connection.CreateCommand())
{
command.CommandText = query;
command.Parameters.AddWithValue("#fileName", Path.GetFileName(filePath));
command.Parameters.AddWithValue("#uploader", "John Doe");
command.Parameters.AddWithValue("#bin", File.ReadAllBytes(filePath));
connection.Open();
command.ExecuteNonQuery();
connection.Close();
}
}
The problem is that the file is loaded as a whole in RAM, is there a way to stream data instead?
The code works, is just a matter of understanding if can be optimized
P.S. I am aware that storing big files directly into the database is bad practice, but this is legacy stuff.
In the case of a JPG that will be rendered in a web page, it is better to have it in a file on the server, then put the URL of that file in the database. When the page is rendered, it will fetch multiple images asynchronously -- making the page load seem faster to the end-user. And it avoids the need for discussing your Question.
If the BLOB is something else, please describe it and its usage.
It may be better to chunk it into multiple pieces, especially on a busy system that includes Replication.
From a third party I am receiving a SQLite '.db3' database file. I wish to read the data stored within the database in an Azure function, but have no requirement to modify the database.
In my Azure function I have also uploaded a text file alongside the .db3 file, I can read the contents of this text file just fine, meaning that I have the URI and read privileges on the folder.
When I try to read some data from the database I get the error 'database is locked'.
Code to read the database is below. This works locally.
List<object> obs = new List<object>();
using(var con = new SQLiteConnection($"URI=file:{fullUri};mode=ReadOnly"))
{
con.Open();
using(var cmd = new SQLiteCommand(sql, con))
{
using(SQLiteDataReader rdr = cmd.ExecuteReader())
{
while(rdr.Read())
{
obs.Add(rdr.GetValues());
}
}
}
con.Close();
}
return obs;
How do I read the database?
I've had this problem before, reading a local file ... more than once. These are the steps I took depending on the scenario:
If your Azure Function is deployed using zipdeploy then under the "Configuration" > "Application Settings", change "WEBSITE_RUN_FROM_PACKAGE" from 1 to 0.
If the file you are trying to reference is saved under wwwroot on the Azure Function I would try put it somewhere else e.g. d:\local.
I am trying to figure out how to allow my users to be able to save their outlook files to a sql database using c#/.net. I am using a normal file upload control and what I thought was I could just save it is a string but and put in a link button but that didn't work for me.
This was my idea for the code:
string ImageName1 = string.Empty;
byte[] Image1 = null;
if (Images1.PostedFile != null && Images1.PostedFile.FileName != "")
{
ImageName1 = Path.GetFileName(Images1.FileName);
Image1 = new byte[Images1.PostedFile.ContentLength];
HttpPostedFile UploadedImage = Images1.PostedFile;
Images1.PostedFile.InputStream.Read(Image1, 0, Images1.PostedFile.ContentLength);
UploadedImage.InputStream.Read(Image1, 0, (int)Images1.PostedFile.ContentLength);
}
Any information on how to do that would be helpful!
You can store files as BLOBs in the database.
I don't quite follow your code, normally you call a stored procedure and pass parameters or specify the command as text in c# for communicating with the SQL server.
This link gives an example of how to insert blob into database.
I have a c# application that is conected to a access database, I am then using a openfiledialog to find a image and then display details about this, I then want to save this to the database. But when i try to do this i get a error saying its too big. So what is the way around this? I want the file path saved and then be able to view it later in my main window.
private void button2_Click(object sender, EventArgs e)
{
DataRow drNewRow = m_dtMedia.NewRow();
drNewRow["File_Path"] = textBox1.Text;
drNewRow["Subject"] = textBox2.Text;
drNewRow["Title"] = textBox3.Text;
drNewRow["Keyword_Search"] = textBox4.Text;
drNewRow["MediaType"] = textBox5.Text;
m_dtMedia.Rows.Add(drNewRow);
m_daDataAdapter.Update(m_dtMedia);
m_rowPosition = m_dtMedia.Rows.Count - 1;
this.ShowCurrentRecord();
this.Close();
this works as i have tried typing in just letters and it will save and update the database, saving images on the other hand doesnt.
The error message
OLeDb exception was upheld
The field is too small to accept the amount of data you attempted to add. Try inserting or pasting less data.
Uploading and Downloading BLOBs to Microsoft Access This will explain you how to store image in Access database.
Microsoft Access stores BLOB data in the a field with an OLE Object
data type. If you are
creating the table from a SQL statement, the data type is IMAGE
(similar to SQL Server).
For example
CREATE TABLE File (
FileName VARCHAR(255),
Size INT,
Type VARCHAR(255),
DateUploaded DATETIME,
File IMAGE,
CONSTRAINT File_PK PRIMARY KEY(FileName)
)
I ask this question as a followup of this question.
A solution that uses bcp and xp_cmdshell, that is not my desired solution, has been posted here.
I am new to c# (since I am a Delphi developer) anyway I was able to create a simple CLR stored procedure by following a tutorial.
My task is to move a file from the client file system to the server file system (the server can be accessed using remote IP, so I cannot use a shared folder as destination, this is why I need a CLR stored procedure).
So I plan to:
store from Delphi the file in a varbinary(max) column of a temporary table
call the CLR stored procedure to create a file at the desired path using the data contained in the varbinary(max) field
Imagine I need to move C:\MyFile.pdf to Z:\MyFile.pdf, where C: is a harddrive on local system and Z: is an harddrive on the server. C is in New York, Z is in London and there is no VPN between them, just https connection.
I provide the code below (not working) that someone can modify to make it work? Here I suppose to have a table called MyTable with two fields: ID (int) and DATA (varbinary(max)). Please note it doesn't make a difference if the table is a real temporary table or just a table where I temporarly store the data. I would appreciate if some exception handling code is there (so that I can manage an "impossible to save file" exception).
I would like to be able to write a new file or overwrite the file if already existing.
[Microsoft.SqlServer.Server.SqlProcedure]
public static void VarbinaryToFile(int TableId)
{
using (SqlConnection connection = new SqlConnection("context connection=true"))
{
connection.Open();
SqlCommand command = new SqlCommand("select data from mytable where ID = #TableId", connection);
command.Parameters.AddWithValue("#TableId", TableId);
// This was the sample code I found to run a query
//SqlContext.Pipe.ExecuteAndSend(command);
// instead I need something like this (THIS IS META_SYNTAX!!!):
SqlContext.Pipe.ResultAsStream.SaveToFile('z:\MyFile.pdf');
}
}
(one subquestion is: is this approach correct or there is a way to directly pass the data to the CLR stored procedure so I don't need to use a temp table?)
If the subquestion's answer is No, could you describe the approach of avoiding a temp table? So is there a better way then the one I describe above (=temp table + Stored procedure)? A way to directly pass the dataastream from the client application to the CLR stored procedure? (my files can be any size but also very big)
is [there] a way to directly pass the data to the CLR stored procedure so I don't need to use a temp table?
Yes, it is both possible and rather simple to pass a binary file to a SQLCLR stored procedure and have it write the contents to disk, and not require first placing those contents into a table--temporary or real.
[Microsoft.SqlServer.Server.SqlProcedure]
public static void SaveFileToLocalDisk([SqlFacet(MaxSize = -1)] SqlBytes FileContents,
SqlString DestinationPath)
{
if (FileContents.IsNull || DestinationPath.IsNull)
{
throw new ArgumentException("Seriously?");
}
File.WriteAllBytes(DestinationPath.Value, FileContents.Buffer);
return;
}
Or, since you said that the files are sometimes large, the following should be much easier on memory usage as it makes use of the streaming functionality:
[Microsoft.SqlServer.Server.SqlProcedure]
public static void SaveFileToLocalDiskStreamed(
[SqlFacet(MaxSize = -1)] SqlBytes FileContents, SqlString DestinationPath)
{
if (FileContents.IsNull || DestinationPath.IsNull)
{
throw new ArgumentException("Seriously?");
}
int _ChunkSize = 1024;
byte[] _Buffer = new byte[_ChunkSize];
using (FileStream _File = new FileStream(DestinationPath.Value, FileMode.Create))
{
long _Position = 0;
long _BytesRead = 0;
while (true)
{
_BytesRead = FileContents.Read(_Position, _Buffer, 0, _ChunkSize);
_File.Write(_Buffer, 0, (int)_BytesRead);
_Position += _ChunkSize;
if (_BytesRead < _ChunkSize || (_Position >= FileContents.Length))
{
break;
}
}
_File.Close();
}
return;
}
The assembly containing this code will, of course, need to have a PERMISSION_SET of EXTERNAL_ACCESS.
In both cases, you would execute them in the following manner:
EXEC dbo.SaveFileToLocalDiskStreamed 0x2A20202A, N'C:\TEMP\SaveToDiskTest.txt';
And "0x2A20202A" should give you a file containing the following 4 characters (asterisk, space, space, asterisk):
* *
See http://www.mssqltips.com/tip.asp?tip=1662
Why are you putting these files into database? If you have http/https connection you can upload the file to the server, write into a protected dierctory and create a page to list those files and give a link to download it. If you want to store some extra information you can write it into database. You just need to change the name of file at server side (use a unique name).
After some research I conclude that it makes no sense, it is better to drop the support of 2005 and use 2008 fielstream feature. I can have a conditional logic to choose between 2005 and 2008 and use filestream only for 2005.