Best way to retrieve pdf attachments saved as varbinary(max) from SQL Server and save to a local drive using C# - c#

I have few thousands attachments saved in a SQL Server database with column datatype varbinary(max). I want to retrieve all these files and save them to a local drive.
What is the best way to achieve that? I am not looking for code specifically but trying to understand all options so I can do this in C#.
Suggestions are appreciated. Thank you

You can do any of the followings:
Write a SQL script which will read the data from the table and save those to the disk. Here is how that can be done
You can write a C# script which will connect to the Database, Read and store the data as a file to a disk
Use the Id of the table as part of the file name to make it unique if you are storing all the files to a single folder.

Here's example code.
Note, the namespace System.Data.SqlClient is not referenced by a .NET Core project by default as done by .NET Framework; you have to manually add the System.Data.SqlClient NuGet package to the project.
using System.Data.SqlClient;
var connectionString = "Data Source=localhost;Initial Catalog=MyDatbase;Integrated Security=True;";
var outputFolder = #"C:\temp\";
using var conn = new SqlConnection(connectionString);
conn.Open();
var query = "select DocumentId, Contents from DocumentFile where ID >= 1234";
using var cmd = new SqlCommand(query);
cmd.Connection = conn;
var reader = cmd.ExecuteReader();
if (!reader.HasRows) throw new Exception("No rows!");
while (reader.Read())
{
var fileName = $"{reader["DocumentId"]}.pdf";
var data = (byte[])reader["Contents"];
if (data == null) throw new Exception("Contents is null");
using var writer = new BinaryWriter(File.OpenWrite(Path.Combine(outputFolder, fileName)));
writer.Write(data);
}

Related

SQL Server CE: Suppress modifications of database file?

we have an application that has a local SQL Server CE database file. When we open the database, but don't do any changes to it, the database file is changed anyway:
using (var connection = new SqlCeConnection("Data Source='data.sdf';File Mode='Shared Read';Encrypt=FALSE;LCID=1033"))
{
connection.Open();
using (var context = new DataContext(connection))
{
}
}
This changes some bytes at the very beginning of the sdf-file.
Is there any way to prevent this?
Yes, you can enable read only mode in the connection string. Also you may need to specify a temp path in this case:
string connectionString = ...;Mode = Read Only;Temp Path= ...;
More info.

Connecting to mysql on 000webhost using C#

Im simply just trying to read what there is in the batabase on to a console but i always get an exception on the conn.Open() line. Here is all the code:
SqlConnectionStringBuilder conn_string = new SqlConnectionStringBuilder();
conn_string.DataSource = "mysql14.000webhost.com"; // Server
conn_string.UserID = "a7709578_codecal";
conn_string.Password = "xxxxx";
conn_string.InitialCatalog = "a7709578_codecal"; // Database name
SqlConnection conn = new SqlConnection(conn_string.ToString());
conn.Open();
SqlCommand cmd = new SqlCommand("Select name FROM Users");
SqlDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
Console.WriteLine("{1}, {0}", reader.GetString(0), reader.GetString(1));
}
reader.Close();
conn.Close();
if (Debugger.IsAttached)
{
Console.ReadLine();
}
You need to build the connection string manually or use MySqlConnectionStringBuilder. MySql uses a different format than SQL Server and the SqlConnectionStringBuilder that you're using. You also need to use a MySQL library, SqlConnection, SqlCommand, etc are all build specifically for SQL Server.
MySQL connectors
For MySQL database you are using wrong provider. Those classes you have used in posted code are for SQL Server. Your code should look like below with MySQL provider related classes
MySqlConnectionStringBuilder conn_string = new MySqlConnectionStringBuilder();
conn_string.Server = "mysql14.000webhost.com";
conn_string.UserID = "a7709578_codecal";
conn_string.Password = "xxxxxxx";
conn_string.Database = "a7709578_codecal";
using (MySqlConnection conn = new MySqlConnection(conn_string.ToString()))
Check Related post in SO
Also to point out, you are selecting only one column from your table as can be seen
new SqlCommand("Select name FROM Users");
Whereas trying to retrieve two column value, which is not correct
Console.WriteLine("{1}, {0}", reader.GetString(0), reader.GetString(1))
000webhost free servers does not allow external connections to the server database.
You can only use your database from your PHP scripts stored on the server.
You can get data from database using PHP and it will return.So i advice to you using php from C# like api.

Visual studio mdf database structure modified then lost after publish

Service-based database is new to me. I would like to create a simple database application with:
Service-based database -> Dataset (mdf)
LINQ to SQL (L2S) Classes
This application will be installed on a lot of individual machines every instances has it's own mdf database.
Installation is done by Clickonce.
My problem is:
I publish my application and install it on user machines
Users put some data into the database
Turns out that we need another table or column
Publish the application again with the extended database and install on user machines
User starts with a new database and original data lost!
(If I am not modifying the database structure than all data is in the database after next Clickone update)
Questions:
If I made only alter table- or add table- like modifications is there any way to preserve data during the next Clickonce update?
Thank you in advance!
Dave
I have found a solution.
If you modify your database model in visual studio Clickonce will automatically recreate all tables after installation (newly published application with modified tables).
Clickonce save the old database to a specified location:
if (ApplicationDeployment.IsNetworkDeployed && ApplicationDeployment.CurrentDeployment.DataDirectory != null)
string preDatabase = Path.GetFullPath(Path.Combine(ApplicationDeployment.CurrentDeployment.DataDirectory,#".pre\sampledatabase.mdf"));
This .pre directory is created by Clickonce. You always check if this file is exist or not. If it is exist you have to copy data from old tables to new tables OR you will loose all data from old tables!
How to copy data from one database to another very similar database? My answer is the following: copy all table to another table with SqlBulkCopy.
// Create source connection
using (
var source =
new SqlConnection(
String.Format(
#"Data Source=(LocalDB)\v11.0;AttachDbFilename={0};Integrated Security=True;",
preDatabase)))
{
source.Open();
// Create destination connection
using (var destination = new SqlConnection(Settings.Default.mdcdbConnectionString))
{
destination.Open();
DataTable dt = source.GetSchema("Tables");
foreach (string tablename in from DataRow row in dt.Rows select (string) row[2])
{
App.Logger.LogText(String.Format("Copying table {0}", tablename));
using (var cmd = new SqlCommand(String.Format("TRUNCATE TABLE {0}", tablename), destination))
{
cmd.ExecuteNonQuery();
//App.Logger.LogText(String.Format("truncate table {0}", tablename));
}
using (var cmd = new SqlCommand(String.Format("SELECT * FROM {0}", tablename), source))
{
using (SqlDataReader reader = cmd.ExecuteReader())
{
var bulkData = new SqlBulkCopy(destination)
{
DestinationTableName = tablename
};
// Set destination table name
bulkData.WriteToServer(reader);
// Close objects
bulkData.Close();
//App.Logger.LogText(String.Format("Copy success {0}", tablename));
}
}
}
destination.Close();
}
source.Close();
}
Wish you good coding!
Dave

Use OLEDB to read AccessFile from Stream to DataSet

I Use Oledb to read an AccessFile(.accdb) to DataSet, I don't know about table names or columns, The regular implementation is:
public void GetAccessDB(string filepath){
this.ConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source = " + filepath;
// get Table Names
this.TableNames = new List<string>();
using (System.Data.OleDb.OleDbConnection oledbConnection = new System.Data.OleDb.OleDbConnection(this.ConnectionString))
{
oledbConnection.Open();
System.Data.DataTable dt = null;
dt = oledbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Tables, null);
foreach (System.Data.DataRow row in dt.Rows)
{
string strSheetTableName = row["TABLE_NAME"].ToString();
if (row["TABLE_TYPE"].ToString() == "TABLE")
this.TableNames.Add(strSheetTableName);
}
oledbConnection.Close();
}
this.Dataset = new System.Data.DataSet();
using (System.Data.OleDb.OleDbConnection oledbConnection = new System.Data.OleDb.OleDbConnection(this.ConnectionString))
{
foreach (string table in this.TableNames)
{
string command = string.Format("SELECT * FROM {0};", table);
using (System.Data.OleDb.OleDbCommand cmd = new System.Data.OleDb.OleDbCommand(command, oledbConnection))
{
cmd.CommandType = System.Data.CommandType.Text;
oledbConnection.Open();
System.Data.OleDb.OleDbDataReader dr = cmd.ExecuteReader();
this.Dataset.Load(dr, System.Data.LoadOption.OverwriteChanges, table);
oledbConnection.Close();
}
}
}
}
But I need to get Access File from Stream, And I can't Write it on the Disk temporary, so what is your suggestion?
I need This overload of GetAccessDB(Stream AccessFile)?
I search and find This, but that's not clear for me, I need finally get DataSet by all Tables in Access File.
Does any one know about this?
I don't know any api function in OleDb for work with in-memory databases. Maybe, could you install a RAMDisk?
If you have control over the MS SQL Server, that's good news. I currently see 2 alternatives:
Create a CLR asssembly that will process (asynchronously is a good idea) the file once the insert is made in the uploaded files table. It would create a temporary MS Access file on the server by using the content of the uploaded file. Then, open it with OleDB, parse it and insert the information from it in a SQL table which maps the extracted information with the uploaded file record in the first table. Then, you could go and look for the data in this second table.
Another option would be to send to the SQL a command which will do the following:
Use the uploaded file bytes to create a file on the filesystem.
Then, use the file as a linked server
Use SELECT to query the Access database
You may have noticed that both options involve creating a (at least temporary) file on the SQL Server.

Reading large volume of data from oracle database and export it as .dat file using C#

We have a query that will be executed on a monthly basis and returns data of size 1GB.
Query used here is just a select query with inner joins, no cursor involved.
Currently they are executing this query in Toad and exporting the data from output window as .dat file.
Please note that doing this manually using Toad takes 2 hrs of time.
After that they are changing the header text in .dat file to have meaningful names to share it with our clients.
I want to automate this process by creating an exe that will do this process.
Code snapshot looks like the below
using (OracleConnection conn = new OracleConnection())
{
conn.ConnectionString = ConfigurationManager.ConnectionStrings["connectionString"].ConnectionString;
conn.Open();
using (OracleCommand cmd = new OracleCommand(commandText))
{
cmd.Connection = conn;
using (OracleDataReader dtReader = cmd.ExecuteReader())
{
outputContent = new StringBuilder();
while (dtReader != null && dtReader.Read())
{
for (int i = 0; i < dtReader.FieldCount; i++)
{
outputContent.Append(dtReader[i]);
outputContent.Append(delimiter);
}
outputContent = outputContent.Replace(delimiter, Environment.NewLine, outputContent.Length - 1, 1);
}
}
}
}
outputPath = string.Format(ConfigurationManager.AppSettings["OutputPath"], DateTime.Now.Ticks);
outputStream = new StreamWriter(outputPath, true);
//Export
outputStream.Write(outputContent.ToString());
outputStream.Close();
From the log, it got ot know that, execute reader statement is completed within seconds.
But reading the data from datareader throws "Exception message is ORA-03113: end-of-file on communication channel
at System.Data.OracleClient.OracleConnection.CheckError(OciErrorHandle errorHandle, Int32 rc)" after 8 hours of time.
Could anyone please let me know the above approach is good to handle the data of 1GB size? or
Is there any other better way of doing this?
Thanks,
Gayathri
May be you can try
CommandBehavior = SequentialAccess
from MSDN
Use SequentialAccess to retrieve large values and binary data
A sample how to use it
You can export data directly from PL/SQL procedure and have a shell file (instead of an exe) that launches it from SqlPlus.
See this question on SO on what to put in the procedure to export data.

Categories