I Use Oledb to read an AccessFile(.accdb) to DataSet, I don't know about table names or columns, The regular implementation is:
public void GetAccessDB(string filepath){
this.ConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source = " + filepath;
// get Table Names
this.TableNames = new List<string>();
using (System.Data.OleDb.OleDbConnection oledbConnection = new System.Data.OleDb.OleDbConnection(this.ConnectionString))
{
oledbConnection.Open();
System.Data.DataTable dt = null;
dt = oledbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Tables, null);
foreach (System.Data.DataRow row in dt.Rows)
{
string strSheetTableName = row["TABLE_NAME"].ToString();
if (row["TABLE_TYPE"].ToString() == "TABLE")
this.TableNames.Add(strSheetTableName);
}
oledbConnection.Close();
}
this.Dataset = new System.Data.DataSet();
using (System.Data.OleDb.OleDbConnection oledbConnection = new System.Data.OleDb.OleDbConnection(this.ConnectionString))
{
foreach (string table in this.TableNames)
{
string command = string.Format("SELECT * FROM {0};", table);
using (System.Data.OleDb.OleDbCommand cmd = new System.Data.OleDb.OleDbCommand(command, oledbConnection))
{
cmd.CommandType = System.Data.CommandType.Text;
oledbConnection.Open();
System.Data.OleDb.OleDbDataReader dr = cmd.ExecuteReader();
this.Dataset.Load(dr, System.Data.LoadOption.OverwriteChanges, table);
oledbConnection.Close();
}
}
}
}
But I need to get Access File from Stream, And I can't Write it on the Disk temporary, so what is your suggestion?
I need This overload of GetAccessDB(Stream AccessFile)?
I search and find This, but that's not clear for me, I need finally get DataSet by all Tables in Access File.
Does any one know about this?
I don't know any api function in OleDb for work with in-memory databases. Maybe, could you install a RAMDisk?
If you have control over the MS SQL Server, that's good news. I currently see 2 alternatives:
Create a CLR asssembly that will process (asynchronously is a good idea) the file once the insert is made in the uploaded files table. It would create a temporary MS Access file on the server by using the content of the uploaded file. Then, open it with OleDB, parse it and insert the information from it in a SQL table which maps the extracted information with the uploaded file record in the first table. Then, you could go and look for the data in this second table.
Another option would be to send to the SQL a command which will do the following:
Use the uploaded file bytes to create a file on the filesystem.
Then, use the file as a linked server
Use SELECT to query the Access database
You may have noticed that both options involve creating a (at least temporary) file on the SQL Server.
Related
I have seen example of PowerShell exporting data in CSV. just wondering if there is any way to do same in C# below is the code with which i can access to Database and Table names. Not sure how we can access the data and export in CSV.
Reference: https://learn.microsoft.com/en-us/analysis-services/tom/tom-pbi-datasets?view=asallproducts-allversions
// create the connection string
string workspaceConnection = "powerbi://api.powerbi.com/v1.0/myorg/worspace";
string connectString = $"DataSource={workspaceConnection};";
// connect to the Power BI workspace referenced in connection string
Server server = new Server();
server.Connect(connectString);
// enumerate through datasets in workspace to display their names
foreach (Database database in server.Databases)
{
//get all the Databases and Tables
}
The below code snippet somehow creates the dataadapter and with which we can fill the Datatable:
AdomdDataAdapter adapter = new AdomdDataAdapter($"EVALUATE '{table}'", connectString);
System.Data.DataTable tbldata = new System.Data.DataTable();
adapter.Fill(tbldata);
However it is not working. Please help.
I have few thousands attachments saved in a SQL Server database with column datatype varbinary(max). I want to retrieve all these files and save them to a local drive.
What is the best way to achieve that? I am not looking for code specifically but trying to understand all options so I can do this in C#.
Suggestions are appreciated. Thank you
You can do any of the followings:
Write a SQL script which will read the data from the table and save those to the disk. Here is how that can be done
You can write a C# script which will connect to the Database, Read and store the data as a file to a disk
Use the Id of the table as part of the file name to make it unique if you are storing all the files to a single folder.
Here's example code.
Note, the namespace System.Data.SqlClient is not referenced by a .NET Core project by default as done by .NET Framework; you have to manually add the System.Data.SqlClient NuGet package to the project.
using System.Data.SqlClient;
var connectionString = "Data Source=localhost;Initial Catalog=MyDatbase;Integrated Security=True;";
var outputFolder = #"C:\temp\";
using var conn = new SqlConnection(connectionString);
conn.Open();
var query = "select DocumentId, Contents from DocumentFile where ID >= 1234";
using var cmd = new SqlCommand(query);
cmd.Connection = conn;
var reader = cmd.ExecuteReader();
if (!reader.HasRows) throw new Exception("No rows!");
while (reader.Read())
{
var fileName = $"{reader["DocumentId"]}.pdf";
var data = (byte[])reader["Contents"];
if (data == null) throw new Exception("Contents is null");
using var writer = new BinaryWriter(File.OpenWrite(Path.Combine(outputFolder, fileName)));
writer.Write(data);
}
I have a large excel file(530K Rows with a lot of columns). Ends up being 247MB in .xlsb format. I am attempting to import to SQL Server using BulkCopy in C#, however I am having issues where the datareader ends up running out of memory before it even starts reading the file once I run the ExecuteReader() command.
string exlConnString = $"Provider=Microsoft.ACE.OLEDB.12.0;Data Source={_filepath};Extended Properties=\"Excel 12.0;HDR=YES;\"";
string sqlQuery = $"SELECT * FROM [{SheetName}]";
using OleDbConnection conn = new OleDbConnection(_connstring)) {
OleDbCommand exlCmd = new OleDbCommand(sqlQuery, conn)
conn.Open();
OleDbDataReader dr = exlcmd.ExecuteReader(); <---NEVER GETS PAST THIS LINE BEFORE RUNNING OUT OF MEMORY.
SqlBulkCopy bulkCopy = new SqlBulkCopy(sqlConnString);
bulkCopy.DestinationTable = TableName;
while(dr.Read()) {
bulkcopy.WriteToServer(dr);
}
dr.Close();
}
I am running in x86 mode because I was geting an error that the ACE Database was not installed on my local machine and corporate policy restrictions prevent me from downloading and installing the needed file to run it in x64 mode.
The code works perfectly fine when I test it on smaller files, but not when I test it on this bigger file, so it definitely is the filesize causing the issue. Any suggestions or help would be appreciated. Doesn't make much sense that a bulk copy runs out of memory when it is meant for handling large sets of data, which also means that the filesize is going to be large as well...
And yes, I know I SHOULD be able to import this using OPENROWSET or OPENDATASOURCE in SQL Server but THAT is ALSO Turned off and they will not enable it, so this is not an option.
So your problem is next.
When you try to ExecuteReader DataReader attempt to read all data from your excel file to memory. You could think about this, like a speciality working with excel through a OleDbProvider.
So my suggestion is to work with csv files instead of excel, because with csv file you have an ability to read and parse them line by line. For this aim i would recommend you to use CSV helper
Refer this code..
Here dtExcelData is datatable variable & da is OleDbDataAdapter variable.
string excelConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source={_filepath};Extended Properties='Excel 12.0;HDR=YES';";
// Create Connection to Excel Workbook
using (OleDbConnection connection = new OleDbConnection(excelConnectionString))
{
connection.Open();
da = new OleDbDataAdapter("Select * FROM [Sheet1$]", connection);
da.Fill(dtExcelData);
//store data in sql server database table
// below connection string "conString" is I mention in app.config file.(sql server connection string to store data in sql server database)
string str = ConfigurationManager.ConnectionStrings["conString"].ConnectionString;
using (SqlConnection con = new SqlConnection(str))
{
// Bulk Copy to SQL Server
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(con))
{
bulkCopy.DestinationTableName = "TableName";
con.Open();
bulkCopy.WriteToServer(dtExcelData);
con.Close();
}
}
connection.Close();
}
Mark it as a answer if it is useful to you. :)
I'm trying to restore a database from a bak file. I found some code on how to do it grammatically but I'm not sure what I'm doing wrong. I'm getting an error:
Error:
Restore failed for Server 'www.freegamedata.com'.
I assume because i'm remotely connected? I'm not sure. The bak file is not on the server machine. I'm trying to build a desktop application that will install my database on the users server using my file. Here is my code:
private void Restore_Database()
{
try
{
Server server = new Server(Properties.Settings.Default.SQL_Server);
string filename = "Test.bak";
string filepath = System.IO.Directory.GetCurrentDirectory() + "\\file\\" + filename;
Restore res = new Restore();
res.Database = Properties.Settings.Default.SQL_Database;
res.Action = RestoreActionType.Database;
res.Devices.AddDevice(filepath, DeviceType.File);
res.PercentCompleteNotification = 10;
res.ReplaceDatabase = true;
res.PercentComplete += new PercentCompleteEventHandler(res_PercentComplete);
res.SqlRestore(server);
}
catch(Exception ex)
{
}
}
I'm not sure if I'm going about this the correct way. I'd like to add my database with my data to the users server as a base database. Am I doing something wrong? My connection string is good so I know its not a connection issue.
I have found a workaround for those whom do not have local access. This is a bit involved so I hope I explain this correctly and it makes sense.
Also note you will need to export your data to an excel spreadsheet before you do the steps listed below.
Exporting Data
Part 1:
Backup Your DATA!
This is a pretty simple process. Open SQL Management Studio and right click on your database. Choose export data and export it as an excel spreadsheet 2007. I'm not going to give detailed steps on this part because its pretty basic and you can google it. Sorry for the inconvenience.
Part 2:
Delete your database for testing purposes but make sure you have a working backup before you delete your database.
Importing Data
Part 1:
You need to create a script that will build your database for you automatically. You can do this by logging into SQL management Studio and right click on the database and choose:
Task -> Generate scripts
you should only need the default information. However, if your like me, I excluded the users in the list. This will generate a large SQL script.
Part 2:
Next you will want to store this file in your solution/project. Make sure you right click it and choose always copy or or copy if newer. I think that's the options. Basically it just copies your file when you debug or build it. This is critical because you will need to access this file to execute the script. Next you need to make a SQL function similar to mine to execute the script:
public bool SQLScript_ExecuteSQLScript(string ScriptLocation)
{
try
{
//5 min timeout
SqlConnection SQLConn = new SqlConnection(cn + "; Connection Timeout = 300;");
string script = File.ReadAllText(ScriptLocation);
Server server = new Server(new ServerConnection(SQLConn));
server.ConnectionContext.ExecuteNonQuery(script);
return true;
}
catch (Exception ex)
{
return false;
}
}
In my code sample please note I changed my timeout to 5 minutes. In the event you have a large script you may need to adjust the timeout to make sure your script fully executes.
Congrats you have rebuilt your database.
Part 3:
Load SQL Management Studio and make sure your database has been rebuilt successfully. You should see all your tables and Stored Procs but no data. If this is true, great you can continue. If not please go back and review your script. If you have SQL comments in your script, you may need to remove them. I had to in order for my script to execute without errors.
Part 4:
Now you need to import your data from your excel spreadsheet you created earlier. If your like me, you had multipal sheets. If you have multipal sheets then you will want to make a list to loop through each item in your list to import the sheets. If not then you can ignore my code on the list. I also put mine in a background worker but you don't need to depending on the size of your data. Also note I created a separate class containing my list but you dont have to do that if you don't want too. My sheet names are Table_1, Table_2 and Table_3 your will be differently most likely.
Sample Sheet List:
public List<string> GetTestTableList()
{
try
{
List<string> testlist = new List<string>();
testlist.Add("Table_1");
testlist.Add("Table_2");
testlist.Add("Table_3");
return testlist;
}
catch (Exception ex)
{
return null;
}
}
Part 5:
Next we will import the data from excel into SQL. This is a function I made but you can modify this to meet your needs.
Function:
private bool Import_Data_Into_SQL(string filepath, string SheetName, string Database, string Schema)
{
try
{
// sql table should match your sheet name in excel
string sqltable = SheetName;
// select all data from sheet by name
string exceldataquery = "select * from [" + SheetName + "$]";
//create our connection strings - Excel 2007 - This may differ based on Excel spreadsheet used
string excelconnectionstring = #"Provider=Microsoft.ACE.OLEDB.12.0; Data Source='" + filepath + " '; Extended Properties=Excel 8.0;";
string sqlconnectionstring = Properties.Settings.Default.SQL_Connection;
//series of commands to bulk copy data from the excel file into our sql table
OleDbConnection oledbconn = new OleDbConnection(excelconnectionstring);
OleDbCommand oledbcmd = new OleDbCommand(exceldataquery, oledbconn);
oledbconn.Open();
OleDbDataReader dr = oledbcmd.ExecuteReader();
SqlBulkCopy bulkcopy = new SqlBulkCopy(sqlconnectionstring);
bulkcopy.DestinationTableName = Database + "." + Schema +"." + sqltable;
while (dr.Read())
{
bulkcopy.WriteToServer(dr);
}
dr.Close();
oledbconn.Close();
return true;
}
catch (Exception ex)
{
return false;
}
}
I hope this helps. This was my workaround solution. Originally I wanted/tried to import my data using the .bak file but as pointed out above you can only do that if the sql server is local. So I hope this work around helps those who where faced with a similar issue as me. I'm not marking this as the answer because the above post answers the question but I'm posting this in case someone else needs this workaround. Thanks
Restore file must be on server. For installation use SQL script. This can be generated by SQL Server Management Studio (including data).
Right click on database. Choose "Tasks" - "Generate scripts". On second page of wizard choose "Advanced" and find "Types of data to script". Select "Schema and data" and save script to file.
Then use this code to run script on database
string scriptText = File.ReadAllText(scriptFile, Encoding.Default);
ExecuteBatch executeBatch = new ExecuteBatch();
StringCollection commandTexts = executeBatch.GetStatements(scriptText);
using (SqlConnection sqlConnection = new SqlConnection(conn))
{
sqlConnection.InfoMessage += SqlConnection_InfoMessage;
sqlConnection.Open();
for (int i = 0; i < commandTexts.Count; i++)
{
try
{
log.InfoFormat("Executing statement {0}", i + 1);
string commandText = commandTexts[i];
using (SqlCommand sqlCommand = sqlConnection.CreateCommand())
{
log.Debug(commandText);
sqlCommand.CommandText = commandText;
sqlCommand.CommandTimeout = 300;
int r = sqlCommand.ExecuteNonQuery();
log.DebugFormat("{0} rows affected", r);
}
}
catch (Exception ex)
{
log.Warn("Executing command failed", ex);
try
{
sqlConnection.Open();
}
catch (Exception ex2)
{
log.Error("Cannot reopen connection", ex2);
}
}
}
sqlConnection.Close();
}
The Problem:
I have a web application where people can upload xml, xmls, csv files.
I then take their content and insert it into my Oracle DB.
Technical details:
I recently had a problem where I get OutOfMemory Exception trying to use the data.
The previous developer created a list of lists on the data in order to manage them. However, this is giving us OutOfMemory Exception.
We are using the LinqToExcel library.
Sample code:
excel = new ExcelQueryFactory(excelFile);
IEnumerable<RowNoHeader> data = from row in excel.WorksheetNoHeader(sheetName)
select row;
List<List<string>> d = new List<List<string>>(data.Count());
foreach (RowNoHeader row in data)
{
List<string> list = new List<string>();
foreach (Cell cell in row)
{
string cellValue = cell.Value.ToString().Trim(' ').Trim(null);
list.Add(cellValue);
}
d.Add(list);
}
I have tried to change the code and instead did this:
string connectionstring = string.Format(#"Provider=Microsoft.ACE.OLEDB.12.0;Data Source={0};Extended Properties='Excel 12.0;HDR=YES;';", excelFile);
OleDbConnection connection = new OleDbConnection();
connection.ConnectionString = connectionstring;
OleDbCommand excelCommand = new OleDbCommand();
excelCommand.Connection = connection;
excelCommand.CommandText = String.Format("Select * FROM [{0}$]", sheetName);
connection.Open();
DataTable dtbl = CreateTable(TableColumns);
OleDbDataReader reader = excelCommand.ExecuteReader();
while (reader.Read())
{
DataRow row = dtbl.NewRow();
dtbl.Rows.Add(row);
}
using (OracleCommand command = new OracleCommand(selectCommand, _oracleConnection))
{
using (OracleDataAdapter adapter = new OracleDataAdapter(command))
{
using (OracleCommandBuilder builder = new OracleCommandBuilder(adapter))
{
OracleTransaction trans = _oracleConnection.BeginTransaction();
command.Transaction = trans;
adapter.InsertCommand = builder.GetInsertCommand(true);
adapter.Update(dtbl);
trans.Commit();
}
}
}
However, I still get the same OutOfMemory Exception.
I have read online and've seen that I should make my project x64 and use the following:
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
</runtime>
However, I can't change my web application to run on x64.
My solution was to make this in batches like this:
int rowCount = 0;
while (reader.Read())
{
DataRow row = dtbl.NewRow();
dtbl.Rows.Add(row);
if (rowCount % _batches == 0 && rowCount != 0)
{
DBInsert(dtbl, selectCommand);
dtbl = CreateTable(TableColumns);
}
}
private void DBInsert(DataTable dt, string selectCommand)
{
using (OracleCommand command = new OracleCommand(selectCommand, _oracleConnection))
{
using (OracleDataAdapter adapter = new OracleDataAdapter(command))
{
using (OracleCommandBuilder builder = new OracleCommandBuilder(adapter))
{
OracleTransaction trans = _oracleConnection.BeginTransaction();
command.Transaction = trans;
adapter.InsertCommand = builder.GetInsertCommand(true);
adapter.Update(dt);
trans.Commit();
}
}
}
}
}
It works, however this is very slow. I was wondering if there is a way to either solve the problem with the memory serially or write in memory in parallel.
I have tried to insert the data in parallel using threads but this takes a lot of memory and throws OutOfMemory Exception as well.
Just don't load 1M rows into a DataTable. Use whatever bulk import mechanism is available to load a stream of rows. Oracle, like SQL Server offers several ways to bulk import data.
Collections like List or DataTable use an internal buffer to store data that they reallocate when it fills up, using twice the original size. With 1M rows that leads to a lot of reallocations and a lot of memory fragmentation. The runtime may no longer be able to even find a contiguous block of memory large enough to store 2M entries. That's why it's important to set the capacity parameter when creating a new List.
Apart from that, it doesn't serve any purpose to load everything in memory and then send it to the database. It's actually faster to send the data as soon as each file is read, or as soon as a sufficiently large number is loaded. Instead of trying to load 1M rows at once, read 500 or 1000 of them each time and send them to the database.
Furthermore, Oracle's ADO.NET provider includes the OracleBulkCopy class that works in a way similar to SqlBulkCopy for SQL Server. The WriteToServer method can accept a DataTable or a DataReader. You can use the DataTable overload to send batches of item. An even better idea is to use the overload that accepts a reader and have the class collect the batch and send it to the database.
Eg :
using(var bcp=OracleBulkCopy(connectionString))
{
bcp.BatchSize=5000;
bcp.DestinationTableName = "MyTable";
//For each source/target column pair, add a mapping
bcp.ColumnMappings.Add("ColumnA","ColumnA");
var reader = excelCommand.ExecuteReader();
bcp.WriteToServer(reader);
}