How can I save a DataTable to a .DBF? - c#

I've been working on a program to read a dbf file, mess around with the data, and save it back to dbf. The problem that I am having is specifically to do with the writing portion.
private const string constring = "Driver={Microsoft dBASE Driver (*.dbf)};"
+ "SourceType=DBF;"
+ "DriverID=277;"
+ "Data Source=¿;"
+ "Extended Properties=dBASE IV;";
private const string qrystring = "SELECT * FROM [¿]";
public static DataTable loadDBF(string location)
{
string filename = ConvertLongPathToShort(Path.GetFileName(location));
DataTable table = new DataTable();
using(OdbcConnection conn = new OdbcConnection(RTN(constring, filename)))
{
conn.Open();
table.Load(new OdbcCommand(RTN(qrystring, filename), conn).ExecuteReader());
conn.Close();
}
return table;
}
private static string RTN(string stmt, string tablename)
{ return stmt.Replace("¿", tablename); }
[DllImport("Kernel32", CharSet = CharSet.Auto)]
static extern Int32 GetShortPathName(
String path, // input string
StringBuilder shortPath, // output string
Int32 shortPathLength); // StringBuilder.Capacity
public static string ConvertLongPathToShort(string longPathName)
{
StringBuilder shortNameBuffer;
int size;
shortNameBuffer = new StringBuilder();
size = GetShortPathName(longPathName, shortNameBuffer, shortNameBuffer.Capacity);
if (size >= shortNameBuffer.Capacity)
{
shortNameBuffer.Capacity = size + 1;
GetShortPathName(longPathName, shortNameBuffer, shortNameBuffer.Capacity);
}
return shortNameBuffer.ToString();
}
This is what I'm working with. I've tried a number of methods to write a new file, none of them productive. To be honest, while normally I would be an advocate of form and function, I just want the damn thing to work, this app is supposed to do one very specific thing, it's not going to simulate weather.
-=# Edit #=-
I've since discontinued the app due to time pressure, but before I scrapped it I realised that the particular format of dbf I was working with had no primary key information. This of course meant that I had to essentially read the data out to DataTable, mess with it, then wipe all the records in the dbf and insert everything from scratch.
Screw that for a lark.

For people coming here in the future: I wrote this today and it works well. The filename is without the extension (.dbf). The path (used for connection) is the directory path only (no file). You can add your datatable to a dataset and pass it in. Also, some of my datatypes are foxpro data types and may not be compatible with all DBF files. Hope this helps.
public static void DataSetIntoDBF(string fileName, DataSet dataSet)
{
ArrayList list = new ArrayList();
if (File.Exists(Path + fileName + ".dbf"))
{
File.Delete(Path + fileName + ".dbf");
}
string createSql = "create table " + fileName + " (";
foreach (DataColumn dc in dataSet.Tables[0].Columns)
{
string fieldName = dc.ColumnName;
string type = dc.DataType.ToString();
switch (type)
{
case "System.String":
type = "varchar(100)";
break;
case "System.Boolean":
type = "varchar(10)";
break;
case "System.Int32":
type = "int";
break;
case "System.Double":
type = "Double";
break;
case "System.DateTime":
type = "TimeStamp";
break;
}
createSql = createSql + "[" + fieldName + "]" + " " + type + ",";
list.Add(fieldName);
}
createSql = createSql.Substring(0, createSql.Length - 1) + ")";
OleDbConnection con = new OleDbConnection(GetConnection(Path));
OleDbCommand cmd = new OleDbCommand();
cmd.Connection = con;
con.Open();
cmd.CommandText = createSql;
cmd.ExecuteNonQuery();
foreach (DataRow row in dataSet.Tables[0].Rows)
{
string insertSql = "insert into " + fileName + " values(";
for (int i = 0; i < list.Count; i++)
{
insertSql = insertSql + "'" + ReplaceEscape(row[list[i].ToString()].ToString()) + "',";
}
insertSql = insertSql.Substring(0, insertSql.Length - 1) + ")";
cmd.CommandText = insertSql;
cmd.ExecuteNonQuery();
}
con.Close();
}
private static string GetConnection(string path)
{
return "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + path + ";Extended Properties=dBASE IV;";
}
public static string ReplaceEscape(string str)
{
str = str.Replace("'", "''");
return str;
}

What kind of dbf file are you working with? (There are several, e.g. dBase, FoxPro etc that are not 100% compatible.) I have gotten this to work with the Microsoft Visual FoxPro OleDB Provider from C#, you might give that a shot instead of using the dBase ODBC driver.

Using ADO.Net to read and write dbf files turns out to be really slow so I would suggest you use an alternative approach.
One option would be to use the old DAO 3.6 library. This is much faster and just as compatible, but depends on a com object to work.
A better approach would be to use the open source DBFExporter component. It may require some code to set up (you need a class with properties that describe your recordset and the properties must have certain attributes set) but after that it works really well. It is fast to use but it doesn't read dbf files. The component is licences under the LGPL so you should be able to use it in commercial code.

Related

Can't export large data from oracle to excel file using c#

I have a problem with extracting large data from oracle table to C#, and I
couldn't find the solution myself.
For this task I wrote a C# code, which loaded data from oracle procedure, which returns cursor, in excel file for the first time.
But when I tried to load bigger table (about 20 columns and 90 000 rows), it just didn't work.
Script doesn't fall with error, but data are not inserted into excel file.
I tried to load for 10 000 rows and then save the results, but again, only 30 000 rows were inserted.
I monitored the counter in loop, it is going correct and reach needed 90 000 and ExecuteNonQuery() always returned the value 10 000. But when I open excel file, there are only 30 000 rows there.
Can you please help me to catch the error, or may be somebody met the same problem, and can advise me what to do or what to read.
Thank you for any help!
I didn't write the connection string, but I think, it's correct, cause script works correctly with small datatable.
public static void Main()
{
string datetime = DateTime.Now.ToString("yyyy-MM-dd HH-mm-ss");
System.Threading.Thread.CurrentThread.CurrentUICulture = new System.Globalization.CultureInfo("en-US");
try
{
OleDbConnection Excel_OLE_Con = new OleDbConnection();
OleDbCommand Excel_OLE_Cmd = new OleDbCommand();
string qwe_constr = "connection string";
OracleConnection myADONETConnection = new OracleConnection(qwe_constr);
string connstring = "Provider=Microsoft.ACE.OLEDB.12.0;" + "Data Source=" + "E:\\qaz\\15.07.2016\\qwe" +
";" + "Extended Properties=\"Excel 12.0 Xml;HDR=YES;\"";
File.Delete("E:\\qaz\\15.07.2016\\qwe.xlsx");
//fill datatable with data for insert
myADONETConnection.Open();
OracleCommand cmd_proc = new OracleCommand();
cmd_proc.Connection = myADONETConnection;
cmd_proc.CommandType = System.Data.CommandType.StoredProcedure;
cmd_proc.CommandText = "procedure_name";
cmd_proc.Parameters.Add("p_show_del", OracleDbType.Int16).Value = 0;
cmd_proc.Parameters.Add("p_type", OracleDbType.Varchar2, 3).Value = "INV";
cmd_proc.Parameters.Add("p_errno", OracleDbType.Int16).Value = 157;
cmd_proc.Parameters.Add("outcur", OracleDbType.RefCursor).Direction = ParameterDirection.Output;
DataTable dt_with_data = new DataTable();
dt_with_data.Load(cmd_proc.ExecuteReader());
myADONETConnection.Close();
//string with column headers
string TableColumns = "";
foreach (DataColumn column in dt_with_data.Columns)
{
TableColumns += column + "],[";
}
// Replace most right comma from Columnlist
TableColumns = ("[" + TableColumns.Replace(",", " Text,").TrimEnd(','));
TableColumns = TableColumns.Remove(TableColumns.Length - 2);
//Use OLE DB Connection and Create Excel Sheet
Excel_OLE_Con.ConnectionString = connstring;
Excel_OLE_Con.Open();
Excel_OLE_Cmd.Connection = Excel_OLE_Con;
Excel_OLE_Cmd.CommandText = "Create table [sheet1] (" + TableColumns + ")";
Excel_OLE_Cmd.ExecuteNonQuery();
Excel_OLE_Con.Close();
//Write Data to Excel Sheet from DataTable dynamically
//string with command
Excel_OLE_Con.Open();
String sqlCommandInsert = "";
String sqlCommandValue = "";
foreach (DataColumn dataColumn in dt_with_data.Columns)
{
sqlCommandValue += dataColumn + "],[";
}
sqlCommandValue = "[" + sqlCommandValue.TrimEnd(',');
sqlCommandValue = sqlCommandValue.Remove(sqlCommandValue.Length - 2);
sqlCommandInsert = "INSERT into [sheet1] (" + sqlCommandValue + ") VALUES(";
int columnCount = dt_with_data.Columns.Count;
int i_qaz = 0;
foreach (DataRow row in dt_with_data.Rows)
{
i_qaz++;
Console.WriteLine(i_qaz.ToString());
string columnvalues = "";
for (int i = 0; i < columnCount; i++)
{
int index = dt_with_data.Rows.IndexOf(row);
columnvalues += "'" + dt_with_data.Rows[index].ItemArray[i].ToString().Replace("'", "''") + "',";
}
columnvalues = columnvalues.TrimEnd(',');
var command = sqlCommandInsert + columnvalues + ")";
Excel_OLE_Cmd.CommandText = command;
Excel_OLE_Cmd.ExecuteNonQuery();
}
}
catch (Exception exception)
{
// Create Log File for Errors
using (StreamWriter sw = File.CreateText("E:\\qaz\\15.07.2016\\qwe_" + datetime + ".log"))
{
sw.WriteLine(exception.ToString());
}
}
}
PS: Same question in Russian.

bulk upload text file using Microsoft.Jet.OLEDB

i have following code and i tried everything mentioned on online and couldn't able to read data from text file.
string path=#"D:\New folder\abc.txt"
string pathOnly = Path.GetDirectoryName(path);
string fileName = Path.GetFileName(path);
string excelConnectionString = #"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + pathOnly + #";Extended Properties=""text;HDR=YES;FMT=TabDelimited""";
OleDbConnection excelConnection = new OleDbConnection(excelConnectionString);
excelConnection.Open();
OleDbCommand cmd = excelConnection.CreateCommand();
cmd.CommandText = String.Format("SELECT * FROM [{0}]", fileName);
OleDbDataReader dReader;
dReader = cmd.ExecuteReader();
DateTime UploadedDate = DateTime.Now;
DataTable sourceData = new DataTable();
sourceData.Load(dReader);
DataColumn col = new DataColumn("UploadedDateCol", typeof(DateTime));
col.DefaultValue = UploadedDate;
sourceData.Columns.Add(col);
int x = sourceData.Rows.Count;
always this x value is 0. my pc is 64 bit pc.
Or else is there any other library that i can use for bulk upload.
My .txt file as below: these values are separated by tab or pipeline(|)
0421230424 3391542691 5295963551 2755344586 12345678
As far as tab delimited files are concerned, you should always have a look at the FileHelper library.
Don't reinvent the wheel, this library is very mature.
The FileHelpers are a free and easy to use .NET library to
import/export data from fixed length or delimited records in files,
strings or streams.
The idea is pretty simple:
You can strong type your flat file (fixed or delimited) simply
describing a class that maps to each record and later read/write your
file as an strong typed .NET array
The Library also has support for import/export data from differents
storages like Excel, Access, SqlServer, etc.
In order to overcome this problem defined details in Schema.ini file. if the file name is abc.txt then its need to create abc.ini file in same location.
string iniFileMsg = "[" + newfileName + ".txt]";
StreamWriter sw = new StreamWriter(newFilePath + "/schema.ini", false);
sw.WriteLine(iniFileMsg);
sw.WriteLine("Format=Delimited(|)");
sw.WriteLine("ColNameHeader=True");
sw.Flush();
If you have different coumns then its need to add as follows:
List<string> columns = excelobj.GetCSVColumnNames(excelUploadedFullPath);
// sw.WriteLine("Col1=Phone Text Width 10");
for (int i = 0; i < columns.Count; i++)
{
sw.WriteLine("Col" + (i + 1) + "=" + columns[i].Replace(" ", "_") + " Text Width 100");
}
sw.Flush();
sw.Close();
From GetCSVColumnNames() i got column names. after that i upload data as above method

C# Fatal error encoutered during data read

I'm selecting about 20,000 records from the database and then I update them one by one.
I looked for this error and I saw that setting the CommandTimeout will help, but not in my case.
public void Initialize()
{
MySqlConnectionStringBuilder SQLConnect = new MySqlConnectionStringBuilder();
SQLConnect.Server = SQLServer;
SQLConnect.UserID = SQLUser;
SQLConnect.Password = SQLPassword;
SQLConnect.Database = SQLDatabase;
SQLConnect.Port = SQLPort;
SQLConnection = new MySqlConnection(SQLConnect.ToString());
}
public MySqlDataReader SQL_Query(string query)
{
MySqlCommand sql_command;
sql_command = SQLConnection.CreateCommand();
sql_command.CommandTimeout = int.MaxValue;
sql_command.CommandText = query;
MySqlDataReader query_result = sql_command.ExecuteReader();
return query_result;
}
public void SQL_NonQuery(string query)
{
MySqlCommand sql_command;
sql_command = SQLConnection.CreateCommand();
sql_command.CommandTimeout = int.MaxValue;
sql_command.CommandText = query;
sql_command.ExecuteNonQuery();
}
And here is my method which makes the select query:
public void CleanRecords()
{
SQLActions.Initialize();
SQLActions.SQL_Open();
MySqlDataReader cashData = SQLActions.SQL_Query("SELECT `cash`.`id`, SUM(`cash`.`income_money`) AS `income_money`, `cash_data`.`total` FROM `cash_data` JOIN `cash` ON `cash`.`cash_data_id` = `cash_data`.`id` WHERE `user`='0' AND `cash_data`.`paymentterm_id`='0' OR `cash_data`.`paymentterm_id`='1' GROUP BY `cash_data_id`");
while(cashData.Read()){
if(cashData["income_money"].ToString() == cashData["total"].ToString()){
UpdateRecords(cashData["id"].ToString());
}
}
SQLActions.SQL_Close();
}
And here is the method which makes the update:
public void UpdateRecords(string rowID)
{
SQLActions.Initialize();
SQLActions.SQL_Open();
SQLActions.SQL_NonQuery("UPDATE `cash_data` SET `end_date`='" + GetMeDate() + "', `user`='1' WHERE `id`='" + rowID + "'");
SQLActions.SQL_Close();
}
Changing the database structure is not an option for me.
I thought that setting the timeout to the maxvalue of int will solve my problem, but is looks like this wont work in my case.
Any ideas? :)
EDIT:
The error which I get is "Fatal error encoutered during data read".
UPDATE:
public void CleanRecords()
{
StringBuilder dataForUpdate = new StringBuilder();
string delimiter = "";
SQLActions.Initialize();
SQLActions.SQL_Open();
MySqlDataReader cashData = SQLActions.SQL_Query("SELECT `cash`.`id`, SUM(`cash`.`income_money`) AS `income_money`, `cash_data`.`total` FROM `cash_data` JOIN `cash` ON `cash`.`cash_data_id` = `cash_data`.`id` WHERE `user`='0' AND `cash_data`.`paymentterm_id`='0' OR `cash_data`.`paymentterm_id`='1' GROUP BY `cash_data_id`");
while (cashData.Read())
{
if (cashData["income_money"].ToString() == cashData["total"].ToString())
{
dataForUpdate.Append(delimiter);
dataForUpdate.Append("'" + cashData["id"].ToString() + "'");
delimiter = ",";
}
}
SQLActions.SQL_Close();
UpdateRecords(dataForUpdate.ToString());
}
public void UpdateRecords(string rowID)
{
SQLActions.Initialize();
SQLActions.SQL_Open();
SQLActions.SQL_NonQuery("UPDATE `cash_data` SET `end_date`='" + GetMeDate() + "', `user`='1' WHERE `id` IN (" + rowID + ")");
SQLActions.SQL_Close();
}
You may be able to use
UPDATE cash_data .... WHERE id IN (SELECT ....)
and do everything in one go. Otherwise, you could do it in two steps: first the select collects all the ids, close the connection and then do the update in obne go with all the ids.
The code for the second option might look something like this:
public void CleanRecords()
{
StringBuilder builder = new StringBuilder();
string delimiter = "";
SQLActions.Initialize();
SQLActions.SQL_Open();
MySqlDataReader cashData = SQLActions.SQL_Query("SELECT `cash`.`id`, SUM(`cash`.`income_money`) AS `income_money`, `cash_data`.`total` FROM `cash_data` JOIN `cash` ON `cash`.`cash_data_id` = `cash_data`.`id` WHERE `user`='0' AND `cash_data`.`paymentterm_id`='0' OR `cash_data`.`paymentterm_id`='1' GROUP BY `cash_data_id`");
while(cashData.Read()){
if(cashData["income_money"].ToString() == cashData["total"].ToString()){
builder.Append(delimiter);
builder.Append("'" + cashData["id"].ToString() + "'");
delimiter = ",";
}
}
SQLActions.SQL_Close();
UpdateRecords(builder.ToString());
}
public void UpdateRecords(string rowIDs)
{
SQLActions.Initialize();
SQLActions.SQL_Open();
SQLActions.SQL_NonQuery("UPDATE `cash_data` SET `end_date`='" + GetMeDate() + "', `user`='1' WHERE `id` IN (" + rowIDs + ")";
SQLActions.SQL_Close();
}
There are multiple problem:
First: You have reading information around 20K using data reader and then doing update one by one in reader itself. Reader holds the connection open until you are finished. So this is not the good way to do it. Solution: We can read the information using Data Adapter.
Second: Rather than doing one by one update, we can update in bulk in one go. There are multiple option for bulk operation. In SQL u can do either by sending information in XML format or u can use Table Valued Parameter (TVP) (http://www.codeproject.com/Articles/22205/ADO-NET-and-OPENXML-to-Perform-Bulk-Database-Opera) OR (http://dev.mysql.com/doc/refman/5.5/en/load-xml.html)

How can I read the rows of an Excel csv or xls file into an ASP.NET app from a remote location using C#?

Here's my situation. I'm designing a program that takes Excel files (which may be in csv, xls, or xlsx format) from a remote network drive, processes the data, then outputs and stores the results of that process. The program provides a listbox of filenames that are obtained from the remote network drive folder using the method detailed in the accepted answer here. Once the user selects a filename from the listbox, I want the program to find the file and obtain the information from it to do the data processing. I have tried using this method to read the data from the Excel file while in a threaded security context, but that method just fails without giving any kind of error. It seems to not terminate. Am I going about this the wrong way?
Edit - (Final Notes: I have taken out the OleDbDataAdapter and replaced it with EPPlus handling.)
I was able to scrub sensitive data from the code, so here it is:
protected void GetFile(object principalObj)
{
if (principalObj == null)
{
throw new ArgumentNullException("principalObj");
}
IPrincipal principal = (IPrincipal)principalObj;
Thread.CurrentPrincipal = principal;
WindowsIdentity identity = principal.Identity as WindowsIdentity;
WindowsImpersonationContext impersonationContext = null;
if (identity != null)
{
impersonationContext = identity.Impersonate();
}
try
{
string fileName = string.Format("{0}\\" + Files.SelectedValue, #"RemoteDirectoryHere");
string connectionString = string.Format("Provider=Microsoft.ACE.OLEDB.14.0; data source={0}; Extended Properties=Excel 14.0;", fileName);
OleDbDataAdapter adapter = new OleDbDataAdapter("SELECT * FROM Sheet1", connectionString);
DataSet ds = new DataSet();
adapter.Fill(ds, "Sheet1");
dataTable = ds.Tables["Sheet1"];
}
finally
{
if (impersonationContext != null)
{
impersonationContext.Undo();
}
}
}
Additional Edit
Now xlsx files have been added to the mix.
Third Party
Third party solutions are not acceptable in this case (unless they allow unrestricted commercial use).
Attempts - (Final Notes: Ultimately I had to abandon OleDb connections.)
I have tried all of the different connection strings offered, and I have tried them with just one file type at a time. None of the connection strings worked with any of the file types.
Permissions
The User does have access to the file and its directory.
Your connection string might be the issue here. As far as I know, there isn't 1 that can read all xls, csv, and xlsx. I think you're using the XLSX connection string.
When I read xls, i use the following connection string:
#"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + sFilePath + ";Extended Properties='Excel 8.0;HDR=YES;IMEX=1;'"
Having said that, I recommend using a 3rd party file reader/parser to read XLS and CSV since, from my experience, OleDbDataAdapter is wonky depending on the types of data that's being read (and how mixed they are within each column).
For XLS, try NPOI https://code.google.com/p/npoi/
For CSV, try http://www.codeproject.com/Articles/9258/A-Fast-CSV-Reader
For XLSX, try EPPlus http://epplus.codeplex.com/
I've had great success with the above libraries.
Is it really important that you use an OleDb interface for this? I've always done it with Microsoft.Office.Excel.Interop, to wit:
using System;
using Microsoft.Office.Interop.Excel;
namespace StackOverflowExample
{
class Program
{
static void Main(string[] args)
{
var app = new Application();
var wkbk = app.Workbooks.Open(#"c:\data\foo.xls") as Workbook;
var wksht = wkbk.Sheets[1] as Worksheet; // not zero-based!
for (int row = 1; row <= 100; row++) // not zero-based!
{
Console.WriteLine("This is row #" + row.ToString());
for (int col = 1; col <= 100; col++)
{
Console.WriteLine("This is col #" + col.ToString());
var cell = wksht.Cells[row][col] as Range;
if (cell != null)
{
object val = cell.Value;
if (val != null)
{
Console.WriteLine("The value of the cell is " + val.ToString());
}
}
}
}
}
}
}
As you will be dealing with xlsx extension, you should rather opt for the new connection string.
public static string getConnectionString(string fileName, bool HDRValue, bool WriteExcel)
{
string hdrValue = HDRValue ? "YES" : "NO";
string writeExcel = WriteExcel ? string.Empty : "IMEX=1";
return "Provider=Microsoft.ACE.OLEDB.12.0;" + "Data Source=" + fileName + ";" + "Extended Properties=\"Excel 12.0 xml;HDR=" + hdrValue + ";" + writeExcel + "\"";
}
Above is the code for getting the connection string. First argument expects the actual path for file location. Second argument will decide whether to consider first row values as column headers or not. Third argument helps decide whether you want to open the connection to create and write the data or simply read the data. To read the data set it to "FALSE"
public static ReadData(string filePath, string sheetName, List<string> fieldsToRead, int startPoint, int endPoint)
{
DataTable dt = new DataTable();
try
{
string ConnectionString = ProcessFile.getConnectionString(filePath, false, false);
using (OleDbConnection cn = new OleDbConnection(ConnectionString))
{
cn.Open();
DataTable dbSchema = cn.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
if (dbSchema == null || dbSchema.Rows.Count < 1)
{
throw new Exception("Error: Could not determine the name of the first worksheet.");
}
StringBuilder sb = new StringBuilder();
sb.Append("SELECT *");
sb.Append(" FROM [" + sheetName + fieldsToRead[0].ToUpper() + startPoint + ":" + fieldsToRead[1].ToUpper() + endPoint + "] ");
OleDbDataAdapter da = new OleDbDataAdapter(sb.ToString(), cn);
dt = new DataTable(sheetName);
da.Fill(dt);
if (dt.Rows.Count > 0)
{
foreach (DataRow row in dt.Rows)
{
string i = row[0].ToString();
}
}
cn.Dispose();
return fileDatas;
}
}
catch (Exception)
{
}
}
This is for reading 2007 Excel into dataset
DataSet ds = new DataSet();
try
{
string myConnStr = "";
myConnStr = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=MyDataSource;Extended Properties=\"Excel 12.0;HDR=YES\"";
OleDbConnection myConn = new OleDbConnection(myConnStr);
OleDbCommand cmd = new OleDbCommand("select * from [Sheet1$] ", myConn);
OleDbDataAdapter adapter = new OleDbDataAdapter();
adapter.SelectCommand = cmd;
myConn.Open();
adapter.Fill(ds);
myConn.Close();
}
catch
{ }
return ds;

Export SQL DataBase to WinForm DataSet and then to MDB Database using DataSet

My application is a winform app that relies on a database. At startup of the application it connects to an SQL Database on the server and put this information in a DataSet/DataTable.
If for some reason the database on the server is not accessible, the application has a built in failover and it will get its information from the local database.
If, in a normal scenario, I start the tool it will read from the sql database and if it has been updated on the server (a seperate snippet checks this), it should make sure the local database is up to date and this is where the problem starts.. (see below)
This part works fine and is added as context - this is where we connect to the SQL Database
public static DataSet dtsTableContents;
public static DataTable CreateDatabaseSQLConnection()
{
try
{
string strSqlConnectionString = "Data Source=MyLocation;Initial Catalog=MyCatalog;User=MyUser;Password=MyPassword;";
SqlCommand scoCommand = new SqlCommand();
scoCommand.Connection = new SqlConnection(strSqlConnectionString);
scoCommand.Connection.Open();
string strQueryToTable = "SELECT * FROM " + strTableName;
dtsTableContents = new DataSet();
SqlCommand scmTableInformation = new SqlCommand(strQueryToTable, scnConnectionToDatabase);
SqlDataAdapter sdaTableInformation = new SqlDataAdapter(scmTableInformation);
scnConnectionToDatabase.Open();
sdaTableInformation.Fill(dtsTableContents, strTableName);
DataTable dttTableInformation = dtsTableContents.Tables[strTableName];
scnConnectionToDatabase.Close();
return dttTableInformation;
}
catch
{
return null;
}
}
This snippet is part of the failover method that reads from my local database...
This part works fine and is added as context - this is where we connect to the MDB Database
public static DataTable CreateDatabaseConnection()
{
try
{
string ConnectionString = #"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=MyLocation;Persist Security Info=True;JET OLEDB:Database Password=MyPassword;"
odcConnection = new OleDbConnection(ConnectionString);
odcConnection.Open();
string strQueryToTable = "SELECT * FROM " + strTableName;
DataSet dtsTableContents = new DataSet();
OleDbCommand ocmTableInformation = new OleDbCommand(strQueryToTable, ocnConnectionToDatabase);
OleDbDataAdapter odaTableInformation = new OleDbDataAdapter(ocmTableInformation);
ocnConnectionToDatabase.Open();
odaTableInformation.Fill(dtsTableContents, strTableName);
DataTable dttTableInformation = dtsTableContents.Tables[strTableName];
ocnConnectionToDatabase.Close();
return dttTableInformation;
}
catch
{
return null;
}
}
From my CreateDatabaseSQLConnection() I have a DataSet. This DataSet is verified to contain all the information from the server database. Now I have been googling around and found myself trying to use this code to update the local database based on this article:
http://msdn.microsoft.com/en-us/library/system.data.common.dataadapter.update(v=vs.71).aspx
public static void UpdateLocalDatabase(string strTableName)
{
try
{
if (CreateDatabaseConnection() != null)
{
string strQueryToTable = "SELECT * FROM " + strTableName;
OleDbDataAdapter odaTableInformation = new OleDbDataAdapter();
odaTableInformation.SelectCommand = new OleDbCommand(strQueryToTable, odcConnection);
OleDbCommandBuilder ocbCommand = new OleDbCommandBuilder(odaTableInformation);
odcConnection.Open();
odaTableInformation.Update(dtsTableContents, strTableName);
odcConnection.Close();
}
}
catch { }
}
This snippet runs error-free but it does not seem to change anything. Also the time it takes to run this step takes like milliseconds and I would think this to take longer.
I am using the DataSet I obtained from my SQL Connection, this DataSet I am trying to write to my local database.
Might it be the fact that this is a DataSet from an SQL connection and that I can't write this to my mdb connection via my OleDbAdapter or am I just missing the obvious here?
Any help is appreciated.
Thanks,
Kevin
For a straight backup from the external database to the internal backup
I've just been messing around with an sdf and a server based sql database and it has worked.. It's not by all means the finished product but for one column I've got the program to read from the external database and then write straight away to the local .sdf in around 15 lines of code
SqlConnection sqlCon = new SqlConnection( ExternalDatabaseConnectionString );
SqlCeConnection sqlCECon = new SqlCeConnection( BackUpConnectionString );
using ( sqlCon )
{
using ( sqlCECon )
{
sqlCon.Open( );
sqlCECon.Open( );
SqlCommand get = new SqlCommand( "Select * from [TableToRead]", sqlCon );
SqlCeCommand save = new SqlCeCommand( "Update [BackUpTable] set InfoColumn = #info where ID = #id", sqlCECon );
SqlDataReader reader = get.ExecuteReader( );
if ( reader.HasRows )
{
reader.Read( );
save.Parameters.AddWithValue("#id", reader.GetString(0));
save.Parameters.AddWithValue( "#info", reader.GetString( 1 ));
save.ExecuteNonQuery( );
}
}
}
For one row of a database, backing up one column, it works, I'm assuming you will have some sort of auto incremented key like ID?
I think a first step would be to reduce your reliance on static methods and fields.
If you look at your UpdateLocalDatabase method you'll see that you are passing in strTableName which is used by that method but a method that UpdateLocalDatabase calls (CreateDatabaseConnection) refers to a different global static variable named the same. Most likely the two strTableName variables contain different values and you are not seeing that they are not the same variable.
Also you are trying to write out the global static data set dtsTableContents in UpdateLocalDatabase but if you then look at CreateDatabaseConnection it actually creates a local version of that variable -- again, you have two variables named the same thing where one is global and one is local.
I suspect that the two variables of dtsTableContents is the problem.
My suggestion, again, would be to not have any static methods or variables and, for what you're doing here, try not to use any global variables. Also, refactor and/or rename your methods to match more what they are actually doing.
After endlessly trying to use the DataAdapter.Update(Method) I gave up.. I settled for using an sdf file instead of an mdb.
If I need to update my database, I remove the local database, create a new one, using the same connectionstring. Then I loop over the tables in my dataset, read the column names and types from it and create tables based on that.
After this I loop over my dataset and insert the contents of my dataset which I filled with the information from the server. Below is the code, this is just 'quick and dirty' as a proof of concept but it works for my scenario.
public static void RemoveAndCreateLocalDb(string strLocalDbLocation)
{
try
{
if (File.Exists(strLocalDbLocation))
{
File.Delete(strLocalDbLocation);
}
SqlCeEngine sceEngine = new SqlCeEngine(#"Data Source= " + strLocalDbLocation + ";Persist Security Info=True;Password=MyPass");
sceEngine.CreateDatabase();
}
catch
{ }
}
public static void UpdateLocalDatabase(String strTableName, DataTable dttTable)
{
try
{
// Opening the Connection
sceConnection = CreateDatabaseSQLCEConnection();
sceConnection.Open();
// Creating tables in sdf file - checking headers and types and adding them to a query
StringBuilder stbSqlGetHeaders = new StringBuilder();
stbSqlGetHeaders.Append("create table " + strTableName + " (");
int z = 0;
foreach (DataColumn col in dttTable.Columns)
{
if (z != 0) stbSqlGetHeaders.Append(", "); ;
String strName = col.ColumnName;
String strType = col.DataType.ToString();
if (strType.Equals("")) throw new ArgumentException("DataType Empty");
if (strType.Equals("System.Int32")) strType = "int";
if (strType.Equals("System.String")) strType = "nvarchar (100)";
if (strType.Equals("System.Boolean")) strType = "nvarchar (15)";
if (strType.Equals("System.DateTime")) strType = "datetime";
if (strType.Equals("System.Byte[]")) strType = "nvarchar (100)";
stbSqlGetHeaders.Append(strName + " " + strType);
z++;
}
stbSqlGetHeaders.Append(" )");
SqlCeCommand sceCreateTableCommand;
string strCreateTableQuery = stbSqlGetHeaders.ToString();
sceCreateTableCommand = new SqlCeCommand(strCreateTableQuery, sceConnection);
sceCreateTableCommand.ExecuteNonQuery();
StringBuilder stbSqlQuery = new StringBuilder();
StringBuilder stbFields = new StringBuilder();
StringBuilder stbParameters = new StringBuilder();
stbSqlQuery.Append("insert into " + strTableName + " (");
foreach (DataColumn col in dttTable.Columns)
{
stbFields.Append(col.ColumnName);
stbParameters.Append("#" + col.ColumnName.ToLower());
if (col.ColumnName != dttTable.Columns[dttTable.Columns.Count - 1].ColumnName)
{
stbFields.Append(", ");
stbParameters.Append(", ");
}
}
stbSqlQuery.Append(stbFields.ToString() + ") ");
stbSqlQuery.Append("values (");
stbSqlQuery.Append(stbParameters.ToString() + ") ");
string strTotalRows = dttTable.Rows.Count.ToString();
foreach (DataRow row in dttTable.Rows)
{
SqlCeCommand sceInsertCommand = new SqlCeCommand(stbSqlQuery.ToString(), sceConnection);
foreach (DataColumn col in dttTable.Columns)
{
if (col.ColumnName.ToLower() == "ssma_timestamp")
{
sceInsertCommand.Parameters.AddWithValue("#" + col.ColumnName.ToLower(), "");
}
else
{
sceInsertCommand.Parameters.AddWithValue("#" + col.ColumnName.ToLower(), row[col.ColumnName]);
}
}
sceInsertCommand.ExecuteNonQuery();
}
}
catch { }
}

Categories