Sqlite DataAdapter isn't updating when called - c#

I have to write identical tables to two different Sqlite database. The tables rewrite existing data and ids, so I currently just delete the entire table and rewrite. Depending on the order that I write to the db, the C# SQLiteCommandBuilder does not update the second db called.
If I call Db_A before Db_B, then A gets written and B gets deleted and vice versa. Can anyone tell me why the second table enters the code to get deleted, but the Sqlite adapter never updates the second table? It doesn't throw an error either.
public static bool WriteDt_Name(DataTable dt)
{
using (connA = GetDbAConn())
{
SaveDataTable(connA, dt);
}
using (connB = GetDbBConn())
{
SaveDataTable(connB, dt);
}
return true;
}
public static void SaveDataTable(SQLiteConnection conn, DataTable dt)
{
string table = dt.TableName;
var cmd = conn.CreateCommand();
cmd.CommandText = string.Format("DELETE FROM {0}", table);
int val = cmd.ExecuteNonQuery();
cmd.CommandText = string.Format("SELECT * FROM {0}", table);
using (var adapter = new SQLiteDataAdapter(cmd))
{
using (SQLiteCommandBuilder builder = new SQLiteCommandBuilder(adapter))
{
adapter.Update(dt);
conn.Close();
}
}
}
public static SQLiteConnection GetDbAConn()
{
string base_dir = System.AppDomain.CurrentDomain.BaseDirectory;
string path = Directory.GetParent(Directory.GetParent(Directory.GetParent(Directory.GetParent(Directory.GetParent(base_dir).ToString()).ToString()).ToString()).ToString()).ToString();
path = path + "\\db\\DbA.sqlite;";
SQLiteConnection conn = new SQLiteConnection("Data Source=" + path + "Version=3;");
return conn;
}
I have tried splitting SaveDataTable into a SaveDt_A and SaveDt_B and calling it that way. I still get the same result.

Related

Connection to SQL Server through ADO.NET - Empty Listbox

Trying to set up a connection to my local SQL Server Express instance so that I can display columns in a listbox. Th build runs fine and I can't see errors, but there is no data in the listbox. I have tested the query and that is fine. I am using NT Authentication to the database. Any ideas where I might have gone wrong?
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
}
private void customers_SelectedIndexChanged(object sender, EventArgs e)
{
string commstring = "Driver ={SQL Server}; Server = DESKTOP-5T4MHHR\\SQLEXPRESS; Database = AdventureWorks2014; Trusted_Connection = Yes;";
string connstring = "SELECT FirstName, LastName FROM Person.Person";
SqlDataAdapter customerDataAdapater = new SqlDataAdapter(commstring, connstring);
DataSet customerDataSet = new DataSet();
customerDataAdapater.Fill(customerDataSet, "Person.Person");
DataTable customerDataTable = new DataTable();
customerDataTable = customerDataSet.Tables[0];
foreach (DataRow dataRow in customerDataTable.Rows)
{
customers.Items.Add(dataRow["FirstName"] + " (" + dataRow["LastName"] + ")");
}
}
}
I tested your code in a sample project here and I realized you passed the parameters to SqlDataAdapter constructor in a wrong order.
After changing the follow line:
SqlDataAdapter customerDataAdapater = new SqlDataAdapter(commstring, connstring);
by
SqlDataAdapter customerDataAdapater = new SqlDataAdapter(connstring, commstring);
the listbox was filled successfully.
Your connection string seems weird.....
Could you try using just this:
string commstring = "Server=DESKTOP-5T4MHHR\\SQLEXPRESS;Database=AdventureWorks2014;Trusted_Connection=Yes;";
Also: why are you first creating a DataSet, filling in a single set of data, and then extracting a DataTable from it?? This is unnecessarily complicated code - just use this instead:
SqlDataAdapter customerDataAdapater = new SqlDataAdapter(commstring, connstring);
// if you only ever need *one* set of data - just use a DataTable directly!
DataTable customerDataTable = new DataTable();
// Fill DataTable with the data from the query
customerDataAdapater.Fill(customerDataTable);
Update: I would really rewrite your code to something like this:
// create a separate class - call it whatever you like
public class DataProvider
{
// define a method to provide that data to you
public List<string> GetPeople()
{
// define connection string (I'd really load that from CONFIG, in real world)
string connstring = "Server=MSEDTOP;Database=AdventureWorks2014;Trusted_Connection=Yes;";
// define your query
string query = "SELECT FirstName, LastName FROM Person.Person";
// prepare a variable to hold the results
List<string> entries = new List<string>();
// put your SqlConnection and SqlCommand into "using" blocks
using (SqlConnection conn = new SqlConnection(connstring))
using (SqlCommand cmd = new SqlCommand(query, conn))
{
conn.Open();
// iterate over the results using a SqlDataReader
using (SqlDataReader rdr = cmd.ExecuteReader())
{
while (rdr.Read())
{
// get first and last name from current row of query
string firstName = rdr.GetFieldValue<string>(0);
string lastName = rdr.GetFieldValue<string>(1);
// add entry to result list
entries.Add(firstName + " " + lastName);
}
rdr.Close();
}
conn.Close();
}
// return the results
return entries;
}
}
And in your code-behind, you only need to do something like this:
protected override void OnLoad(.....)
{
if (!IsPostback)
{
List<string> people = new DataProvider().GetPeople();
customers.DataSource = people;
customers.DataBind();
}
}
but I still don't understand what you were trying to when the SelectedIndexChanged event happens.....

C# Mysql multiple queries

Im trying to build up a little status-tool. I need to get results of multiple queries (about 4-5). The general connection-setup and 'how-to-read-data' is already done but I cant figure out how the another query executed.
Everything I found while searching for it is for the SqlClient. Im totally overcharged with this.
Here is my code so far (be patient, im a newbie to this):
private void button1_Click(object sender, EventArgs e)
{
if(listView1.Items.Count > 1)
{
listView1.Items.Clear();
}
var listMember = new List<string>{};
var listOnline = new List<string>{};
// SQL PART //
string connString = "Server=10*****;Port=3306;Database=e***;Uid=e***;password=********************;";
MySqlConnection conn = new MySqlConnection(connString);
MySqlCommand command = conn.CreateCommand();
command.CommandText = "SELECT fullname,online FROM member WHERE active = '1' ORDER BY online DESC";
try
{
conn.Open();
}
catch (Exception ex)
{
listView1.Items.Add("Error: " + ex);
}
MySqlDataReader reader = command.ExecuteReader();
while(reader.Read())
{
listMember.Add(reader["fullname"].ToString());
listOnline.Add(reader["online"].ToString());
}
conn.Close();
// SQL ENDING //
// SET ENTRIES TO LISTVIEW //
int counter = 0;
foreach(string member in listMember)
{
ListViewItem item = new ListViewItem(new[] { member, listOnline.ElementAt(counter) });
item.ForeColor = Color.Green;
listView1.Items.Add(item);
counter++;
}
}
Im not really sure how the design/layout will look like in the end, so I would like to just append the results to lists in the sql-part to process the data later out of the lists.
Do I really have to setup a complete new connection after conn.Close()? Or is there any other way? I can just imagine: 5 queries with their own connection,try,catch and 2 loops... this will get about 100-200 lines just for getting the results out of 5 queries. Isnt that a bit too much for such an easy thing?
Hope for some help.
Greetings.
According to the new comments my latest code:
Top:
public partial class Form1 : Form
{
public static string connString = "Server=10****;Port=3306;Database=e****;Uid=e****;password=****;";
public Form1()
{
InitializeComponent();
MySqlConnection conn = new MySqlConnection(connString); // Error gone!
}
Body part:
public void QueryTwoFields(string s, List<string> S1, List<string> S2)
{
try
{
MySqlCommand cmd = conn.CreateCommand(); // ERROR: conn does not exist in the current context.
cmd.CommandType = CommandType.Text;
string command = s;
cmd.CommandText = command;
MySqlDataReader sqlreader = cmd.ExecuteReader();
while (sqlreader.Read())
{
S1.Add(sqlreader[0].ToString());
S2.Add(sqlreader[1].ToString());
}
sqlreader.Close();
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
}
private void button1_Click(object sender, EventArgs e)
{
if(listView1.Items.Count > 1)
{
listView1.Items.Clear();
}
var listMember = new List<string>{};
var listOnline = new List<string>{};
using (conn) // ERROR: conn does not exist in the current context.
{
conn.Open();
///...1st Query
QueryTwoFields("SELECT fullname,online FROM member WHERE active = '1' ORDER BY online DESC",listMember,listOnline);
//...2nd query
//QueryTwoFields("your new Select Statement", otherList, otherList);
}
}
You don't have to close connection every time you execute one query rarher than close the sqlreader assigned to that connection. Finally when all of your queries have been executed you close the connection. Consider also the use of using:
You cal also define a method for execution your Query in order for your code not to be repetive:
public void QueryTwoFields(string s, List<string> S1, List<string> S2)
///Select into List S1 and List S2 from Database (2 fields)
{
try
{
MySqlCommand cmd = conn.CreateCommand();
cmd.CommandType = CommandType.Text;
string command = s;
cmd.CommandText = command;
MySqlDataReader sqlreader = cmd.ExecuteReader();
while (sqlreader.Read())
{
S1.Add(sqlreader[0].ToString());
S2.Add(sqlreader[1].ToString());
}
sqlreader.Close();
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
}
private void button1_Click(object sender, EventArgs e)
{
if(listView1.Items.Count > 1)
{
listView1.Items.Clear();
}
var listMember = new List<string>{};
var listOnline = new List<string>{};
// SQL PART //
using (conn)
{
conn.Open();
///...1st Query
QueryTwoFields("SELECT fullname,online FROM member WHERE active = '1' ORDER BY online DESC",listmember,listonline)
//...2nd query
QueryTwoFields("your new Select Statement",myOtherList1,myOtherlist2)
....
}
}
EDIT :
Take in mind you cant define QueryTwoFields method inside button handler. You must define it outside (see code above).
Also Define your connection data in the start of the programm:
namespace MyProject
{
/// <summary>
/// Defiine your connectionstring and connection
/// </summary>
///
public partial class Form1 : Form
{ public static string connString = "Server=10*****;Port=3306;Database=e***;Uid=e***;password=********************;";
MySqlConnection conn = new MySqlConnection(connString);
.........
Datatables are fantastic
Using a data table is a nice way to do both read and write. And it comes with the luxury of eveything you can do with a datatable - like asssigning it directly to a datagrid control, sorting, selecting and deleting while disconnected.
The sample below assumes a MySqlConnection conection property managed by calls to your own OpenConnection() and CloseConnection() methods not shown.
Simple datatable read demo:
public DataTable Select(string query = "")
{
//Typical sql: "SELECT * FROM motorparameter"
DataTable dt = new DataTable();
//Open connection
if (this.OpenConnection() == true)
{
//Create Command
MySqlCommand cmd = new MySqlCommand(query, connection);
//Create a data reader and Execute the command
MySqlDataReader dataReader = cmd.ExecuteReader();
dt.Load(dataReader);
//close Data Reader
dataReader.Close();
//close Connection
this.CloseConnection();
//return data table
return dt;
}
else
{
return dt;
}
}
In case of writing back the datatable to the database - supply the SQL you used in the read (or would have used to read to the data table):
public void Save(DataTable dt, string DataTableSqlSelect)
{
//Typically "SELECT * FROM motorparameter"
string query = DataTableSqlSelect;
//Open connection
if (this.OpenConnection() == true)
{
//Create Command
MySqlCommand mySqlCmd = new MySqlCommand(query, connection);
MySqlDataAdapter adapter = new MySqlDataAdapter(mySqlCmd);
MySqlCommandBuilder myCB = new MySqlCommandBuilder(adapter);
adapter.UpdateCommand = myCB.GetUpdateCommand();
adapter.Update(dt);
//close Connection
this.CloseConnection();
}
else
{
}
}
The neat thing the datatable is extremely flexible. You can run your own selects against the table once it contains data and before writing back you can set or reset what rows needs updating and by default the datatable keeps track of what rows you update in the table. Do not forget primary key column(s) for all tables in the db.
For multiple queries consider if possible using a join between the database tables or same table if data related or use a UNION sql syntax if column count and type of data is the same. You can allways "create" your extra column in the select to differ what data comes from what part of the UNION.
Also consider using CASE WHEN sql syntax to conditionally select data from different sources.

Simple Database Comparison

I am trying to compare two databases from different dates. I want to load my 'Orders' table from the ending database into a DataTable, then convert that DataTable to a Dictionary<string,int> where string = OrderNumber and int = ClientNumber.
Once I have the Dictionary into memory, I will run my function that goes out to the beginning database and performs a atrOrdersEND.ContainsKey(ordernumber) check. If it is exists nothing will be done, if it does not exist, I want the information of the DataRow to be saved somewhere.
Is this the best way to go about this?
If not, how should I?
static public DataTable getDatabaseInfo(string DB)
{
DB = #"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + DB;
OleDbConnection conn = new OleDbConnection(DB);
string query = "SELECT * FROM ORDERS;";
OleDbCommand cmd = new OleDbCommand(query, conn);
OleDbDataAdapter da = new OleDbDataAdapter(cmd);
DataTable dt = new DataTable();
conn.Open();
try
{
da.Fill(dt);
}
catch (OleDbException ex)
{
}
return dt;
}
static Dictionary<string,int> ToDictionary(DataTable DT)
{ //this doesn't work, I get the 'An item with the same key has already been added' error
var Dict = (from order in DT.AsEnumerable()
select new
{
OrderNumber = order.Field<string>("OrderNumber"),
ClientNumber = order.Field<int>("Client")
}).AsEnumerable()
.ToDictionary(k => k.OrderNumber, v => v.ClientNumber);
return Dict;
}
static public void CheckHistoryBEG(string DB)
{
DB = #"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + DB;
OleDbConnection conn = new OleDbConnection(DB);
string query = "SELECT * " +
"FROM Orders";
OleDbDataAdapter adapter = new OleDbDataAdapter(query, conn);
DataTable dt = new DataTable();
try
{
adapter.Fill(dt);
for (int i = 0; i <= dt.Rows.Count - 1; i++)
{
string ordernumber = dt.Rows[i]["OrderNumber"].ToString();
if (atrOrdersEND.ContainsKey(ordernumber))
{
atrOrdersEND[ordernumber] = atrOrdersEND[ordernumber] + 1;
}
else
{
atrOrdersEND.Add(ordernumber, 1);
}
}
}
catch (OleDbException ex)
{
}
}
public Main()
{
DataTable EndDB_Data = getDatabaseInfo(endDBLocation);
Dictionary<string, int> atrOrdersBEG = ToDictionary(EndDBData);
CheckHistoryBEG(beginningDBLocation);
}
You can avoid writing any C# code by using linked tables in your MS Access database. From one DB you can reference the table in the other database. Then you can do joins or existence queries between the two databases.

Dealing with huge amount of data when inserting into sql database

in my code the user can upload an excel document wish contains it's phone contact list.Me as a developer should read that excel file turn it into a dataTable and insert it into the database .
The Problem is that some clients have a huge amount of contacts like saying 5000 and more contacts and when i am trying to insert this amount of data into the database it's crashing and giving me a timeout exception.
What would be the best way to avoid this kind of exception and is their any code that can reduce the time of the insert statement so the user don't wait too long ?
the code
public SqlConnection connection = new SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings["ConnectionString"].ConnectionString);
public void Insert(string InsertQuery)
{
SqlDataAdapter adp = new SqlDataAdapter();
adp.InsertCommand = new SqlCommand(InsertQuery, connection);
if (connection.State == System.Data.ConnectionState.Closed)
{
connection.Open();
}
adp.InsertCommand.ExecuteNonQuery();
connection.Close();
}
protected void submit_Click(object sender, EventArgs e)
{
string UploadFolder = "Savedfiles/";
if (Upload.HasFile) {
string fileName = Upload.PostedFile.FileName;
string path=Server.MapPath(UploadFolder+fileName);
Upload.SaveAs(path);
Msg.Text = "successfully uploaded";
DataTable ValuesDt = new DataTable();
ValuesDt = ConvertExcelFileToDataTable(path);
Session["valuesdt"] = ValuesDt;
Excel_grd.DataSource = ValuesDt;
Excel_grd.DataBind();
}
}
protected void SendToServer_Click(object sender, EventArgs e)
{
DataTable Values = Session["valuesdt"] as DataTable ;
if(Values.Rows.Count>0)
{
DataTable dv = Values.DefaultView.ToTable(true, "Mobile1", "Mobile2", "Tel", "Category");
double Mobile1,Mobile2,Tel;string Category="";
for (int i = 0; i < Values.Rows.Count; i++)
{
Mobile1 =Values.Rows[i]["Mobile1"].ToString()==""?0: double.Parse(Values.Rows[i]["Mobile1"].ToString());
Mobile2 = Values.Rows[i]["Mobile2"].ToString() == "" ? 0 : double.Parse(Values.Rows[i]["Mobile2"].ToString());
Tel = Values.Rows[i]["Tel"].ToString() == "" ? 0 : double.Parse(Values.Rows[i]["Tel"].ToString());
Category = Values.Rows[i]["Category"].ToString();
Insert("INSERT INTO client(Mobile1,Mobile2,Tel,Category) VALUES(" + Mobile1 + "," + Mobile2 + "," + Tel + ",'" + Category + "')");
Msg.Text = "Submitied successfully to the server ";
}
}
}
You can try SqlBulkCopy to insert Datatable to Database Table
Something like this,
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(sqlConnection, SqlBulkCopyOptions.KeepIdentity))
{
bulkCopy.DestinationTableName = DestTableName;
string[] DtColumnName = YourDataTableColumns;
foreach (string dbcol in DbColumnName)//To map Column of Datatable to that of DataBase tabele
{
foreach (string dtcol in DtColumnName)
{
if (dbcol.ToLower() == dtcol.ToLower())
{
SqlBulkCopyColumnMapping mapID = new SqlBulkCopyColumnMapping(dtcol, dbcol);
bulkCopy.ColumnMappings.Add(mapID);
break;
}
}
}
bulkCopy.WriteToServer(YourDataTableName.CreateDataReader());
bulkCopy.Close();
}
For more Read http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx
You are inserting 1 row at a time, which is very expensive for this amount of data
In those cases you should use bulk insert, so the round trip to DB will be only once, if you need to roll back - all is the same transaction
You can use SqlBulkCopy which is more work, or you can use the batch update feature of the SqlAdpater. Instead of creating your own insert statement, then building a sqladapter, and then manually executing it, create a dataset, fill it, create one sqldataadpater, set the number of inserts in a batch, then execute the adapter once.
I could repeat the code, but this article shows exactly how to do it: http://msdn.microsoft.com/en-us/library/kbbwt18a%28v=vs.80%29.aspx
protected void SendToServer_Click(object sender, EventArgs e)
{
DataTable Values = Session["valuesdt"] as DataTable ;
if(Values.Rows.Count>0)
{
DataTable dv = Values.DefaultView.ToTable(true, "Mobile1", "Mobile2", "Tel", "Category");
//Fix up default values
for (int i = 0; i < Values.Rows.Count; i++)
{
Values.Rows[i]["Mobile1"] =Values.Rows[i]["Mobile1"].ToString()==""?0: double.Parse(Values.Rows[i]["Mobile1"].ToString());
Values.Rows[i]["Mobile2"] = Values.Rows[i]["Mobile2"].ToString() == "" ? 0 : double.Parse(Values.Rows[i]["Mobile2"].ToString());
Values.Rows[i]["Tel"] = Values.Rows[i]["Tel"].ToString() == "" ? 0 : double.Parse(Values.Rows[i]["Tel"].ToString());
Values.Rows[i]["Category"] = Values.Rows[i]["Category"].ToString();
}
BatchUpdate(dv,1000);
}
}
public static void BatchUpdate(DataTable dataTable,Int32 batchSize)
{
// Assumes GetConnectionString() returns a valid connection string.
string connectionString = GetConnectionString();
// Connect to the database.
using (SqlConnection connection = new SqlConnection(connectionString))
{
// Create a SqlDataAdapter.
SqlDataAdapter adapter = new SqlDataAdapter();
// Set the INSERT command and parameter.
adapter.InsertCommand = new SqlCommand(
"INSERT INTO client(Mobile1,Mobile2,Tel,Category) VALUES(#Mobile1,#Mobile2,#Tel,#Category);", connection);
adapter.InsertCommand.Parameters.Add("#Mobile1",
SqlDbType.Float);
adapter.InsertCommand.Parameters.Add("#Mobile2",
SqlDbType.Float);
adapter.InsertCommand.Parameters.Add("#Tel",
SqlDbType.Float);
adapter.InsertCommand.Parameters.Add("#Category",
SqlDbType.NVarchar, 50);
adapter.InsertCommand.UpdatedRowSource = UpdateRowSource.None;
// Set the batch size.
adapter.UpdateBatchSize = batchSize;
// Execute the update.
adapter.Update(dataTable);
}
}
I know this is a super old post, but you should not need to use the bulk operations explained in the existing answers for 5000 inserts. Your performance is suffering so much because you close and reopen the connection for each row insert. Here is some code I have used in the past that keeps one connection open and executes as many commands as needed to push all the data to the DB:
public static class DataWorker
{
public static Func<IEnumerable<T>, Task> GetStoredProcedureWorker<T>(Func<SqlConnection> connectionSource, string storedProcedureName, Func<T, IEnumerable<(string paramName, object paramValue)>> parameterizer)
{
if (connectionSource is null) throw new ArgumentNullException(nameof(connectionSource));
SqlConnection openConnection()
{
var conn = connectionSource() ?? throw new ArgumentNullException(nameof(connectionSource), $"Connection from {nameof(connectionSource)} cannot be null");
var connState = conn.State;
if (connState != ConnectionState.Open)
{
conn.Open();
}
return conn;
}
async Task DoStoredProcedureWork(IEnumerable<T> workData)
{
using (var connection = openConnection())
using (var command = connection.CreateCommand())
{
command.CommandType = CommandType.StoredProcedure;
command.CommandText = storedProcedureName;
command.Prepare();
foreach (var thing in workData)
{
command.Parameters.Clear();
foreach (var (paramName, paramValue) in parameterizer(thing))
{
command.Parameters.AddWithValue(paramName, paramValue ?? DBNull.Value);
}
await command.ExecuteNonQueryAsync().ConfigureAwait(false);
}
}
}
return DoStoredProcedureWork;
}
}
This was actually from a project where I was gathering emails for a restriction list, so kind of relevant example of what a parameterizer argument might look like and how to use the above code:
IEnumerable<(string,object)> RestrictionToParameter(EmailRestriction emailRestriction)
{
yield return ("#emailAddress", emailRestriction.Email);
yield return ("#reason", emailRestriction.Reason);
yield return ("#restrictionType", emailRestriction.RestrictionType);
yield return ("#dateTime", emailRestriction.Date);
}
var worker = DataWorker.GetStoredProcedureWorker<EmailRestriction>(ConnectionFactory, #"[emaildata].[AddRestrictedEmail]", RestrictionToParameter);
await worker(emailRestrictions).ConfigureAwait(false);

Export SQL DataBase to WinForm DataSet and then to MDB Database using DataSet

My application is a winform app that relies on a database. At startup of the application it connects to an SQL Database on the server and put this information in a DataSet/DataTable.
If for some reason the database on the server is not accessible, the application has a built in failover and it will get its information from the local database.
If, in a normal scenario, I start the tool it will read from the sql database and if it has been updated on the server (a seperate snippet checks this), it should make sure the local database is up to date and this is where the problem starts.. (see below)
This part works fine and is added as context - this is where we connect to the SQL Database
public static DataSet dtsTableContents;
public static DataTable CreateDatabaseSQLConnection()
{
try
{
string strSqlConnectionString = "Data Source=MyLocation;Initial Catalog=MyCatalog;User=MyUser;Password=MyPassword;";
SqlCommand scoCommand = new SqlCommand();
scoCommand.Connection = new SqlConnection(strSqlConnectionString);
scoCommand.Connection.Open();
string strQueryToTable = "SELECT * FROM " + strTableName;
dtsTableContents = new DataSet();
SqlCommand scmTableInformation = new SqlCommand(strQueryToTable, scnConnectionToDatabase);
SqlDataAdapter sdaTableInformation = new SqlDataAdapter(scmTableInformation);
scnConnectionToDatabase.Open();
sdaTableInformation.Fill(dtsTableContents, strTableName);
DataTable dttTableInformation = dtsTableContents.Tables[strTableName];
scnConnectionToDatabase.Close();
return dttTableInformation;
}
catch
{
return null;
}
}
This snippet is part of the failover method that reads from my local database...
This part works fine and is added as context - this is where we connect to the MDB Database
public static DataTable CreateDatabaseConnection()
{
try
{
string ConnectionString = #"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=MyLocation;Persist Security Info=True;JET OLEDB:Database Password=MyPassword;"
odcConnection = new OleDbConnection(ConnectionString);
odcConnection.Open();
string strQueryToTable = "SELECT * FROM " + strTableName;
DataSet dtsTableContents = new DataSet();
OleDbCommand ocmTableInformation = new OleDbCommand(strQueryToTable, ocnConnectionToDatabase);
OleDbDataAdapter odaTableInformation = new OleDbDataAdapter(ocmTableInformation);
ocnConnectionToDatabase.Open();
odaTableInformation.Fill(dtsTableContents, strTableName);
DataTable dttTableInformation = dtsTableContents.Tables[strTableName];
ocnConnectionToDatabase.Close();
return dttTableInformation;
}
catch
{
return null;
}
}
From my CreateDatabaseSQLConnection() I have a DataSet. This DataSet is verified to contain all the information from the server database. Now I have been googling around and found myself trying to use this code to update the local database based on this article:
http://msdn.microsoft.com/en-us/library/system.data.common.dataadapter.update(v=vs.71).aspx
public static void UpdateLocalDatabase(string strTableName)
{
try
{
if (CreateDatabaseConnection() != null)
{
string strQueryToTable = "SELECT * FROM " + strTableName;
OleDbDataAdapter odaTableInformation = new OleDbDataAdapter();
odaTableInformation.SelectCommand = new OleDbCommand(strQueryToTable, odcConnection);
OleDbCommandBuilder ocbCommand = new OleDbCommandBuilder(odaTableInformation);
odcConnection.Open();
odaTableInformation.Update(dtsTableContents, strTableName);
odcConnection.Close();
}
}
catch { }
}
This snippet runs error-free but it does not seem to change anything. Also the time it takes to run this step takes like milliseconds and I would think this to take longer.
I am using the DataSet I obtained from my SQL Connection, this DataSet I am trying to write to my local database.
Might it be the fact that this is a DataSet from an SQL connection and that I can't write this to my mdb connection via my OleDbAdapter or am I just missing the obvious here?
Any help is appreciated.
Thanks,
Kevin
For a straight backup from the external database to the internal backup
I've just been messing around with an sdf and a server based sql database and it has worked.. It's not by all means the finished product but for one column I've got the program to read from the external database and then write straight away to the local .sdf in around 15 lines of code
SqlConnection sqlCon = new SqlConnection( ExternalDatabaseConnectionString );
SqlCeConnection sqlCECon = new SqlCeConnection( BackUpConnectionString );
using ( sqlCon )
{
using ( sqlCECon )
{
sqlCon.Open( );
sqlCECon.Open( );
SqlCommand get = new SqlCommand( "Select * from [TableToRead]", sqlCon );
SqlCeCommand save = new SqlCeCommand( "Update [BackUpTable] set InfoColumn = #info where ID = #id", sqlCECon );
SqlDataReader reader = get.ExecuteReader( );
if ( reader.HasRows )
{
reader.Read( );
save.Parameters.AddWithValue("#id", reader.GetString(0));
save.Parameters.AddWithValue( "#info", reader.GetString( 1 ));
save.ExecuteNonQuery( );
}
}
}
For one row of a database, backing up one column, it works, I'm assuming you will have some sort of auto incremented key like ID?
I think a first step would be to reduce your reliance on static methods and fields.
If you look at your UpdateLocalDatabase method you'll see that you are passing in strTableName which is used by that method but a method that UpdateLocalDatabase calls (CreateDatabaseConnection) refers to a different global static variable named the same. Most likely the two strTableName variables contain different values and you are not seeing that they are not the same variable.
Also you are trying to write out the global static data set dtsTableContents in UpdateLocalDatabase but if you then look at CreateDatabaseConnection it actually creates a local version of that variable -- again, you have two variables named the same thing where one is global and one is local.
I suspect that the two variables of dtsTableContents is the problem.
My suggestion, again, would be to not have any static methods or variables and, for what you're doing here, try not to use any global variables. Also, refactor and/or rename your methods to match more what they are actually doing.
After endlessly trying to use the DataAdapter.Update(Method) I gave up.. I settled for using an sdf file instead of an mdb.
If I need to update my database, I remove the local database, create a new one, using the same connectionstring. Then I loop over the tables in my dataset, read the column names and types from it and create tables based on that.
After this I loop over my dataset and insert the contents of my dataset which I filled with the information from the server. Below is the code, this is just 'quick and dirty' as a proof of concept but it works for my scenario.
public static void RemoveAndCreateLocalDb(string strLocalDbLocation)
{
try
{
if (File.Exists(strLocalDbLocation))
{
File.Delete(strLocalDbLocation);
}
SqlCeEngine sceEngine = new SqlCeEngine(#"Data Source= " + strLocalDbLocation + ";Persist Security Info=True;Password=MyPass");
sceEngine.CreateDatabase();
}
catch
{ }
}
public static void UpdateLocalDatabase(String strTableName, DataTable dttTable)
{
try
{
// Opening the Connection
sceConnection = CreateDatabaseSQLCEConnection();
sceConnection.Open();
// Creating tables in sdf file - checking headers and types and adding them to a query
StringBuilder stbSqlGetHeaders = new StringBuilder();
stbSqlGetHeaders.Append("create table " + strTableName + " (");
int z = 0;
foreach (DataColumn col in dttTable.Columns)
{
if (z != 0) stbSqlGetHeaders.Append(", "); ;
String strName = col.ColumnName;
String strType = col.DataType.ToString();
if (strType.Equals("")) throw new ArgumentException("DataType Empty");
if (strType.Equals("System.Int32")) strType = "int";
if (strType.Equals("System.String")) strType = "nvarchar (100)";
if (strType.Equals("System.Boolean")) strType = "nvarchar (15)";
if (strType.Equals("System.DateTime")) strType = "datetime";
if (strType.Equals("System.Byte[]")) strType = "nvarchar (100)";
stbSqlGetHeaders.Append(strName + " " + strType);
z++;
}
stbSqlGetHeaders.Append(" )");
SqlCeCommand sceCreateTableCommand;
string strCreateTableQuery = stbSqlGetHeaders.ToString();
sceCreateTableCommand = new SqlCeCommand(strCreateTableQuery, sceConnection);
sceCreateTableCommand.ExecuteNonQuery();
StringBuilder stbSqlQuery = new StringBuilder();
StringBuilder stbFields = new StringBuilder();
StringBuilder stbParameters = new StringBuilder();
stbSqlQuery.Append("insert into " + strTableName + " (");
foreach (DataColumn col in dttTable.Columns)
{
stbFields.Append(col.ColumnName);
stbParameters.Append("#" + col.ColumnName.ToLower());
if (col.ColumnName != dttTable.Columns[dttTable.Columns.Count - 1].ColumnName)
{
stbFields.Append(", ");
stbParameters.Append(", ");
}
}
stbSqlQuery.Append(stbFields.ToString() + ") ");
stbSqlQuery.Append("values (");
stbSqlQuery.Append(stbParameters.ToString() + ") ");
string strTotalRows = dttTable.Rows.Count.ToString();
foreach (DataRow row in dttTable.Rows)
{
SqlCeCommand sceInsertCommand = new SqlCeCommand(stbSqlQuery.ToString(), sceConnection);
foreach (DataColumn col in dttTable.Columns)
{
if (col.ColumnName.ToLower() == "ssma_timestamp")
{
sceInsertCommand.Parameters.AddWithValue("#" + col.ColumnName.ToLower(), "");
}
else
{
sceInsertCommand.Parameters.AddWithValue("#" + col.ColumnName.ToLower(), row[col.ColumnName]);
}
}
sceInsertCommand.ExecuteNonQuery();
}
}
catch { }
}

Categories