MySQL -> Transaction Context -> Code Review - c#

I am hoping someone could check my context of how I am using Transaction with MySql. I believe this should work with the outline below. Can someone look at my code and tell me if I am doing it correctly? Thank you.
I believe this should:
Instantiate the db connection.
Iterate through the DataTable rows of the given DataTable.
Check to see if the table exists and if it does not it will Execute the Create Table.
Execute Insert Command with Parameters of information into the newly created or existing table.
Commit the Transaction and then close the connection.
//Open the SQL Connection
var dbConnection = new MySqlConnection(GetConnectionString(WowDatabase));
dbConnection.Open();
//Instantiate the Command
using (var cmd = new MySqlCommand())
{
//Create a new Transaction
using (var transaction = dbConnection.BeginTransaction())
{
uint lastId = 999999;
for (int i = 0; i < dt.Rows.Count; i++)
{
//var identifier = dt.Rows[i].Field<int>("Identifier");
var id = dt.Rows[i].Field<uint>("Entry");
var name = dt.Rows[i].Field<string>("Name");
var zone = dt.Rows[i].Field<uint>("ZoneID");
var map = dt.Rows[i].Field<uint>("MapID");
var state = dt.Rows[i].Field<Enums.ItemState>("State");
var type = dt.Rows[i].Field<Enums.ObjectType>("Type");
var faction = dt.Rows[i].Field<Enums.FactionType>("Faction");
var x = dt.Rows[i].Field<float>("X");
var y = dt.Rows[i].Field<float>("Y");
var z = dt.Rows[i].Field<float>("Z");
string dataTableName = "entry_" + id;
//Create Table if it does not exist.
if (id != lastId)
{
cmd.CommandText = $"CREATE TABLE IF NOT EXISTS `{dataTableName}` (" +
"`identifier` int NOT NULL AUTO_INCREMENT COMMENT 'Auto Incriment Identifier' ," +
"`zone_id` int NULL COMMENT 'Zone Entry' ," +
"`x_axis` float NULL COMMENT 'X Axis on Map' ," +
"`y_axis` float NULL COMMENT 'Y Axis on Map' ," +
"`z_axis` float NULL COMMENT 'Z Axis on Map' ," +
"`situation` enum('') NULL COMMENT 'Location of the item. Underground, Indoors, Outdoors)' ," +
"`faction` enum('') NULL COMMENT 'Specifies the Faction which can safely access the item.' ," +
"PRIMARY KEY(`identifier`)" +
")";
cmd.ExecuteNonQuery();
lastId = id;
}
//Create command to execute the insertion of Data into desired Table
cmd.CommandText = $"INSERT INTO [{dataTableName}] " +
"([identifier], [zone_id], [x_axis], [y_axis], [z_axis], [situation], [faction], [Create_Date], [Update_Date]) " +
"VALUES (#Identifier, #Zone_Id, #X_Axis, #Y_Axis, #Z_Axis, #Situation, #Faction, #Create_Date, #Update_Date)";
//Add data value with Parameters.
cmd.CommandType = CommandType.Text;
//cmd.Parameters.AddWithValue("#Identifier", identifier);
cmd.Parameters.AddWithValue("#Identifier", id);
cmd.Parameters.AddWithValue("#Zone_Id", zone);
cmd.Parameters.AddWithValue("#X_Axis", x);
cmd.Parameters.AddWithValue("#Y_Axis", y);
cmd.Parameters.AddWithValue("#Z_Axis", z);
cmd.Parameters.AddWithValue("#Situation", state);
cmd.Parameters.AddWithValue("#Faction", faction);
cmd.Parameters.AddWithValue("#Create_Date", DateTime.Now.Date);
cmd.Parameters.AddWithValue("#Update_Date", DateTime.Now.Date);
cmd.ExecuteNonQuery();
} //for (int i = 0; i < dt.Rows.Count; i++)
//Commit the Transaction
transaction.Commit();
} //using (var transaction = dbConnection.BeginTransaction())
} //using (var cmd = new MySqlCommand())
//Close the Connection
dbConnection.Close();

I don't think this will work (as expected) with MySql. There are a few statements that cause an implicit commit - CREATE TABLE is one of them.
http://dev.mysql.com/doc/refman/5.7/en/implicit-commit.html

Consider a using statement
You can actually wrap your existing dbConnection within a using statement to ensure that it is safely disposed of (similar to how you are handling your transactions, commands, etc.) :
//Open the SQL Connection
using(var dbConnection = new MySqlConnection(GetConnectionString(WowDatabase))
{
// Other code omitted for brevity
}
Consistent String Interpolation
You have a few spots where you simply concatenate strings via + but you are mostly taking advantage of C# 6's String Interpolation feature. You might want to consider using it everywhere :
string dataTableName = $"entry_{id}";
No Need for Setting CommandType
Additionally, you could remove the the setting of your CommandType property for your actual cmd object as CommandType.Text is the default :
//cmd.CommandType = CommandType.Text;

Related

Read Entity objects from database using EntityDataReader

Due to some reason I need to read entity objects directly from database using ADO.Net.
I've found below snippet from Microsoft documentation. I want to know are there any methods to read whole row into an Onject ('contact' in this sample) using EntityDataReader instead of mapping every single field to every property? I mean instead of reading Contact.Id and Contact.Name and other fields one by one, are there any methods which read one row into one object or not?
using (EntityConnection conn =
new EntityConnection("name=AdventureWorksEntities"))
{
conn.Open();
string esqlQuery = #"SELECT VALUE contacts FROM
AdventureWorksEntities.Contacts AS contacts
WHERE contacts.ContactID == #id";
// Create an EntityCommand.
using (EntityCommand cmd = conn.CreateCommand())
{
cmd.CommandText = esqlQuery;
EntityParameter param = new EntityParameter();
param.ParameterName = "id";
param.Value = 3;
cmd.Parameters.Add(param);
// Execute the command.
using (EntityDataReader rdr =
cmd.ExecuteReader(CommandBehavior.SequentialAccess))
{
// The result returned by this query contains
// Address complex Types.
while (rdr.Read())
{
// Display CustomerID
Console.WriteLine("Contact ID: {0}",
rdr["ContactID"]);
// Display Address information.
DbDataRecord nestedRecord =
rdr["EmailPhoneComplexProperty"] as DbDataRecord;
Console.WriteLine("Email and Phone Info:");
for (int i = 0; i < nestedRecord.FieldCount; i++)
{
Console.WriteLine(" " + nestedRecord.GetName(i) +
": " + nestedRecord.GetValue(i));
}
}
}
}
conn.Close();
}
Your cleanest option is to use execute your query using EntityFramework as suggested by #herosuper
In your example, you'd need to do something like this:
EntityContext ctx = new EntityContext();
var contacts= ctx.Contacts
.SqlQuery("SELECT * FROM AdventureWorksEntities.Contacts AS contacts"
+ "WHERE contacts.ContactID =#id", new SqlParameter("#id", 3)).ToList();
From here, you would be able to:
var myvariable = contacts[0].ContactID;//zero is index of list. you can use foreach loop.
var mysecondvariable = contacts[0].EmailPhoneComplexProperty;
Alternatively, you might skip the whole SQL string by by doing this:
EntityContext ctx = new EntityContext();
var contact= ctx.Contacts.Where(a=> a.ContactID ==3).ToList();
I'm assuming the query returns more than one record, otherwise you would just use FirstOrDefault() instead of Where()

Import Text File data to PostgreSQL database using C#

private void GetTextFile()
{
NpgsqlConnection npgsqlConnection = new NpgsqlConnection();
npgsqlConnection.ConnectionString = "Server=127.0.0.1;Port=5432;User
Id=postgres;Password=rutuparna;Database=Employee";
npgsqlConnection.Open();
NpgsqlCommand command = new NpgsqlCommand("Select * from employee_details", npgsqlConnection);
NpgsqlDataReader dataReader = command.ExecuteReader();
using (System.IO.StreamWriter writer = new System.IO.StreamWriter(#"D:\Rutu\txtfile.txt", false, Encoding.UTF8))
{
while (dataReader.Read())
{
writer.WriteLine(dataReader[0] + "; " + dataReader[1] + ";" + dataReader[2] + ";" + dataReader[3]);
}
}
MessageBox.Show("Data fetched Properly");
}
Here I have done how to convert data into a text file.
Can somebody please give me the code of how to export Text File data to the SQL database using C# ?
Why not something like this
using (SqlConnection con = new SqlConnection(#"your connection string"))
{
con.Open();
using(StreamReader file = new StreamReader(#"D:\Rutu\txtfile.txt"))
{
while((line = file.ReadLine()) != null)
{
string[] fields = line.Split(',');
SqlCommand cmd = new SqlCommand("INSERT INTO employee_details(column1, column2, column3,column4) VALUES (#value1, #value2, #value3, #value4)", con);
cmd.Parameters.AddWithValue("#value1", fields[0].ToString());
cmd.Parameters.AddWithValue("#value2", fields[1].ToString());
cmd.Parameters.AddWithValue("#value3", fields[2].ToString());
cmd.Parameters.AddWithValue("#value4", fields[3].ToString());
cmd.ExecuteNonQuery();
}
}
}
I don't know the names of your columns, so you'll need to replace them.
If speed is a consideration (and IMHO it always should be) and you are using PostgreSQL (which you seem to be), then you should take a look at the COPY function. Single inserts are always slower than a bulk operation.
It is relatively easy to wrap the COPY function within c#. The following is a cut down version of a method, to illustrate the point. My method loops through the rows of a DataTable, but it is easy to adapt for a file situation (ReadLine() etc).
using (var pgConn = new NpgsqlConnection(myPgConnStr))
{
using (var writer = pgConn.BeginBinaryImport("COPY " + destinationTableName + " (" + commaSepFieldNames + ") FROM STDIN (FORMAT BINARY)"))
{
//Loop through data
for (int i = 0; i < endNo; i++)
{
writer.StartRow();
//inner loop through fields
for (int j = 0; j < fieldNo; j++)
{
//test for null
if (true)
{
writer.WriteNull();
}
else
{
//Write data using column types
writer.Write(value, type);
}
}
}
writer.Complete();
}
}
WriteNull() is a special method for correctly adding NULL to the stream, otherwise you use Write<T>(). This latter has three overloads, just the value, value with type enumeration, or value with type name. I prefer to use the enumeration. commaSepFieldNames is a comma separated list of the field names. Other variables should (I hope) be self-explanatory.

SQLiteException constraint failed

I have a SQLite table which I've created and it works fine when inserting data which is non-zero. However, I need to insert some zero default values and the SQLiteParameter seems to be converting the zero values to null
Can someone explain why I'm getting #xxxx3=null instead of #xxxx3=0 and also how to fix it.
This appears to happen for any numeric field (INTEGER/NUMERIC).
I've put together a simplified example that shows the problem
class Program
{
private static List<SQLiteParameter> DefaultSystemParameters()
{
List<SQLiteParameter> sp = new List<SQLiteParameter>()
{
new SQLiteParameter("#xxxx2", 60),
//new SQLiteParameter("#xxxx3", 1), // Works fine
new SQLiteParameter("#xxxx3", 0), // Throws 'System.Data.SQLite.SQLiteException' NOT NULL constraint failed: tblxxxx.xxxx3
};
return sp;
}
static void Main(string[] args)
{
//Add Nuget package - System.Data.SQLite v 1.0.99
string baseDir = AppDomain.CurrentDomain.BaseDirectory + AppDomain.CurrentDomain.RelativeSearchPath + "db\\";
string fileName = "test.db";
string sqlCreateTable = "CREATE TABLE IF NOT EXISTS tblxxxx (" +
"xxxx1 INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT," +
"xxxx2 INTEGER NOT NULL," +
"xxxx3 INTEGER NOT NULL" +
")";
string sqlInsert = "INSERT INTO tblxxxx (xxxx2, xxxx3) VALUES (#xxxx2, #xxxx3)";
if (!Directory.Exists(baseDir))
Directory.CreateDirectory(baseDir);
DataTable dt = new DataTable();
string connectionString = $"Data Source={baseDir + fileName};Version=3;";
using (var connection = new SQLiteConnection(connectionString))
{
connection.Open();
using (var transaction = connection.BeginTransaction())
{
//CREATE
using (SQLiteCommand command = new SQLiteCommand(sqlCreateTable, connection))
{
command.CommandType = CommandType.Text;
command.ExecuteNonQuery();
//INSERT
command.CommandText = sqlInsert;
command.Parameters.AddRange(DefaultSystemParameters().ToArray());
command.ExecuteNonQuery();
}
transaction.Commit();
}
}
}
}
From https://msdn.microsoft.com/en-us/library/0881fz2y(v=vs.110).aspx:
Use caution when you use this overload of the SqlParameter constructor
to specify integer parameter values. Because this overload takes a
value of type Object, you must convert the integral value to an Object
type when the value is zero, as the following C# example demonstrates.
Parameter = new SqlParameter("#pname", (object)0);

Fastest way to update more than 50.000 rows in a mdb database c#

I searched on the net something but nothing really helped me. I want to update, with a list of article, a database, but the way that I've found is really slow.
This is my code:
List<Article> costs = GetIdCosts(); //here there are 70.000 articles
conn = new OleDbConnection(string.Format(MDB_CONNECTION_STRING, PATH, PSW));
conn.Open();
transaction = conn.BeginTransaction();
using (var cmd = conn.CreateCommand())
{
cmd.Transaction = transaction;
cmd.CommandText = "UPDATE TABLE_RO SET TABLE_RO.COST = ? WHERE TABLE_RO.ID = ?;";
for (int i = 0; i < costs.Count; i++)
{
double cost = costs[i].Cost;
int id = costs[i].Id;
cmd.Parameters.AddWithValue("data", cost);
cmd.Parameters.AddWithValue("id", id);
if (cmd.ExecuteNonQuery() != 1) throw new Exception();
}
}
transaction.Commit();
But this way take a lot of minutes something like 10 minutes or more. There are another way to speed up this updating ? Thanks.
Try modifying your code to this:
List<Article> costs = GetIdCosts(); //here there are 70.000 articles
// Setup and open the database connection
conn = new OleDbConnection(string.Format(MDB_CONNECTION_STRING, PATH, PSW));
conn.Open();
// Setup a command
OleDbCommand cmd = new OleDbCommand();
cmd.Connection = conn;
cmd.CommandText = "UPDATE TABLE_RO SET TABLE_RO.COST = ? WHERE TABLE_RO.ID = ?;";
// Setup the paramaters and prepare the command to be executed
cmd.Parameters.Add("?", OleDbType.Currency, 255);
cmd.Parameters.Add("?", OleDbType.Integer, 8); // Assuming you ID is never longer than 8 digits
cmd.Prepare();
OleDbTransaction transaction = conn.BeginTransaction();
cmd.Transaction = transaction;
// Start the loop
for (int i = 0; i < costs.Count; i++)
{
cmd.Parameters[0].Value = costs[i].Cost;
cmd.Parameters[1].Value = costs[i].Id;
try
{
cmd.ExecuteNonQuery();
}
catch (Exception ex)
{
// handle any exception here
}
}
transaction.Commit();
conn.Close();
The cmd.Prepare method will speed things up since it creates a compiled version of the command on the data source.
Small change option:
Using StringBuilder and string.Format construct one big command text.
var sb = new StringBuilder();
for(....){
sb.AppendLine(string.Format("UPDATE TABLE_RO SET TABLE_RO.COST = '{0}' WHERE TABLE_RO.ID = '{1}';",cost, id));
}
Even faster option:
As in first example construct a sql but this time make it look (in result) like:
-- declaring table variable
declare table #data (id int primary key, cost decimal(10,8))
-- insert union selected variables into the table
insert into #data
select 1121 as id, 10.23 as cost
union select 1122 as id, 58.43 as cost
union select ...
-- update TABLE_RO using update join syntax where inner join data
-- and copy value from column in #data to column in TABLE_RO
update dest
set dest.cost = source.cost
from TABLE_RO dest
inner join #data source on dest.id = source.id
This is the fastest you can get without using bulk inserts.
Performing mass-updates with Ado.net and OleDb is painfully slow. If possible, you could consider performing the update via DAO. Just add the reference to the DAO-Library (COM-Object) and use something like the following code (caution -> untested):
// Import Reference to "Microsoft DAO 3.6 Object Library" (COM)
string TargetDBPath = "insert Path to .mdb file here";
DAO.DBEngine dbEngine = new DAO.DBEngine();
DAO.Database daodb = dbEngine.OpenDatabase(TargetDBPath, false, false, "MS Access;pwd="+"insert your db password here (if you have any)");
DAO.Recordset rs = daodb.OpenRecordset("insert target Table name here", DAO.RecordsetTypeEnum.dbOpenDynaset);
if (rs.RecordCount > 0)
{
rs.MoveFirst();
while (!rs.EOF)
{
// Load id of row
int rowid = rs.Fields["Id"].Value;
// Iterate List to find entry with matching ID
for (int i = 0; i < costs.Count; i++)
{
double cost = costs[i].Cost;
int id = costs[i].Id;
if (rowid == id)
{
// Save changed values
rs.Edit();
rs.Fields["Id"].Value = cost;
rs.Update();
}
}
rs.MoveNext();
}
}
rs.Close();
Note the fact that we are doing a full table scan here. But, unless the total number of records in the table is many orders of magnitude bigger than the number of updated records, it should still outperform the Ado.net approach significantly...

UPDATE faster in SQLite + BEGIN TRANSACTION

This one is related to spatilite also (not only SQLite)
I have a file database (xyz.db) which I am using by SQLiteconnection (SQLiteconnection is extends to spatialite).
I have so many records needs to update into database.
for (int y = 0; y < castarraylist.Count; y++)
{
string s = Convert.ToString(castarraylist[y]);
string[] h = s.Split(':');
SQLiteCommand sqlqctSQL4 = new SQLiteCommand("UPDATE temp2 SET GEOM = " + h[0] + "WHERE " + dtsqlquery2.Columns[0] + "=" + h[1] + "", con);
sqlqctSQL4.ExecuteNonQuery();
x = x + 1;
}
At above logic castarraylist is Arraylist which contains value which need to process into database.
When I checked above code updating around 400 records in 1 minute.
Is there any way by which I can able to improve performance ?
NOTE :: (File database is not thread-safe)
2. BEGIN TRANSACTION
Let's suppose I like to run two (or millions) update statement with single transaction in Spatialite.. is it possible ?
I read online and prepare below statement for me (but not get success)
BEGIN TRANSACTION;
UPDATE builtuparea_luxbel SET ADMIN_LEVEL = 6 where PK_UID = 2;
UPDATE builtuparea_luxbel SET ADMIN_LEVEL = 6 where PK_UID = 3;
COMMIT TRANSACTION;
Above statement not updating records in my database.
is SQLite not support BEGIN TRANSACTION ?
is there anything which I missing ?
And If I need to run individual statement then it's taking too much time to update as said above...
SQLite support Transaction, you can try below code.
using (var cmd = new SQLiteCommand(conn))
using (var transaction = conn.BeginTransaction())
{
for (int y = 0; y < castarraylist.Count; y++)
{
//Add your query here.
cmd.CommandText = "INSERT INTO TABLE (Field1,Field2) VALUES ('A', 'B');";
cmd.ExecuteNonQuery();
}
transaction.Commit();
}
The primary goal of a database transaction to get everything done, or nothing if something fails inside;
Reusing the same SQLiteCommand object by changing its CommandText property and execute it again and again might be faster, but leads to a memory overhead: If you have an important amount of queries to perform, the best is to dispose the object after use and create a new one;
A common pattern for an ADO.NET transaction is:
using (var tra = cn.BeginTransaction())
{
try
{
foreach(var myQuery in myQueries)
{
using (var cd = new SQLiteCommand(myQuery, cn, tra))
{
cd.ExecuteNonQuery();
}
}
tra.Commit();
}
catch(Exception ex)
{
tra.Rollback();
Console.Error.Writeline("I did nothing, because something wrong happened: {0}", ex);
throw;
}
}

Categories