How to copy large datatable to MySql table? - c#

I am trying to copy a large datatable (columns with more than 1000 rows) created dynamically in the applicaiton to a MySQL table using c#, WPF.
I have searched for various ways to do this but was unsuccessful to implement. I think the MySqlDataAdapter class is something I should use but I cant make it work. This is what I tried to do...
MySqlConnection con = new MySqlConnection(MyConString);
MySqlCommand comm = new MySqlCommand("Select * From kinectdata", con);
MySqlDataAdapter test1 = new MySqlDataAdapter(comm);
test1.Update(skelData);
Speed of this transfer is also important so I prefered not to call an insert or update statement 1000 times.
Many thanks for your feedback!
M

You can build a single INSERT statement which inserts all 1000 rows.
INSERT INTO table VALUES (1,2,3), (4,5,6), (7,8,9);

1000 rows is not that much, in database terms its nothing, using insert should be very fast. No more then 2 seconds.
In your example you do have to declare command type and set your query and command text.

Related

OleDbDataAdapter Sporadic Missing Records

In my application I'm getting some data out of a local MS Access database file. I'm puzzled by a sporadic issue where my query for all records of a specific table sometimes returns all the records, and sometimes returns all but the last record. I'm using the following code
string resourceConStr = #"Provider=Microsoft.ACE.OLEDB.12.0;Data source = C:/FileName.mdb";
OleDbConnection resourceCon = new OleDbConnection(resourceConStr);
OleDbDataAdapter personnelAdapter = new OleDbDataAdapter("Select * From Personnel", resourceCon);
DataTable personnel = new DataTable();
personnelAdapter.Fill(personnel);
When I look at the personnel DataTable, sometimes I have the correct # of records and sometimes I'm missing the last record from the Access table. I haven't been able to find any pattern as to when it works successfully and when it does not. Any idea what could be the reason for this or suggestions or a way to validate that all records were copied into the DataTable sucessfully? Thanks
Any ... suggestions or a way to validate that all records were copied into the DataTable sucessfully?
One way to do it would be to execute a SELECT COUNT(*) AS n FROM Personnel, and compare that number (assuming that you get one back) with the number of rows in the DataTable after it gets filled.

Correct work with MS SQL

I have a program that gets data from SQL server table. The code is the following:
SqlConnection conn=new SqlConnection(...);//correct
conn.Open();
DataTable dt=new DataTable();
SqlCommand selectCMD = new SqlCommand("SELECT * FROM TABLE WHERE Condition", conn);
SqlDataAdapter custDA = new SqlDataAdapter();
custDA.SelectCommand = selectCMD;
custDA.Fill(dt);
Datagridview1.DataSource=dt;
Datagridview1.DataBind();
But the problem is that when executing the same query in SQL server management studio, it takes less then second to execute. Meanwhile when using program, it takes half a minute to get the result. Using debugger I see, that the main row, where program "thinks" a lot time is when data adapter is filling DataTable. Any suggestions how to reduce the time? What's wrong in my code?
Managaement studio just displays text results. SqlDataAdapter must map each result column value to a DataGridView column value. One will take much more time than the other. With Management Studio, it also virtualizes the results--it doesn't show all the results at once as you scroll down for large result sets more data is retrieved.
Check that you might need proper indexing on required columns . When you are running query from SQL it might be using an execution plan which is more optimized compared to when .Fill method is doing .
Can you try cleaning out procedure cache and memory buffers using SSMS:
DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE
Doing so before you test your query prevents usage of cached execution plans and previous results cache.
You can to create index and to set options to accelerate query execution. And you can to use this approach to load data SqlDataAdapter.Fill - Asynchronous approach
create index condition_idx on table (condition)
new SqlCommand("set nocount on;SELECT * FROM TABLE WHERE Condition", conn);
Thank Everyone for help. I used parameters in SqlCommand object. Unfortunately I haven't mentioned that so you couldn't help me. But as James posted a link, I have found that when SqlCommand is uded with parameters then the execution is made using stored procedure sp_execute. Because server has to compile it, that's why it takes so long. After parameters were removed, everything is working good. The other way is that you can turn off automatic recomplilation each time stored procedure executes. Thanks everyone again. +1 for everyone.

SqlDataAdapter.Fill() - Conversion overflow

All,
I am encountering "Conversion overflow" exceptions on one of the SqlDataAdapter.Fill() usages for a decimal field. The error occurs for value beginning 10 billion, but not till 1 billion. Here is the code:
DataSet ds = new DataSet();
SqlDataAdapter sd = new SqlDataAdapter();
adapter.SelectCommand = <my SQL Command instance>
adapter.Fill(ds);
I have read using SqlDataReader as an alternate but we need to set the datatype and precision explicitly. There are at least 70 columns that I am fetching and I don't want to set all of them only for one decimal field in error.
Can anyone suggest alternate approaches?
Thank you.
Although dataset is allowed for "filling" a data adapter, I've typically done with a DataTable instead as when querying, I'm only expecting one result set. Having said that, I would pre-query the table, just to get its structure... something like
select whatever from yourTable(s) where 1=2
This will get the expected result columns when you do a
DataTable myTable = new DataTable();
YourAdapter.Fill( myTable );
Now that you have a local table that will not fail for content size because no records will have been returned, you can now explicitly go to that one column in question and set its data type / size information as you need...
myTable.Columns["NameOfProblemColumn"].WhateverDataType/Precision = Whatever you need...
NOW, your local schema is legit and the problem column will have been identified with its precision. Now, put in your proper query with proper where clause and not the 1=2 to actually return data... Since no actual rows in the first pass, you don't even need to do a myTable.Clear() to clear the rows... Just re-run the query and dataAdapter.Fill().
I haven't actually tried as I don't have your data issues to simulate same problem, but the theoretical process should get you by without having to explicitly go through all columns... just the few that may pose the problem.
I had the same problem and the reason is because in my stored procedure I returned a decimal(38,20) field. I changed it into decimal(20,10) and all works fine. It seems to be a limitation of Ado.Net.
CREATE PROCEDURE FOOPROCEDURE AS
BEGIN
DECLARE #A DECIMAL(38,20) = 999999999999999999.99999999999999999999;
SELECT #A;
END
GO
string connectionString ="";
SqlConnection conn = new SqlConnection(connectionString);
conn.Open();
SqlCommand cmd = new SqlCommand("EXEC FOOPROCEDURE", conn);
SqlDataAdapter adt = new SqlDataAdapter(cmd);
DataSet ds = new DataSet();
adt.Fill(ds); //exception thrown here

Data Adapter Vs Sql Command

Which one would be better in executing an insert statement for ms-sql database:
Sql DataAdapter or SQL Command
Object?
Which of them would be better, while inserting only one row and while inserting multiple rows?
A simple example of code usage:
SQL Command
string query = "insert into Table1(col1,col2,col3) values (#value1,#value2,#value3)";
int i;
SqlCommand cmd = new SqlCommand(query, connection);
// add parameters...
cmd.Parameters.Add("#value1",SqlDbType.VarChar).Value=txtBox1.Text;
cmd.Parameters.Add("#value2",SqlDbType.VarChar).Value=txtBox2.Text;
cmd.Parameters.Add("#value3",SqlDbType.VarChar).Value=txtBox3.Text;
cmd.con.open();
i = cmd.ExecuteNonQuery();
cmd.con.close();
SQL Data Adapter
DataRow dr = dsTab.Tables["Table1"].NewRow();
DataSet dsTab = new DataSet("Table1");
SqlDataAdapter adp = new SqlDataAdapter("Select * from Table1", connection);
adp.Fill(dsTab, "Table1");
dr["col1"] = txtBox1.Text;
dr["col2"] = txtBox5.Text;
dr["col3"] = "text";
dsTab.Tables["Table1"].Rows.Add(dr);
SqlCommandBuilder projectBuilder = new SqlCommandBuilder(adp);
DataSet newSet = dsTab.GetChanges(DataRowState.Added);
adp.Update(newSet, "Table1");
Updating a data source is much easier using DataAdapters. It's easier to make changes since you just have to modify the DataSet and call Update.
There is probably no (or very little) difference in the performance between using DataAdapters vs Commands. DataAdapters internally use Connection and Command objects and execute the Commands to perform the actions (such as Fill and Update) that you tell them to do, so it's pretty much the same as using only Command objects.
I would use LinqToSql with a DataSet for single insert and most Database CRUD requests. It is type safe, relatively fast for non compilcated queries such as the one above.
If you have many rows to insert (1000+) and you are using SQL Server 2008 I would use SqlBulkCopy. You can use your DataSet and input into a stored procedure and merge into your destination
For complicated queries I recommend using dapper in conjunction with stored procedures.
I suggest you would have some kind of control on your communication with the database. That means abstracting some code, and for that the CommandBuilder automatically generates CUD statements for you.
What would be even better is if you use that technique together with a typed Dataset. then you have intellisense and compile time check on all your columns

Does DataSet occupy too much space?

If I wish to add some information to my SQL Server database, must I do it through a DataSet and a DataAdapter ?
The idea is that if my database has 1-2 million entries, isn't my memory going to be occupied unnecessary with the 1-2 mil rows in the DataSet considering that I want to add only one row? Is there an alternative ?
If you're only inserting a row, that needn't fetch anything into the DataSet/DataAdapter. You add the row, submit the changes, and the relevant INSERT command will be executed.
You could always create a plain old ADO.NET parametrized SqlCommand holding a simple SQL INSERT statement, and provide parameters, and load the data that way (nothing needs to be loaded, doesn't matter how many rows you already have - it will just work):
string insertStmt = "INSERT INTO dbo.YourTable(col1, col2, ...., colN) " +
"VALUES(#Value1, #Value2, ...., #ValueN)";
using(SqlConnection _con = new SqlConnection(-your-connection-string-here))
using(SqlCommand _cmdInsert = new SqlCommand(insertStmt, _con))
{
// define the parameters for your query
_cmdInsert.Parameters.Add("#Value1", SqlDbType.Int);
.......
// set the values
_cmdInsert.Parameters["#Value1"].Value = 4711;
.....
_con.Open();
int rowsInserted = _cmdInsert.ExecuteNonQuery();
_con.Close();
}
If you have multiple rows to insert, you could loop over e.g. a list of objects, set the values for our _cmdInsert for each object, and execute the _cmdInsert.ExecuteNonQuery() for each row.
Of course, if you use something like an ORM (NHibernate, Linq-to-SQL, Entity Framework), that work might get infinitely easier - just insert new objects into your collection and save them - the ORM will deal with all the nitty-gritty details (and basically do this code I showed above and execute it - more or less).

Categories