I'm inserting large number of records using LinqToSql from C# to SqlServer 2008 express DB. It looks like the insertion is very slow in this. Following is the code snippet.
public void InsertData(int id)
{
MyDataContext dc = new MyDataContext();
List<Item> result = GetItems(id);
foreach (var item in result)
{
DbItem dbItem = new DbItem(){ItemNo = item.No, ItemName=item.Name};
dc.Items.InsertOnSubmit();
}
dc.SubmitChanges();
}
Am I doing anything wrong? Or using Linq to insert large number of records is a bad choice?
Update: Thanks for all the answers.
#p.campbell: Sorry for the records count, it was a typo, actually it is around 100000. Records also range till 200k as well.
As per all the suggestions I moved this operation into parts (also a requirement change and design decision) and retrieving data in small chunks and inserting them into database as and when it comes. I've put this InsertData() method in thread operation and now using SmartThreadPool for creating a pool of 25 threads to do the same operation. In this scenario I'm inserting at a time only 100 records. Now, when I tried this with Linq or sql query it didn't make any difference in terms of time taken.
As per my requirement this operation is scheduled to run every hour and fetches records for around 4k-6k users. So, now I'm pooling every user data (retrieving and inserting into DB) operation as one task and assigned to one thread. Now this entire process takes around 45 minutes for around 250k records.
Is there any better way to do this kind of task? Or can anyone suggest me how can I improve this process?
For inserting massive amount of data into SQL in a oner
Linq or SqlCommand, neither are designed for bulk copying data into SQL.
You can use the SqlBulkCopy class which provides managed access to the bcp utility for bulk loading data into Sql from pretty much any data source.
The SqlBulkCopy class can be used to write data only to SQL Server tables. However, the data source is not limited to SQL Server; any data source can be used, as long as the data can be loaded to a DataTable instance or read with a IDataReader instance.
Performance comparison
SqlBulkCopy is by far the fastest, even when loading data from a simple CSV file.
Linq will just generate a load of Insert statements in SQL and send them to your SQL Server. This is no different than you using Ad-hoc queries with SqlCommand. Performance of SqlCommand vs. Linq is virtually identical.
The Proof
(SQL Express 2008, .Net 4.0)
SqlBulkCopy
Using SqlBulkCopy to load 100000 rows from a CSV file (including loading the data)
using (SqlConnection conn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=EffectCatalogue;Data Source=.\\SQLEXPRESS;"))
{
conn.Open();
Stopwatch watch = Stopwatch.StartNew();
string csvConnString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\\data\\;Extended Properties='text;'";
OleDbDataAdapter oleda = new OleDbDataAdapter("SELECT * FROM [test.csv]", csvConnString);
DataTable dt = new DataTable();
oleda.Fill(dt);
using (SqlBulkCopy copy = new SqlBulkCopy(conn))
{
copy.ColumnMappings.Add(0, 1);
copy.ColumnMappings.Add(1, 2);
copy.DestinationTableName = "dbo.Users";
copy.WriteToServer(dt);
}
Console.WriteLine("SqlBulkCopy: {0}", watch.Elapsed);
}
SqlCommand
using (SqlConnection conn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=TestDb;Data Source=.\\SQLEXPRESS;"))
{
conn.Open();
Stopwatch watch = Stopwatch.StartNew();
SqlCommand comm = new SqlCommand("INSERT INTO Users (UserName, [Password]) VALUES ('Simon', 'Password')", conn);
for (int i = 0; i < 100000; i++)
{
comm.ExecuteNonQuery();
}
Console.WriteLine("SqlCommand: {0}", watch.Elapsed);
}
LinqToSql
using (SqlConnection conn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=TestDb;Data Source=.\\SQLEXPRESS;"))
{
conn.Open();
Stopwatch watch = Stopwatch.StartNew();
EffectCatalogueDataContext db = new EffectCatalogueDataContext(conn);
for (int i = 0; i < 100000; i++)
{
User u = new User();
u.UserName = "Simon";
u.Password = "Password";
db.Users.InsertOnSubmit(u);
}
db.SubmitChanges();
Console.WriteLine("Linq: {0}", watch.Elapsed);
}
Results
SqlBulkCopy: 00:00:02.90704339
SqlCommand: 00:00:50.4230604
Linq: 00:00:48.7702995
if you are inserting large record of data you can try with BULK INSERT .
As per my knowledge there is no equivalent of bulk insert in Linq to SQL.
You've got the SubmitChanges() being called once, which is good. This means that only one connection and transaction are being used.
Consider refactoring your code to use InsertAllOnSubmit() instead.
List<dbItem> newItems = GetItems(id).Select(x=> new DbItem{ItemNo = x.No,
ItemName=x.Name})
.ToList();
db.InsertAllOnSubmit(newItems);
dc.SubmitChanges();
The INSERT statements are sent one-by-one as previous, but perhaps this might be more readable?
Some other things to ask/consider:
What's the state of the indexes on the target table? Too many will slow down the writes. * Is the database in Simple or Full recovery model?
Capture the SQL statements going across the wire. Replay those statements in an adhoc query against your SQL Server database. I realize you're using SQL Express, and likely don't have SQL Profiler. Use context.Log = Console.Out; to output your LINQ To SQL statements to the console. Prefer SQL Profiler for convenience though.
Do the captured SQL statements perform the same as your client code? If so, then the perf problem is at the database side.
Here's a nice walk-through of how to add a Bulk-Insert class to your application, which hugely improves the performance of inserting records using LINQ.
(All source code is provided, ready to be added to your own application.)
http://www.mikesknowledgebase.com/pages/LINQ/InsertAndDeletes.htm
You would just need to make three changes to your code, and link in the class provided.
Good luck !
Related
I am experimenting with Parallel.ForEach in C# and SQL-Server Database Connectivity and table creation. By and large, trying to create a large number of tables using Parallel.ForEach. Although the code is running, I notice data leakage as not all tables are created. I am reading data from Microsoft.Data.Analysis DataFrame which is IEnumerable. Therefore, I would like to ask just for a skeleton of how can I run Parallel.ForEach with SQL connectivity and Commands in order to have multiple different table creation running on parallel threads on the same database and creating at the same time multiple tables, but to make sure all tables are created. I am not an experienced user of C#, so thank you in advance for your help....
Here is some code also, sorry for bad formatting (first time...)
Task task_111 = Task.Factory.StartNew(() =>
{
Parallel.ForEach<string>(list, iter =>
{
using (var myConn=new SqlConnection())
DataFrame LoadExample = DataFrame.LoadCsv();
{
// ......
{
string sql= // create table
{
using (var command=new SqlCommand())
{
command_2.Connection = myConn;
command_2.CommandText = sql;
myConn.Open();
command_2.ExecuteNonQuery();
myConn.Close();
}
.......
I have such an idea (don't know bad or good).
I have utility, which connects by reglament to SQL server and fetches some data to application. Data is simple (2 varchar text attributes), but count of data is ~ 3 millions rows. So, my application uses network very intensively.
Can I programmatically decrease (limit, throttling, etc...) the network bandwidth usage by SQL DataReader? Let it work more slowly, but not stress nither server nor client. Is this idea good? If not, what I have to do so?
Here is code, so far:
using (SqlConnection con = new SqlConnection("My connection string here"))
{
con.Open();
using (SqlCommand command = new SqlCommand(query, con))
{
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
yield return new MyDBObject()
{
Date = (DateTime)reader["close_date"],
JsonResult = (string)reader["json_result"]
};
}
}
}
}
Making the server buffer data or hold an open query longer could actually be significantly increasing load on the server, but ultimately the only way to do what you're after would be to apply "paging" to your query, and access the data in successive pages, perhaps with pauses between pages. The pages could still be pretty big - 100k for example. You can achieve this relatively easily with OFFSET/FETCH in SQL Server.
I have been searching over to find if there is any easy way to get the Execution time of a ADO.NET command object.
I know i can manually do a StopWatch start and stop. But wanted to if there are any easy way to do it in ADO.NET
There is a way, but using SqlConnection, not command object. Example:
using (var c = new SqlConnection(connectionString)) {
// important
c.StatisticsEnabled = true;
c.Open();
using (var cmd = new SqlCommand("select * from Error", c)) {
cmd.ExecuteReader().Dispose();
}
var stats = c.RetrieveStatistics();
var firstCommandExecutionTimeInMs = (long) stats["ExecutionTime"];
// reset for next command
c.ResetStatistics();
using (var cmd = new SqlCommand("select * from Code", c))
{
cmd.ExecuteReader().Dispose();
}
stats = c.RetrieveStatistics();
var secondCommandExecutionTimeInMs = (long)stats["ExecutionTime"];
}
Here you can find what other values are contained inside dictionary returned by RetrieveStatistics.
Note that those values represent client-side statistics (basically internals of ADO.NET measure them), but seems you asked for analog of Stopwatch - I think that's fine.
The approach from the answer of #Evk is very interesting and smart: it's working client side and one of the main key of such statistics is in fact NetworkServerTime, which
Returns the cumulative amount of time (in milliseconds) that the
provider spent waiting for replies from the server once the
application has started using the provider and has enabled statistics.
so it includes the network time from the DB server to the ADO NET client.
An alternative, more DB server oriented, would be running SET STATISTICS TIME ON and then retrieve the InfoMessage.
A draft of the code of the delegate (where I'm simply writing to the debug console, but you may want to replace it with a StringBuilder Append)
internal static void TrackInfo(object sender, SqlInfoMessageEventArgs e)
{
Debug.WriteLine(e.Message);
foreach (var element in e.Errors) {
Debug.WriteLine(element.ToString());
}
}
and usage
conn.InfoMessage += TrackInfo;
using (var cmd = new SqlCommand(#"SET STATISTICS TIME ON", conn)) {
cmd.ExecuteNonQuery();
}
using (var cmd = new SqlCommand(yourQuery, conn)) {
var RD = cmd.ExecuteReader();
while (RD.Read()) {
// read the columns
}
}
I suggest you move to SQL Server 2016 and use the Query Store feature. This will track execution time and performance changes over time for each query you submit. Requires no changes in your application. Track all queries, including those executed inside stored procedures. Track any application, not only your own. Is available in all editions, including Express, and including the Azure SQL DB Service.
If you track on the client side, you must measure the time yourself, using a wall clock. I would add and expose performance counters and then use the performance counters infrastructure to capture and store the measurements.
As a side not, simply tracking the execution time of a batch sent to SQL Server yields very coarse performance info and is seldom actionable. Read How to analyse SQL Server performance.
I have recently changed my web app to create a database connection per command instead of creating one connection and just reusing it for all commands. I used this code this afternoon and my database memory went up to 24GB usage peforming about 8k inserts. My code is like this (semi pseudo code):
public int ExecSQL(string SQLStr)
{
using (SqlConnection Con = new SqlConnection(MyConStr))
{
using (SqlCommand Cmd = new SqlCommand(SQLStr, Con))
{
return Cmd.ExecuteNonQuery();
}
}
}
using (TransactionScope TX = new TransactionScope(TransactionScopeOption.RequiresNew))
{
//Loop and perform 8000 x
int ID = ExecSQL("insert into something (column) output unique_id values ('data')").
// I also perform 1 or 2 selects per insert based on the ID returned from the insert. I don't use a .Supress for my inserts.
}
Could this of caused the high database memory usage? I was under the impression it should create 100 connections (default) then just keep re-using it but I am guessing I am missing something.
Answered: Ran the following SQL:
SELECT
DB_NAME(dbid) as DBName,
COUNT(dbid) as NumberOfConnections,
loginame as LoginName
FROM
sys.sysprocesses
WHERE
dbid > 0
GROUP BY
dbid, loginame
and there is only one open connection for my database so this isn't causing the issue. Now to find out what is ..
ADO.NET uses connection pools, so multiple SqlConnection objects with the same connection strings reuse the same physical database connection. Hardly your memory increase was caused by using new SqlConnection()
I want to write a code that transfers data from on server to my SQL Server. Before I put the data in, I want to delete the current data. Then put the data from one to the other. How do I do that. This is snippets from the code I have so far.
string SQL = ConfigurationManager.ConnectionStrings["SQLServer"].ToString();
string OLD = ConfigurationManager.ConnectionStrings["Server"].ToString();
SqlConnection SQLconn = new SqlConnection(SQL);
string SQLstatement = "DELETE * FROM Data";
SqlCommand SQLcomm = new SqlCommand(SQLstatement, SQLconn);
SQLconn.Open();
OdbcConnection conn = new OdbcConnection(OLD);
string statement = "SELECT * FROM BILL.TRANSACTIONS ";
statement += "WHERE (TRANSACTION='NEW') ";
OdbcCommand comm = new OdbcCommand(statement, conn);
comm.CommandTimeout = 0;
conn.Open();
SqlDataReader myDataReader = SQLcomm.ExecuteReader();
while (myDataReader.Read())
{
//...
}
SQLconn.Close();
SQLconn.Dispose();
Depending on which version of SQL Server you are using, the standard solution here is to use either DTS (2000 and before) or SSIS (2005 and on). You can turn it into an executable if you need to, schedule it straight from SQL Server, or run it manually. Both tools are fairly robust (although SSIS much more so) with methods to clear existing data, rollback in case of errors, transform data if necessary, write out exceptions, etc.
If at all possible I'd try and do it all in SQL Server. You can create a linked server to your other database server. Then simply use T-SQL to copy the data across - it would look something like...
INSERT INTO new_server_table (field1, field2)
SELECT x, y
FROM mylinkedserver.myolddatabase.myoldtable
If you need to do this on a regular basis or clear out the data first you can do this as part of a scheduled task using the SQL Agent.
If you only need to import the data once, and you have a lot of data, why not use the "BULK INSERT" command? Link
T-SQl allows you to insert data from a select query. It would look something like this:
insert into Foo
select * from Bar;
As long as the field types align this will work - otherwise you will have to massage the data from Bar to fit the fields from Foo.
When you need to do this once, take a look at the database publishing wizard (just google) and generate a script which does everything.