I am experimenting with Parallel.ForEach in C# and SQL-Server Database Connectivity and table creation. By and large, trying to create a large number of tables using Parallel.ForEach. Although the code is running, I notice data leakage as not all tables are created. I am reading data from Microsoft.Data.Analysis DataFrame which is IEnumerable. Therefore, I would like to ask just for a skeleton of how can I run Parallel.ForEach with SQL connectivity and Commands in order to have multiple different table creation running on parallel threads on the same database and creating at the same time multiple tables, but to make sure all tables are created. I am not an experienced user of C#, so thank you in advance for your help....
Here is some code also, sorry for bad formatting (first time...)
Task task_111 = Task.Factory.StartNew(() =>
{
Parallel.ForEach<string>(list, iter =>
{
using (var myConn=new SqlConnection())
DataFrame LoadExample = DataFrame.LoadCsv();
{
// ......
{
string sql= // create table
{
using (var command=new SqlCommand())
{
command_2.Connection = myConn;
command_2.CommandText = sql;
myConn.Open();
command_2.ExecuteNonQuery();
myConn.Close();
}
.......
Related
I'm creating a simple C# programs to compare a bunch of log files with our SQL Server, to see which has been processed or not.
The issue is the following:
The SQL has more than 2.000.000. So, only a SELECT would take too much time to load.
My ideia is the following: I have a custom object that loads all the data from the logs and them, a FOR Loop would search for a match inside the SQL. It's happening that it won't find any matches with the Query into the For.
I've already tried the SELECT * FROM, but there are too many registers to load that way.
This is my C# Loop:
for (int i = 0; i < registros.Count; i++) {
command = new($"Select DateTime, P15, Reference from ProductionDay where DateTime = Convert(datetime, '{registros[i].day.Year}-{registros[i].day.Day}-{registros[i].day.Month}')", cnn);
Console.WriteLine($"{i:D5} - Editing {registros[i].part_number}");
using (SqlDataReader reader = command.ExecuteReader()) {
while (reader.Read()) {
Console.WriteLine("SQL Matched!");
try {
if (reader["Reference"].ToString().Trim() == registros[i].part_number) {
registros[i].sql_part_number = registros[i].part_number;
registros[i].quantity += Convert.ToInt32(reader["P15"].ToString());
}
} catch {
}
}
}
}
This code only writes SQL Matched! in the last itineration of the loop.
Is this the best way to do a QUERY Loop? I'm learning now about C# and SQL connections.
If you want to full load the table, can you try query in chunk? It will reduce the stress, also you can preserve memory for further processing. Try to query on server side too to help not transferring everything at once to client side (I got this from Postgresql, I think SQL Server also support this)
There is an example for this:
SQL Server : large DB Query In Chunks
I have such an idea (don't know bad or good).
I have utility, which connects by reglament to SQL server and fetches some data to application. Data is simple (2 varchar text attributes), but count of data is ~ 3 millions rows. So, my application uses network very intensively.
Can I programmatically decrease (limit, throttling, etc...) the network bandwidth usage by SQL DataReader? Let it work more slowly, but not stress nither server nor client. Is this idea good? If not, what I have to do so?
Here is code, so far:
using (SqlConnection con = new SqlConnection("My connection string here"))
{
con.Open();
using (SqlCommand command = new SqlCommand(query, con))
{
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
yield return new MyDBObject()
{
Date = (DateTime)reader["close_date"],
JsonResult = (string)reader["json_result"]
};
}
}
}
}
Making the server buffer data or hold an open query longer could actually be significantly increasing load on the server, but ultimately the only way to do what you're after would be to apply "paging" to your query, and access the data in successive pages, perhaps with pauses between pages. The pages could still be pretty big - 100k for example. You can achieve this relatively easily with OFFSET/FETCH in SQL Server.
I have been searching over to find if there is any easy way to get the Execution time of a ADO.NET command object.
I know i can manually do a StopWatch start and stop. But wanted to if there are any easy way to do it in ADO.NET
There is a way, but using SqlConnection, not command object. Example:
using (var c = new SqlConnection(connectionString)) {
// important
c.StatisticsEnabled = true;
c.Open();
using (var cmd = new SqlCommand("select * from Error", c)) {
cmd.ExecuteReader().Dispose();
}
var stats = c.RetrieveStatistics();
var firstCommandExecutionTimeInMs = (long) stats["ExecutionTime"];
// reset for next command
c.ResetStatistics();
using (var cmd = new SqlCommand("select * from Code", c))
{
cmd.ExecuteReader().Dispose();
}
stats = c.RetrieveStatistics();
var secondCommandExecutionTimeInMs = (long)stats["ExecutionTime"];
}
Here you can find what other values are contained inside dictionary returned by RetrieveStatistics.
Note that those values represent client-side statistics (basically internals of ADO.NET measure them), but seems you asked for analog of Stopwatch - I think that's fine.
The approach from the answer of #Evk is very interesting and smart: it's working client side and one of the main key of such statistics is in fact NetworkServerTime, which
Returns the cumulative amount of time (in milliseconds) that the
provider spent waiting for replies from the server once the
application has started using the provider and has enabled statistics.
so it includes the network time from the DB server to the ADO NET client.
An alternative, more DB server oriented, would be running SET STATISTICS TIME ON and then retrieve the InfoMessage.
A draft of the code of the delegate (where I'm simply writing to the debug console, but you may want to replace it with a StringBuilder Append)
internal static void TrackInfo(object sender, SqlInfoMessageEventArgs e)
{
Debug.WriteLine(e.Message);
foreach (var element in e.Errors) {
Debug.WriteLine(element.ToString());
}
}
and usage
conn.InfoMessage += TrackInfo;
using (var cmd = new SqlCommand(#"SET STATISTICS TIME ON", conn)) {
cmd.ExecuteNonQuery();
}
using (var cmd = new SqlCommand(yourQuery, conn)) {
var RD = cmd.ExecuteReader();
while (RD.Read()) {
// read the columns
}
}
I suggest you move to SQL Server 2016 and use the Query Store feature. This will track execution time and performance changes over time for each query you submit. Requires no changes in your application. Track all queries, including those executed inside stored procedures. Track any application, not only your own. Is available in all editions, including Express, and including the Azure SQL DB Service.
If you track on the client side, you must measure the time yourself, using a wall clock. I would add and expose performance counters and then use the performance counters infrastructure to capture and store the measurements.
As a side not, simply tracking the execution time of a batch sent to SQL Server yields very coarse performance info and is seldom actionable. Read How to analyse SQL Server performance.
I am trying to access a very large MySQL table (17.6M rows x 60 columns) in a .NET application and perform some simple analytics within the application. I have the following code using the .NET MySqlAdapter, where the queryis a simple SELECT ... FROM X query:
using (MySqlConnection client = new MySqlConnection(connectionString))
{
client.Open();
using(MySqlCommand command = new MySqlCommand(td, client))
{
using (MySqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
DoSomething(reader);
}
}
}
}
I am testing this application on the database server itself, to try to eliminate network delays.
Profiling this approach, the total sum of the DoSomething() calls is about 20ms, where the total time spent in the inner while loop is ~8 minutes.
Is there a faster approach to transferring a large amount of data from MySQL to a client application?
I'm inserting large number of records using LinqToSql from C# to SqlServer 2008 express DB. It looks like the insertion is very slow in this. Following is the code snippet.
public void InsertData(int id)
{
MyDataContext dc = new MyDataContext();
List<Item> result = GetItems(id);
foreach (var item in result)
{
DbItem dbItem = new DbItem(){ItemNo = item.No, ItemName=item.Name};
dc.Items.InsertOnSubmit();
}
dc.SubmitChanges();
}
Am I doing anything wrong? Or using Linq to insert large number of records is a bad choice?
Update: Thanks for all the answers.
#p.campbell: Sorry for the records count, it was a typo, actually it is around 100000. Records also range till 200k as well.
As per all the suggestions I moved this operation into parts (also a requirement change and design decision) and retrieving data in small chunks and inserting them into database as and when it comes. I've put this InsertData() method in thread operation and now using SmartThreadPool for creating a pool of 25 threads to do the same operation. In this scenario I'm inserting at a time only 100 records. Now, when I tried this with Linq or sql query it didn't make any difference in terms of time taken.
As per my requirement this operation is scheduled to run every hour and fetches records for around 4k-6k users. So, now I'm pooling every user data (retrieving and inserting into DB) operation as one task and assigned to one thread. Now this entire process takes around 45 minutes for around 250k records.
Is there any better way to do this kind of task? Or can anyone suggest me how can I improve this process?
For inserting massive amount of data into SQL in a oner
Linq or SqlCommand, neither are designed for bulk copying data into SQL.
You can use the SqlBulkCopy class which provides managed access to the bcp utility for bulk loading data into Sql from pretty much any data source.
The SqlBulkCopy class can be used to write data only to SQL Server tables. However, the data source is not limited to SQL Server; any data source can be used, as long as the data can be loaded to a DataTable instance or read with a IDataReader instance.
Performance comparison
SqlBulkCopy is by far the fastest, even when loading data from a simple CSV file.
Linq will just generate a load of Insert statements in SQL and send them to your SQL Server. This is no different than you using Ad-hoc queries with SqlCommand. Performance of SqlCommand vs. Linq is virtually identical.
The Proof
(SQL Express 2008, .Net 4.0)
SqlBulkCopy
Using SqlBulkCopy to load 100000 rows from a CSV file (including loading the data)
using (SqlConnection conn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=EffectCatalogue;Data Source=.\\SQLEXPRESS;"))
{
conn.Open();
Stopwatch watch = Stopwatch.StartNew();
string csvConnString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\\data\\;Extended Properties='text;'";
OleDbDataAdapter oleda = new OleDbDataAdapter("SELECT * FROM [test.csv]", csvConnString);
DataTable dt = new DataTable();
oleda.Fill(dt);
using (SqlBulkCopy copy = new SqlBulkCopy(conn))
{
copy.ColumnMappings.Add(0, 1);
copy.ColumnMappings.Add(1, 2);
copy.DestinationTableName = "dbo.Users";
copy.WriteToServer(dt);
}
Console.WriteLine("SqlBulkCopy: {0}", watch.Elapsed);
}
SqlCommand
using (SqlConnection conn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=TestDb;Data Source=.\\SQLEXPRESS;"))
{
conn.Open();
Stopwatch watch = Stopwatch.StartNew();
SqlCommand comm = new SqlCommand("INSERT INTO Users (UserName, [Password]) VALUES ('Simon', 'Password')", conn);
for (int i = 0; i < 100000; i++)
{
comm.ExecuteNonQuery();
}
Console.WriteLine("SqlCommand: {0}", watch.Elapsed);
}
LinqToSql
using (SqlConnection conn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=TestDb;Data Source=.\\SQLEXPRESS;"))
{
conn.Open();
Stopwatch watch = Stopwatch.StartNew();
EffectCatalogueDataContext db = new EffectCatalogueDataContext(conn);
for (int i = 0; i < 100000; i++)
{
User u = new User();
u.UserName = "Simon";
u.Password = "Password";
db.Users.InsertOnSubmit(u);
}
db.SubmitChanges();
Console.WriteLine("Linq: {0}", watch.Elapsed);
}
Results
SqlBulkCopy: 00:00:02.90704339
SqlCommand: 00:00:50.4230604
Linq: 00:00:48.7702995
if you are inserting large record of data you can try with BULK INSERT .
As per my knowledge there is no equivalent of bulk insert in Linq to SQL.
You've got the SubmitChanges() being called once, which is good. This means that only one connection and transaction are being used.
Consider refactoring your code to use InsertAllOnSubmit() instead.
List<dbItem> newItems = GetItems(id).Select(x=> new DbItem{ItemNo = x.No,
ItemName=x.Name})
.ToList();
db.InsertAllOnSubmit(newItems);
dc.SubmitChanges();
The INSERT statements are sent one-by-one as previous, but perhaps this might be more readable?
Some other things to ask/consider:
What's the state of the indexes on the target table? Too many will slow down the writes. * Is the database in Simple or Full recovery model?
Capture the SQL statements going across the wire. Replay those statements in an adhoc query against your SQL Server database. I realize you're using SQL Express, and likely don't have SQL Profiler. Use context.Log = Console.Out; to output your LINQ To SQL statements to the console. Prefer SQL Profiler for convenience though.
Do the captured SQL statements perform the same as your client code? If so, then the perf problem is at the database side.
Here's a nice walk-through of how to add a Bulk-Insert class to your application, which hugely improves the performance of inserting records using LINQ.
(All source code is provided, ready to be added to your own application.)
http://www.mikesknowledgebase.com/pages/LINQ/InsertAndDeletes.htm
You would just need to make three changes to your code, and link in the class provided.
Good luck !