SQL DataReader network usage limit - c#

I have such an idea (don't know bad or good).
I have utility, which connects by reglament to SQL server and fetches some data to application. Data is simple (2 varchar text attributes), but count of data is ~ 3 millions rows. So, my application uses network very intensively.
Can I programmatically decrease (limit, throttling, etc...) the network bandwidth usage by SQL DataReader? Let it work more slowly, but not stress nither server nor client. Is this idea good? If not, what I have to do so?
Here is code, so far:
using (SqlConnection con = new SqlConnection("My connection string here"))
{
con.Open();
using (SqlCommand command = new SqlCommand(query, con))
{
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
yield return new MyDBObject()
{
Date = (DateTime)reader["close_date"],
JsonResult = (string)reader["json_result"]
};
}
}
}
}

Making the server buffer data or hold an open query longer could actually be significantly increasing load on the server, but ultimately the only way to do what you're after would be to apply "paging" to your query, and access the data in successive pages, perhaps with pauses between pages. The pages could still be pretty big - 100k for example. You can achieve this relatively easily with OFFSET/FETCH in SQL Server.

Related

Is there a method other than using ssis to get data from sql server to oracle?

The problem is, using ssis, I do an ado source to ado destination. This method only writes over 88 rows per second and is very slow.
using Oracle.ManagedDataAccess.Client;
using System;
using System.Data;
using System.Data.SqlClient;
namespace SQLconnection
{
internal static class Program
{
private static void Main(string[] args)
{
SqlConnection conn = new SqlConnection("Data Source=;Database=;Integrated Security=yes");
conn.Open();
SqlCommand cmd = new SqlCommand("SELECT * FROM TABLE", conn);
SqlDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
Console.WriteLine(reader.GetString(0) + ", " + reader.GetString(19));
}
conn.Close();
conn.Dispose();
Console.ReadLine();
OracleConnection con = new OracleConnection("User Id=;Password=;Data Source=;");
con.Open();
OracleCommand cmd2 = con.CreateCommand();
cmd2
.CommandText = "SELECT \'Hello World!\' FROM dual";
OracleDataReader reader2 = cmd2.ExecuteReader();
reader2.Read();
Console.WriteLine(reader2.GetString(0));
Console.WriteLine(con.ServiceName);
Console.WriteLine(con.ServerVersion);
Console.WriteLine(con.HostName);
con.Close();
Console.ReadLine();
}
}
}
Is there any way I can do a connection and pass the data via a console application? I feel that would be faster than 88 rows per sec.
Yes, you can write a console app that uses a native Oracle provider to pass data to Oracle.
https://www.oracle.com/webfolder/technetwork/tutorials/obe/db/dotnet/GettingStartedNETVersion/GettingStartedNETVersion.htm
I have found that file based operations are much quicker when doing bulk data transfers.
I would investigate using the BCP out utility to generate delimited text files from SQL server. Have a read: https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-2017
Getting it into Oracle could possibly be a little harder (I have very limited Oracle experience). As per the following question, you can investigate using SQL Loader scripts:
Oracle: Import CSV file
There are a couple gotchas when using BCP, though:
Depending on the makeup of the data (do you have commas and carriage returns in text fields in your data?), consider using custom delimiters for both fields and records. This can be easily specified in your BCP command by using the -t and -r options
Obviously, make sure the fields and the data matches (or is at least comparable) between formats. You can use the QUERYOUT option in BCP to create custom queries, which should give you the ability to cast if you need to and order the columns the way you want.
It might not be the sexiest solution, but it can work, can be very repeatable and can see a high throughput of data. We have been doing this for a Sybase ASE to SQL Server ETL process and saw processing times drop to 10% of what they were using other database to database methods.
Obviously, though, YMMV, so test first.

Inner Join two databases to import data - SQL C#

I've created a Winforms app in C#. I do have both my Datasources listed as Datasets.
LOTSDataSet = Source Info
webbitdbdataset = Destination Dataset.
These are connected with LOTSConnectionString and WebbitConnectionString.
Anyway I have the code shown below that I am getting a connection error on, when I try to import data from LOTS to Webbit.
SqlConnection lotscon = new SqlConnection(PackChecker.Properties.Settings.Default["LOTSConnectionString"].ToString());
using (OleDbConnection con = new OleDbConnection(PackChecker.Properties.Settings.Default["WebbitConnectionString"].ToString()))
{
string strgetloc = #"INSERT INTO tblinstitution (
dispenseinstid, institutionname )
SELECT NEWinstitution.institutionid, NEWinstitution.institutionname
FROM NEWinstitution LEFT JOIN tblinstitution ON NEWinstitution.institutionid
= tblinstitution.dispenseinstid
WHERE (((tblinstitution.institutionid) Is Null));";
using (OleDbCommand cmd = new OleDbCommand(strgetloc, con))
{
lotscon.Open();
con.Open();
cmd.ExecuteNonQuery();
con.Close();
lotscon.Close();
}
}
Trying to "OPEN CONNECTION" for both connections was just something I tried.. I guess knowing it was going to fail, but I wanted to try before I asked on here.
The command is attached to the OLEDB connection to Webbit, thus I'm getting an exception based on 'cannot find LOTS server'
Prior to running the query I do run a connection checker, which opens both connections and "tries and catches" both connections to make sure they are valid connections.
Using Access the query does work, so I know the issue 100% is trying to open these connections to two databases!
Any direction would be appreciated.
Gangel

Get the execution time of a ADO.NET SQL Command

I have been searching over to find if there is any easy way to get the Execution time of a ADO.NET command object.
I know i can manually do a StopWatch start and stop. But wanted to if there are any easy way to do it in ADO.NET
There is a way, but using SqlConnection, not command object. Example:
using (var c = new SqlConnection(connectionString)) {
// important
c.StatisticsEnabled = true;
c.Open();
using (var cmd = new SqlCommand("select * from Error", c)) {
cmd.ExecuteReader().Dispose();
}
var stats = c.RetrieveStatistics();
var firstCommandExecutionTimeInMs = (long) stats["ExecutionTime"];
// reset for next command
c.ResetStatistics();
using (var cmd = new SqlCommand("select * from Code", c))
{
cmd.ExecuteReader().Dispose();
}
stats = c.RetrieveStatistics();
var secondCommandExecutionTimeInMs = (long)stats["ExecutionTime"];
}
Here you can find what other values are contained inside dictionary returned by RetrieveStatistics.
Note that those values represent client-side statistics (basically internals of ADO.NET measure them), but seems you asked for analog of Stopwatch - I think that's fine.
The approach from the answer of #Evk is very interesting and smart: it's working client side and one of the main key of such statistics is in fact NetworkServerTime, which
Returns the cumulative amount of time (in milliseconds) that the
provider spent waiting for replies from the server once the
application has started using the provider and has enabled statistics.
so it includes the network time from the DB server to the ADO NET client.
An alternative, more DB server oriented, would be running SET STATISTICS TIME ON and then retrieve the InfoMessage.
A draft of the code of the delegate (where I'm simply writing to the debug console, but you may want to replace it with a StringBuilder Append)
internal static void TrackInfo(object sender, SqlInfoMessageEventArgs e)
{
Debug.WriteLine(e.Message);
foreach (var element in e.Errors) {
Debug.WriteLine(element.ToString());
}
}
and usage
conn.InfoMessage += TrackInfo;
using (var cmd = new SqlCommand(#"SET STATISTICS TIME ON", conn)) {
cmd.ExecuteNonQuery();
}
using (var cmd = new SqlCommand(yourQuery, conn)) {
var RD = cmd.ExecuteReader();
while (RD.Read()) {
// read the columns
}
}
I suggest you move to SQL Server 2016 and use the Query Store feature. This will track execution time and performance changes over time for each query you submit. Requires no changes in your application. Track all queries, including those executed inside stored procedures. Track any application, not only your own. Is available in all editions, including Express, and including the Azure SQL DB Service.
If you track on the client side, you must measure the time yourself, using a wall clock. I would add and expose performance counters and then use the performance counters infrastructure to capture and store the measurements.
As a side not, simply tracking the execution time of a batch sent to SQL Server yields very coarse performance info and is seldom actionable. Read How to analyse SQL Server performance.

Fastest way to access query a MySql database via .NET?

I am trying to access a very large MySQL table (17.6M rows x 60 columns) in a .NET application and perform some simple analytics within the application. I have the following code using the .NET MySqlAdapter, where the queryis a simple SELECT ... FROM X query:
using (MySqlConnection client = new MySqlConnection(connectionString))
{
client.Open();
using(MySqlCommand command = new MySqlCommand(td, client))
{
using (MySqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
DoSomething(reader);
}
}
}
}
I am testing this application on the database server itself, to try to eliminate network delays.
Profiling this approach, the total sum of the DoSomething() calls is about 20ms, where the total time spent in the inner while loop is ~8 minutes.
Is there a faster approach to transferring a large amount of data from MySQL to a client application?

Very slow insert process using Linq to Sql

I'm inserting large number of records using LinqToSql from C# to SqlServer 2008 express DB. It looks like the insertion is very slow in this. Following is the code snippet.
public void InsertData(int id)
{
MyDataContext dc = new MyDataContext();
List<Item> result = GetItems(id);
foreach (var item in result)
{
DbItem dbItem = new DbItem(){ItemNo = item.No, ItemName=item.Name};
dc.Items.InsertOnSubmit();
}
dc.SubmitChanges();
}
Am I doing anything wrong? Or using Linq to insert large number of records is a bad choice?
Update: Thanks for all the answers.
#p.campbell: Sorry for the records count, it was a typo, actually it is around 100000. Records also range till 200k as well.
As per all the suggestions I moved this operation into parts (also a requirement change and design decision) and retrieving data in small chunks and inserting them into database as and when it comes. I've put this InsertData() method in thread operation and now using SmartThreadPool for creating a pool of 25 threads to do the same operation. In this scenario I'm inserting at a time only 100 records. Now, when I tried this with Linq or sql query it didn't make any difference in terms of time taken.
As per my requirement this operation is scheduled to run every hour and fetches records for around 4k-6k users. So, now I'm pooling every user data (retrieving and inserting into DB) operation as one task and assigned to one thread. Now this entire process takes around 45 minutes for around 250k records.
Is there any better way to do this kind of task? Or can anyone suggest me how can I improve this process?
For inserting massive amount of data into SQL in a oner
Linq or SqlCommand, neither are designed for bulk copying data into SQL.
You can use the SqlBulkCopy class which provides managed access to the bcp utility for bulk loading data into Sql from pretty much any data source.
The SqlBulkCopy class can be used to write data only to SQL Server tables. However, the data source is not limited to SQL Server; any data source can be used, as long as the data can be loaded to a DataTable instance or read with a IDataReader instance.
Performance comparison
SqlBulkCopy is by far the fastest, even when loading data from a simple CSV file.
Linq will just generate a load of Insert statements in SQL and send them to your SQL Server. This is no different than you using Ad-hoc queries with SqlCommand. Performance of SqlCommand vs. Linq is virtually identical.
The Proof
(SQL Express 2008, .Net 4.0)
SqlBulkCopy
Using SqlBulkCopy to load 100000 rows from a CSV file (including loading the data)
using (SqlConnection conn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=EffectCatalogue;Data Source=.\\SQLEXPRESS;"))
{
conn.Open();
Stopwatch watch = Stopwatch.StartNew();
string csvConnString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\\data\\;Extended Properties='text;'";
OleDbDataAdapter oleda = new OleDbDataAdapter("SELECT * FROM [test.csv]", csvConnString);
DataTable dt = new DataTable();
oleda.Fill(dt);
using (SqlBulkCopy copy = new SqlBulkCopy(conn))
{
copy.ColumnMappings.Add(0, 1);
copy.ColumnMappings.Add(1, 2);
copy.DestinationTableName = "dbo.Users";
copy.WriteToServer(dt);
}
Console.WriteLine("SqlBulkCopy: {0}", watch.Elapsed);
}
SqlCommand
using (SqlConnection conn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=TestDb;Data Source=.\\SQLEXPRESS;"))
{
conn.Open();
Stopwatch watch = Stopwatch.StartNew();
SqlCommand comm = new SqlCommand("INSERT INTO Users (UserName, [Password]) VALUES ('Simon', 'Password')", conn);
for (int i = 0; i < 100000; i++)
{
comm.ExecuteNonQuery();
}
Console.WriteLine("SqlCommand: {0}", watch.Elapsed);
}
LinqToSql
using (SqlConnection conn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=TestDb;Data Source=.\\SQLEXPRESS;"))
{
conn.Open();
Stopwatch watch = Stopwatch.StartNew();
EffectCatalogueDataContext db = new EffectCatalogueDataContext(conn);
for (int i = 0; i < 100000; i++)
{
User u = new User();
u.UserName = "Simon";
u.Password = "Password";
db.Users.InsertOnSubmit(u);
}
db.SubmitChanges();
Console.WriteLine("Linq: {0}", watch.Elapsed);
}
Results
SqlBulkCopy: 00:00:02.90704339
SqlCommand: 00:00:50.4230604
Linq: 00:00:48.7702995
if you are inserting large record of data you can try with BULK INSERT .
As per my knowledge there is no equivalent of bulk insert in Linq to SQL.
You've got the SubmitChanges() being called once, which is good. This means that only one connection and transaction are being used.
Consider refactoring your code to use InsertAllOnSubmit() instead.
List<dbItem> newItems = GetItems(id).Select(x=> new DbItem{ItemNo = x.No,
ItemName=x.Name})
.ToList();
db.InsertAllOnSubmit(newItems);
dc.SubmitChanges();
The INSERT statements are sent one-by-one as previous, but perhaps this might be more readable?
Some other things to ask/consider:
What's the state of the indexes on the target table? Too many will slow down the writes. * Is the database in Simple or Full recovery model?
Capture the SQL statements going across the wire. Replay those statements in an adhoc query against your SQL Server database. I realize you're using SQL Express, and likely don't have SQL Profiler. Use context.Log = Console.Out; to output your LINQ To SQL statements to the console. Prefer SQL Profiler for convenience though.
Do the captured SQL statements perform the same as your client code? If so, then the perf problem is at the database side.
Here's a nice walk-through of how to add a Bulk-Insert class to your application, which hugely improves the performance of inserting records using LINQ.
(All source code is provided, ready to be added to your own application.)
http://www.mikesknowledgebase.com/pages/LINQ/InsertAndDeletes.htm
You would just need to make three changes to your code, and link in the class provided.
Good luck !

Categories