How can I log query time? - c#

I have few oledb connections like this:
try
{
OleDbConnection Connection8;
using (Connection8 = new OleDbConnection("Provider=MSDAORA.1;Data Source=DATABASE:1521/orcl;Persist Security Info=True;Password=PASSWORD;User ID=USERNAME;"))
{
string sqlQuery = "select * from TABLE";
using (OleDbDataAdapter cmd = new OleDbDataAdapter(sqlQuery, Connection8))
{
Connection8.Open();
DataTable dt = new DataTable();
cmd.Fill(dt);
GridView5.DataSource = dt;
GridView5.DataBind();
v8 = 1;
Connection8.Close();
}
}
}
catch (Exception)
{
v8 = 0;
}
Some connections waiting so much, but I can't know which one.
How can I log or see query time for every connection? Any suggestion for that? Thank you.

You can use Stopwatch:
var stopwatch = new Stopwatch();
DataTable dt = new DataTable();
stopwatch.Start();
Connection8.Open();
cmd.Fill(dt);
stopwatch.Stop();
var timeElapsed = stopwatch.ElapsedMilliseconds;
Notice here in sample I've shown time to open connection will be included in measured time. If you don't need it and want "pure" query execution time - then just change the order od lines where connection being opened and stopwatch started.

I don't know if this will work because you are using a OleDbConnection but one thing you may be able to do is in control panel open the "Odbc Administrator" (Be sure to check if you want 32 bit or 64 bit) there is a "Tracing" tab you can turn on, that gives you a log file of all ODBC requests that are processed.
But remember, like I said, because you are using a OleDbConnection it may not log anything.

I've used Glimpse in the past, might be as well useful for you:
http://getglimpse.com/

Related

C# MYSQL query every second

So, I have a program I'm working on where more than one client is connected to a MYSQL Database and I want to keep them all current. When one client updated the database, the information in all clients updates. I'm new and still studying in college, so the only way I could think to do this is to make a column that held each records update time and then each second use this query:
if (sqlHandler.userLastUpdate < sqlHandler.LastUpdate())
{
//Loads the products from the Database
sqlHandler.ReadDB();
//Set's this clients last update time to now so it doesn't keep refreshing
sqlHandler.userLastUpdate = DateTimeOffset.UtcNow.ToUnixTimeSeconds();
}
public double LastUpdate()
{
using var con = new MySqlConnection(CONNECTION_STRING);
con.Open();
string sql = "SELECT MAX(LastUpDate) FROM products";
using var cmd = new MySqlCommand(sql, con);
using MySqlDataReader rdr = cmd.ExecuteReader();
rdr.Read();
double time = rdr.GetDouble(0);
return time;
}
This seems horribly inefficient. Is there a better way to do this. I've had a couple clients running on the same MYSQL server and it seemed to be fine, but it just seems to be a better way to do this.

SSRS ReportExecutionServce.LoadReport gets stuck

Consider the following example code:
using System.Data.SqlClient;
namespace ReportLoadTest
{
class Program
{
static void Main(string[] args)
{
using (var con = new SqlConnection("...your connection string here..."))
{
con.Open();
var trans = con.BeginTransaction();
var cmd = con.CreateCommand();
cmd.Transaction = trans;
cmd.CommandText = #"insert SomeTable(...columns...) values (...); select scope_identity()";
var rows = cmd.ExecuteScalar();
var rs = new SSRS.ReportExecutionService();
rs.Credentials = System.Net.CredentialCache.DefaultCredentials;
rs.Url = "http://localhost/ReportServer/ReportExecution2005.asmx";
var ei = rs.LoadReport("/Folder/Folder/Some report", null);
}
}
}
}
Under what conditions would this program "get stuck" at the call to ReportExecutionService.LoadReport?
By stuck, I mean 0 CPU, 0 I/O - no progress being made at all by the calling program, Reporting Services or SQL Server.
This program will get stuck if the report that's being loaded contains a dataset that's used to populate the available values for a parameter and that dataset is based on a query that reads rows from SomeTable.
LoadReport will eventually time out and there will be zero helpful information left lying around to help you figure out what happened.
Possible solutions:
Change the report to do "dirty reads" on SomeTable
Change the database to snapshot isolation mode to avoid the lock on the table.
I ran into this as an actual production issue with a system that runs reports on a schedule, and the report being run was the "scheduled reports history" report.
The subtlety is that LoadReport runs queries - in retrospect, it's obvious that it must run queries since the available values for parameters are contained in the ExecutionInfo that's returned by LoadReport.

Timeout expired with SQL Server insert

I am trying to insert 200,000 documents from a folder into the SQL Server database into a varbinary column. I get a timeout expiration message after inserting 80,000 documents. Average file size is about 250kb and max file size is 50MB. I am running this c# program on the server where the database is located.
Please suggest.
The error:
The timeout period elapsed prior to completion of the operation or the server is not responding.
The code:
string spath = #"c:\documents";
string[] files = Directory.GetFiles(spath, "*.*", SearchOption.AllDirectories);
Console.Write("Files Count:" + files.Length);
using (SqlConnection con = new SqlConnection(connectionString))
{
con.Open();
string insertSQL = "INSERT INTO table_Temp(doc_content, doc_path) values(#File, #path)";
SqlCommand cmd = new SqlCommand(insertSQL, con);
var pFile = cmd.Parameters.Add("#File", SqlDbType.VarBinary, -1);
var pPath = cmd.Parameters.Add("#path", SqlDbType.Text);
var tran = con.BeginTransaction();
var fn = 0;
foreach (string docPath in files)
{
string newPath = docPath.Remove(0, spath.Length);
string archive = new DirectoryInfo(docPath).Parent.Name;
fn += 1;
using (var stream = new FileStream(docPath, FileMode.Open, FileAccess.Read))
{
pFile.Value = stream;
pPath.Value = newPath;
cmd.Transaction = tran;
cmd.ExecuteNonQuery();
if (fn % 10 == 0)
{
tran.Commit();
tran = con.BeginTransaction();
Console.Write("|");
}
Console.Write(".");
}
}
tran.Commit();
}
For this, I would suggest using SQLBulkCopy methods, since it should be able to handle data insertion much more easily. Further, as others have pointed out, you might want to increase the timeout condition for your command.
While I would agree this may be best for a bulk copy of some sort, if you must do this in the c# program, your only option is probably to increase the timeout value. You can do this after your SqlCommand object has been created via cmd.CommandTimeout = <new timeout>; The property CommandTimeout is an integer representing the number seconds for the timeout, or zero if you never want it to timeout.
See the MSDN docs for details
You should be able to set the timeout for the transaction directly on that object in your application code, this way you are not changing sql server settings.
Secondly, you can build batches in your application also. You say you can get 80k docs before a timeout, set your batch size at 50k, process them, commit them, grab the next batch. Having your application manage batching also allows you to catch sql errors, such as timeout, and then dynamically adjust the batch size and retry without ever crashing. This is the entire reason for writing your application in the first place, other wise you could just use the wizard in management studio and manually insert your files.
I highly recommend batching over other options.
#Shubham Pandey also provides a great link to SQL bulk copy info, which also has links to more info. You should definitely experiment with the bulk copy class and see if you can get additional gains with it as well.

Million inserts: SqlBulkCopy timeout

We already have a running system that handles all connection-strings (db2, oracle, MSServer).
Currently, We are using ExecuteNonQuery() to do some inserts.
We want to improve the performance, by using SqlBulkCopy() instead of ExecuteNonQuery(). We have some clients that have more than 50 million records.
We don't want to use SSIS, because our system supports multiple databases.
I created a sample project to test the performance of SqlBulkCopy(). I created a simple read and insert function for MSServer
Here's the small function:
public void insertIntoSQLServer()
{
using (SqlConnection SourceConnection = new SqlConnection(_sourceConnectionString))
{
//Open the connection to get the data from the source table
SourceConnection.Open();
using (SqlCommand command = new SqlCommand("select * from " + _sourceSchemaName + "." + _sourceTableName + ";", SourceConnection))
{
//Read from the source table
command.CommandTimeout = 2400;
SqlDataReader reader = command.ExecuteReader();
using (SqlConnection DestinationConnection = new SqlConnection(_destinationConnectionString))
{
DestinationConnection.Open();
//Clean the destination table
new SqlCommand("delete from " + _destinationSchemaName + "." + _destinationTableName + ";", DestinationConnection).ExecuteNonQuery();
using (SqlBulkCopy bc = new SqlBulkCopy(DestinationConnection))
{
bc.DestinationTableName = string.Format("[{0}].[{1}]", _destinationSchemaName, _destinationTableName);
bc.NotifyAfter = 10000;
//bc.SqlRowsCopied += bc_SqlRowsCopied;
bc.WriteToServer(reader);
}
}
}
}
}
When I have less that 200 000 in my dummyTable the bulk copy is working fine. But, when it's over 200 000 records, I have the following errors:
Attempt to invoke bulk copy on an object that has a pending operation.
OR
The wait operation timed out (for the IDataReader)
I increased the CommandTimeout for the reader. It seems that it has solved the timeout issue related to IDataReader.
Am I doing something wrong in the code?
Can you try adding the following before the call to WriteToServer ...
bc.BatchSize = 10000;
bc.BulkCopyTimeout = 0;
I don't know what the default batch size or timeout is, but I suspect this might be your issue.
Hope that helps
Also, you can try playing with different Batch Sizes for optimal performance.
You can try this
bc.BatchSize = 100000; // How many Rows you want to insert at a time
bc.BulkCopyTimeout = 60; // Time in Seconds. If you want infinite waiting Time then assign 0.

Unusual SQL/Data issues

We have a report that has been giving us some serious issues, so I decided to put it into a console application in order to troubleshoot the issues.
The report is just a simple single select from SQL, returning approximately 25 columns,
and our date range can be 3-6 months, returning around 10k rows, so we are not talking about a lot of data.
Here is whats happening, when the report runs, it is timing out from our website, in the console, it takes anywhere from 13-18 mins to finish, the wait seems to happen at the da.Fill(ds);
Now here is the strange thing, it runs approximately 1-3 seconds within SQL Server Management Studio, and when our Delphi developers create a similar application, it is also a few seconds to run, this only happens using .NET
We tried changing from a dataset to loading into a datareader,
using this code..
using (var dr = _command.ExecuteReader())
{
if (dr.HasRows)
{
int i = 0;
while (dr.Read())
{
var startRead = DateTime.Now;
Console.Write("{2}\t{0}\t{1}\t", dr.GetInt32(0), dr.GetString(1), i);
var tookRead = DateTime.Now.Subtract(startRead);
Console.WriteLine("Took: " + tookRead);
i++;
}
}
However it did not help at all, it just displays in chucks but has frequent delays. I'm thinking its SQL, but can't explain why it works fine in Delphi and in SQL Management Studio.
I've tried using .NET 2.0, 3.5 and 4, happens on all frameworks.
Here is my code
public static DataSet GetData()
{
var now = DateTime.Now;
var _command = new SqlCommand();
var _connection = new SqlConnection();
try
{
_connection.ConnectionString = connectionString;
_command.Connection = _connection;
_command.CommandText = storedProcedure;
_command.CommandType = CommandType.StoredProcedure;
_command.CommandTimeout = 60;
if (string.IsNullOrEmpty(_connection.ConnectionString)) { throw new Exception("Connection String was not supplied"); }
_command.Parameters.Add(new SqlParameter("DateFrom", dateFrom));
_command.Parameters.Add(new SqlParameter("DateTo", dateTo));
SqlDataAdapter da;
var ds = new DataSet();
_connection.Open();
var done = DateTime.Now;
da = new SqlDataAdapter(_command);
da.Fill(ds);
if (ds == null) { throw new Exception("DataSet is null."); }
if (ds.Tables.Count == 0) { throw new Exception("Table count is 0"); }
var took = done.Subtract(now);
return ds;
}
catch (Exception ex)
{
File.WriteAllText(Path.Combine(Application.StartupPath, String.Format("Exception{0:MMddyyyy_HHmmss}.log", DateTime.Now)), ex.ToString());
}
finally
{
if (_connection.State != ConnectionState.Closed) { _connection.Close(); }
}
return null;
}
Any ideas? Our DBA is blaming the framework, I'm actually blaming something in SQL.. (maybe statistics, or corrupted db)
Differences in SQL performance between .NET and other clients (SQL Management Studio) are usually down to the connections being configured differently - frequent culprits are ANSI_NULLS; ANSI_PADDING.
Try looking at how the connection is configured in SQL Management Studio, then replicate the same thing in your .NET application.
The information you give doesn't contain enough details to really help...
IF SSMS is really that much faster then the reason could be some session/connection setting - SSMS uses subtly different settings in comparison to .NET.
For some explanation and hints on what could be different/wrong etc. see http://www.sommarskog.se/query-plan-mysteries.html

Categories