I have the below code:
using (System.Data.OracleClient.OracleConnection dataConn = new System.Data.OracleClient.OracleConnection(_connectionString))
{
using (System.Data.OracleClient.OracleCommand cmd = new System.Data.OracleClient.OracleCommand())
{
cmd.Connection = dataConn;
cmd.CommandText = "DELETE FROM Employees WHERE LOCATIONID= :LOCATIONID";
cmd.Parameters.AddWithValue(":LOCATIONID", locationId);
dataConn.Open();
retVal += cmd.ExecuteNonQuery();
dataConn.Close();
}
using (System.Data.OracleClient.OracleCommand cmd = new System.Data.OracleClient.OracleCommand())
{
cmd.Connection = dataConn;
cmd.CommandText = string.Format("DELETE FROM Locations WHERE LocationId = :LOCATIONID";
cmd.Parameters.AddWithValue(":LOCATIONID", locationId);
dataConn.Open();
retVal += cmd.ExecuteNonQuery();
dataConn.Close();
}
}
Just FYI,
I am calling the above block in a loop of say 50 iterations.
In each iteration, I am passing a new locationid.
-The first query, for each iteration, is potentially deleting 500 records on avg, as one location is assigned to 500 + employees.
As per this link:, I think I am doing things correctly, can anyone please point why I am still getting the ORA-01000: maximum open cursors exceeded error?
Any help will be highly appreciated.
Thanks.
As per the accepted answer in the link in your post (ORA-01000: maximum open cursors exceeded in asp.net), when you call dataConn.Close(), the connection is not really being closed, but is being left open in the connection pool. This is a hidden optimisation that makes it faster to open additional connections, but can cause problems with Oracle when you exceed some of it limits. I suggest you investigate ways of limited the size of the connection pool--this depends on what is hosting your code (IIS? Something else?).
You could also change your SQL to "DELETE FROM TABLE WHERE key IS IN (...list of values...)". This will remove the need to open 50 logical connections (and who knows how many physical connections--potentially lots).
Or doing the looping within the dataConn.Open...dataConn.Close--just use the same open connection for all the cmds.
Edit: depending on which data provider you are using, the size of the connection pool may be controllable from within the connection string. See https://msdn.microsoft.com/en-us/library/ms254502(v=vs.110).aspx for an example.
Related
I have such an idea (don't know bad or good).
I have utility, which connects by reglament to SQL server and fetches some data to application. Data is simple (2 varchar text attributes), but count of data is ~ 3 millions rows. So, my application uses network very intensively.
Can I programmatically decrease (limit, throttling, etc...) the network bandwidth usage by SQL DataReader? Let it work more slowly, but not stress nither server nor client. Is this idea good? If not, what I have to do so?
Here is code, so far:
using (SqlConnection con = new SqlConnection("My connection string here"))
{
con.Open();
using (SqlCommand command = new SqlCommand(query, con))
{
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
yield return new MyDBObject()
{
Date = (DateTime)reader["close_date"],
JsonResult = (string)reader["json_result"]
};
}
}
}
}
Making the server buffer data or hold an open query longer could actually be significantly increasing load on the server, but ultimately the only way to do what you're after would be to apply "paging" to your query, and access the data in successive pages, perhaps with pauses between pages. The pages could still be pretty big - 100k for example. You can achieve this relatively easily with OFFSET/FETCH in SQL Server.
I have been searching over to find if there is any easy way to get the Execution time of a ADO.NET command object.
I know i can manually do a StopWatch start and stop. But wanted to if there are any easy way to do it in ADO.NET
There is a way, but using SqlConnection, not command object. Example:
using (var c = new SqlConnection(connectionString)) {
// important
c.StatisticsEnabled = true;
c.Open();
using (var cmd = new SqlCommand("select * from Error", c)) {
cmd.ExecuteReader().Dispose();
}
var stats = c.RetrieveStatistics();
var firstCommandExecutionTimeInMs = (long) stats["ExecutionTime"];
// reset for next command
c.ResetStatistics();
using (var cmd = new SqlCommand("select * from Code", c))
{
cmd.ExecuteReader().Dispose();
}
stats = c.RetrieveStatistics();
var secondCommandExecutionTimeInMs = (long)stats["ExecutionTime"];
}
Here you can find what other values are contained inside dictionary returned by RetrieveStatistics.
Note that those values represent client-side statistics (basically internals of ADO.NET measure them), but seems you asked for analog of Stopwatch - I think that's fine.
The approach from the answer of #Evk is very interesting and smart: it's working client side and one of the main key of such statistics is in fact NetworkServerTime, which
Returns the cumulative amount of time (in milliseconds) that the
provider spent waiting for replies from the server once the
application has started using the provider and has enabled statistics.
so it includes the network time from the DB server to the ADO NET client.
An alternative, more DB server oriented, would be running SET STATISTICS TIME ON and then retrieve the InfoMessage.
A draft of the code of the delegate (where I'm simply writing to the debug console, but you may want to replace it with a StringBuilder Append)
internal static void TrackInfo(object sender, SqlInfoMessageEventArgs e)
{
Debug.WriteLine(e.Message);
foreach (var element in e.Errors) {
Debug.WriteLine(element.ToString());
}
}
and usage
conn.InfoMessage += TrackInfo;
using (var cmd = new SqlCommand(#"SET STATISTICS TIME ON", conn)) {
cmd.ExecuteNonQuery();
}
using (var cmd = new SqlCommand(yourQuery, conn)) {
var RD = cmd.ExecuteReader();
while (RD.Read()) {
// read the columns
}
}
I suggest you move to SQL Server 2016 and use the Query Store feature. This will track execution time and performance changes over time for each query you submit. Requires no changes in your application. Track all queries, including those executed inside stored procedures. Track any application, not only your own. Is available in all editions, including Express, and including the Azure SQL DB Service.
If you track on the client side, you must measure the time yourself, using a wall clock. I would add and expose performance counters and then use the performance counters infrastructure to capture and store the measurements.
As a side not, simply tracking the execution time of a batch sent to SQL Server yields very coarse performance info and is seldom actionable. Read How to analyse SQL Server performance.
I have a webmethod that fails to fire a stored procedure during the first attempt only (I guess while establishing a new connection). I get Internal error server as an error Ajax message. However if I hit return from the URL address then the stored procedure is executed and pages work perfectly well, as they should. Then again if I keep the page inactive for a while and then try to establish a new connection the same problem occurs.
I spent the last 2 days trying to identify the problem by checking the parameters, removing most of the codes from the webmethod, and finally I was able to trace the problem to this stored procedure.
CREATE PROCEDURE [dbo].[spUnionServices]
#freetext NVARCHAR(50),
#offset int,
#fetch int
AS SET NOCOUNT ON;
BEGIN
SELECT IDphoto AS IDservice, photos.IDuser, photoCaption AS serviceTitle, photoDate as serviceDate, photoFileName AS servicePhoto, 'Photo' AS service,
photoPublished as servicePublished, inap, hashtags, KEY_TBL.RANK, screenName AS IDname
FROM photos INNER JOIN FREETEXTTABLE(photos, (photoCaption), #freetext) AS KEY_TBL ON photos.IDphoto = KEY_TBL.[KEY]
INNER JOIN editorDetail ON editorDetail.IDuser = photos.IDuser
ORDER BY RANK DESC OFFSET #offset ROWS FETCH NEXT #fetch ROWS ONLY
END
And here is how I connect to the stored procedure from the webmethod
StringBuilder SBsmallSquares = new StringBuilder();
SqlConnection sqlConnection1 = new SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings["DefaultConnection"].ConnectionString);
using (sqlConnection1)
{
SqlCommand cmd = new SqlCommand();
SqlDataReader ReaderPopup;
cmd.CommandText = "spUnionServices";
cmd.CommandType = CommandType.StoredProcedure;
cmd.Connection = sqlConnection1;
cmd.Parameters.AddWithValue("#offset", offset);
cmd.Parameters.AddWithValue("#fetch", fetch);
cmd.Parameters.AddWithValue("#freetext", fts);
sqlConnection1.Open();
ReaderPopup = cmd.ExecuteReader();
if (ReaderPopup.HasRows)
{
while (ReaderPopup.Read())
{
//creating the string to return. Here there is no problem.
}
return SBsmallSquares.ToString();
}
else return string.Empty;
I would appreciate it if someone could find out why I'm having this problem during the first attempt to run the stored procedure. Thanks!
You should retry on this error. During availability events (upgrades for example) a DB can move to a different machine and fdhost needs to pair with the SQL instance again. Usually the pairing process happens on the first free text query or when needed. You may be seeing timeouts in that process for the first time. It should help you on retry. if this is very consistent there could be a bug in the service and let know. I will help you debug this further.
I would put this in a comment but it's easier to write the code here in an answer.
I realize you said you are getting the error in ajax but it is occurring server side and being propagated back to your Ajax call. Because you do not have any exception handling in your code IIS is wrapping the exception for you using a generic exception with minimal detail.
You need to catch the exception server side and examine it, this should give you more detail and hopefully with that insight figuring out why it is occurring and then fixing it should be much easier.
StringBuilder SBsmallSquares = new StringBuilder();
SqlConnection sqlConnection1 = null;
try
{
sqlConnection1 = new SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings["DefaultConnection"].ConnectionString);
SqlCommand cmd = new SqlCommand();
SqlDataReader ReaderPopup;
cmd.CommandText = "spUnionServices";
cmd.CommandType = CommandType.StoredProcedure;
cmd.Connection = sqlConnection1;
cmd.Parameters.AddWithValue("#offset", offset);
cmd.Parameters.AddWithValue("#fetch", fetch);
cmd.Parameters.AddWithValue("#freetext", fts);
sqlConnection1.Open();
ReaderPopup = cmd.ExecuteReader();
if (ReaderPopup.HasRows)
{
while (ReaderPopup.Read())
{
//creating the string to return. Here there is no problem.
}
return SBsmallSquares.ToString();
}
else return string.Empty;
}
catch (Exception ex)
{
// write this out to a text file OR (if you can) examine the exception in the debugger
// what you need are
// 0. ex.StackTrace <- the full call stack that generated the exception, now you know what method call caused it
// 1. ex.GetType().FullName <- the full type of the exception
// 2. ex.Message <- the message in the exception
// 3. ex.InnerException <- if this is not null then we need the type and message (1 and 2 above) again, this can recurse as many times as needed (the ex.InnerException can have an ex.InnerException.InnerException and that one can also have an InnerException, and so on)
// 4. some exceptions have additional details in other properties depending on the type, if you see anything usefull get that too
throw; // rethrow the exception because you do not want to ignore it and pretend everything succeeded (it didn't)
}
finally
{
if(sqlConnection1 != null) sqlConnection1.Close();
}
EDIT
It is also possible to update the web site configuration in IIS (and the web.config I believe) to pass through the complete exception and the details. I am not sure if its just sql server that is hosted for you in Azure or also IIS. In the later case I am not familiar with how to configure it to allow this but it might be possible. You can sometimes also specify whether to only allow complete exception/error details when running on local host which is great when you are debugging as you want to see exception information when troubleshooting but you would prefer that the outside world (outside of your company) did not see any sensitive internal details.
You can google this for more information on how to enable it, if you are running IIS in Azure then add a keyword about Azure to the search. Google: iis show exception
My guess is that there is a session authentication scheme that is not being initiated. You should try to perform a get first to allow cookies to be set in the browser. Alternatively have a closer examination of the cookies that are set on the successful attempt and work to align this if the information is available at the point of page generation.
I have recently changed my web app to create a database connection per command instead of creating one connection and just reusing it for all commands. I used this code this afternoon and my database memory went up to 24GB usage peforming about 8k inserts. My code is like this (semi pseudo code):
public int ExecSQL(string SQLStr)
{
using (SqlConnection Con = new SqlConnection(MyConStr))
{
using (SqlCommand Cmd = new SqlCommand(SQLStr, Con))
{
return Cmd.ExecuteNonQuery();
}
}
}
using (TransactionScope TX = new TransactionScope(TransactionScopeOption.RequiresNew))
{
//Loop and perform 8000 x
int ID = ExecSQL("insert into something (column) output unique_id values ('data')").
// I also perform 1 or 2 selects per insert based on the ID returned from the insert. I don't use a .Supress for my inserts.
}
Could this of caused the high database memory usage? I was under the impression it should create 100 connections (default) then just keep re-using it but I am guessing I am missing something.
Answered: Ran the following SQL:
SELECT
DB_NAME(dbid) as DBName,
COUNT(dbid) as NumberOfConnections,
loginame as LoginName
FROM
sys.sysprocesses
WHERE
dbid > 0
GROUP BY
dbid, loginame
and there is only one open connection for my database so this isn't causing the issue. Now to find out what is ..
ADO.NET uses connection pools, so multiple SqlConnection objects with the same connection strings reuse the same physical database connection. Hardly your memory increase was caused by using new SqlConnection()
I've got two processes with connections to the same SQL CE .sdf database file. One inserts items into a table and the other reads all the records from the table. After the insert I can confirm the rows are there with the Server Explorer but my query from the second process does not show them:
this.traceMessages.Clear();
SqlCeCommand command = new SqlCeCommand("SELECT AppName, Message, TraceId FROM Messages", this.connection);
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
this.traceMessages.Add(
new TraceMessage
{
AppName = reader.GetString("AppName"),
Message = reader.GetString("Message"),
TraceId = reader.GetString("TraceId")
});
}
}
It can generally load up correctly the first time but doesn't pick up updates, even after restarting the process. The connection string just has a simple Data Source that I've confirmed is pointing to the same file on both processes.
Anyone know why this is happening? Is there some setting I can enable to get updates from separate processes to work?
This is because unlike "traditional" databases, the data that you write is not flushed to disk immediately, it is deferred and happens some time later.
You have two choices in the writing program:
1) Add the Flush Interval parameter to your connection string and set it to 1. This will have a lag of up to a second before the data is flushed to the sdf.
2) When you call Commit, use the parameterized overload that allows you to specify CommitMode.Immediate. This will flush data to disk immediately.