C# SqlDataAdapter Fill Inconsistent Results - c#

I would greatly appreciate some help with my SqlDataAdapter code below.
I am getting inconsistent results and am at a loss as to what the issue is.
SCENARIO:
This is a windows form project.
I have a routine that loops through a list of stations and calls the code below for each station.
The problem code executes the query (SQL Server) and fills a dataset with the returned rows.
Depending on what the station is up to, the returned rowcount will be >= 0 (typically 1 or 2).
PROBLEM:
The code does not always fill the dataset. Sometimes the dataset fills and sometimes it does not, or more accurately sometimes the rowcount = correct and sometimes rowcount = 0.
Currently, it is the same subset of stations that are not filling correctly.
There are no issues if rowcount is actually = 0.
TROUBLESHOOTING SO FAR:
The code does not throw any exceptions.
Looking at the parameters/ vars in the local window during step through I do not see any differences between stations that return correctly and those that do not.
The query itself does work without issue. I can execute the query in SSMS with the same parameters and I get the expected/ correct results in SSMS.
I do not see any difference in the SSMS query results between between stations that return/ fill correctly and those that do not (columns all have data, etc.).
My first thought was that I must have something amiss with these stations in the underlying tables, but again I see nothing out of place (no missing columns, nulls, etc.).
I have looked at quite a few posts regarding SqlDataAdapter and fill, but they appear to be dealing with all or nothing problems.
I would think if there was a problem with the code, the fill would fail all of the time.
I would also think if there was a problem with the query, the problem would be
present all of the time.
I would further think if there was a problem with the data, I would see something in the query results in SSMS.
RESULTS:
CODE:
string strSql =
#"SELECT
ISNULL(a.ToolGroupId, 0) AS ToolGroupId, ISNULL(a.PartId, 0) AS PartId,
ISNULL(a.CycleCount, 0) AS CycleCount, ISNULL(a.BDT, 0) AS BDT,
start.starttime, endx.endtime,
ISNULL(start.ProgName, 'none') AS Program, ISNULL(a.gName, 'none') AS gName,
ISNULL(a.ProductionMetrics, 1) AS ProductionMetrics
FROM
(SELECT
MIN(datetime) AS starttime, ProgName
FROM
V_CycleTime
WHERE
eocr_no = #stationId
AND datetime >= #startTime
AND datetime < #endtime
GROUP BY
ProgName) AS start
LEFT JOIN
(SELECT
MAX(datetime) AS endtime, ProgName
FROM
V_CycleTime
WHERE
eocr_no = #stationId
AND datetime >= #startTime
AND datetime < #endtime
GROUP BY
ProgName) AS endx ON (endx.ProgName = start.ProgName)
LEFT JOIN
(SELECT
ISNULL(p.ToolGroupId, 0) AS ToolGroupId,
ISNULL(p.PartId, 0) AS PartId,
COUNT(ID) AS CycleCount,
AVG(Expected_Ct) AS BDT,
Program, p.gName, p.ProductionMetrics
FROM
V_CycleTime
LEFT JOIN
(SELECT
ToolGroupId, Program, PartId, t.gName, t.ProductionMetrics
FROM
PartsToPrograms
LEFT JOIN
(SELECT ID, gName, ProductionMetrics
FROM ToolGroups) t ON (t.ID = PartsToPrograms.ToolGroupId)
WHERE
StationId = #stationId
AND IsActive = 1) p ON (p.Program = V_CycleTime.ProgName)
WHERE
eocr_no = #stationId
AND datetime >= #startTime
AND datetime < #endtime
GROUP BY
p.ToolGroupId, p.PartId, Program, p.gName, p.ProductionMetrics) AS A ON (a.Program = start.ProgName)
WHERE
a.ToolGroupID IS NOT NULL
ORDER BY
start.starttime;";
// retrieve the dataset
DataSet ds = new DataSet();
string connetionString = ConfigurationManager.ConnectionStrings["connString"].ConnectionString;
using (var adapter = new SqlDataAdapter(strSql, connetionString))
{
try
{
adapter.SelectCommand.Parameters.AddWithValue("#stationId", GlobalVars.stationId);
adapter.SelectCommand.Parameters.AddWithValue("#startTime", GlobalVars.currentHourStartTime);
adapter.SelectCommand.Parameters.AddWithValue("#endTime", GlobalVars.currentHourEndTime);
adapter.Fill(ds);
}
catch (SqlException ex)
{
SimpleLogger.SimpleLog.Log(ex);
return;
}
}
// test row count
int rowCount = ds.Tables[0].Rows.Count;
if (ds.Tables[0].Rows.Count == 0)
{
//do this
}
else
{
//do that
}

Related

OracleDataReader returning only last row on pagination query

I'm using the Oracle.ManagedDataAccess to return data from my database, and I really need to page the results because there are lots of registers in this table.
So I'm using the second answer from this post to paging, and it really works when I do run on an Oracle Client.
The Final query looks like this:
select *
from (
select rownum as rn, a.*
from (
Select u.*
From users u
order by u.user_code
) a
)
where rownum <= :myReturnSize
and rn > (:myReturnPage-1) * :myReturnSize;
But when I call it from the .Net code below, it returns only the last register of the 100's I asked for.
OracleParameter[] parameters = new OracleParameter[]{
new OracleParameter("myReturnPage", page), //1
new OracleParameter("myReturnSize", size) //100
};
List<User> usersList = new List<User>();
using (OracleConnection conn = new OracleConnection(connString))
{
using (OracleCommand cmd = new OracleCommand(sbSelect.ToString(), conn))
{
conn.Open();
cmd.CommandType = CommandType.Text;
cmd.Parameters.AddRange(parameters);
using (OracleDataReader odr = cmd.ExecuteReader())
{
if (!odr.IsClosed && odr.HasRows)
{
while (odr.Read())
{
User userToReturn = new User();
FillUserEntity(userToReturn, odr);
usersList.Add(userToReturn);
}
}
}
}
}
return usersList.AsQueryable();
Even more bizarre is that when I run this query without pagination in the same method it returns me all registers, more than 723,000.
Any help would be appreciated.
Thanks a lot.
By default the ODP.Net set the parameters by position and not by name. So you just need to invert the order when creating the OracleParameter's array, and also set the BindByName property to true, like this:
cmd.BindByName = true;
Oracle tends to prefer stored procedures over direct text (because reasons). I've had more than a few "it works in SQL Developer but not .Net!" situations that were solved by putting it all together in a stored proc within a package on the database side. That also decouples your query from your application, so if the query has to change you don't have to recompile the app. Your app then just makes the same call as before, but to the stored procedure, probably using an OracleDataAdapter.
Can you confirm whether you query giving correct output from Oracle client.?
Problem is with
where rownum <= :myReturnSize
It will always return the value rownum = :myReturnSize
One possible solution can be
select *
from (
select rownum as rnum, a.*
from (
Select rownum as rn, u.*
From users u
order by u.user_code
) a
)
where rnum <= :myReturnSize
and rn > (:myReturnPage-1) * :myReturn.

How to update a db column comparing dates after checking a condition C#

I have a table column as status(Varchar,records "Paid"/"Unpaid" only) and another column as nextDateOfPay(datetime, records a date).I need to update this status column as Unpaid for the records which are as Paid when the date(present date) is passed the date in nextDateOdPay column.And I need to repeat this update for every row in the table. Primary key is Id column(integer).
I have no clue where to start,do i need to use T-sql or if i have to use sql agent job.I'd like to know if there is a simple solution that i can write in the program itself.Thank you so much in advance !
Code so far is as,
dateTimePicker3.Value = DateTime.Now;
textBox12.Enabled = false;
string newstat;
string query = "select Id,nextDateOfPay from gymTb where status='Paid'";
SqlConnection cn = new SqlConnection(cs);
SqlCommand cmd = new SqlCommand(query, cn);
DateTime x = DateTime.Now;
DateTime y;
if (cn.State.ToString() == "Closed")
{
cn.Open();
}
try
{
object dtx = cmd.ExecuteScalar();
y = Convert.ToDateTime(dtx);
int result = DateTime.Compare(x, y);
if (result >= 0)
newstat = "Unpaid";
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
A simple UPDATE query would do:
UPDATE gymTb SET status='Unpaid'
WHERE status='Paid' AND nextDateOfPay <= getdate()
You could call this from your application, triggered by manually pressing a button or perhaps using timer logic.
To make it run daily, and independent from your application, you can put the UPDATE statement in a recurring job in SQL Server Agent. For more on that see here:
how to schedule a job for sql query to run daily?

C# - SqlDataAdapter doesn't fill DataTable no matter what

So I'm trying to fill a DataTable with data from a MSSQL query, but for some reason it completely refuses to work and it's making me very annoyed.
When I iterate through the results with a SqlDataReader and Read(), I get the results and even when I attempt to fill the DataTable with the SqlDataAdapter, the query appears on the SQL Profiler and yet doesn't return any data.
I have no idea what has possessed my code, but maybe you can figure it out:
try
{
// Global variables
var connectionString = System.Configuration.ConfigurationManager.ConnectionStrings["DefaultConnection"].ConnectionString;
var textString = "Pasākums {0} sākas pēc {1}!";
var linkString = #"/Event/Index/{0}";
using (SqlConnection conn = new SqlConnection(connectionString))
{
// Set variables
var findIn24HrsEventsCmd = new SqlCommand(#"
SELECT adm.UserID, adm.EventID FROM [dbo].[EventAdmissions] AS adm WHERE EventID IN
(
SELECT EventID FROM [dbo].[Events]
WHERE DATEDIFF(hour, #date, StartTime) BETWEEN 0 AND 24
)
AND
(
SELECT COUNT(URL) FROM [dbo].[Notifications]
WHERE Type = 1 AND UserID = adm.UserID
AND URL LIKE '/Event/Index/'+CAST(adm.EventID AS VARCHAR(36))
) = 0", conn);
findIn24HrsEventsCmd.Parameters.Add(new SqlParameter("date", "2015-05-31 02:17:28.727"));
var test = new SqlCommand(#"SELECT * FROM [dbo].[EventAdmissions]", conn);
var findIn1HrEventsCmd = new SqlCommand(#"
SELECT adm.UserID, adm.EventID FROM [dbo].[EventAdmissions] AS adm WHERE EventID IN
(
SELECT EventID FROM [dbo].[Events]
WHERE DATEDIFF(minute, #date, StartTime) BETWEEN 0 AND 60
)
AND
(
SELECT COUNT(URL) FROM [dbo].[Notifications]
WHERE Type = 1 AND UserID = adm.UserID
AND URL LIKE '/Event/Index/'+CAST(adm.EventID AS VARCHAR(36))
) < 2", conn);
findIn1HrEventsCmd.Parameters.Add(new SqlParameter("date", "2015-05-31 02:17:28.727"));
var t = findIn1HrEventsCmd.CommandTimeout;
// Retrieve data
conn.Open();
log.Debug("Starting with the events that are on in an hour.");
// Do it first for evens within an hour
var oneHrDataAdapter = new SqlDataAdapter(test);
var oneHrDt = new DataTable();
oneHrDataAdapter.Fill(oneHrDt);
findIn1HrEventsCmd.Dispose();
findIn24HrsEventsCmd.Dispose();
oneHrDataAdapter.Dispose();
}
} catch (Exception e)
{
log.Fatal("Fatal error!" + e.Message);
}
Note how I've replaced the complex queries for a very simple test query that definitely returns results in Management Studio and with the DataReader, but doesn't work with a DataTable for some reason. Note that it isn't timing out, the server is located on the same machine and the query runs for maybe like 1-2 seconds at most.
The connection works, because as I mentioned before the DataReader approach works and also there are no exceptions thrown.
God damn, I never bothered to check the Rows property of the DataTable, turns out it did work.
I thought it didn't because while in debugging mode Visual Studio is very misleading because when you hover over the datatable variable it just shows "{}" which usually would mean that the thing is empty.

In C#, is "SELECT TOP 0 * FROM (/* ... */) s" used in conjuction with ADO.NET a good way to determine the column information in a SELECT statement?

I have a SQL SELECT statement which will not be known until runtime, which could contain JOIN's and inner selects. I need to determine the names and data types of each of the columns of the returned result of the statment from within C#. I am inclined to do something like:
string orginalSelectStatement = "SELECT * FROM MyTable";
string selectStatement = string.Format("SELECT TOP 0 * FROM ({0}) s", orginalSelectStatement);
SqlConnection connection = new SqlConnection(#"MyConnectionString");
SqlDataAdapter adapter = new SqlDataAdapter(selectStatement, connection);
DataTable table = new DataTable();
adapter.Fill(table);
foreach (DataColumn column in table.Columns)
{
Console.WriteLine("Name: {0}; Type: {1}", column.ColumnName, column.DataType);
}
Is there a better way to do what I am trying to do? By "better" I mean either a less resource-intensive way of accomplishing the same task or a more sure way of accomplishing the same task (i.e. for all I know the code snippet I just gave will fail in some situations).
SOLUTION:
First of all, my TOP 0 hack is bad, namely for something like this:
SELECT TOP 0 * FROM (SELECT 0 AS A, 1 AS A) S
In other words, in a sub-select, if two things are aliased to the same name, that throws an error. So it is out of the picture. However, for completeness sake, I went ahead and tested it, along with the two proposed solutions: SET FMTONLY ON and GetSchemaTable.
Here are the results (in milliseconds for 1,000 queries, each):
Schema Time: 3130
TOP 0 Time: 2808
FMTONLY ON Time: 2937
My recommendation would be GetSchemaTable since it's more likely to be future-proofed by a removal of the SET FMTONLY ON as valid SQL and it solves the aliasing problem, even though it is slightly slower. However, if you "know" that duplicate column names will never be an issue, then TOP 0 is faster than GetSchemaTable and is more future-proofed than SET FMTONLY ON.
Here is my experimental code:
int schemaTime = 0;
int topTime = 0;
int fmtOnTime = 0;
SqlConnection connection = new SqlConnection(#"MyConnectionString");
connection.Open();
SqlCommand schemaCommand = new SqlCommand("SELECT * FROM MyTable", connection);
SqlCommand topCommand = new SqlCommand("SELECT TOP 0 * FROM (SELECT * FROM MyTable) S", connection);
SqlCommand fmtOnCommand = new SqlCommand("SET FMTONLY ON; SELECT * FROM MyTable", connection);
for (int i = 0; i < 1000; i++)
{
{
DateTime start = DateTime.Now;
using (SqlDataReader reader = schemaCommand.ExecuteReader(CommandBehavior.SchemaOnly))
{
DataTable table = reader.GetSchemaTable();
}
DateTime stop = DateTime.Now;
TimeSpan span = stop - start;
schemaTime += span.Milliseconds;
}
{
DateTime start = DateTime.Now;
DataTable table = new DataTable();
SqlDataAdapter adapter = new SqlDataAdapter(topCommand);
adapter.Fill(table);
DateTime stop = DateTime.Now;
TimeSpan span = stop - start;
topTime += span.Milliseconds;
}
{
DateTime start = DateTime.Now;
DataTable table = new DataTable();
SqlDataAdapter adapter = new SqlDataAdapter(fmtOnCommand);
adapter.Fill(table);
DateTime stop = DateTime.Now;
TimeSpan span = stop - start;
fmtOnTime += span.Milliseconds;
}
}
Console.WriteLine("Schema Time: " + schemaTime);
Console.WriteLine("TOP 0 Time: " + topTime);
Console.WriteLine("FMTONLY ON Time: " + fmtOnTime);
connection.Close();
You could use GetSchemaTable to do what you want.
There is an example of how to use it here.
If using SQL Server, I would try using SET FMTONLY ON
Returns only metadata to the client. Can be used to test the format of
the response without actually running the query.
Apparently on SQL Server 2012, there's a better way. All is specified in the linked MSDN article.
BTW, this technique is what LINQ To SQL uses internally to determine the result set returned by a stored procedure, etc.
Dynamic SQL is always a bit of a minefield, but you could the SET FMTONLY ON on your query - this means the query will only return Metadata, the same as if no results were returned. So:
string selectStatement = string.Format("SET FMTONLY ON; {0}", orginalSelectStatement);
Alternatively, if you aren't tied to ADO, could you not go down the Linq-to-SQL route and generate a data context which will map out all of your database schemas in to code and their relevant types? You could also have a look at some of the Micro ORMs out there, such as Dapper.Net
There are plenty of other ORMs out there too.

Segmented Data Load: Table records to xml

I have a sequence of sql queries that result in very large datasets that I have to query against a database and write them to files. I have about 80 queries and each one produces somewhere between 1000 records to 10,000,000 records. I cannot change the queries themselves. What I'm trying to do is read 500,000 records at a time for each query and write to a file. Here's what I have so far
void WriteXml(string tableName, string queryString)
{
int pageSize = 500000;
int currentIndex = 0;
using (
SqlConnection connection =
new SqlConnection(CONNECTION_STRING))
{
using (SqlCommand command = new SqlCommand(queryString, connection))
{
try
{
connection.Open();
SqlDataAdapter dataAdapter = new SqlDataAdapter(command);
int rowsRead = 0, count = 0, index = 0;
do
{
DataSet dataSet = new DataSet("SomeDatasetName");
rowsRead = dataAdapter.Fill(dataSet, currentIndex, pageSize, tableName);
currentIndex += rowsRead;
if (dataSet.Tables.Count > 0 && rowsRead > 0)
{
dataSet.Tables[0].WriteXml(string.Format(#"OutputXml\{0}_{1}.xml", tableName, index++),
XmlWriteMode.WriteSchema);
}
}
while (rowsRead > 0);
}
catch (Exception e)
{
Log(e);
}
}
}
}
This works but it's very very slow. I'm pretty sure I'm doing something wrong here because when I run it, the application hogs up most of my memory (I have 6GB) and it takes for ever to run. I started it last night and it is still running. I understand I'm dealing with a lot a records but I don't think it's something that would take so many hours to run.
Is this the right way to do paged/segmented data read from a database? Is there any way this method could be optimized or is there any other way I can approach this?
Do let me know if I'm not clear on anything and I'll try to provide clarification.
The paging overloads for DataAdapter.Fill still get the entire result set beneath the covers. Read here:
http://msdn.microsoft.com/en-us/library/tx1c9c2f%28vs.71%29.aspx
the part that pertains to your question:
The DataAdapter provides a facility for returning only a page of data,
through overloads of the Fill method. However, this might not be the
best choice for paging through large query results because, while the
DataAdapter fills the target DataTable or DataSet with only the
requested records, the resources to return the entire query are still
used. To return a page of data from a data source without using the
resources required to return the entire query, specify additional
criteria for your query that reduces the rows returned to only those
required.
In Linq2Sql, there are convenient methods Skip and Take for paging through data. You could roll your own by using a parameterized query constructed to do the same thing. Here is an example to skip 100, and take 20 rows:
SELECT TOP 20 [t0].[CustomerID], [t0].[CompanyName],
FROM [Customers] AS [t0]
WHERE (NOT (EXISTS(
SELECT NULL AS [EMPTY]
FROM (
SELECT TOP 100 [t1].[CustomerID]
FROM [Customers] AS [t1]
WHERE [t1].[City] = #p0
ORDER BY [t1].[CustomerID]
) AS [t2]
WHERE [t0].[CustomerID] = [t2].[CustomerID]
))) AND ([t0].[City] = #p1)
ORDER BY [t0].[CustomerID]

Categories