I have this c# code that does an SQL Update that could be doing multiple updates at a time. Now the table I am updating has a column called SortOrder, so while I am doing these multiple updates, I would like to do the updates in order of the sortOrder column...is this even possible?
Here is my code:
public void PostScheduledTasks(List<CellModel> cells)
{
conn = new SqlConnection(connectionString);
cmd = new SqlCommand(
#"UPDATE ScheduleTasks_Copy
SET
ActualStart=#actualStart,
ActualFinish=#actualFinish,
ActualEndDate=#actualEndDate,
UserDate1=#userDateOne,
IsCompleted=#isCompleted
WHERE ScheduleTaskID = #scheduleTaskID");
cmd.Parameters.Add("#isCompleted", System.Data.SqlDbType.Bit);
cmd.Parameters.Add("#userDateOne", System.Data.SqlDbType.DateTime);
cmd.Parameters.Add("#actualStart", System.Data.SqlDbType.DateTime);
cmd.Parameters.Add("#actualFinish", System.Data.SqlDbType.DateTime);
cmd.Parameters.Add("#actualEndDate", System.Data.SqlDbType.DateTime);
cmd.Parameters.Add("#scheduleTaskID", System.Data.SqlDbType.Int);
cmd.Connection = conn;
conn.Open();
for (int i = 0; i < cells.Count; i++)
{
cmd.Parameters["#isCompleted"].Value = cmd.Parameters["#percentComplete"].Value = (cells[i].selected == true) ? 1 : 0;
cmd.Parameters["#userDateOne"].Value = !string.IsNullOrEmpty(cells[i].scheduledDate) ? cells[i].scheduledDate : (object)DBNull.Value;
cmd.Parameters["#actualStart"].Value = !string.IsNullOrEmpty(cells[i].actualDate) ? cells[i].actualDate : (object)DBNull.Value;
cmd.Parameters["#actualFinish"].Value = !string.IsNullOrEmpty(cells[i].finishedDate) ? cells[i].finishedDate : (object)DBNull.Value;
cmd.Parameters["#actualEndDate"].Value = !string.IsNullOrEmpty(cells[i].finishedDate) ? cells[i].finishedDate : (object)DBNull.Value;
cmd.Parameters["#scheduleTaskID"].Value = cells[i].scheduleTaskID;
cmd.ExecuteNonQuery();
}
conn.Close();
}
If "SortOrder" can be ascertained from the source object "cells", like other attributes:
Then the most efficient way would be to sort "cells" by SortOrder prior to iterating. The exact method to do so is beyond the scope of the question, since you have not told us what "cells" is, exactly (a list? an array? a custom object? a set?)
If SortOrder can only be ascertained by querying the database:
Then, not surprisingly, you'll need to query the database:
SELECT ScheduleTaskID, SortOrder FROM ScheduleTasks_Copy ORDER BY SortOrder
You iterate through that rowset, grabbing the ScheduleTaskID each time. For each ScheduleTaskID, iterate through "cells" until you find the matching task (cells[i].scheduleTaskID == TaskID), and THEN do the database update using the matching task from the source table.
Here is very rough code, I haven't written C# in a while:
using (connection)
{
SqlCommand command = new SqlCommand("SELECT ScheduleTaskID, SortOrder FROM ScheduleTasks_Copy ORDER BY SortOrder;", connection);
connection.Open();
SqlDataReader reader = command.ExecuteReader();
if (reader.HasRows)
{
while (reader.Read())
{
int taskid = reader.GetInt32(0);
for (int i = 0; i < cells.Count; i++)
{
if (cells[i].scheduleTaskID == taskid) {
cmd.Parameters["#isCompleted"].Value = cmd.Parameters["#percentComplete"].Value = (cells[i].selected == true) ? 1 : 0;
cmd.Parameters["#userDateOne"].Value = !string.IsNullOrEmpty(cells[i].scheduledDate) ? cells[i].scheduledDate : (object)DBNull.Value;
cmd.Parameters["#actualStart"].Value = !string.IsNullOrEmpty(cells[i].actualDate) ? cells[i].actualDate : (object)DBNull.Value;
cmd.Parameters["#actualFinish"].Value = !string.IsNullOrEmpty(cells[i].finishedDate) ? cells[i].finishedDate : (object)DBNull.Value;
cmd.Parameters["#actualEndDate"].Value = !string.IsNullOrEmpty(cells[i].finishedDate) ? cells[i].finishedDate : (object)DBNull.Value;
cmd.Parameters["#scheduleTaskID"].Value = cells[i].scheduleTaskID;
cmd.ExecuteNonQuery();
}
}
}
}
reader.Close();
}
Taking a step back and ignoring your actual question about running the updates concurrently, lets assume that your problem isn't "can I run a bunch of single updates concurrently?" and instead its "how do I update each of the items in my List<CellModel> cells as quickly/efficiently as possible?".
Obviously, running single updates sequentially doesn't work when cells.Count gets large, likely due to the fact that the cost of performing the update is dwarfed by the overheads incurred in performing multiple roundtrips across the network. It follows that running a bunch of single updates concurrently probably isn't the most efficient solution either.
Instead, lets move the whole cells list in one go. This means the server has everything it needs to process our request. If cells.Count is large, SqlBulkCopy (as suggested by Will) will be hard to beat. Alternatively, a Table-Valued Parameter could work fine for smaller sets.
Once the data is available to the server, all you need to do is perform a single update joining against the temp table/variable and you're done (MSDN has some good examples of this technique.) Do note, however, that if cells contains duplicate scheduleTaskID you may also have to perform a little extra pre-processing (ensuring that the highest sortOrder wins, etc.)
Example:
create table #temp(scheduleTaskID int, isCompleted bit, userDateOne datetime, actualStart datetime, actualFinish datetime, actualEndDate datetime)
--populate #temp via SqlBulkCopy, etc
update st
set st.ActualStart = t.actualStart
, st.ActualFinish = t.actualFinish
, st.ActualEndDate = t.actualEndDate
, st.UserDate1 = t.userDateOne
, st.IsCompleted = t.isCompleted
from dbo.ScheduleTasks_Copy st
inner join #temp t on st.ScheduleTaskID = t.scheduleTaskID
drop table #temp
The join syntax for updates isn't as well known as it probably should be - it becomes ridiculously useful once you get used to the slight awkwardness...
You would have to grab the values of the SortOrder column initially, and then iterate through them updating with SortOrder = X in the WHERE clause:
var dt = new DataTable();
Fill dt from query like: SELECT SortOrder FROM ScheduleTasks_Copy
var conn = new SqlConnection(connectionString);
foreach (DataRow dr in dt.Rows)
{
cmd = new SqlCommand(
#"UPDATE ScheduleTasks_Copy
SET
ActualStart=#actualStart,
ActualFinish=#actualFinish,
ActualEndDate=#actualEndDate,
UserDate1=#userDateOne,
IsCompleted=#isCompleted
WHERE ScheduleTaskID = #scheduleTaskID
AND SortOrder = #sortOrder");
cmd.Parameters.Add("#isCompleted", System.Data.SqlDbType.Bit);
cmd.Parameters.Add("#userDateOne", System.Data.SqlDbType.DateTime);
cmd.Parameters.Add("#actualStart", System.Data.SqlDbType.DateTime);
cmd.Parameters.Add("#actualFinish", System.Data.SqlDbType.DateTime);
cmd.Parameters.Add("#actualEndDate", System.Data.SqlDbType.DateTime);
cmd.Parameters.Add("#scheduleTaskID", System.Data.SqlDbType.Int);
cmd.Parameters.Add("#sortOrder", dr["SortOrder"].ToString());
}
(I've not tested this code for syntax, but it gets across the general idea)
Though I guess it depends on the reason you want to do this, there seems to me a limited number of reasons that this would be required.
Does the CellModel has the sortOrder property? Because it should. And in this case you can add this line:
cells = cells.OrderBy(o => o.sortOrder); // add this -> order your cells first
for (int i = 0; i < cells.Count; i++) {
...
}
If you don't have the sortOrder property you should find another way to sort this collection.
Another solution would be to get the ScheduleTaskID's that you expect to update ordered by sortOrder using:
select ScheduleTaskID from ScheduleTasks_Copy order by SortOrder
Then you should go through all these items and if you find an item in cells having the same ScheduleTaskID then you should do the update.
I believe what you could do is upload your result into a temp table, then use a cursor server side to roll through and update all your rows in the specified order. This would probably be easier if you are able to use sql user defined types and stored procedures.
var conn = new SqlConnection(connectionString);
conn.Open();
SqlCommand cmd = new SqlCommand("create table ##Tasks(id int, isCompleted bit, userDateOne datetime, actualStart datetime, actualFinish datetime, actualEndDate datetime)", conn);
cmd.ExecuteNonQuery();
DataTable localTempTable = new DataTable("Tasks");
localTempTable.Columns.Add("id", typeof(int));
localTempTable.Columns.Add("isCompleted", typeof(bool));
localTempTable.Columns.Add("userDateOne", typeof(DateTime));
localTempTable.Columns.Add("actualStart", typeof(DateTime));
localTempTable.Columns.Add("actualFinish", typeof(DateTime));
localTempTable.Columns.Add("actualEndDate", typeof(DateTime));
for (int i = 0; i < cells.Count; i++)
{
var row = localTempTable.NewRow();
row["id"] = cells[i].scheduleTaskID;
row["isCompleted"] = (cells[i].selected == true);
row["userDateOne"] = !string.IsNullOrEmpty(cells[i].scheduledDate) ? cells[i].scheduledDate : (object)DBNull.Value;
row["actualStart"] = !string.IsNullOrEmpty(cells[i].actualDate) ? cells[i].actualDate : (object)DBNull.Value;
row["actualFinish"] = !string.IsNullOrEmpty(cells[i].finishedDate) ? cells[i].finishedDate : (object)DBNull.Value;
row["actualEndDate"] = !string.IsNullOrEmpty(cells[i].finishedDate) ? cells[i].finishedDate : (object)DBNull.Value;
}
localTempTable.AcceptChanges();
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName = "##Tasks";
bulkCopy.WriteToServer(localTempTable);
}
SqlCommand cmd = new SqlCommand(#"
declare orderedUpdate cursor
for
select tempTasks.* from ##tasks tempTasks
inner join ScheduleTasks tasks on tempTasks.id = tasks.ScheduleTaskID
order by tasks.sortOrder;
declare #id int, #isCompleted bit, #actualStart datetime, #actualFinish datetime, #actualEndDate datetime, #userDateOne datetime;
open orderedUpdate;
fetch next from orderedUpdate INTO #id, #isCompleted, #userDateOne, #actualStart, #actualFinish, #actualEndDate
while ##fetch_status = 0
begin
UPDATE ScheduleTasks_Copy
SET
ActualStart=#actualStart,
ActualFinish=#actualFinish,
ActualEndDate=#actualEndDate,
UserDate1=#userDateOne,
IsCompleted=#isCompleted
WHERE ScheduleTaskID = #id;
end
close orderedUpdate
deallocate orderedUpdate", conn);
cmd.ExecuteNonQuery();
While it is still unclear as to why there is a desire to update the set of records in any particular order (is there a trigger that causes some down-stream effect that would benefit from being processed in the order of the sortOrder field?), it should be pointed out that doing all of the updates in a single UPDATE statement renders the ordering irrelevant. It has been pointed out by a few others already that taking a set-based approach will be much faster (and it certainly will be), but being done in a single UPDATE means that transactionally there was no order (on a practical level).
Making this set-based via SqlBulkCopy has already been mentioned, but my preference is for using Table-Valued Parameters (TVPs) as they allow for streaming the Cells collection straight to the stored procedure that will do the UPDATE (well, technically via [tempdb], but close enough). Conversely, using SqlBulkCopy means needing to make a copy of the existing Cells collection in the form of a `DataTable.
Below is both the T-SQL and C# code for converting the existing process to using a TVP. It should be noted that if there is still some reason for doing this in a particular order, the only thing to change would be how the table variable in the stored procedure is handled.
T-SQL code
-- First: You need a User-Defined Table Type
CREATE TYPE dbo.ScheduleTasksImport AS TABLE
(
ScheduleTaskID INT NOT NULL, -- optionally mark this field as PRIMARY KEY
IsCompleted BIT NOT NULL,
ActualStart DATETIME NULL,
ActualFinish DATETIME NULL,
ActualEndDate DATETIME NULL,
UserDate1 DATETIME NULL
);
GO
GRANT EXECUTE ON TYPE::[dbo].[ScheduleTasksImport] TO [user_or_role];
GO
-- Second: Use the UDTT as an input param to an import proc.
-- Hence "Tabled-Valued Parameter" (TVP)
CREATE PROCEDURE dbo.ImportData (
#ImportTable dbo.ScheduleTasksImport READONLY
)
AS
SET NOCOUNT ON;
UPDATE stc
SET stc.ActualStart = imp.ActualStart,
stc.ActualFinish = imp.ActualFinish,
stc.ActualEndDate = imp.ActualEndDate,
stc.UserDate1 = imp.UserDate1,
stc.IsCompleted = imp.IsCompleted
FROM ScheduleTasks_Copy stc
INNER JOIN #ImportTable imp
ON imp.ScheduleTaskID = stc.ScheduleTaskID
GO
GRANT EXECUTE ON dbo.ImportData TO [user_or_role];
C# code
Part 1: Define the method that will take the collection and return it as IEnumerable<SqlDataRecord>
using System.Collections;
using System.Data;
using System.Data.SqlClient;
using System.IO;
using Microsoft.SqlServer.Server;
private static IEnumerable<SqlDataRecord> SendRows(List<CellModel> Cells)
{
SqlMetaData[] _TvpSchema = new SqlMetaData[] {
new SqlMetaData("ScheduleTaskID", SqlDbType.Int),
new SqlMetaData("IsCompleted", SqlDbType.Bit),
new SqlMetaData("ActualStart", SqlDbType.DateTime),
new SqlMetaData("ActualFinish", SqlDbType.DateTime),
new SqlMetaData("ActualEndDate", SqlDbType.DateTime),
new SqlMetaData("UserDate1", SqlDbType.DateTime)
};
SqlDataRecord _DataRecord = new SqlDataRecord(_TvpSchema);
// read a row, send a row
for (int _Index = 0; _Index < Cells.Count; _Index++)
{
// Unlike BCP and BULK INSERT, you have the option here to create an
// object, do manipulation(s) / validation(s) on the object, then pass
// the object to the DB or discard via "continue" if invalid.
_DataRecord.SetInt32(0, Cells[_Index].scheduleTaskID);
_DataRecord.SetBoolean(1, Cells[_Index].selected); // IsCompleted
_DataRecord.SetDatetime(2, Cells[_Index].actualDate); // ActualStart
_DataRecord.SetDatetime(3, Cells[_Index].finishedDate); // ActualFinish
_DataRecord.SetDatetime(4, Cells[_Index].finishedDate); // ActualEndDate
_DataRecord.SetDatetime(5, Cells[_Index].scheduledDate); // UserDate1
yield return _DataRecord;
}
}
Part 2: Replace your current PostScheduledTasks method with the following that just executes the ImportData stored procedure. When the stored procedure is executed, it will ask for the value of the #ImportTable input parameter which will start the process of streaming the records across.
public static void PostScheduledTasks(List<CellModel> Cells)
{
SqlConnection _Connection = new SqlConnection(connectionString);
SqlCommand _Command = new SqlCommand("ImportData", _Connection);
_Command.CommandType = CommandType.StoredProcedure;
SqlParameter _TVParam = new SqlParameter();
_TVParam.ParameterName = "#ImportTable";
_TVParam.SqlDbType = SqlDbType.Structured;
_TVParam.Value = SendRows(Cells); // method return value is streamed data
_Command.Parameters.Add(_TVParam);
try
{
_Connection.Open();
// Send the data and process the UPDATE
_Command.ExecuteNonQuery();
}
finally
{
_Connection.Close();
}
return;
}
Please note that the two C# listing really go in the same file but were broken apart here for readability. If they are put into separate .cs files, then the using statements will need to be copied over for the second listing.
Ignoring the obvious question about why you would want to do this, you can try this hack:
Insert the data into a temp table (using SqlBulkCopy), then use a MERGE with an ORDER BY clause in the source. In this example I'm using ScheduleTasks_Copy as the "temp" table and ScheduleTasks as the destination.
MERGE ScheduleTasks AS T
USING (SELECT TOP 99999999 C.*
FROM ScheduleTasks_Copy C
JOIN ScheduleTasks S
ON C.ID = S.ID
ORDER BY S.SortOrder ASC) AS S
ON S.ID = T.ID
WHEN MATCHED THEN
UPDATE SET T.ActualStart = S.ActualStart,
T.ActualFinish = S.ActualFinish,
T.ActualEndDate = S.ActualEndDate,
T.UserDate1 = S.UserDate1,
T.IsCompleted = S.IsCompleted,
T.UpdatedOn = dbo.GetDateValue();
Don't worry about the silly GetDateValue function and UpdatedOn column. I just included them so that you could see in which order the updates were performed. If you want to update the records in reverse order, just change the sort order to DESC.
http://sqlfiddle.com/#!6/dbc5f/16
Personally I suggest avoid things like this and redesigning your solution so that it avoids the issue altogether.
Related
I have a piece of C# code, which updates two specific columns for ~1000x20 records in a database on the localhost. As I know (though I am really far from being a database expert), it should not take long, but it takes more than 5 minutes.
I tried SQL Transactions, with no luck. SqlBulkCopy seems a bit overkill, since it's a large table with dozens of columns, and I only have to update 1/2 column for a set of records, so I would like to keep it simple. Is there a better approach to improve efficiency?
The code itself:
public static bool UpdatePlayers(List<Match> matches)
{
using (var connection = new SqlConnection(Database.myConnectionString))
{
connection.Open();
SqlCommand cmd = connection.CreateCommand();
foreach (Match m in matches)
{
cmd.CommandText = "";
foreach (Player p in m.Players)
{
// Some player specific calculation, which takes almost no time.
p.Morale = SomeSpecificCalculationWhichMilisecond();
p.Condition = SomeSpecificCalculationWhichMilisecond();
cmd.CommandText += "UPDATE [Players] SET [Morale] = #morale, [Condition] = #condition WHERE [ID] = #id;";
cmd.Parameters.AddWithValue("#morale", p.Morale);
cmd.Parameters.AddWithValue("#condition", p.Condition);
cmd.Parameters.AddWithValue("#id", p.ID);
}
cmd.ExecuteNonQuery();
}
}
return true;
}
Updating 20,000 records one at a time is a slow process, so taking over 5 minutes is to be expected.
From your query, I would suggest putting the data into a temp table, then joining the temp table to the update. This way it only has to scan the table to update once, and update all values.
Note: it could still take a while to do the update if you have indexes on the fields you are updating and/or there is a large amount of data in the table.
Example update query:
UPDATE P
SET [Morale] = TT.[Morale], [Condition] = TT.[Condition]
FROM [Players] AS P
INNER JOIN #TempTable AS TT ON TT.[ID] = P.[ID];
Populating the temp table
How to get the data into the temp table is up to you. I suspect you could use SqlBulkCopy but you might have to put it into an actual table, then delete the table once you are done.
If possible, I recommend putting a Primary Key on the ID column in the temp table. This may speed up the update process by making it faster to find the related ID in the temp table.
Minor improvements;
use a string builder for the command text
ensure your parameter names are actually unique
clear your parameters for the next use
depending on how many players in each match, batch N commands together rather than 1 match.
Bigger improvement;
use a table value as a parameter and a merge sql statement. Which should look something like this (untested);
CREATE TYPE [MoraleUpdate] AS TABLE (
[Id] ...,
[Condition] ...,
[Morale] ...
)
GO
MERGE [dbo].[Players] AS [Target]
USING #Updates AS [Source]
ON [Target].[Id] = [Source].[Id]
WHEN MATCHED THEN
UPDATE SET SET [Morale] = [Source].[Morale],
[Condition] = [Source].[Condition]
DataTable dt = new DataTable();
dt.Columns.Add("Id", typeof(...));
dt.Columns.Add("Morale", typeof(...));
dt.Columns.Add("Condition", typeof(...));
foreach(...){
dt.Rows.Add(p.Id, p.Morale, p.Condition);
}
SqlParameter sqlParam = cmd.Parameters.AddWithValue("#Updates", dt);
sqlParam.SqlDbType = SqlDbType.Structured;
sqlParam.TypeName = "dbo.[MoraleUpdate]";
cmd.ExecuteNonQuery();
You could also implement a DbDatareader to stream the values to the server while you are calculating them.
I've read several dozen posts, many dating back years, and cannot come up with a modern, safe and reliable way to update a special value in several thousand records as a single query.
I loop over all the records in the table, determine a DateTime value based on some special logic and then run this simple query to update that value... over 3500 times. That's a lot of trips over the wire.
UPDATE ScheduleTickets
SET ScheduledStartUTC = #ScheduledStartUTC
WHERE ScheduleId = #ScheduleId AND PatchSessionId = #PatchSessionId
I've seen comments to not waste memory by saving to and using a DataTable. I've seen solutions that use a StringBuilder to dynamically create an update query but that feels insecure/dirty. Sure, the entire process takes less than a minute but there must be a better way.
So, after figuring out the DateTime value, I call...
UpdateScheduleTicketStart(ScheduleId, PatchSessionId, scheduledDateTime);
Which looks like this...
private static void UpdateScheduleTicketStart(long scheduleId, long patchSessionId, DateTime scheduledStartUTC)
{
using (SqlConnection c = ConnectVRS())
{
SqlCommand cmd = new SqlCommand(#"
UPDATE ScheduleTickets
SET ScheduledStartUTC = #ScheduledStartUTC
WHERE ScheduleId = #ScheduleId AND PatchSessionId = #PatchSessionId
", c);
cmd.Parameters.Add("#ScheduleId", SqlDbType.BigInt).Value = scheduleId;
cmd.Parameters.Add("#PatchSessionId", SqlDbType.BigInt).Value = patchSessionId;
cmd.Parameters.Add("#ScheduledStartUTC", SqlDbType.VarChar).Value = scheduledStartUTC;
cmd.ExecuteNonQuery();
}
}
How can I pass all the values to SQL Server in one call or how can I create a single SQL query to do the updates in one fell swoop?
Many people have suggested using a TableValueParameter, and I agree it would be a good method. Here is an example of how you could do that:
First Create a TVP and Stored Proc in SQL Server
CREATE TYPE [dbo].[SchdeuleTicketsType] As Table
(
ScheduledStartUTC DATETIME NOT NULL
, ScheduleId INT NOT NULL
, PatchSessionId INT NOT NULL
)
CREATE PROCEDURE [dbo].[usp_UpdateTickets]
(
#ScheduleUpdates As [dbo].[SchdeuleTicketsType] Readonly
)
AS
Begin
UPDATE t1
SET t1.ScheduledStartUTC = t2.ScheduledStartUTC
FROM ScheduleTickets AS t1
INNER JOIN #ScheduleUpdates AS t2
ON t1.ScheduleId = t2.ScheduleId AND
t1.PatchSessionId = t2.PatchSessionId
End
)
Next Modify your code to populate a table and pass that as a parameter to the stored proc:
private void Populate()
{
DataTable dataTable = new DataTable("SchdeuleTicketUpdates");
//we create column names as per the type in DB
dataTable.Columns.Add("ScheduledStartUTC", typeof(DateTime));
dataTable.Columns.Add("ScheduleId", typeof(Int32));
dataTable.Columns.Add("PatchSessionId", typeof(Int32));
//write you loop to populate here
//call the stored proc
using (var conn = new SqlConnection(connString))
{
var command = new SqlCommand("[usp_UpdateTickets]");
command.CommandType = CommandType.StoredProcedure;
var parameter = new SqlParameter();
//The parameter for the SP must be of SqlDbType.Structured
parameter.ParameterName = "#ScheduleUpdates";
parameter.SqlDbType = System.Data.SqlDbType.Structured;
parameter.Value = dataTable;
command.Parameters.Add(parameter);
command.ExecuteNonQuery();
}
}
If the values are in another table, use a join:
UPDATE st
SET ScheduledStartUTC = ot.ScheduledStartUTC
FROM ScheduleTickets st JOIN
OtherTable ot
ON st.ScheduleId = ot.ScheduleId AND st.PatchSessionId = ot.PatchSessionId;
You don't specify the special logic but you can probably express it in SQL.
I have a SQL Server database which has a lot of information inside.
I want to select top 50 rows in a single query (which I did, with no problem) but then I want to update a column from false to true, so next time I select I wont select the same, my code looks like this:
string Command = "UPDATE HubCommands SET [Alreadytaken] = 'true' FROM (SELECT TOP 50 [CommandId],[DeviceId],[Commandtext], [HashCommand],[UserId] FROM HubCommands) I WHERE [HubId] = '18353fe9-82fd-4ac2-a078-51c199d9072b'";
using (SqlConnection myConnection = new SqlConnection(SqlConnection))
{
using (SqlDataAdapter myDataAdapter = new SqlDataAdapter(Command, myConnection))
{
DataTable dtResult = new DataTable();
myDataAdapter.Fill(dtResult);
foreach (DataRow row in dtResult.Rows)
{
Guid CommandId, DeviceId, UserId;
Guid.TryParse(row["CommandId"].ToString(), out CommandId);
Guid.TryParse(row["DeviceId"].ToString(), out DeviceId);
Guid.TryParse(row["UserId"].ToString(), out UserId);
Console.WriteLine("CommandId" + CommandId);
}
}
}
This code does work, and it updates what I ask it to update, but I don't get nothing in the data table, its like it is always updating but not selecting.
If I do a normal select it does work and give information.
Does anyone have any idea how to update and get some data back, in a single query?
So your question is:
How can I update a table in SQL Server using C# and return the truly updated
rows as a DataTable ?
First You have multiple issues in your query.
You should use 1 and 0, not true or false. SQL-Server has a bit datatype and not a Boolean.
Second, this is how you should've constructed your query:
DECLARE #IDs TABLE
(
[CommandId] uniqueidentifier
);
INSERT INTO #IDs
SELECT [CommandId] FROM HubCommands
WHERE [HubId] = '18353fe9-82fd-4ac2-a078-51c199d9072b' AND [Alreadytaken] = 0;
UPDATE HubCommands
SET [Alreadytaken] = 1
WHERE CommandId IN
(
SELECT [CommandId] FROM #IDs
);
SELECT * FROM HubCommands
WHERE CommandId IN
(
SELECT [CommandId] FROM #IDs
);
Wrap all the above in a single string and use SqlDataReader. No need for an Adapter in you case (Since we're mixing commands unlike what the adapter usually does):
var sqlCommand = new SqlCommand(Command, myConnection);
SqlDataReader dataReader = sqlCommand.ExecuteReader();
DataTable dtResult = new DataTable();
dtResult.Load(dataReader);
I highly advise you to create a stored procedure accepting HubId as a parameter that does all the above work. It is neater and better for maintenance.
I have a list Called ListTypes that holds 10 types of products. Below the store procedure loops and gets every record with the product that is looping and it stores it in the list ListIds. This is killing my sql box since I have over 200 users executing this constantly all day.
I know is not a good architecture to loop a sql statement, but this the only way I made it work. Any ideas how I can make this without looping? Maybe a Linq statement, I never used Linq with this magnitude. Thank you.
protected void GetIds(string Type, string Sub)
{
LinkedIds.Clear();
using (SqlConnection cs = new SqlConnection(connstr))
{
for (int x = 0; x < ListTypes.Count; x++)
{
cs.Open();
SqlCommand select = new SqlCommand("spUI_LinkedIds", cs);
select.CommandType = System.Data.CommandType.StoredProcedure;
select.Parameters.AddWithValue("#Type", Type);
select.Parameters.AddWithValue("#Sub", Sub);
select.Parameters.AddWithValue("#TransId", ListTypes[x]);
SqlDataReader dr = select.ExecuteReader();
while (dr.Read())
{
ListIds.Add(Convert.ToInt32(dr["LinkedId"]));
}
cs.Close();
}
}
}
Not a full answer, but this wouldn't fit in a comment. You can at least update your existing code to be more efficient like this:
protected List<int> GetIds(string Type, string Sub, IEnumerable<int> types)
{
var result = new List<int>();
using (SqlConnection cs = new SqlConnection(connstr))
using (SqlCommand select = new SqlCommand("spUI_LinkedIds", cs))
{
select.CommandType = System.Data.CommandType.StoredProcedure;
//Don't use AddWithValue! Be explicit about your DB types
// I had to guess here. Replace with the actual types from your database
select.Parameters.Add("#Type", SqlDBType.VarChar, 10).Value = Type;
select.Parameters.Add("#Sub", SqlDbType.VarChar, 10).Value = Sub;
var TransID = select.Parameters.Add("#TransId", SqlDbType.Int);
cs.Open();
foreach(int type in types)
{
TransID.Value = type;
SqlDataReader dr = select.ExecuteReader();
while (dr.Read())
{
result.Add((int)dr["LinkedId"]);
}
}
}
return result;
}
Note that this way you only open and close the connection once. Normally in ADO.Net it's better to use a new connection and re-open it for each query. The exception is in a tight loop like this. Also, the only thing that changes inside the loop this way is the one parameter value. Finally, it's better to design methods that don't rely on other class state. This method no longer needs to know about the ListTypes and ListIds class variables, which makes it possible to (among other things) do better unit testing on the method.
Again, this isn't a full answer; it's just an incremental improvement. What you really need to do is write another stored procedure that accepts a table valued parameter, and build on the query from your existing stored procedure to JOIN with the table valued parameter, so that all of this will fit into a single SQL statement. But until you share your stored procedure code, this is about as much help as I can give you.
Besides the improvements others wrote.
You could insert your ID's into a temp table and then make one
SELECT * from WhatEverTable WHERE transid in (select transid from #tempTable)
On a MSSQL this works really fast.
When you're not using a MSSQL it could be possible that one great SQL-Select with joins is faster than a SELECT IN. You have to test these cases by your own on your DBMS.
According to your comment:
The idea is lets say I have a table and I have to get all records from the table that has this 10 types of products. How can I get all of this products? But this number is dynamic.
So... why use a stored procedure at all? Why not query the table?
//If [Type] and [Sub] arguments are external inputs - as in, they come from a user request or something - they should be sanitized. (remove or escape '\' and apostrophe signs)
//create connection
string queryTmpl = "SELECT LinkedId FROM [yourTable] WHERE [TYPE] = '{0}' AND [SUB] = '{1}' AND [TRANSID] IN ({2})";
string query = string.Format(queryTmpl, Type, Sub, string.Join(", ", ListTypes);
SqlCommand select = new SqlCommand(query, cs);
//and so forth
To use Linq-to-SQL you would need to map the table to a class. This would make the query simpler to perform.
I have a SQL SELECT statement which will not be known until runtime, which could contain JOIN's and inner selects. I need to determine the names and data types of each of the columns of the returned result of the statment from within C#. I am inclined to do something like:
string orginalSelectStatement = "SELECT * FROM MyTable";
string selectStatement = string.Format("SELECT TOP 0 * FROM ({0}) s", orginalSelectStatement);
SqlConnection connection = new SqlConnection(#"MyConnectionString");
SqlDataAdapter adapter = new SqlDataAdapter(selectStatement, connection);
DataTable table = new DataTable();
adapter.Fill(table);
foreach (DataColumn column in table.Columns)
{
Console.WriteLine("Name: {0}; Type: {1}", column.ColumnName, column.DataType);
}
Is there a better way to do what I am trying to do? By "better" I mean either a less resource-intensive way of accomplishing the same task or a more sure way of accomplishing the same task (i.e. for all I know the code snippet I just gave will fail in some situations).
SOLUTION:
First of all, my TOP 0 hack is bad, namely for something like this:
SELECT TOP 0 * FROM (SELECT 0 AS A, 1 AS A) S
In other words, in a sub-select, if two things are aliased to the same name, that throws an error. So it is out of the picture. However, for completeness sake, I went ahead and tested it, along with the two proposed solutions: SET FMTONLY ON and GetSchemaTable.
Here are the results (in milliseconds for 1,000 queries, each):
Schema Time: 3130
TOP 0 Time: 2808
FMTONLY ON Time: 2937
My recommendation would be GetSchemaTable since it's more likely to be future-proofed by a removal of the SET FMTONLY ON as valid SQL and it solves the aliasing problem, even though it is slightly slower. However, if you "know" that duplicate column names will never be an issue, then TOP 0 is faster than GetSchemaTable and is more future-proofed than SET FMTONLY ON.
Here is my experimental code:
int schemaTime = 0;
int topTime = 0;
int fmtOnTime = 0;
SqlConnection connection = new SqlConnection(#"MyConnectionString");
connection.Open();
SqlCommand schemaCommand = new SqlCommand("SELECT * FROM MyTable", connection);
SqlCommand topCommand = new SqlCommand("SELECT TOP 0 * FROM (SELECT * FROM MyTable) S", connection);
SqlCommand fmtOnCommand = new SqlCommand("SET FMTONLY ON; SELECT * FROM MyTable", connection);
for (int i = 0; i < 1000; i++)
{
{
DateTime start = DateTime.Now;
using (SqlDataReader reader = schemaCommand.ExecuteReader(CommandBehavior.SchemaOnly))
{
DataTable table = reader.GetSchemaTable();
}
DateTime stop = DateTime.Now;
TimeSpan span = stop - start;
schemaTime += span.Milliseconds;
}
{
DateTime start = DateTime.Now;
DataTable table = new DataTable();
SqlDataAdapter adapter = new SqlDataAdapter(topCommand);
adapter.Fill(table);
DateTime stop = DateTime.Now;
TimeSpan span = stop - start;
topTime += span.Milliseconds;
}
{
DateTime start = DateTime.Now;
DataTable table = new DataTable();
SqlDataAdapter adapter = new SqlDataAdapter(fmtOnCommand);
adapter.Fill(table);
DateTime stop = DateTime.Now;
TimeSpan span = stop - start;
fmtOnTime += span.Milliseconds;
}
}
Console.WriteLine("Schema Time: " + schemaTime);
Console.WriteLine("TOP 0 Time: " + topTime);
Console.WriteLine("FMTONLY ON Time: " + fmtOnTime);
connection.Close();
You could use GetSchemaTable to do what you want.
There is an example of how to use it here.
If using SQL Server, I would try using SET FMTONLY ON
Returns only metadata to the client. Can be used to test the format of
the response without actually running the query.
Apparently on SQL Server 2012, there's a better way. All is specified in the linked MSDN article.
BTW, this technique is what LINQ To SQL uses internally to determine the result set returned by a stored procedure, etc.
Dynamic SQL is always a bit of a minefield, but you could the SET FMTONLY ON on your query - this means the query will only return Metadata, the same as if no results were returned. So:
string selectStatement = string.Format("SET FMTONLY ON; {0}", orginalSelectStatement);
Alternatively, if you aren't tied to ADO, could you not go down the Linq-to-SQL route and generate a data context which will map out all of your database schemas in to code and their relevant types? You could also have a look at some of the Micro ORMs out there, such as Dapper.Net
There are plenty of other ORMs out there too.