Batching Stored Procedure Commands in EF 4.2 - c#

I've got a call to a stored procedure, that is basically an INSERT stored procedure. It inserts into Table A, then into Table B with the identity from Table A.
Now, i need to call this stored procedure N amount of times from my application code.
Is there any way i can batch this? At the moment it's doing N round trips to the DB, i would like it to be one.
The only approach i can think of is to pass a the entire list of items across the wire, via an User Defined Table Type.
But the problem with this approach is that i will need a CURSOR in the sproc to loop through each item in order to do the insert (because of the identity field).
Basically, can we batch DbCommand.ExecuteNonQuery() with EF 4.2?
Or can we do it with something like Dapper?

You can keep it like that and in the stored procedure just do a MERGE between your target table and the table parameter. Because you are always coming with new records, the MERGE will enter only on the INSERT branch.
In this case, using MERGE like this is an easy way of doing batch inserts without a cursor.
Also, another way which also avoids the use of a cursor is to use a INSERT from SELECT statement in the SP.

Related

Inserting multiple rows in multiple table in a single sql query

I want to some rows in a table , then select a specific data from the newly added record and then insert that data in another table.The following statement is obviously not working but it should give an idea of what i am trying to accomplish here.
cmd = new SqlCommand("INSERT INTO SalesDetails.sale(sale_date) VALUES (#sale_date);
SELECT sale_id FROM SalesDetails.sale WHERE sale_date=#sale_date;
SELECT stock_id FROM StockDetails.stock_item WHERE item_id=#item_id;
INSERT INTO SalesDetails.payment(payment_method,sale_id)
VALUES (#payment_method, sale_id);
INSERT INTO SalesDetails.dispatch_out_item(stock_id,sale_id,qty)
VALUES (stock_id,sale_id,#qty);", con);
Rather than writing eveything into one single SQL Command statement, I would suggest you to write a Stored Procedure for this.
The reason behind using a stored procedure is that you can more cleanly and nicely handle multiple table transactions in it while also implementing the Transaction logic through which you can ensure data consistency of all the tables where the changes will be taking place. Since you are manipulating multiple tables here, you need to make sure that the change is preserved in all tables or in none of them.
Check this link as a reference : Using Stored Procedures with Transactions
Hope this helps.

In MySQL, how to pass a set of objects to a stored procedure?

In my C# code, I will be populating a Dictionary.
I need to get that data into a MySQL table in the most efficient way possible.
Is it possible to pass that to a MySQL stored procedure? I guess I could pass it in some sort of string with commas, etc, so that the stored procedure could then call a function to parse the string and populate the table, but that's a pain.
Any other ideas?
Thanks!
Based on the comments so far, let my try to show the code/sudocode I'm working on.
The code that builds the dictionary will look something like this:
private void DistributeCallsToReschedule()
{
CallTimeSpacing = GetNumMinutesInNextCallWindow() / callsToReschedule.Count;
DateTime currTimeToCall = new DateTime();
foreach (int id in callsToReschedule)
{
CallIdTimeToCallMap.Add(id, currTimeToCall);
currTimeToCall.AddMinutes(CallTimeSpacing);
}
}
So, the dictionary can contain my entries.
What I HOPE I can do is to pass the dictionary to a stored procedure as shown below.
If this isn't possible, what's the most efficient way to do what the stored procedure indicates; ie,
the stored procedure wants to have a table to JOIN to, that has the data from the dictonary populated in the C# code. In other words, what's the most efficient way to get the dictionary's data into a table in MySQL? If this isn't possible, and I have to loop, what's the most efficient way to do that: Iteratively call a stored procedure? Build a prepared statement that has all the values (build via StringBuilder, I suppose)?
PARAMETERS PASSED TO STORED PROCEDURE BY C# CODE:
#CallIdTimeToCallMap
Put #CallIdTimeToCallMap into CallIdTimeToCallMapTable;
update cr
set cr.TimeToCall = map.TimeToCall
from callRequest cr
inner join CallIdTimeToCallMapTable map on
cr.id = map.id
You have to map objects to tables and columns before any relational database can do anything with them. Objects are not relations.
You don't say what the parameters are that the stored procedure is expecting.
If it's an INSERT or UPDATE that expects a large set of objects, I'd wonder if a stored procedure is the right answer. You'd have to call it repeatedly, once to write the row for each object in the set. I'd consider a prepared statement, binding variables, and batching so you can do it in one round trip.
Is the set a single unit of work? Have you thought about transactional behavior?

How to know how many persistent objects were deleted using Session.Delete(query);

We are refactoring a project from plain MySQL queries to the usage of NHibernate.
In the MySQL connector there is the ExecuteNonQuery function that returns the rows affected. So
int RowsDeleted = ExecuteNonQuery("DELETE FROM `table` WHERE ...");
would show me how many rows where effectively deleted.
How can I achieve the same with NHibernate? So far I can see it is not possible with Session.Delete(query);.
My current workaround is first loading all of the objects that are about to be deleted and delete them one-by-one, incrementing a counter on each delete. But that will cost performance I may assume.
If you don't mind that nHibernate will create delete statements for each row and maybe additional statements for orphans and/or other relationships, you can use session.Delete.
For better performance I would recommend to do batch deletes (see example below).
session.Delete
If you delete many objects with session.Delete, nHibernate makes sure that the integrity is preserved, it will load everything into the session if needed anyways. So there is no real reason to count your objects or have a method to retrieve the number of objects which have been deleted, because you would simply do a query before running the delete to determine the number of objects which will be affected...
The following statement will delete all entities of type post by id.
The select statement will query the database only for the Ids so it is actually very performant...
var idList = session.Query<Post>().Select(p => p.Id).ToList<int>();
session.Delete(string.Format("from Post where Id in ({0})", string.Join(",", idList.ToArray())));
The number of objects deleted will be equal to the number of Ids in the list...
This is actually the same (in terms of queries nHibernate will fire against your database) as if you would query<T> and loop over the result and delete all of them one by one...
Batch delete
You can use session.CreateSqlQuery to run native SQL commands. It also allows you to have input and output parameters.
The following statement would simply delete everything from the table as you would expect
session.CreateSQLQuery(#"Delete from MyTableName");
To retrieve the number of rows delete, we'll use the normal TSQL ##ROWCOUNT variable and output it via select. To retrieve the selected row count, we have to add an output parameter to the created query via AddScalar and UniqueResult simple returns the integer:
var rowsAffected = session.CreateSQLQuery(#"
Delete from MyTableName;
Select ##ROWCOUNT as NumberOfRows")
.AddScalar("NumberOfRows", NHibernateUtil.Int32)
.UniqueResult();
To pass input variables you can do this with .SetParameter(<name>,<value>)
var rowsAffected = session.CreateSQLQuery(#"
DELETE from MyTableName where ColumnName = :val;
select ##ROWCOUNT NumberOfRows;")
.AddScalar("NumberOfRows", NHibernateUtil.Int32)
.SetParameter("val", 1)
.UniqueResult();
I'm not so confortable with MySQL, the example I wrote is for MSSQL, I think in MySQL the ##ROWCOUNT equivalent would be SELECT ROW_COUNT();?

asp.net stored procedure insert multiple pairs of data into the DB

I have a method that accepts int[] (userIDs) and an int (groupID) as parameters. Then it runs the stored procedure and insert the data into the DB
For example:
if userIDs=[1,2,3] and groupID=4,
then I want the following data to be inserted into the DB
userID groupID
1 4
2 4
3 4
I have 2 solutions to this problem. The first one is to write a stored procedure that insert a single record into the DB. In the method(), I will loop through the int[] and call the stored procedures n times
method()
for (int i =0; i< userID.length; i++){
// call stored procedure to insert a single record
}
The second solution is to pass int[] and int as parameters to the stored procedures and do the looping in the stored procedure.
Which way is a better solution? ( if its the 2nd solution is better, can someone provide guidance on handling int[] in stored procedure )
Is there a good reason why you don't want to use an O/R mapper?
Your example looks like server / code-behind and you can use Entity Framework (or other) to insert your new values.
If you can't use them then I would use the for ( the 2nd in your posting) approach. But it's dangerous, because you are not within any transaction.
You can start your Entity Framework investigation here : http://www.asp.net/entity-framework.
If you, for any reasons, are not able to use EF consider using a transcation scope for your sql commands ( see http://msdn.microsoft.com/en-us/library/777e5ebh.aspx for starting reading )
Geeeeeeeeeeeenerally speaking, code is faster for this stuff. You're bettter off iterating through your array, C# side, then calling a stored procedure to handle your specific record.
Now of course specifics will change, and this certainly sounds best handled in one big shot. Convert your int array to a datatable and then you can have some real fun...
If all you're doing is adding rows to a database, I don't see the need for a Stored Procedure (unless that's a requirement coming from a DBA or policy).
Just loop through your items and add the entries to the database using ADO.NET or Entity Framework.

SQL Server : delay before output (a print statement) is produced

I may have a slight feeling as to what is going on, but I thought I'd ask to get confirmation, and potentially look at an alternative.
As a slight background, I've written a C# application that is providing a small front end to a stored procedure. The procedure contains a number of temporary table inserts from other stored procedures (and one table valued udf), and some xml processing.
In order to get a sense of how far along I am in the stored procedure, I'm subscribing to the InfoMessage (using an SqlInfoMessageEventHandler) from an SqlConnection. I've put some informative print statements in various places throughout the SP, so I can get a sense of the processing which has completed and update a status bar accordingly).
The rough structure of the SP is along these lines:
Print 'Beginning processing'
Create Temp Table
Insert into Temp Table from Table Valued UDF
Print 'Creating working Tables'
Create more Temp Tables
Insert into Temp Tables from SPs (each SP contains a print statement e.g 'Starting SP1').
All the messages are received and processed successfully, but there is a delay of a few seconds before any messages are returned from the server, then the first few arrive all at once (as though they were all processed but the output was held back for a while).
I had naively assumed (I still have a lot to learn about DB mechanics) that my initial print statement would be returned before any of the other processing instructions within the SP took place.
I assume that the server is doing something regarding fetching execution plans an/or potentially recalculating plans, or is it the case that query optimiser is doing some preprocessing before any results at all are returned?
Hopefully my question is reasonably understandable from that mess of text. Essentially, would the query optimiser make the server perform some of the selects/inserts before the procedure actually begins sequentially encountering my print statements?
I also tried doing some small temp table manipulation before my initial print statement, so some rows would be returned before the slow operations began but the result was much the same.
Thanks for any responses.
I can judge by MSSM, when you use PRINT statements they are late because of output happens once buffer is full. If you would like to output message straight away use:
RAISERROR('Message',0,0) WITH NOWAIT
It will work as Print but will be outputted straight away.
Hope this helps.
I would prefer to use SELECT statement instead of PRINT -
SELECT 'Beginning processing'
Create Temp Table
Insert into Temp Table from Table Valued UDF
SELECT 'Creating working Tables'
Create more Temp Tables
Insert into Temp Tables from SPs (each SP contains a print statement e.g 'Starting SP1').

Categories