Dapper to insert multiple rows into two tables using stored procedure - c#

I am using Dapper (https://github.com/StackExchange/Dapper) in asp.net core myweb api project. I have a requirement to create a master record and insert a set of rows into a child table.
Ex. OrdersMaster table
create an order id in this table
OrderDetails table
ordermasterid
order items
How can I do this using Dapper? Please share some code snippets if possible.

I would implement the insert operations as a transactional stored procedure, and then call that from your .NET application.
You may need a table-valued type to pass in a list of data, like this:
CREATE TYPE List_Of_Items AS TABLE (
ItemID INT NOT NULL,
Quantity INT NOT NULL
)
The procedure might look like this
CREATE PROC Insert_Order_With_Details (
#CustomerId INT,
#Items List_Of_Items
) AS
BEGIN
BEGIN TRANSACTION
INSERT INTO OrdersMaster (CustomerId) VALUES #CustomerId
DECLARE #OrderID INT
SET #OrderID = SCOPE_IDENTITY() --last assigned id
INSERT INTO OrderDetails (OrderId, CustomerId, ItemId, Quantity)
SELECT #OrderID, #CustomerID, ItemID, Quantity
FROM #Items
COMMIT
END
Then in C#, I would suggest creating methods for creating your TVP. It is not as simple as you might like. This requires the using Microsoft.SqlServer.Server and using Dapper.Tvp.
//This is a shell to create any kind of TVP
private static void AddTableCore<T>(
this DynamicParametersTvp dp,
string tvpTypeName,
Func<T, SqlDataRecord> valueProjection,
IEnumerable<T> values,
string parameterTableName)
{
var tvp = values
.Select(valueProjection)
.ToList();
//If you pass a TVP with 0 rows to SQL server it will error, you must pass null instead.
if (!tvp.Any()) tvp = null;
dp.Add(new TableValueParameter(parameterTableName, tvpTypeName, tvp));
}
//This will create your specific Items TVP
public static void AddItemsTable(this DynamicParametersTvp dp, IEnumerable<Item> items, string parameterTableName = "Items")
{
var columns = new[]
{
new SqlMetaData("ItemID", SqlDbType.Int)
new SqlMetaData("Quantity", SqlDbType.Int)
};
var projection = new Func<Item, SqlDataRecord>(item =>
{
var record = new SqlDataRecord(columns);
record.SetInt32(0, item.Id);
record.SetInt32(1, item.Quantity);
return record;
});
AddTableCore(dp, "Items", projection, items, parameterTableName);
}
and then where you need to query you might do:
using (var cn = new SqlConnection(myConnectionString))
{
var p = new DynampicParametersTvp(new {
CustomerId = myCustomerId
});
p.AddItemsTable(items);
cn.Execute("Insert_Order_With_Details", p, commandType: CommandType.StoredProcedure);
}
The commandType argument is super important. It defaults to plain SQL text and will error if you send the name of a proc.
If you want to put in multiple orders at once, you'll need to use table-valued parameters and the Dapper.Tvp package.
See this SO question Using Dapper.TVP TableValueParameter with other parameters as well as this documentation on TVP's from Microsoft https://learn.microsoft.com/en-us/sql/relational-databases/tables/use-table-valued-parameters-database-engine. I don't think all SQL vendors support TVPs.

Using stored procedure as mentioned in other answer is good solution. My answer implements the same without Stored Procedure.
You have to use transaction. This way, either all the changes will be committed or rolled back. Code below assumes you are using Identity as Primary Key. Please refer this question for discussion about ##IDENTITY that I have used in below code.
Though code is not complete, I have put detailed comments to explain steps.
using (var connection = new SqlCeConnection("connection_string"))
{
connection.Open();
//Begin the transaction
using (var transaction = connection.BeginTransaction())
{
//Create and fill-up master table data
var paramMaster = new DynamicParameters();
paramMaster.Add("#XXX", ...);
....
//Insert record in master table. Pass transaction parameter to Dapper.
var affectedRows = connection.Execute("insert into OrdersMaster....", paramMaster, transaction: transaction);
//Get the Id newly created for master table record.
//If this is not an Identity, use different method here
newId = Convert.ToInt64(connection.ExecuteScalar<object>("SELECT ##IDENTITY", null, transaction: transaction));
//Create and fill-up detail table data
//Use suitable loop as you want to insert multiple records.
//for(......)
foreach(OrderItem item in orderItems)
{
var paramDetails = new DynamicParameters();
paramDetails.Add("#OrderMasterId", newId);
paramDetails.Add("#YYY", ...);
....
//Insert record in detail table. Pass transaction parameter to Dapper.
var affectedRows = connection.Execute("insert into OrderDetails....", paramDetails, transaction: transaction);
}
//Commit transaction
transaction.Commit();
}
}

Related

insert and select all data entity framework and SP

I have a sp that inserts data in table. The table has pk_id, doc_id and other fields. After insert I want to get all records based on doc_id
ALTER PROCEDURE [dbo].[insertTable]
#doc_id int
,......
AS
BEGIN
// Insert
INSERT INTO [dbo].[Table]
(
doc_id
,..........
)
VALUES
(
#doc_id
,.......
)
// Select all records based on doc_id
select * from [dbo].[Table] where doc_id = #doc_id
END
After creating the Sp i updated my EF model with this SP. In my context.cs file i found that the return type is int, which makes sense if my SP was only insert.
I followed this article to change the return type using complex types and funtion imports. but when i try to "Get column information" I get this "The selected SP or function return no columns"
I want something like this but instead of retuining two list in want 1st int and 2nd list.
what changes do i need to make in my code
here is how i am calling my sp in my controller
List<Table> List = new List<Table>();
using (Entities entities = new Entities())
List = (entities.insertTable(1, ......)).ToList();
var result = new { List = List != null ? List: new List<Table>()};
return Json(result, JsonRequestBehavior.AllowGet);
but this does not work. I get an error in line List = (entities.insertTable(1, ......)).ToList(); which i think is obvious since i get two resultsets
Any help will be greatly appreciated
You can call Database.SqlQuery on your dbcontext:
context.Database.SqlQuery<DtoType>("insertTable #param1",
new SqlParameter("param1", 1));
You just need to provide a DTO type for the result set. In this case for Table.

Pass table value type to SQL Server stored procedure via Entity Framework

I created a user-defined table type in SQL Server:
CREATE TYPE dbo.TestType AS TABLE
(
ColumnA int,
ColumnB nvarchar(500)
)
And I'm using a stored procedure to insert records into the database:
create procedure [dbo].[sp_Test_CustomType]
#testing TestType READONLY
as
insert into [dbo].[myTable]
select ColumnA, ColumnB
from #testing
And I would like to use EF to execute this stored procedure, but here's the problem: how can I pass a user defined table to the stored procedure?
I tried adding the stored procedure to the model, but I'm unable to find the desired stored procedure in the updated context.
What I'm trying to do is to execute a bulk insert to a table, here's the method that I'm currently using:
List<items> itemToInsertToDB = //fetchItems;
foreach(items i in itemToInsertToDB)
{
context.sp_InsertToTable(i.ColumnA, i.ColumnB)
}
Currently, I use a foreach loop to loop through the list to insert item to DB, but if the list have a lot of items, then there will be a performance issue, so, I'm thinking of passing a list to the stored procedure and do the insert inside.
So how to solve this problem? or are there any better ways to do this?
Lets say you want to send a table with a single column of GUIDs.
First we need to create a structure using SqlMetaData which represents the schema of the table (columns).
The below code demonstrates one column named "Id" of the GUID is the SQL stored procedure parameter table type
var tableSchema = new List<SqlMetaData>(1)
{
new SqlMetaData("Id", SqlDbType.UniqueIdentifier)
}.ToArray();
Next you create a list of records that match the schema using SqlDataRecord.
The below code demonstrates how to add the items inside a list using the above created schema. Create a new SqlDataRecord for each of the items in the list. Replace SetGuid with the corresponding type and Replace Guid.NewGuid() as the corresponding value.
Repeat new SqlDataRecord for each item and add them to a List
var tableRow = new SqlDataRecord(tableSchema);
tableRow.SetGuid(0, Guid.NewGuid());
var table = new List<SqlDataRecord>(1)
{
tableRow
};
Then create the SqlParameter:
var parameter = new SqlParameter();
parameter.SqlDbType = SqlDbType.Structured;
parameter.ParameterName = "#UserIds"; //#UserIds is the stored procedure parameter name
parameter.TypeName = "{Your stored procedure type name}"
parameter.Value = table;
var parameters = new SqlParameter[1]
{
parameter
};
Then simply call the stored procedure by using the Database.SqlQuery.
IEnumerable<ReturnType> result;
using (var myContext = new DbContext())
{
result = myContext.Database.SqlQuery<User>("GetUsers #UserIds", parameters)
.ToList(); // calls the stored procedure
// ToListAsync(); // Async
{
In SQL Server, create your User-Defined Table Type (I suffix them with TTV, Table Typed Value):
CREATE TYPE [dbo].[UniqueidentifiersTTV] AS TABLE(
[Id] [uniqueidentifier] NOT NULL
)
GO
Then specify the type as a parameter (don't forget, Table Type Values have to be readonly!):
CREATE PROCEDURE [dbo].[GetUsers] (
#UserIds [UniqueidentifiersTTV] READONLY
) AS
BEGIN
SET NOCOUNT ON
SELECT u.* -- Just an example :P
FROM [dbo].[Users] u
INNER JOIN #UserIds ids On u.Id = ids.Id
END
I suggest you not using Stored Procedure to insert bulk data, but just rely to Entity Framework insert mechanism.
List<items> itemToInsertToDB = //fetchItems;
foreach(items i in itemToInsertToDB)
{
TestType t = new TestType() { ColumnA = i.ColumnA, ColumnB = i.ColumnB };
context.TestTypes.Add(t);
}
context.SaveChanges();
Entity framework will smartly perform those insertion in single transaction and (usually) in single query execution, which will almost equal to executing stored procedure. This is better rather than relying on stored procedure just to insert bulk of data.

SqlBulkCopy Ignore Duplicate Records of Datatable From DataBase

Can i ignore the duplicate records of a data already present in sql database from a datatable which i am passing to SqlBulkCopy. If Yes then How and also explain me if No and other option.
No, that's not built-in. You need to clean the data on the client or insert into a staging table first.
As previous poster said, this is not built in. I achieve similar using the following:
SQL Stored Procedure that accepts a TableValuedParameter with the data you require.
In the stored proc, I then INSERT all records into a temp table. Once you have it there, you can use SQLs MERGE statement in your stored proc to insert data where it doesn't already exist.
So, let us assume that our data is simply people's names stored in a table people. We hold only an ID and a name. I also assume this table is called 'people'.
Here's how I create my Table Valued Parameter type (created in SQL Server)
CREATE TYPE udt_person AS TABLE(
[id] [INT] NOT NULL,
[name] [nvarchar(50)] NULL
)
GO
I now create the stored procedure:
CREATE PROCEDURE SaveNewPeople #pPeople udt_Person
AS
BEGIN
-- Create Temp table
CREATE TABLE #tmpPeople (id INT, name VARCHAR 50)
-- We will stage all data passed in into temp table
INSERT INTO #tmpPeople
SELECT id, name FROM #pPeople
-- NB: you will need to think about locking strategy a bit here
MERGE people AS p
USING #tmpPeople AS t
ON p.id = t.id
WHEN NOT MATCHED BY TARGET THEN
-- We want to insert new person
INSERT (id, name) VALUES (t.id, t.name)
WHEN MATCHED THEN
-- you may not need this, assume updating name for example
UPDATE SET p.name = t.name
END
Now we have the SQL in place.
Let us create the bulk of data in C#:
DataTable ppl = new DataTable();
ppl.Columns.Add("id", typeof(int));
ppl.Columns.Add("name", typeof(string));
// table is created, let's add some people
var bob = ppl.NewRow();
bob["id"] = 1;
bob["name"] = "Bob";
ppl.Rows.Add(bob);
var jim = ppl.NewRow();
jim["id"] = 2;
jim["name"] = "Jim";
ppl.Rows.Add(jim);
// that's enough people for now, let's call the stored procedure
using(var conn = new SqlConnection("YouConnStringHere"))
{
using(var cmd = new SqlCommand("SaveNewPeople", conn))
{
cmd.CommandType = CommandType.StoredProcedure;
var tvp = new SqlParameter
{
ParameterName = "#pPeople",
SqlDbType = SqlDbType.Structured,
Value = ppl,
TypeName = "udt_person"
}
cmd.Parameters.Add(tvp);
conn.Open();
cmd.ExecuteNonQuery();
}
}
Hopefully this gives you the idea. If you then modified the C# datatable, you should see rows inserted, updated or ignored.
Good luck.
Another way to do it is to create a database trigger to replace the inserts initiated by SqlBulkCopy. The performance will be impeded, depending on, among other things, the size of the batch, but it works nonetheless.
CREATE TABLE [dbo].[TempTable] (
[Id] INT IDENTITY PRIMARY KEY,
[Val] NVARCHAR(20)
)
GO
CREATE OR ALTER TRIGGER [IgnoreDuplicates] ON [dbo].[TempTable]
INSTEAD OF INSERT
AS
BEGIN
SET NOCOUNT ON
INSERT INTO [dbo].[TempTable]([Val])
SELECT [Val] FROM [INSERTED] WHERE [Val] NOT IN (
SELECT [Val] FROM [dbo].[TempTable]
)
END
GO

Dapper: Help me run stored procedure with multiple user defined table types

I have a stored procedure with 3 input paramaters.
... PROCEDURE [dbo].[gama_SearchLibraryDocuments]
#Keyword nvarchar(160),
#CategoryIds [dbo].[IntList] READONLY,
#MarketIds [dbo].[IntList] READONLY ...
Where IntList is user defined table type.
CREATE TYPE [dbo].[IntList]
AS TABLE ([Item] int NULL);
My goal is to call this stored procedure with dapper.
I have found some examples regarding passing user defined type with dapper.
One of them is TableValuedParameter class implemented in Dapper.Microsoft.Sql nuget package.
var list = conn.Query<int>("someSP", new
{
Keyword = (string)null,
CategoryIds = new TableValuedParameter<int>("#CategoryIds", "IntList", new List<int> { }),
MarketIds = new TableValuedParameter<int>("#MarketIds", "IntList", new List<int> { 541 })
}, commandType: CommandType.StoredProcedure).ToList();
Written above code throws
An exception of type 'System.NotSupportedException' occurred in Dapper.dll but was not handled in user code
Additional information: The member CategoryIds of type Dapper.Microsoft.Sql.TableValuedParameter`1[System.Int32] cannot be used as a parameter value
I have tested my stored procedure with one user defined table type and it worked fine.
conn.Query<int>("someSP", new TableValuedParameter<int>("#MarketIds", "IntList", new List<int> { 541 }), commandType: CommandType.StoredProcedure).ToList();
I need help with running original stored procedure.
Thank you.
Dapper has extension method to work with table valued parameters.
public static SqlMapper.ICustomQueryParameter AsTableValuedParameter(this DataTable table, string typeName = null)
You can use dapper in the following way:
var providersTable = new DataTable();
providersTable.Columns.Add("value", typeof(Int32));
foreach (var value in filterModel.Providers)
{
providersTable.Rows.Add(value);
}
var providers = providersTable.AsTableValuedParameter("[dbo].[tblv_int_value]");
var filters =
new
{
campaignId = filterModel.CampaignId,
search = filterModel.Search,
providers = providers,
pageSize = requestContext.PageSize,
skip = requestContext.Skip
};
using (var query = currentConnection.QueryMultiple(StoredProcedureTest, filters, nhTransaction, commandType: CommandType.StoredProcedure))
{
var countRows = query.Read<int>().FirstOrDefault();
var temp = query.Read<CategoryModel>().ToList();
return new Result<IEnumerable<CategoryModel>>(temp, countRows);
}
It will be translated into SQL:
declare #p3 dbo.tblv_int_value
insert into #p3 values(5)
insert into #p3 values(34)
insert into #p3 values(73)
insert into #p3 values(14)
exec [dbo].[StoredProcedureTest] #campaignId=123969,#search=NULL,#providers=#p3,#pageSize=20,#skip=0
I answered a similar post at Call stored procedure from dapper which accept list of user defined table type with the following and I think this may help you as well!
I know this is a little old, but I thought I would post on this anyway since I sought out to make this a little easier. I hope I have done so with a NuGet package I create that will allow for code like:
public class IntList
{
public int Item { get; set; }
}
var list1 = new List<IntList>{new IntList{ Item= 1 }, new IntList{ Item= 2}};
var list2 = new List<IntList>{new IntList{ Item= 3 }, new IntList{ Item= 4}};
var parameters = new DynamicParameters();
parameters.Add("#Keyword", "stringValue");
parameters.AddTable("#CategoryIds", "IntList", list1);
parameters.AddTable("#MarketIds", "IntList", list2);
var result = con.Query<int>("someSP", parameters, commandType: CommandType.StoredProcedure);
NuGet package: https://www.nuget.org/packages/Dapper.ParameterExtensions/0.2.0
Still in its early stages so may not work with everything!
Please read the README and feel free to contribute on GitHub: https://github.com/RasicN/Dapper-Parameters
This is a bit of a necro-post, but I believe there is a better way to do this. Specifically, a way that allows simple and complex types to be inserted and perform a single update, which may be important if you have a lot of indexes.
In this example I'm adding a Product that has a Name, ImageUrl and Year using a Stored Procedure called ProductAdd and a User-Defined Table Type called ProductList. You can simplify this to be a list of numbers or strings.
DataTable productList = new DataTable();
productList.Columns.Add(new DataColumn("Name", typeof(string)));
productList.Columns.Add(new DataColumn("ImageUrl", typeof(string)));
productList.Columns.Add(new DataColumn("Year", typeof(Int32)));
foreach (var product in products)
{
// The order of these must match the order of columns for the Type in SQL!
productList.Rows.Add(product.Name, product.ImageUrl, product.Year);
}
using (var connection = new SqlConnection(_appSettings.ConnectionString))
{
await connection.OpenAsync();
var result = new List<Product>();
try
{
result = await connection.QueryAsync<Product>("[dbo].[ProductAdd]",
new { product = productList },
commandType: CommandType.StoredProcedure
);
}
catch (SqlException ex)
{
// Permissions and other SQL issues are easy to miss, so this helps
throw ex;
}
}
return result;
The User-Defined Type is defined with this:
CREATE TYPE [dbo].[ProductList] AS TABLE(
[Name] [nvarchar](100) NOT NULL,
[ImageUrl] [nvarchar](125) NULL,
[Year] [int] NOT NULL,
)
And the Stored Procedure uses the List passed in as a table. If you are using the same logic for a GET/SELECT and trying to use an IN you would use this same logic. A SQL IN is a JOIN under the covers, so we are just being more explicit about that by using a JOIN on a table in place of the IN
CREATE PROCEDURE ProductAdd
#product ProductList READONLY
AS
BEGIN
SET NOCOUNT ON;
INSERT INTO [Product] ([Name], [ImageUrl], [Year])
SELECT paramProduct.[Name]
, paramProduct.[ImageUrl]
, paramProduct.[Year]
FROM #product AS paramProduct
LEFT JOIN [Product] AS existingProduct
ON existingProduct.[Name] = paramProduct.[Name]
AND existingProduct.[Year] = paramProduct.[Year]
WHERE existingProduct.[Id] IS NULL
END
GO

Application of Primary Key from Parent Table to Child Table

I have the following tables in my SQL Server CE database:
ORDERS
OrderID (handled my DBMS)
CustomerID
OrderDate
ORDER_DETAILS
OrderID (from the ORDERS table)
ProductID
OrderQTY
I currently use 2 insert queries to add new orders to the database. The first one to inserts the order into the ORDERS table and allow the DBMS to create the OrderID, and the second one that uses that OrderID provided to insert to the ORDER_DETAILS table.
The approach I have used below seems very klunky and potentially vulnerable to concurrency issues. Is there a way to have the DBMS handle the creation of the correct OrderID for the ORDER_DETAILS table when a new record is inserted into the ORDERS table?
This is the C# I use to run the insert queries:
public int InsertOrder(Order order)
{
DBConnection connection = DBConnection.getInstance();
connection.conn.Open();
using (SqlCeCommand query = new SqlCeCommand(OrderCommandList.cmdInsertOrderHeader, connection.conn))
{
query.Parameters.AddWithValue("#CustomerID", order.CustomerID);
query.Parameters.AddWithValue("#OrderDate", order.OrderDate);
query.ExecuteNonQuery();
}
//retrieves the PK for the recently inserted record
int newOrderPK = 0;
using(SqlCeCommand cmdGetIdentity = new SqlCeCommand("SELECT ##IDENTITY", connection.conn))
{
newOrderPK = Convert.ToInt32(cmdGetIdentity.ExecuteScalar());
}
connection.conn.Close();
InsertOrderDetails(order, newOrderPK);
return newOrderPK;
}
//inserts all the order details associated with the Order object
private void InsertOrderDetails(Order order, int orderForeignKey)
{
foreach (OrderDetail od in order.OrderLineItems)
{
DBConnection connection = DBConnection.getInstance();
connection.conn.Open();
using (SqlCeCommand query = new SqlCeCommand(OrderCommandList.cmdInsertOrderDetails, connection.conn))
{
query.Parameters.AddWithValue("#OrderID", orderForeignKey);
query.Parameters.AddWithValue("#ProductID", od.ProductID);
query.Parameters.AddWithValue("#OrderQty", od.QtyOrdered);
query.ExecuteNonQuery();
}
connection.conn.Close();
}
}
Change your stored procedure to select the generated identity value, and have it use SELECT SCOPE_IDENTITY(); rather than SELECT ##IDENTITY;. Actually, preferable would be to use an OUTPUT parameter, but for now I'll assume you're going to keep using SELECT. e.g.
INSERT dbo.Orders(CustomerID,OrderDate) SELECT #CustomerID,#OrderDate;
SELECT ID = SCOPE_IDENTITY();
Then, instead of ExecuteNonQuery() for the insert, then a separate command for the SELECT ##IDENTITY, you just need one ExecuteScalar to perform both.
You can do even better than this using TVPs, where you could send the order and all of the detail rows with a single stored procedure. However, CE is a little passé - not sure if it supports TVPs. You should consider using Express / LocalDB.

Categories