Application of Primary Key from Parent Table to Child Table - c#

I have the following tables in my SQL Server CE database:
ORDERS
OrderID (handled my DBMS)
CustomerID
OrderDate
ORDER_DETAILS
OrderID (from the ORDERS table)
ProductID
OrderQTY
I currently use 2 insert queries to add new orders to the database. The first one to inserts the order into the ORDERS table and allow the DBMS to create the OrderID, and the second one that uses that OrderID provided to insert to the ORDER_DETAILS table.
The approach I have used below seems very klunky and potentially vulnerable to concurrency issues. Is there a way to have the DBMS handle the creation of the correct OrderID for the ORDER_DETAILS table when a new record is inserted into the ORDERS table?
This is the C# I use to run the insert queries:
public int InsertOrder(Order order)
{
DBConnection connection = DBConnection.getInstance();
connection.conn.Open();
using (SqlCeCommand query = new SqlCeCommand(OrderCommandList.cmdInsertOrderHeader, connection.conn))
{
query.Parameters.AddWithValue("#CustomerID", order.CustomerID);
query.Parameters.AddWithValue("#OrderDate", order.OrderDate);
query.ExecuteNonQuery();
}
//retrieves the PK for the recently inserted record
int newOrderPK = 0;
using(SqlCeCommand cmdGetIdentity = new SqlCeCommand("SELECT ##IDENTITY", connection.conn))
{
newOrderPK = Convert.ToInt32(cmdGetIdentity.ExecuteScalar());
}
connection.conn.Close();
InsertOrderDetails(order, newOrderPK);
return newOrderPK;
}
//inserts all the order details associated with the Order object
private void InsertOrderDetails(Order order, int orderForeignKey)
{
foreach (OrderDetail od in order.OrderLineItems)
{
DBConnection connection = DBConnection.getInstance();
connection.conn.Open();
using (SqlCeCommand query = new SqlCeCommand(OrderCommandList.cmdInsertOrderDetails, connection.conn))
{
query.Parameters.AddWithValue("#OrderID", orderForeignKey);
query.Parameters.AddWithValue("#ProductID", od.ProductID);
query.Parameters.AddWithValue("#OrderQty", od.QtyOrdered);
query.ExecuteNonQuery();
}
connection.conn.Close();
}
}

Change your stored procedure to select the generated identity value, and have it use SELECT SCOPE_IDENTITY(); rather than SELECT ##IDENTITY;. Actually, preferable would be to use an OUTPUT parameter, but for now I'll assume you're going to keep using SELECT. e.g.
INSERT dbo.Orders(CustomerID,OrderDate) SELECT #CustomerID,#OrderDate;
SELECT ID = SCOPE_IDENTITY();
Then, instead of ExecuteNonQuery() for the insert, then a separate command for the SELECT ##IDENTITY, you just need one ExecuteScalar to perform both.
You can do even better than this using TVPs, where you could send the order and all of the detail rows with a single stored procedure. However, CE is a little passé - not sure if it supports TVPs. You should consider using Express / LocalDB.

Related

How to check if a record with composite key already exists in mysql database before inserting it using Dapper in C#

Supplier and Invoice are composite keys in a table which consists of 9 fields
You can simply perform an SELECT query first then insert or you can do so in a single trip as shown below (SQL Server). You can use one of the recommended Dapper extensions listed here
string cmd = #"IF NOT EXISTS (SELECT * FROM Table1 WHERE SupplierId=#SupplierId AND InvoiceId=#InvoiceId)
BEGIN
INSERT INTO Table1(SupplierId, InvoiceId) VALUES(#SupplierId, #InvoiceId)
END";
using (var dbConn = new SqlConnection("Server=(local);Database=MyDatabase;Integrated Security=true"))
{
dbConn.Execute(cmd, new { SupplierId = 1, InvoiceId = 2 });
}
Console.ReadKey();
UPDATE
For MySQL, you can write something like
INSERT INTO Table1 (SupplierId, InvoiceId) VALUES(#SupplierId, #InvoiceId) ON DUPLICATE KEY SupplierId=SupplierId
https://dev.mysql.com/doc/refman/5.7/en/insert-on-duplicate.html

Dapper to insert multiple rows into two tables using stored procedure

I am using Dapper (https://github.com/StackExchange/Dapper) in asp.net core myweb api project. I have a requirement to create a master record and insert a set of rows into a child table.
Ex. OrdersMaster table
create an order id in this table
OrderDetails table
ordermasterid
order items
How can I do this using Dapper? Please share some code snippets if possible.
I would implement the insert operations as a transactional stored procedure, and then call that from your .NET application.
You may need a table-valued type to pass in a list of data, like this:
CREATE TYPE List_Of_Items AS TABLE (
ItemID INT NOT NULL,
Quantity INT NOT NULL
)
The procedure might look like this
CREATE PROC Insert_Order_With_Details (
#CustomerId INT,
#Items List_Of_Items
) AS
BEGIN
BEGIN TRANSACTION
INSERT INTO OrdersMaster (CustomerId) VALUES #CustomerId
DECLARE #OrderID INT
SET #OrderID = SCOPE_IDENTITY() --last assigned id
INSERT INTO OrderDetails (OrderId, CustomerId, ItemId, Quantity)
SELECT #OrderID, #CustomerID, ItemID, Quantity
FROM #Items
COMMIT
END
Then in C#, I would suggest creating methods for creating your TVP. It is not as simple as you might like. This requires the using Microsoft.SqlServer.Server and using Dapper.Tvp.
//This is a shell to create any kind of TVP
private static void AddTableCore<T>(
this DynamicParametersTvp dp,
string tvpTypeName,
Func<T, SqlDataRecord> valueProjection,
IEnumerable<T> values,
string parameterTableName)
{
var tvp = values
.Select(valueProjection)
.ToList();
//If you pass a TVP with 0 rows to SQL server it will error, you must pass null instead.
if (!tvp.Any()) tvp = null;
dp.Add(new TableValueParameter(parameterTableName, tvpTypeName, tvp));
}
//This will create your specific Items TVP
public static void AddItemsTable(this DynamicParametersTvp dp, IEnumerable<Item> items, string parameterTableName = "Items")
{
var columns = new[]
{
new SqlMetaData("ItemID", SqlDbType.Int)
new SqlMetaData("Quantity", SqlDbType.Int)
};
var projection = new Func<Item, SqlDataRecord>(item =>
{
var record = new SqlDataRecord(columns);
record.SetInt32(0, item.Id);
record.SetInt32(1, item.Quantity);
return record;
});
AddTableCore(dp, "Items", projection, items, parameterTableName);
}
and then where you need to query you might do:
using (var cn = new SqlConnection(myConnectionString))
{
var p = new DynampicParametersTvp(new {
CustomerId = myCustomerId
});
p.AddItemsTable(items);
cn.Execute("Insert_Order_With_Details", p, commandType: CommandType.StoredProcedure);
}
The commandType argument is super important. It defaults to plain SQL text and will error if you send the name of a proc.
If you want to put in multiple orders at once, you'll need to use table-valued parameters and the Dapper.Tvp package.
See this SO question Using Dapper.TVP TableValueParameter with other parameters as well as this documentation on TVP's from Microsoft https://learn.microsoft.com/en-us/sql/relational-databases/tables/use-table-valued-parameters-database-engine. I don't think all SQL vendors support TVPs.
Using stored procedure as mentioned in other answer is good solution. My answer implements the same without Stored Procedure.
You have to use transaction. This way, either all the changes will be committed or rolled back. Code below assumes you are using Identity as Primary Key. Please refer this question for discussion about ##IDENTITY that I have used in below code.
Though code is not complete, I have put detailed comments to explain steps.
using (var connection = new SqlCeConnection("connection_string"))
{
connection.Open();
//Begin the transaction
using (var transaction = connection.BeginTransaction())
{
//Create and fill-up master table data
var paramMaster = new DynamicParameters();
paramMaster.Add("#XXX", ...);
....
//Insert record in master table. Pass transaction parameter to Dapper.
var affectedRows = connection.Execute("insert into OrdersMaster....", paramMaster, transaction: transaction);
//Get the Id newly created for master table record.
//If this is not an Identity, use different method here
newId = Convert.ToInt64(connection.ExecuteScalar<object>("SELECT ##IDENTITY", null, transaction: transaction));
//Create and fill-up detail table data
//Use suitable loop as you want to insert multiple records.
//for(......)
foreach(OrderItem item in orderItems)
{
var paramDetails = new DynamicParameters();
paramDetails.Add("#OrderMasterId", newId);
paramDetails.Add("#YYY", ...);
....
//Insert record in detail table. Pass transaction parameter to Dapper.
var affectedRows = connection.Execute("insert into OrderDetails....", paramDetails, transaction: transaction);
}
//Commit transaction
transaction.Commit();
}
}

Edit existing data in SQL Server database

I have a table SupplierMaster in a SQL Server database with a column SUPPLIERNAME.
I want to edit saved supplier name using stored procedure with below query
ALTER PROCEDURE [dbo].[usp_SupplierMasterUpdateDetails]
(
#SUPPLIERNAME NVARCHAR(50)
)
AS
BEGIN
UPDATE [dbo].[SupplierMaster]
SET [SUPPLIERNAME] = #SUPPLIERNAME
WHERE [SUPPLIERNAME] = #SUPPLIERNAME
END
and I run the BELOW code through "UPDATE BUTTON" to update the data.
string connString = ConfigurationManager.ConnectionStrings["dbx"].ConnectionString;
using (SqlConnection conn = new SqlConnection(connString))
{
using (SqlCommand cmd = new SqlCommand("usp_SupplierMasterUpdateDetails", conn))
{
cmd.CommandType = CommandType.StoredProcedure;
// Parameter
cmd.Parameters.AddWithValue("SUPPLIERNAME", AddSupplierTextBox.Text);
// Open Connection
conn.Open();
// ExecuteReader (Select Statement)
// ExecuteScalar (Select Statement)
// ExecuteNonQuery (Insert, Update or Delete)
cmd.ExecuteNonQuery();
MessageBox.Show("SUCCESSFULLY UPDATED", "Successful", MessageBoxButtons.OK, MessageBoxIcon.Information);
}
}
But its not updating the selected data.
Please advice and assist me to correct the code for proper work.
You have multiple issues there.
First you need to fix your update query just as Thomas Levesque suggested.
a SQL Server table needs a primary key to be able to uniquely identify a record, for updates for example.
The easiest thing you could do is set that primary key to be identity of type int and make it self generating. Your supplier table could look like this :
SupplierID int, Primary Key, identity
SupplierName nvarchar(100)
Now, when you do an update, you would do it like this:
Update SupplierMaster
Set SupplierName = #supplierName
Where SupplierID = #suplierID
Such a SQL statement will return an int value. This return value will tell you how many SQL rows this update statement has changed. If it says 0 then it means that the SQL statement could not find that id you passed through and nothing changed. If it says 1, then the record was found and updated, if you get more than 1 you have an issue with the SQL statement and multiple rows were updated.
In your code check for this return value and that's how you determine if your update statement was successful or not.

How can I use more than 2100 values in an IN clause using Dapper?

I have a List containing ids that I want to insert into a temp table using Dapper in order to avoid the SQL limit on parameters in the 'IN' clause.
So currently my code looks like this:
public IList<int> LoadAnimalTypeIdsFromAnimalIds(IList<int> animalIds)
{
using (var db = new SqlConnection(this.connectionString))
{
return db.Query<int>(
#"SELECT a.animalID
FROM
dbo.animalTypes [at]
INNER JOIN animals [a] on a.animalTypeId = at.animalTypeId
INNER JOIN edibleAnimals e on e.animalID = a.animalID
WHERE
at.animalId in #animalIds", new { animalIds }).ToList();
}
}
The problem I need to solve is that when there are more than 2100 ids in the animalIds list then I get a SQL error "The incoming request has too many parameters. The server supports a maximum of 2100 parameters".
So now I would like to create a temp table populated with the animalIds passed into the method. Then I can join the animals table on the temp table and avoid having a huge "IN" clause.
I have tried various combinations of syntax but not got anywhere.
This is where I am now:
public IList<int> LoadAnimalTypeIdsFromAnimalIds(IList<int> animalIds)
{
using (var db = new SqlConnection(this.connectionString))
{
db.Execute(#"SELECT INTO #tempAnmialIds #animalIds");
return db.Query<int>(
#"SELECT a.animalID
FROM
dbo.animalTypes [at]
INNER JOIN animals [a] on a.animalTypeId = at.animalTypeId
INNER JOIN edibleAnimals e on e.animalID = a.animalID
INNER JOIN #tempAnmialIds tmp on tmp.animalID = a.animalID).ToList();
}
}
I can't get the SELECT INTO working with the list of IDs. Am I going about this the wrong way maybe there is a better way to avoid the "IN" clause limit.
I do have a backup solution in that I can split the incoming list of animalIDs into blocks of 1000 but I've read that the large "IN" clause sufferes a performance hit and joining a temp table will be more efficient and it also means I don;t need extra 'splitting' code to batch up the ids in to blocks of 1000.
Ok, here's the version you want. I'm adding this as a separate answer, as my first answer using SP/TVP utilizes a different concept.
public IList<int> LoadAnimalTypeIdsFromAnimalIds(IList<int> animalIds)
{
using (var db = new SqlConnection(this.connectionString))
{
// This Open() call is vital! If you don't open the connection, Dapper will
// open/close it automagically, which means that you'll loose the created
// temp table directly after the statement completes.
db.Open();
// This temp table is created having a primary key. So make sure you don't pass
// any duplicate IDs
db.Execute("CREATE TABLE #tempAnimalIds(animalId int not null primary key);");
while (animalIds.Any())
{
// Build the statements to insert the Ids. For this, we need to split animalIDs
// into chunks of 1000, as this flavour of INSERT INTO is limited to 1000 values
// at a time.
var ids2Insert = animalIds.Take(1000);
animalIds = animalIds.Skip(1000).ToList();
StringBuilder stmt = new StringBuilder("INSERT INTO #tempAnimalIds VALUES (");
stmt.Append(string.Join("),(", ids2Insert));
stmt.Append(");");
db.Execute(stmt.ToString());
}
return db.Query<int>(#"SELECT animalID FROM #tempAnimalIds").ToList();
}
}
To test:
var ids = LoadAnimalTypeIdsFromAnimalIds(Enumerable.Range(1, 2500).ToList());
You just need to amend your select statement to what it originally was. As I don't have all your tables in my environment, I just selected from the created temp table to prove it works the way it should.
Pitfalls, see comments:
Open the connection at the beginning, otherwise the temp table will
be gone after dapper automatically closes the connection right after
creating the table.
This particular flavour of INSERT INTO is limited
to 1000 values at a time, so the passed IDs need to be split into
chunks accordingly.
Don't pass duplicate keys, as the primary key on the temp table will not allow that.
Edit
It seems Dapper supports a set-based operation which will make this work too:
public IList<int> LoadAnimalTypeIdsFromAnimalIdsV2(IList<int> animalIds)
{
// This creates an IEnumerable of an anonymous type containing an Id property. This seems
// to be necessary to be able to grab the Id by it's name via Dapper.
var namedIDs = animalIds.Select(i => new {Id = i});
using (var db = new SqlConnection(this.connectionString))
{
// This is vital! If you don't open the connection, Dapper will open/close it
// automagically, which means that you'll loose the created temp table directly
// after the statement completes.
db.Open();
// This temp table is created having a primary key. So make sure you don't pass
// any duplicate IDs
db.Execute("CREATE TABLE #tempAnimalIds(animalId int not null primary key);");
// Using one of Dapper's convenient features, the INSERT becomes:
db.Execute("INSERT INTO #tempAnimalIds VALUES(#Id);", namedIDs);
return db.Query<int>(#"SELECT animalID FROM #tempAnimalIds").ToList();
}
}
I don't know how well this will perform compared to the previous version (ie. 2500 single inserts instead of three inserts with 1000, 1000, 500 values each). But the doc suggests that it performs better if used together with async, MARS and Pipelining.
In your example, what I can't see is how your list of animalIds is actually passed to the query to be inserted into the #tempAnimalIDs table.
There is a way to do it without using a temp table, utilizing a stored procedure with a table value parameter.
SQL:
CREATE TYPE [dbo].[udtKeys] AS TABLE([i] [int] NOT NULL)
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[myProc](#data as dbo.udtKeys readonly)AS
BEGIN
select i from #data;
END
GO
This will create a user defined table type called udtKeys which contains just one int column named i, and a stored procedure that expects a parameter of that type. The proc does nothing else but to select the IDs you passed, but you can of course join other tables to it. For a hint regarding the syntax, see here.
C#:
var dataTable = new DataTable();
dataTable.Columns.Add("i", typeof(int));
foreach (var animalId in animalIds)
dataTable.Rows.Add(animalId);
using(SqlConnection conn = new SqlConnection("connectionString goes here"))
{
var r=conn.Query("myProc", new {data=dataTable},commandType: CommandType.StoredProcedure);
// r contains your results
}
The parameter within the procedure gets populated by passing a DataTable, and that DataTable's structure must match the one of the table type you created.
If you really need to pass more that 2100 values, you may want to consider indexing your table type to increase performance. You can actually give it a primary key if you don't pass any duplicate keys, like this:
CREATE TYPE [dbo].[udtKeys] AS TABLE(
[i] [int] NOT NULL,
PRIMARY KEY CLUSTERED
(
[i] ASC
)WITH (IGNORE_DUP_KEY = OFF)
)
GO
You may also need to assign execute permissions for the type to the database user you execute this with, like so:
GRANT EXEC ON TYPE::[dbo].[udtKeys] TO [User]
GO
See also here and here.
For me, the best way I was able to come up with was turning the list into a comma separated list in C# then using string_split in SQL to insert the data into a temp table. There are probably upper limits to this, but in my case I was only dealing with 6,000 records and it worked really fast.
public IList<int> LoadAnimalTypeIdsFromAnimalIds(IList<int> animalIds)
{
using (var db = new SqlConnection(this.connectionString))
{
return db.Query<int>(
#" --Created a temp table to join to later. An index on this would probably be good too.
CREATE TABLE #tempAnimals (Id INT)
INSERT INTO #tempAnimals (ID)
SELECT value FROM string_split(#animalIdStrings)
SELECT at.animalTypeID
FROM dbo.animalTypes [at]
JOIN animals [a] ON a.animalTypeId = at.animalTypeId
JOIN #tempAnimals temp ON temp.ID = a.animalID -- <-- added this
JOIN edibleAnimals e ON e.animalID = a.animalID",
new { animalIdStrings = string.Join(",", animalIds) }).ToList();
}
}
It might be worth noting that string_split is only available in SQL Server 2016 or higher or if using Azure SQL then compatibility mode 130 or higher. https://learn.microsoft.com/en-us/sql/t-sql/functions/string-split-transact-sql?view=sql-server-ver15

ConstraintException when querying SQlite database with C#

I’m hoping somebody will be able to help with my SQLite database problem.
I’m receiving a ConstraintException when querying my SQLite database with C#. The full exception message is “Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.” I originally built this database using access which worked fine, but for various reasons I had to recreate it using SQLite.
To give a bit of background - this is a simple status scheduling program. Each Status has an associated Account and Schedule. I realise Statuses and Schedule is a 1:1 relationship and could be in the same table but to allow the program to develop further I have split them into two tables.
See below for a cut down version of my table script (this is enough to recreate the problem).
PRAGMA foreign_keys = ON;
CREATE TABLE Accounts
(ID INTEGER PRIMARY KEY AUTOINCREMENT,
Name char(100));
CREATE TABLE Statuses
(ID INTEGER PRIMARY KEY AUTOINCREMENT,
AccountId INTEGER REFERENCES Accounts(ID) ON DELETE CASCADE,
Text char(140));
CREATE TABLE Schedule
(ID INTEGER PRIMARY KEY REFERENCES Statuses(ID) ON DELETE CASCADE,
StartDate char(255),
Frequency INT);
I did not have any issues until I created two Statues and associated them to the same Account.
Accounts
ID Name
1 Fred Blogs
Statuses
ID AccountId Text
1 1 “Some text”
2 1 “Some more text”
Schedule
ID StartDate Frequency
1 16/02/2011 1
2 16/02/2011 1
The select statement I’m using which throws the exception is:
SELECT Statuses.Id, Statuses.Text, Accounts.Id, Accounts.Name, Schedule.StartDate, Schedule.Frequency
FROM [Statuses], [Accounts], [Schedule]
WHERE Statuses.AccountId = Accounts.Id AND Statuses.Id = Schedule.Id
If I run the same query, but remove the ‘Accounts.Id’ column the query works fine.
See below for the C# code I’m using but I don’t think this is the problem
public DataTable Query(string commandText)
{
SQLiteConnection sqliteCon = new SQLiteConnection(ConnectionString);
SQLiteCommand sqliteCom = new SQLiteCommand(commandText, sqliteCon);
DataTable sqliteResult = new DataTable("Query Result");
try
{
sqliteCon.Open();
sqliteResult.Load(sqliteCom.ExecuteReader());
}
catch (Exception)
{
throw;
}
finally
{
sqliteCon.Close();
}
return sqliteResult;
}
Any help will be appreciated. Thanks.
the error is occuring due to the ID columns in Statuses table and Schedule table. If they are not important delete the columns from the two tables.
I have found a way round this problem. If I select the AccountId from the Schedule table rather than the Accounts table there is no exception thrown. It seems I was unable to run a SELECT statement that contained two Unique primary key columns.
So instead of
SELECT Statuses.Id, Statuses.Text, Accounts.Id, Accounts.Name, Schedule.StartDate, Schedule.Frequency
FROM [Statuses], [Accounts], [Schedule]
WHERE Statuses.AccountId = Accounts.Id AND Statuses.Id = Schedule.Id
I run
SELECT Statuses.Id, Statuses.Text, Statuses.AccountId, Accounts.Name, Schedule.StartDate, Schedule.Frequency
FROM [Statuses], [Accounts], [Schedule]
WHERE Statuses.AccountId = Accounts.Id AND Statuses.Id = Schedule.Id
I fixed the issue by reading the schema only at first, then cleared the constraints of the datatable and then read again the data.
like this :
DataSet DS = new DataSet();
mytable = new DataTable();
DS.Tables.Add(mytable);
DS.EnforceConstraints = false;
SQLiteCommand command = DBconnection.CreateCommand();
command.CommandText = "select * from V_FullView";
SQLiteDataReader reader = command.ExecuteReader(CommandBehavior.SchemaOnly);
mytable.Load(reader);
mytable.Constraints.Clear();
reader = command.ExecuteReader();
mytable.Load(reader);
reader.Close();
my V_FullView is a view of 4 different tables merged. It seems that the constraints are the ones of the first merged table (name was unique on that one, but replicated a multiple of times in the view)

Categories