Pass param into SQL checksum function with Dapper - c#

Faced with strange problem:
string sql = $#"SELECT * FROM SearchLogs
WHERE CHECKSUM(#query) = cs_SearchTerm
AND Website = #website";
return await Connection.QueryFirstOrDefaultAsync<SearchLog>(sql,
param: new { query, website });
Record 100% exists in the database, but I got null. cs_SearchTerm is computed, int, nullable field. Then I tried:
DECLARE #term nvarchar(500)
SET #term = #query
SELECT * FROM SearchLogs
WHERE CHECKSUM(#term) = cs_SearchTerm AND Website = #website
But got same result. Then I tried split into two operations:
private async Task<int> SqlCheckSumAsync(string query)
{
string sql = #"SELECT CHECKSUM(#query)";
return await Connection.ExecuteScalarAsync<int>(sql, param: new { query }, transaction: Transaction);
}
string sql = $#"SELECT * FROM Search_Master
WHERE cs_SearchTerm = #checksum
AND Website = #website";
int checksum = await SqlCheckSumAsync(query);
return (await Connection.QueryFirstOrDefaultAsync<Search_Master>(sql,
param: new { checksum, website },
transaction: Transaction));
But still got not positive result. I wonder what I doing wrong? Why I can't pass param into SQL scalar?

From the comments, this works:
SELECT * FROM SearchLogs WHERE cs_SearchTerm = CHECKSUM('abc') AND Website = 'ICF'
So: that tells me that you computed your checksums using varchar inputs. This is very important, because CHECKSUM('abc') gives a completely different answer than CHECKSUM(N'abc'):
select checksum('abc') as [Ansi], checksum(N'abc') as [Unicode]
gives:
Ansi Unicode
----------- -----------
34400 1132495864
By default, dapper uses nvarchar (because .NET strings are utf-16). So we need to tell dapper to pass that as an ANSI string; fortunately this is simple:
return await Connection.ExecuteScalarAsync<int>(sql,
new { query = new DbString { Value = query, IsAnsi = true} },
transaction: Transaction);
Dapper's DbString type allows fine-grained control over how strings are sent, including both whether they are unicode or ANSI, and whether they are fixed width (and if so: what) or variable width.

Related

Dapper sql too slow when in parameter is list string

I use raw SQL with parameters in Dapper. Query speed is normal.
Something like this:
string sql = "SELECT * FROM SomeTable WHERE messages IN ('Volvo', 'BMW', 'Ford', 'Mazda')"
var results = conn.Query(sql);
When I change parameters with #messages, the speed is too slow:
string sql = "SELECT * FROM SomeTable WHERE messages IN (#cars)"
string[] cars = { "Volvo", "BMW", "Ford", "Mazda" };
var results = conn.Query(sql, new {cars});
PS: string[] size is almost 300.
If you pass string as parameter, Dapper treats it as nvarchar type.
So if your messages is varchar type and the SomeTable is huge, even if it's indexed, the speed could be really slow. DbString may solve this problem.
cars.ToList().Select(x => new DbString { Value = x, IsFixedLength = false, IsAnsi = true });
(would be a mess as a comment)
(I think there might be some syntax error in second one and there shouldn't be parentheses.)
The two queries would generate different SQL sent to backend. ie: with MS SQL server they would look like:
First one:
SELECT * FROM SomeTable WHERE messages IN ('Volvo', 'BMW', 'Ford', 'Mazda')
Second one:
exec sp_executesql N'SELECT * FROM SomeTable WHERE messages IN (#cars1, #cars2, #cars3, #cars4)',
N'#cars1 nvarchar(4000),#cars2 nvarchar(4000),#cars3 nvarchar(4000),#cars4 nvarchar(4000)',
#cars1=N'Volvo',#cars2=N'BMW',#cars3=N'Ford',#cars4=N'Mazda'
IMHO there are better ways of doing IN queries depending on your backend.

Insert large array into a SQL Server table variable with Dapper

I have an integer list and I'm trying to insert those values into a table variable, declared with DECLARE statement, using Dapper. I've tried several combinations, but ultimately it leads to Incorrect syntax near ',' error.
Can Dapper even differentiate between a local variable and a Dapper query param, both being with # prefix?
Fiddle
List<int> output = null;
List<int> input = new List<int>
{
1, 2, 3
};
var sql = #"DECLARE #tempTable TABLE (Id INT)
INSERT INTO #tempTable VALUES (#Ids);
SELECT * FROM #tempTable;";
using (var connection = new SqlConnection(FiddleHelper.GetConnectionStringSqlServer()))
{
output = connection.Query<int>(sql, new { Ids = input }).ToList();
}
Note that the input list can be bigger than 1000.
Dapper will transform the list into seperate parameters, suitable for inclusion in an IN predicate, but not an INSERT ... VALUES query. Instead pass the values as JSON, which is also much, much cheaper for large lists than using separate parameters. EG
List<int> output = null;
List<int> input = new List<int>
{
1, 2, 3
};
var sql = #"
DECLARE #tempTable TABLE (Id INT)
INSERT INTO #tempTable(id) select value from openjson( #Ids );
SELECT * FROM #tempTable;";
var inputJson = System.Text.Json.JsonSerializer.Serialize(input);
using (var con = new SqlConnection("server=localhost;database=tempdb;integrated security=true;trust server certificate=true"))
{
output = con.Query<int>(sql, new { Ids = inputJson }).ToList();
}

How do I read multiple results sets from a PostgreSQL function using Dapper?

I am trying to understand how to use Dapper to make a call to a PostgreSQL function that returns multiple result sets. My understanding is that in PostgreSQL, the best (only?) way to currently achieve this is to declare that the function RETURNS SETOF REFCURSOR.
Example PostgreSQL Function that Returns Multiple REFCURSORs
CREATE OR REPLACE FUNCTION public.testmultiplerefcursorfunc()
RETURNS SETOF REFCURSOR
LANGUAGE 'plpgsql'
STABLE
AS $BODY$
DECLARE
ref1 REFCURSOR;
ref2 REFCURSOR;
BEGIN
OPEN ref1 FOR
SELECT *
FROM characters;
RETURN NEXT ref1;
OPEN ref2 FOR
SELECT *
FROM planets;
RETURN NEXT ref2;
END;
$BODY$;
Broken Dapper+PostgreSQL with Multiple REFCURSORs Example
[Test]
public void UsingDapper_QueryMultiple_CallFunctionThatReturnsMultipleRefCursors_ReadsMultipleResultSetsViaMultipleRefCursors()
{
// Arrange
using (var conn = new NpgsqlConnection(_getConnectionStringToDatabase()))
{
var funcName = "testmultiplerefcursorfunc";
var expect1 = CharacterTestData;
var expect2 = PlanetTestData;
conn.Open();
using (var transaction = conn.BeginTransaction())
{
// Act
using (var results = conn.QueryMultiple(
funcName,
commandType: CommandType.StoredProcedure,
transaction: transaction))
{
var result1 = results.Read<Character>().AsList();
var result2 = results.Read<Planet>().AsList();
// Assert
CollectionAssert.AreEquivalent(expect1, result1);
CollectionAssert.AreEquivalent(expect2, result2);
}
}
}
}
The problem that I'm having with the code above is that when I make the first results.Read<T>() call, it attempts to return both REFCURSORs cast as T. This cast then results in a T with null values for all of the properties. Then the next call to results.Read<T>() throws the following exception:
System.ObjectDisposedException: 'The reader has been disposed; this can happen after all data has been consumed
Object name: 'Dapper.SqlMapper+GridReader'.'
So, how does Dapper work with multiple PostgreSQL REFCURSORs? Is there a way to read the results without manually dereferencing the cursors?
I've got a vanilla example that returns multiple REFCURSORs without using Dapper that works where I manually dereference the cursors and read the results and I've also got examples that work against a SQL Server stored procedure that return multiple results.
I haven't (yet) found any particular documentation that points to a specific difference of how QueryMultiple should be called for PostgreSQL vs SQL Server, but such documentation would be greatly appreciated.
Even when calling a PostgreSQL function that returns single REFCURSOR using Dapper, I've found it necessary to manually handle the cursor dereferencing like the example below.
But from what I've read so far, this doesn't seem like it's supposed to be necessary, although I've had trouble finding specific documentation/examples for Dapper+PostgreSQL that show how this should otherwise work.
Working Dapper+PostgreSQL with Single REFCURSOR Example
[Test]
public void UsingDapper_Query_CallFunctionThatReturnsRefCursor_ReadsRowsViaRefCursor()
{
// Arrange
using (var conn = new NpgsqlConnection(_getConnectionStringToDatabase()))
{
var procName = "testrefcursorfunc";
var expect = CharacterTestData;
conn.Open();
using (var transaction = conn.BeginTransaction())
{
// Act
var cursorResult = (IDictionary<string, object>)conn
.Query<dynamic>(procName, commandType: CommandType.StoredProcedure, transaction: transaction)
.Single();
var cursorSql = $#"FETCH ALL FROM ""{(string)cursorResult[procName]}""";
var result = conn.Query<Character>(
cursorSql,
commandType: CommandType.Text,
transaction: transaction);
// Assert
CollectionAssert.AreEquivalent(expect, result);
}
}
}
So, with Dapper + PostgreSQL + REFCURSOR, is it always necessary to manually deference the cursor to read the results? Or can Dapper handle that for you?
Coming from an sql server background where its just a question of a list of select statements in your stored proc and your "good to go"; using postgresql and the requirement to use refcursors and "fetch " those refcusors on the other side can be quite painful.
What i can suggest is:
1.) Use a postgresql Procedure with refcursors as INOUT parameters.
CREATE OR REPLACE PROCEDURE public.proc_testmultiplerefcursor(INOUT ref1 refcursor, INOUT ref2 refcursor)
2.) Call the procedure and then fetch the refcursors for the returned data using "FETCH ALL".
Fill the INOUT parameters with names for refcursors so they are recoverable in this case i have used 'ref1' & 'ref2'.
var sql = "BEGIN;CALL public.proc_testmultiplerefcursor(#pentity_id,'ref1','ref2');" +
"FETCH ALL FROM ref1; " +
"FETCH ALL FROM ref2;" +
"COMMIT;";
3.) Then your usual Dapper QueryMutliple and Reads.
var multi = await conn.QueryMultipleAsync(sql);
var result1 = (await multi.ReadAsync<Character>()).AsList();
var result2 =(await multi.ReadAsync<Planet>()).AsList();
This is untested but i hope it can be of help. Postgresql is painfull but brilliant.
I've ran into this problem lately, and to imitate the multiple result set behavior of SQL Server, I used a function that returns SETOF REFCURSOR. I made my own simple library to resolve this problem, so I can reuse it to my other projects. See my GitHub Repository
Assuming you have 5 tables in the database
User
Id
Username
Role
Id
Name
Permission
Id
Name
UserRole
UserId | FK: User.Id
RoleId | FK: Role.Id
RolePermission
RoleId | FK: Role.Id
PermissionId | FK: Permission.Id
PostgreSQL (PL/PGSQL)
CREATE OR REPLACE FUNCTION "get_user_with_roles_and_permissions"("user__id" INTEGER)
RETURNS SETOF REFCURSOR AS
$BODY$
DECLARE
-- Refcursor declarations
"ref__user" REFCURSOR;
"ref__roles" REFCURSOR;
"ref__permissions" REFCURSOR;
BEGIN
-- Select User
-- NOTE: this only query for exactly 1 row
OPEN "ref__user" FOR
SELECT "User"."Id", "User"."Username"
FROM "User"
WHERE "User"."Id" = "user__id"
LIMIT 1;
RETURN NEXT "ref__user";
-- Select Roles
OPEN "ref__roles" FOR
SELECT "Role"."Id", "Role"."Name"
FROM "Role"
INNER JOIN "UserRole" ON "Role"."Id" = "UserRole"."RoleId"
WHERE "UserRole"."UserId" = "user__id";
RETURN NEXT "ref__roles";
-- Select Permissions
-- NOTE: There's a chance that user has many roles which have same permission, we use DISTINCT to eliminate duplicates
OPEN "ref__permissions" FOR
SELECT DISTINCT "Permission"."Id", "Permission"."Name"
FROM "Permission"
INNER JOIN "RolePermission" ON "Permission"."Id" = "RolePermission"."PermissionId"
INNER JOIN "UserRole" ON "RolePermission"."RoleId" = "UserRole"."RoleId"
WHERE "UserRole"."UserId" = "user__id";
RETURN NEXT "ref__permissions";
END;
$BODY$
C#
/* Entity models */
record User
{
public int Id { get; set; }
public string Username { get; set; }
public IEnumerable<Role> Roles { get; set; }
public IEnumerable<Permission> Permissions { get; set; }
}
record Role
{
public int Id { get; set; }
public string Name { get; set; }
}
record Permission
{
public int Id { get; set; }
public string Name { get; set; }
}
// Calling the database function in C# code
// Create an instance of NpgsqlConnection
using var connection = new NpgsqlConnection(connectionString);
// Try to open a database connection
connection.Open();
// Begin a database transaction
using var transaction = connection.BeginTransaction();
// Query refcursors
// Call a function with name 'get_user_with_roles_and_permissions' with parameter 'user_id' = 1
var refcursors = connection.QueryRefcursor("get_user_with_roles_and_permissions", transaction, new { user__id = 1 });
// we use ReadSingleOrDefault because we're sure that there is only one user that has an id of 1 (or none if the user with id = 1 doesn't exists)
var user = refcursors.ReadSingleOrDefault<User>();
// Check if user with id = 1 exists
if (user is not null)
{
// Query for roles
user.Roles = refcursors.Read<Role>();
// Query for permissions
user.Permissions = refcursors.Read<Permission>();
}
Try with conn.QueryMultipleAsync In Current source you are using conn.QueryMultiple. Here is the complete guide for it . Dapper Multiple
You Can Use Like this. Sure It Will Work..!
public DataSet Manage_user_profiledata(string _prof_code)
{
string query = string.Format(#"select * from Function_Name(#prof_code, #first_tbl, #second_tbl)");
NpgsqlParameter[] sqlParameters = new NpgsqlParameter[3];
sqlParameters[0] = new NpgsqlParameter("#prof_code", NpgsqlDbType.Varchar);
sqlParameters[0].Value = Convert.ToString(_prof_code);
//
sqlParameters[1] = new NpgsqlParameter("#first_tbl", NpgsqlTypes.NpgsqlDbType.Refcursor);
sqlParameters[1].Value = Convert.ToString("Avilable");
sqlParameters[1].Direction = ParameterDirection.InputOutput;
sqlParameters[1].NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Refcursor;
//
sqlParameters[2] = new NpgsqlParameter("#second_tbl", NpgsqlTypes.NpgsqlDbType.Refcursor);
sqlParameters[2].Value = Convert.ToString("Assigned");
sqlParameters[2].Direction = ParameterDirection.InputOutput;
sqlParameters[2].NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Refcursor;
return conn.executeMultipleSelectQuery(query, sqlParameters);
}
public DataSet executeMultipleSelectQuery(string _query, NpgsqlParameter[] sqlParameter)
{
// NgpSql Init //
npg_connection = new NpgsqlConnection(connstr);
npg_command = new NpgsqlCommand(_query, npg_connection);
// NgpSql Init //
i = 0;
try
{
ds = new DataSet();
npg_connection.Open();
NpgsqlTransaction tran = npg_connection.BeginTransaction();
npg_command.CommandType = CommandType.Text;
npg_command.Parameters.AddRange(sqlParameter);
npg_command.ExecuteNonQuery();
foreach (NpgsqlParameter parm in sqlParameter)
{
if (parm.NpgsqlDbType == NpgsqlTypes.NpgsqlDbType.Refcursor)
{
if (parm.Value.ToString() != "null" || parm.Value.ToString() != "NULL" || parm.Value.ToString() != "")
{
string parm_val = string.Format("FETCH ALL IN \"{0}\"", parm.Value.ToString());
npg_adapter = new NpgsqlDataAdapter(parm_val.Trim().ToString(), npg_connection);
ds.Tables.Add(parm.Value.ToString());
npg_adapter.Fill(ds.Tables[i]);
i++;
}
}
}
tran.Commit();
return ds;
}
catch (Exception ex)
{
ds_ERROR.Tables[0].Rows.Add(ex.ToString(), ex.Message.ToString());
return ds_ERROR;
}
finally
{
npg_connection.Close();
}
}

How to translate table defintion from MS Access to Oracle using C# and OleDbConnection?

I need to copy a table from MS Access to Oracle without using OracleBulkCopy (using Managed Data Access).
First step is to create the table.
As can be seen in the code below I am now converting the fields manually while querying for the column length.
I would like to use a statement that converts the fields for me without having to add rules for all types manually since there are a lot of types.
Or better, just extract some sort of DDL that I can execute in Oracle.
Is this possible?
private int GetLength(string accessTableName, string columnName, OleDbConnection accessConnection)
{
columnName = $"[{columnName}]";
var accessCommand = new OleDbCommand($"select max(len({columnName})) from {accessTableName}", accessConnection);
return int.Parse(accessCommand.ExecuteScalar().ToString());
}
private void CreateEmptyTable(DataTable schemaTable, string tableName, OracleConnection oracleConnection, string accessTableName, OleDbConnection accessConnection)
{
var columnSpecs = new string[schemaTable.Rows.Count];
for (int i = 0; i < schemaTable.Rows.Count; ++i)
{
var name = schemaTable.Rows[i].ItemArray[0].ToString();
var dataType = schemaTable.Rows[i].ItemArray[5];
//var length = schemaTable.Rows[i].ItemArray[2];
var length = GetLength(accessTableName, name.ToString(), accessConnection);
var precision = schemaTable.Rows[i].ItemArray[3];
var scale = schemaTable.Rows[i].ItemArray[4];
var oracleDt = "";
switch (dataType.ToString())
{
case "System.String":
oracleDt = $"nchar({length})";
break;
case "System.Int32":
case "System.Int16":
var iLng = int.Parse(length.ToString()) * 2;
oracleDt = $"number({iLng},0)";
break;
case "System.Double":
case "System.Decimal":
oracleDt = $"number({length},{precision})";
break;
default:
throw new Exception();
}
name = name.ToString().ToUpper().Replace(' ', '_');
columnSpecs[i] = $"{name} {oracleDt}";
}
var query = $"create table MDB_{tableName.ToUpper().Replace(' ', '_')} ( {string.Join(",", columnSpecs)} )";
var oracleCommand = new OracleCommand(query, oracleConnection);
oracleCommand.ExecuteNonQuery();
}
Perhaps you try executing the command on the base table in place of the query.
So try something like this:
DoCmd.TransferDatabase acExport, "ODBC Database", strCon,
acTable, "tblHotels2", "tblHotelEXPORT", True
Now above is from Access, but your posted answer is on the right track.
When I send above to SQL server, long, money, memo, text files all get sent out and created on sql server. I suppose you could use above, and then execute a series of alter table to change the types, but that would be painful.
So before the pain approach, I would try above. Note that:
I used the transfer command on the base table, not a query.
I used "True" as that last value - that means no data copy, just structure.
You could also try/test a different ORACLE odbc driver. I don't have a test oracle server around, but sending the above to SQL server - it did a VERY nice job of creating the data types on the server side.
This option uses Interop instead but the result is disappointing. Numeric fields are converted to VARCHAR2, haven't tested other types yet. Note that it also requires a DSN file, and it requires the Oracle ODBC drivers.
For the people who could use this, here is the code:
public void CreateTableDefsUsingInterop(string accessFilePath, string accessTable, string oracleUser, string oraclePassword, string oracleTable, string oracleDSNFilePath)
{
var strConn = $"ODBC;FILEDSN={oracleDSNFilePath};UID={oracleUser};PWD={oraclePassword}";
var sTypExprt = "ODBC Database";
var interop = new Application();
interop.OpenCurrentDatabase(accessFilePath);
var db = interop.CurrentDb();
var emptyTable = $"n{accessTable}";
QueryDef qd;
qd = db.CreateQueryDef("access2ora1", $"select top 1 * into [{emptyTable}] from [{accessTable}]");
qd.Execute();
qd = db.CreateQueryDef("access2ora2", $"delete from [{emptyTable}]");
qd.Execute();
interop.DoCmd.TransferDatabase(AcDataTransferType.acExport, sTypExprt, strConn, AcObjectType.acTable, emptyTable, oracleTable);
interop.Quit(AcQuitOption.acQuitSaveNone);
}

How do I perform an insert and return inserted identity with Dapper?

How do I perform an insert to database and return inserted identity with Dapper?
I've tried something like this:
string sql = "DECLARE #ID int; " +
"INSERT INTO [MyTable] ([Stuff]) VALUES (#Stuff); " +
"SELECT #ID = SCOPE_IDENTITY()";
var id = connection.Query<int>(sql, new { Stuff = mystuff}).First();
But it did't work.
#Marc Gravell thanks, for reply.
I've tried your solution but, still same exception trace is below
System.InvalidCastException: Specified cast is not valid
at Dapper.SqlMapper.<QueryInternal>d__a`1.MoveNext() in (snip)\Dapper\SqlMapper.cs:line 610
at System.Collections.Generic.List`1..ctor(IEnumerable`1 collection)
at System.Linq.Enumerable.ToList[TSource](IEnumerable`1 source)
at Dapper.SqlMapper.Query[T](IDbConnection cnn, String sql, Object param, IDbTransaction transaction, Boolean buffered, Nullable`1 commandTimeout, Nullable`1 commandType) in (snip)\Dapper\SqlMapper.cs:line 538
at Dapper.SqlMapper.Query[T](IDbConnection cnn, String sql, Object param) in (snip)\Dapper\SqlMapper.cs:line 456
It does support input/output parameters (including RETURN value) if you use DynamicParameters, but in this case the simpler option is simply:
var id = connection.QuerySingle<int>( #"
INSERT INTO [MyTable] ([Stuff]) VALUES (#Stuff);
SELECT CAST(SCOPE_IDENTITY() as int)", new { Stuff = mystuff});
Note that on more recent versions of SQL Server (2005+) you can use the OUTPUT clause:
var id = connection.QuerySingle<int>( #"
INSERT INTO [MyTable] ([Stuff])
OUTPUT INSERTED.Id
VALUES (#Stuff);", new { Stuff = mystuff});
A late answer, but here is an alternative to the SCOPE_IDENTITY() answers that we ended up using: OUTPUT INSERTED
Return only ID of inserted object:
It allows you to get all or some attributes of the inserted row:
string insertUserSql = #"INSERT INTO dbo.[User](Username, Phone, Email)
OUTPUT INSERTED.[Id]
VALUES(#Username, #Phone, #Email);";
int newUserId = conn.QuerySingle<int>(
insertUserSql,
new
{
Username = "lorem ipsum",
Phone = "555-123",
Email = "lorem ipsum"
},
tran);
Return inserted object with ID:
If you wanted you could get Phone and Email or even the whole inserted row:
string insertUserSql = #"INSERT INTO dbo.[User](Username, Phone, Email)
OUTPUT INSERTED.*
VALUES(#Username, #Phone, #Email);";
User newUser = conn.QuerySingle<User>(
insertUserSql,
new
{
Username = "lorem ipsum",
Phone = "555-123",
Email = "lorem ipsum"
},
tran);
Also, with this you can return data of deleted or updated rows. Just be careful if you are using triggers because (from link mentioned before):
Columns returned from OUTPUT reflect the data as it is after the
INSERT, UPDATE, or DELETE statement has completed but before triggers
are executed.
For INSTEAD OF triggers, the returned results are generated as if the
INSERT, UPDATE, or DELETE had actually occurred, even if no
modifications take place as the result of the trigger operation. If a
statement that includes an OUTPUT clause is used inside the body of a
trigger, table aliases must be used to reference the trigger inserted
and deleted tables to avoid duplicating column references with the
INSERTED and DELETED tables associated with OUTPUT.
More on it in the docs: link
KB:2019779,"You may receive incorrect values when using SCOPE_IDENTITY() and ##IDENTITY",
The OUTPUT clause is the safest mechanism:
string sql = #"
DECLARE #InsertedRows AS TABLE (Id int);
INSERT INTO [MyTable] ([Stuff]) OUTPUT Inserted.Id INTO #InsertedRows
VALUES (#Stuff);
SELECT Id FROM #InsertedRows";
var id = connection.Query<int>(sql, new { Stuff = mystuff}).Single();
The InvalidCastException you are getting is due to SCOPE_IDENTITY being a Decimal(38,0).
You can return it as an int by casting it as follows:
string sql = #"
INSERT INTO [MyTable] ([Stuff]) VALUES (#Stuff);
SELECT CAST(SCOPE_IDENTITY() AS INT)";
int id = connection.Query<int>(sql, new { Stuff = mystuff}).Single();
Not sure if it was because I'm working against SQL 2000 or not but I had to do this to get it to work.
string sql = "DECLARE #ID int; " +
"INSERT INTO [MyTable] ([Stuff]) VALUES (#Stuff); " +
"SET #ID = SCOPE_IDENTITY(); " +
"SELECT #ID";
var id = connection.Query<int>(sql, new { Stuff = mystuff}).Single();
There is a great library to make your life easier Dapper.Contrib.Extensions. After including this you can just write:
public int Add(Transaction transaction)
{
using (IDbConnection db = Connection)
{
return (int)db.Insert(transaction);
}
}
I was using .net core 3.1 with postgres 12.3. Building on the answer from Tadija Bagarić I ended up with:
using (var connection = new NpgsqlConnection(AppConfig.CommentFilesConnection))
{
string insertUserSql = #"INSERT INTO mytable(comment_id,filename,content)
VALUES( #commentId, #filename, #content) returning id;";
int newUserId = connection.QuerySingle<int>(
insertUserSql,
new
{
commentId = 1,
filename = "foobar!",
content = "content"
}
);
}
where AppConfig is my own class which simply gets a string set for my connection details. This is set within the Startup.cs ConfigureServices method.
I see answer for sql server, well here it is for MySql using a transaction
Dim sql As String = "INSERT INTO Empleado (nombres, apepaterno, apematerno, direccion, colonia, cp, municipio, estado, tel, cel, correo, idrol, relojchecadorid, relojchecadorid2, `activo`,`extras`,`rfc`,`nss`,`curp`,`imagen`,sueldoXHra, IMSSCotiza, thumb) VALUES (#nombres, #apepaterno, #apematerno, #direccion, #colonia, #cp, #municipio, #estado, #tel, #cel, #correo, #idrol, #relojchecadorid, #relojchecadorid2, #activo, #extras, #rfc, #nss, #curp, #imagen,#sueldoXHra,#IMSSCotiza, #thumb)"
Using connection As IDbConnection = New MySqlConnection(getConnectionString())
connection.Open()
Using transaction = connection.BeginTransaction
Dim res = connection.Execute(sql, New With {reg.nombres, reg.apepaterno, reg.apematerno, reg.direccion, reg.colonia, reg.cp, reg.municipio, reg.estado, reg.tel, reg.cel, reg.correo, reg.idrol, reg.relojchecadorid, reg.relojchecadorid2, reg.activo, reg.extras, reg.rfc, reg.nss, reg.curp, reg.imagen, reg.thumb, reg.sueldoXHra, reg.IMSSCotiza}, commandTimeout:=180, transaction:=transaction)
lastInsertedId = connection.ExecuteScalar("SELECT LAST_INSERT_ID();", transaction:=transaction)
If res > 0 Then
transaction.Commit()
return true
end if
End Using
End Using
If you're using Dapper.SimpleSave:
//no safety checks
public static int Create<T>(object param)
{
using (SqlConnection conn = new SqlConnection(GetConnectionString()))
{
conn.Open();
conn.Create<T>((T)param);
return (int) (((T)param).GetType().GetProperties().Where(
x => x.CustomAttributes.Where(
y=>y.AttributeType.GetType() == typeof(Dapper.SimpleSave.PrimaryKeyAttribute).GetType()).Count()==1).First().GetValue(param));
}
}

Categories