I have trouble with inserting in psql db by ef core method ExecuteSqlInterpolatedAsync/ExecuteSqlInterpolated.
TableScheme:
Column | Type | Collation | Nullable | Default
----------+---------+-----------+----------+------------------------------------------
Id | integer | | not null | nextval('"LookupRows_Id_seq"'::regclass)
Index | integer | | not null |
RowData | text | | not null |
LookupId | integer | | not null |
Insert method:
FormattableString str = $"INSERT into \"LookupRows\" (\"Index\", \"RowData\", \"LookupId\") VALUES {q.First()};";
await _context.Database.ExecuteSqlRawAsync(str.ToString(),cancellationToken);
await _context.Database.ExecuteSqlInterpolatedAsync($"INSERT into \"LookupRows\" (\"Index\", \"RowData\", \"LookupId\") VALUES {q.First()};",cancellationToken);
q.First() = (0, '{{"0":["0","Bucharest","0"]}}', 115)
ExecuteSqlRawAsync works fine. Entry successfully insert.
But ExecuteSqlInterpolatedAsync always return an error:
MessageText: syntax error at or near "$1"
I ran out of ideas. What am i doing wrong?
Neither option works.
In the first case the code is using plain old string manipulation to construct a SQL query, allowing SQL injection and conversion errors. If that q.First() is a string whose value is ('','',''); drop table Users;--, you'd end up with a dropped table.
In the second case the syntax is simply invalid. Instead of supplying the 3 values expected through parameters, a single value is used.
The correct syntax for ExecuteSqlRawAsync is :
var sql=#"INSERT into LookupRows (Index, RowData, LookupId)
VALUES (#index,#row,#lookup);"
var paramObject=new {index=1,row="abc",lookup=100};
await _context.Database.ExecuteSqlRawAsync(sql, paramObject, cancellationToken);
The parameter object's properties must match the parameter names.
The correct syntax for ExecuteSqlInterpolatedAsync is:
await _context.Database.ExecuteSqlInterpolatedAsync(
#$"INSERT into LookupRows (Index, RowData, LookupId)
VALUES ({paramObject.index},{paramObject.row},{paramObject.lookup});",
cancellationToken);
or
FormattableString sql=#$"INSERT into LookupRows (Index, RowData, LookupId)
VALUES ({paramObject.index},{paramObject.row},{paramOject.lookup});";
await _context.Database.ExecuteSqlInterpolatedAsync(sql, cancellationToken);
ExecuteSqlInterpolatedAsync will inspect the FormattableString and generate a new parameterized query using the string placeholders as positional parameters and their values as parameter values. It's the equivalent of:
var sql=#"INSERT into LookupRows (Index, RowData, LookupId)
VALUES (?,?,?);"
await _context.Database.ExecuteSqlRawAsync(sql, 1,"abc",100);
Using ExecuteSqlInterpolatedAsync is rather risky because it's way too easy to forget to explicitly specify FormattableString and end up with :
var sql=#$"INSERT into LookupRows (Index, RowData, LookupId)
VALUES ({paramObject.index},{paramObject.row},{paramObject.lookup});";
Which is just a string constructed from data, and once again vulnerable to SQL injection
Inserting 100K items
Looks like the actual problem is inserting 100K rows. ORMs are the wrong tool for this job. In this case, instead of a single graph of objects there are 100K rows with no business logic.
Executing 100K INSERT statements will take a long time and flood the database's transaction log. The solution is to use SqlBulkCopy to insert the rows using bulk operations and minimal logging.
SqlBulkCopy.WriteToServer expects a DataTable or DataReader. To use an IEnumerable<T> we can use FastMember's ObjectReader to wrap it:
IEnumerable<SomeType> data = ...
using(var bcp = new SqlBulkCopy(connection)) ;
using(var reader = ObjectReader.Create(data, "Id", "Name", "Description"))
{
bcp.DestinationTableName = "SomeTable";
bcp.WriteToServer(reader);
}
Importing CSVs
To import CSV files one can use CsvHelper's CsvDataReader :
using (var reader = new StreamReader("path\\to\\file.csv"))
using (var csv = new CsvReader(reader, CultureInfo.InvariantCulture))
{
// Do any configuration to `CsvReader` before creating CsvDataReader.
using (var dr = new CsvDataReader(csv))
using(var bcp = new SqlBulkCopy(connection))
{
bcp.DestinationTableName = "SomeTable";
bcp.WriteToServer(dr);
}
}
Related
I am getting these records from database MySql version 8.0.17
+-------------------------+
| TABLE_NAME |
+-------------------------+
| t_contents_s300_1_2021 |
| t_contents_s34d_1_2021 |
| t_contents_s34g_1_2021 |
| t_contents_s3sv_1_2021 |
+-------------------------+
and I used MySqlDataReader to read those records as follows
MySqlDataReader reader = cmd.ExecuteReader();
// in reader, I have records which comes from database.
while(reader.Read())
{
string [] arpp_pro = new string[] {reader["TABLE_NAME"].ToString()};
}
everything works fine...
But I need assigning these values of string [] arpp_pro in array for execute single query INSERT INTO on new table for each values from TABLE_NAME
How to solve this problem.
How can I get all records in array from TABLE_NAME?
Thanks in advance for any help
I think you want to construct a list:
MySqlDataReader reader = cmd.ExecuteReader();
List<string> arpp_pro = new List<string>(); // define a list outside of the loop
while(reader.Read())
{
// for each row from the database, add the retrieved table name to the list
arpp_pro.Add(reader["TABLE_NAME"].ToString());
}
// code to dos something with arpp_pro here.
I also recommend using the using keyword with your reader to ensure that it's closed/disposed when you are done with it. Example:
List<string> arpp_pro = new List<string>(); // define a list outside of the loop
using(MySqlDataReader reader = cmd.ExecuteReader())
{
while(reader.Read())
{
// for each row from the database, add the retrieved table name to the list
arpp_pro.Add(reader["TABLE_NAME"].ToString());
}
}
If you really need it as an array, you can call string[] arpp_pro_array = arpp_pro.ToArray(); to convert the list to an array. You will need using System.Linq; at the top of your code file for this to work, as ToArray is a LINQ extension method.
i want to get max code(my means last id of columns) with sql query
Thats my code :
var qre = dbms.Database.SqlQuery<TCOD_BANKS>("SELECT MAX(CODE) FROM TCOD_BANKS");
var data_main11 = qre.ToList();
foreach (var x1 in qre)
{
var aasd = x1.CODE;
}
When i start my code I see this error
System.Data.Entity.Core.EntityCommandExecutionException: 'The data reader is incompatible with the specified 'DENAF1399Model.TCOD_BANKS'. A member of the type, 'CODE', does not have a corresponding column in the data reader with the same name.'
my table is : TCOD_BANKS
my column id is : CODE as int
please help me
The query SELECT MAX(CODE) FROM TCOD_BANKS will return a single number, so you can't try to read a result set with items of type TCOD_BANKS. Try reading a single number from the result like this:
var qre = dbms.Database.SqlQuery<int>("SELECT MAX(CODE) FROM TCOD_BANKS").First();
Faced with strange problem:
string sql = $#"SELECT * FROM SearchLogs
WHERE CHECKSUM(#query) = cs_SearchTerm
AND Website = #website";
return await Connection.QueryFirstOrDefaultAsync<SearchLog>(sql,
param: new { query, website });
Record 100% exists in the database, but I got null. cs_SearchTerm is computed, int, nullable field. Then I tried:
DECLARE #term nvarchar(500)
SET #term = #query
SELECT * FROM SearchLogs
WHERE CHECKSUM(#term) = cs_SearchTerm AND Website = #website
But got same result. Then I tried split into two operations:
private async Task<int> SqlCheckSumAsync(string query)
{
string sql = #"SELECT CHECKSUM(#query)";
return await Connection.ExecuteScalarAsync<int>(sql, param: new { query }, transaction: Transaction);
}
string sql = $#"SELECT * FROM Search_Master
WHERE cs_SearchTerm = #checksum
AND Website = #website";
int checksum = await SqlCheckSumAsync(query);
return (await Connection.QueryFirstOrDefaultAsync<Search_Master>(sql,
param: new { checksum, website },
transaction: Transaction));
But still got not positive result. I wonder what I doing wrong? Why I can't pass param into SQL scalar?
From the comments, this works:
SELECT * FROM SearchLogs WHERE cs_SearchTerm = CHECKSUM('abc') AND Website = 'ICF'
So: that tells me that you computed your checksums using varchar inputs. This is very important, because CHECKSUM('abc') gives a completely different answer than CHECKSUM(N'abc'):
select checksum('abc') as [Ansi], checksum(N'abc') as [Unicode]
gives:
Ansi Unicode
----------- -----------
34400 1132495864
By default, dapper uses nvarchar (because .NET strings are utf-16). So we need to tell dapper to pass that as an ANSI string; fortunately this is simple:
return await Connection.ExecuteScalarAsync<int>(sql,
new { query = new DbString { Value = query, IsAnsi = true} },
transaction: Transaction);
Dapper's DbString type allows fine-grained control over how strings are sent, including both whether they are unicode or ANSI, and whether they are fixed width (and if so: what) or variable width.
Is there any way to parameterize an SQL INSERT statement (in C#), which inserts multiple rows? Currently I can think of only one way, to generate a statement for inserting mulitple rows, but that is quite open to SQL injection:
string sql = " INSERT INTO my_table"
+ " (a, b, c)"
+ " VALUES";
// Add each row of values to the statement
foreach (var item in collection) {
sql = sql
+ String.Format(" ({0}, {1}, {2}),",
aVal, bVal, cVal);
}
// Remove the excessive comma
sql = sql.Remove(sql.Length - 1);
What is the smarter/safer way to do this?
You could add paramaters inside the loop, like:
using (var comm = new SqlCommand()) {
var counter = 0;
foreach (var item in collection) {
sql = sql + String.Format(" (#a{0}, #b{0}, #c{0})," counter);
comm.Parameters.AddWithValue("#a" + counter, aVal);
comm.Parameters.AddWithValue("#b" + counter, bVal);
comm.Parameters.AddWithValue("#c" + counter, cVal);
counter++;
}
}
But I really wouldn't do a multi-row insert like this. IIRC the maximum amount of parameters in a query is about 2100, and this could get very big very fast. As you're looping through a collection anyway, you could just send it to the database in your loop, something like:
using (var con = new SqlConnection("connectionString here"))
{
con.Open();
var sql = "INSERT INTO my_table (a, b, c) VALUES (#a,#b,#c);"
using (var comm = new SqlCommand(sql, con))
{
comm.Parameters.Add("#a", SqlDbType.Int);
comm.Parameters.Add("#b", SqlDbType.NVarChar);
comm.Parameters.Add("#c", SqlDbType.Int);
foreach (var item in collection) {
{
comm.Parameters["#a"].Value = aVal;
comm.Parameters["#b"].Value = bVal;
comm.Parameters["#b"].Size = bVal.Length;
comm.Parameters["#c"].Value = cVal;
comm.ExecuteNonQuery();
}
}
}
The statement is prepared only once (and faster than a huge statement with 100's of parameters), and it doesn't fail all records when one record fails (add some exception handling for that). If you want to fail all when one record fails, you could wrap the thing up in a transaction.
Edit:
Ofcourse, when you regularly have to input 1000's of rows, this approach isn't the most efficient either, and your DBA might start to complain.
There are other approaches to this problem to remove the strain from the database: for example, create a stored procedure in your database that will insert the data from an xml document, or use Table Valued Parameters.
NYCdotNet wrote 2 nice blogs about these options, which I won't recreate here, but they're worth exploring (I'll paste some code below from the blog, as per guidelines, but credit where it's due: NYCdotNet)
XML document approach
Table Valued Parameters
The "meat" from the blog about TVP (in VB.NET but that shouldn't matter):
So I created this "generic" table-valued type:
CREATE TYPE dbo.UniqueIntegerList AS TABLE
(
TheInteger INT NOT NULL
PRIMARY KEY (TheInteger)
);
Creating the Save Stored Procedure
Next, I created a new stored procedure which would accept my new
Table-Valued Type as a parameter.
CREATE PROC DoTableValuedParameterInsert(#ProductIDs
dbo.UniqueIntegerList READONLY)
AS BEGIN
INSERT INTO ProductsAccess(ProductID)
SELECT TheInteger AS [ProductID]
FROM #ProductIDs;
END
In this procedure, I am passing in a parameter called #ProductIDs.
This is of type "dbo.UniqueIntegerList" which I just created in the
previous step. SQL Server looks at this and says "oh I know what this
is - this type is actually a table". Since it knows that the
UniqueIntegerList type is a table, I can select from it just like I
could select from any other table-valued variable. You have to mark
the parameter as READONLY because SQL 2008 doesn't support updating
and returning a passed table-valued parameter.
Creating the Save Routine
Then I had to create a new save routine on my business object that
would call the new stored procedure. The way you prepare the
Table-Valued parameter is to create a DataTable object with the same
column signature as the Table-Valued type, populate it, and then pass
it inside a SqlParameter object as SqlDbType.Structured.
Public Sub SaveViaTableValuedParameter()
'Prepare the Table-valued Parameter'
Dim objUniqueIntegerList As New DataTable
Dim objColumn As DataColumn =
objUniqueIntegerList.Columns.Add("TheInteger", _
System.Type.GetType("System.Int32"))
objColumn.Unique = True
'Populate the Table-valued Parameter with the data to save'
For Each Item As Product In Me.Values
objUniqueIntegerList.Rows.Add(Item.ProductID)
Next
'Connect to the DB and save it.'
Using objConn As New SqlConnection(DBConnectionString())
objConn.Open()
Using objCmd As New SqlCommand("dbo.DoTableValuedParameterInsert")
objCmd.CommandType = CommandType.StoredProcedure
objCmd.Connection = objConn
objCmd.Parameters.Add("ProductIDs", SqlDbType.Structured)
objCmd.Parameters(0).Value = objUniqueIntegerList
objCmd.ExecuteNonQuery()
End Using
objConn.Close()
End Using
End Sub
I have a simple query where in FROM I've got something like:
WHERE ([Comment] LIKE '%something%'
OR [Comment] LIKE '%aaaa%'
OR [Commnet] LIKE '%ccc')
AND Replace([Number], ' ', '') = #number
Unfortunetly this is now hardcoded in code so if anything changes (more OR's on the [Comment] field I have to go and change it in the code. Is it possible to pass [Comment] like a parameter with multiple values (unknown to me) so I would create some SQL Table with every comment I want to use in that query (and users would be able to add new ones from within program so it would be taken care of automatically?
using (var varConnection = Locale.sqlConnectOneTime(Locale.sqlDataConnectionDetails))
using (var sqlQuery = new SqlCommand(preparedCommand, varConnection)) {
sqlQuery.Parameters.AddWithValue("#number", varNumber);
using (var sqlQueryResult = sqlQuery.ExecuteReader())
while (sqlQueryResult.Read()) {
string varID = sqlQueryResult["ID"].ToString();
}
}
You can use table value parameters in SQL Server 2008.
For earlier versions, you could pass in an XML parameter and parse the data with XPath.
For good in depth analysis of the different options, read this article by Erland Sommarskog (SQL Server 2005) and the additional article about SQL Server 2008.
Even if the solution of Oded seems to be the best way to go (which i also didn't know till now), i currently built up the command and parameters in some way automatically like this (which doesn't currently match your problem, cause you're using like):
// Some sql statement with a placeholder for all parameters (#ID0, #ID1, #ID2, ...)
string sql = "select * from table where ID in ( {0} )";
// Create a list of items of #ID0, 3; #ID1, 8; ...
var parameters = myList.Where(item => item.MatchesSomeCondition())
.Select((item, index) => new
{
Name = "#ID" + index,
Value = item.ID
});
// Add all parameters to the sqlCmd
foreach(parameter in parameters)
{
sqlCmd.Parameters.AddWithValue(parameter.Name, parameter.Value);
}
// Insert all #IDx into the sql statement
// Result: "select * from table where ID in ( #ID0, #ID1, #ID2, ... )"
sqlCmd.CommandText = String.Format(sql, String.Join(", ", parameters.Select(parameter => parameter.Name).ToArray()