I am using System.Data.SQLite.Core nuget package in my asp.net core 3 app.
I have a SQLite DB with my table, assuming something like this:
+-----+------------+-----+
| ... | Phone | ... |
+-----+------------+-----+
| --- | 0123456789 | --- |
+-----+------------+-----+
| --- | 9876543210 | --- |
+-----+------------+-----+
I do a simple query
command = new SQLiteCommand("SELECT * FROM Table", connection);
var reader = await command.ExecuteReaderAsync();
and get data in my reader.
But when I try to Istance the values in an object I can read and cast every value to its type, however I can't cast the phone value into a string (I have declared it as a string in my DB), I mean I can cast it using explicit cast
var p = reader[index2];
var t = p.GetType().Name; //--> String
var myobj = new MyObj(othervalues: reader.GetString(index1), phone: (string)reader[index2]);
That is working but why can't I simply do this ?
... phone: reader.GetString(index2)...
And there is a way to do it ?
Thanks for your answer!
Update:
So apparently change colum definition from STRING to TEXT seems to work using reader.GetString(index) and allows to store phone number like '0000000001' too, without deleting zeros.
Related
I have trouble with inserting in psql db by ef core method ExecuteSqlInterpolatedAsync/ExecuteSqlInterpolated.
TableScheme:
Column | Type | Collation | Nullable | Default
----------+---------+-----------+----------+------------------------------------------
Id | integer | | not null | nextval('"LookupRows_Id_seq"'::regclass)
Index | integer | | not null |
RowData | text | | not null |
LookupId | integer | | not null |
Insert method:
FormattableString str = $"INSERT into \"LookupRows\" (\"Index\", \"RowData\", \"LookupId\") VALUES {q.First()};";
await _context.Database.ExecuteSqlRawAsync(str.ToString(),cancellationToken);
await _context.Database.ExecuteSqlInterpolatedAsync($"INSERT into \"LookupRows\" (\"Index\", \"RowData\", \"LookupId\") VALUES {q.First()};",cancellationToken);
q.First() = (0, '{{"0":["0","Bucharest","0"]}}', 115)
ExecuteSqlRawAsync works fine. Entry successfully insert.
But ExecuteSqlInterpolatedAsync always return an error:
MessageText: syntax error at or near "$1"
I ran out of ideas. What am i doing wrong?
Neither option works.
In the first case the code is using plain old string manipulation to construct a SQL query, allowing SQL injection and conversion errors. If that q.First() is a string whose value is ('','',''); drop table Users;--, you'd end up with a dropped table.
In the second case the syntax is simply invalid. Instead of supplying the 3 values expected through parameters, a single value is used.
The correct syntax for ExecuteSqlRawAsync is :
var sql=#"INSERT into LookupRows (Index, RowData, LookupId)
VALUES (#index,#row,#lookup);"
var paramObject=new {index=1,row="abc",lookup=100};
await _context.Database.ExecuteSqlRawAsync(sql, paramObject, cancellationToken);
The parameter object's properties must match the parameter names.
The correct syntax for ExecuteSqlInterpolatedAsync is:
await _context.Database.ExecuteSqlInterpolatedAsync(
#$"INSERT into LookupRows (Index, RowData, LookupId)
VALUES ({paramObject.index},{paramObject.row},{paramObject.lookup});",
cancellationToken);
or
FormattableString sql=#$"INSERT into LookupRows (Index, RowData, LookupId)
VALUES ({paramObject.index},{paramObject.row},{paramOject.lookup});";
await _context.Database.ExecuteSqlInterpolatedAsync(sql, cancellationToken);
ExecuteSqlInterpolatedAsync will inspect the FormattableString and generate a new parameterized query using the string placeholders as positional parameters and their values as parameter values. It's the equivalent of:
var sql=#"INSERT into LookupRows (Index, RowData, LookupId)
VALUES (?,?,?);"
await _context.Database.ExecuteSqlRawAsync(sql, 1,"abc",100);
Using ExecuteSqlInterpolatedAsync is rather risky because it's way too easy to forget to explicitly specify FormattableString and end up with :
var sql=#$"INSERT into LookupRows (Index, RowData, LookupId)
VALUES ({paramObject.index},{paramObject.row},{paramObject.lookup});";
Which is just a string constructed from data, and once again vulnerable to SQL injection
Inserting 100K items
Looks like the actual problem is inserting 100K rows. ORMs are the wrong tool for this job. In this case, instead of a single graph of objects there are 100K rows with no business logic.
Executing 100K INSERT statements will take a long time and flood the database's transaction log. The solution is to use SqlBulkCopy to insert the rows using bulk operations and minimal logging.
SqlBulkCopy.WriteToServer expects a DataTable or DataReader. To use an IEnumerable<T> we can use FastMember's ObjectReader to wrap it:
IEnumerable<SomeType> data = ...
using(var bcp = new SqlBulkCopy(connection)) ;
using(var reader = ObjectReader.Create(data, "Id", "Name", "Description"))
{
bcp.DestinationTableName = "SomeTable";
bcp.WriteToServer(reader);
}
Importing CSVs
To import CSV files one can use CsvHelper's CsvDataReader :
using (var reader = new StreamReader("path\\to\\file.csv"))
using (var csv = new CsvReader(reader, CultureInfo.InvariantCulture))
{
// Do any configuration to `CsvReader` before creating CsvDataReader.
using (var dr = new CsvDataReader(csv))
using(var bcp = new SqlBulkCopy(connection))
{
bcp.DestinationTableName = "SomeTable";
bcp.WriteToServer(dr);
}
}
I am getting these records from database MySql version 8.0.17
+-------------------------+
| TABLE_NAME |
+-------------------------+
| t_contents_s300_1_2021 |
| t_contents_s34d_1_2021 |
| t_contents_s34g_1_2021 |
| t_contents_s3sv_1_2021 |
+-------------------------+
and I used MySqlDataReader to read those records as follows
MySqlDataReader reader = cmd.ExecuteReader();
// in reader, I have records which comes from database.
while(reader.Read())
{
string [] arpp_pro = new string[] {reader["TABLE_NAME"].ToString()};
}
everything works fine...
But I need assigning these values of string [] arpp_pro in array for execute single query INSERT INTO on new table for each values from TABLE_NAME
How to solve this problem.
How can I get all records in array from TABLE_NAME?
Thanks in advance for any help
I think you want to construct a list:
MySqlDataReader reader = cmd.ExecuteReader();
List<string> arpp_pro = new List<string>(); // define a list outside of the loop
while(reader.Read())
{
// for each row from the database, add the retrieved table name to the list
arpp_pro.Add(reader["TABLE_NAME"].ToString());
}
// code to dos something with arpp_pro here.
I also recommend using the using keyword with your reader to ensure that it's closed/disposed when you are done with it. Example:
List<string> arpp_pro = new List<string>(); // define a list outside of the loop
using(MySqlDataReader reader = cmd.ExecuteReader())
{
while(reader.Read())
{
// for each row from the database, add the retrieved table name to the list
arpp_pro.Add(reader["TABLE_NAME"].ToString());
}
}
If you really need it as an array, you can call string[] arpp_pro_array = arpp_pro.ToArray(); to convert the list to an array. You will need using System.Linq; at the top of your code file for this to work, as ToArray is a LINQ extension method.
I wrote this code:
SqlConnection Conn = new SqlConnection("ConnectionString");
SqlCommand cmd = new SqlCommand("select * from tablename", conn);
SqlDataReader dr = cmd.ExecuteReader();
datatable dt = new datatable();
dt.load(dr);
But I get an exception as shown below when I load data into the datatable because I have an xml column with a big size (102 MB).
Exception of type 'System.OutOfMemoryException' was thrown.
I'm very grateful if someone can give me the solution for this exception.
Solutions to your problem:
Either normalize your database such that rather than raw XML, appropriate relational model is stored. That is how SQL works in its essentials. Hopefully, that will solve your problem.
However, no matter how well your database is normalized, there are limits beyond which data simply does not fit available memory. If it is your case, you need to abandon tough select * way and reimplement it in fetch-next-batch size, i.e. repeatedly fetch batches of fixed predefined size, process them, mark as processed somewhere and go on.
Here is a conceptual example.
T-SQL shreds XML data type column into a rectangular format.
The c# side of the equation will not have any problem.
SQL
-- DDL and sample data population, start
DECLARE #tbl TABLE (ID INT IDENTITY PRIMARY KEY, product VARCHAR(20), xmldata XML);
INSERT INTO #tbl (product, xmldata) VALUES
('vase', N'<root>
<r>red</r>
<r>blue</r>
<r>yellow</r>
</root>'),
('phone', N'<root>
<r>black</r>
<r>white</r>
</root>');
-- DDL and sample data population, start
SELECT id, product
, c.value('(./text())[1]', 'VARCHAR(20)') AS color
FROM #tbl CROSS APPLY xmldata.nodes('/root/r') AS t(c);
Output
+----+---------+--------+
| id | product | color |
+----+---------+--------+
| 1 | vase | red |
| 1 | vase | blue |
| 1 | vase | yellow |
| 2 | phone | black |
| 2 | phone | white |
+----+---------+--------+
We got in-house software and I'm trying to connect to it via OdbcDataAdapter. It's a in-house database as well.
I have managed to connect to Db via excel but I'm having problems in C.
I'm not sure how to phrase correctly the table names. I have managed in excel as trial and error.
String from Excel:
queryString = "SELECT * FROM ADI.\"kzn-57 | 600 | Survey Disabled | Realtime\"";
ADI = databasename
Table = kz - 50 | 600 | Data Disabled | Realtime
I'm getting errors as below:
Unknown Table Name.
string connectionString = "dsn=int_db";
DataSet dataSet = new DataSet();
OdbcConnection connection =
new OdbcConnection(connectionString);
string queryString = "SELECT * FROM ADI.\"kz-50 | 0600 | Data Disabled | Realtime\"";
OdbcDataAdapter adapter =
new OdbcDataAdapter(queryString, connection);
connection.Open(); // Connection established ok
adapter.Fill(dataSet); // unknown tables error
Console.WriteLine(dataSet.GetXml());
I have this query:
string query = "SELECT afspraak_locatie FROM Afspraak WHERE date(datum) = '" + datum +"'";
The final query will look like this:
SELECT afspraak_locatie FROM Afspraak WHERE date(datum) = '2016-06-16'
When i execute the query in my PHPMYADMIN it returns the row. But when i do it in C# it says my MySqldatareader is empty
Here is the code i use for that:
MySqlCommand cmd1 = new MySqlCommand(query1, connection);
cmd1.CommandType = CommandType.Text;
using (MySqlDataReader reader1 = cmd1.ExecuteReader())
{
while (reader1.Read())
{
result1.Add(reader1.GetString(0));
}
reader1.Close();
}
cmd1.Cancel();
When this gets executed it will give a System.NullreferenceException on the while(reader1.read) part. Any solutions?
Schema and data loaded:
create table Afspraak
(
id int auto_increment primary key,
afspraak_locatie varchar(100) not null, -- just an example (we don't know your datatype)
datum datetime not null -- you said it was a datetime in a comment under your question
);
insert Afspraak (afspraak_locatie,datum) values
('Rome','2016-06-14 13:55:55'),
('London','2016-06-15 15:12:12'),
('Cairo','2016-06-16 07:00:33'),
('Boston','2016-06-17 01:30:00');
select * from afspraak;
+----+------------------+---------------------+
| id | afspraak_locatie | datum |
+----+------------------+---------------------+
| 1 | Rome | 2016-06-14 13:55:55 |
| 2 | London | 2016-06-15 15:12:12 |
| 3 | Cairo | 2016-06-16 07:00:33 |
| 4 | Boston | 2016-06-17 01:30:00 |
+----+------------------+---------------------+
GUI Layer:
private void button1_Click(object sender, EventArgs e)
{
myDB.FindThatRow("2016-06-16"); // get data
}
DB Layer:
public void FindThatRow(string theDate)
{ // or all those rows
//
using (MySqlConnection lconn = new MySqlConnection(connString))
{
lconn.Open();
using (MySqlCommand cmd = new MySqlCommand())
{ //
cmd.Connection = lconn;
cmd.CommandText = #"select id,afspraak_locatie FROM Afspraak WHERE date(datum) = #pTheDate";
cmd.Prepare();
cmd.Parameters.AddWithValue("#pTheDate", theDate);
using (MySqlDataReader rs = cmd.ExecuteReader())
{ //
while (rs.Read())
{
int qId = (int)rs.GetInt32("id");
string sViewIt = rs.GetString("afspraak_locatie");
}
}
}
}
}
It found the data:
Use the using blocks as recommended by everyone. Bind your parameters.
The reasons why one should steer toward data bindings, versus string concatenation as seen in your attempt, include losing the functionality of what binding offers as seen in Configuring Parameters and Parameter Data Types and other links near or off that topic. And, it turns querying into the mess seen in PHP with concatenation which steered their modern usage toward parameter data bindings too.
Imagine how difficult and debug-intensive the following query would be without bindings:
Sql Injection Attacks:
Parameter binding protects you from such attacks, unlike your method of concat. See the following question including this answer for stored procedure usage.