keep sequence same for list of items - c#

In my orders table I have PK INVOICE ID for each order item within the invoice
and a FK CUSTOMER ID I want to keep the CUSTOMER ID sequence same for same item within below code is doing the job but I just want to know if I am doing it with right way or there is a better way to do it
string connstr = "Data Source=JDT; User Id=admin; password=admin;";
string seqcmdtxt = #"SELECT CUSTOMER_ID_SEQ.NEXTVAL AS CUSTSEQ FROM DUAL";
string insertcmdtxt = #"INSERT INTO ORDERS (ORDER_ID,
CUSTOMER_ID,
PRODUCT_ID,
QUANTITY,
UNIT_PRICE,
ORDER_STATUS,
NOTES,
CREATED_BY,
CREATED_ON,
UPDATE_BY,
UPDATE_ON)
VALUES
(ORDER_ID_SEQ.NEXTVAL, --ORDER_ID
:TB_INVOICE_ID, --CUSTOMER_ID
:DGV_PRODUCT_DESC, --PRODUCT_ID
:DGV_QUANTITY, --QUANTITY
:DGV_UNIT_PRICE, --UNIT_PRICE
NULL, --ORDER_STATUS
:DGV_NOTES, --NOTES
'SYSTEM', --CREATED_BY
SYSDATE, --CREATED_ON
NULL, --UPDATE_BY
NULL) --UPDATE_ON
RETURNING ORDER_ID INTO :OUT_ORDER_ID"; //~ Returning VAULES ~//
using (OracleConnection conn = new OracleConnection(connstr))
using (OracleCommand cmd = new OracleCommand(insertcmdtxt, conn))
{
try
{
conn.Open();
cmd.CommandText = seqcmdtxt;
OracleDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
TB_INVOICE_ID.Text = reader["CUSTSEQ"].ToString();
}
cmd.CommandText = insertcmdtxt;
for (int i = 0; i < DGV_INVOICE.Rows.Count; i++)
{
//~ refreshing parameters ~//
cmd.Parameters.Clear();
cmd.Parameters.Add(new OracleParameter("TB_INVOICE_ID", TB_INVOICE_ID.Text));
cmd.Parameters.Add(new OracleParameter("DGV_PRODUCT_DESC", DGV_INVOICE.Rows[i].Cells[1].Value));
cmd.Parameters.Add(new OracleParameter("DGV_QUANTITY", DGV_INVOICE.Rows[i].Cells[2].Value));
cmd.Parameters.Add(new OracleParameter("DGV_UNIT_PRICE", DGV_INVOICE.Rows[i].Cells[3].Value));
cmd.Parameters.Add(new OracleParameter("DGV_NOTES", DGV_INVOICE.Rows[i].Cells[5].Value));
cmd.Parameters.Add(":OUT_ORDER_ID", OracleDbType.Decimal, ParameterDirection.Output);
//cmd.Parameters.Add(":OUT_CUSTOMER_ID", OracleDbType.Decimal, ParameterDirection.Output);
cmd.ExecuteNonQuery();
}
TB_INVOICE_ID.Text = (cmd.Parameters[":OUT_ORDER_ID"].Value).ToString();
//TB_NOTES.Text = (cmd.Parameters[":OUT_CUSTOMER_ID"].Value).ToString();
}
catch (Exception EX)
{ MessageBox.Show(EX.Message, "error msg", MessageBoxButtons.OK, MessageBoxIcon.Error); }
}

A few pointers.
Consider separating the code that gets the Customer ID from code that adds the invoice rows
int GetCustomerId();
void AddInvoice(int customerId);
You seem to be storing the CUSTSEQ in, what looks like, a global variable (input box). I would considering storing that in a local variable
string customerSequenceId = reader["CUSTSEQ"].ToString();
cmd.Parameters.Add(new OracleParameter("TB_INVOICE_ID", customerSequenceId));

Typically you don't need to be too concerned with the order in which records in the database are stored. (It makes some difference with regard to performance, but from the perspective of your application it doesn't matter.) What matters is the order in which records are sorted when you retrieve them. And that can vary. In one scenario you might want them ordered by date, in another by quantity, etc.
So what matters first is your 'ORDER BY' clause in your SQL query. Even though a database might appear to return records in some predictable sequence, it's important to always specify the order unless it absolutely doesn't matter.
Then if you build collections like List<Order> from the results of that query in your application you'll want to also ensure that if the sequence matters, you also specify an .OrderBy or .OrderByDescending in your LINQ expressions. Many operations, like .Select, will always return records in their original sequence, so if you know the sort order when you start you don't need to specify it again and again with every operation. But if you're selecting distinct groups or joining, or passing collections from one function to another so that the order may be uncertain, then you may need to sort them again if the sort order matters.

Related

How do I delete an entire row from a database

How would I delete a row from a sql database, either with stored procedures or without, right now I have tried without, using a button press.
This is what I have so far, _memberid has been sent over from a differnt form from the database(For context).
private void btnDelete_Click(object sender, EventArgs e)
{
SqlCommand cmd = new SqlCommand();
cmd.Connection = Lib.SqlConnection;
cmd.CommandType = CommandType.Text;
cmd.CommandText = "Delete * From Members where MemberId = " + _memberId;
SqlDataAdapter adapter = new SqlDataAdapter();
adapter.DeleteCommand = cmd;
adapter.Fill(MembersDataTable); // Im fairly sure this is incorrect but i used it from old code
DialogResult = DialogResult.OK;
}
If you're trying to do a simple ADO.Net-based delete, then it would be somehting like his:
private void DeleteById(int memberId)
{
// or pull the connString from config somewhere
const string connectionString = "[your connection string]";
using (var connection = new SqlConnection(connectionString))
{
connection.Open();
using (var command = new SqlCommand("DELETE FROM Members WHERE MemberId = #memberId", connection))
{
command.Parameters.AddWithValue("#memberId", memberId);
command.ExecuteNonQuery();
}
}
Use parameter to prevent SQL injection.
There are essentially three main things I'm seeing...
One
You don't need the * in the query. DELETE affects the whole row, so there's no need to specify columns. So just something like:
DELETE FROM SomeTable WHERE SomeColumn = 123
Two
There's no need for a SqlDataAdapter here, all you need to do is execute the query. For example:
cmd.ExecuteNonQuery();
The "non query" is basically a SQL command which doesn't query data for results. Inserts, updates, and deletes are generally "non queries" in this context. What it would return is simply the number of rows affected, which you can use to double-check that it matches what you expect if necessary.
Three
Don't do this:
cmd.CommandText = "Delete From Members where MemberId = " + _memberId;
This kind of string concatenation leads to SQL injection. While it looks intuitively like you're using _memberId as a query value, technically you're using it as executable code. It's less likely (though not impossible) to be a problem for numeric values, but it's a huge problem for string values because it means the user can send you any string and you'll execute it as code.
Instead, use query parameters. For example, you might do something like this:
cmd.CommandText = "Delete From Members where MemberId = #memberId";
cmd.Parameters.Add("#memberId", SqlDbType.Int);
cmd.Parameters["#memberId"].Value = _memberId;
This tells the database engine itself that the value is a value and not part of the executing query, and the database engine knows how to safely handle values.
You could use a DataAdapter, but since you aren't using a datatable, it's just easier to do it without like this:
var sql = "DELETE FROM Members WHERE MemberId=#MemberId";
using(var cmd = new SqlCommand(sql, Lib.SqlConnection))
{
cmd.Connection.Open();
cmd.Parameters.Add("#MemberId",SqlDbType.Int).Value = _memberId;
cmd.ExecuteNonQuery();
}
And if you are using Dapper, you can do this:
Lib.SqlConnection.Execute("DELETE FROM Members WHERE MemberId=#MemberId", new {MemberId=_memberId});
If you are still using DataTables, I would highly recommend you look into using this (or something like this) to simplify your database accesses. It'll make CRUD logic on a database a breeze, and your code will me a lot more maintainable because you can get rid of all the odd needs to do casting, boxing/unboxing, and reduce the chances of runtime bugs because of the use of magic strings that happens so often with DataTables (column names). Once you start working with POCO classes, you'll hate having to use DataTables. That said, there are a few places where DataTables are a better solution (unknown data structures, etc), but those are usually pretty rare.

Optimize sqlite query and being able to compare object and database

I'm developing a front end for retro gaming. Upon boot I'm parsing a lot of xml files from another application so I end up with a List of "systems" each containing a list of "games". Parsing the xml data is really fast, but writing all this to the sqlite database is not. Currently it takes ~25 seconds (20.000 records) which may not be too bad, but hopefully I can get some ideas on how to make it even faster.
Ideally I want to be able to compare my object containing all the data with the sqlite database. If anything has changed in the xml files/parsed object the data should be updated or deleted from the database. Is there a better way of solving this then doing a initial import and then dump all of the database and compare it to the object? Basically the whole import code backwards...
This is my current code:
public static void PopulateDatabase(List<RetroBoxSystem> systems)
{
using (SQLiteConnection con = new SQLiteConnection("Data Source=RetroBox.db;Version=3;"))
{
con.Open();
using (SQLiteTransaction tr = con.BeginTransaction())
{
using (SQLiteCommand cmd = con.CreateCommand())
{
foreach (var system in systems)
{
cmd.CommandText = #"INSERT OR IGNORE INTO systems(system_id, name)
VALUES ((SELECT system_id FROM systems WHERE name = #name), #name)";
cmd.Parameters.Add(new SQLiteParameter("#name", system.Name));
cmd.ExecuteNonQuery();
}
}
using (SQLiteCommand cmd = con.CreateCommand())
{
cmd.CommandText = #"INSERT OR IGNORE INTO games(game_id, system_id, name, description, cloneof, manufacturer, genre, rating, year)
VALUES ((SELECT game_id FROM games WHERE name = #name), (SELECT system_id FROM systems WHERE name = #system), #name, #description, #cloneof, #manufacturer, #genre, #rating, #year)";
foreach (var system in systems)
{
foreach (var g in system.GameList)
{
cmd.Parameters.Add(new SQLiteParameter("#system", system.Name));
cmd.Parameters.Add(new SQLiteParameter("#name", g.Name));
cmd.Parameters.Add(new SQLiteParameter("#description", g.Description));
cmd.Parameters.Add(new SQLiteParameter("#cloneof", g.CloneOf));
cmd.Parameters.Add(new SQLiteParameter("#manufacturer", g.Manufacturer));
cmd.Parameters.Add(new SQLiteParameter("#genre", g.Genre));
cmd.Parameters.Add(new SQLiteParameter("#rating", g.Rating));
cmd.Parameters.Add(new SQLiteParameter("#year", g.Year));
cmd.ExecuteNonQuery();
}
}
}
tr.Commit();
}
}
}
Your best bet would be the Entity Framework for comparing objects without coding everything yourself, however depending on your project type you might not have access to it (such as Windows Phone projects).
Your insert query seems pretty optimized, but maybe you can make it faster using async inserts with ConfigureAwait(false).
More details about ConfigureAwait here: How to massively improve SQLite Performance (using SqlWinRT)

Convert C# SQL Loop to Linq

I have a list Called ListTypes that holds 10 types of products. Below the store procedure loops and gets every record with the product that is looping and it stores it in the list ListIds. This is killing my sql box since I have over 200 users executing this constantly all day.
I know is not a good architecture to loop a sql statement, but this the only way I made it work. Any ideas how I can make this without looping? Maybe a Linq statement, I never used Linq with this magnitude. Thank you.
protected void GetIds(string Type, string Sub)
{
LinkedIds.Clear();
using (SqlConnection cs = new SqlConnection(connstr))
{
for (int x = 0; x < ListTypes.Count; x++)
{
cs.Open();
SqlCommand select = new SqlCommand("spUI_LinkedIds", cs);
select.CommandType = System.Data.CommandType.StoredProcedure;
select.Parameters.AddWithValue("#Type", Type);
select.Parameters.AddWithValue("#Sub", Sub);
select.Parameters.AddWithValue("#TransId", ListTypes[x]);
SqlDataReader dr = select.ExecuteReader();
while (dr.Read())
{
ListIds.Add(Convert.ToInt32(dr["LinkedId"]));
}
cs.Close();
}
}
}
Not a full answer, but this wouldn't fit in a comment. You can at least update your existing code to be more efficient like this:
protected List<int> GetIds(string Type, string Sub, IEnumerable<int> types)
{
var result = new List<int>();
using (SqlConnection cs = new SqlConnection(connstr))
using (SqlCommand select = new SqlCommand("spUI_LinkedIds", cs))
{
select.CommandType = System.Data.CommandType.StoredProcedure;
//Don't use AddWithValue! Be explicit about your DB types
// I had to guess here. Replace with the actual types from your database
select.Parameters.Add("#Type", SqlDBType.VarChar, 10).Value = Type;
select.Parameters.Add("#Sub", SqlDbType.VarChar, 10).Value = Sub;
var TransID = select.Parameters.Add("#TransId", SqlDbType.Int);
cs.Open();
foreach(int type in types)
{
TransID.Value = type;
SqlDataReader dr = select.ExecuteReader();
while (dr.Read())
{
result.Add((int)dr["LinkedId"]);
}
}
}
return result;
}
Note that this way you only open and close the connection once. Normally in ADO.Net it's better to use a new connection and re-open it for each query. The exception is in a tight loop like this. Also, the only thing that changes inside the loop this way is the one parameter value. Finally, it's better to design methods that don't rely on other class state. This method no longer needs to know about the ListTypes and ListIds class variables, which makes it possible to (among other things) do better unit testing on the method.
Again, this isn't a full answer; it's just an incremental improvement. What you really need to do is write another stored procedure that accepts a table valued parameter, and build on the query from your existing stored procedure to JOIN with the table valued parameter, so that all of this will fit into a single SQL statement. But until you share your stored procedure code, this is about as much help as I can give you.
Besides the improvements others wrote.
You could insert your ID's into a temp table and then make one
SELECT * from WhatEverTable WHERE transid in (select transid from #tempTable)
On a MSSQL this works really fast.
When you're not using a MSSQL it could be possible that one great SQL-Select with joins is faster than a SELECT IN. You have to test these cases by your own on your DBMS.
According to your comment:
The idea is lets say I have a table and I have to get all records from the table that has this 10 types of products. How can I get all of this products? But this number is dynamic.
So... why use a stored procedure at all? Why not query the table?
//If [Type] and [Sub] arguments are external inputs - as in, they come from a user request or something - they should be sanitized. (remove or escape '\' and apostrophe signs)
//create connection
string queryTmpl = "SELECT LinkedId FROM [yourTable] WHERE [TYPE] = '{0}' AND [SUB] = '{1}' AND [TRANSID] IN ({2})";
string query = string.Format(queryTmpl, Type, Sub, string.Join(", ", ListTypes);
SqlCommand select = new SqlCommand(query, cs);
//and so forth
To use Linq-to-SQL you would need to map the table to a class. This would make the query simpler to perform.

Sql query: use where in or foreach?

I'm using query, where the piece is:
...where code in ('va1','var2'...')
I have about 50k of this codes.
It was working when I has 30k codes, but know I get:
The query processor ran out of internal resources and could not produce a query plan. This is a rare event and only expected for extremely complex queries or queries that reference a very large number of tables or partition
I think that problem is related with IN...
So now I'm planning use foreach(string code in codes)
...where code =code
Is it good Idea ??
I suggest you create a temporary table containing all the codes you want to match, and join against that instead.
However, we haven't really seen enough of your code to comment for sure.
First, create a temp table, call it #tmp0, with just one column. Then:
SqlConnection conn = new SqlConnexion(connection_string);
SqlCommand cmd = new SqlCommand("INSERT INTO #tmp0 (code) VALUE (#code)", conn);
conn.Open();
foreach (string s in bigListOfCodes)
{
cmd.Parameters.Clear();
cmd.Parameters.AddWithValue("#code", s);
cmd.ExecuteNonQuery();
}
cmd = new SqlCommand("SELECT * FROM tbl " +
"WHERE code IN (SELECT CODE FROM #tmp0)", conn);
SqlDataReader rs = cmd.ExecuteReader();
while (rs.Read())
{
/* do stuff */
}
cmd = new SqlCommand("DROP TABLE #tmp0", conn);
cmd.ExecuteNonQuery();
conn.Close();
I know this seems like a lot of work for the server, but it's really pretty fast.
I'm not sure where you got those 50k values, but if it is from another query, just JOIN to this table and get all the data at one time from one query.

How do I count the number of rows returned in my SQLite reader in C#?

I'm working in Microsoft Visual C# 2008 Express and with SQLite.
I'm querying my database with something like this:
SQLiteCommand cmd = new SQLiteCommand(conn);
cmd.CommandText = "select id from myTable where word = '" + word + "';";
cmd.CommandType = CommandType.Text;
SQLiteDataReader reader = cmd.ExecuteReader();
Then I do something like this:
if (reader.HasRows == true) {
while (reader.Read()) {
// I do stuff here
}
}
What I want to do is count the number of rows before I do "reader.Read()" since the number returned will affect what I want/need to do. I know I can add a count within the while statement, but I really need to know the count before.
Any suggestions?
The DataReader runs lazily, so it doesn't pick up the entirety of the rowset before beginning. This leaves you with two choices:
Iterate through and count
Count in the SQL statement.
Because I'm more of a SQL guy, I'll do the count in the SQL statement:
cmd.CommandText = "select count(id) from myTable where word = '" + word + "';";
cmd.CommandType = CommandType.Text;
int RowCount = 0;
RowCount = Convert.ToInt32(cmd.ExecuteScalar());
cmd.CommandText = "select id from myTable where word = '" + word + "';";
SQLiteDataReader reader = cmd.ExecuteReader();
//...
Note how I counted *, not id in the beginning. This is because count(id) will ignore id's, while count(*) will only ignore completely null rows. If you have no null id's, then use count(id) (it's a tad bit faster, depending on your table size).
Update: Changed to ExecuteScalar, and also count(id) based on comments.
What you request is not feasible -- to quote Igor Tandetnik, my emphasis:
SQLite produces records one by one, on request, every time you call sqlite3_step.
It simply doesn't know how many there are going to be, until on some sqlite3_step
call it discovers there are no more.
(sqlite3_step is the function in SQLite's C API that the C# interface is calling here for each row in the result).
You could rather do a "SELECT COUNT(*) from myTable where word = '" + word + "';" first, before your "real" query -- that will tell you how many rows you're going to get from the real query.
Do a second query:
cmd.CommandText = "select count(id) from myTable where word = '" + word + "';";
cmd.CommandType = CommandType.Text;
SQLiteDataReader reader = cmd.ExecuteReader();
Your reader will then contain a single row with one column containing the number of rows in the result set. The count will have been performed on the server, so it should be nicely quick.
If you are only loading an id column from the database, would it not be easier to simply load into a List<string> and then work from there in memory?
Normally i would do
select count(1) from myTable where word = '" + word + "';";
to get the result as fast as possible. In the case where id is an int then it won't make much difference. If it was something a bit bigger like a string type then you'll notice a difference over a large dataset.
Reasoning about it count(1) will include the null rows. But i'm prepared to be corrected if i'm wrong about that.
Try this,
SQLiteCommand cmd = new SQLiteCommand(conn);
cmd.CommandText = "select id from myTable where word = '" + word + "';";
SQLiteDataReader reader = cmd.ExecuteReader();
while (reader.HasRows)
reader.Read();
int total_rows_in_resultset = reader.StepCount;
total_rows_in_resultset gives you the number of rows in resultset after processing query
remember that if you wanna use the same reader then close this reader and start it again.
Here is my full implementation in a static method.
You should be able to plug this into your class (replace _STR_DB_FILENAME & STR_TABLE_NAME with your database file name and table name).
/// <summary>
/// Returns a count of the number of items in the database.
/// </summary>
/// <returns></returns>
public static int GetNumberOfItemsInDB()
{
// Initialize the count variable to be returned
int count = 0;
// Start connection to db
using (SqliteConnection db =
new SqliteConnection("Filename=" + _STR_DB_FILENAME))
{
// open connection
db.Open();
SqliteCommand queryCommand = new SqliteCommand();
queryCommand.Connection = db;
// Use parameterized query to prevent SQL injection attacks
queryCommand.CommandText = "SELECT COUNT(*) FROM " + _STR_TABLE_NAME;
// Execute command and convert response to Int
count = Convert.ToInt32(queryCommand.ExecuteScalar());
// Close connection
db.Close();
}
// return result(count)
return count;
}
Note: To improve performance, you can replace '' in "SELECT COUNT()…." with the column name of the primary key in your table for much faster performance on larger datasets.
but I really need to know the count before
Why is that ? this is usually not necessary, if you use adequate in-memory data structures (Dataset, List...). There is probably a way to do what you want that doesn't require to count the rows beforehand.
You do have to count with select count... from...
This will make your application slower. However there is an easy way to make your app faster and that way is using parameterized queries.
See here: How do I get around the "'" problem in sqlite and c#?
(So besides speed parameterized queries have 2 other advantages too.)
SQLiteCommand cmd = new SQLiteCommand(conn);
cmd.CommandText = "select id from myTable where word = '" + word + "';";
SQLiteDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
total_rows_in_resultset++;
}
Surely a better way to get a row count would be something like this:-
SQLiteDataReader reader = SendReturnSQLCommand(dbConnection, "SELECT COUNT(*) AS rec_count FROM table WHERE field = 'some_value';");
if (reader.HasRows) {
reader.Read();
int count = Convert.ToInt32(reader["rec_count"].ToString());
...
}
That way you don't have to iterate over the rows

Categories