C# MYSQL query every second - c#

So, I have a program I'm working on where more than one client is connected to a MYSQL Database and I want to keep them all current. When one client updated the database, the information in all clients updates. I'm new and still studying in college, so the only way I could think to do this is to make a column that held each records update time and then each second use this query:
if (sqlHandler.userLastUpdate < sqlHandler.LastUpdate())
{
//Loads the products from the Database
sqlHandler.ReadDB();
//Set's this clients last update time to now so it doesn't keep refreshing
sqlHandler.userLastUpdate = DateTimeOffset.UtcNow.ToUnixTimeSeconds();
}
public double LastUpdate()
{
using var con = new MySqlConnection(CONNECTION_STRING);
con.Open();
string sql = "SELECT MAX(LastUpDate) FROM products";
using var cmd = new MySqlCommand(sql, con);
using MySqlDataReader rdr = cmd.ExecuteReader();
rdr.Read();
double time = rdr.GetDouble(0);
return time;
}
This seems horribly inefficient. Is there a better way to do this. I've had a couple clients running on the same MYSQL server and it seemed to be fine, but it just seems to be a better way to do this.

Related

Is it possible to use SqlDataAdapter in a Client/Server scenario on the server side?

First of all: we have an application that is build heavily around the legacy DataTable type. Because of this, we cannot switch to e. g. EF now. This will be a future project. In the meantime we need to build a new server-sided REST based solution as a replacement for the legacy server logics.
Problem: SqlDataAdapter.Update(DataTable) does not update the data in the database:
New records: get inserted successfully in DB
Modified records: above Update() method returns correct count, but the change is not in DB
Deleted records: above Update() method returns 0 count and therefore throws concurrency exception (which is by design of the data adapter and not correct here)
Supposed Cause: As the DataTable is fetched by the server application on request of a client, but then transmitted to the client and back to the server before it gets written to the DB, SqlDataAdapter seems to not detect them properly as changes:
Client requests data
Server fetches data from database
Data is transmitted serialized via REST to the client
Client works on data
Changed data is transmitted serialized via REST to server
Server instantiates a new instance of SqlDataAdapter and makes SqlDataAdapter.Update() on this received data
Data integrity:
the correct RowState of each record is present on the server side, when it makes the SqlDataAdapter.Update()
the client transmits changed records only to the server, for efficiency reasons
all of the tables have a PK
none of the tables have FK relations (this is/was the legacy design rule)
Is it possible to somehow achieve (server-sided) SqlDataAdapter.Update() on "foreign" data or is this method designed for direct (client) updates to the database of the original data only?
Common Errors: of course I heavily searched for this issue already and took care of correct population of the sql command properties.
Server-sided code part:
public override int[] SaveDataTable(IEnumerable<DataTable> dataTables)
{
var counts = new Queue<int>();
using (_connection = new SqlConnection(ConnectionString))
{
_connection.Open();
var transaction = _connection.BeginTransaction();
try
{
foreach (var table in dataTables)
{
//var command = new SqlCommand();
var command = _connection.CreateCommand();
using (command)
{
command.Connection = _connection;
command.Transaction = transaction;
command.CommandText = Global.GetSelectStatement(table);
var dataAdapter = new SqlDataAdapter(command);
var cmdBuilder = new SqlCommandBuilder(dataAdapter);
dataAdapter.UpdateCommand = cmdBuilder.GetUpdateCommand();
dataAdapter.InsertCommand = cmdBuilder.GetInsertCommand();
dataAdapter.DeleteCommand = cmdBuilder.GetDeleteCommand();
//dataAdapter.SelectCommand = command;
//var dSet = new DataSet();
//dataAdapter.Fill(dSet);
//dataAdapter.Fill(table);
//dataAdapter.Fill(new DataTable());
//var clone = table.Copy();
//clone.AcceptChanges();
//dataAdapter.Fill(clone);
counts.Enqueue(dataAdapter.Update(table));
}
}
transaction.Commit();
}
catch (Exception)
{
transaction.Rollback(); //this may throw also
throw;
}
}
return counts.ToArray();
}
ok, so the quest is solved. There was nothing wrong with the implementation of the SqlDataAdapter (except the improvement advises from the comments of course).
The problem was in the client application code in always calling AcceptChanges() to reduce the amount of data. Prior to sending changed data to the data access layer, the RowState of each rows were "restored" with DataRow.SetModified(), etc.
This causes the problem of SqlDataAdapter.Update().
Of course this is logical, as the original DataRowVersion is lost then. But this wasn't easy to identify.

Why is my math not working on my SQL Server database?

I am developing an asp.net web application and I am trying to add a user xp system to it. I have a SQL Server database connected to it and I am trying to make a function that will give 5 experience points to the user.
I queried to the user that is logged in, accessed the user_xp column, and I am trying to add +5 to the old session variable for xp, then send that back into the database to be stored. Here is my code, I am not sure what is wrong with it.
void generateXp()
{
try
{
SqlConnection con = new SqlConnection(strcon);
if (con.State == ConnectionState.Closed)
{
con.Open();
}
SqlCommand cmd = new SqlCommand("UPDATE member_master_tbl SET user_xp = #user_xp WHERE " +
"user_name = '" + Session["username"].ToString().Trim() + "'", con);
int xp = 5;
int current_xp = Convert.ToInt32(Session["user_xp"]);
int new_xp = xp + current_xp;
string new_xp2 = Convert.ToString(new_xp);
cmd.Parameters.AddWithValue("user_xp", new_xp2);
}
catch (Exception ex)
{
}
}
Try renaming the SQL parameter to #user_xp.
cmd.Parameters.AddWithValue("#user_xp", new_xp2);
I don't have an accessible database to test. Also, you need to add the command to execute the query at the end.
cmd.ExecuteNonQuery()
That being said, it's a good practice to learn to separate DB queries to stored procedures or functions.
As others noted, you simply forgot to do a execute non query to run the command that you setup.
However, you can write things this way. You don't mention or note what the data type the experience points column is - I assumed "int".
So, your code block can be written this way:
using (SqlCommand cmd = new SqlCommand("UPDATE member_master_tbl SET user_xp = #user_xp WHERE user_name = #user",
new SqlConnection(strcon)))
{
cmd.Parameters.Add("#user_xp", SqlDbType.Int).Value = 5 + Session("user_xp");
cmd.Parameters.Add("#user", SqlDbType.NVarChar).Value = Session("username");
cmd.Connection.Open();
cmd.ExecuteNonQuery();
}
note how the command object has a connection object (so we don't need a separate one).
And while several people here "lamented" the string concentration to build the sql and warned about sql injection?
Actually, the introduction of # parameters for both values cleans up the code. So you get nice parameters - nice type checking, and you don't have to remember to add/use/have things like quotes around teh string, but not for numbers.
And I let .net cast the number expression from session() - this also likely is ok.
Also the "using block" also correctly cleans up the command object and also the connection object - so the using block is a good idea here.

Pulling in Access Database Entry using C#

I'm trying to figure out how to pull specific Entry lines from an Access Database and into a C# Program.
I'm working with a friend to make a sudoku game. We want to pull different levels of difficulty of puzzles from an access database and into a C# program.
Now my question is: Is there a way to have to program pull the specific lines from the database or would we need to load them all into the program and then have them selected from there? These would be put into a two-dimensional array.
What would be the best way to go about this?
I'm not sure what soduku is, but I'm thinking that you need to query your Access DB. Something like this should get you started.
Class BusLogic
{
public List<string> ListboxItems = new List<string>();
public void PopulateListBoxItems(string userName)
{
string connString = #"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=C:\Users\redgabanan\Desktop\Gabanan_Red_dbaseCon\Red_Database.accdb";
using (OleDbConnection connection = new OleDbConnection(connString))
{
connection.Open();
OleDbDataReader reader = null;
OleDbCommand command = new OleDbCommand("SELECT * from Users WHERE LastName='#1'", connection);
command.Parameters.AddWithValue("#1", userName)
reader = command.ExecuteReader();
while (reader.Read())
{
ListboxItems.Add(reader[1].ToString()+","+reader[2].ToString());
}
}
}
}
You could use a DataReader as well.
http://www.akadia.com/services/dotnet_data_reader.html
You definitely don't want to pull in all data from a Table; you need to somehow Query the data set.

Npgsql C# throwing Operation is not valid due to the current state of the object suddenly

I have a C# winform application and everything was working fine, but after prolonged use of the app I started getting this error
Operation is not valid due to the current state of the object
The error is from a function that executes every 5 seconds to get a list of names from the database
NpgsqlCommand cmd = new NpgsqlCommand(
String.Format("select pid,name from queue order by id"), conn);
NpgsqlDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
queue[(int)reader["pid"]] = (string)reader["name"];
}
This list contains names in a queue and needs to be updated in as short time as possible
From what I have read, it seems like a new limitation from .net framework..
Any better way to do this or a workaround to avoid this error?
Edit: and btw, I dont understand the limitation! I added a function that enters more than 100000 entries into the database and I didn't get this error!
Do you dispose reader and cmd after use? This could be memory leak-related where the postgres-provider ends up running out of an internal resource after some time.
You should follow a using-pattern like described on their homepage: http://www.npgsql.org/doc/
using (NpgsqlCommand cmd = new NpgsqlCommand(
String.Format("select pid,name from queue order by id"), conn))
{
using (NpgsqlDataReader reader = cmd.ExecuteReader())
{
while (reader.Read())
{
queue[(int)reader["pid"]] = (string)reader["name"];
}
}
}

Why does my application hang while trying to close a SqlConnection object?

I am trying to get column information in C# from a SQL table on SQL Server. I am following the example in this link: http://support.microsoft.com/kb/310107 My program strangely gets hung up when it tries to close the connection. If the connection is not closed, the program exits without any Exceptions. Here's my code:
SqlConnection connection = new SqlConnection(#"MyConnectionString");
connection.Open();
SqlCommand command = new SqlCommand("SELECT * FROM MyTable", connection);
SqlDataReader reader = command.ExecuteReader(CommandBehavior.KeyInfo); // If this is changed to CommandBehavior.SchemaOnly, the program runs fast.
DataTable table = reader.GetSchemaTable();
Console.WriteLine(table.Rows.Count);
connection.Close(); // Alternatively If this line is commented out, the program runs fast.
Putting the SqlConnection inside a using block also causes the application to hang unless CommandBehavior.KeyInfo is changed to CommandBehavior.SchemaOnly.
using (SqlConnection connection = new SqlConnection(#"MyConnectionString"))
{
connection.Open();
SqlCommand command = new SqlCommand("SELECT * FROM MyTable", connection);
SqlDataReader reader = command.ExecuteReader(CommandBehavior.KeyInfo); // If this is changed to CommandBehavior.SchemaOnly, the program runs fast even here in the using
DataTable table = reader.GetSchemaTable();
Console.WriteLine(table.Rows.Count);
}
The table in question has over 3 million rows, but since I am only obtaining the Schema information, I would think this wouldn't be an issue. My question is: Why does my application get stuck while trying to close a connection?
SOLUTION: Maybe this isn't optimal, but it does work; I inserted a command.Cancel(); statement right before Close is called on connection:
SqlConnection connection = new SqlConnection(#"MyConnectionString");
connection.Open();
SqlCommand command = new SqlCommand("SELECT * FROM MyTable", connection);
SqlDataReader reader = command.ExecuteReader(CommandBehavior.KeyInfo); // If this is changed to CommandBehavior.SchemaOnly, the program runs fast.
DataTable table = reader.GetSchemaTable();
Console.WriteLine(table.Rows.Count);
command.Cancel(); // <-- This is it.
connection.Close(); // Alternatively If this line is commented out, the program runs fast.
I saw something like this, long ago. For me, it was because I did something like:
SqlCommand command = new SqlCommand("SELECT * FROM MyTable", connection);
SqlDataReader reader = command.ExecuteReader();
// here, I started looping, reading one record at a time
// and after reading, say, 100 records, I'd break out of the loop
connection.Close(); // this would hang
The problem is that the command appears to want to complete. That is, go through the entire result set. And my result set had millions of records. It would finish ... eventually.
I solved the problem by adding a call to command.Cancel() before calling connection.Close().
See http://www.informit.com/guides/content.aspx?g=dotnet&seqNum=610 for more information.
It looks right to me overall and I think you need a little optimization. In addition to the above suggestion regarding avoiding DataReader, I will recommend to use connection pooling. You can get the details from here :
http://www.techrepublic.com/article/take-advantage-of-adonet-connection-pooling/6107854
Could you try this?
DataTable dt = new DataTable();
using(SqlConnection conn = new SqlConnection("yourConnectionString"))
{
SqlCommand cmd = new SqlCommand("SET FMTONLY ON; " + yourQueryString + "; SET FMTONLY OFF;",conn);
conn.Open();
dt.Load(cmd.ExecuteReader());
}
SET FMTONLY ON/OFF from MSDN seems the way to go
There is an specific way to do this, using SMO (SQL Server management objects)
You can get the collection of tables in the database, and then read the properties of the table you're interested in (columns, keys, and all imaginable properties)
This is what SSMS uses to get and set properties of all database objects.
Look at this references:
Database.Tables Property
Table class
This is a full example of how to get table properties:
Retrieving SQL Server 2005 Database Info Using SMO: Database Info, Table Info
This will allow you to get all the possible information from the database in a very easy way. there are plenty of samples in VB.NET and C#.
I would try something like this. This ensures all items are cleaned up - and avoids using DataReader. You don't need this unless you have unusually large amounts of data that would cause memory issues.
public void DoWork(string connectionstring)
{
DataTable dt = new DataTable("MyData");
using (var connection = new SqlConnection(connectionstring))
{
connection.Open();
string commandtext = "SELECT * FROM MyTable";
using(var adapter = new SqlDataAdapter(commandtext, connection))
{
adapter.Fill(dt);
}
connection.Close();
}
Console.WriteLine(dt.Rows.Count);
}

Categories