Inserting not committed to database - c#

I'm having some trouble getting my DataSet to work.
I have a MDB-Database in the background and created a DataSet out of it. Now I created a new method that lets me create a new user in the table.
But when I call it, nothing happens. No exceptions, no errors and I even get "1" returned as number of affected rows. But when I look in the database, no user was added.
I have the feeling that I miss to somehow tell the DataSet that I want to operate the Database itself rather than just the internal DataSet... How can I achieve this?
// DataSet1 uses the connection to the Database.mdb
// Created by the Designer
// The Users table has 3 columns, id, name and password
DataSet1 set = new DataSet1();
UsersTableAdapter adap = new UsersTableAdapter();
DataSet1.UsersRow row = set.Users.AddUsersRow("asd", "asd");
int count = adap.Insert("das", "das");
MessageBox.Show(row.RowState.ToString() + ": " + count.ToString());
count = adap.Update(set.Users);
set.AcceptChanges();
MessageBox.Show(row.RowState.ToString() + ": " + count.ToString());
// The Messagebox shows: "Added: 1"
// The second one: "Unchanged: 1"
MessageBox.Show(set.Users.Rows.Count.ToString());
// Returns "2"...

Not knowing too much about your code, try reading this how to from MSDN:
http://msdn.microsoft.com/en-us/library/ms233812(v=VS.80).aspx
Update: with data sets, what would normally happen (assuming the data set has been populated) is rows would be added, edited, or deleted in code. These rows would have a corresponding RowState signifying them as added, edited, deleted, or unmodified. During the .Update the associated table adapter, complete with associated insert/update/delete commands, will iterate the rows and execute the command on a given row depending on the row's state. When the rows have been iterated, .AcceptChanges is called, reverting the row state's to default.
If you call AcceptChanges before updating, then nothing will happen. The RowState of each row will be lost so the .Update will not have the required information it needs to perform the update to the database. If you have custom code for wrapping it all in a transaction, then you need to make sure you commit the transaction.
In your example, you seem to imply you are using the table adapter and not the data set itself. I would advise against this - the table adapters usually place methods on the tables in a data set that you should use.
Update 2: the code looks OK to me, though you don't need to call Insert on the adapter, or AcceptChanges after the update (the latter is done automatically). This leads me to believe there is a bug in your insert command SQL - try extracting the SQL used by the command and run it manually against the database.
Update 3: the following code works fine for me:
static void Main(string[] args)
{
db1DataSet set = new db1DataSet();
set.Users.AddUsersRow("asd", "asd");
foreach (DataRow row in set.Users.Rows)
{
object foo = row.RowState; // Confirm row state in debugger.
}
UsersTableAdapter adap = new UsersTableAdapter();
adap.Update(set.Users);
Console.Read();
}

Related

How to read all new rows from database?

I am trying to read all new rows that are added to the database on a timer.
First I read the entire database and save it to a local data table, but I want to read all new rows that are added to the database. Here is how I'm trying to read new rows:
string accessDB1 = string.Format("SELECT * FROM {0} ORDER BY ID DESC", tableName);
setupaccessDB(accessDB1);
int dTRows = localDataTable.Rows.Count + 1;
localDataTable.Rows.Add();
using (readNext = command.ExecuteReader())
{
while (readNext.Read())
{
for (int xyz = 0; xyz < localDataTable.Columns.Count; xyz++)
{
// Code
}
break;
}
}
If only 1 row is added within the timer then this works fine, but when multiple rows are added this only reads the latest row.
So is there any way I can read all added rows.
I am using OledbDataReader.
Thanks in advance
For most tables the primary key is based an incremental value. This can be a very simple integer that is incremented by one, but it could also be a datetime based guid.
Anyway if you know the id of the last record. You can simple ask for all records that have a 'higher' id. In that way you do get the new records, but what about updated records? If you also want those you might want to use a column that contains a datetime value.
A little bit more trickier are records that are deleted from the database. You can't retrieve those with a basic query. You could solve that by setting a TTL for each record you retrieve from the database much like a cache. When the record is 'expired', you try to retrieve it again.
Some databases like Microsoft SQL Server also provide more advanced options into this regard. You can use query notifications via the broker services or enable change tracking on your database. The last one can even indicate what was the last action per record (insert, update or delete).
Your immediate problem lies here:
while (readNext.Read())
{
doSomething();
break;
}
This is what your loop basically boils down to. That break is going to exit the loop after processing the first item, regardless of how many items there are.
The first item, in this case, will probably be the last one added (as you state it is) since you're sorting by descending ID.
In terms of reading only newly added rows, there are a variety of ways to do it, some which will depend on the DBMS that you're using.
Perhaps the simplest and most portable would be to add an extra column processed which is set to false when a row is first added.
That way, you can simply have a query that looks for those records and, for each, process them and set the column to true.
In fact, you could use triggers to do this (force the flag to false on insertion) which opens up the possibility for doing it with updates as well.
Tracking deletions is a little more difficult but still achievable. You could have a trigger which actually writes the record to a separate table before deleting it so that your processing code has access to those details as well.
The following works
using (readNext = command.ExecuteReader())
{
while (readNext.Read())
{
abc = readNext.FieldCount;
for (int s = 1; s < abc; s++)
{
var nextValue = readNext.GetValue(s);
}
}
}
The For Loop reads the current row and then the While Loop moves onto the next row

Database changes are not pushing to server

I have a winforms application, that is using the datasource and dataset controls from the IDE
This is the code block i am using
dsParcelBatch.BC_cpo_PARCELRow pr = dsParcelBatch.BC_cpo_PARCEL.FindByISN(int.Parse(activeParcelID));
pr.BeginEdit();
pr.NODE_ISN = 6;
pr.EndEdit();
pr.AcceptChanges();
dsParcelBatch.AcceptChanges();
I can read the correct row in line 1, and it is populating the PR row with the correct values,
i call the beginedit, and the dsParcelBatch has not updated.
I change the value of the NODE_ISN to the new value, and it sticks
I close the edit, and accept the changes in the row.
I can look into the datasource (dsParcelBatch) and the changes are in there - YEA!
I call the dsParcelBatch, and the changed value is changed....
but when i view the database, the value is back to the original value (5)
What am i missing something?
Before calling AcceptChanges you need to use a DataAdapter to Update the database.
AcceptChanges only changes the state of the rows in the DataTable, not in the database.
MSDN:
When AcceptChanges is called, any DataRow object still in edit mode
successfully ends its edits. The DataRowState also changes: all Added
and Modified rows become Unchanged, and Deleted rows are removed.

Issue with Simultaneous users making changes in a databound DataGridView

The current situation is that there is a databound data grid view that many people need to make changes to on a daily basis simultaneously. The grid has to be able to save and update but it is not working correctly since it is very active.
Here's the scenario:
3 people open the grid form at the same time.
They make changes to 3 different rows of data.
Person 1 saves changes and is successful.
Person 2 Saves changes and is successful but person 1's changes are now gone since person 2's grid was not synced with the data that person 1 just submitted.
Person 3 saves changes and wipes out anything that person 1 and 2 did because person 3's data grid view was not synced up with the updated data.
I have tried this approach first:
private MySqlDataAdapter da;
private MySqlConnection conn;
BindingSource bs = new BindingSource();
DataSet ds = null;
string qry;
string ConnString = System.Configuration.ConfigurationManager.AppSettings["ConnectionString"];
// THE LOAD METHOD
private void LoadDataToGrid(string srcTable, string query)
{
conn = new MySqlConnection(ConnString);
// ADD ANY QUERY
qry = query;
da = new MySqlDataAdapter(qry, conn);
conn.Open();
ds = new DataSet();
MySqlCommandBuilder cb = new MySqlCommandBuilder(da);
// USE TABLE NAME
da.Fill(ds, srcTable);
//USE TABLE NAME
bs.DataSource = ds.Tables[srcTable];
dataGridView1.DataSource = bs;
// CUSTOMIZE GRID
txtRows.Text = dataGridView1.Rows.Count.ToString();
dataGridView1.AutoResizeColumns();
dataGridView1.AllowUserToDeleteRows = false;
}
// THE SAVE METHOD
private void SaveDataFromGrid(string srcTable)
{
// USE TABLE NAME
DataTable dt = ds.Tables[srcTable];
this.dataGridView1.BindingContext[dt].EndCurrentEdit();
this.da.Update(dt);
txtRows.Text = dataGridView1.Rows.Count.ToString();
}
This did not work out for me because of the reasoning above. Data just wasen't getting saved correctly.
Here was my second thought, but still suffers from above issue:
// Load event
dataGridView1.DataSource = context.TableName;
// btnSave_click Event
connection.open();
// loop through cells in current row and apply changes by id
context.tableName.Attach(DataFromGrid);
context.ObjectStateManager.ChangeObjectState(DataFromGrid, System.Data.EntityState.Modified);
context.savechanges();
Basically, how can I solve this disconnection problem? Has anyone had this issue before?
This issue is probably more complicated than one code snippet is going to solve. It seems like you have a couple of things to resolve first, before you get into the implementation;
Firstly; does each person's changes actually have to commit an entire grid? According to your scenario, each person changes a different row of data. Modifying your code to commit changes to just the row that was modified will cut down your conflict scenarios considerably.
Secondly, and regardless of whether the conflicts are occurring in one row or many, you need to decide how to manage conflicts (you've not noted what you expect to happen - e.g; database wins, user's choice, ett.).
EF implements an optimistic concurrency mode (no database locks), however by default it will just elect to overwrite database changes when you commit your local changes. You need to modify your EF configuration to set ConcurrenyMode=Fixed so that it will not automatically overwrite changes when you have not 'gotten latest' before committing rows.
See here for a bit more info on how EF handles concurrency: http://msdn.microsoft.com/en-us/library/bb738618.aspx
Once you've modified the EF config, you'll start getting exceptions when conflicts are detected, which you'll need to handle, by refreshing the data either in the database (with local changes) or refreshing the local data (with database changes).

Save dataset data to other database

I have dataset connected with db. Dataset and TableAdapterManager were auto-created by VS based on connection.
I can do like this.tableAdapterManager.UpdateAll(testcheckerDataSet);
It's working fine.
Next I (using 3d party database editor) have copied .db file to "blank_testing.db" (2nd db) and then deleted all data from it. So It has just scheme without data.
Now I (in the code) filled dataset with data from "testchecker.db" (1st db), changed data using from elements and want to save data to 2nd db and then to 1st db. I try to make it chaging connectionString for each adapter.
string originalPath = tableAdapterManager.Connection.ConnectionString;
string ns = tableAdapterManager.Connection.ConnectionString.Replace("testchecker.db", "blank_testing.db");
tableAdapterManager.Connection.ConnectionString = ns;
group_testingTableAdapter.Connection.ConnectionString = ns;
groupsTableAdapter.Connection.ConnectionString = ns;
testingTableAdapter.Connection.ConnectionString = ns;
this.tableAdapterManager.UpdateAll((testcheckerDataSet) testcheckerDataSet.Copy() ); // here error
tableAdapterManager.Connection.ConnectionString = originalPath;
group_testingTableAdapter.Connection.ConnectionString = originalPath;
groupsTableAdapter.Connection.ConnectionString = originalPath;
testingTableAdapter.Connection.ConnectionString = originalPath;
this.tableAdapterManager.UpdateAll( testcheckerDataSet );
But I get error "concurrency violation the updatecommand affected 0 from 1". I think It happens because 2nd db has not data at all.
So could somebody advise me how can I save current dataset to another db file and then again to first (original) db?
Thanks.
Problem here is when you call UpdateAll method it try to find previous record from database and update. but here in second db there is no record. what you can try is iterate through all the DataRows in your DataTable and use the DataRow.SetAdded() method to change the row's DataRowState, then it will consider records as newly added.
One more thing:
SetAdded can only be invoked on a DataRow instance where the RowState is Unchanged. you can call AcceptChanges on the DataSet. Then RowState property of each DataRow also changes; Added and Modified rows become Unchanged, and Deleted rows are removed. Now you can call DataRow.SetAdded().

.Net Simple Data File usage

I've created a new project in .Net (2010 4.0) and added an SDF data file. I've generated a dataset and created a table in it (and I believe generated the Fill and other methods).
In code, I'm trying to add a row to the database.
eBureauScrubber.App_Data.matchingtempDataSet ds = new App_Data.matchingtempDataSet();
eBureauScrubber.App_Data.matchingtempDataSet.ctfFileRow row = ds.ctfFile.NewctfFileRow();
row.Address = "123 Main St.";
row.City = "Overland Park";
row.FirstName = "Matt";
row.LastName = "Dawdy";
row.rownum = 1;
EDIT: Added the next bit of code.
ds.ctfFile.Rows.Add(row);
ds.ctfFile.AcceptChanges();
ds.AcceptChanges();
eBureauScrubber.App_Data.matchingtempDataSetTableAdapters.ctfFileTableAdapter ctfa = new App_Data.matchingtempDataSetTableAdapters.ctfFileTableAdapter();
ctfa.Update(ds.ctfFile);
This runs fine. However, after the program completes, the data is not persisted in the database. What am I missing?
EDIT: I've tried all different combinations of AcceptChanges() on the datatable, the dataset, running update() before, after, etc. I'm missing something huge here. I'm not even sure it is connecting to the "right" database. Maybe that's my problem.
EDIT 2: Here's what I did to get this to work (it's still funky, though).
Change the properties of my DB file in App_Data to "Do Not Copy"
Manually copy that db file to bin\debug\app_data
Use the data adapter's fill method to fill the ds.ctfFile data table.
Create a row (.NewctfFileRow())
Set values on that row.
ds.ctfFile.Rows.Add(row)
ds.ctfFile.AcceptChanges();
ds.AcceptChanges();
Call the adapater's update method.
Now, the data is in my database file (in bin\debug\app_data), but I can't see it because the Data Sources connection. I'm still trying to find out how to do that.
It should have generated a TableAdapter class with a .Update() method that you have to call to save data in your database. See MSDN for some examples.

Categories