I have a dataset containing several tables, which is populated from a stored procedure. I want to make it nested for the GetXml() method.
I added the relation:
set.Relations.Add(
new DataRelation("Author_Document",
new DataColumn[] { set.Tables["Author"].Columns["lngDocumentSeriesId"], set.Tables["Author"].Columns["strAuthorName"] },
new DataColumn[] { set.Tables["Document"].Columns["lngDocumentSeriesId"], set.Tables["Document"].Columns["strAuthorName"] }, true));
I made it nested:
foreach (DataRelation relation in set.Relations)
{
relation.Nested = true;
}
And enforced:
set.EnforceConstraints = true;
All of which run fine, with no errors. The problem is when I call set.GetXml(), which throws a DataException: "Cannot proceed with serializing DataTable 'Document'. It contains a DataRow which has multiple parent rows on the same Foreign Key".
Upon inspection, the tables in question have each just a single row. The columns lngDocumentSeriesId and strAuthorName match. Even if there were a data integrety problem, it should have caused the exception on the set.EnforceConstraints = true; line, as I understand it.
What could cause this error (when all tables have just a single row), and how can it be fixed?
any other relations on the dataset ? ('there are two different tables that are each the parent, and each has one row')
Related
I have a DataSet and read data from two sources in it. One table from a XML file and another table from a Firebird SQL database. What I try to get is only one table that has all columns from the XML file AND a few fields from the SQL data in a single table. Both tables have the same unique key field in it so it could be merged easily. I would like to bind all fields of this table to fields on a form.
Is it possible like described or do I not see that there is a simpler solution to my problem?
Edit:
To show what I try to do a bit of extra code.
DataSet dataSet = new DataSet();
DataTable table1 = new DataTable("test1", "test1");
table1.Columns.Add("id");
table1.Columns.Add("name");
table1.Columns[0].Unique = true;
table1.Rows.Add(new object[2] { 1, "name1" });
table1.Rows.Add(new object[2] { 2, "name2" });
DataTable table2 = new DataTable("test2", "test2");
table2.Columns.Add("id");
table2.Columns.Add("thing");
table2.Columns[0].Unique = true;
table2.Rows.Add(new object[2] { 1, "thing1" });
table2.Rows.Add(new object[2] { 2, "thing2" });
dataSet.Tables.Add(table1);
dataSet.Tables[0].Merge(table2, false);
When I run this code I get a ConstraintException. When I remove the unique on the id fields it fills the list with all the needed columns but one row with data from table1 and another one with table2 data. How can I merge them?
Edit 2:
I tried to use the PrimaryKey solution as follows in live data.
xmlData.Tables[0].PrimaryKey = new[] { xmlData.Tables[0].Columns["usnr"] };
dbData.PrimaryKey = new[] { dbData.Columns["usid"] };
xmlData.Tables[0].Merge(dbData, true, MissingSchemaAction.Add);
xmlData is a DataSet which comes from a XML file. It has id, usnr and a few other fields in it. dbData is a DataTable with data from db it has id, usid, name and a few other fields. The id fields are not relevant to my data. Both fields usnr and usid are strings in the table as I tested with GetType().
When I now add xmlData.Tables[0].Merge(dbData, true, MissingSchemaAction.Add); it throws a DataException
<target>.ID and <source>.ID have conflicting properties: DataType property mismatch.
While writing this I realized that the id fields where different in both tables but I dont need them anyways so did remove the column before changing the primaryKey entries and merging. Now I get a NullReferenceException with no further information in it. The tables are all fine and have data in them, where could the Exception come frome now?
Instead of ...
table1.Columns[0].Unique = true;
table2.Columns[0].Unique = true;
... add these lines:
table1.PrimaryKey = new[] { table1.Columns[0] };
table2.PrimaryKey = new[] { table2.Columns[0] };
Because the merge command needs tho know the primary keys of the data tables. Just indicating which columns are unique is not enough. With the primary keys the output will be:
id name thing
== ===== ======
1 name1 thing1
2 name2 thing2
Note that for this to work properly, the primary key fields must have matching data types and names. If the data types don't match, you get a decent error message. However, if the names don't match, a nondescript null reference exception is thrown. Microsoft could have done a better job there.
That means that in your case, I'd recommend to rename either usnr or usid before merging the data tables.
You can use Linq for this purpose and join your two DataTables like this:
.........
.........
dataSet.Tables.Add(table1);
//dataSet.Tables[0].Merge(table2, false);
var collection = from t1 in dataSet.Tables[0].AsEnumerable()
join t2 in table2.AsEnumerable()
on t1["id"] equals t2["id"]
select new
{
ID = t1["id"],
Name = t1["name"],
Thing = t2["thing"]
};
DataTable result = new DataTable("Result");
result.Columns.Add("ID", typeof(string));
result.Columns.Add("Name", typeof(string));
result.Columns.Add("Thing", typeof(string));
foreach (var item in collection)
{
result.Rows.Add(item.ID, item.Name, item.Thing);
}
The result in a DataGridView will be what you want as shown below:
dataGridView1.DataSource = result;
Here you cannot merge those two data tables together. you need to merge data in those two tables iterating each.
Create new data table with containing all the columns(Id,Name,Thing). Then populate that table reading other two.
I am trying to merge data from two separate queries using C#. The data is located on separate servers or I would just combine the queries. I want to update the data in one of the columns of the first data set with the data in one of the columns of the second data set, joining on a different column.
Here is what I have so far:
ds.Tables[3].Columns[2].ReadOnly = false;
List<object> table = new List<object>();
table = ds.Tables[3].AsEnumerable().Select(r => r[2] = reader.AsEnumerable().Where(s => r[3] == s[0])).ToList();
The ToList() is just for debugging. To summarize, ds.Tables[3].Rows[2] is the column I want to update. ds.Tables[3].Rows[3] contains the key I want to join to.
In the reader, the first column contains the matching key to ds.Tables[3].Rows[3] and the second column contains the data with which I want to update ds.Tables[3].Rows[2].
The error I keep getting is
Unable to cast object of type 'WhereEnumerableIterator1[System.Data.IDataRecord]' to type 'System.IConvertible'.Couldn't store <System.Linq.Enumerable+WhereEnumerableIterator1[System.Data.IDataRecord]> in Quoting Dealers Column. Expected type is Int32.
Where am I going wrong with my LINQ?
EDIT:
I updated the line where the updating is happening
table = ds.Tables[3].AsEnumerable().Select(r => r[2] = reader.AsEnumerable().First(s => r[3] == s[0])[1]).ToList();
but now I keep getting
Sequence contains no matching element
For the record, the sequence does contain a matching element.
You can use the following sample to achieve the join and update operation. Let's suppose there are two Datatables:
tbl1:
tbl2:
Joining two tables and updating the value of column "name1" of tbl1 from column "name2" of tbl2.
public DataTable JoinAndUpdate(DataTable tbl1, DataTable tbl2)
{
// for demo purpose I have created a clone of tbl1.
// you can define a custom schema, if needed.
DataTable dtResult = tbl1.Clone();
var result = from dataRows1 in tbl1.AsEnumerable()
join dataRows2 in tbl2.AsEnumerable()
on dataRows1.Field<int>("ID") equals dataRows2.Field<int>("ID") into lj
from reader in lj
select new object[]
{
dataRows1.Field<int>("ID"), // ID from table 1
reader.Field<string>("name2"), // Updated column value from table 2
dataRows1.Field<int>("age")
// .. here comes the rest of the fields from table 1.
};
// Load the results in the table
result.ToList().ForEach(row => dtResult.LoadDataRow(row, false));
return dtResult;
}
Here's the result:
After considering what #DStanley said about LINQ, I abandoned it and went with a foreach statement. See code below:
ds.Tables[3].Columns[2].ReadOnly = false;
while (reader.Read())
{
foreach (DataRow item in ds.Tables[3].Rows)
{
if ((Guid)item[3] == reader.GetGuid(0))
{
item[2] = reader.GetInt32(1);
}
}
}
I think it best I explain my scenario first.
I bring all my data back from a sql database, using SqlDataAdapters, within on transaction.
In an example, I have a college. I want open this college and add modules, and at the same time I wish to add students to these new modules.
These modules and students are saved to their respective DataTable, and the student table has a column relating to it's parent module, "moduleid".
My problem is that I need a way to save both of these in the same transaction, adding the new moduleid to it's child rows. I can create the new modules, and their own moduleid in it's datatable is updated, however when I now need to save the students to this module, I need to add it's moduleid, otherwise it's added to the database without one.
This is my effort so far but I feel I'm barking up the wrong tree.
DataTable dt_new_modules = ds_College.Tables["module"].GetChanges(DataRowState.Added);
da_modules.Update(ds_College.Tables["module"]);
ds_College.Tables["module"].AcceptChanges();
DataTable dt_added = ds_College.Tables["student"].GetChanges(DataRowState.Added);
if (dt_added != null)
{
if (dt_new_modules != null)
{
foreach (DataRow new_module in dt_new_modules.Rows)
{
foreach (DataRow updated_module in ds_College.Tables["module"].Rows)
{
if (updated_module.Equals(new_module))
{
foreach (DataRow new_student in dt_added.Rows)
{
if ((int)new_student["moduleid"] == (int)new_module["moduleid"])
new_student["moduleid"] = (int)updated_module["moduleid"];
}
}
}
}
}
da_student.Update(dt_added);
dt_added.AcceptChanges();
}
DataTable dt_modified = ds_College.Tables["student"].GetChanges(DataRowState.Modified);
if (dt_modified != null)
{
da_student.Update(dt_modified);
dt_modified.AcceptChanges();
}
I am trying to loop through all the added users and if the datarow is the same as the one before it was given it's new moduleid, then get the new id and give it to the user, however I feel there must be a more efficient way to do this.
If I get it right, your problem is with inserting child records for a parent module which was not yet inserted to the DB. I had somewhat the same issue, and using SqlCommandBuilder instead made it work.
Create SqlCommandBuilder (System.Data.SqlClient) objects for each table you are changing, passing the corresponding sqlAdapter as a parameter to the constructor.
It creates the INSERT, UPDATE and DELETE commands automatically, and will handle all the changes made in memory back to the database.
After the command builder objects are created, just call "Update" on the data adapter you had for the parent table (modules), and afterwards "Update" on the children table data adapter.
Hope this solves it.
What is the purpose in life of the AcceptRejectRule property of ForeignKeyConstraint class in ADO.Net?
The MSDN Document doesn't carry sufficient explanation (for me) to make its purpose clear. After reading the documentation, I thought that setting the property to None will prevent cascading of any changes from the parent table to the child table. But, this assumption was proved to be wrong after running the following code:
DataTable table1 = new DataTable("Customers");
table1.Columns.Add(new DataColumn("CustomerID", typeof(int)));
table1.Columns.Add(new DataColumn("CustomerName", typeof(string)));
DataTable table2 = new DataTable("Orders");
table2.Columns.Add(new DataColumn("OrderID", typeof(int)));
table2.Columns.Add(new DataColumn("CustomerID", typeof(int)));
DataSet dataSet = new DataSet();
dataSet.Tables.AddRange(new DataTable[] { table1, table2 });
dataSet.EnforceConstraints = true;
DataRelation dataRelation = new DataRelation("CustomerOrders", table1.Columns["CustomerID"],
table2.Columns["CustomerID"], true);
dataSet.Relations.Add(dataRelation);
Debug.WriteLine("No. of constaints in the child table = {0}", table2.Constraints.Count);
dataRelation.ChildKeyConstraint.AcceptRejectRule = AcceptRejectRule.None;
dataRelation.ChildKeyConstraint.DeleteRule = Rule.Cascade;
dataRelation.ChildKeyConstraint.UpdateRule = Rule.Cascade;
table1.Rows.Add(new object[] { 11, "ABC" });
table1.Rows.Add(new object[] { 12, "XYZ" });
table2.Rows.Add(new object[] { 51, 12 });
table2.Rows.Add(new object[] { 52, 11 });
table2.Rows.Add(new object[] { 53, 11 });
table1.Rows.RemoveAt(0);
table1.AcceptChanges();
table2.AcceptChanges();
Debug.WriteLine("No of rows in the parent table = {0}", table1.Rows.Count);
Debug.WriteLine("No of rows in the child table = {0}", table2.Rows.Count);
The output of the above code is:
No. of constaints in the child table = 1
No of rows in the parent table = 1
No of rows in the child table = 1
Thanks,
Dinesh
To avoid cascading, you need to set DeleteRule and UpdateRule to Rule.None.
I'm not certain but I believe AcceptRejectRule only affects whether the accept/reject command itself is cascaded or not. In your code, I would guess that the changes have been cascaded (since that's how DeleteRule and UpdateRule have been set) but only the changes on table1 have been accepted; the changes on table2 have not been accepted.
Here is my understanding. Let's say you have a parent child relationship and you set the relation.acceptrejectrule to cascade. You also have the dataadapter.acceptchangesduringupdate set to true, if you modify the parent and child record and then do a dataadapter.update on the parent first, the parent AND child records get "accepted" and if you do a subsequent dataadapter.update on the child records nothing will get updated (cause the "accept" was cascaded from parent to child records). So your parent got updated but not the child records. Possible solutions would be update the child recs first, then the parent, or simply set the relation.acceptrejectrule to none. By doing that the "accept" will not cascade down to the child recs and you will be able to update them. When you do an update on the child recs they, too, will be "accepted" because you have the dataadapter.acceptchangesduringupdate set to true. Of course, you can set acceptchangesduringupdate to false and do a manual dataset.acceptchanges to accept all the changes in the dataset. But if you do this, make sure you have done updates on all tables in the dataset.
I'm only stating here what I think is happening based on my testing. If anybody else knows differently please jump in.
This test code is really straightforward
var addedRows1 = (securityDataTable.GetChanges(DataRowState.Added));
MessageBox.Show(addedRows1.Rows[1].RowState.ToString());
MessageBox.Show(addedRows1.Rows.Count.ToString());
addedRows1.Rows[1].AcceptChanges();
var addedRows2 = (securityDataTable.GetChanges(DataRowState.Added));
MessageBox.Show(addedRows2.Rows[1].RowState.ToString());
MessageBox.Show(addedRows2.Rows.Count.ToString());
The 4 MessageBox show, in order, the following messages:
Added
3
Added
3
I would expect the count to return 2 on the last message. Why isn't that the case and can this be fixed by any mean? Note: The DataTable is not linked to a table nor a particular data source.
EDIT: Note that the RowState is ok (set to Unchanged) if I don't requery the GetChanges() the second time
GetChanges returns a copy of the rows. Are you using a data adapter to fill your data table? MSDN recommends calling AcceptChanges on the DataAdapter'
private void UpdateDataTable(DataTable table,
OleDbDataAdapter myDataAdapter)
{
DataTable xDataTable = table.GetChanges();
// Check the DataTable for errors.
if (xDataTable.HasErrors)
{
// Insert code to resolve errors.
}
// After fixing errors, update the database with the DataAdapter
myDataAdapter.Update(xDataTable);
}
Edit
Since you are just using a datatable, you could create a query for the rows that are added and call AcceptChanges on that row:
DataRow[] addedRows = datatable.Select(null, null, DataViewRowState.Added);
foreach (DataRow _ddr in addedRows)
{
_ddr.AcceptChanges();
}