I'm using table adapters and datasets in .NET (version 2.0).
I have two tables like this:
Table1
------
...
TypeId
Table2
------
Id
Name
where Table1.TypeId is linked to Table2.Id.
I've strongly generated these types by using the wizard so now I can do things like this:
Table1Adapter adapter = new Table1Adapter();
Table1DataSet data = adapter.GetData();
foreach(Table1Row row in data) { ... }
// Can now iterate through and display the data
This all works fine. But now I also want the data from Table2. I have noticed that in row there is generated field Table2Row which seems ideal but it is set to null. How do I populate this dataset properly?
Each DataTable in Typed-Dataset has its own TableAdapter. So you'll have to the same step for each DataTable in the dataset.
There does not exists any api that lets you do this auto-fill of the entire typed-dataset or no such code is generated within typed-dataset that supports this. It is also difficult to do this because TableAdapters do not have a common base-class that can let you do this.
If you really need to do this, you'll have to maintain a collection of DataTable type-names and TableAdapter type-names and iterate over the collection to perform the dataset fill.
So I recommend to fill dataset for each table in 'hard-code' manner.
Actually there is a Microsoft Data Provider called `MSDataShape', which provides this functionality. It was flagged for depreciation last when I checked and I don't know what the current status and future plans are, or what replaces it.
Here's an example of a single "SQL command", that will return, I believe, a DataSet with two nicely related DataTable-s:
SHAPE {select * from customers}
APPEND ({select * from orders} AS rsOrders
RELATE customerid TO customerid)
Potential replacement to look at is well-crafted FOR XML query, but I have not played with that.
EDIT:
I think this SO answer does exactly that using XML.
Related
I need to cache some look up tables in memory from sql database. I have hundreds of them. The tables are pretty simple with the following structure.
Tablename= "l_lookupobjectname"
column1Name: ID
Column2Name: Code
Code is mostly a string but can also be an integer in a few cases.
I use entity framework and would like a generic way to load those tables into my web application memory. I do not want to individually load each table by specifying its name.
I'm thinking along having a list of dictionary<in id, dynamic code>.
My problem is:
How do I generate the data access code that will pull all the data to my List of dictionary without having to write repetitive code for all my hundreds of table.
"select ID, Code from all the tables" instead calling this statement for each table.
I'm not concerned about the code for caching the data. This is quite trivial.
Your issue might be types, unless you declare everything is a string or an object (and cast it as needed).
Other than that going with some nested dictionaries seems like your best bet. You can build SQL queries ("select * from {0}"), and just provide a list of tables. Then read each one into a dictionary.
You could use DataSet, but that is quite cumbersome. Probably SqlDataReader is better bet.
You can get column names from it by:
var reader = cmd.ExecuteReader();
var columns = new List<string>();
for(int i=0;i<reader.FieldCount;i++)
{
columns.Add(reader.GetName(i));
}
and then just read it all as strings or objects.
Constructing a BindingSource.Filter string feels like an ugly, manual, forced way of filtering already retrieved data.
No explicit type checking
No explicit column name checking
No explicit SQL syntax checking
Requires manual ToString() formatting
DataSet design changes not propagated to Filter
Managing a Filter with multiple criteria from multiple controls quickly becomes tedious, error-prone, and unwieldy.
Using a typedTableAdapter.FillBy(typedDataSet.typedTable, #params ...) is a powerful, easy, and straightforward method for "filtering" between the database and the DataSet.
Does .NET provide any strongly-typed filtering between a strongly-typed DataSet and Form controls (perhaps through a BindingSource)?
Initial Bounty:
The initial bounty was awarded for a proof of concept (using a LINQ query as the DataSource). However, it does not demonstrate how to actually access the strongly-typed typedTableRow to perform the filtering.
Additional Bounty:
All of the casting through IList, DataView, DataRowView, DataRow, and typedTableRow has proven to be quite confusing.
Object Generic Watch List Type
------ -------- ------------
myBindingSource.List IList {System.Data.DataView}
myBindingSource.List[0] object {System.Data.DataRowView}
((DataRowView)myBindingSource.List[0]).Row
DataRow typedTableRow
Demonstrate a BindingSource with a DataSource using a strongly-typed LINQ query (ie: with typedTableRow fields accessible in .Where( ... )).
Notes (for shriop):
The form controls are bound to myBindingSource.
myBindingSource.DataSource: typedDataSet
myBindingSource.DataMember: typedTable
Filter Code (applied in FilterBtn_Click()):
myBindingSource.DataSource
= typedDataSet.typedTable.Where( x => x.table_id > 3).ToList();
After filtering, the BindingNavigator shows the appropriate number of records. However, if I navigate to any record which contains a null value, I get a StrongTypingException thrown in typedDataSet.typedTableRow.get_FIELDNAME(). Since this behavior only happens after filtering, I assume the LINQ filtering breaks something in the data binding.
Ok, I think this is what you want. I created a typed DataSet called AdventureWorks and added the Product table to it. I then added a DataGridView and a TextBox to a Form. I added an instance of the typed DataSet to the form. I added a BindingSource to the form. I set the DataSource of the BindingSource to the DataSet instance on the form. I set the DataMember to the Product table, which generated a ProductTableAdapter on the form. I set the DataSource of the DataGridView to the BindingSource. I bound the Text property of the TextBox to the Name property of the BindingSource, which resolves to the Name column of the Product table. In OnLoad, it had already generated a Fill for me using the TableAdapter and the Product DataTable. I then just had to add a single line to set my typed filter:
this.bindingSource.DataSource = this.adventureWorks.Product.Where(p => !p.IsProductSubcategoryIDNull()).ToList();
I then ran the form and was able to see only the filtered set of rows, and as I clicked through them, the text of the TextBox would change to match the name of the product of the selected row.
The ToList is key because the BindingSource does something goofy when either binding or supplying values out to the bound controls where without it you will get an exception that says
The method or operation is not implemented.
at System.Linq.Enumerable.Iterator`1.System.Collections.IEnumerator.Reset()
...
You also have to remember to watch out for the nullable fields when applying your filter criteria and make sure you're using the typed Is*Null() methods.
While this is a fairly straightforward way, it throws exceptions when it displays column values that have nulls, unless you go into the DataSet designer and change the option for handling nulls to return a null instead of throwing an exception. This works for string columns, but not so well for other column types like DateTime.
After ALOT of research into how DataView implements this, which DataTable uses internally, I can't find a simple fully functional implementation and did find this answer which best describes the pain, Data binding dynamic data .
I did find a pretty simple solution if you're ok with binding to a copy of the data, using some logic from Simple way to convert datarow array to datatable . This gets you back to a DataTable, and using it's implementation, but with only your rows after filtering.
DataRow[] rows = this.adventureWorks.Product.Where(p => !p.IsProductSubcategoryIDNull()).ToArray();
if (rows.Length > 0)
{
this.bindingSource.DataSource = rows.CopyToDataTable();
}
else
{
this.bindingSource.DataSource = rows;
}
Now you should still be able to use this copy of the data to send updates back to the database if you get the DataTable back out of the DataSource, making sure that it's a DataTable and not a DataRow[], and send that DataTable into the TableAdapter's Update method. Then, depending on how you're doing things, you could refill your original table, and reapply your filter.
You can use linq on binding source:
this.BindingSource.DataSource = ((IList<T>)this.BindingSource.List).Where( ... );
I have done same thing in vb.net and share it here maybe useful for someone:
I have created an extension for filtering TypedTables and filling my BindingSource.Datasource from a filtered view of exiting filled typedTable keeping original table and keeping schema in returning table (returning typedTable instead of DataTable):
Imports System.Runtime.CompilerServices
Imports System.Windows.Forms
Public Module DB_Extensions
<Extension()> _
Public Function LINQ_Filter(Of RowT As DataRow)(ByRef table As TypedTableBase(Of RowT), predicate As System.Func(Of RowT, Boolean)) As TypedTableBase(Of RowT)
' Create a clone of table structure(without data) to not loose filtered data on original table and return in a new table instead
Dim ret As TypedTableBase(Of RowT) = table.Clone()
' Using .ImportRow() method garantees to not loose original table schema after return
For Each r As RowT In table.Where(predicate)
ret.ImportRow(r)
Next
Return ret
End Function
End Module
and use it simply this way (needs to import DB_Extensions module to work):
myBindingSource.DataSource = someTypedTable.LINQ_Filter(your_filters)
May I tentatively offer an alternative to this approach (of LINQ directly into the Bindingsource filter).
I have wresteld with this problem for filtering data with complex sort criteria especially where data is matched with other sources. I have also need to keep update feature of the tableadapter working which seems to preclude some strastegies. In the past I dynamically created an adhoc filter column and used that but the upkeep of keeping hte datatable straight was rather tedious and it isn;t partuicularly fast or pretty.
Recently I realised that the bindingsource filter has a simliar option to SQL regards the IN command.
An example (sorry its VB) would be bindingsource.filter = "[columnname] IN ('n','n','n','n')" etc where each N is one of a list of values to match.
It is quite a simple matter to create a bindingsource extension to take a list and return the compiled filter string.
By utilising this method you can use Linq (or any other method) to create your inclusion list. For my part I used the unique ID keys of the records I want to include for the contents of my list (of integer usually).
There seems to be scant documentation on the facilites or limits of the bindingsource filter however I have seen reports of people using IN with very large strings. I personally don't deal with massive data so it's not a problem for me but maybe, if you consider this appraoch useful, you woud want to test those limits (and performance of course).
I hope this helps spark someone's imagination - but if you do want to shoot me down - please - be gentle :)
I'm using Entity Framework and SQL Server 2008 with the Database First approach.
My problem is :
I have some tables that holds many many columns (~100), and when I try to retrieve a lot of rows it takes a significant time before it returns the results, even if sometimes I need to use just 3 or 4 columns from that table.
I passed half a day in Stackoverflow trying to find a way to solve this problem, and I came up with two solutions :
Using stored procedures to retrieve data with the columns I want.
Edit the .edmx (xml) and the .cs files to remove the columns that I won't use.
My problem again is :
If I use stored procedures to retrieve the data with the columns that I want, Entity Framework loose it benefit and I can use ADO.NET instead of it and call directly the stored procedures ...
I can't take the second solution, because every time I make a change in the database, I'm obliged to regenerate the .edmx file and I loose the changes I made before :'(
Is there a way to do this somehow in Entity Framework ? Is that possible !
I know that other ORMs exist like NHibernate or Dapper, but I don't know if they can offer this feature without causing a lot of pain.
You don't have to return every column each time. You can specify which columns you need.
var query = from t in db.Table
select new { t.Column1, t.Column2, t.Column3 };
Normally if you project the data into a different poco it will do this automatically in EF / L2S etc:
var slim = from row in db.Customers
select new CustomerViewModel {
Name = row.Name, Id = row.Id };
I would expect that to only read 2 columns.
For tools like dapper: since you control the SQL, only specify columns you want - don't use *
You can create a second project with a code-first DbContext, POCO's and maps that return the subset of columns that you require.
This is a case of cut and paste code but it will get you what you need.
You can just create classes and project the data into them but I'm not sure you can make updates using this method. You can use anonymous types within a single method but you'll need actual classes to pass around between methods.
Another option would be to move to a code first development.
I am developing an HRM application to import and export xml data from database. The application receives exported xml data for the employee entry. I imported the xml file using linq to xml, where I converted the xml into respective objects. Then I want to attach (update) the employee objects.
I tried to use
//linqoper class for importing xml data and converts into IEnumerable employees object.
var emp = linqoper.importxml(filename.xml);
Using (EmployeedataContext db = new EmployeedatContext){
db.attachAllonSubmit(emp);
db.submitchange();
}
But I got error
“An entity can only be attached as modified without original state if it declares as version member or doesn't have an update check policy”.
I have also an option to retrieve each employee, and assign value to the new employee from xml data using this format.
//import IEnumerable of Employee objects
var employees = = linqoper.importxml(filename.xml)
using(Employeedatacontext db = new Employeedatacontext){
foreach(var empobj in employees)
{
Employee emp = db.Employee.where(m=>m.id==empobj.Id);
emp.FirstName=empobj.FirstName;
emp.BirthDate=empobj.BirthDate;
//….continue
}
db.submitChanges();
}
But the problem with the above is I have to iterate through the whole employee objects, which is very tiresome.
So is there any other way, I could attach (update) the employee entity in the database using LINQ to SQL.
I have seen some similar links on SO, but none of them seems to help.
https://stackoverflow.com/questions/898267/linq-to-sql-attach-refresh-entity-object
When linq-to-sql saves the changes to the database, it has to know properties of the object has been changed. It also checks if a potentially conflicting update to the database have been done during the update (optimistic concurrency).
To handle those cases LINQ-to-SQL needs two copies of the object when attaching. One with the original values (as present in the DB) and one with the new, changed values. There is also a more advanced mechanism involving a version member which is mapped to a rowversion column.
The linq-to-sql way to update a set of data is to first read all data from the database, then update the objects retrieved form the database and finally call SubmitChanges(). That would be my first approach in your situation.
If you experience performance problems, then it's time to go outside of linq-to-sql's toolbox. A solution with better performance is to load the new data into a separate staging table (for best performance, use bulk insert). Then run a SQL command or Stored Procedure that does the actual merging of data. The SQL Merge clause is excellent for this kind of updates.
LINQ to SQL is proper ORM, but if you want to take control of create/update/delete in your hand; than you can try some simple ORMs which just provide ways to do CRUD operations. I can recommend one http://crystalmapper.codeplex.com, it is simple yet powerful.
Why CrystalMapper?
I built this for large financial transaction system with lots of insert and update operations. What I need is speed and control of insert/update serving complex business scenarios ... hitting multiple tables just for one transaction.
When I put this to use in social text processing platform, it serves very well there too.
I create a custom dataset that I pass off to a black boxed component. The dataset consists of usually 5-6 tables (with unique names assigned by me). The component takes the dataset and builds a drop down combo box based off the table names. What I am needing to do though is to change the ordering of the tables within the dataset. I need to do this so I first offer up to the user the appropriate selection in the drop down (based off what section in the application they are in). So for instance...if they are in "Section A" then that is the first table name shown in the drop down list...if the user goes to "Section F" then that is what is shown in the list first...so on and so forth.
The more code intensive way is of course to just change the ordering in which I add the tables to the dataset. This would work, but I thought there had to be some way to do this more elegantly and with less code.
I am working in C# with the 3.5 framework.
Remember that DataSets and their contents are stored on the heap, so you can have a DataTable object in more than one place in the DataSet.
Simply create your DataSet with a dummy DataTable in position zero. Then, based on whatever section they'e in, you put the corresponding table in position zero. Your table name will appear twice in your DropDownBox, once as the 'default' and again below in its proper context and order with the other tables.
public class ThisThing
{
private DataSet myDS = new DataSet();
//Populate your DataSet as normal
public DataSet ChangeLocation(int CurrentSectionNumber)
{
myDS.Table[0] = myDS.Table[CurrentSectionNumber]
}
}
I'm not sure trying to force your ordering information into the DataSet's data structure is the most intuitive approach. You might consider passing an ordered list of DataTable instead of (or in addition to) the DataSet.