Are .NET DataTables useful beyond as a means of interfacing with databases? - c#

I often use the DataTable class in my .NET WCF services, since many of our SPs require TVPs. As far as I know, DataTables are the only way of passing TVPs to SPs.
It just occurred to me that similarly to how tables, in which information is stored according to rows and columns are useful, that the DataTable class may be useful beyond just as a means of interfacing with SQL Server TVPs.
Actually... thinking about this, I have previously written code that iterated over a DataTable's rows, building up an HTML string. However the main reason we used a DataTable as because the same table could be passed to SQL Server as a TVP.
Looking at the docs: https://msdn.microsoft.com/en-us/library/system.data.datatable%28v=vs.110%29.aspx?f=255&MSPPError=-2147217396, it looks like you can effectively create relational object models using DataTables.
Would using DataTables be an effective way of caching data retrieved from a SQL Server in a service?
Another potential use-case that comes to mind... Would there be any benefit of using a DataTable for a collection instead of List<MyType>?

Datatables are slower than Lists/Enumerables, and its better to use dataAdapter while reading data if you really care about performance.
But Datatables can be really useful as a item source for grids, where you want to just publish whole table data on the UI and no need to specify each column individually as in the case of List.

Related

Small in-memory database for making temporary operation on a table

I'm building a small part for my ASP.NET MVC website the will require the following steps:
Query my SQL Server DB for data from a particular table.
For each data row returned take that data and run another query (stored procedure) that returns some values.
take those values and run a calculation.
now that I have that calculation I will need to take it and store it along with some other data items from the first query in-memory (or, not if you think otherwise) and filter and sort. after filtering and sorting displaying the results to the user.
What do you guys recommend doing for such a scenario where you need to have an in-memory data representation that will have to be manipulated? should I just stick with DataTable? I found a component called QueryADataSet which allows running querys against .NET DataSets and DataTables - does anyone knows it? how about using that kind of solution? recommended?
would love to hear your thoughts...
Thanks
Change the website to behave as follows:
Send a single query to SQL that uses set operations to apply the calculations to the relevant data and return the result.
This is not a joke nor irony. Using the app server for doing 'sort' and 'filter' is not the proper place. Aside from the lack of an adequate toolset, there are issues around consitency/caching behavior. Data manipulation belongs to the back end, this is why you use SQL and not a key-value store. Not even mentioning the counter-pattern of 'retrieve a set and then call DB for each row'.
If the processing cannot be performed on the database server, then LINQ is a good toolset to filter and sort data in the application.

Should I persist consolidated sums in a separate table?

I am developing a C# application working with millions of records retrieved from a relational database (SQL Server). My main table "Positions" contains the following columns:
PositionID, PortfolioCode, SecurityAccount, Custodian, Quantity
Users must be able to retrieve Quantities consolidated by some predefined set of columns e.g. {PortfolioCode, SecurityAccount}, {Porfolio, Custodian}
First, I simply used dynamic queries in my application code but, as the database grew, the queries became slower.
I wonder if it would be a good idea to add another table that will contain the consolidated quantities. I guess it depends on the distribution of those groups?
Besides, how to synchronize the source table with the consolidated one?
In SQL Server you could use indexed views to do this, it'd keep the aggregates synchronised with the underlying table, but would slow down inserts to the underlying table:
http://technet.microsoft.com/en-us/library/ms191432.aspx
If it's purely a count of grouped rows in a single table, would standard indexing not suffice here? More info on your structure would be useful.
Edit: Also, it sounds a little like you're using your OLTP server as a reporting server? If so, have you considered whether a data warehouse and an ETL process might be appropriate?

connecting to different databases using .net datasets

I have many Databases with same structure and I have designed a dataset that matches the database design. It is easy to connect to database using connectionStrings which asked at design time and defined in app.config. But the problem arises when trying to change the database at runtime. I can not find any non-reflection solution to handle it. Is there any other way to change connection string of a dataset dynamically at run time or at least create dataset with different connection string!!!
You are filling DataSet using TableAdapter and you can easily modify TableAdapter connection string like this:
myTableAdapter.Connection.ConnectionString = connectionString;
Hope this helps :)
gzaxx answer will not work, simply because different DBMS work with different ADO.NET providers, which may or may not be compatible with each other. There's a lot of theory behind it and I won't type all of that in this textbox, but you need to understand that is the TableAdapters that is the main issue, and not the DataTable. Your business and UI layers normally only talk to DataTables which will mostly have the same structure for almost any DBMS given that you have correctly used corresponding data types when creating table columns. So, in theory, if Typed DataSets could provide a way to attach multiple Adapters per DataTable, you could add one adapter for each DBMS you support, while keeping the DataTable structure the same.
I myself had to deal with this issue in a somewhat large project and the only workable solution for me was to separate my Data Access into a separate project (a class lib) and then create one such DLL for each DBMS I was supporting. Hope that helps you get started with this.

Dynamic query data from WCF Data Service to Silverlight Application?

I have a Silverlight application that contains a window that lists several values in multiple columns. The first column contains fields to be retrieved from a database, the second column contains table names.
Multiple fields can be selected from the first column and only one field can be selected from the second column. So, the idea is to build a query that can select multiple columns from one of several tables (assume the column names are the same for each table).
My question is how do I pass these values into a WCF Data Service method and return an untyped dataset back to the calling Silverlight application? I will have no way of knowing the columns to fetch or the table to use until run-time. That means I cannot define a class to be used to return the data back from the WCF data service to Silverlight.
Any ideas on how to accomplish this?
Thanks
Silverlight doesn't have a datatable or dataset construct. However, you can fake it with nested lists. This guy put together the silverlight datatable code, and also shows how it can be serialized and sent over WCF:
http://blogs.telerik.com/blogs/posts/10-01-22/how_to_serialize_your_datatable_to_silverlight_using_wcf_service.aspx
As for getting the query to the server, you have a few options. You can build linq expression trees dynamically (the same way that domaindatasource does under the covers) and use that to query right from the client. You could also send your search parameters in a serialized form over to the server and construct the query there. Again, you'll either have to build linq expression trees if you want to use LinqToEntities for the query, or you could go old school and just build up an SQL query. If you go SQL, make sure you protect against SQL injection.
I could also suggest that you vote for the dataset/datatable feature to be added to silverlight, which would make your solution a little easier to develop. You can vote for it here.

DataSet or Reader or What?

When using a Class to get one row of Data from the Database what is best to use:
A DataSet?
A Reader and do what store the data in a Structure?
What else?
Thanks for your time, Nathan
A DataReader is always your best choice--provided that it is compatible with your usage. DataReaders are very fast, efficient, and lightweight--but they carry the requirement that you maintain an active/open db connection for their lifecycle, this means they can't be marshalled across AppDomains (or across webservices, etc).
DataSets are actually populated by DataReaders--they are eager-loaded (all data is populated before any is accessed) and are therefore less performant, but they have the added benefit of being serializable (they're essentially just a DTO) and that means they're easy to carry across AppDomains or webservices.
The difference is sometimes summed up by saying "DataReaders are ideal for ADO.NET ONLINE (implying that it's fine to keep the db connection open) whereas DataSets are ideal for ADO.NET OFFLINE (where the consumer can't necessarily connect directly to the database).
DataAdapter (which fills a DataSet) uses a DataReader to do so.
So, DataReader is always more lightweight and easier to use than a DataAdapter. DataSets and DataTables always have a huge overhead in terms of memory usage. Makes no difference if you are fetching a single row, but makes a huge difference for bigger result sets.
If you are fetching a fixed number of items, in MS SQL Server, output variables from a stored proc (or parameterized command) usually perform best.
if you use a reader you must have a open connection to your database generally a DataReader is used for fetch a combo or dataGrid, but if you want to stock your data in memory and you close our data base connexion you must use Datatable
Note : excuse my english level
If you just want read-only access to the data, then go with a raw DataReader; it's the fasted and most lightweight data access method.
However, if you intend to alter the data and save back to the database, then I would recommend using a DataAdapter and a DataSet (even a typed DataSet) because the DataSet class takes care of tracking changes, additions and deletions to the set which makes saves much easier. Additionally, if you have multiple tables in the dataset, you can model the referential constraints between them in the dataset.

Categories