i know this is an age old question, but this is my scenario
This is in C# 2.0
Have a windows application which has a datagridview control. This needs to be populates by making a webservice call.
I want to achive the same functionality on the data if i were to use a direct connection and if i were using datasets, namely like paging and applying filters to returned data. i know returning datasets is a bad idea and am looking to find a good solution.
I might look at ADO.NET Data Services, aka Astoria, in VS2008 SP1.
This allows you to expose data over a web-service (WCF exposing ATOM, IIRC) - but you don't need to know all these details: the tooling worries about that for you: you just get regular IQueryable<T> sources on a data-context (not quite the same as the LINQ-to-SQL data-context, but same concept).
The good thing here is that a LINQ query (such as filtering (Where), paging (Skip/Take) etc) can get composed all the way from the client, through the web-service, and down to the LINQ-enabled data store (LINQ-to-SQL or Entity Framework, etc). So only the right data comes over the wire: if you ask for the first 10 rows (of 20000) ordered by Name, then that is what you get: 10 rows out of the database; 10 rows over the wire, no messing.
Write a custom class (MyDataItem) that'll hold your data. Then you can pass a List<MyDataItem> or some collection of MyDataItem and bind to your grid.
Paging, filtering, etc. would need to be implemented by you.
There is no way to get the binding behavior you automatically get with a DataSet if you are going through a Web Services data layer. You would have to create your own proxy class that supports all the databinding functions and persists them through your web service calls. Depending on your application's environment, you may want to batch up modifications to avoid excess round trips to the web services.
fallen888 has it right - you will need to create a collection class of List or a DataTable, fill it with the output from the webservice data stream, and handle paging and filtering yourself yourself.
Related
I'm building a small part for my ASP.NET MVC website the will require the following steps:
Query my SQL Server DB for data from a particular table.
For each data row returned take that data and run another query (stored procedure) that returns some values.
take those values and run a calculation.
now that I have that calculation I will need to take it and store it along with some other data items from the first query in-memory (or, not if you think otherwise) and filter and sort. after filtering and sorting displaying the results to the user.
What do you guys recommend doing for such a scenario where you need to have an in-memory data representation that will have to be manipulated? should I just stick with DataTable? I found a component called QueryADataSet which allows running querys against .NET DataSets and DataTables - does anyone knows it? how about using that kind of solution? recommended?
would love to hear your thoughts...
Thanks
Change the website to behave as follows:
Send a single query to SQL that uses set operations to apply the calculations to the relevant data and return the result.
This is not a joke nor irony. Using the app server for doing 'sort' and 'filter' is not the proper place. Aside from the lack of an adequate toolset, there are issues around consitency/caching behavior. Data manipulation belongs to the back end, this is why you use SQL and not a key-value store. Not even mentioning the counter-pattern of 'retrieve a set and then call DB for each row'.
If the processing cannot be performed on the database server, then LINQ is a good toolset to filter and sort data in the application.
In our web application we have some data-intensive UI pages that are quite slow, and the client-side sorting and filtering we're using now is forcing these pages to load all the data from the API (in JSON) and then perform the operations in the browser which usually means the page freezes for several seconds before it's ready for use.
The solution we're looking into is to provide server-side paging with support for filtering and sorting.
For paging, we are sending the pageIndex and pageSize parameters in the URL and it works fine. Now for sorting (and filtering) we are thinking of sending the names of the properties to sort (and filter) with inside a JSON object that can be parsed by the API to generate a proper query for the data. The problem I have here is that to make things as simple and fast as possible, the UI pages will have to “know” about the data column names according which they want to sort (or filter) data, which is not desirable.
For example, the API sends users first names in a property called “FN” (in a JSON object); however, to ask the API to sort users according by first names, the UI will have to send “FirstName” as the sorting property which is the name of the column in my database. This method allows me to pass the name of the property directly to my data layer and get things done quickly. However, the need of the UI to recognize the names of the columns of the data tables in the database pretty much undoes the whole idea of separating concerns between layers (which was a major motivation for us to break down our web app into separate API and UI).
Is there a clean way to provide the server-side paging with filtering and sorting without creating this dependency between the UI and the backend?
Thank you.
Don't see much problem here. Don't know how exactly your app structured, but let's assume we have a table with 2 columns (Acol, Bcol). What is the name of the column is written in attributes (Acol: "First name", "LastName" or in Italian "Nome", "Cognome", who cares..) of the column itself, so in the moment user in some way requests to sort or to filter, you don't pass the fixed name of the column, but the string present in attributes of that column. And who sets the attribute on the column? the server.
So server sets alias for the columns (real names, may be translated to fit localize context of the client), and the client what does is simply passing that string, without even understanding what it for to a server, because the server knows what to do.
I've been reviewing examples on the web of 3 layer design and I've noticed that most samples return either datasets or data tables. The thing that is confusing me is what if you would rather return a generic list of type so you can utlize properties or methods from within the type your list is based on? As example using a Name property that concats various fields in a specific way depending on the data, if the List is bound to a control on a form then the Name property can be used as the datafield. If you would want to accomplish the same thing when using a dataset or table, you'd have to return the data from the database to acheive the same (I try not to use datasets or datatables so I'm probably very wrong about this statement. :) )
The part that is really confusing me is about resusing code, to me it seems the only way to reuse code is to retrieve the data into either a dataset or datatable and then loop through the data and add it to a List, is this generally the best practice for 3 layer or is there a way to do this without datasets and datatables.
The example in the link below demonstrates in essence using datasets or tables and then adding it to an object but I'm forced to ask if this is the best practice?
http://www.codeproject.com/Articles/36847/Three-Layer-Architecture-in-C-NET
Thanks
Using DataTables is a specific dotnetism. The reason behind it is that they contain metadata about the structure of the data, which lets DataGrid (and other such components) display the data automatically without using reflection or so. My guess is this is amongst other things a heritage of the MS Access approach to RAD, where the intent was enabling "business people" to create apps by generating the user interface directly from a SQL schema, essentially doing the opposite of a tiered design. This heritage then seems to have leaked into the hivemind.
There's nothing wrong about using "plain" data structures, as long as you're willing to give up the RAD features, and the trend lately seems to have been to get rid of this tradeoff too. (For instance with Web Forms' strongly typed data controls, and MVC's model binding features.)
Also, speaking more generally, Code Project articles from before MVC was established are not really a good source of wisdom on general software architecture.
What you should carry your data on depends entirely on your needs.
If you retrieve data from the DB and bind it to a datagrid, datasets might give you the perfect solution. If you want some other method where data tracks its own update status you should look into Entity Framework. If you retrieve data and send it through a web service for cross platform or cross domain processing you need to load your data onto some other serializable classes of your own.
Take a look at the article below. It is a little old and targeted at EF4 but it summerizes pros and cons of different strategies very well. (There are three articles in the series, I suggest you read them all)
http://msdn.microsoft.com/en-us/magazine/ee335715.aspx
I think the samples you're finding used data tables and datasets because it's a simple way to show 3-tier design. Now days Entity Framework has largely replaced the "data access layer" mentioned in the sample.
Before entity framework when I wrote a data access layer I would return a generic list that I built from the database. To run an update, delete, or insert I would pass an object in as the parameter to the methods, then use the object's properties as the values in the sql statement. I preferred doing it that way for the reasons you mentioned but also because it allowed me to change the object definitions or db schema (or even use a different db all together) independently of each other.
i need to show total number of rows at the grid title.
the grid have to deal also with large amount of records.
so i decide to use grid custom paging feature.
i know how to do server side paging with sql2005 ROW_NUMBER etc.
but my difficulty is with the complex row base filtering done at the business logic layer.
i think that doing first the complex filtering (in order to know the items count) on the large amount of records will be not efficient and maybe can cause out of memory exception.
right now this project (asp.net web app) is on production with .net framework1.1,sql2005.
next version on production will be with .net framework4.0.
after that we will upgrade to sql2008.
please help me to find a solution for that problem?
thanks.
I would say if you are afraid of Out of Memory exceptions in production either the HW is undersized for the amount of data you have or your code is really badly wrong :)
I would have everything done is a store procedure, including filtering, paging and sorting. Once you have this sorted out in the server and you have specified the page size and pag index you need to retrieve, the stored proc simply returns the single page of records you are looking for already sorted as well and you can bind this to your UI controls.
Is this what you wanted or did I get you wrong?
if you are using .net 4.0, IQueryable is a viable option. See here for details. Basically, IQueryable delays the execution of the query so you can apply the business logic and then fetch relevant data from the underlying data store (sql server in your case).
But, I would do some micro-benchmarking of the query performance before going down this route.
Application Type:
3-Tier web application with RDBMS at backend
Development Platform
Client : Silverlight 3/ WPF
Services: WCF web services with Basic Http binding
Problem Definition:
Trying to develop a application that has a client side business handling and data intensive objects being passed to client. Once the objects are viewed and edited in client screen they should be passed to services on server side for save. The issue being since that data is in sizable amount I dont want to pass the entire object back again to the services. E.g:- If I have a collection of 10 rows and 10 columns for each row and only 2 columns are updated. I should be able to pas only the data.
Question:
Is this a good practise and if yes whats the best way to achieve
Tried out solutions
I have tried two solutions
1: Have setters with event delegate that do change notification
2: Use custom data type
If you're using WCF web services as automagically generated by VS then you're pretty much constrained to transmitting classes that are known; thus to transmit smaller chunks of info you will need to define some new classes specifically for that purpose. Such objects are I believe commonly called DTO (Data Transfer Objects). So, for your scenario with the 10 x 10 matrix, your DTO would perhaps contain a list of {x, y, value} triples.
If you're using a REST web service (and composing your own) then you might avoid the DTO classes entirely and just create an XML schema that is adequate to convey the info; e.g., a top level element with subelements of the form:
<Deltas>
<Delta x="3" y="9"> ...value subelementgoes here </Delta>
... more Delta elements
</Deltas>
Your REST service would then have to do the work of incrementally updating the server side database records. You'd probably need a distinct REST url for each data type.
HTH
Bill