Silverlight/C# - Best way to dynamically load WCF data? - c#

I would like to be able to load data into a DataGrid in Silverlight as it becomes available. Here is the scenario:
My silverlight client fires a WCF call to the server.
The server takes about 1 to 2 seconds to respond.
The response is between 1 MB and 4 MB (quite large).
This data is loaded into a DataGrid.
Although the server responds quickly, the data is not seen by the user until all 1 MB to 4 MB has been downloaded.
What is the best (or most efficient) way to load this data into the DataGrid as it is being downloaded by the client? As opposed to waiting for the download to complete?

A way of handling this is implementing custom virtualization.
Add a webservice method that returns only ids
Add a webservice method that returns a single object by id
Retrieve the ids and load only the visible objects (and perhaps a few more to allow scrolling)
Retrieve more objects when needed for scrolling.

The problem is part of what I was trying to get at with my comment (you still didn't really specify the Datatype for the return), and what Erno gave you a workable solution for. The web service serializes whatever return type you are sending, and will not give you partial results. It's not a question of how you interface with the grid, it's a question of when the web service call on the client says "ok i received the data you need, now continue processing". For instance, if you are gathering up a data table on the server side with 4MB worth of records, then in your service do a :
return MyMassiveDatatable;
Then you are going to have to wait for the entire data table to be serialized and pumped across the wire.
His solution was to break up the transfer into atomic units. I.E. query first for the id's of the records in one web service call, then iterate through those id's and request the record for each id one at a time, and as you receive the one record back, add that to your client side table so that your display writes each record as you get it.

Related

How to page an API response to other system expecting only one response?

I'm working with the integration between two systems (A and B), and I'm responsible for the A side (C#).
On that scenario, B client consumes A API, where each endpoint delivers all data related to the that endpoint (for example, the endpoint Clients will deliver all clients in only one response).
The issue is that in some endpoints the amount of data may (it will) increase very much, and I need to find some way of sending it parceled, preferably not needing to change B implementation (there are many Bs).
One big detail on that challenge is that B side can't click on Next or similar to the next page, it's a system.
Some idea/library/tip that can help me?
Actual Scenario: ClientController delivers all 10000 clients in one response.
Expected Scenario: Send 100 clients at a time, sending the next 100 just after the response that the previous 100 arrived ok.
Depends.
If you have an object to share like a big video, you can use "range" requests and return pieces of total bytes. This won't work with a request which changes results, cause obviously it will expect multiple pieces requests and the result won't be understandable till you have every bit of the response.
Another idea can be to have an ID for every object in response and ask for a response starting by the last known ID given as parameter of the request.
You can use query string:
/your_api/clients?limit=100&start=0"
so on system B side you will iterate over that and pull as much as
you need at that moment, by changing start= and/or limit=
Edit:
no need to do any kind of mambo jumbo, query string is good thing, if you change that, there is no need to update consumers, like when you're changing API, so if there is some third system who don't need your change, they don't need to be updated.
If you need to update system B if you want to optimize it, do it.
Mambo jumbo is not good thing for quality and sustainable API.

How to show real time counter in asp.net

I want to process some records one by one , Need to show record number on .aspx page that currently we are processing nth record.
Setting value of nth record in a label does not work as it will be reflecting only after response come back from server (it does not get refreshed for every record - looks like freezing )
You can use different techniques:
Do server pulling via AJAX request by in some period. Good answered question on this topic - Server polling with JavaScript
Use SignalR to send changes back to client when there was changes on server. You can read about it more here: How SignalR works internally?
Choose one that is applicable in your case

send multiple responses using task based approach instead of events/callbacks

Recently in C# 4.0 the task based approach for async programming was unveiled. So we were trying to develop some of our functions which used callbacks earlier.
The problem we are facing is with the implementation of multiple responses for the functions using tasks. E.g. we have a function which fetches some data from a thirdparty API. But before fetching the data from API we first check whether we already have it in our in-memory cache or in DB then only we go to the API. The main client application sends a request for a list of symbols for which data to fetch. If we find data for some symbols in cache or in DB we send it immediately via the callback. For remaining symbols we request the API.
This gives a feeling of real-time processing on client application for some symbols. And for other symbols the user gets to know that it will take time. If I do not send responses to the client instantly and first collect all the data and then only send response for the whole list then the user will be stuck for 99 symbols even if only 1 symbol is to be fetched from API.
How can I send multiple responses using the task based approach?
It seems like you want to have an async method that returns more than once. The answer is you can't.
What you can do is:
Call 2 different methods with the same "symbols". The first only checks the cache and DB and returns what it can, and the second one only calls the remote API. This way you get what you can fast from a cache and the rest more slowly.
Keep using callbacks as a "mini producer consumer" design so you can call it as many times you like.
I could try for a more concrete answer if you post the code you're using.

WCF GridView paging and caching

Is there a way to cache a user requested data depending upon the user/connection. My situation is that I display the data returned from the WCF in ASP.NET Gridview using paging. The gridview paging displays only 10 items at a time. Whenever the next page is clicked, the service is getting called again (which takes time). After each call the WCF connection is closed. Is there a way to fix this issue? I read upon WCF ASP.NET caching mechanism that caches functions call data and it expires after a certain time. My main thing, is per user/call caching like return 10 items at a time without calling the long running function again for each 10 sets of data. Is there a way to do?
Basically call the function the first time and get 100 items, and return it 10 at a time, without ever running the function again (which gets 100 items)?
I believe that the ASP.NET Gridview always does a postback which result in function call everytime a user changes page. But I maybe proven wrong by someone.
That being said, this may not be the best solution for you. But you could avoid using ASP.NET Gridview in your scenario. When you run a WCF call and return 100 items, you keep them on client side as a JSON file stored in memory, then use jQuery or some other form of javascript to enable users to go through pages without making another function call.
Edited
This may prove useful to you to help you do paging on client-side without having multiple repeated call to the server
http://www.smallworkarounds.net/2009/02/jquery-aspnet-how-to-implement.html

Update client-PCs with each others activities (insert, update, delete)

I am developing a Client-Server application using C# .NET Winforms with SQL Server 2008. The client pc's connect to the database server via the LAN.
The task I want to achieve is when an insert (or update or delete) is performed on one client-PC, all other clients must get that update in real-time.
I am currently using timers, so that each client queries the database every 15 seconds then refreshes the gridviews, combo boxes and list boxes. But this makes the application slow and bulky to use.
What is the correct method to use in such scenario. What are such operations called (correct terminology)? Should I use windows services? or same application with background threads ?
First of all, its Windows, so it cannot ever be realtime.
The solutions that – Igby Largeman suggests is well possible. It does have the disadvantage that it can cause very heavvy network traffic, because every time something changes in the database, it is broadcasted to all the clients.
You also have to consider the possibility that something clogs ub the communication between the server and one or more clients, so realtime is out of the question.
that's tricky!
If you really want a user with a grid open on PCA to see data inserted on a PCB without performing any action, you will need a timer to refresh the grid. But I dont think this is a good aproach, you can easily overload the system.
The good practice here is display only the data due to be manipulated. So for example, lets say you want to alter a client's name. You build a search form with a grid where the user can inform search parameters (to filter the data) and once the client is found and altered, you perform another search to the DB to get and display the new data.
But lets say another user had the same grid open showing the client before you perform the alteration. It is showing the old value, but once it clicks on it to see the details, you'll perform a search to the DB to get the new data so that would be ok.

Categories