WebClient and caching in Silverlight + WP7 - c#

I am using WebClient to download a JSON file everytime my WP7 application loads. I am loading all the details in one shot and that too from a server endpoint serving this JSON. The obvious problem I faced was the caching. It was always loading the stale copy. But I tackled this issue by adding a dummy URL paramater at the end.
However, the JSON changes very rarely. So I still need to utilize the caching technique that the WebClient automatically uses. To do this I initially request a call to the server's JSON version something like, http://myserver/JSONVersion. This JSONVersion gets updated any time JSON is updated.
Once I get it, i append it to my url http://myserver/myjson.json?v=(JSONVERSION). This has solved my entire problem. However, I feel this is very ugly and has unnecessary excess code+logic floating around. I am hoping the HTTP Cache headers have a work around similar to the one that I am having. If so, please let me know.

Apparently I found no better solution than the one I currently have.

Related

React and .NET app data fetching in portions

I tried to find the answer to my question, but it seems like I am either missing correct terminology or it really is a bit tricky to do.
I am trying to see if it is possible to utilise either Lazy Loading, or data sent from API in portions periodically, so loading time does not take as long to reach first render. My current system where an array of over 1000 objects is being fetched from .NET API into React UI just does not work as I would like it to.
I would like to skip pagination if possible.
Just implement the endpoint GET /data?show=X&skip=Y,
for first request get data from /data?show=10,
then whenever you want (for example when user reaches the bottom of website), do /data?show=10&skip=10
I'm not sure if I understood you correctly, but I hope it helps somehow :)

using xmlDocument.Load vs using http GET for loading xml documents

so i have been reading around and i cant seem to find a simple enough answer, i have been doing a bit of work with webservices and xml documents being sent around but now im looking to understand something a little better.
xmlDocument.Load(url) and myHttpWebRequest = (HttpWebRequest) HttpWebRequest.Create(inURL);
now there there is obviously a little more code to each of these but i am just giving a brief idea of both so we're all on the same page.
i have used both, and they both work perfectly well, i just dont want to sell myself short when using one over the other (.Load(url) has WAY less code to it)
In my instance (testing at the moment) i am using the former to get tiny amounts of data from my web service and using the later to post a fair bit of information back to my web service.
So my question actually is, not really which is better but when would it be desirable to use the one over the other?
Does it make a big difference or just 2 ways to do the same thing without any negatives?

Increase MaxFieldLength and MaxRequestBytes in web.config

I have a get request with a very large url which works.
But every subsquent calls fails because when this large url becomes
a referer and then the whole header becomes larger than 16k and the request fails.
The size of the request headers is too long.
I know I can fix this by using the registry hack. But I need to change this in the web.config.
Is that possible att all?
Please don't advice me to shorten the url.
No you can't change it from the web.config.
You can do it only by registry because this settings is used at kernel level (http.sys).
It will apply to all web sites and it has many implication and concern
so you have to decide if YOU REALLY NEED THIS?
you says to not advice to shorten the url but I think you're not really aware of the implication.
The problems is your too long urls and not this setting
you're just trying to work around the problem by changing a setting but definitely the problem will recur in other forms and other places

Fastest way to code up hitting a URL

I need to login to a site, then hit a certain URL about a thousand times (with different params, of course).
The URL is something this:
http://www.foo.com/bar.asp?id=x ' where x is the ID
Of course if I simply hit the URL without being logged, it will fail.
I am not very familiar with this type of work, but I would imagine that whatever the method I choose, it would have to support cookies.
I was thinking that I could create a winform app with a browser control and somehow drive it, but that seems like a massive overkill.
Is there a better way?
If you are determined to do it in your code itself then i dont think any thing is stopping you from doing that.
HttpRequest and HttpResponse classes has pretty much everything you need to do that.
Moreover if you are concerned about cookies then you could always store received cookies in a database or file and send them with every subsequent request.
If you want to know the structure of the Http Request like a GET request then look here.
Also you can make your request look like a Request from browser by specifying the Proper Request Headers...(However it doesn't work every time)
And all this can be done even in a console app
You may want to look into WCAT if you are mainly interested in how your server performs under load.
Using Python or PHP, you can use the libcURL library, I believe they both have bindings for these languages. If not, just use the urllib2 module (for Python).

SQL 2008: returning data rows as JSON?

I think this question is like clay pidgeon shooting.. "pull... bang!" .. shot down.. but nevertheless, it's worth asking I believe.
Lots of JS frameworks etc use JSON these days, and for good reason I know. The classic question is "where to transform the data to JSON".
I understand that at some point in the pipeline, you have to convert the data to JSON, be it in the data access layer (I am looking at JSON.NET) or I believe in .NET 4.x there are methods to output/serialize as JSON.
So the question is:
Is it really a bad idea to contemplate a SQL function to output as JSON?
Qualifier:
I understand trying to output 1000's of rows like that isn't a good idea - in fact not really a good idea for web apps either way unless you really have to.
For my requirement, I need possibly 100 rows at a time...
The answer really is: it depends.
If your application is a small one that doesn't receive much use, then by all means do it in the database. The thing to bear in mind though is, what happens when your application is being used by 10x as many users in 12 months time?
If it makes it quick, simple and easy to implement JSON encoding in your stored procedures, rather than in your web code and allows you to get your app out and in use, then that's clearly the way to go. That said, it really doesn't take that much work to do it "properly" with solutions that have been suggested in other answers.
The long and short of it is, take the solution that best fits your current needs, whilst thinking about the impact it'll have if you need to change it in the future.
This is why [WebMethod] (WebMethodAttribute) exists.
Best to load the data to to the piece of program and then return it as JSON.
.NET 4 has a support for returning json, and i did it as a part of one ASP.NET MVC site and it was fairly simple and straightforward.
I recommend to move the transformation out of the sql server
I agree with the other respondents that this is better done in your application code. However... this is theoretically possible using SQL Server's ability to include CLR assemblies in the database using create assembly syntax. The choice is really yours. You could create an assembly to do the translation in .net, define that assembly to SQL Server and then use contained method(s) to serialize to JSON as return values from your stored procedures...
Better to load it using your standard data access technique and then convert to JSON. You can then use it in standard objects in .NET as well as your client side javascript.
If using .net mvc you serialize your results in your controllers and output a JsonResult, there's a method Controller.Json() that does this for you. If using webforms an http handler and the JavascriptSerializer class would be the way to go.
Hey thanks for all the responses.. it still amazes me how many people out there have the time to help.
All very good points, and certainly confirmed my feeling of letting the app/layer do the conversion work - as the glue between the actual data and frontend. I guess I haven't kept up too much with MVC or SQL-2008, and so was unsure if there were some nuggets worth tracking down.
As it worked out (following some links posted here, and further fishing) I have opted to do the following for the time being (stuck back using .NET 3.5 and no MVC right now..):
Getting the SQL data as a datatable/datareader
Using a simple datatable > collection (dictionary) conversion for a serializable list
Because right now I am using an ASHX page to act as the broker to the javascript (i.e.
via a JQuery AJAX call), within my ASHX page I have:
context.Response.ContentType = "application/json";
System.Web.Script.Serialization.JavaScriptSerializer json = new System.Web.Script.Serialization.JavaScriptSerializer();
I can then issue: json.serialize(<>)
Might seem a bit backward, but it works fine.. and the main caveat is that it is not ever returning huge amounts of data at a time.
Once again, thanks for all the repsonses!

Categories