using xmlDocument.Load vs using http GET for loading xml documents - c#

so i have been reading around and i cant seem to find a simple enough answer, i have been doing a bit of work with webservices and xml documents being sent around but now im looking to understand something a little better.
xmlDocument.Load(url) and myHttpWebRequest = (HttpWebRequest) HttpWebRequest.Create(inURL);
now there there is obviously a little more code to each of these but i am just giving a brief idea of both so we're all on the same page.
i have used both, and they both work perfectly well, i just dont want to sell myself short when using one over the other (.Load(url) has WAY less code to it)
In my instance (testing at the moment) i am using the former to get tiny amounts of data from my web service and using the later to post a fair bit of information back to my web service.
So my question actually is, not really which is better but when would it be desirable to use the one over the other?
Does it make a big difference or just 2 ways to do the same thing without any negatives?

Related

ASP.NET Web API return data every second

first of all I am sorry for my poor question title, but I can't think of everything better right now.
What I want to do seems quite simpel. I want my API to return, for example, a simple string every second for like 10 times.
Is there any way to achive this with ASP.NET?
And yes, obviously the actual usecase is different. My API is scraping some data and returning it after that. The problem is the scraping takes like 70 seconds. I would like it to return the data in smaller packages, so my application can start processing the data earlier.
But the way to achive this - if there is any - should be the same.
At least thats how I think of it.
Thanks for any help.
this is not possible with http.
you must do long pooling on client side or You can create a socket connection between client and server. package signalr is a great choice.

Any tips on how i would go about extract Pandora likes and putting them on a spreadsheet? (C++/C#)

Fairly new to coding and i want a project to work on that could help me advance my skills. I'm not sure what language would be best for this sort of undertaking but i would definitely prefer to use C++ or C#.
For the first part of the program i basically would like to try and take all my pandora likes and put them on a spreadsheet with song name is one column and artist in the other. I don't see the formatting being too hard once i actually get the data i need, but i'm not really sure how to communicate with a server at all in this point in time. I'm guessing i probably won't be able to grab a raw list of likes so the i'm thinking my best course of action will be to first expand the likes list all the way, and then i need to read the text on the screen ro in the source code.
For the first step, expanding my like i found the HTML source code that actually does this:
<div class="show_more tracklike" data-nextLikeStartIndex="0" data-nextThumbStartIndex="5">Show more</div>"
Not sure if this is something i can work with but i was thinking if i could set data-nextThumbStartIndex="5" to be equal to the # of likes - 5 (the amount it shows by default) it would be fairly easy to expand the list. If not i would probably have to click the "show more" link repeatedly until i have all the likes on the page.
For the next step, getting the data i want, i think my best option would be to basically just grab the text that i physically see on the screen and worry about filtering and manipulating the data afterwards. The other option is looking at the source code, which i actually found the pieces of code where the info i want is stored. If i could retrieve the page's source code i think it would be relatively easy to pick out the data i actually want from that.
So yea that's about it, i know i'm pretty noob atm and what i'm saying is probably wrong and/or much more complicated than i think but i'm a pretty quick learner and at the very least if someone could point me in the right direction to communicate with a server that would be much appreciated.
This question is quite "wide" (and I have absolutely no knowledge of Pandora itself - can't access it from where I live).
In general, there are several different ways to solve this type of problem:
Screen Scraping - basically access the website as if you were a web-server, and from the HTML string that comes back, dig out the information you need. The problem here is that the data is not very suitable for "machine reading", as it often has no distinct points for the "reader" to find the relevant information, and it's difficult to sort the data from the "chaff".
AJAX api - "Asynchronous Java Script and XML" where the provider of the website has an interface to fetch certain data within to the web-browser - of course, if you "pretend" to be the web-browser, requesting the same type of information. You are relying on the website to have such an interface, but if it exists, the data is generally in a "more suitable form to be machine read" (typically XML, but not always).
JSON api - "Java Script Object Notation" is a similar solution to AJAX - like XML, JSON is a "human and machine readable format".
The latter two are definitely preferable, as the data coming back is meant for machine reading. The drawback is that you need to have "server side cooperation". The good thing here is that Pandora does have a JSON API. The bad thing is that it seems to be hard to use... Here's one discussion on the subject:
Making JSON calls to Unoffical Pandora API
The main principle here is that you send some stuff to the webserver, and receive a reply with the requested information. Exactly how this is done depends on the language/programming environment. A popular C++ solution is libcurl.
There is a Ruby Client here, using the JSON interface
https://github.com/nixme/pandora_client
A C# implementation to interface with Pandora is here:
http://pandoraunleashed.googlecode.com/svn/trunk/PandoraUnleashed/Pandora.cs
Unfortunately, I can't find any direct reference to "listing likes".

Web API get dynamic Form Data

I'm in the process of converting an asp.net application from MVC controllers to ApiController. So far everything is going pretty smooth, except I've had a few hicccups.
The problem I'm having right now is a few methods are having requests of the form:
sort:FieldName
dir:DESC
filter[0][field]:FieldName
filter[0][data][type]:string
filter[0][data][value]:deadeawd
(the content type is application/x-www-form-urlencoded;)
And the sort and the dir can easily be captured within a model class, and I've done so, but I don't know how to capture the filter[0] fields. (there could be filter[1] and so on as well, just how many there are is not known ahead of time, ie the data structure is dynamic).
Currently the application grabs the form data, and a method builds a query string based on the data there, but in Web API we no longer have access to the form data directly.
I could use a dynamic object, or a NameValueCollection, but I'm just trying to figure out what's the best option, what's the intended usage, and what's the best practice.
(in case you're wondering, the request data can't be changed, it's from a framework that we are using and don't have an easy way to override how it does things)
The answer we ended up going with was to change the framework so it sent a proper JSON object.
It was quite a bit of work, and it means it might be difficult to update it, but we're looking at moving away from the framework soon anyways, and we don't want the trivial little details of this framework to alter our API in a negative way.
If anyone is interested, the dynamic or NameValueCollection probably would've worked, and I don't think there is a best practice, because passing data like this is already bad practice.

SQL 2008: returning data rows as JSON?

I think this question is like clay pidgeon shooting.. "pull... bang!" .. shot down.. but nevertheless, it's worth asking I believe.
Lots of JS frameworks etc use JSON these days, and for good reason I know. The classic question is "where to transform the data to JSON".
I understand that at some point in the pipeline, you have to convert the data to JSON, be it in the data access layer (I am looking at JSON.NET) or I believe in .NET 4.x there are methods to output/serialize as JSON.
So the question is:
Is it really a bad idea to contemplate a SQL function to output as JSON?
Qualifier:
I understand trying to output 1000's of rows like that isn't a good idea - in fact not really a good idea for web apps either way unless you really have to.
For my requirement, I need possibly 100 rows at a time...
The answer really is: it depends.
If your application is a small one that doesn't receive much use, then by all means do it in the database. The thing to bear in mind though is, what happens when your application is being used by 10x as many users in 12 months time?
If it makes it quick, simple and easy to implement JSON encoding in your stored procedures, rather than in your web code and allows you to get your app out and in use, then that's clearly the way to go. That said, it really doesn't take that much work to do it "properly" with solutions that have been suggested in other answers.
The long and short of it is, take the solution that best fits your current needs, whilst thinking about the impact it'll have if you need to change it in the future.
This is why [WebMethod] (WebMethodAttribute) exists.
Best to load the data to to the piece of program and then return it as JSON.
.NET 4 has a support for returning json, and i did it as a part of one ASP.NET MVC site and it was fairly simple and straightforward.
I recommend to move the transformation out of the sql server
I agree with the other respondents that this is better done in your application code. However... this is theoretically possible using SQL Server's ability to include CLR assemblies in the database using create assembly syntax. The choice is really yours. You could create an assembly to do the translation in .net, define that assembly to SQL Server and then use contained method(s) to serialize to JSON as return values from your stored procedures...
Better to load it using your standard data access technique and then convert to JSON. You can then use it in standard objects in .NET as well as your client side javascript.
If using .net mvc you serialize your results in your controllers and output a JsonResult, there's a method Controller.Json() that does this for you. If using webforms an http handler and the JavascriptSerializer class would be the way to go.
Hey thanks for all the responses.. it still amazes me how many people out there have the time to help.
All very good points, and certainly confirmed my feeling of letting the app/layer do the conversion work - as the glue between the actual data and frontend. I guess I haven't kept up too much with MVC or SQL-2008, and so was unsure if there were some nuggets worth tracking down.
As it worked out (following some links posted here, and further fishing) I have opted to do the following for the time being (stuck back using .NET 3.5 and no MVC right now..):
Getting the SQL data as a datatable/datareader
Using a simple datatable > collection (dictionary) conversion for a serializable list
Because right now I am using an ASHX page to act as the broker to the javascript (i.e.
via a JQuery AJAX call), within my ASHX page I have:
context.Response.ContentType = "application/json";
System.Web.Script.Serialization.JavaScriptSerializer json = new System.Web.Script.Serialization.JavaScriptSerializer();
I can then issue: json.serialize(<>)
Might seem a bit backward, but it works fine.. and the main caveat is that it is not ever returning huge amounts of data at a time.
Once again, thanks for all the repsonses!

Looking for the most painless non-RDBMS storage method in C#

I'm writing a simple program that will run entirely client-side. (Desktop programming? do people still do that?) and I need a simple way to store trivial amounts of data in a structured form, but really don't see any need to use a database system. What's more, some of the data needs to be serialized and passed around to different users, like some kind of "file" or perhaps a "document". (has anyone ever done that before?)
So, I've looked at using .Net DataSets, LINQ, direct XML manipulation, and they all seem like they would get the job done, but I would like to know before I dive into any of them if there's one method that is generally regarded as easier to code than others. As I said, the amount of data to be stored is trivial, even if one hundred people all used the same machine we're not talking about more than 10 MB, so performance is not as large a concern as is codeability/maintainability. Thank you all in advance!
Sounds like Linq-to-XML is a good option for this.
Link 1
Link 2
Tons of info out there on this.
Without knowing anything else about your app, the .Net DataSets would likely be your easiest option because WriteXml and ReadXml already exist.
Any serialization API should do fine here. I would recommend something that is contract based (not BinaryFormatter, which is type-based) as that will keep it usable over time (as your assembly changes).
So I would build a basic object model (DTO) and use any of:
XmlSerializer
DataContractSerializer
protobuf-net (you all knew it was coming...)
OO, simple, and easy. And easy to use for passing fragments of the data (either between users of to a central server).
I would choose an embedded database. Using something like sqlite doesn't seem to be an overkill for me. You may even try its c# port (http://code.google.com/p/csharp-sqlite/).

Categories