I think this question is like clay pidgeon shooting.. "pull... bang!" .. shot down.. but nevertheless, it's worth asking I believe.
Lots of JS frameworks etc use JSON these days, and for good reason I know. The classic question is "where to transform the data to JSON".
I understand that at some point in the pipeline, you have to convert the data to JSON, be it in the data access layer (I am looking at JSON.NET) or I believe in .NET 4.x there are methods to output/serialize as JSON.
So the question is:
Is it really a bad idea to contemplate a SQL function to output as JSON?
Qualifier:
I understand trying to output 1000's of rows like that isn't a good idea - in fact not really a good idea for web apps either way unless you really have to.
For my requirement, I need possibly 100 rows at a time...
The answer really is: it depends.
If your application is a small one that doesn't receive much use, then by all means do it in the database. The thing to bear in mind though is, what happens when your application is being used by 10x as many users in 12 months time?
If it makes it quick, simple and easy to implement JSON encoding in your stored procedures, rather than in your web code and allows you to get your app out and in use, then that's clearly the way to go. That said, it really doesn't take that much work to do it "properly" with solutions that have been suggested in other answers.
The long and short of it is, take the solution that best fits your current needs, whilst thinking about the impact it'll have if you need to change it in the future.
This is why [WebMethod] (WebMethodAttribute) exists.
Best to load the data to to the piece of program and then return it as JSON.
.NET 4 has a support for returning json, and i did it as a part of one ASP.NET MVC site and it was fairly simple and straightforward.
I recommend to move the transformation out of the sql server
I agree with the other respondents that this is better done in your application code. However... this is theoretically possible using SQL Server's ability to include CLR assemblies in the database using create assembly syntax. The choice is really yours. You could create an assembly to do the translation in .net, define that assembly to SQL Server and then use contained method(s) to serialize to JSON as return values from your stored procedures...
Better to load it using your standard data access technique and then convert to JSON. You can then use it in standard objects in .NET as well as your client side javascript.
If using .net mvc you serialize your results in your controllers and output a JsonResult, there's a method Controller.Json() that does this for you. If using webforms an http handler and the JavascriptSerializer class would be the way to go.
Hey thanks for all the responses.. it still amazes me how many people out there have the time to help.
All very good points, and certainly confirmed my feeling of letting the app/layer do the conversion work - as the glue between the actual data and frontend. I guess I haven't kept up too much with MVC or SQL-2008, and so was unsure if there were some nuggets worth tracking down.
As it worked out (following some links posted here, and further fishing) I have opted to do the following for the time being (stuck back using .NET 3.5 and no MVC right now..):
Getting the SQL data as a datatable/datareader
Using a simple datatable > collection (dictionary) conversion for a serializable list
Because right now I am using an ASHX page to act as the broker to the javascript (i.e.
via a JQuery AJAX call), within my ASHX page I have:
context.Response.ContentType = "application/json";
System.Web.Script.Serialization.JavaScriptSerializer json = new System.Web.Script.Serialization.JavaScriptSerializer();
I can then issue: json.serialize(<>)
Might seem a bit backward, but it works fine.. and the main caveat is that it is not ever returning huge amounts of data at a time.
Once again, thanks for all the repsonses!
Related
so i have been reading around and i cant seem to find a simple enough answer, i have been doing a bit of work with webservices and xml documents being sent around but now im looking to understand something a little better.
xmlDocument.Load(url) and myHttpWebRequest = (HttpWebRequest) HttpWebRequest.Create(inURL);
now there there is obviously a little more code to each of these but i am just giving a brief idea of both so we're all on the same page.
i have used both, and they both work perfectly well, i just dont want to sell myself short when using one over the other (.Load(url) has WAY less code to it)
In my instance (testing at the moment) i am using the former to get tiny amounts of data from my web service and using the later to post a fair bit of information back to my web service.
So my question actually is, not really which is better but when would it be desirable to use the one over the other?
Does it make a big difference or just 2 ways to do the same thing without any negatives?
I'm in the process of converting an asp.net application from MVC controllers to ApiController. So far everything is going pretty smooth, except I've had a few hicccups.
The problem I'm having right now is a few methods are having requests of the form:
sort:FieldName
dir:DESC
filter[0][field]:FieldName
filter[0][data][type]:string
filter[0][data][value]:deadeawd
(the content type is application/x-www-form-urlencoded;)
And the sort and the dir can easily be captured within a model class, and I've done so, but I don't know how to capture the filter[0] fields. (there could be filter[1] and so on as well, just how many there are is not known ahead of time, ie the data structure is dynamic).
Currently the application grabs the form data, and a method builds a query string based on the data there, but in Web API we no longer have access to the form data directly.
I could use a dynamic object, or a NameValueCollection, but I'm just trying to figure out what's the best option, what's the intended usage, and what's the best practice.
(in case you're wondering, the request data can't be changed, it's from a framework that we are using and don't have an easy way to override how it does things)
The answer we ended up going with was to change the framework so it sent a proper JSON object.
It was quite a bit of work, and it means it might be difficult to update it, but we're looking at moving away from the framework soon anyways, and we don't want the trivial little details of this framework to alter our API in a negative way.
If anyone is interested, the dynamic or NameValueCollection probably would've worked, and I don't think there is a best practice, because passing data like this is already bad practice.
In my application we have multi-lingual language strings which are stored in custom tables, as the user can edit, delete, import new languages etc... via a UI
Currently, what I'm doing is at the beginning of each request is. I'm going off and getting all the language strings (From our database) for the currently selected language and sticking them in a dictionary.
I then have a Html Helper extension method which I use in the razor views (See below), which fishes in the dictionary I got at the beginning of the request to pull out the correct language based on the key supplied in the helper.
Html.LanguageString("MyLanguage.KeyHere")
Now this works fine. However, as the application is getting bigger. We are getting more and more language strings. It's not an issue right now, as its still very fast as there are only around 200 strings to get.
But this also means I'm getting all of them, even if a page has say one on it. I'd ideally like a way of processing the LanguageString("")'s before hand and doing a query to just get those that are needed at the beginning of the request? Or maybe my own linq based language that can be processed and product a more efficient call.
I'm looking for some advice on how to do this. As I'd like the application to be as efficient as possible. Any advice, help, tips are greatly received. Thanks.
I'd suggest caching language strings on the application basis rather than fetching them for every request. For example, this can be done by maintaining a static dictionary and invalidating the cache only when the user makes changes to these strings. This will make your application more responsive as well as save you from implementing (imho) rather more complex and not necessarily efficient technique of loading this data on-demand.
As a side note I'd add the following: it's usually a good practice to address these kinds of problems when they arise (rather than fixing something that is not broken) and focus on more important things. I totally agree that performance implications of a given solution must always be taken into consideration, I'm just saying that premature optimizations are not always a good idea.
I am in a situation where I possibly can influence a decision about some web service work on the C# side and need some nice info (ammunition) that I can use to choose what I think would be better(JSON).
Could anyone give me any help as to the pros and cons of each? One of things that I like about JSON is that is it much cleaner to maintain, it supports any web browser(i think) and it supports(i am pretty sure) the ability to send non-primitive objects across the pipe. If IIS can do all of those things then please inform me differently. thanks!
If the decision is for the data transport, is best to use JSON than xml, since the footprint is smaller, the translation from the string to the object is easily supported in all languages (javascript, C#), you can use JSON services to communicate cross domains (which fails if you are using an XML web service).
a non primitive data type is transformed to a JSON string, you can have really complex objects in JSON format, the only issue you may find is the Date transformation, but if the code is only for your company, that may not pose a problem since the serialization and de-serialization will be the same.
you can find the description of JSON here.
To be able to get more help you will have to be more specific on what do you want to do.
I'm writing a simple program that will run entirely client-side. (Desktop programming? do people still do that?) and I need a simple way to store trivial amounts of data in a structured form, but really don't see any need to use a database system. What's more, some of the data needs to be serialized and passed around to different users, like some kind of "file" or perhaps a "document". (has anyone ever done that before?)
So, I've looked at using .Net DataSets, LINQ, direct XML manipulation, and they all seem like they would get the job done, but I would like to know before I dive into any of them if there's one method that is generally regarded as easier to code than others. As I said, the amount of data to be stored is trivial, even if one hundred people all used the same machine we're not talking about more than 10 MB, so performance is not as large a concern as is codeability/maintainability. Thank you all in advance!
Sounds like Linq-to-XML is a good option for this.
Link 1
Link 2
Tons of info out there on this.
Without knowing anything else about your app, the .Net DataSets would likely be your easiest option because WriteXml and ReadXml already exist.
Any serialization API should do fine here. I would recommend something that is contract based (not BinaryFormatter, which is type-based) as that will keep it usable over time (as your assembly changes).
So I would build a basic object model (DTO) and use any of:
XmlSerializer
DataContractSerializer
protobuf-net (you all knew it was coming...)
OO, simple, and easy. And easy to use for passing fragments of the data (either between users of to a central server).
I would choose an embedded database. Using something like sqlite doesn't seem to be an overkill for me. You may even try its c# port (http://code.google.com/p/csharp-sqlite/).