Web API get dynamic Form Data - c#

I'm in the process of converting an asp.net application from MVC controllers to ApiController. So far everything is going pretty smooth, except I've had a few hicccups.
The problem I'm having right now is a few methods are having requests of the form:
sort:FieldName
dir:DESC
filter[0][field]:FieldName
filter[0][data][type]:string
filter[0][data][value]:deadeawd
(the content type is application/x-www-form-urlencoded;)
And the sort and the dir can easily be captured within a model class, and I've done so, but I don't know how to capture the filter[0] fields. (there could be filter[1] and so on as well, just how many there are is not known ahead of time, ie the data structure is dynamic).
Currently the application grabs the form data, and a method builds a query string based on the data there, but in Web API we no longer have access to the form data directly.
I could use a dynamic object, or a NameValueCollection, but I'm just trying to figure out what's the best option, what's the intended usage, and what's the best practice.
(in case you're wondering, the request data can't be changed, it's from a framework that we are using and don't have an easy way to override how it does things)

The answer we ended up going with was to change the framework so it sent a proper JSON object.
It was quite a bit of work, and it means it might be difficult to update it, but we're looking at moving away from the framework soon anyways, and we don't want the trivial little details of this framework to alter our API in a negative way.
If anyone is interested, the dynamic or NameValueCollection probably would've worked, and I don't think there is a best practice, because passing data like this is already bad practice.

Related

React and .NET app data fetching in portions

I tried to find the answer to my question, but it seems like I am either missing correct terminology or it really is a bit tricky to do.
I am trying to see if it is possible to utilise either Lazy Loading, or data sent from API in portions periodically, so loading time does not take as long to reach first render. My current system where an array of over 1000 objects is being fetched from .NET API into React UI just does not work as I would like it to.
I would like to skip pagination if possible.
Just implement the endpoint GET /data?show=X&skip=Y,
for first request get data from /data?show=10,
then whenever you want (for example when user reaches the bottom of website), do /data?show=10&skip=10
I'm not sure if I understood you correctly, but I hope it helps somehow :)

Best way to handle large amount of permanent data

I'm developing a PC app in Visual Studio where I'm showing the status of hundreds of sensors that are connected via WiFi. The thing is that I need to hold on to the sensor data even after I close the app, so I'm considering some form of permanent storage. These are the options I've considered:
1) My Sensor object is relatively compact with only a few properties. I could serialize all the objects before closing the app and load them every time the app starts anew.
2) I could throw all the properties (which are mostly strings and doubles) into a simple text file and create a custom protocol for storage and retrieval.
3) I could integrate a database with my app. Someone told me this is the best way to go about it, but I'm a bit hesitant seeing as I'm not familiar with DBs.
Which method would yield the best results in terms of resource usage and speed? Or is there some other, better way to go about this?
First thing you need is to understand is your problem. For example, when the program is running do you need to have everything in memory at the same time or do you work with your sensors one at a time?
What is a "large amount of data"? For example, to me that will never be less than million (or billion in some cases).
Once you know that you shouldn't be scared of using something just because you are not familiar to it. Otherwise you are not looking for the best solution for your problem, you are just hacking around it in a way that you feel comfortable.
This being said, you have several ways of doing this. Like you said you can serialize data, using json to store and a few other alternatives but if we are talking about a "large amount of data that we want to persist" I would always call for the use of Databases (the name says a lot). If you don't need to have everything in memory at the same time then I believe that this is you best option.
I personally don't like them (again, personal choice) but one way of not learning SQL (a lot) while you still use your objects is to use an ORM like NHibernate (you will also need to learn how to use it so you don't get things a slower).
If you need to have everything loaded at the same time (most often that is not the case so be sure of this) you need to know what you want to keep and serialize it. If you want that data to be readable by another tool or organize in a given way consider a data format like XML or JSON.
Also, you can use mmap-file.
File is permanent, and keep data between program run.
So, you just keep your data structs in the mmap-ed area, and no more.
MSDN manual here:
https://msdn.microsoft.com/en-us/library/windows/desktop/aa366556%28v=vs.85%29.aspx
Since you need to load all the data once at the start of the program, the database case seems doubtful. The DB necessary when you need to load a bit of data many times.
So first two cases seem more preferred. I would advice to hide a specific solution behind an interface, then you'll can change it later.
Standard .NET serialization of sensors' array is more simple probably, and it will be easier to expand.

Pre-process MVC Razor File For Multi-Lingual Language Strings?

In my application we have multi-lingual language strings which are stored in custom tables, as the user can edit, delete, import new languages etc... via a UI
Currently, what I'm doing is at the beginning of each request is. I'm going off and getting all the language strings (From our database) for the currently selected language and sticking them in a dictionary.
I then have a Html Helper extension method which I use in the razor views (See below), which fishes in the dictionary I got at the beginning of the request to pull out the correct language based on the key supplied in the helper.
Html.LanguageString("MyLanguage.KeyHere")
Now this works fine. However, as the application is getting bigger. We are getting more and more language strings. It's not an issue right now, as its still very fast as there are only around 200 strings to get.
But this also means I'm getting all of them, even if a page has say one on it. I'd ideally like a way of processing the LanguageString("")'s before hand and doing a query to just get those that are needed at the beginning of the request? Or maybe my own linq based language that can be processed and product a more efficient call.
I'm looking for some advice on how to do this. As I'd like the application to be as efficient as possible. Any advice, help, tips are greatly received. Thanks.
I'd suggest caching language strings on the application basis rather than fetching them for every request. For example, this can be done by maintaining a static dictionary and invalidating the cache only when the user makes changes to these strings. This will make your application more responsive as well as save you from implementing (imho) rather more complex and not necessarily efficient technique of loading this data on-demand.
As a side note I'd add the following: it's usually a good practice to address these kinds of problems when they arise (rather than fixing something that is not broken) and focus on more important things. I totally agree that performance implications of a given solution must always be taken into consideration, I'm just saying that premature optimizations are not always a good idea.

ASP.NET MVC 3 Razor View Restrictions

I apologize in advance for the generic nature of my question, but I was unable to find any helpful advice from people trying to do the same thing as me on the web. Let me describe my scenario:
I am providing end users/designers of a website the ability to customize their views by storing the views (using Razor) in the database. I have all of this working, but my question is the following; From a security standpoint, how can I ensure and enforce that unwanted code doesn't get executed in the user-defined view? There are two basic approaches that I think will work conceptually, but am not sure which one is more possible or feasible.
Option 1: Create a validation method in the administration tool that allows the user to input the view code. This would need to either take a whitelist or blacklist approach to what is allowable or not.
Option 2: Prevent unwanted code from being able to execute when rendering of the view occurs.
As a quick example of something that would need to be blocked, we wouldn't want to allow access to read or write files, access any data access functions, or even access configuration settings, etc. in the web.config. There will likely be a decently-sized list of things that probably shouldn't be allowable, but I'll need to sit down and try to think of as many security-related concerns as possible.
My question then is, which method would be the best bet? Also, can any direction be provided on how to go about either? I thought I might be able to make trust-level based change which would be Option 2, but couldn't find any way to make that work in a per-view based manor (the administration code is allowed to execute whatever it wants). I'm thinking Option 1 will end up being the best bet and I'll have to check for the input of certain framework functions that shouldn't be allowed. Does anyone have any experience doing anything like what I'm trying to do? ANY feedback is much appreciated!
This would be extremely difficult.
You could run the the template through the Razor preprocessor, then use Roslyn (still in early beta) to parse the generated file and look through all method calls (or constructors) and return an error if it calls something you don't like.
I strongly recommend that you use a whitelist for that, since the .Net framework is big enough that you are bound to overlook something in a blacklist.
However, I would instead recommend that you not use Razor at all and instead use a templating engine that does not allow real C# code.

SQL 2008: returning data rows as JSON?

I think this question is like clay pidgeon shooting.. "pull... bang!" .. shot down.. but nevertheless, it's worth asking I believe.
Lots of JS frameworks etc use JSON these days, and for good reason I know. The classic question is "where to transform the data to JSON".
I understand that at some point in the pipeline, you have to convert the data to JSON, be it in the data access layer (I am looking at JSON.NET) or I believe in .NET 4.x there are methods to output/serialize as JSON.
So the question is:
Is it really a bad idea to contemplate a SQL function to output as JSON?
Qualifier:
I understand trying to output 1000's of rows like that isn't a good idea - in fact not really a good idea for web apps either way unless you really have to.
For my requirement, I need possibly 100 rows at a time...
The answer really is: it depends.
If your application is a small one that doesn't receive much use, then by all means do it in the database. The thing to bear in mind though is, what happens when your application is being used by 10x as many users in 12 months time?
If it makes it quick, simple and easy to implement JSON encoding in your stored procedures, rather than in your web code and allows you to get your app out and in use, then that's clearly the way to go. That said, it really doesn't take that much work to do it "properly" with solutions that have been suggested in other answers.
The long and short of it is, take the solution that best fits your current needs, whilst thinking about the impact it'll have if you need to change it in the future.
This is why [WebMethod] (WebMethodAttribute) exists.
Best to load the data to to the piece of program and then return it as JSON.
.NET 4 has a support for returning json, and i did it as a part of one ASP.NET MVC site and it was fairly simple and straightforward.
I recommend to move the transformation out of the sql server
I agree with the other respondents that this is better done in your application code. However... this is theoretically possible using SQL Server's ability to include CLR assemblies in the database using create assembly syntax. The choice is really yours. You could create an assembly to do the translation in .net, define that assembly to SQL Server and then use contained method(s) to serialize to JSON as return values from your stored procedures...
Better to load it using your standard data access technique and then convert to JSON. You can then use it in standard objects in .NET as well as your client side javascript.
If using .net mvc you serialize your results in your controllers and output a JsonResult, there's a method Controller.Json() that does this for you. If using webforms an http handler and the JavascriptSerializer class would be the way to go.
Hey thanks for all the responses.. it still amazes me how many people out there have the time to help.
All very good points, and certainly confirmed my feeling of letting the app/layer do the conversion work - as the glue between the actual data and frontend. I guess I haven't kept up too much with MVC or SQL-2008, and so was unsure if there were some nuggets worth tracking down.
As it worked out (following some links posted here, and further fishing) I have opted to do the following for the time being (stuck back using .NET 3.5 and no MVC right now..):
Getting the SQL data as a datatable/datareader
Using a simple datatable > collection (dictionary) conversion for a serializable list
Because right now I am using an ASHX page to act as the broker to the javascript (i.e.
via a JQuery AJAX call), within my ASHX page I have:
context.Response.ContentType = "application/json";
System.Web.Script.Serialization.JavaScriptSerializer json = new System.Web.Script.Serialization.JavaScriptSerializer();
I can then issue: json.serialize(<>)
Might seem a bit backward, but it works fine.. and the main caveat is that it is not ever returning huge amounts of data at a time.
Once again, thanks for all the repsonses!

Categories