animated chart control for asp.net - c#

I am trying to write an application that will reflect and update some statistics info of performance while a task is running (a benchmark app)
the issue is, that it's related to systems info, Such as CPU usage and memory allocation
so i need to reflect those calculations and make a simple way to update that graph with the new numbers translated visually ( with any user /programmer custom interval of update frequency) or even calculated based on overall system resources so if the system is not too busy
frequency will be max speed...
the question is , what to start with ?
using a progress bar of Winforms was my first thought (thinking simple as less as i could use lines of code for that specific task of displaying the data )
i am using c# .net 4.0 ASP Webforms .
can you please guide me to a simple way, how to implement this?
could be in C# asp.net or mixed with javascript /jQuery
PS
i was thinking of an approach, to have multiple images :
from an empty-bar till full-bar (switch between them according to calculation) , though i need a faster response than replacing src of img.
maybe a pixel wise action, ... a divs with bgcolor... height will change so it's kind of a bar meter... really I could think of many ways to try implementing the task at hand,
though i thought there is a known way (keeping simple in mind).
just for illustration , this is 2 options(vertical or horizontal ) i thought of how to display statistics visually .
i don't mind which of those graphs(LINK) as long as the implementation will be as simple as it could be
and response will be fast to make it happen without being heavy task .

I would consider FusionCharts They have a nice assortment of chart types, very nice visual, and very simple implementation. You can supply data directly in XML or JSON format, either from server-side or directly to the client, so real-time updates are supported as well.
Oh and even though this is a commercial product, they do have a free version.

Related

Best way to handle large amount of permanent data

I'm developing a PC app in Visual Studio where I'm showing the status of hundreds of sensors that are connected via WiFi. The thing is that I need to hold on to the sensor data even after I close the app, so I'm considering some form of permanent storage. These are the options I've considered:
1) My Sensor object is relatively compact with only a few properties. I could serialize all the objects before closing the app and load them every time the app starts anew.
2) I could throw all the properties (which are mostly strings and doubles) into a simple text file and create a custom protocol for storage and retrieval.
3) I could integrate a database with my app. Someone told me this is the best way to go about it, but I'm a bit hesitant seeing as I'm not familiar with DBs.
Which method would yield the best results in terms of resource usage and speed? Or is there some other, better way to go about this?
First thing you need is to understand is your problem. For example, when the program is running do you need to have everything in memory at the same time or do you work with your sensors one at a time?
What is a "large amount of data"? For example, to me that will never be less than million (or billion in some cases).
Once you know that you shouldn't be scared of using something just because you are not familiar to it. Otherwise you are not looking for the best solution for your problem, you are just hacking around it in a way that you feel comfortable.
This being said, you have several ways of doing this. Like you said you can serialize data, using json to store and a few other alternatives but if we are talking about a "large amount of data that we want to persist" I would always call for the use of Databases (the name says a lot). If you don't need to have everything in memory at the same time then I believe that this is you best option.
I personally don't like them (again, personal choice) but one way of not learning SQL (a lot) while you still use your objects is to use an ORM like NHibernate (you will also need to learn how to use it so you don't get things a slower).
If you need to have everything loaded at the same time (most often that is not the case so be sure of this) you need to know what you want to keep and serialize it. If you want that data to be readable by another tool or organize in a given way consider a data format like XML or JSON.
Also, you can use mmap-file.
File is permanent, and keep data between program run.
So, you just keep your data structs in the mmap-ed area, and no more.
MSDN manual here:
https://msdn.microsoft.com/en-us/library/windows/desktop/aa366556%28v=vs.85%29.aspx
Since you need to load all the data once at the start of the program, the database case seems doubtful. The DB necessary when you need to load a bit of data many times.
So first two cases seem more preferred. I would advice to hide a specific solution behind an interface, then you'll can change it later.
Standard .NET serialization of sensors' array is more simple probably, and it will be easier to expand.

A model defined in a view?

I'm working on a website which will be used all over the world and has to be highly disponible at anytime anywhere on the planet. That's why I try to use all the possible tricks to reduce at maximum the need of recompiling/restarting the website when minor maintenances must occur.
The ability in Asp.Net MVC to edit a view and have it automatically and dynamically recompiled by the framework without service interruption is really great and perfectly fits my needs. But its interest is strongly limited if I cannot edit the underlying model in a similar way and must recompile the whole stuff.
So my question : it is possible in any way (even an awful, hacky one) to define the view model class right inside the view itself in a code block ?
Otherwise, which trails could I explore to achieve a 'hot-editable' website (I mean : whose parts could be recompiled while the site is still alive, with changes taken into account straight away) ?
Thank you so much in advance ! :-)
If you are that concerned about performance and up time, consider using a server farm to host your site. When you need to make updates, you can take each server down separately so that your site is always available.
However, most deployments only take a few seconds. Your application may need more or less time to spin up (EF view generation may take 10-20 secs for example), but as long as you update during off peak hours you should be fine.
Also, I would NEVER EVER recommend changing code on a live server. You will break something eventually.
Eventually I managed to achieve the goal with another strategy.
All views have the same model called DataSource which globally is a recordset open before the rendering and closed after (if needed, reads are performed by the Razor code inside the view).
The column list of the recordset may change live without making the site crash.
For forms and validation, metadata taken from the database about the underlying stored procedure lead to a code emission which dynamically create a c# type, and that's it. Despite a new type is generated each time the sp is changed, the app pool recycling rate prevents too many obsolete types zombiing in memory.

Modular programming to accommodate future changes (software for scraping websites)

I had developed a software in C# using Windows Forms to scrape selected websites for images.
First problem I have is that the websites I monitor constantly change their look and feel, thus making my code in need for updating. I had switched to using XPaths to isolate the divs I look for, but the div ids change too. I have thought of using a text file with the div xpath for each site which the software would read thru, thus saving me the time to edit and recompile the code. Is there a better way to solve this problem ? Maybe CodeDom ?
Secondly, since every website uses different formatting and encoding I had to rewrite parts of code with the HtmlDocument, HtmlWebResponse, HtmlNodes and others for each of them, which ended up accounting for nearly half of my code. I could not put them together since some need extra scraping and paginating and some do not. Is there a way make to simplify this problem ?
Lastly, I have the whole code in one class file with around 600 lines of code. The only methods I have are the backgroundworkers, ui event handlers, a scraping method each for each site, and one method to save the images. Is it alright to have the whole code in one class ? When I used to write in Java, I used to often make use of multiple classes and call them as objects, this helped making changes to particular sections easier. Can I do the same with C# ?
Is there a more efficient approach to making the software ? I was thinking of making a class for each site, so that modifications could be done directly to the class in question, but that would cause a lot of lines to be repeated in each class. Or is it okay to have the whole in one class file ?
Thanks.
PS: This software is for personal use, but I think it is a good opportunity to learn and apply good programming.

ECG digital signal processing in C#

I'm looking for a C# .NET library for digital filtering (lowpass, highpass, notch) to filter ECG waveforms in real-time. Any suggestions?
If this is non commercial use, I have heard good things about the Signal Lab library. It is free for non commercial use, $570 for commercial use. It it a bit overkill if you are just needing low pass, high pass, and band pass filters. but it does come with controls for visualizing the data if you do not have any yet.
If you just need the filters you may just want to write your own code for the 3 filters. You can check the wikipedia pages for psudocode examples of a Low-pass filter and High-pass filter, I did not quickly find a code example of a noch filter.
Here are some C examples of various filters, to help give you a clue on what you need to do.
If your data is arriving in discrete chunks, I would use Reactive Extensions. This allows the input to control what happens next (reacting to data) instead of using "pull" operations. You can then react to this data by passing it through filters, and then react to that data by displaying it or performing additional calculations.
If you only need notch, high, and low filters, these are trivial to write. As each chunk of data arrives, you can decide whether or not to pass it to the next step (or whether or not to modify the data first). I would imagine you could write this whole section of code in less than 20 lines (maybe less than 10) using Rx. It would result in some pretty elegant code for this use case.
As far as I know you can write your own, because I did.
This should be a good starter for you (coded in C++ but you can easily covert the syntax to C#) - http://www.codeproject.com/KB/cpp/ecg_dsp.aspx
Third party libraries wouldn't be very flexible on the filter equation parameters. As you only will know the characteristics of your signal (amplitudes, frequency band and sampling etc.)
I recommend using a waveshaping algorithm first to get a smooth signal on the C# side before you apply filters, if your ECG sampling rate is low.

Should I store localization content in the application state

I am developing my first multilingual C# site and everything is going ok except for one crucial aspect. I'm not 100% sure what the best option is for storing strings (typically single words) that will be translated by code from my code behind pages.
On the front end of the site I am going to use asp.net resource files for the wording on the pages. This part is fine. However, this site will make XML calls and the XML responses are only ever in english. I have been given an excel sheet with all the words that will be returned by the XML broken into the different languages but I'm not sure how best to store/access this information. There are roughly 80 words x 7 languages.
I am thinking about creating a dictionary object for each language that is created by my global.asax file at application run time and just keeping it stored in memory. The plus side for doing this is that the dictionary object will only have to be created once (until IIS restarts) and can be accessed by any user without needing to be rebuilt but the downside is that I have 7 dictionary objects constantly stored in memory. The server is a Win 2008 64bit with 4GB of RAM so should I even be concerned with memory taken up by using this method?
What do you guys think would be the best way to store/retrieve different language words that would be used by all users?
Thanks for your input.
Rich
From what you say, you are looking at 560 words which need to differ based on locale. This is a drop in the ocean. The resource file method which you have contemplated is fit for purpose and I would recommend using them. They integrate with controls so you will be making the most from them.
If it did trouble you, you could have them on a sliding cache, i.e. sliding cache of 20mins for example, But I do not see anything wrong with your choice in this solution.
OMO
Cheers,
Andrew
P.s. have a read through this, to see how you can find and bind values in different resource files to controls and literals and use programatically.
http://msdn.microsoft.com/en-us/magazine/cc163566.aspx
As long as you are aware of the impact of doing so then yes, storing this data in memory would be fine (as long as you have enough to do so). Once you know what is appropriate for the current user then tossing it into memory would be fine. You might look at something like MemCached Win32 or Velocity though to offload the storage to another app server. Use this even on your local application for the time being that way when it is time to push this to another server or grow your app you have a clear separation of concerns defined at your caching layer. And keep in mind that the more languages you support the more stuff you are storing in memory. Keep an eye on the amount of data being stored in memory on your lone app server as this could become overwhelming in time. Also, make sure that the keys you are using are specific to the language. Otherwise you might find that you are storing a menu in german for an english user.

Categories