"being moved to page file" event - c#

Is there an event ( or similar ) in C# to tell when the current process is about to be moved from memory to the page file? Also for that matter an event for coming back from the pagefile.
Or if those events do not exist, perhaps a better way for this to happen or suggestions?
The reason i would like to accomplish this:
I have an application / inventory management program which allows you to look through everything and it mostly keeps all that information in a large List. I would like to just clear out that list before it gets written to disk. It just becomes stale information and slows down the return of the program when it has to be resumed. I would rather be querying the database for new information instead of loading stale info.

No, there is no such event. Even if there were, memory is paged out at the, err, page level, and there is no easy way to know what objects reside in which pages. Add to that the fact that even if you knew that object X is in page Y, X likely has references to other objects that may reside in other pages. So even if X is currently paged in, actually using it may require paging in other memory. And the garbage collector can move objects in memory over their lifetime, making the whole problem more complicated.
That said, for your use case it sounds like you shouldn't even be storing the data in a large List. Can you just use normal data binding?

There is no such event. However, if your program happens to be using ASP.Net then you can use System.Web.Caching.Cache: it will dump your cached objects when there's memory pressure.
I don't know of a good one outside of ASP.Net, but perhaps someone else can suggest something?
edit: meklarian suggests trying out System.Runtime.Caching, probably specifically MemoryCache. Thank you meklarian!

Related

Free memory explicitly

I know this may seem like a duplicate, but I think this specific case may be a bit different.
In my MVC application, I have a ViewModel containing a big List used to display a table similar to an Excel sheet. each MyComplexClass contains a list of MyComplexClass and a list of MyComplexColumn.
This makes a pretty big memory usage, but now I have a problem. I am supposed to write a method that cleans my big table, and load a different set of data into it. I could write something like:
MyViewModel.Lines = new List<MyComplexClass>();
MyViewModel.LoadNewSetOfData(SetOfData);
But, comming from a C background, explicitly losing my reference on the old List is not something I can do, and continue looking at my face in the mirror every morning.
I saw Here, Here, and Here, that I can set the reference to null, and then call
GC.Collect()
But I was told that I was not really best practice, especially because it may really affects performance, and because I have no way to know if the GC has disposed of this specific memory allocation.
Isn't there any way I can just call something like Free() and live on?
Thank you
Don't worry about it! Trying to "encourage" the garbage collector to reclaim memory isn't a great idea - just leave it to do its work in the background.
Just load your different set of data. If memory is getting low the GC will kick in to reclaim the memory without you having to do anything, and if there's plenty of memory available then you'll be able to load the new set without taking the hit of a garbage collection.
The short answer - don't worry about garbage collection. The only time I'd be concerned is if you're putting massive amounts of objects and data in a place where they might not get properly garbage collected at all. For example, you store tons of per-user data in ASP.NET session, and it sits there for a long time even though the user visits a few pages and leaves.
But as long as references to objects are released (as when they go out of scope) then garbage collection will do its job. It might not be perfect, but it's really, really good. The work and thought that went into developing good garbage collection in most cases probably exceeds the work that we put into developing the applications themselves.
Although we have the ability to trigger it, the design of the .NET framework deliberately allows us to focus on higher-level abstractions without having to worry about smaller details like when the memory is freed.

Client Side Caching C# Forms Application

I have an application which query the database for records. The records can be thousands in numbers and this can shoot up the memory of the process and eventually leads up to a crash or slow in response.
The paginated query is a solution for this but the information in the record always keep changing. Hence to give a unique experience, we are forced to show the information available at the time which user make the query.
Employing paging could dynamically update the content on moving from pages to page. I believe a client-side caching could solve this problem.
One way I am finding is to store the results in to disk in XML format and query using LINQ to XML. Are there any proven client side caching mechanism which can work with desktop application (not web)
See some pattern like http://msdn.microsoft.com/en-us/library/ff664753
It talks about the use of the Enterprise Library Caching Application Block that lets developers incorporate a local cache in their applications.
Read also http://www.codeproject.com/Articles/8977/Using-Cache-in-Your-WinForms-Applications
Enterprise Library 5.0 can be found here http://msdn.microsoft.com/en-us/library/ff632023
Memory usage shouldn't really be an issue unless you are letting your cache grow indefinitely. There is little benefit to pre-fetching too many pages the user may never see, or in holding on to pages that the user has not viewed for a long time. Dynamically fetching the next/previous page would keep performance high, but you should clear from the cache pages that have been edited or are older than a certain timespan. Clearing from the cache simply requires discarding all references to the page (e.g. removing it from any lists or dictionaries) and allowing the garbage collector to do its work.
You can also potentially store a WeakReference to your objects and let the garbage collector collect your objects if it needs to, but this gives you less control over what is an isn't cached.
Alternatively there are some very good third party solutions for this, especially if its a grid control. The DevExpress grid controls have an excellent server mode that can handle very large data sets with good performance.

How to manually release resources from images downloaded via the web in WP7?

Hi I'm bringing back some items from a web service that contains three strings. One of those is the path to an image. Now when I start loading the images into a listbox the memory as expected starts to go up which isn't bad. But when I hit the back button the memory is still very high.
I'm thinking it has to do with the fact that I'm not releasing the resources taken up by the images. The idea comes from this answer => Question.
Does anyone know how to manually release these resources?
There's no Dispose() method for Image or BitmapImage classes, so the best you can do is dispose the stream you're getting the data to. But I'd personally look for the problem somewhere else since Images should be GC'd automatically (in fact, they do).
There might be event handlers that bind to your page from outside making GC unable to collect it, e.g. you have a reference to the page in your application settings or something. Or the GC just doesn't at once collect the dumped objects but waits for the specific time - try moving back/forward several times and see if the memory raises up and up and up.
Anyway, there's no need/ability to free resources taken by Image/BitmapImage manually (only the corresponding Stream, which usually doesn't give the result since it's cached in the image).

Help finding a leak with Remote Performance Monitor

I'm trying to find the source of a memory leak in a Compact Framework application using the Remote Performance Monitor. I managed to remove a few minor ones related to brushes and other graphic objects, but I still don't have a clear idea of what is causing the main problem.
At this point the only objects that seem to never go back to their original count are System.String objects. I find this to be very weird, since I would've thought that in order for these objects to remain uncollected, the objects that contain them would have to remain as well, and yet no other type of objects seem to be increasing along with the System.Strings.
I'm trying to find out which new String objects are the ones that remain after the application returns to its original state (i.e. the login screen). The problem is that originally, the application loads with about 2200 string objects, and then after process "X" it increases another 70 or so which never get collected. I don't know how to identify those 70 new objects, in order to find out who is holding on to them and make the appropriate corrections.
Does anyone have any experience in which Strings were not being collected? Is there any way to separate the new objects created during process "X" from those that were originally required by the application so that I can know which are leaking? Any advice would be appreciated.
Thanks
**UPDATE
Ok... there is something very strange going on. I'm starting to doubt whether there is a leak at all.
Let's say that I take a memory snapshot at the Login screen which is the original starting point of the application. Imagine that there are 1000 string objects in memory at this point. Now, if I login and choose an option from the menu, I'll take a snapshot once the new screen is loaded. Let's say that loading this form increases the string count by say 50 objects. When I logout and take a snapshot again at the login screen, only 25 of those objects have been collected, the rest will stay in memory from then on.
The strange thing is that if I keep repeating this process, no further string objects will accumulate. Instead of the string count increasing by 50, only 25 will be added at this point, and those same 25 will be collected once I go back to the login screen. I'm thinking that if this were an actual leak, then each time that I opened that screen the string count would increase permanently by 25, but this only happens the first time.
This happens to me on every new screen that I open. At first there is a small permanent increase in the overall string count, but once I've loaded that particular screen, any increase in the string count during its execution will be collected once I go back to the login screen.
All of this has led me to believe that perhaps those strings are part of the inner workings of the CLR or something like that. Could it be some sort of caching done by the runtime? Perhaps it is storing my string constants for faster loading? Something like that? I hope this wasn't too confusing.
if you're using CF 3.5, then use the CLR Profiler instead of RPM. It will show you all objects and the roots that spawned them. It would allow you to walk back the root tree to figure out where they were allocated.
EDIT
So you're saying that you aren't actually getting OOMs or any aberrent behavior? Sounds like you're trying to optimise when it's not necessary. Those string are likely things like Form captions, etc. that the JITter is creating and would get collected if/when the objects get collected. What they are, though, really isn't terribly important if you're not actually seeing memory problems.

How many DataTable objects should I use in my C# app?

I'm an experienced programmer in a legacy (yet object oriented) development tool and making the switch to C#/.Net. I'm writing a small single user app using SQL server CE 3.5. I've read the conceptual DataSet and related doc and my code works.
Now I want to make sure that I'm doing it "right", get some feedback from experienced .Net/SQL Server coders, the kind you don't get from reading the doc.
I've noticed that I have code like this in a few places:
var myTableDataTable = new MyDataSet.MyTableDataTable();
myTableTableAdapter.Fill(MyTableDataTable);
... // other code
In a single user app, would you typically just do this once when the app starts, instantiate a DataTable object for each table and then store a ref to it so you ever just use that single object which is already filled with data? This way you would ever only read the data from the db once instead of potentially multiple times. Or is the overhead of this so small that it just doesn't matter (plus could be counterproductive with large tables)?
For CE, it's probably a non issue. If you were pushing this app to thousands of users and they were all hitting a centralized DB, you might want to spend some time on optimization. In a single-user instance DB like CE, unless you've got data that says you need to optimize, I wouldn't spend any time worrying about it. Premature optimization, etc.
The way to decide varys between 2 main few things
1. Is the data going to be accesses constantly
2. Is there a lot of data
If you are constanty using the data in the tables, then load them on first use.
If you only occasionally use the data, fill the table when you need it and then discard it.
For example, if you have 10 gui screens and only use myTableDataTable on 1 of them, read it in only on that screen.
The choice really doesn't depend on C# itself. It comes down to a balance between:
How often do you use the data in your code?
Does the data ever change (and do you care if it does)?
What's the relative (time) cost of getting the data again, compared to everything else your code does?
How much value do you put on performance, versus developer effort/time (for this particular application)?
As a general rule: for production applications, where the data doesn't change often, I would probably create the DataTable once and then hold onto the reference as you mention. I would also consider putting the data in a typed collection/list/dictionary, instead of the generic DataTable class, if nothing else because it's easier to let the compiler catch my typing mistakes.
For a simple utility you run for yourself that "starts, does its thing and ends", it's probably not worth the effort.
You are asking about Windows CE. In that particular care, I would most likely do the query only once and hold onto the results. Mobile OSs have extra constraints in batteries and space that desktop software doesn't have. Basically, a mobile OS makes bullet #4 much more important.
Everytime you add another retrieval call from SQL, you make calls to external libraries more often, which means you are probably running longer, allocating and releasing more memory more often (which adds fragmentation), and possibly causing the database to be re-read from Flash memory. it's most likely a lot better to hold onto the data once you have it, assuming that you can (see bullet #2).
It's easier to figure out the answer to this question when you think about datasets as being a "session" of data. You fill the datasets; you work with them; and then you put the data back or discard it when you're done. So you need to ask questions like this:
How current does the data need to be? Do you always need to have the very very latest, or will the database not change that frequently?
What are you using the data for? If you're just using it for reports, then you can easily fill a dataset, run your report, then throw the dataset away, and next time just make a new one. That'll give you more current data anyway.
Just how much data are we talking about? You've said you're working with a relatively small dataset, so there's not a major memory impact if you load it all in memory and hold it there forever.
Since you say it's a single-user app without a lot of data, I think you're safe loading everything in at the beginning, using it in your datasets, and then updating on close.
The main thing you need to be concerned with in this scenario is: What if the app exits abnormally, due to a crash, power outage, etc.? Will the user lose all his work? But as it happens, datasets are extremely easy to serialize, so you can fairly easily implement a "save every so often" procedure to serialize the dataset contents to disk so the user won't lose a lot of work.

Categories