I have an application and I have around 20+ pages and am creating all pages at the starting of the application. It might be a memory over flow exception in future.? Whether it is a better Idea or can pages as I need.
If those pages are created only once, then the memory usage for them wont change. Any objects you create on those pages will cause the memory consumption to increase.
As for your question, creating the pages at the start of the application should be fine, just beware that you will have to create them in such a way that the Garbage Collection will not clear them out of memory. Also make sure you don't create new instances of them each time they are displayed :)
Related
Suppose I've got something like following
bunch of Data classes
bunch of List of most of the data classes
loads of List<List<List<Data>>> classes to represent at least 3D arrays
a good few App.MyViewModel view models used in different pages due to thread access
the viewmodels are quite complex to my liking with tonnes of properties linking back to point 3
in the end, each ListView template is created from ObservableCollection<String> generated from one of the List<Data>
During the lifecycle, those lists might be renewed many times, which I would hope should recycle previous used memory? The list view rows/cells are created as Grids.
On small list views of up to tens of rows it works good and fast, not increasing memory use too much.
However, on large data sets, containing thousands of rows, even scrolling the ListView sometimes just crashes the app and memory increases dramatically with each portion of data.
So the question really Is, from your own experience, what would you recommend in troubleshooting and perhaps redesigning of the approach?
You should really look at the Xamarin Profiler
Xamarin Profiler
The Xamarin Profiler has a number of instruments available for
profiling Allocations, Cycles, and Time Profiler
There could be so many problems its impossible to know where to start, as for design once again its hard to know how to refactor your app because we don't know what you are trying to achieve. If you need to use lists you need to use them and there isn't much you can do about it.
However, you need to start from first principles and make sure you are doing only what you need to do, instantiating only what you need to instantiate, keeping your xaml and UI with the minimal amount of cyclic calculations as possible. Last of all making sure your view models and objects are going out of scope and being garbage collected
I have developed one UWP application.In that nearly 20 windows are there.
Every window contains lot of Xaml controls. For some time it's working fine.
But after usage of some time application is getting very slow.
After doing some R&D I came to know that it's called Memory Leakage.
As per my knowledge in .Net Garbage Collector has to take care of this if I am not wrong.It seems like In UWP Application it is not happening. So I thought I should use GC.Collect() in Page Unload Event.
Is that correct approach or anything else I need to do to release memory which are used by the window controls?
The performance optimization is a vast subject and may not be possible to answer on an open ended question (without knowledge of your environment and architecture etc).
However you can use the Visual Studio profiler to measure and track performance and to find out the area where you need to take action, these can be;
Data binding
UI Virtualization
Image rendering
Visual Tree Size
Further reading these urls may also help you.
ms docs and this blog
The GC takes care of orphaned objects or objects that are no longer referenced by any other classes. when the GC finds objects like these it removes them from memory. memory leaks happen when an object is referenced by another class even after it is done with it. this means you need to look at your code and find where this is happening. you need to help GC in doing its job by making sure you no longer reference objects you don't need.
I also wouldn't recommend using GC.Collect in page unload event since GC has to freeze threads in order to collect. this might impact performance.
Is there an event ( or similar ) in C# to tell when the current process is about to be moved from memory to the page file? Also for that matter an event for coming back from the pagefile.
Or if those events do not exist, perhaps a better way for this to happen or suggestions?
The reason i would like to accomplish this:
I have an application / inventory management program which allows you to look through everything and it mostly keeps all that information in a large List. I would like to just clear out that list before it gets written to disk. It just becomes stale information and slows down the return of the program when it has to be resumed. I would rather be querying the database for new information instead of loading stale info.
No, there is no such event. Even if there were, memory is paged out at the, err, page level, and there is no easy way to know what objects reside in which pages. Add to that the fact that even if you knew that object X is in page Y, X likely has references to other objects that may reside in other pages. So even if X is currently paged in, actually using it may require paging in other memory. And the garbage collector can move objects in memory over their lifetime, making the whole problem more complicated.
That said, for your use case it sounds like you shouldn't even be storing the data in a large List. Can you just use normal data binding?
There is no such event. However, if your program happens to be using ASP.Net then you can use System.Web.Caching.Cache: it will dump your cached objects when there's memory pressure.
I don't know of a good one outside of ASP.Net, but perhaps someone else can suggest something?
edit: meklarian suggests trying out System.Runtime.Caching, probably specifically MemoryCache. Thank you meklarian!
I am currently working on performance tuning of an existing C# website. There is a class say.. MyUtil.cs. This class has been extensively used across all web pages. On some pages around 10/12 instances are created (MyUtil). I ran the "Redgate" performance profiler. The object creation is a costly operation according to RedGate.
Please note that each instance sets specific properties and performs specific operation. So I can not reuse the object as it is. I have to reset all the member variables.
I want to optimize this code. I have thought about following options. Kindly help me evaluate which is the better approach here :
(1) Create a "Reset" method in "MyUtil.cs" class which will reset all the member variables (there are 167 of those :(..) so that I can reuse one single object in an page class.
(2) Continue with the multiple object creation (I do have Dispose() method in "MyUtil")
(3) I thought of "object pooling" but again I will have to reset the members. I think its better to pool objects at page level and release them instead of keeping them live at the project level.
Any reply on this will be appreciated. Thanks in advance..!
Every app has multiple opportunities for speedup of different sizes, like kinds of food on your plate.
Creating and initializing objects can typically be one of these, and can typically be a large one.
Whenever I see that object creation/initialization is taking a large fraction of time, I recycle used objects.
It's a basic technique, and it can make a big difference.
But I only do it if I know that it will save a healthy fraction of time.
Don't just do it on general principles.
I would recommend that you always create new objects instead of resetting the objects. This is so for the following reasons
GC is smart enough to classify objects and assign them a generation. It depends on the usage of objects in your code. Profiling is done based on the execution pattern of code as well as architecture of code
You will be able to get optimum result from hardware if you use the GC and let it manage the process as it also decides the garbage collection thresh hold based on hardware configuration available and available system resources.
Apart from that, your code will be much easier and manageable. - (Though it is not a direct benefit, but should also give at least some weight to it.)
Creating object pool at page level is also not a good idea because to re-use the object, you will have to do two things, fetch it from pool and reset its properties, i.e. you will also have to manage the pool which is additional burden
Creating a single instance and re-using it with resetting properties might also not be a good idea cause when you need more then one instance of object, it will not work.
so the conclusion is, you should keep creating objects in the page and let the Garbage collector do its job.
I have an application which query the database for records. The records can be thousands in numbers and this can shoot up the memory of the process and eventually leads up to a crash or slow in response.
The paginated query is a solution for this but the information in the record always keep changing. Hence to give a unique experience, we are forced to show the information available at the time which user make the query.
Employing paging could dynamically update the content on moving from pages to page. I believe a client-side caching could solve this problem.
One way I am finding is to store the results in to disk in XML format and query using LINQ to XML. Are there any proven client side caching mechanism which can work with desktop application (not web)
See some pattern like http://msdn.microsoft.com/en-us/library/ff664753
It talks about the use of the Enterprise Library Caching Application Block that lets developers incorporate a local cache in their applications.
Read also http://www.codeproject.com/Articles/8977/Using-Cache-in-Your-WinForms-Applications
Enterprise Library 5.0 can be found here http://msdn.microsoft.com/en-us/library/ff632023
Memory usage shouldn't really be an issue unless you are letting your cache grow indefinitely. There is little benefit to pre-fetching too many pages the user may never see, or in holding on to pages that the user has not viewed for a long time. Dynamically fetching the next/previous page would keep performance high, but you should clear from the cache pages that have been edited or are older than a certain timespan. Clearing from the cache simply requires discarding all references to the page (e.g. removing it from any lists or dictionaries) and allowing the garbage collector to do its work.
You can also potentially store a WeakReference to your objects and let the garbage collector collect your objects if it needs to, but this gives you less control over what is an isn't cached.
Alternatively there are some very good third party solutions for this, especially if its a grid control. The DevExpress grid controls have an excellent server mode that can handle very large data sets with good performance.