I am getting around 200K rows from database. Everytime I click on pagination it getting records from DB. How to avoid that? Caching??
public ActionResult Demo(int? page, string sortOrder, string recordSize)
{
int pageSize = Convert.ToInt32(recordSize = string.IsNullOrEmpty(recordSize) ? "20" : recordSize);
int pageNumber = (page ?? 1);
var customers = DBHelper.GetCustomers();
return View(customers.ToPagedList(pageNumber, pageSize));
}
You should never pull back all rows to cache or use. That is simply too much data and you are going to take massive performance hits in doing so. You should try to only ever pull back the data you actually want to use.
If you are using LinqToSql or EntityFrameWork or similar data access, you can use linq to retrieve the page you want. This would be the most efficient solution because you will not pull back all 200k rows.
var customers = Customers.Skip(pageIndex * pageSize).Take(pageSize);
Caching could improve retrieval speeds for the second time around, but your initial speed problem is going to be with pulling back all 200k rows from the database with each request. Yes, you could pull them all back, cache them, and use them, but this is not an ideal approach. Ideal caching would be to cache each page with a dependency on the previous page so that you can properly invalidate your cache.
Related
I have MVC project with WCF service.
When I display a list of data, I do want to load everything from the database/service and do a client paging. But I do want a server-side paging. If I have 100 records and my page size is 10, then when a user clicks on page 1, it will only retrieve the first 10 records from the database and if a user clicks on Page 3, then it will only retrieve the corresponding ten records.
I am not using Angular or any other bootstrap.
Can someone guide me how to do it?
public ActionResult Index(int pageNo = 1)
{
..
..
..
MyViewModel[] myViewModelListArray = MyService.GetData();
//when I create this PageList, BLL.GetData have to retreive all the records to show more than a single page no.
//But if the BLL.GetData() was changed to retrieve a subset, then it only shows a single page no.
//what I wanted to do is, show the correct no of pages (if there are 50 records, and pageSize is 10, then show
//page 1,2,3,4,5 and only retrieve 10 records at a time.
PagedList<MyViewModel> pageList = new PagedList<<MyViewModel>(myViewModelListArray, pageNo, pageSizeListing);
..
..
..
return View(pageList);
}
The best approach is to use LINQ to Entities operators Skip & Take.
For example, to page
int items_per_page = 10;
MyViewModel[] myViewModelListArray = MyService.GetData().OrderBy(p => p.ID).Skip((pageNo - 1) * items_per_page).Take(items_per_page).ToArray();
NOTE: The data must be ordered, so the pages have some consistency (but I did by an arbitrary field ID). Also some databases required 'order by' to apply 'limit' or 'top' (which is how Take/Skip are implemented).
I put it that way, because I dont know how you are retrieving the data.
But instead retrieving the full list with GetData and then filtering out, better include the pagination in the query inside GetData (so you don't retrieve unnecessary data).
Add paramters page size and page number to your service method and make the result an object which returns TotalCount and a List Items (Items being the items on the current page). Then you can use those values to create the PagedList.
Inside your business logic code you will do two queries one for the count of items and one for the items on the page.
Also if you are starting the project now do yourself a favor and remove the useless WCF service from your architecture.
I am storing images in SQL Server in bytes[] and then retrieving it using VIEWDATA as following (there are nine images (bytes[]) in database which I am retrieving):
Action controller:
public ActionResult show_pics2()
{
using (cygnussolutionEntities6 db = new cygnussolutionEntities6())
{
// db.CommandTimeout = int.MaxValue; //For test
var querylist = (from f in db.Images
select f.ImageContent);
// get list in ViewBag
ViewBag.DataLIst = querylist;
// get list in View Data
ViewData["images"] = querylist.ToList();
return View();
}
}
In view, I am parsing images and displaying it with foreach loop and viewDATA, but it's taking so long to load to browser. Does anyone knows why this is taking so long?
It would be worth putting some timers through your code to see where the slowness is occurring - i.e. is it the query to the database that is slow, is materialising the result into a list slow, or is the slowness occurring within the view when processing the images.
Also worth considering - are you eager-loading any properties from the Images table as that will affect performance depending on the number of extra navigation properties being loaded.
Are you able to post the code within your view and your entities class?
If you are going to ToList() something and also going to use it elsewhere, then make both uses use the results of the ToList() (unless you really need to maintain the queryable interface). Then you need only get that list once.
Don't obtain several blobs and use them on a page, obtain several IDs and use them to call other resources through <img src="theImage/#id"> or similar, then have that resource served by a view that only retrieves the sole image. Then each such resource loads a single image and can do so in parallel to each other.
Unless the images are all small, use streams that build on ADO access to the blob and stream out a chunk of 4096 bytes at a time, rather than EF. While it means leaving a lot of what EF give you (and a lot of what MVC gives you since you'll have to do it in a result rather than a view) it allows for memory-efficient streaming that can begin with the first loaded chunk.
I want to do grid, I get 1000 rows of data from SQL Server with WCF, then I put grid 10 data in view in first after use scroll and get 10-20 data from controller in two after use scroll and get 20-30 data from controller in three..... use scroll and get 990-1000 data from controller. But I must go SQL Server with WCF only one time for 1000 rows of data (I cannot go to SQL Server all time (example 0-10,10-20,20-30)) and I put 10 data grid in view, problem is 990 rows of data in controller.
How to keep 990 rows of data in the controller ?
You can make use of Caching for this
Either use the System.Web.Caching
Or use MemoryCache
Depending on you setup, you might also be able to use OutputCache
[OutputCache(Duration=10, VaryByParam="none")]
public ActionResult Result()
{
return Data();
}
See http://www.asp.net/mvc/overview/older-versions-1/controllers-and-routing/improving-performance-with-output-caching-cs for more around this.
your description is quite confusing. Sorry if I misunderstood your requirement.
If it involve over 1000+ of data, session is not a good option especially if your program involve other usage of session.
Since you are using MVC, you can take advantage of new option such as ViewData and TempData. You can read more about it here.
I used TempData before and it can process large amount of data (I did not count how much it was, but consider quite huge) so it should be a much better option than session.
I'm experiencing OutOfMemory exceptions in my application, when fetching from the database. It is a C# .Net application using Linq2Sql.
I have tried using GC.GetTotalMemory() to see how much memory is taken up before and after the call to the database. This gives a nice although not quite accurate picture of what is going on. When I look in the Windows Task Manager I can see that the Peak Working Set is not smaller when fetching the data in a paged manner using the following code:
public static void PreloadPaged()
{
int NoPoints = PointRepository.Count();
int pagesize = 50000;
int fetchedRows = 0;
while (fetchedRows < NoPoints)
{
PreloadPointEntity.Points.AddRange(PointRepository.ReadPaged(pagesize, fetchedRows));
PointRepository.ReadPointCollections();
PreloadPointEntity.PointCollections.Count());
fetchedRows += pagesize;
}
}
private static List<PointEntity> ReadPaged(int pagesize, int fetchedRows)
{
DataModel dataContext = InstantiateDataModel();
var Points = (from p in dataContext.PointDatas
select p.ToEntity());
return Points.Skip(fetchedRows).Take(pagesize).ToList();
}
I guess it's the Linq2Sql code that is using up the memory and not reusing it or freeing it afterwards, but what can I do to get the memory foot print down?
I have observed that it uses 10 times as much memory to fetch the data as it does to store them in my list of enties. I have considered invoking the garbage collector, but I would rather avoid it.
you are retrieving way too much data and storing it in memory, that's why you are getting an OOM exception.
1 of 2 things is occurring:
you are loading an excessive amount of data when the user will only view a subset of the results and/or this is a 1st attempt at "caching" data.
you do need all this data, but are using the wrong technology (Linq2Sql) to access the data.
if it's the first, you need to either
load smaller chunks of data (20-50 records, not 50K or everything)
if this is only for display purposes, then query a projection of what's needed, rather than the entity itself.
if it's the second than use an ETL tool designed to manage large amounts of data. I prefer Rhino.ETL but SSIS also works.
I am retrieving a big list of news items.
I want to utilise caching of this list for obvious reasons (running a query any more than necessary won't do).
I bind this list to a repeater on which I have extended to enable paging (flicking between pages causes page load and so a retrieval of the list again).
The complication is the fact that these news items can also be queried by date from a query string 'year' that is present only when querying the news items by date.
Below is pseudo code for what I have so far (pasting the real code here would take too much time trimming out all of the bits that just add confusion):
if(complete news items list are not cached OR querystring["year"] != null)
{
Int year = queryString["year"] (could be null)
Cache.Add(GetNewsItems(year));
}
else
{
return cached newsItems;
}
The problem is that when the news items page loads (because of a paging control's postback) the querystring's [year] parameter will still be populated and so will re-do the GetNewsItems. Also - even if the home URL (i.e. no query string) is then navigated to - in effect there is still a cached version of the news items so it will not bother to try and retrieve them - but they MAY be there for a particular year and so aren't relevant to an 'all years' kind of search.
Do I add a new cache entry to flag what most recent search was done?
What considerations do I have to make here to cache timing out?
I can add a new query string if needs be (preferably not) - would this solve the problem?
Could you get the complete list on the first query, and cache this (presuming it is not too enormous)? Then for any subsequent queries get the data from the cache and query it to filter out just the year you want.
Maybe you could store the data in the cache in the form of
IDictionary<int, IEnumerable<NewsItem>>
(assuming there are more than one NewsItem for a year) where the key is the year so you can just retrieve a single dictionary value pair to get a year's data.
Alternatively cache the data by year, as you are doing in your sample, and implement a mechanism to get data from the cache for each year if it exists. When getting all, you can either just get all from the data store and cache this separately or implement some mechanism to determine what years are missing from the cache and just get these. Personally, I think I would go with caching all and filtering data from the cache so as to not clog up the memory with too much cached data.
A useful TryGetValue pattern to get from cache is something like:
private bool TryGetValue<U>(string key, out U value)
{
object cachedValue = HttpContext.Current.Cache.Get(key);
if (cachedValue == null)
{
value = default(U);
return false;
}
else
{
try
{
value = (U)cachedValue;
return true;
}
catch
{
value = default(U);
return false;
}
}
}
Not sure if this might help for your case, but you can add the repeater to an user control, and enable OutputCache for it, to be invalidated by POST/GET params:
<%# OutputCache Duration="100" VaryByParam="year" %>