Blazor Server Side - Load more 50 data - c#

Good Day Everyone
I'm creating a Blazor Server Side application with card style on my home page, I did not use javascript on loading those data, it's just simple razor and c#. Now it loads almost 2000+ data, this makes my app slow, so what I want to do is to load 50 data first, then there is a load more below that will load 50 data, my idea is to insert the 50 on the object where the first 50 is, but I think this will also cause a bulk on the data projected on the page, and it might cause a problem if it reaches a 3000+, is there a way to do this on the Blazor Server-Side?
Thanks and regards

Blazor does not have built-in pagination. You'll have to do that yourself in your C# code that sends the data to your frontend/razor component(s). There are Blazor frameworks that can handle that for you, of course (Google for MudBlazor or Radzen).

You can either create your own logic to load more results on each click of a button (not really hard to manage that with a SQL query).
Or you can try component virtualization, which I suspect is the closest built-in option to what you want:
https://learn.microsoft.com/en-us/aspnet/core/blazor/components/virtualization?view=aspnetcore-6.0

Edit Note:
I just noticed Bennyboy1973 mentioned this.
I suppose I just expounded on it for no reason, haha.
I know this is an old question, but for anyone else who may stumble across this problem.
Blazor has a "Virtualize" component that you can make use of. Simply put, it only renders and displays records that would be visible on screen. As you scroll in the list, it would then render the next set of records and so on, freeing up the resources that would normally be used rendering the full dataset.
To compliment the "Virtualize" component, Blazor has a ItemsProvider delegate method which you can make use of; allowing you to set it up so that instead of loading the full 2000+ record set, it only loads the amount of records needed for the current viewspace of your app. Then like the Virtualize, as you scroll it will query your dataset for the next X amount of records and then render them, and so on.
Setting up the initial "Virtualize" component is easy:
Let's say you load your cards as below
<div style="height:800px; overflow-y:scroll">
#foreach (var card in allCardDetails)
{
<MyCard #key="card.Id" Title="#card.Title" Details="#card.Details" />
}
</div>
What this will result in, is all Cards being rendered for all data records. (This includes rendering of Cards that aren't even visible onscreen.)
To implement the Virtualize component, you simply change up the code snippet to resemble the following
<div style="height:800px; overflow-y:scroll">
<Virtualize Items="#allCardDetails" Context="card">
<MyCard #key="card.Id" Title="card.Title" Details="#card.Details" />
</Virtualize>
</div>
Now, only the Cards that would be visible within the region of the DIV will be rendered, and as you scroll down in the DIV it will proceed to render the next Cards that would be visible and so on.
This would greatly aid in screen jittering or rendering (The lag). If you want to take it a step further, to limit the amount of data that is queried from your server at initial load, you can make use of the ItemsProvider delegate method to achieve this.
Virtualize with ItemsProvider:
<div style="height:800px; overflow-y:scroll">
<Virtualize Context="card" ItemsProvider="#loadCardDetails">
<MyCard #key="card.Id" Title="card.Title" Details="#card.Details" />
</Virtualize>
</div>
We've removed the Items field, and replaced it with the ItemsProvider as the Datasource is now mutable; determined by the ItemsProvider. Lastly, we need to create the ItemsProvider method (in this case called "loadCardDetails") which will dynamically load the records as they are needed.
private async ValueTask<ItemsProviderResult<CardDetails>> loadCardDetails(ItemsProviderRequest request)
{
//It would be a good idea, at page load, to get a count of all records and
//and store in an accessible variable.
//For the sake of this example, I'll include it here.
var totalCount = DbContext.CardDetails.Count();
//This portion, determines how many records need to be loaded next
//Math.Min is used to ensure that we don't try to load more records than
//is available.
//Eg. We have 50 records, but Vitualize is requesting 60 records (more than
//we have) instead it will only attempt to get the 50 we have and return it
var numCardDeets = Math.Min(request.Count, totalCount - request.StartIndex);
//This portion get's the next set of data records to the count of our
//"numCardDeets" value.
var results = DbContext.CardDetails.Skip(request.StartIndex).Take(numCardDeets);
//Finally, it returns the result set to the Virtualize component to render.
return new ItemsProviderResult<CardDetails>(results, totalCount);
}
And that's it. If all is setup correctly, the Virtualize component will now only load the Data that would fit on your screen (from your datasource) and render it; then as you scroll down it loads the next set of Data, and renders it.
This example is made under the assumption you make use of EntityFramework to retrieve data from a database. The implementation of how you get the Data from your datasource will vary depending on what or where your datasource is.
I'll just note here:
request.StartIndex and request.Count are managed by the ItemsProvider; it keeps track of it's own current index and request number (number of records being requested)

Related

ListView with 6000 rows c#

I am with a little issue on handling with one of my applications.
I have a Vessel's historic which is shown on a ListView, but until now I have never needed to show all the data inside this history (the user was using kind of a filter to get what he needed), but now one of the managers want to see all data through the application (they was receiving the full data through an excel report).
The biggest issue is because its 6000 rows with 21 columns each one and when I try to select all the data it takes something between 5 minutes to fully load, but more than that the user need to add new, edit or copy the history, which brings him to a new update on the list with more 5 minutes to load.
I don't quite know how is the best way to handle with this and I wanted your help!
I would actually split the information into sections. Rather than loading 6,000 rows plus columns all at once, why don't you use something like alphabetize the information? Use a different ListView for A-G, then another ListView for H-O, then so on and so forth. That why it would cut down the time it took to query all of the information.

how to track only the changed values in the repeater asp.net

I have a editable nested repeater containing the data.I use it for showing the data and also for saving the data which is updated by the user.
I want to detect particular cell/row for which the data has been modified by the user so that I can update only that particular row in the database instead of saving all the data again.
Is there a way to work this out.Which would be the best technique to use ,Javascript or server side code ?
Resource country etc | Week1 Week2 Week3 | Total
ABC XYZ 10 15 20 45
This is the repeater structure.
Middle one (Weeks showing hours worked by resource which is editable) is the nested repeater.
Values can only be changed in the nested repeater.
I maintain unique ID's for each resource in the hidden field.
Can you suggest me some ways to achieve this functionlity ?
This is normally handled by your ORM, but you can implement it yourself if you like. The idea is called 'change tracking', and basically what's commonly done is to retrieve the record from the database, then compare known retrieve values to the new values before saving it again.
Depending on your concurrency strategy, you also might use a hash of the original values that were sent to the client, placed in a hidden field perhaps, which you can then check original vs. currently in the database in order to not accidentally overwrite someone else's changes.

Showing records on grid efficiently

i have a third component grid that i am using on my page.
It displays 20 records in one page.
Also i am using images to display certain columns.
Like comments , attachements have clickable images for every row.
the problem is.. everytime i load my page.. the logic right now is.. it goes in the database.. checks for every row in the table to see if the comments are added or is there attachment.. and accordingly disables or enable the image of that particular record(row)
Now this takes too many database hits and processing time increases.
Can u tell me any other way to do this.?
You could add a column "Number of comments" and "Number of attachments" to your rows.
You should enable some sort of VirtualMode in your grid ( every grid should have this function in some way ) in order to feed just the item the user actually see. This is a good practice always.
Irrespective of grid, you can build some sort of object which collectively holds the textual information that you need to show and some more flags which tell you about the image and attachment. You bind the collection of this custom object to your grid.
Use the flags in this object to put different images in your columns.
HTH

ASP.NET MVC 3 WebGrid paging issue

My data access layer returns collection with rows for single page and total number of rows.
Unfortunately WebGrid component does not allow to specify total number of rows or total number of pages (these properties are read-only).
Has anyone had to deal with this issue before?
You can use the Bind method on the WebGrid to tell the grid to use server side paging.
grdv.Bind(myData, rowCount=10000, autoSortAndPage=False)
Setting autoSortAndPage to false tells the grid that myData is just a segment of the data. It will show all rows of this data regardless of your page size setting. Pager will be built using the rowCount you pass in and not the number of records in myData.
EDIT: I see what your question is now. Check out this article for not using the WebGrid.
Paging with WebGrid
From this page, it looks like you can specify rows per page.
var grid = new WebGrid(source, rowsPerPage : 25);
And this page (look at line 9 from the first code block).
rowsPerPage is only settable through the constructor. This was done to keep the helper simple and avoid handling complex states. Total rows comes from the data source.

is there a good way to display too much information in ASP.NET?

I find myself in a quandry that I think I know the solution to but I'd like to ask the field. I have an ASP.NET (C# 2.0 framework) page within a site which is used as a lookup. Standard gridview control, 5 columns of data, hyperlink for the 6th column to do something with record the user wants to select.
My question goes towards how to best display 'a possible' 100k records in that gridview? As it stands right now I'd sprout a few more gray hairs before it ever returns a rendered result. The gridview, for its realestate can display about 20 rows of data on the screen at a time, so paging the data still gives me 5000 pages. Adding in a rolodex-type search on A-Z the largest return set on 'J' gives me 35000 records (where alas 'X' only has 54).
Do I just break the rolodex up smaller or is there a better way to handle a situation like this?
thanks in advance!
edit: I already have the stored procedure which populates this set up for paging like GenericTypeTea suggested, again even with paging on 'J' that would give me 1750 pages. The reason I have that much data is that the amount of participants on the given auto policy. The admin needs to be able to search for a given name, or a partial. 'Jones' has 1209 records and 'Smith' has 2918 so even that makes for a rebust result set.
edit #2: added 'a possible' 100k, there is no guarentee that the account will have that many records, on the other hand it could have more :(
AutoComplete is your friend :)
Just let people enter the first 2 or 3 characters then filter your searches.
With a dataset that large I don't think paging would make that much sense.
jQuery has a nice example page AutoComplete Examples
Filters. Don't show that much data. Show the first x records. And beyond that, the user will need to be more precise with their search. Nobody will look through 100k records for the one they want. I'd limit it to a couple hundred at most (10 pages, 20 per page).
Advise the user how many results there were though, or give some clue so they know that there were many that aren't shown, and they need to be more specific in their search
It seems to me like adding search capabilities would be more efficient than filtering or paging.

Categories