What is the best way to fill Combo box? - c#

I have 3 form's in my project
When I open form 3, I fill Combo box with data (from DataBase)
it take's time......
How I can fill this Combo box only one time when the program is open ?
(in the first form - form1)
thank's in advance

There are a million ways to do this, and your question is pretty vague. is it the same data in all three combo boxes? Regardless, you want to load the data and store the lists in memory when you application first initializes. There are a lot of good, and a lot of bad ways to do this. Then when each form comes up, check to see if the list in memory is filled, if it is, bind to that list. (If not, of course, fill the list from the database, and then bind to it).
The overall concept is to preload the data, and then always check your memory persistence before going to the database.
Edit
To quickly list a good and bad way of storing these values in memory before I turn in for the night. I'll try to expand on this in the morning.
The best way would be to create a memory repository layer in your application, and have your business objects poll it before heading to the database, but there is some complexity in using this sort of model (mainly dealing with concurrency issues.)
The worst way would be to just declare some global collections of data somewhere, and pull them directly into your UI.

Related

Biniding data from dataset to gridview taking time

I am having one windows form which is dynamic form,used in many forms.
In this form i have to bind data from dataset to datagridview.But takes 3-4 minutes to bind data from dataset to datagridview.Result from query executed is placed in dataset.And then this dataset is then bind to datagridview as
dgSearch.DataSource=ds.tables[0];
but this statement takes time to execute.As my dataset has more than 100k rows. How to bind such large amount of data to datagridview?
I have a C# windows application.
First, the idea of binding that much data in an application suggests either a) bad architecture or b) bad business requirements. People cannot process 100,000 rows at one time. If this is a report, then the grid is not your best way of handling this.
If you are forced to head this direction, I would consider some form of paging and only show a portion of the "gridded" data at any one time. http://www.codeproject.com/Articles/16303/DataGrid-Paging-C-Windows-Forms
Heading this direction, you can control the amount of data that is bound at any one time. You can also implement sort on different columns so the user can refine what he or she is looking at. But you will dramatically speed up binding, as you are not binding 100K+ rows at a time.
If you have large amount of records, the best way to show the records is that you have paging in your gridview.
Effective paging is important for applications that handle large number of records to build scalable applications.
Here is the good article for that. It shows you how to do it on ASP.NET web forms, but when you find out the way of that you can do it on windows forms application.
Also there is another question may help you with that. What is the fastest way to load a big data set into a GridView? It's for windows forms application. You may read this too for solving your problem.
Hope it helps.

Ways to speed up queries SQL Server 2008 R2 without SqlDataSource object

I'm trying to build a product catalog application in ASP.NET and C# that will allow a user to select product attributes from a series of drop-down menus, with a list of relevant products appearing in a gridview.
On page load, the options for each of the drop-downs are queried from the database, as well as the entire product catalog for the gridview. Currently this catalog stands at over 6000 items, but we're looking at perhaps five or six times that when the application goes live.
The query that pulls this catalog runs in less than a second when executed in SQL Server Management Studio, but takes upwards of ten seconds to render on the web page. We've refined the query as much as we know how: pulling only the columns that will show in our gridview (as opposed to saying select * from ...) and adding the with (nolock) command to the query to pull data without waiting for updates, but it's still too slow.
I've looked into SqlCacheDependency, but all the directions I can find assume I'm using a SqlDataSource object. I can't do this because every time the user makes a selection from the menu, a new query is constructed and sent to the database to refine the list of displayed products.
I'm out of my depth here, so I'm hoping someone can offer some insight. Please let me know if you need further information, and I'll update as I can.
EDIT: FYI, paging is not an option here. The people I'm building this for are standing firm on that point. The best I can do is wrap the gridview in a div with overflow: auto set in the CSS.
The tables I'm dealing with aren't going to update more than once every few months, if that; is there any way to cache this information client-side and work with it that way?
Most of your solution will come in a few forms (none of which have to do with a Gridview):
Good indexes. Create good indexes for the tables that pull this data; good indexes are defined as:
Indexes that store as little information as actually needed to display the product. The smaller the amount of data stored, the greater amount of data can be stored per 8K page in SQL Server.
Covering indexes: Your SQL Query should match exactly what you need (not SELECT *) and your index should be built to cover that query (hence why it's called a 'covering index')
Good table structure: this goes along with the index. The fewer joins needed to pull the information, the faster you can pull it.
Paging. You shouldn't ever pull all 6000+ objects at once -- what user can view 6000 objects at once? Even if a theoretical superhuman could process that much data; that's never going to be your median usecase. Pull 50 or so at a time (if you really even need that many) or structure your site such that you're always pulling what's relevant to the user, instead of everything (keep in mind this is not a trivial problem to solve)
The beautiful part of paging is that your clients don't even need to know you've implemented paging. One such technique is called "Infinite Scrolling". With it, you can go ahead and fetch the next N rows while the customer is scrolling to them.
If, as you're saying paging really is not an option (although I really doubt it ; please explain why you think it is, and I'm pretty sure someone will find a solution), there's really no way to speed up this kind of operation.
As you noticed, it's not the query that's taking long, it's the data transfer. Copying the data from one memory space (sql) to another (your application) is not that fast, and displaying this data is orders of magnitude slower.
Edit: why are your clients "firm on that point" ? Why do they think it's not possible otherwise ? Why do they think it's the best solution ?
There are many options to show a big largeset of data on a grid but third parties software.
Try to use jquery/javascript grids with ajax calls. It will help you to render on client a large amount of rows. Even you can use the cache to not query many times the database.
Those are a good grids that will help your to show thousands of rows on a web browser:
http://www.trirand.com/blog/
https://github.com/mleibman/SlickGrid
http://demos.telerik.com/aspnet-ajax/grid/examples/overview/defaultcs.aspx
http://w2ui.com/web/blog/7/JavaScript-Grid-with-One-Million-Records
I Hope it helps.
You can load all the rows into a Datatable on the client using a Background thread when the application (Web page) starts. Then only use the Datatable to populate your Grids etc....So you do not have to hit SQL again until you need to read / write different data. (All the other answers cover the other options)

DataSet usage when data source is very large?

I have read several MS articles about when to use DataSets in conjuration with a database from within a WinForms application. I certainly like the ease of use DataSets offer, but have a few concerns when using them with a large data source. I want to use a SQLite database to locally store processed web log information. Potentially this could result in tens of thousands of rows of data.
When a DataSet is filled via a database table, does it end up containing ALL the data from the database, or does it contain only a portion of data from the database?
Could I use a DataSet to add rows to the database, perform an Update for example, somehow 'clear' what the DataSet is holding in memory, then perform additional row adding?
So is it possible to essentially manage what a DataSet is currently holding in memory? If a DataSet represents a table that contains 100,000 rows, does that mean all 100,000 rows need to be loaded from the database into memory before it is even usable?
Thanks.
You have very important points here. These points were raised at the beginning of .Net, when we suddenly moved to disconnected state introduced in .NET.
The answer to your problem is paging. You need to manually code your grid or other displaying device (control) so it queries database in chunks. For example, you have a control (but not grid) that has fields and a scroll. You give your scroll 201 clicks. On 200 clicks, it scrolling through 200 records, on click # 201, it queries database for 200 more. May be, add some logic to remove 200 records, when number of them in the dataset reaches 1000. This is just an example.
To save data you can add it to this same DataSet/DataTable. There are few ways of doing it. DataSet/DataTable have capabilities to identify new or edited rows, relationships, etc. On a serious systems, Entity Lists encapsulate Datatables and provide customizations.
May be you want to look into Entity Framework capability. I am not sure if this functionality was included there.
Basically, for some simple application with small data it is Ok to use out of box ADO.net. But in a serious system, normally, there is a lot of ground work with ADO.NET to provide solid Data Access Layer and more additional work to create favorable user experience. In this case, it would be loading data in chunks because if you load 100K records, user will have to wait to load first, then it will be hard to scroll through all of them.
In the end, you need to look at what your application is, and what it is for, and what will be satisfactory or not satisfactory for the user.

Saving and Loading Dictionary. Improving Memory Footprint

Long time watcher and this is my first post so please go easy on me. I've looked around but my query is quite specific and I can't find it elsewhere.
I have an application which consists of 2 controls.
A control on the left contains a tree view which displays searches mapped from an XML file. Each Search has an associated GUID.
A control on the right displays a datagrid, the information of which I obtain from the tree view through a Dictionary (Guid<->Dataset).
When the user clicks on a node in the tree view, my application works out which GUID the search is linked with and presents the associated dataset which gets flattened and displays on the datagrid.
I save(Serialise) and load(Deserialise) the Dictionary when exiting/loading the application respectively.
I did some memory profiling recently and for larger searches the memory footprint of the Dictionary can be quite large (200mb) for the limited capacity of the user machines which I really need to sort out.
I was wondering if anyone had any idea how to achieve this.
I was thinking about splitting the Serialisable Dictionary into constituent Datasets and storing each one individually on the hard drive (maybe with the GUID as the filename). The saved Datasets would then be deserialised with each Node.Click event and displayed in the grid view. My worry about this is the potential pause in between each click event as it saves the old search and loads a new one.
Simple answer, toss the dictionary and the datasets into a Sqlite file. (or other db) It will be a little slower but I expect faster than any code you can hand code in a reasonable amount of time. If used correctly a db layer will do memory buffering and caching to make the ui more responsive.

How to deal with large objects?

I have 5 types of objects: place info (14 properties),owner company info (5 properties), picture, ratings (stores multiple vote results), comments.
All those 5 objects will gather to make one object (Place) which will have all the properties and information about all the Place's info, pictures, comments, etc
What I'm trying to achieve is to have a page that displays the place object and all it's properties. another issue, if I want to display the Owner Companies' profiles I'll have object for each owner company (but I'll add a sixth property which is a list of all the places they own)
I've been practicing for a while, but I never got into implementing and performance experience, but I sensed that it was too much!
What do you think ?
You have to examine the use case scenarios for your solution. Do you need to always show all of the data, or are you starting off with displaying only a portion of it? Are users likely to expand any collapsed items as part of regular usage or is this information only used in less common usages?
Depending on your answers it may be best to fetch and populate the entire page with all of the data at once, or it may be the case that only some data is needed to render the initial screen and the rest can be fetched on-demand.
In most cases the best solution is likely to involve fetching only the required data and to update the page dynamically using ajax queries as needed.
As for optimizing data access, you need to strike a balance between the number of database requests and the complexity of each individual request. Because of network latency it is often important to fetch as much as possible using as few queries as possible, even if this means you'll sometimes be fetching data that you do not always need. But if you include too much data in a single query, then computing all the joins may also be costly. It is quite rare to see a solution in which it is better to first fetch all root objects and then for every element go fetch some additional objects associated with that element. As such, design your solution to fetch all data at once, but include only what you really need and try to keep the number of involved tables to a minimum.
You have 3 issues to deal with really, and they are often split into DAL, BLL and UI
Your objects obviously belong in the BLL and if you're considering performance then you need to consider how your objects will be created and how they interface to the DAL. I have many objects with 50-200 properties so 14 properties is really no issue.
The UI side of it is seperate, and if you're considering the performance of displaying a lot of information onto a single page you'll consider tabbed content, grids etc.
Tackle it one thing at a time and see where your problems lie.

Categories