make application responsive and fast [closed] - c#

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I created a wpf browser application ,and I hosted it in a live server ,it is a Inventory Management application ,which uses database heavily for saving, updating and selecting data.On save and select operation it takes very long time to save and fetch data to and from database.In the meantime (when save or select data from database) the application is irresponsive .How to make the application fast when dealing with database and how to make the application responsive during that operation .I am totally messed up ,its very kind if anyone has the answer .
Is using stored procedures instead of raw sql bring performance or not?
Edit
My application interact with database by using raw sql statements.

Responsive Application
In order to make your application responsive you need to make every interaction with the database in a new thread or task. You could also use the new C# 5.0 features await and async.
Your current application is blocking because the main thread is handling database operations as well as user interface, so what you need to do is to handle these two separate functionalities in separate threads.
Fast Database Access
I don't know how are you dealing with the database right now, but you could definitely use some kind of ORM like Entity Framework or NHibernate.
For better performance, but much less readable and mantainable code, which is really important, you could use raw sql.

try this
to access DB faster in .NEt i am sure it not problem with c# but for querying and retireving DB u need to have better mechanism

Related

How to maximize SQL retrieval of binary images [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am creating a Zillow-like IOS app. I am using Azure SQL (S-0 version) and C# for my backend. I store users data including the avatar of each user and the trips the person has been to (another table), where each user can have multiple trip pictures. I am using varbinary(max) to store those image types. So each user entity would have an average of 3 pictures.
my C# app retrieves all users with all their images and trip info and images using lazy loading in my linq-to-sql query.
The issue is naturally the time it takes to load all those date before user becomes able to consume it in the iphone clients (written in Swift). The end user will need all data to scroll through all those images. He at first load users, to see his images and read about him, and then scrolls up and would see trips the other user has gone to. It won't make sense for the user to see only avatars and then wait for all trips for example, every time he calls.
What strategy should I do to minimize the load time!?
The thoughts I have had so far:
Use Azure Blobs to store images and consume that right from my swift clients to see if it will be any faster. I ditched that option because the extra slow retrieval of files because I have to first access the image file from my C# code, and make binary out of it so I can transmit it to my iphone client app.
Instead of using linq-to-sql, use the good ole ADO.NET to minimize the extra layer linq-to-sql. I didn't see much difference not even a second extra.
optimize my linq query by adding as much as possible in the where clause and avoid using loops. Make them all one select statement. The query spat by the linq-to-sql seems right. It didn't change much because the issue is the extra huge binary write ups being retrieved.
I mentioned it already above, using silent calls where I load only the user-related image and then when user sees summary, behind the scene I start loading all those trips, but the user will quickly need to check the trips. He will be confronted of having to wait again to see those trips. And I would have to initiate new calls to DB, which is expensive. Very undesirable!
What other suggestions should I try?

C# Where to cache big data tables [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I have to execute a long running PL/SQL query and need to store the result set somewhere and serve to UI request. After some time need to refresh it automatically / manually.
there will be multiple result sets need to be handled in the same way. Each and every result set will have millions of records.
My project is using AngularJS with Web API.
I am using ADO.net with Oracle Client, not entity framework.
What I feel, .net MemoryCache is not suitable because the no.of resultset and size will be keep on growing.
I am planning to cache it in MongoDb. is there any other solution you suggest?
Thanks
Your question says "Each and every result set will have millions of records"
In that context,
Caching in server is not a good idea. Even distributed caching may result in performance issue.
Create dedicated physical tables in oracle to store your results
Load & refresh the result table whenever required
Fetch and return the result from results table
If you can't insert/delete records from your oracle database you may need to go with a dedicated application database.
Even if you return millions or records from web API, the angular application should process all your data. That might result in long running JavaScript and client side performance.
Redis is a good solution. Check out http://redis.io/

How to call functions on other computers? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Is there a way to call a function to run on all instances of a Windows Forms application across a LAN?
I have an application which contains a dashboard of their own Joblist. I want another user on another PC to create and allocate a job to this user. Once created and saved, I would like the method GetJobs(); to refresh. I've not done anything this advanced yet, so please go easy :)
Chris Walsh has excellent advice in his comment. That said, it is possible for Windows Forms applications to communicate with each other, and the simplest method, for me anyway, is WCF, self-hosted server. Typically the server code will not be running in the UI thread -- at least I don't recommend it. In fact, all WCF is best kept running in a background thread in a Windows Forms application, to avoid blocking the UI. WCF has lots of error conditions you will need to handle.
Another thing you might want to look at is MSMQ, now called Message Queueing. It can store a queue of jobs for you, and it won't lose them if the power is lost.
I assume you have some SQL Server Express Edition installed as the database backend.
This way you can connect to the database using some authentication, and add the job's directly there.
Then on the other computer, add a refresh button or poll for changes. This has the advantage that you don't need to write a service by yourself, and jobs can be created even if the user is not there and his PC is switched off.
You need just one server which hosts the database.

C# How to update data in datagridview [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I am building a C# wpf SQL server application and before I start there are few things that i want to ask.
First of, the application will be used by several clients to write data into one SQL table. In the main window of the application I will have a datagridview that will display the main table. Since data is going to be entered by more than one person, can I dynamically update the datagrid view or should I use a timer and update in once every few minutes?
Is it better if I open one global SQL connection and keep it open while the application is running (and if it is how can I do that) or should I create a new connection each time I want to do something?
These are the questions for now, I am sure I will have much more questions in the progress, since I am new to wpf and database programming. Thanks for the understanding :).
I recommend you to build your wpf application in mvvm architecture.
As for me, I think you should not use timers for this purpose. Because if your client will use different filters or sortings, to find custom data rows, these sudden data updates will change displayed data order, which is an unpredictable behavior (breaks UI development principles) and it will annoy everyone. The better solution, is when users will refresh data only when they need it (manually, button click), or on navigating to dataGrid table control himself (automatically).
If a client takes a record on editing, you can store record State parameter in database, and check its state to avoid collisions (editing of the same data in the same time by multiple users).
You should not keep the connection alive explicitly, when you're not make operation with dataBase. Since you're using WPF - the best, easy to configure and managing approach - to install EntityFramework nuget package to your project, and use Code first.
Here is a good tutorials, about how to use it.
Working with your database through project models is much more reliable and simple in realization, than working through SqlConnection classes.

Performance evaluation of Web API calls with Database LINQ queries [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a UI which calls WebAPIs (WebAPI 2.0), Web API are basically LINQ queries (to MS SQL database) and some processing logic for the data. I want to do performance evaluation of the complete flow (click from UI to API, back to UI to display data) upon a huge DB with 30K - 60K records in it.
How can it be done? Let me know the methods/tools used for this.
currently I am tracking time-taken in chrome debug window, which shows the total time for each network call.
Wow. This is a subject in its own right but here's an approach:
The bits are independent so you break it down. You measure your LINQ queries without any of the logic or web api stuff getting in the way. If LINQ is against stored procedures then measure those first. Then you measure the cost of the logic, then you measure the cost of sending X rows of data using WebAPI. You should avoid including the cost of actually retrieving the rows from the database so you're checking just the connectivity. I'd also consider writing a browserless test client (i.e. GETS/POSTS or whatever) to eliminate the browser as a variable.
Now you've got a fairly good picture of where the time gets spent. You know if you've got DB issues, query issues, network issues or application server issues.
Assuming it all goes well, now add a bunch of instances to your test harness so you're testing concurrent access, load testing and the like. Often if you get something wrong you can't surface that with a single user so this is important.
Break it down into chunks and have a data set that you can consistently bring back to a known state.
As for tools, it really depends on what you use. VS comes with a bunch of useful things but there are tons of third party ones too. If you have a dedicated test team this might be part of their setup. SQL Server has a huge chunk of monitoring capability. Ask your DBAs. If you've got to find your own way, just keep in mind that you want to be able to do this by pressing a button, not by setting up a complex environment.

Categories