Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I’m using Entity Framework and Sql Server 2012. I have to remove in one Transaction a lot of items from database about 200 GB. I’m deleting the data now directly this is very fast!
The problem if the deleting process is failed the database will be defected that why I’m thinking to use Transactionscope. When the process failed I will Rollback the database.
Is Transactionscope good to handle a lot of data or we have to make some consideration about the performance. Anyone have some benchmark data or experience with this problem?
Any help would be greatly appreciated
As they have said in the comments it has some impact but it is trivial if you decompile TransactionScope class and look to the code you will see what it happening there!
It is really depending on your Database settings/Schema; you can optimize your database to get the best results. This approach have been used for years and I have used for long time until now I have no problem with it.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I'm interested in your opinions, best thoughts ...
Would you say it's better to just query the database any time you want to retrieve data (even if you know it'll be static), or store the data in server(web) session memory?
On the one hand, storing in memory would allow faster processing of data, and decrease SQL server loads. On the other hand, it can somewhat increase complication of application code, and web server/system resources.
The answer depends (as most do) on your specific needs. If your database has the bandwidth but your web server doesn't, you should do it in SQL. And vice versa. There's no way to recommend anything to anyone without knowing their situation.
You can use static dictionaries to store static data from database.
The benefit of use this is have only one instance of each dictionary and reduce the number of request to the database.
You can refer to this question to see how to implement this.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I gonna write a new Program,
is it better to store my Queries in Views and call them from my C# Code.
Or should i Just execute all SQL queries in my Code.
Which is better for the Performance.
From a performance point of view there is no difference between running a query from C# and accessing a view that is based on the same query; the only difference would be if the view that you are accessing has indexes on it and thus runs faster than your ad-hoc query.
As a good practice is best to use SQL View when extracting data from multiple tables through joins as putting the whole SQL code in C# would look messy and might be more prone to errors.
There are a lot of things to be considered when undertaking this:
1. volume of data - indexed views
2. number of tables involved
3. database structure dynamic - how often you change tables
have fun!
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I have been trying to update the database when we deploy the application on the client's machine using sql queries. Now i want to update the database automatically. Is there any way to do this, i have heard of SQLMigrations but they say that it can only be used with Code first approach. Can anyone shed some light on this topic
You can use a Database Project in Visual Studio. With Database Projects, you can generate SQL Scripts for any existing database, you can create difference (update) scripts, you can add SQL scripts of your own, etc.
Database Projects are extremely handy in many scenarios. Check out this link on MSDN: http://msdn.microsoft.com/en-us/library/xee70aty.aspx
There is also guidance on CodeProject about this: http://www.codeproject.com/Articles/245612/Creating-a-Database-Project-with-Visual-Studio
You'll love it!
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I have been told to make a process that insert data for clients using multithreading.
Need to update a client database in a short period of time.There is an application that does the job but it's single threaded.Need to make it multithread.
The idea being is to insert data in batches using the existing application
EG
Process 50000 records
assign 5000 record to each thread
The idea is to fire 10-20 threads and even multiple instance of the same application to do the job.
Any ideas,suggestions examples how to approach this.
It's .net 2.0 unfortunately.
Are there any good example how to do it that you have come across,EG ThreadPool etc.
Reading on multithreading in the meantime
I'll bet dollars to donuts the problem is that the existing code just uses an absurdly inefficient algorithm. Making it multi-threaded won't help unless you fix the algorithm too. And if you fix the algorithm, it likely will not need to be multi-threaded. This doesn't sound like the type of problem that typically benefits from multi-threading itself.
The only possible scenario I could see where this matters is if latency to the database is an issue. But if it's on the same LAN or in the same datacenter, that won't be an issue.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
If any of you are using the new TVP to pass a collection to SQL Server what issues have you encountered with that process? I am thinking about implmenting it, but wanted to get some feedback on potential pitfalls with going with that solution. I like passing XML for flexibility and the ability to pass collections, but I don't want to have to shred the xml with SQL Server (Doesn't seem like it will scale well). Anyways just wanted to get some feedback from some of the more experienced users....always really helpful.
Thanks,
S
I have been using TVPs for half a year so far, and I did not have any issues at all. I do not send more than 1K rows at a time. Before TVPs were available, I was packing numbers in binary format into an image, and casting parts of an image back into numbers on the server - that scaled very well, performed real fast for up to 100K numbers.