Create a data tier in legacy C# application [closed] - c#

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
Subjective question I know....
We have an existing application that has around 20 components that connect to a database. We are now clustering the application to increase scalability but we're hitting some limits where every process having a small connection pool is resulting in many connections to the DB. There would also be some interesting options for caching across our cluster if we could centralise connections.
My question is are there any low cost/risk options to refactor our solution from utilising SQL connections directly to a middle tier? Is the pain of this worth rewriting a full unit of work + models style application tier and refactoring all our database connections into the WCF business logic calls?

As a longer term solution refactoring to a middle tier would be helpful in that any common connection handling logic can be centralized. One common reason for the multiple connection problem is that connections are not disposed of correctly but it's speculation without looking at the code.
See the following from the sql support team for diagnosing multiple connection issues http://blogs.msdn.com/b/sql_pfe_blog/archive/2013/10/08/connection-pooling-for-the-sql-server-dba.aspx

Related

c# solution architecture for signal-r and angular [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I would like to build a c# spa application using angular with real-time messaging signal-r. The signal-r should read the data continuously from a data source and publish the updated data to the user and store the data in a database as well. It should also enable the chatting of the users.The expected number of users is around a hundred.
For such application what should be the best architectural structure of the solution? Should I implement two (three?) projects, e.g. one for the web app and the other for the signal-r, running as two applications? Then, in this case, how can I do the messaging between the applications? Or should I implement a single project for all of these? It would be best if you can provide the pros and cons of these alternatives or provide any other option.
Start with one project.
For 100 simultaneous users, you aren't even close to worried about load. Any simple hosting plan would take care of it easily. If you get more, ASP.NET and SignalR work just fine behind a load balancer (though certain operations can get more complicated).
A properly architected application won't be difficult to split into multiple processes in the future if it ever came to that, and doing so now is just adding mounds of complexity for no appreciable benefit. This goes double since it sounds like you are just starting out.

Is having many small wcf services better then one? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I developing software for taxi services. My software consist from wcf service as server and wpf application as client. Functionality is growing and my wcf service has more then 50 methods now. I thinking about split my one big WCF service to couple of small services.
Is it a good idea to do so?
I would say yes if you can define a clear separation of responsibilities for each of your services. You should try to avoid or minimize coupling between services though. Keep in mind you'll break existing clients if you do change the contract, but it sounds like you are in control of these.
This is quite a common scenario, and you can help yourself by ensuring your service layer is essentially a facade to make it easier to move things around.
IME, if splitting the service up would just create a lot of replicated code for common functionality (like DB access), or would complicate things by needing to add stuff like additional functionality for the services to talk to each other, I would suggest no.
Another reason: your fault-tolerant scenarios now become more complicated. It is one thing if your entire monolithic service dies - then nothing works. But if you have 4 related services that need to work together, you now have to intelligently handle partial failure scenarios like what happens if service #3 goes down, and the other services have half-done jobs that need it. Now you have to be able to get things back to a consistent state while waiting for #3 to come back up, or be able to persist stuff so you can get back to it when it does. Your number of error messages had just increased as well.

Data Access Layer in ASP.NET, best practices for making connections [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Which is considered a better practice when creating a data access layer for use in ASP.NET/Web Services:
1) Open and close a connection within each method that retrieves data from the database;
2) Have a 'connection class' that deals with connecting to the database, and other separate classes that uses the connection class, by opening the connection in the constructor, and then closing the connection in the destructor.
(I tried to enter pseudo-code as an example, but stackexchange won't accept it. Sorry for no example.)
I'm concerned that the first method is going to open a million unnecessary connections to the database over time. Or does ASP.NET just cache the connection and I don't have to worry about it?
Are there disadvantages or dangers about going about it the second method, which I guess would keep a connection a bit longer to the database?
Thanks in advance.
The problem with having a global connection class is that multiple web handlers would be calling it simultaneously. Sometimes you can have database objects with a lifetime and visibility scoped to that connection. (For example, a temporary table). If you have multiple simultaneous queries running on that same connection, then these queries might inadvertently interfere with one another.
What you want is a Pool of connections. Here, there are a number of connections kept open, but when a function accesses a connection it has exclusive usage of it.
If you are using SQL Server, then this is provided for you for free:
SQL Server Connection Pooling (ADO.NET)

use Entity Framework for big projects [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
Can I Use Entity Framework 6 with .net 4.5 for a big project ?
In this project i have many solution in this project and some of them are in a fast communication with database like as Smtp with hug data communication
thanks
// edit for more details
i start a big project that it has five section
1- send sms with smtp that create huge request and overload on database for example in secound 1500 record insert and select
2- payment request
3- many other request ...
The default answer, is certainly yes. This is where all ORMs shine. In large systems where writing a data access layer is a big and error prone task.
As Steve McConnell suggests in his great book you should never make speculations about performance. Therefore, if you have specific performance concerns, you should try benchmarking.
If you want my opinion, between clear code and performance, I choose clear code. It will allow you to implement a more robust and maintainable system. Then, if you identify performance problems, you could make the necessary minor changes. This is my default rule.
Hope I helped!
Can I Use Entity Framework 6 with .net 4.5 for a big project ?
Yes
In this project i have many solution in this project and some of them are in a fast communication with database like as Smtp with hug data communication
Yes, although it might depend on your database design. And even then you could use Stored Procedures if necessary.

Multithreading advice on approach needed [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I have been told to make a process that insert data for clients using multithreading.
Need to update a client database in a short period of time.There is an application that does the job but it's single threaded.Need to make it multithread.
The idea being is to insert data in batches using the existing application
EG
Process 50000 records
assign 5000 record to each thread
The idea is to fire 10-20 threads and even multiple instance of the same application to do the job.
Any ideas,suggestions examples how to approach this.
It's .net 2.0 unfortunately.
Are there any good example how to do it that you have come across,EG ThreadPool etc.
Reading on multithreading in the meantime
I'll bet dollars to donuts the problem is that the existing code just uses an absurdly inefficient algorithm. Making it multi-threaded won't help unless you fix the algorithm too. And if you fix the algorithm, it likely will not need to be multi-threaded. This doesn't sound like the type of problem that typically benefits from multi-threading itself.
The only possible scenario I could see where this matters is if latency to the database is an issue. But if it's on the same LAN or in the same datacenter, that won't be an issue.

Categories