C# batch up operations to load in remote database asynchronously - c#

I have several front end web app servers, running ASP.NET MVC apps, which collect some analytics data on visitors.
I want to collect this data into a central database, via a Web API call.
I don't want to lose data, but I also don't want the front end servers to slow or stop in the event that the webapi/db server is not available.
I'm envisioning writing all these events to a log file and then run an async process to ship them to the central webapi/db, and can retry in case the server is not available for a time.
Is there some standard library or method for performing this, without building a bunch of custom code around it?

Related

How to catch SQL Server trigger in C# code

I have to create an external service for existing database which is working with ERP system.
Edit: service will be running on the same machine where SQL Server is running.
Service has to listen for a trigger on a table with documents.
Scenario:
User creates a document
Trigger calls service method
Method queries the database to get data, create document and send it to external API.
Is it possible to catch trigger like that from a C# Worker Service?
It's technically possible to run arbitrary SQL CLR code in a trigger, and make remote web service or rpc calls to code running in a separate service. But since the trigger runs during the transaction and any code you run can delay or undo the database change, it's not recommended. And probably would not be supported by the ERP system.
So the best pattern here is to have the trigger write a row into a local table, and then have an external process poll that table and do the rest of the work, or to configure Change Data Capture or Change Tracking and have the external program query that.

How to move ASP.NET MVC 5/SQL Server app from single server to multiple servers

I have a web application (ASP.NET C# MVC5 with SQL Server 2014 back end on a single server) that has become moderately successful. It is handling the load fine now, but I want to be prepared in the event usage explodes (as it may). The app takes a lot of data from the user in the form of images regularly captured from the user's webcam.
What would be the best way to begin moving from a single server to a setup that would be scalable? I'd ideally like to be able to simply add servers when we need to handle more traffic and processing (for facial/object recognition, etc).
I've done a little research, but haven't found anything I'd consider even a good starting point, so I'd appreciate even a point in the right direction.
You should consider moving your files from filesystem to some CDN (Content Delivery Network) This will assure you high scalability. It can be for example: Amazon CloudFront or Google Cloud Platform
You can also consider moving your code (most resource consuming) to ESB (Enterprise Service Bus) for example MassTransit (free) or NServiceBus.

updating aspx page from server

Currently I am working on a project where a web page will start a 3 hour data transfer process through the use of a web service. Basically a list of Id's are used to tell the web service to transfer an data object from one database to another.
When I start the process, I would like to allow the user to remain in control. He must be able to see how far the current process is, and be able press a cancel button to stop the data transfer. All this is run from an ASPX web page.
I have discovered that trying to run the process asynchronous does not allow me to update the user interface as I had hoped. Only when the entire process is finished Another problem I might face is that since the process takes a lot of time, the server objects might get refreshed at some point, which could cause me to lose my progress.
I am currently at the point where I am deciding whether or not to use the web server to process the data transfer. If I am, my current solution is to use tasks to run the process asynchronously, and use client side web calls (to the aspx page) to update the client interface. I am also implementing the IRegisteredObject interface for my work object. IRegisteredObject explanation
Any idea on how to best tackle this problem is most welcome. I really want to know if I'm heading in the right direction.
My suggestion would be for you to create a WCF service, which is not hosted within ASP.NET and to make your ASP.NET application call that service to trigger the long running job that performs the data transfer. Meanwhile your ASP.NET app notifies the user that it has triggered the job and you can expose other endpoints on your service which the ASP.NET application can query through a user request in order to extract and report progress.
The web server will only run one page at a time for each user, so if you want to communicate with the browser while the process is running, you can't run the progress from a request. You need to start it as an independent background thread, so that you can finish the request that started the process.
After that you can send requests to the server either to do polling for status of the process, or to control the process.
If possible you should run the process entirely outside of IIS, for example as a console application. That way you only have to keep track of the fact that there is a process running in the web application, for example putting that in a database so that it survives IIS recycling.

Creating a Database Query Service

I've been asked to create a Service for our Parent Company. They don't care how I do it, as long as the data is sent to them.
We have an SQL 2000 Server that receives machine data via Data Transformation Services (DTS).
Our Parent Company wants me to create a Service that runs every 5 minutes or so to collect new data, summarize it, and forward it to them.
With my background in Windows Forms development, I naturally think that I should poll the database every 5-minutes using some type of Windows Service, then send that data over to our Parent Company.
The machine housing this data is an old Windows 2000 machine, and our Network Administrator has recommended that I write this as a Web Service on our newer Web Server.
I created a Web Service a few months back for the Web Server to pull work order information from our Parent Company, but I do not know how to make this Service execute a process every 5-minutes.
Yesterday, I learned how to create an Insert After Trigger when records were added to the table. Unfortunately, the triggers are not called because this old server uses DTS jobs. I was able to learn about Controlling Trigger Execution When Bulk Importing Data, but there does not seem to be a way to modify our old DTS jobs to enable the BULK INSERT command. It may not work on SQL Server 2000.
So, with this background, should I create a Windows Service or a Web Service?
How should I proceed?
I would not make a web service for a recurring task. Web services are not very comparable to a windows service.
btw: A simpler alternative might be to create a command-line app that runs, periodically via a scheduled task (read about the "AT scheduler in Server 2000"). I is just easier to install and make updates because it wouldn't require a reboot of your server each time you make an update.
If the webservice has a method which you can call that executes the data importing/converting exactly one time, you can use a windows task or cron job to make a request to that method. You can either add this task to the server that is hosting the service, or some other server as long as it can access the webservice.

C#, ASP.net application which calls executable to create output file

We are developing a web application in ASP.Net and C#. The requirement here is to interact with a third party exe which is developed in Fortran77. This third party exe produces an output file after being provided with some inputs and shuts down.In windows desktop single user application this is easily possible by using System.Diagnostics.Process and the events provided therein. But in web there will be multi-user environment, and many calls will be made to this exe. What are the best possible ways to handle such an exe in web application?
Is it fine if we invoke exe on each user request as the exe shuts down after generating output file? Or
Is it possible to use windows service? Or
Any other approach?
Thanks in advance.
-Prasad
Typically, invoking a different process to do some job (for a request) does not scale well when your number of requests start growing. Said that, if the process invocation is not going to happen frequently then you should be OK. The number of concurrent requests and through-put etc will really depend on your server hardware and the best bet would be to load test the server. As such you should use Process class to launch the process to get the work done.
Yet another issue that is possible that your legacy executable does not support multiple instance. It's unlikely but there are quite a few desktop windows application that check for existing instance. So in such, you cannot launch process concurrently and only way would be to create a queuing logic - you can create a in-process queue (in your web application) or create a external application (such as windows service) that will do queuing.
There can be alternate approach for this solution that is useful when the time taken for process to complete is large (so that you cannot block your web requests till the job is complete) and/or you need to scale your app to support more load. Essentially idea is use producer-consumer pattern where your web server will add requests to a persisted (e.g. table in database) queue and then you have multiple machines/servers running a job/windows service that would read from this queue and run the process to generate file.

Categories