Best Way to Aggregate Log Files from Multiple Remote Windows Servers - c#

I have a Windows Desktop application written in C# that takes information from Windows services, web services, databases and other components running on various remote Windows servers for specific stages in a process. Currently, the application reads information from a shared database to get an indication of when to move on to the next step in a process, but the details and verbose information that each component outputs to its logs is not captured in the shared database, and I want to capture this log information in a "master process steps detail" type list that the Windows Desktop application can display to users.
Is there a best method or practice for capturing logs in various formats from multiple Windows servers?
There are similar questions on Stack Overflow, but they seem to be based more in the Linux or other non-Windows servers:
Best way to aggregate multiple log files from several servers
I was told that perhaps WMI might be able to help somehow, but would like confirmation.
Thanks for your reply.

Several sources outside of StackOverflow have recommended that I look into using Elastic Stack, so I'm moving forward with that for now.

Related

What is a good place to store production logs of an ASP.NET application?

Environment: Windows instance hosted in AWS. ASP.NET application logs events and errors using log4net to an SQL Server database.
Now I'm looking to offload that logging to another server with the final goal of 1) reducing the load on the SQL Server and the server itself and 2) have a better way to search and analyze those logs, which are currently being stored in a non-indexed table.
I’m looking for something faster yet simple to install, configure and use. I've already investigated Seq, Logstash, and others, but none seem to fit my requirements very well. Can you suggest a solution?
The requirements:
Free (and Open Source if possible)
Runs on Windows
Can be configured with log4net
The server can be run on another server (or it has a minimal memory footprint)
Fast
The good to haves:
Minimal security to protect the server if it's public on the Internet.
Supports event forwarding since the production server and the log server will be in loosely connected machines.
UI with basic search & analytics
One option is Elasticsearch and Kibana. Use Google, but a good starting point for you could be http://www.ben-morris.com/using-logstash-elasticsearch-and-log4net-for-centralized-logging-in-windows/
The idea behind putting the logs in a search engine instead of into a database; is to make it easier to search your logs. You will get indexing on the fields you choose and full text search for free here. At the same time, Kibana provides an excellent user interface for exploring and aggregating logs.
My personal experience is that this makes for a very good logging setup that will make it easier for you to spot errors and negative trends faster.

connecting to Oracle from a Windows.Forms Application

I'll working in a Random Moment Sampling desktop app. I don't work with windows forms since a long time and I have the following questions.
I need to query data from Oracle 11g, if I remember right, before my users can start using the client application they need to install the oracle client. I'm right or this changed?
If this is a problem I can use web services to retrieve the data. If someone has recommendations I'm open to alternatives, I'll have approximately 3000 users and I'm looking for the best option.
The application will run in the background querying the database every minute, it will look for samples, the moment it founds ones a window comes up blocking the computer until the user fills the sample.
Is a Windows.Forms application the best option or I shall use Windows Service? I read a few threads but Im thinking in the installation process.
I'm currently on time so I can try a few ideas.
Yes the Oracle software needs to be installed. There is an "instant client" package That is a little more lightweight then the normal client which can allow for connectivity.
Whether to use a service or not depends on the functionality of your system and how extensible you want it. You mentioned you will have 300 users querying the data. If they are querying the same data it may result in more than one user responding to the same data. I don't know if this is what is desired.
edit: to combine a bit if the oracle software is a concern. If you do create a service that serves up your data, the system where the service is run is the only one that will require the Oracle client software.

Storing log files on cloud

I have developed a desktop application that will be used without my presence. Therefore, my program may register different events - input, executed operations, working time. Is it possible to modify app for sending logs to any cloud for remote viewing reports?
I have accounts on Wuala and Google Drive, but not a problem to register somewhere else (DropBox? SkyDrive? I don't know what cloud will be easier for work in this situation).
Or you can suggest another, more elegant, solution.
I am using C#, .Net 2.0. Logs are simple .txt files.

Shaky connectivity - favor web or desktop app?

I'm a desktop application developer who is temporarily working in the web. I'm working with a client that wants me to build an app for use by locations all over the state; however, these locations have very shaky connectivity.
They really want a centralized web app and are suggesting I build a "lean" web app. I don't know what a "lean web app" means: small HTTP requests but lots of them? or large HTTP requests with few of them? I tend to favor chunky vs chatty.. but I've never had to worry about connectivity before.
Do I suggest a desktop app that replicates data when connectivity exists? If not, what's the best way to approach a web app when connectivity is shaky?
EDIT:
I must qualify my question with further information. Assuming the web option, they've disallowed the use of browser runtime technologies and anything that requires installation. Thus, Silverlight is out, Flash is out, Gears is out - only asp.net and javascript is available to me. Having state this, part of my question was whether to use a desktop app; I suppose that can be extended to "thicker technologies".
EDIT #2: Network is homogeneous - every node is Windows. This won't be changing.
You should get a definition of what the client means by "lean" so that you don't have confusion surrounding it. Maybe present them with several options of lean that you think they might mean. One thing I've found is it's no good at all to guess about client requirements. Just get clarification before you waste a bunch of time.
Shaky connectivity definitely favors a desktop application. Web apps are great for users that have always-on Internet connections, and that might be using a variety of different browsers and operating systems.
Your client probably has locations that are all using Windows, so a desktop application is an appropriate choice. One other advantage of web applications is that they make the deployment issue easy to deal with. Auto-update technologies like ClickOnce make the deployment and update of desktop applications almost as easy.
And not to knock Google Gears, but it's relatively new and would have to be considered more risky than a tried-and-true desktop application.
Update: and if you're limited to just javascript on the client side, you definitely do not want to make this a web app. Your application simply will not be available whenever the Internet connection is down. There are ways to save stuff locally in javascript using cookies and user stores and whatnot, but you just don't want to do this.
If connectivity is so bad, I would suggest that you write a WinForm app that downloads information, locally edits it and then uploads it. This way, if your connection goes down, all you have to do is retry until it works.
They seem to be suggesting a plain vanilla web app that doesn't use AJAX or rely on .NET postbacks or do anything that might make it break down horribly if your connection goes away for a bit. Instead, it should be designed so that you can hit Refresh until it works. In other words, they seem to want the closest thing to a WinForm app, only uglier.
You may consider using a framework like Google Gears to help provide functionality during network down time. This allows users to connect to the web page once (with a functioning connection) and then be able to use the web app from then on, even without a connection.
When the network is restored, the framework can sync changes back with the central database.
There is even a tutorial for using Google Gears with the .Net Framework.
Gears with other languages
You mention that connectivity is shaky at these locations, but that the app needs to be centralized. One thing you might consider is using multiple decentralized read database servers and a single centralized write server. Mysql makes this possible and affordable if your app is small.
Have the main database server at the datacenter/central office. Put up small web/db servers at each location, with your app installed. You can even run them off a user computer if the remote location is not too big. Make the local database servers connect to the centralized database server as replication slaves. As changes come in to the centralized database, the slave servers will pull down the data and make it available locally. When the connection is unavailable, your app data is still at least available, if not up to date. When the connection is available, the database handles replicating all relevant data down.
Now all you have to do is make your app use two separate database handles: reading data it uses the local database, writing data it uses the central database.

Suggestions for developing Windows Service with C#

I am developing a distributed application where each distributed site will run a 2-tier winform based client-server application (there are specific reasons for not going into web architecture). The local application server will be responsible for communicating with a central server and the local workstations will in turn communicate with the local server.
I am about to develop one or more windows services to achieve the following objectives. I presume some of them will run on the local server while some on the local workstations. The central server is just a SQL Server database so no services to be developed on that end, just DML statements will be fired from local server.
Windows Service Objectives
Synchronise local data with data from central server (both end SQL Server) based on a schedule
Take automatic database backup based on a schedule
Automatically download software updates from vendor site based on a schedule
Enforce software license management by monitoring connected workstations
Intranet messaging between workstations and/or local server (like a little richer NET SEND utility)
Manage deployment of updates from local server to workstations
Though I have something on my mind, I am a little confused in deciding and I need to cross-check my thoughts with experts like you. Following are my queries -
How many services should run on the local server?
Please specify individual service roles.
How many services should run on the workstations?
Please specify individual service roles.
Since some services will need to query the local SQL database, where should the services read the connection string from? registry? Config file? Any other location?
If one service is alotted to do multiple tasks, e.g. like a scheduler service, what should be the best/optimal way with respect to performance for managing the scheduled tasks
a. Build .NET class Library assemblies for the scheduled tasks and spin them up on separate threads
b. Build .NET executable assemblies for the scheduled tasks and fire the executables
Though it may seem as too many questions in a single post, they are all related. I will share my views with the forum later as I do not want to bias the forum's views/suggestions in any way.
Thanking all of you in advance.
How many services should run on the local server?
It depends on many factors, but I would generally go for as less services as possible, because maintaining and surveilling one service is less work for the admin than having many services.
How many services should run on the workstations?
I would use just one, because this will make it a single point of failure. The user will notice if the service on the workstation is down. If this is the case only one service needs to be started.
Since some services will need to query the local SQL database, where should the services read the connection string from? registry? Config file? Any other location?
I would generally put the connection string in the app.config. The .NET framework also offers facilities to encrypt the connection string.
If one service is alotted to do multiple tasks, e.g. like a scheduler service, what should be the best/optimal way with respect to performance for managing the scheduled tasks a. Build .NET class Library assemblies for the scheduled tasks and spin them up on separate threads b. Build .NET executable assemblies for the scheduled tasks and fire the executables
b. is easier to design and implement. It gives you the possibility to use the windows scheduler. In this case you will need to think of the problem, when the windows scheduler starts the executable, when the previous start has not finished yet. This results to two processes, which may do the same. If this is not a problem then stay at that design. If it is a problem, consider solution a.
For solution a. have a look on Quartz.NET which offers you a lot of advanced scheduling capabilities. Also considering using application domains instead of threads to make the service more robust.
If you don't get admin rights on the local server, think about means to restart the service without the service control manager. Give some priviledged user the possibility to re-initialize the service from a client machine.
Also think about ways to restart just one part of a serivce, if one service is doing mulitple tasks. For instance the service is behaving strangely, because the update task is running wrong. If you need to restart the whole service to repair this, all users may become aware of it. Provide some means to re-intialize only the update task.
Most important: Don't follow any of my advices if you find an easier way to achieve your goals. Start with a simple design. Don't overengineer! Solutions with (multiple) services and scheduling tend to explode in their complexity with each added feature. Especially when you need to let the services talk to each other.
I dont think there is one answer to this, some would probably use just one service and other would modularize every domain into a service and add enterprise transaction services etc. The question is more of a SOA one than C# and you might consider read up on some SOA books to find your pattern.
This does not answer your questions specifically, but one thing to consider is that you can run multiple services in the same process. The ServiceBase.Run method accepts an array of ServerBase instances to start. They are technically different services and appear as such in the service control manager, but they are executed inside the same process. Again, just something to consider in the broader context of your questions.

Categories