I have a windows network (not connected to domain) and I need to provide some automation on each PC at certain time of the day. There are several tasks - launch executables, managing FS, transfering files. All this actions must be implemented via RDP, using C#. What is common approach to achieve this? I don't have experience using RDP within software. So are there .NET classes or free libraries I can use to get RDP functionality in my software. Thank you!
All the tasks you have listed relyed much more on security issues for machines within your network and a user logged-in priveledges a rather than a usage of RPD.
Within a windows domain the tasks like yours are usually delegated to ActiveDirectory administration and policies.
In case of a not Windows Domain Network you will need to use a mechanism that will be presented in following configuration:
a client installed on each particular machine under proper permissions. The client should implement a subscriber pattern.
a server installed on a "commander" machine. the server should inplement a publisher pattern.
There should be a lot of ready solution that should implement the concept of content disribution and starting specific scripts. I think that your investment in such tools research and evaluation will be much more time- and cost- effective rather than writing an app that "uses RPD functionality"
But if there is a reason that prevents usage of 3rd parties, I would go for implementaion of WCF service that will be installed on all clients. This service should be "trained" to do all your suff on client. Server side you will need an appliaction or a service that will publish events for clients or trigger known clients methods.
Related
I'm looking for a way to establish a simple communication between a c# web application and the operating system.
Since i'm working on Silverlight, i get everything i need to create files into any folder on the C:/ Disk. The problem is that we're going to migrate from Silverlight to Html 5 / C#
So i'd need a way to create files FROM any browser to any OS : Windows,Mac,Linux ..
I thought about using Microsoft Active X but that's not cross platforms.
I'm simply looking for a technology/plugin/software or anything that would allow me to do that, the less client interaction would be the best.
I think your need is in conflict with any common sense about security. If there was a simple way to create any file on any computer that loads your web app, just imagine how quickly all sorts of malware would spread.
But going back to your question - I think it will not be simple (btw. was it really simple in silverlight?). What I can imagine is to have some kind of service running on a client PC (the user would have to install it, or it could be corporate policy if your web app is targeted at corporate solutions). Then the service would listen on some TCP port and your web app could send requests to that port with the intent to create particular file with particular content. All the security concerns would be then implemented in mentioned service so that it doesn't get abused by hostile web apps
This has been asked a lot, but I'm after more specific answers.
I'm designing/developing a suite of applications that are dependent on communicating with a agent and a manager. They will be communicating commands (from manager to agent) and statistics (agent to manager). At the moment, this suite of applications will only run on the Windows platform, but eventually should scale out to other systems. Some places where this application would be used might not want to install a great load of applications across their systems, and some might not want the manager side sitting there sucking up juice (analysis!) and therefore might want it in the cloud.
So, I know that I can use .Net TCP sockets, which seems to have good raw performance and allows me a great deal of flexibility.
I also know that I could use Windows Communication Foundation, which seems like naturally a better choice.
But, seeing as I'm sending commands and receiving statistics, I could just use Powershell to connect remotely and use the plethora of commands available from what would be the server, ruling out writing a client application full stop.
Bearing in mind that these applications (client especially) should just sit there, be quiet, do their job and not interfere with general operations, which would you suggest to be better?
If you need more clarification I'll be happy to do so!
Thanks.
The way you discribe your application, I would reformulate it as an 'agent' located on clients and a manager located on the server. If it's so the manager sends commands to the agents and the agents respond or store statics available to the manager.
For such an architecture you can imagine using WMI, with a WMI provider on the agent, and wathever you want on the server. Powershell can be used on the server to query your agent. In the near future you can use the same architecture puting your agent on a Linux box with NanoWBEM on the top of WS-Man Protocol (see Standards-based Management in Windows Server “8”) .
They will be communicating commands (from server to client) and
statistics (client to server). At the moment, this suite of
applications will only run on the Windows platform, but eventually
should scale out to other systems.
In your description there is one requirement that is not compatible with Power Shell.
Power Shell works only in Windows world.
WCF service should not consume any resources if they are properly configured. IIS and WAS are able to load services on-demand (upon getting request from client) and unload when it is not necessary.
http://blogs.msdn.com/b/blambert/archive/2009/02/13/enable-iis.aspx
http://msdn.microsoft.com/en-us/library/ms731053.aspx
TCP sockets are the best from performance point of view but unfortunately implementation will require lots of extra DEV and QA work to implement all plumbing that already exists in WCF. When you finish that work you will have one more "bicycle" that is not compatible with industry standards.
My vote is WCF.
I am writing a Windows application using C#. I am planning on later to allow it to be controlled over the intranet using browser also. So in future we should be able to control it both using the local interface or over the intranet from the browser.
Is there any pre-defined architecture which will allow me to do this? What are the methods of achieving this? I am new to C#/.Net.
EDIT:
The windows application needs to access the communication ports extensively, and needs to be pretty stable and would probably run for some days together.
Thanks...
I can't tell you if a specific package exists that would ease the development. But, if I were to attempt it, after Googling and not finding something already available and meeting my needs, I would likely make my application a WCF host. Create service entry points to accept control messages remotely. You would also need some well-know location where to register your application so the remote system could find it. You should be sure to provide the user with a way of disabling the application remote control feature.
Your host interface will need to run on its own thread to remain performant. Since you are new to C#, and presumably windows forms application development, you will need to read up on how to properly talk to the GUI controls from a non-GUI thread.
Alternatively, you may want to implement your application as two distinct units, one with a GUI that does all the user interaction. It would form service requests to send to the host portion (with no GUI). Your app could then operate locally or be controlled remotely.
One solution I have used in a similar situation has three parts :-
1) Win32 (local) Service
Manages the COM ports and does whatever is necessary with the attached hardware
2) WinForms/Console Application
Runs on the local machine and communicates with the local service via named pipes or TCP.
3) Web Server + Web App
Runs on local or remote machine & communicates with local service.
The local user can shut the WinForms application down and log-off without affecting the service or remote users.
The newest version of Silverlight (the version that ships with Visual Studio 2010) allows what Microsoft terms the "Out Of Browser Experience" (OOB for short).
This allows the user to set up the Silverlight application as a desktop application as well as running through a browser.
Rudi Grobbler has just blogged about how he went about setting this up on his PC.
I am developing a distributed application where each distributed site will run a 2-tier winform based client-server application (there are specific reasons for not going into web architecture). The local application server will be responsible for communicating with a central server and the local workstations will in turn communicate with the local server.
I am about to develop one or more windows services to achieve the following objectives. I presume some of them will run on the local server while some on the local workstations. The central server is just a SQL Server database so no services to be developed on that end, just DML statements will be fired from local server.
Windows Service Objectives
Synchronise local data with data from central server (both end SQL Server) based on a schedule
Take automatic database backup based on a schedule
Automatically download software updates from vendor site based on a schedule
Enforce software license management by monitoring connected workstations
Intranet messaging between workstations and/or local server (like a little richer NET SEND utility)
Manage deployment of updates from local server to workstations
Though I have something on my mind, I am a little confused in deciding and I need to cross-check my thoughts with experts like you. Following are my queries -
How many services should run on the local server?
Please specify individual service roles.
How many services should run on the workstations?
Please specify individual service roles.
Since some services will need to query the local SQL database, where should the services read the connection string from? registry? Config file? Any other location?
If one service is alotted to do multiple tasks, e.g. like a scheduler service, what should be the best/optimal way with respect to performance for managing the scheduled tasks
a. Build .NET class Library assemblies for the scheduled tasks and spin them up on separate threads
b. Build .NET executable assemblies for the scheduled tasks and fire the executables
Though it may seem as too many questions in a single post, they are all related. I will share my views with the forum later as I do not want to bias the forum's views/suggestions in any way.
Thanking all of you in advance.
How many services should run on the local server?
It depends on many factors, but I would generally go for as less services as possible, because maintaining and surveilling one service is less work for the admin than having many services.
How many services should run on the workstations?
I would use just one, because this will make it a single point of failure. The user will notice if the service on the workstation is down. If this is the case only one service needs to be started.
Since some services will need to query the local SQL database, where should the services read the connection string from? registry? Config file? Any other location?
I would generally put the connection string in the app.config. The .NET framework also offers facilities to encrypt the connection string.
If one service is alotted to do multiple tasks, e.g. like a scheduler service, what should be the best/optimal way with respect to performance for managing the scheduled tasks a. Build .NET class Library assemblies for the scheduled tasks and spin them up on separate threads b. Build .NET executable assemblies for the scheduled tasks and fire the executables
b. is easier to design and implement. It gives you the possibility to use the windows scheduler. In this case you will need to think of the problem, when the windows scheduler starts the executable, when the previous start has not finished yet. This results to two processes, which may do the same. If this is not a problem then stay at that design. If it is a problem, consider solution a.
For solution a. have a look on Quartz.NET which offers you a lot of advanced scheduling capabilities. Also considering using application domains instead of threads to make the service more robust.
If you don't get admin rights on the local server, think about means to restart the service without the service control manager. Give some priviledged user the possibility to re-initialize the service from a client machine.
Also think about ways to restart just one part of a serivce, if one service is doing mulitple tasks. For instance the service is behaving strangely, because the update task is running wrong. If you need to restart the whole service to repair this, all users may become aware of it. Provide some means to re-intialize only the update task.
Most important: Don't follow any of my advices if you find an easier way to achieve your goals. Start with a simple design. Don't overengineer! Solutions with (multiple) services and scheduling tend to explode in their complexity with each added feature. Especially when you need to let the services talk to each other.
I dont think there is one answer to this, some would probably use just one service and other would modularize every domain into a service and add enterprise transaction services etc. The question is more of a SOA one than C# and you might consider read up on some SOA books to find your pattern.
This does not answer your questions specifically, but one thing to consider is that you can run multiple services in the same process. The ServiceBase.Run method accepts an array of ServerBase instances to start. They are technically different services and appear as such in the service control manager, but they are executed inside the same process. Again, just something to consider in the broader context of your questions.
I am going to be coding up a windows service to add users to a computer (users with no rights, ie just for authentication). (As a side note, I plan to use this method.)
I want to be able to call this windows service from another computer.
How is this done? Is this a tall order? Would I be better off just creating a Web Service and hosting it in IIS?
I have some WCF services hosted in IIS on the calling computer (they will do the calling to the proposed windows service). I have found that Hosting in IIS is somewhat problematic, so I would rather not have a second IIS instance to manage unless I need to.
(I will be using Visual Studio 2008 SP1, C# and Windows Server 2003 (for both caller and service host).
Thanks for the help
If you are thinking of hosting a web service in IIS just to communicate with an NT-service on that same machine, that is definitely more trouble than it is worth in this case.
As other answers have indicated you can make a WCF service with the operations you need and host that within the same NT-service that you want to interact with. You can easily secure this with certificates, or user accounts to make sure it is only controlled by the right people/machines.
If you need to control the NT-service itself, there are existing programs such as sc.exe to start, stop, configure, or query the status of your NT-service remotely.
However, you may want to consider seeking a solution without the overhead of creating an custom NT-service and a custom WCF service to interact with it. If you do, the Net User commands (sorry no link - new user limitation) or the AddUsers (see kb 199878/en-us) utility may be sufficient. If your remote "controller" can invoke these commands directly against the target machine you may not have to create any custom software address this need. Additionally you would have less software to maintain and administer on the target machine. You would just be using the built-in OS capabilities and admin utilities.
Finally, you will need to think about the security aspect, NT-services and IIS are usually run under very restricted accounts, many auditors would flip-out over any service running with sufficient permission to create or modify users locally, and especially on other machines. You'll want to make sure that the service could never be used to create users that do have more than the "authenticate" permission you indicated.
Edit: The net user command may not work against another machine's local users, but check out. pspasswd that along with PsExec to create users, should do what you need remotely.
Simply host a WCF service in the Windows Service. You'll then be able to call it remotely.
You can host a WCF service inside a Windows service. Take a look at the TCP binding (NetTcpBinding class). Both client and server will have to use WCF, but that doesn't sound like it will be an issue with your implementation.
Also, the section entitled "Hosting in Windows Services" in this MSDN article provides a walk-through of the process
If the windows service publishes a remoting interface then it can be accessed via that remoting interface.
Otherwise it's the same as accessing any other process running on a remote machine except that there may be some tools (e.g., sc) with built in support for executing against a remote machine (barring firewall complications).
Any IPC mechanisms applies; sockets, web services, remoting, etc...
You could expose a WCF service directly from your windows service. When you start up your windows service, in addition to spinning up any other background processes, you could create an instance of ServiceHost<T> for your service implementation. This would allow you to not only provide WCF access, but also avoid the extra instance of IIS like you requested, and provide TCP, Named Pipes, and WsHttp endpoints. This should give you some nice flexibility in the performance tuning arena, since it sounds like this service will be consumed internally on your network, rather than externally.
You could create a WCF service which will talk to your Windows service on the remote box. Host the WCF component in IIS (or however you'd like so that you can communicate with it) and then call the WCF component from your remote machine.