VMWare ESX Server Backup with C# .net - c#

Does anybody know how I can backup an VMWare ESXi server with C#.net?
Thanks

Not sure if you mean backing up vms, or config. Since you say the "server", I'll go with the config. Not sure how C# should enter into this, either. But, according to the pdf, you can use "vicfg-cfgbackup" command to back up the configuration.
Frankly, I'm working with ESX and not ESXi, but all that command-line stuff is generally how things get done. I've written automation, including some utility apps in C#, to execute SSH commands against the ESX host, and found it to be a successful approach. Most of what I've done has been more around backing up and replicating vms. This is generally a process of archiving the files to another drive (generally mounted as /backup), then push-copying to a centralized backup location on the network. From there, the backup could be picked up by another, larger system like NetBackup, etc.

Related

SFTP server or FTPS Server?

I need to create a C# client with .NET Framework 4.6.2 to connect a server.
My client offers me the possibility to connect to a SFTP server or to a FTPS server, but I don't know which one is the best to connect with.
On this page, I have found this:
No built-in SSH/SFTP support in VCL and .NET frameworks
I need to connect to a server to upload and download files. I also need to monitor a directory on the server to know when a file is on that remote directory.
Searching on Internet I'm not sure if the .NET libraries (SSHNet) that implements SFTP protocol are good enough to a production environment.
I think SFTP is the best option to use but FTPS could be easier to implement a C# client for it.
Or maybe I can use libssh2 to implement a C program that do the job to monitor a remote directory, download any new file on it and upload the files that I need to upload.
Any advice?
If both protocols are fine for your actual needs (as suggested in your question and comments), if it's only a matter of "what's the easiest to use in .Net", I would simply go for FTPS.
It's very fast to implement, since you'll find all what you need in the framework ("FtpWebRequest" class, or more recently "WebClient", etc), even on old versions of the framework.
You can find plenty resource about this on the web or on SO
You have mentioned that you need to "monitor" a folder on remote server. Of course there's no problem with FTPS to retrieve the list of all files of a folder, but it will be in "pull" mode, as frequently as you wish. There's no way for the server itself to push you a notification every time a new file has been dropped. So if you need some real-time notifications, it's not optimal.

Stand alone DirectoryServices.DirectoryEntry()

I'm on a stand alone Windows 10 laptop and NOT running AD LDS or any other active directory services. I'm running IIS.
I'm trying to deeply understand what this line of code is doing and more importantly, how.
DirectoryEntry e3 = new DirectoryEntry(#"IIS://localhost/W3SVC/1/Root");
Does a windows OS fake in some sort of resolution for this method in absence of active directory?
1) First take care to note if you mean pre/port IIS7. With and after IIS7 many things changed yet, all to much, they still look alike. But there are important differences.
2) MAKE SURE you are at least running in administrative mode
run as administrator
~ or doing something better.
3) Look into .net's DirectoryServices()/DirectoryEntry() but also Microsoft.Web.Administration.ServerManager(). This is probably where you can do 90% of all you are attempting.
4) There is a windows tool cmdline exe (windows/syswow64[system32]/inetsvr/appcmd.exe) that is wonderfully helpful ~ in fact, if it is an option for your needs/environment, you might prefer to create a cmd script for all that you are trying to do. I suggest first learn this tool, then use it to extract out a lot of the IIS/Site metadata to explore what & where you are trying to get to. https://www.iis.net/configreference/system.applicationhost/applicationpools
5) Powershell has a snapin, certainly on server with IIS installed, maybe on workstations. I don't use a lot of powershell so the most i will say about that is the snapin is called WebAdministration and/or iisConsole. You may need to/prefer to manually register the snapin each time you run your script OR you might automatically register the snapin by using the IIS powershell management console.
6) For any above option always remember #2 ~ be certain you are at least running in administrative mode.
7) I know you certainly are playing in the land of IIS's metadata database ~ not the registry so much.
Local workstation: The exact mechanics when you are local to the IIS instance? I'm not sure. You might be accessing the metadata directly, you might be getting to the metadata via the IIS service, or you might be accessing the Server.exe Server service, or something else.
Remote server w/o LDAP: If you are querying a remote server not in an active directory? same as a workstation.
Remote server w/ LDAP: If you are querying a server in an AD you almost certainly are hitting the AD/LDAP service. Of course, how you are doing so might technically be via a segregate such as server.exe service running on that remote.
-- The end game is appcmd.exe, powershell, or c# Microsoft.Web.Administration, DirectoryServer(), all probably come close to doing the same thing in the background. But these are your interfaces to access that background so you don't need to think so much about the deeper implementation.
I hope this helps everyone!
Up vote it is you like this answer.

Solution for a no-server multi-user application/database?

I am at a dead end an I could really use some help.
I intern for a huge company. My projects involves creating an application to automate/simplify the work of a retiring employee.
The problem here lies in the strict company policies. I am a developer stuck at business end of the company. Therefor IT gives me nothing:
I don't have a server (nor web nor database)
I can't create a server, because no pc will be running and we can't keep them logged in due to single sign on with company cards.
I can't install anything on the pc's in the network.
I can access a share file server, that is backed up every day.
The libraries involved have to be free
A central database has to be accessed by a dozen of users (at once)
The database will recieve new data every day and will grow accordingly
The users will both read and write from/to the database
Preferably C#.NET or WPF solution
Application needs to open files stored on the shared drive. ( Only once, the important information will be extracted and stored in the database.. the file will then be removed)
My initial idea was to use silverlight (which runs standalone) in combination with SQLite. I ran a test and Silverlight files stored on the shared drive work. (Silverlight is installed on every pc on the network) This is my preferred front end. However (correct me if i'm wrong) I tried SQLite-net and I needed to add the sqlite3.dll to my windows/system32 folder, but on the network PC's I don't have access to the Windows folder, so this can not be done.
Also I read that SQLite or files in general can become corrupt when accessed by multiple users as one, so maybe I thought locking was an idea.
Which solutions are there to my problem?
I worked for a company for several years writing software for police departments to manage traffic collision reports. Police stations usually have little-to-no IT support, so we faced many similar limitations. The company actually did pretty well using Microsoft Access databases, with the setup looking something like this:
The shared drive had an Access database file (.mdb or .accdb) which was the actual "database".
Client computers (at the officers' desks) had Access applications with local "utility" tables for temporary storage, UI defined in Forms, and logic defined in Modules. Each of the client machines were connected to the repository on the shared drive by using linked tables. Local client configuration was stored either in the Access application in a config table, or in a text file on the machine.
It's not the cleanest solution, but it would allow you to create and maintain a unified solution using files that don't need to be installed and don't require any funny permissions, as long as everyone has read/write access to the shared drive.
Create a website. Today you can host ASP web apps in a stand alone .exe. By doing so you can make sure that the shared files are only accessed by one process. You can also limit the access to sqlite.
It also means that you do not have to distribute anything. Simply start your application and tell your users which url and port they have to browse too.
As for permissions, only the account running your webhost requires access to shared files etc.
You should take a look at ScimoreDB. It's an embedded database that supports multi-process read/write access. If needed it can also act as a client/server database; even as a distributed database with multiple nodes.
It's free to use and deploy. It has support for C++ and .NET. Only disadvantage is that it only works on Windows.

Simulate a network share in order to share files

Often times a program requires a file that happens to be on a network location. Take for instance Outlook. If I where to place an outlooks database (.pst file) in a network location then windows will make that "transparent" to the user and outlook will still be able to work. Another example could be quickbooks and many more. (as long as you have permissions to write and read)
For this example let's use Microsoft Word. If I would want to open a file in some other computer in the network I would be able to navigate to it as:
and open the file that I want because we are on the same network.
Now my question is how will I be able to simulate that? I want to have a virtual directory on the internet where I can place lets say my .pst file and then select it from windows explorer as:
(this example obviously does not work)
Will it be possible to do that? I believe windows uses a tcp connection with the host computer and then the host responds with he files that it shares. I will like to implement a program that does that so that I could avoid having to create a vpn. Also it will be nice if I could have my pst (outlook database file) on the internet so that all my computers open the same outlook database.
Note my purpose of this question is to open an outlook database file on a network location. I will like to be able to select a file on the internet from windows open file dialog. Also in todays world everything pretty much exists. I will like to create it lol
Windows provides a network redirector for CIFS (Common Internet File System, formerly SMB Server Message Block) resources. Writing a CIFS server is the easiest approach.
But you can also use one of the other existing redirectors, such as NFS, WebDAV, or Netware. And it's also possible to write new redirectors (though that requires kernel mode code, there are some development kits that provide the kernel code for you, similar to a Linux FUSE filesystem).
If you want to avoid writing code, WebDAV over HTTPS will provide you secure access (no need for a VPN layer) and software already exists.
It depends on how the server on the internet is set up to make its files available. Most often tcpip is not the protocol used for this - it is FTP, SFTP, HTTP or something similar. I believe Windows Explorer uses RPC calls over a local network to accomplish this. I don't think you will be able to use the Open File Dialog, you will have to write something similar that works over the protocol you need to use.

Shaky connectivity - favor web or desktop app?

I'm a desktop application developer who is temporarily working in the web. I'm working with a client that wants me to build an app for use by locations all over the state; however, these locations have very shaky connectivity.
They really want a centralized web app and are suggesting I build a "lean" web app. I don't know what a "lean web app" means: small HTTP requests but lots of them? or large HTTP requests with few of them? I tend to favor chunky vs chatty.. but I've never had to worry about connectivity before.
Do I suggest a desktop app that replicates data when connectivity exists? If not, what's the best way to approach a web app when connectivity is shaky?
EDIT:
I must qualify my question with further information. Assuming the web option, they've disallowed the use of browser runtime technologies and anything that requires installation. Thus, Silverlight is out, Flash is out, Gears is out - only asp.net and javascript is available to me. Having state this, part of my question was whether to use a desktop app; I suppose that can be extended to "thicker technologies".
EDIT #2: Network is homogeneous - every node is Windows. This won't be changing.
You should get a definition of what the client means by "lean" so that you don't have confusion surrounding it. Maybe present them with several options of lean that you think they might mean. One thing I've found is it's no good at all to guess about client requirements. Just get clarification before you waste a bunch of time.
Shaky connectivity definitely favors a desktop application. Web apps are great for users that have always-on Internet connections, and that might be using a variety of different browsers and operating systems.
Your client probably has locations that are all using Windows, so a desktop application is an appropriate choice. One other advantage of web applications is that they make the deployment issue easy to deal with. Auto-update technologies like ClickOnce make the deployment and update of desktop applications almost as easy.
And not to knock Google Gears, but it's relatively new and would have to be considered more risky than a tried-and-true desktop application.
Update: and if you're limited to just javascript on the client side, you definitely do not want to make this a web app. Your application simply will not be available whenever the Internet connection is down. There are ways to save stuff locally in javascript using cookies and user stores and whatnot, but you just don't want to do this.
If connectivity is so bad, I would suggest that you write a WinForm app that downloads information, locally edits it and then uploads it. This way, if your connection goes down, all you have to do is retry until it works.
They seem to be suggesting a plain vanilla web app that doesn't use AJAX or rely on .NET postbacks or do anything that might make it break down horribly if your connection goes away for a bit. Instead, it should be designed so that you can hit Refresh until it works. In other words, they seem to want the closest thing to a WinForm app, only uglier.
You may consider using a framework like Google Gears to help provide functionality during network down time. This allows users to connect to the web page once (with a functioning connection) and then be able to use the web app from then on, even without a connection.
When the network is restored, the framework can sync changes back with the central database.
There is even a tutorial for using Google Gears with the .Net Framework.
Gears with other languages
You mention that connectivity is shaky at these locations, but that the app needs to be centralized. One thing you might consider is using multiple decentralized read database servers and a single centralized write server. Mysql makes this possible and affordable if your app is small.
Have the main database server at the datacenter/central office. Put up small web/db servers at each location, with your app installed. You can even run them off a user computer if the remote location is not too big. Make the local database servers connect to the centralized database server as replication slaves. As changes come in to the centralized database, the slave servers will pull down the data and make it available locally. When the connection is unavailable, your app data is still at least available, if not up to date. When the connection is available, the database handles replicating all relevant data down.
Now all you have to do is make your app use two separate database handles: reading data it uses the local database, writing data it uses the central database.

Categories