What is the possibility of making asp.net web application update itself? - c#

I have a web application that i would like it to check for updates, download and install them.
i know there are already some updater frameworks that works for windows applications, but is it possible for web applications ?
The first thing came to my mind when thinking of this is:
File permissions (i might not be able to replace all my application files due to file permissions)
Also touching the web.config or the bin folder will cause the application to restart.
I also thought about executing an exe from my web application that does the job, but i dont know if it could get shutdown because of a restart to the web application.
I would appreciate any ideas or solution to that case.
Thanks

Take a look at WebDeploy
It is meant to ease such tasks. Where you want to deploy a publish to a production server.
Rather than having your server check for updates and update itself, it would be simpler to just push updates to it when you have them.
Web Deploy allows you to efficiently synchronize sites, applications
or servers across your IIS 7.0 server farm by detecting differences
between the source and destination content and transferring only those
changes which need synchronization. The tool simplifies the
synchronization process by automatically determining the
configuration, content and certificates to be synchronized for a
specific site. In addition to the default behavior, you still have the
option to specify additional providers for the synchronization,
including databases, COM objects, GAC assemblies and registry
settings.
Administrative privileges are not required in order to deploy Web
applications.
Server administrators have granular control over the operations that can be performed and can delegate tasks to non-administrators.
This needs you to be running IIS7 though.

Related

File system issue in Azure app service - Unable to create Directories

We have a dotnet core application hosted in Azure app service (Windows machine) in our production environment. It consists of two components -
Email Service
Business Rules Engine
The Email service downloads all emails first to a folder Attachments in the same directory where the application is hosted (D:\home\wwwroot\). For each email, a separate directory (with a guid value) is created under the Attachments directory.
The Business Rules engine accesses that folder and uses the email and it's attachments. Once done, we clear out all contents from the Attachments directory.
The problem we're seeing is that after a certain number of emails are processed, all of a sudden our application is unable to create directories under the Attachments folder. The statement
Directory.CreateDirectory({path})
throws an error saying the specified path could not be found.
The only way we've been able to resolve this is to restart the app service and it again happily goes on it's way creating directories, processing emails until it fails again in a day or so 8-|
What we've tried -
Ours was a multithreaded app, so assuming that maybe one thread is holding a lock on the filesystem due to incorrect or incomplete disposing of resources, we changed it to single threaded processing
Where the directories were being created, we used DirectoryInfo, so tried using DirectoryInfo.Refresh() after every directory deletion, creation etc
Wherever FileStream was being used, we've added explicit .Dispose() statements to dispose of the FileStream
Called GC.Collect() at the end of each run of our service
I suspect this issue is due to the Azure environment but we've not been able to identify what is causing this issue. Has anybody had any such issues and if so how was it resolved?
I made some changes to my code based on what I read in these links here which gives a good summary of the storage system in Azure app service -
https://www.thebestcsharpprogrammerintheworld.com/2017/12/13/how-to-manually-create-a-directory-on-your-azure-app-service/
https://github.com/projectkudu/kudu/wiki/Understanding-the-Azure-App-Service-file-system
D:\local directory points to a folder that is accessible only to that instance of the service, instead of what I was using earlier which is shared among instances - D:\home.
So I changed the code to resolve the %Temp% environment variable, which resolved to D:\local\Temp and then used that location to store the downloaded Emails.
So far multiple testing runs have been executed without any exceptions related to the file system.
Yes, based on your issue description,it does look to be a sandbox restriction. To provide more more on this, the standard/native Azure Web Apps run in a secure environment called a sandbox. Each app runs inside its own sandbox, isolating its execution from other instances on the same machine as well as providing an additional degree of security and privacy which would otherwise not be available.
Azure App Service provides pre-defined application stacks on Windows like ASP.NET or Node.js, running on IIS. The preconfigured Windows environment locks down the operating system from administrative access, software installations, changes to the global assembly cache, and so on (see Operating system functionality on Azure App Service). If your application requires more access than the preconfigured environment allows, you can deploy a custom Windows container instead.
Symbolic link creation: While sandboxed applications can follow/open existing symbolic links, they cannot create symbolic links (or any other reparse point) anywhere.
Additionally, you can check if the files has read-only attribute, to check for this, go to Kudu Console (({yoursite}.scm.azurewebsites.net)) and run attrib somefile.txt, and check if it includes the R (read-only) attribute.

Assembly files locked by IIS and/or application provisioning

The Scenario
I'm using msdeploy to deploy files to Web Server A (let's call it WebA) and Web Farm Framework's Application Provisioning feature to synchronise to Web Server B (let's be imaginative and call it WebB).
The Problem
For just one specific WCF .NET web service, the msdeploy to WebA works okay, but the sync fails, reporting that a .NET assembly file is locked by the w3wp.exe process.
What have I tried?
Of course restarting IIS etc will unlock it and allow the sync, but I'm struggling to work out why it's locked in the first place. I believe IIS doesn't use the deployed files directly, instead copying them to the Temp ASP.NET Files directory and JIT-ing the svc file etc all in there as it does with regular ASP.NET.
The Question
Where can I begin to work out why the file would be locked by w3wp.exe? I don't think it'll be the service itself because the msdeploy.exe to WebA works okay and it's only the sync to WebB that fails. Could it be the Application Provisioning "service" on WebB that's locking the file? Why might it do that?

Approaches for (re)deploying code/bin files to (multiple) Windows Azure Virtual Machines

This question may not relate specifically to Azure Virtual Machines, but I'm hoping maybe Azure provides an easier way of doing this than Amazon EC2.
I have long-running apps running on multiple Azure Virtual Machines (i.e. not Azure Web Sites or [Paas] Roles). They are simple Console apps/Windows Services. Occasionally, I will do a code refresh and need to stop these processes, update the code/binaries, then restart these processes.
In the past, I have attempted to use PSTools (psexec) to remotely do this, but it seems like such a hack. Is there a better way to remotely kill the app, refresh the deployment, and restart the app?
Ideally, there would be a "Publish Console App" equivalent from within Visual Studio that would allow me to deploy the code as if it were an Azure Web Site, but I'm guessing that's not possible.
Many thanks for any suggestions!
There are number of "correct" ways to perfrom your task.
If you are running Windows Azure Application - there is simple a guide on MSDN.
But if you have to do this with a regular console app - you have a problem.
The Microsoft-way is to use WMI - good technology for any kind managent of the remote Windows servers. I suppose WMI should be ok for your purposes.
And the last way: install Git on every Azure VM and write simple server-side script scheduled to run every 5 minutes to update the code from repository, build it, kill old process and start new one. Publish your update to repository, thats all.
Definitely hack, but it works even for non-windows machines.
One common pattern is to store items, such as command-line apps, in Windows Azure Blob storage. I do this frequently (for instance: I store all MongoDB binaries in a blob, zip'd, with one zip per version #). Upon VM startup, I have a task that downloads the zip from blob to local disk, unzips to local folder, and starts the mongod.exe process (this applies equally well to other console apps). If you have a more complex install, you'd need to grab an MSI or other type of automated installer. Two nice thing about storing these apps in blob storage:
Reduced deployment package size
No more need to redeploy entire cloud app just to change one component of it
When updating the console app: You can upload a new version to blob storage. Now you have a few ways to signal my VM's to update. For example:
Modify my configuration file (maybe I have a key/value pair referring to my app name + version number). When this changes, I can handle the event in my web/worker role, allowing my code to take appropriate action. This action could be to stop exe, grab new one from blob, and restart. Or... if it's more complex than that, I could even let the VM instance simply restart itself, clearing memory/temp files/etc. and starting everything cleanly.
Send myself some type of command to update the app. I'd likely use a Service Bus queue to do this, since I can have multiple subscribers on my "software update" topic. Each instance could subscribe to the queue and, when an update message shows up, handle it accordingly (maybe the message contains app name and version number, like our key/value pair in the config). I could also use a Windows Azure Storage queue for this, but then I'd probably need one queue per instance (I'm not a fan of this).
Create some type of wcf service that my role instances listen to, for a command to update. Same problem as Windows Azure queues: Requires me to find a way to push the same message to every instance of my web/worker role.
These all apply well to standalone exe's (or xcopy-deployable exe's). For MSI's that require admin-level permissions, these need to run via startup script. In this case, you could have a configuration change event, which would be handled by your role instances (as described above), but you'd have the instances simply restart, allowing them to run the MSI via startup script.
You could
build your sources and stash the package contents in a packaging folder
generate a package from the binaries in the packaging folder and upload into Blob storage
use PowerShell Remoting to host to pull down (and unpack) the package into a remote folder
use PowerShell Remoting to host to run an install.ps1 from the package contents (i.e. download and configure) as desired.
This same approach can be done with your Enter-PSSession -ComputerName $env:COMPUTERNAME to have a quick deploy local build strategy that means you're using an identical strategy for dev, production and test a la Continuous Delivery.
A potential optimization you can do later (if necessary) is (for a local build) to cut out steps 2 and 3, i.e. pretend you've packed, uploaded, downloaded and unpacked and just supply the packaging folder to your install.ps1 as the remote folder and run your install.ps1 interactively in a non-remoted session.
A common variation on the above theme is to use an efficient file transfer and versioning mechanism such as git (or (shudder) TFS!) to achieve the 'push somewhere at end of build' and 'pull at start of deploy' portions of the exercise (Azure Web Sites offers a built in TFS or git endpoint which makes each 'push' implicitly include a 'pull' on the far end).
If your code is xcopy deployable (and shadow copied), you could even have a full app image in git and simply do a git pull to update your site (with or without a step 4 comprised of a PowerShell Remoting execute of an install.ps1).

Can a web app access and modify the registry of Windows?

I've been writing desktop apps in C# for some time now but I'm increasingly getting frustrated with the fact that not everyone has .NET 2 or Higher installed. I don't have the option of upgrading their systems to meet my needs. My apps are mostly utilities that run alongside the main program the company I work for has. They access the file system and the registry. Being relatively new to programming in general, I was wondering if moving these tools to the web would solve some of my problems. But I have no idea if web apps can have access to these parts of Windows. I was thinking of writing these web apps in either Rails or ASP.NET. So my question is this. Can a web app access and modify the registry and file system of Windows?
Thanks.
Nope, "web apps" like asp.net or rails apps run on the server alone and just serve html to the client. So all the client-side code can do is what jscript running in the browser sandbox can do, ie no file access or registry access.
You can however install an activex on the client computer that gets full access, but the user has to agree to install it as it's a security risk.
Writing the apps as Web apps instead (and Rails is cool to use) is a good option - your users don't need to install anything, upgrades are easy to do, and dependancies are no longer a problem.
However, you now need to start re-architecting your apps so they do not need to write anything to the client, except a cookie (that's stored in the browser). If you can do this, then migrating to a webapp will be great.
If you cannot, my advice is to learn the same language that your company's app is written in. Once you do that, the company app will have taken care of the dependencies already and you will just need to offer your utilities alongside the app, perhaps even in the installer, or just to copy the files into a subdirectory. If you're thinking of learning Ruby, then learning the corporate language will be just as difficult (only you'll be able to reuse a lot of code used in the main app)
No, a traditional asp.net application cannot access the file system or registry on the windows box. Simply put because it doesn't actually run on the client machine. Instead it runs on the server where it does not have access to the local machine.
It is possible to have portions of the application which run on the client machine. Browser based applications for instance. However these would require that the 2.0 framework be installed on the customers machine which puts you right back at square #1.
No, this isn't possible. Web applications cannot modify the registry and/or file system on a user's machine because of the security implications. You would need to develop a Windows app to do these kind of changes. You could always make this tool available for download on your website though.
No, you can't do that with a web application. Besides others have already said, a web application run in a browser, not inside an operating system, so all you can do is what browsers allows you to do and not all you want, and browsers doesn't allows you to take control of the host machine.
I'm guessing the desktop app used in your company uses the registry to store workstation / user specific (state)data.
Moving to a web based app does not mean storing state data is no longer possible, just account for it by including a table in your database that can be used to save that same (state)data in. The registry is no longer needed.
Another pro is that by moving to a fully webbased application, you never have to worry about your endusers, because the code is running on the server, all the enduser gets is the output in html :-D.
The only thing to keep in mind is cross browser compatibility, don't create an app that works in IE only for instance, it has to look and work the same in all major browsers.
There are a few products out there, such as Xenocode and VMWare's ThinApp, that allow you to virtualize your app's dependencies to the point where your .NET app can run on a machine without the .NET Framework installed. Just another option from left field.

Dynamically load .dlls from network share not browsable on client PC -- WCF?

I'm architecting a WPF application using the PnP Composite Application Guidance. The application will be run locally, within our intranet.
Modules will be loaded dynamically based on user roles. The modules must therefore be accessible to the application through a network share, thus accessible from the client machines.
What I'd like to do is keep all the module .dlls in a location not accessible to staff, but still be able to provide them to the composite application when demanded and when the current user is authenticated to use that module.
My thought is to load the .dlls by streaming them down from a WCF service, where the WCF service (on the server) can access the .dll repository, but none of the client machines can access it. Authentication would also be handled by the service.
I suspect that I might be overcomplicating things somehow.
Is this something that can be done with a simple filesystem configuration and programmatically passing credentials when accessing the shared folder? If I do this, would access only be granted to the calling application, or would the logged-on user now be able to navigate to the shared folder?
Is this, in any way, a solved problem with MEF or any other project of which you're aware? (I hope this isn't LMGTFY-worthy -- I haven't been able to come up with anything.)
At Argonne National Laboratory we keep all sharable DLL and other objects (.INI files, PowerBuilder PBD libraries, application software, etc.) on a simple and internally public file server and objects are being downloaded over the network on a per need basis as defined by each client/server application. Thus we are minimizing the maintenance of middleware (Oracle Client, PowerBuilder, Java, Microsoft, ODBC, etc.) to a single file server location with basically no software installed on the end user PC. Typically we physically download less than a few KB Registry Keys to the individual end user PC; this includes the full Oracle Client, which if installed on the PC alone would take up 650+ MB disk space and several thousand Registry Keys, and costly to maintain on the enterprise. Instead our Oracle Client on the PC is about 17KB.
The only “software" on the client side are Registry Keys containing variables pointing to server locations (f.ex. ORACLE_HOME: \<server name>\ORACLE\v10\Ora10g ).
This has been a very cost effective solution we have been using for 10+ years, making all middleware and application software upgrades totally transparent to more than 2000 users Lab wide. Over the years we have done thousands of object upgrades on the central file server without ever having to install a single upgrade on the end user Desktop. Although this has some risks (“thou shall not copy DLLs over the network”, etc.) and is a heavily customized solution, it has worked flawlessly for us throughout for a large number of applications and middleware.
This is a somewhat surprisingly simple solution in today’s advanced technology, but it has been totally efficient and cost effective for us. Several vendors (Citrix and others) have looked at our solution somewhat perplexed, but every vendor of deployment techniques who have seen our deployment has come to the same conclusion, basically: “you do not need us”.
when loading modules you need to keep in mind that:
Once loaded, an assembly can't be unloaded (unless you unload the entire application domain) - so if users can log in and out using the same instance, you may have a problem.
"the load context" matters (see http://blogs.msdn.com/suzcook/archive/2003/05/29/57143.aspx) - this may cause problems if you have dependencies between modules or dependencies on assemblies that are not in the "load context"
If the restricted access to dlls is due to a licensing issue, maybe you need to refine the licensing mechanism somehow (not tie it to access to the actual code, but to some other checks)?

Categories