Extremely strange ClickOnce Web deployment behavior (caching) - c#

so I recently deployed my application via ClickOnce to a web server (WAMP to be exact), and had VS2010 auto-generate the webpage and all that jazz. The users were able to download the application just fine.
The strangeness began when I pushed out my first update. 2 different scenarios occurred. When then when to website and hit install, it always installed the first version and not the update. Also, I have a "Check for Updates" Button in the app itself and when they'd click on that it would say "No update available" (using a variation of this code).
On a hunch I had them clear their browser cache and try the "Check for Updates" button in-app again... and lo and behold it worked.
What's going on here? Is it caching the webpage and thus not seeing the updates? When they visit it the text on the webpage has been updated saying it's the new version but they cannot install until they clear the cache. Furthermore, is that check for update code hitting the webpage too (How else would that not work either)? Would placing a NO-CACHE HTML line in the auto-generated webpage's header fix this? Any suggestions/insights are welcome.

I'd look into how your Apache is set up for caching, like you said. Look into what headers it's sending out. Make sure that it's sending out the .application file with the correct MIME type application/x-ms-application
After the ClickOnce is installed, it will always get the same Uri example.com/app/app.application and compare the installed version number with the one it just downloaded. When you Publish through Visual Studio it overwrites the file at that location. So, yes, I could see it being a caching issue. It's odd to me that the ApplicationDeployment API would be using the same browser cache, but who knows maybe it uses IE internally.
I have my testing ClickOnce application written on top of MSDN's asynchronous example. There's a progress string where you can see it downloading the .application each run. I haven't seen the same issue as you hosting the deployments either on a UNC path or on AWS S3 with static web hosting enabled. That's why I think it may be something in Apache?

Related

ASP.NET website deploys but only I can see

I have a prototype .net web site that is on Windows 10, created using C#. I am using IIS on the same machine to deploy from Visual Studio 2017. It uses SQL Server for back-end data. The site only runs on intranet.
It deploys ok, I can see all the pages, from all three of my machines. But others cannot see anything. They get a run-time error, that does not say anything specific.
My machine was re-imaged and hence the need to redo this.
I looked at IIS log and it does not have any info. What else I can look into?
In web.config, turn off custom error mode to see detailed error message on the client:
How to set web.config file to show full error message
If it works locally and not remotely, it's likely a permissions error (accessing a resource that the client doesn't have access to), or maybe a pathing issue (you are referring somewhere to something by disk or UNC instead of URL, or to a domain that only makes sense to you, like localhost).

IIS Does not refresh changes on web references ASP.NET

I am working over a Virtual Machine using Microsoft Azure, and I installed an app into the IIS, however the when I want to replace the files I've transferred via FTP (From my Local computer to a VM folder) the IIS does not refresh the changes. These are the steps I've run:
Site -- Add new website
Fill up all the required fields
Start the app
but what I can see in the browser is old application, I tried to modified the code but the changes never displayed in the screen.
Note: it is a web service which I try to modified.
Note: I've tried, iisreset, stop the web site, re-start it, re-start the server, deleting the web site, re-create the web site but nothing works.
I am using 4.5.0 in my web app, in [Web.config].
Windows Server 2016.
IIS 10.
Is there something which I am doing incorrectly?
The issue was that there is something in VS2015 solution which does not allow to refresh the web service once the new files are updated via FTP. What I did was to install the https://learn.microsoft.com/en-us/visualstudio/debugger/remote-debugging-aspnet-on-a-remote-iis-7-5-computer and configure the Visual Studio to Deploy the web service to IIS in the Virtual Machine. This was the link which helped a lot.
https://learn.microsoft.com/en-us/aspnet/web-forms/overview/deployment/visual-studio-web-deployment/deploying-to-iis
RECOMMENDATION
DO NOT deploy a web site or service using FTP, it is better to set the ports in the VM and run the deployment from VS properly.
Check to be absolutely sure that you are overwriting the original files. I have seen some ftp clients show a very small status window that can make it easy to assume the files are transferring due to the flood of messages streaming by when in reality there are permissions issues preventing you from overwriting files. Expand the logging window for whatever client you're using so that you can confirm for certain that your files are actually transmitting. If they are, maybe you're dropping them in the wrong folder.

How to publish asp.net core app Dlls without having to stop the application

When i try to publish the .net core app Dlls using ftp via filezilla tool it shows an error message that the file is in use by another process.
It's understandable that the above message shows because the the file is used by dotnet.exe which is a separate process.
To overcome the issue i have to stop the application in iis each time then upload the dlls and then restart it. Due to which a very small down time is experienced , the asp.net identity session expires and it needs rdp to the server each time upload is needed.
It's also not smooth experience in comparison to asp.net mvc where we could publish the files directly without need to RPD or do some manual action.
Any work around or solution to overcome to the above issues will be appreciated.
Got the solution.
It can be done via the app_offline.html file.
You have to move the app_offline.html file to application root folder to stop/make offline the application.
Then you can push your changes and remove the app_offline.html file from application root folder.
Note:
You can put some informative message in the app_offline.html file which will be shown to user that maintenance is on progress.
Well, if you had two servers behind a load balancer, you could do (simplified):
Remove server 1 from LB
Stop app on server 1
Update server 1
Add server 1 back to the LB
Repeat 1-4 for server 2
This is called a rolling upgrade.

After code changes, .Net website won't come up without deleting cache first

I have a .Net website with C# code behind.
When I make coding changes to the website (on my local machine), and copy the files (.dll and .aspx files using ftp through Windows Explorer) to the server (hosted by GoDaddy) the site will sometimes not come up without clearing the browser cache first. It happens in IE, Firefox, and Chrome.
Does anyone know why this would happen and how to fix the issue?
(FYI - not sure if it matters, but the website has a SQL database and the site is http://www.fonyfacts.com/)
Thanks for your help!
As soon as you upload a new DLL your website will recycle and can take anything from a few seconds to much more to get going again - this is normal. This will also happen when some other files are changed too, such as the web.config. Like Stanislav says; build local and only upload when you're ready to run it.

Running .net application over a network

I need to enable a .NET application to run over a network share, the problem is that this will be on clients' network shares and so the path will not be identical.
I've had a quick look at ClickOnce and the publish options in Visual Studio 2008 but it needs a specific network share location - and I'm assuming this location is stored somewhere when it does its thing.
At the moment the job is being done with an old VB6 application and so gets around all these security issues, but that application is poorly written and almost impossible to maintain so it really needs to go.
Is it possible for the domain controller to be set up to allow this specific .NET application to execute? Any other options would be welcomed as I want to get this little application is very business critical.
I ought to say that the client networks are schools, and thus are often quite locked down as are the client machines, so manually adding exceptions to each client machine is a big no no.
Apologies, I forgot to mention we're restricted to .NET 2.0 for the moment, we are planning to upgrade this to 4.0 but that won't happen immediately.
The deployment location in the manifest must match the location where it is deployed. You are going to HAVE to use a UNC path. There shouldn't be any problem with this. ClickOnce applications install under the user's profile, and require no administrative privileges. It only needs read access to the file share where the application is deployed.
The best answer is to create deployments for each school and for you to set the UNC path, because then you can just send them a signed deployment and they can put it on the file share. But that's a major p.i.t.a. if there are a lot of schools involved.
The next answer is: Who actually deploys the application to each school, i.e. puts it on the file share? Is there some kind of administrator?
What I would recommend (depending on who it is) is giving them mageUI.exe and teaching them how to change the deployment URL and re-sign the manifest (it will prompt). The problem with them re-signing the deployment is they have to have a certificate. You could give each school their own certificate (created with the "create test certificate" button in VS, or use MakeCert to create one [ask if you want more info]) or give them all the same key (not very secure, but hey, it would work).
If you at least updated to .NET 3.5 SP-1, you could deploy the application without signing it. (I'm not giving you a hard time about .NET 2.0, my company is in the same position, I'm just passing this information on.)
If the computers have internet access, you could probably find somewhere to host the deployment for $10/month, and push it to a webserver and let them all install from there. Then everyone would get updates at the same time, you would only have to deploy updates to one location. This would be the simpliest solution, assuming they have internet access.
RobinDotNet
Visit my ClickOnce blog!
On the "Publish" tab of your project properties there is a "Installation Folder Url" textbox. Visual Studio requires you to put something there. Just put in any random UNC path (\\someserver\randomfolder)
Click the "Options" button. Select "Manifests" and check "Exclude deployment provider URL". This will remove the path you were forced to add in step 1.
This should allow clients to put your deployment wherever they want. When their users install, their start menu shortcut will point back to where they put the deployment.
Here's the description from MSDN about that checkbox...
Exclude deployment provider URL
Specifies whether to exclude the
deployment provider URL from the
deployment manifest. Starting in
Visual Studio 2008 SP1, the deployment
provider URL can be excluded from the
manifest for scenarios in which
application updates should come from a
location unknown at the time of the
initial publication. After the
application is published, updates will
be checked from wherever the
application is installed from.
Perhaps, the link here could save you, if I am not mistaken, you are worried about the drive letter and handling UNC conventions? Take a look at this on CodeGuru, which contains code on how to map to a UNC share dynamically at run-time.
The problem is Security related to the .Net framework. Unfortunately i don't have much experience in this area, but maybe one of these links will help:
Microsoft is aware of this problem
Hint about mscorcfg.msc
Another hint from ID Automation
Last but not least: A google search
Can you use a UNC path?
\\\server\folder\app.exe?

Categories