Copy Files to the output directory on an Azure WebRole - c#

We have an azure Webrole which we call as an API from other applications to convert webpages into a rendered byte array which we then attach as a pdf into emails. We do this using ABCpdf.
For our latest project we have to use a second engine from ABC (ABCGecko) in order to render our pages correctly. The ABCGecko engine requires manually copying to the output directory after build occurs, it doesn't happen automatically.
For a normal application this is no issue, I simply copy the required folder (XULRunner_38 for anyone who uses ABC) into the release after building but I can't figure out how to do this for an Azure WebRole and there doesn't seem to be anything much in the way of help from what I can see in google searches.
I'm assuming I either have to build the role and then adjust the package before I deploy, or deploy the role and then copy the folder across after. I can't figure out how to do this though.
If anyone has any ideas or has needed to manually copy files to an Azure Webrole in the past then I would greatly appreciate your help. Also I should mention that we use Visual Studio as our IDE and publish from within there in case that matters to anyone.

Azure web (and worker) roles allow you to define startup tasks, which allows you to call a script (e.g. powershell script, batch file) which can then perform actions such as copying files.
Oh, and if you don't want it to attempt copying in the event a role instance reboots, you'd need to do something like leaving yourself a "breadcrumb" somewhere to signal that you've already done your init work.
What you don't want to ever do is manual copying of content to your role instances. The moment those instances are updated (e.g. new Host OS update) or they're moved to another physical host, you'll lose any of the files you manually copied.
This is all independent of any IDE (aside from general support for the script language you're writing in), since your startup task is going to execute on each web role instance when it starts up.
More details about startup tasks are here.

Related

Deploy - how can I efficiently deploy different exe based on parameters sent from web browser?

I have a webpage that offers an installer which adds an registry into user's computer based on the clients that the user has access to.
The installer is quite simple. It reads from its app.config, gets the client key and downloads configuration file that is used to create the registry.
Here is the thing, I use ClickOnce to deploy the app. The main logic of the installer remains the same, the only different thing is the app.config key. If the user has 5 clients, I have to publish 5 times since I separate different installers by setting different publish/install urls like below. BTW, I will have to define different Assembly Name too:
It's definitely not a good solution.
Is there any better ways that I can configure the installer to accept this parameter from the webpage, or other better ways to automate this process and then reduce the publish times?
Looking forward to any suggestions!
Thanks!
What I finally done is to write a powershell script, modify related parameters in csproj and deploy them. But I failed to copy those deployed files to remote server(aws). If anyone has any experience on that, you're more than welcome to share your thoughts!

Debug ASP.NET CMS

We're developing a custom Content Management System in ASP.NET 4.0, using Team Foundation Server for source control. The database is hosted in a remote server, whereas the debug is done locally, thus new content (aspx pages) created by each member of the team is stored in our local computers and unavailable for other team members. I don't think adding those files to source control is the best approach, but the only other way I see is deploying to an external IIS for debugging.
Have you already worked with this scenario? Wwhat do you think is the best option? Thanks in advance
thus new content (aspx pages) created by each member of the team is stored in our local computers and unavailable for other team members ... I don't think adding those files to source control is the best approach
I really wonder why you think that is a bad idea... I think adding created code in your source control system is the very best thing to do.
What do you think happens if a computer stops working, gets stolen, etc? How do you obtain the file again? Store every file that is crucial to your system in a source control system.
I guess you can have a "tools" folder in project structure where you keep all of your test pages. Then when project is built that folder can be excluded from copying.
For example when Release build is executed "tools" folder is excluded, while Debug builds leave it in the project.
It really depends how do you guys work, how many of you there are, how often do you do delivery cycle - but generally garbage of test pages tends to grow all over the place (same as commented out code) if there is no systematic approach and whole project team chips in.

Taking down multiple Azure Website Instances simultaneously

Quote from the Azure Web Jobs Documentation:
Persisted files
This is what you can view as your web site's files. They follow a
structure described here. They are rooted in d:\home, which can also
be found using the %HOME% environment variable.
These files are persistent, meaning that you can rely on them staying
there until you do something to change them. Also, they are shared
between all instances of your site (when you scale it up to multiple
instances). Internally, the way this works is that they are stored in
Azure Storage instead of living on the local file system.
Does that imply that by dropping app_offline.htm into the site root folder should pretty much bring down all instances simultaneously?
Yes.
It doesn't bring them down exactly just redirects all traffic to that htm file.
And it's easy to try for example using Visual Studio Online editor:
http://dotnet.dzone.com/articles/first-look-visual-studio
Or the DebugConsole:
http://azure.microsoft.com/blog/2014/03/04/windows-azure-websites-online-tools-you-should-know-about/
Just add the file to wwwroot and browse to your site.

how to load a picture to an <asp:Image> using a windows path

I have a little payments webApp, our customers can install it on their IIS and work with it. They can upload their own logotype.
We are using WyBuild to update this apps, but it replaces all files on the web folder with the new version, so the logotypes are deleted, that's why we placed the customer's files in program files, so the updater can't delete them.
the problem is that I can't load the images from the following path
C:\Program Files\MyApp\ImageFoder\logo.jpg
I don't know how to do it and I'm almost sure that is not possible to load
My web application is on
C:\inetpub\wwwroot\MyApp\
I can't have the images on the webFolder because wyBuild deletes them when I'm trying to update them, I already tried the paths like this: (the don't work)
///file:c:/program files/ .... etc
so, the question is
How can I load an image to an asp:image control using it's windows path ?
You need to configure an IIS Virtual Folder to point to the alternate location where the images are stored.
I wouldn't put them in Program Files, though, a sibling folder in wwwroot would be better.
Remember NTFS permissions are easy to mess up and it's easier to manage them in a single place.
Update - for locally installed, localhost-only sites Alternatively (and this is only a good idea if you have minimal amounts of traffic. NOT for public websites), you can serve files from an arbitrary location using a VirtualPathProvider. It sounds like this 'web app' is installed like a desktop app for some reason? If you want to store user data externally, the user's App Data folder would be appropriate, but ONLY if the web app refuses external connections, and can only be accessed from the machine.
Since you're dealing with images, I'd grab the imageresizing.net library and use the VirtualFolder plugin to serve the files dynamically. It's 200KB more in your project, but you get free dynamic image resizing and/or processing if you need it, and you save a few days making a VirtualPathProvider subclass work (they're a nightmare).
Wouldn't it be better to use isolated storage?
Added: I mean on the users machine, and upload them again if they are not found. This takes away your overhead completely.

Using ClickOnce when i need to generate several large files?

I'm building a digital signage application and I want to deploy it using ClickOnce. (I feel this is the best approach.) When I start the application from Visual Studio (VS) it works great. The application downloads a lot of images from my web service and saves them to disk:
string saveDir = new FileInfo(Assembly.GetExecutingAssembly().Location).Directory.FullName;
When I start my deployed application, it shows the splash screen and then disappears. The process keeps running, but the UI doesn't display. I'm wondering if my saveDir as shown above is giving me trouble?
How do I locate my installed application? (I need to make license files, etc.)
I'm not sure if this is the root of your problem, but I highly recommend you change the structure of how you store your application information.
When an application is installed through ClickOnce, the application is installed within the User's folder, and it's considerably obfuscated. Furthermore, locations may change with subsequent application updates, so you can not be guarantee than any cached, downloaded file will exist from update to update.
To solve this problem, ClickOnce does provide a Data directory, that is not obfuscated and can be used for caching local data. The only caveat is this directory is not available for non-ClickOnce instances of your application (like the version that is running within the VS debugger.)
To get around this, you should write a function that you can use to get your data directory, regardless of your method of distribution or execution. The following code is a sample of what the function should look like:
//This reference is necessary if you want to discover information regarding
// the current instance of a deployed application.
using System.Deployment.Application;
//Method to obtain your applications data directory
public static string GetAppDataDirectory()
{
//The static, IsNetworkDeployed property let's you know if
// an application has been deployed via ClickOnce.
if (ApplicationDeployment.IsNetworkDeployed)
//In case of a ClickOnce install, return the deployed apps data directory
// (This is located within the User's folder, but differs between
// versions of Windows.)
return ApplicationDeployment.CurrentDeployment.DataDirectory;
//Otherwise, return another location. (Application.StartupPath works well with debugging.)
else return Application.StartupPath;
}

Categories