I want to send Tracer output to Azure blob store, but the application is not deployable to Azure, it must run as a standalone application. Is there a relatively easy way to do this? Everything I've seen talks about logging in Azure if you're running as some sort of deployed Azure role. Ideally, I want a Trace.Listeners.Add call and/or something in app.config that solves this.
You can certainly create a trace listener to writes to Windows Azure Storage and use that from any application as long as it will have the ability to access the REST endpoints for the Storage system in Windows Azure. In fact, Steve Marx posted something on his blog that could give you a head start.
I would suggest that for something like a trace writer I would recommend looking at writing the messages to Table storage (like Steve's post) rather than BLOB storage, unless you are wanting to batch up a lot of messages locally to the application and then write them all as a file to BLOB storage.
Related
Is there a way to write C# code in an Azure Function running in the Azure cloud to move files to and from an instance of SharePoint on-premises ? I have heard that to do this I require some sort of data gateway or connector of some sort to be in place.
Any guidance much appreciated, thanks.
Yes it would be possible but there's lots of factors to take into account with regards to your environment and how it is configured from a network perspective.
You are able to run Azure Functions within your on-premises environment. You can see some of the hybrid scenarios here:
https://learn.microsoft.com/en-us/azure/architecture/hybrid/azure-functions-hybrid#running-azure-functions-on-premises
There's also App Service Hybrid Connections which allow you to reach on-premises infrastructure:
https://learn.microsoft.com/en-us/azure/app-service/app-service-hybrid-connections
There's a devblogs post with a scenario you can read also:
https://devblogs.microsoft.com/premier-developer/using-azure-app-services-with-hybrid-connections/
Please suggest the best way to I copy data from Azure File storage to local machines every 2 hours (periodically). Can we write a C# exe to do that and deploy it on the PC?
Write a desktop application in any language with the SDK support of Azure File Storage. Within that application, create a timer to do your download through the API.
If the there are configurable settings or user interactions needed, I'd say go for a desktop application.
Otherwise, and if your clients are windows PCs, best way would be to write a windows service that does the job.
You perhaps could use Azure Logic Apps - if you set a job to run periodically and copy files from File Storage to OneDrive, for example, then OneDrive would replicate your files onto a on-premise server.
To start off with, I am aware of this question that seems to ask the same thing. However I'm going to ask it again with a slight modification.
I'm working on a project to print PDF's. We have a library for PDF manipulation. We can use it to render a PDF to a file that the System.Drawing.Printing.PrintDocument object can use and print in c#. We are going to have an Azure Worker role that takes many 1-page pdf's and turns them into one large pdf, and I would like to have another Azure Worker Role that then spools that large PDF to a Windows Print Server here locally.
Since the printing part is so much slower (compared to the pdf creation/aggregation piece) I would like to be able to host it in azure to easily scale.
My initial thought was "I don't think this is even possible. How would Azure be able to know anything about my local print server." Which is the basic answer from the above similar question. But after some searching, I found some results that seem to indicate setting up a VPN Site-To-Site Tunnel or ExpressRoute Connection would let me do what I want. However I'm relatively new to Azure and the results I found are short on actual, useful, helpful details.
If it can't be done, fine, I can set up an application server locally to do the printing. But if someone has ideas or more insight on how to accomplish this I would love to hear it.
Basically, you could store PDFs into Azure Blob Storage like:
http://azureaccount.blob.core.windows.net/PDF/pdf1.pdf
Then you define an Azure Queue entity like:
MyQueue(Pdf_Id, Pdf_Blob_Url)
MyQueue(1, http://azureaccount.blob.core.windows.net/PDF/pdf1.pdf)
MyQueue(2, http://azureaccount.blob.core.windows.net/PDF/pdf2.pdf)
and submit to the Azure Queue.
Then on your Printing server, just setup an application to check the AzureQueue to process the PDFs. At this point, just get the PDFs directly from Azure blob storage url to do anything you want like merging, printing,....
Without getting into the VPN / Site-To-Site setups, here is an Idea:
You could have a small application hosted on your network that uses Service Bus Relay to expose a WCF service (this will allow incoming connections to the service from the role)
The worker role can consume this Service and then send the PDF for printing to it.
Your App would send the PDF to the printer via PrintDocument object you mentioned.
See:
https://azure.microsoft.com/en-gb/documentation/articles/service-bus-dotnet-how-to-use-relay/
What is the Service Bus relay? The Service Bus relay service enables
you to build hybrid applications that run in both an Azure datacenter
and your own on-premises enterprise environment. The Service Bus relay
facilitates this by enabling you to securely expose Windows
Communication Foundation (WCF) services that reside within a corporate
enterprise network to the public cloud, without having to open a
firewall connection, or require intrusive changes to a corporate
network infrastructure.
I am using the MongoDB worker role project to use MongoDB on Azure. I have two separate cloud services, in one of them everything works fine, however in the other, the MongoDB worker role is stuck in a Busy (Waiting for role to start... Calling OnRoleStart state.
I connected to one of the MongoDB worker roles and accessed the MongoDB log file and found the following error:
[rsStart] replSet can't get local.system.replset config from self or any seed (EMPTYCONFIG)
There are threads on how to fix this normally, but not with Windows Azure. I did not configure anything for the MongoDB worker role (apart from Azure storage connection strings), and it works in another service, so I don't know why it isn't working for this service. Any idea?
Some time ago I was trying to host RavenDB in Azure as Worker Role and had lot's of issues with it as well.
Today, I believe it's better to run database the "suggested" way on target platform which is as Windows Service according to this "Install MongoDB on Windows" guide. This way you won't have to deal with Azure-specific issues. To achieve this you can:
Use Azure CmdLets along with CsPack.exe to create the package for MondoDB.
A solution similiar to RavenDB Master-Slave reads on Azure which I posted on GitHub.
Sign up for Virtual Machine (beta) on Azure, kick off a machine and install MongoDB manually there.
But I guess the most important question when hosting DB is: where do you plan to store the actual DB?
Azure's CloudDrive, which is a VHD stored in Cloud Storage, has the worst IO performance possible. Not sufficient for normal DB usage I'd say.
Ephemeral storage, a Cloud Service local disk space, has perfect IO, but you lose all data once VM is deleted. This means you usually want to make continious or at least regular backups to Cloud Storage, maybe through CloudDrive.
Azure VM attached disk - has better IO than CloudDrive, but still not as good as Ephemeral storage.
As for the actual troubleshooting to your problem. I'd suggest wrapping OnRoleStart with try-catch, writting it to the log, enabling RDP to the box and then connecting and looking into the actual issue right in place. Another alternative is using IntelliTrace, but you need VS Ultimate for that. Also, don't forget that Azure requires usage of Local Resources if your app needs to make writes to the disk.
I have an old ASP.NET 2.0 site that I really have no interest in rebuilding or updating at this point. I'd like to move it to Windows Azure but I'm not all that familiar with Azure so I'm wondering if it's easily portable.
The biggest potential roadblock is the fact that users can upload multiple photos. Upon upload, I create several copies of the image in pre-defined dimensions and store them on the local file system using Server.MapPath("{location}") to indicate where it should be stored.
Can I have a site hosted on Azure (using their Free or Shared tier) and continue to use this method of uploading and storing files or do I have to switch to blob storage? There are only about 400MB of images.
Basically I'm looking for a low/no-cost way to easily host this site on Azure that doesn't involve changes to the code (or at the very least, extremely minimal changes such as changing Server.MapPath to some relative location) so that I can move my other, more current ones to Azure as well. My situation is such that if I can't move this site, it doesn't make sense to move the others because I'll have to keep paying for the server for this one anyways (they're all hosted on the same server for now).
Azure drives are not guaranteed to be stored between reboots of the virtual machine, so you will probably need to use blob storage. But you can mount a blob as a NTFS volume and store it there. This would make the transition quite simple in the code.