I have this solution that works perfectly on my development workstation using Visual Studio 2012 .Net 4.5. No IIS changes. It's basically C# that grabs a webpage and turns it into bmp. I'm not sure if my problem is that I'm trying to write to filesystem.
This is what I get in Azure. And I'm reading it might be a limitation/restriction in Azure (Very disappointed if that is the case)
502 - Web server received an invalid response while acting as a gateway or proxy server.
I figure my problem with this code:
public Bitmap GenerateThumbnail()
{
Thread thread = new Thread(new ThreadStart(GenerateThumbnailInteral));
thread.SetApartmentState(ApartmentState.STA);
thread.Start();
thread.Join();
return ThumbnailImage;
}
private void GenerateThumbnailInteral()
{
WebBrowser webBrowser = new WebBrowser();
webBrowser.ScrollBarsEnabled = false;
if (this.Method == ThumbnailMethod.Url)
webBrowser.Navigate(this.Url);
else webBrowser.DocumentText = this.Html;
webBrowser.DocumentCompleted += new WebBrowserDocumentCompletedEventHandler(WebBrowser_DocumentCompleted);
while (webBrowser.ReadyState != WebBrowserReadyState.Complete) Application.DoEvents();
webBrowser.Dispose();
}
or maybe with image.save (which might be a problem next).
image.Save(Server.MapPath("~") + "/output/"+filename);
We are low budget and trying to keeping our hosted site simple and inexpensive. Would rather not have to build a worker process or use storage features of Azure.
===
UPDATE:
I don't think my issue is the Save, but do think that will be an issue on Azure. Additionally, as a test, I created a handler ashx that returns a image stream to a html img tag. That too works great locally, but on Azure I just get red Xs. Not sure what's up with that.
You basically don't have permissions to do that in Azure to write to the local directory without assigning them.
But there are many solutions to that problem.
Setup your start up task to assign permissions to the directory.
Worth reading this.
WindowsAzure: Is it possible to set directory permissions within the web.config?
If you need a temp local drive then this is preferred method.
http://msdn.microsoft.com/en-us/library/windowsazure/ee758708.aspx
If you need a persistent storage then Blob storage would be the best bet.
http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/
You can I believe also attach a drive to the compute like a VHD but I am not sure if two computes can both write to that drive at the same time so you would be maybe best to use blob storage if you need to keep the data.
Blob storage is super cheap.
hths, James
Related
i have a question, I have this code
foreach (string line in File.ReadLines(**#"C:\fis32v6\fis32.ini"**))
{
if (line.Contains("TEST1"))
{
Label1.Text="TEST1";
PdLine = "1";
}
}
DataSet ds;
ds = GetData(PdLine.ToString());
I want to read from txt file on client specific line with condition. When developing this and building the code it works, what ever i change in txt file can be read from my PC. But when I run the website on server it reads the txt file on that server instead of client I opened the website.
Is there any possibility to make the path relative?
As John mentioned that would be a huge security issue, mostly to make sure the website doesn't dig around in your system.
However it can be instigated from client side.
Just have search here on SO for 'upload file using asp.net' there are loads of hits with answers listed.
You didn't mention specific versions you use (MVC?, asp.net? / core?), and no context as to what workflow your code is running in (is it run on connection or during a specific process), is it configuration settings used for the web session itself? but it is possible to upload.
Should you require the file during startup that might be a bit trickier as you'd have to upload it somehow.
If however it is settings for the web session, why not save it in a cookie?
I have a simple Windows Universal Platform Application that I wrote primarily to teach myself about UWP apps. It works fine, except for what appears to be a somewhat random "The RPC server is unavailable" error generated on the OpenReadAsync method on the StorageFile object.
The application simply builds a list of files to be displayed, then iterates through and displays each one on a timer. Most of the time it works fine. I can let it run for hours no problem. But every once in a while, I receive the RPC error. It seems to happen when I leave the machine for a long time so it logs me out, then I log back in and switch to my slideshow application. It doesn't happen every time, and if I try to reproduce, I can never get it to fail. It seems to be mainly when I leave the machine for a long time - like overnight.
Here is the code running when the error occurs. It happens on the OpenReadAsync method call.
StorageFile file = this._files.ElementAt(this._index++);
await this._dispatcher.RunAsync(CoreDispatcherPriority.Normal, async () =>
{
var bitmap = new BitmapImage();
var stream = await file.OpenReadAsync();
await bitmap.SetSourceAsync(stream);
this.Image = bitmap;
});
If I try to open the file using the Windows Photo viewer, even while sitting at the error, it opens fine. But even re-trying the failing line of code will not allow my app to open the file. All I can do is shut down and restart the application.
Has anyone run across anything like this? Any hints on where I should look?
Thanks.
Edit: Oh, one more thing - the files are all on a local drive. No external servers or network involved.
I am an experienced windows C# developer, but new to the world of Azure, and so trying to figure out a "best practice" as I implement one or more Azure Cloud Services.
I have a number of (external, and outside of my control) sources that can all save files to a folder (or possibly a set of folders). In the current state of my system under Windows, I have a FileSystemWatcher set up to monitor a folder and raise an event when a file appears there.
In the world of Azure, what is the equivalent way to do this? Or is there?
I am aware I could create a timer (or sleep) to pass some time (say 30 seconds), and poll the folder, but I'm just not sure that's the "best" way in a cloud environment.
It is important to note that I have no control over the inputs - in other words the files are saved by an external device over which I have no control; so I can't, for example, push a message onto a queue when the file is saved, and respond to that message...
Although, in the end, that's the goal... So I intend to have a "Watcher" service which will (via events or polling) detect the presence of one or more files, and push a message onto the appropriate queue for the next step in my workflow to respond to.
It should be noted that I am using VS2015, and the latest Azure SDK stuff, so I'm not limited by anything legacy.
What I have so far is basically this (a snippet of a larger code base):
storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create a CloudFileClient object for credentialed access to File storage.
fileClient = storageAccount.CreateCloudFileClient();
// Obtain the file share name from the config file
string sharenameString = CloudConfigurationManager.GetSetting("NLRB.Scanning.FileSharename");
// Get a reference to the file share.
share = fileClient.GetShareReference(sharenameString);
// Ensure that the share exists.
if (share.Exists())
{
Trace.WriteLine("Share exists.");
// Get a reference to the root directory for the share.
rootDir = share.GetRootDirectoryReference();
//Here is where I want to start watching the folder represented by rootDir...
}
Thanks in advance.
If you're using an attached disk (or local scratch disk), the behavior would be like on any other Windows machine, so you'd just set up a file watcher accordingly with FileSystemWatcher and deal with callbacks as you normally would.
There's Azure File Service, which is SMB as-a-service and would support any actions you'd be able to do on a regular SMB volume on your local network.
There's Azure blob storage. These can not be watched. You'd have to poll for changes to, say, a blob container.
You could create a loop that polls the root directory periodically using
CloudFileDirectory.ListFilesAndDirectories method.
https://msdn.microsoft.com/en-us/library/dn723299.aspx
You could also write a small recursive method to call this in sub directories.
To detect differences you can build up an in memory hash map of all files and directories. If you want something like a persistent distributed cache then you can use ie. Redis to keep this list of files/directories. Every time you poll if the file or directory is not in your list then you detected a new file/ directory under root.
You could separate the responsibility of detection and business logic ie. a worker role keeps polling the directory and writes the new files to a queue and the consumer end another worker role/ web service that does the processing with that information.
Azure Blob Storage pushes events through Azure Event Grid. Blob storage has two event types, Microsoft.Storage.BlobCreated and Microsoft.Storage.BlobDeleted. So instead of long polling you can simply react to the created event.
See this link for more information:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview
I had a very similar requirement. I used BOX application. It has a Webhook feature for events occurring in Files or Folders: such as Add, Move, Delete etc..
Also there are some newer alternatives with Azure Autromation.
I'm pretty new to Azure too, and actually I'm investigating a file watcher type thing. I'm considering something involving Azure Functions, because of this, which looks like a way of triggering some code when a blog is created or updated. There's a way of specifying a pattern too: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob
I have an ASP.NET web application that uses System.Speech to transform text to a WAV file. It works fine locally but when I deploy it to the server, I get the below error message. This is using Windows Server 2012, ASP.NET 4.5, and IIS 8.5:
Object reference not set to an instance of an object.
System.Speech
at System.Speech.Internal.ObjectTokens.RegistryDataKey..ctor(String fullPath, RegistryDataKey copyKey)
at System.Speech.Internal.ObjectTokens.SAPICategories.DefaultDeviceOut()
at System.Speech.Internal.Synthesis.VoiceSynthesis..ctor(WeakReference speechSynthesizer)
at System.Speech.Synthesis.SpeechSynthesizer.get_VoiceSynthesizer()
at QuinnSDS.handlerTransform.<>c__DisplayClass6.<ProcessRequest>b__1()
The code which is generating this error message runs on the server:
if (context.Request.ContentLength > 0)
{
string line = new StreamReader(context.Request.InputStream).ReadToEnd();
// ********* generate wav file voicing the response *****************
// Using Microsoft voices
// initiate new instance of speech synthesizer
Thread t = new Thread(() =>
{
try
{
// The object creation works fine
System.Speech.Synthesis.SpeechSynthesizer synth = new System.Speech.Synthesis.SpeechSynthesizer();
if (synth != null)
{
// The code breaks at synth.GetInstalledVoices() below. It will break any time I try to do anything with the synth object
foreach (System.Speech.Synthesis.InstalledVoice voice in synth.GetInstalledVoices())
{
System.Speech.Synthesis.VoiceInfo info = voice.VoiceInfo;
string voiceName = info.Name;
ws.WriteLine(voiceName);
}
}
}
catch (Exception e)
{
ws.WriteLine(e.Message);
ws.WriteLine(e.Source);
ws.WriteLine(e.StackTrace);
}
//... code continues...
It does not break when the Speech Synthesis object is created; it breaks whenever I try to use that object in any way.
I'm not sure if it's an access issue but I'm pretty new to ASP.NET and IIS and I can't figure out how to give the web app access to the GAC or if that's even what the problem is. I tried changing the property Local Copy for the System.Speech reference to True in Visual Studio, before I deploy the app, but that hasn't worked. I searched online and while the "object reference not set to an instance of an object" seems fairly common, I cannot find any similar issues where it is because of a .NET framework class library...I have run the text-to-speech code locally on the server and it ran fine. I have not run the entire app locally on the server because the web app requires speech input and there is not a microphone on the server.
Any ideas of anything to try would be most welcome!
What user account is the code running under when executed from ASP.NET? If the Speech API is touching the registry like the call stack suggests, it possibly has different permissions than the account you used to run the code manually.
If you can't just make the application pool for your site run with the same account you log into the machine with, I've had some success using Process Monitor to track down this kind of problem before. Basically, execute the code that fails while Process Monitor is running and look for 'ACCESS DENIED' in the 'Result' column (or anything else that looks suspicious). Quickly switching the application pool to use your standard user account will be the fastest way to rule out security or permission related problems though.
I am trying to send a push notification to IOS device via PushSharp. For android, it works. For IOS, the call to StopAllServices() hangs forever, without calling any exception handlers.
Could the problem be that I was given a .pem certificate file, and pushsharp requires a .p12 file?
The code is the following:
var br = new PushBroker();
br.OnNotificationSent += br_OnNotificationSent;
br.OnNotificationFailed += br_OnNotificationFailed;
br.OnChannelException += br_OnChannelException;
br.OnServiceException += br_OnServiceException;
br.OnDeviceSubscriptionChanged += br_OnDeviceSubscriptionChanged;
br.OnDeviceSubscriptionExpired += br_OnDeviceSubscriptionExpired;
br.OnChannelCreated += br_OnChannelCreated;
br.OnChannelDestroyed += br_OnChannelDestroyed;
var appleCert = Resource1.ck; // this is a pem file, not a p12 file!!! could this be the problem?
var sandbox = true;
br.RegisterAppleService(new ApplePushChannelSettings(!sandbox, appleCert, "223684"));
// password given to me by ios developer
var deviceIds = new string[] { "09eddcb8b89494adf802a0caf97d5daaa789a53f52d8c544dbdcf39f2c0b619a" };
foreach (var did in deviceIds)
{
br.QueueNotification(
new AppleNotification()
.ForDeviceToken(did)//the recipient device id
.WithAlert("test: " + DateTime.Now.ToString())//the message
.WithBadge(1)
.WithSound("sound.caf"));
}
br.StopAllServices(waitForQueuesToFinish: true); // hangs forever, no callbacks are called
I am using PushSharp taken via Git, and compiled by myself with Visual Studio 2013, as of yesterday.
The hang happens both if the code is in a console application and in an asp.net application.
I am using the sandbox, because I was told to. If I use the production server, I get an exception telling me that the certificate is for the sandbox.
Thanks for any hint as to the cause of the freeze.
We spent a whole day trying to guess the problem!
In the end it was in the wrong Newtonsoft.Json version
Some of the projects in our solution were dependant on the older version of this library as a result we had bad luck to get wrong version in the /bin folder of the Web project.
You can wait few seconds for br_OnNotificationFailed or any other event probably. It should contain some error description.
Nevertheless, I've found out PushSharp has strict requirements about certificates usage. PEM should be OK but it is not enough, even if you import it from file - you should have all necessary certificates in Windows certificates store (pem itself and its dependecies):
Import your PEM to Local Machine\Root storage and give read access rights of its private key to the user of your running application
Import from Apple site certificates Apple Worldwide Developer Relations Certification Authority and Apple Root CA into Local Machine\Trusted Root Certification Authorities
Import Entrust Secure CA certificate (for SSL as described in iOS Developer Library) into Local Machine\Trusted Root Certification Authorities
In the end it was a certificate problem. The .pem I was given is not accepted by PushSharp. Only when I was given a .p12 created with this guide
https://code.google.com/p/apns-sharp/wiki/HowToCreatePKCS12Certificate
, the problem was solved.
However, PushSharp should have raised an exception instead of hanging.
An ASP.NET application is NOT the ideal place to use PushSharp. You'd be better off using a Windows Service, or some other infrastructure if at all possible. The reason is that in an ASP.NET application, the Application Pool (AppPool) can be restarted on you and is usually not under your direct control, which means all the Queued notifications that PushSharp may be in the process of sending could be lost if PushSharp is not cleaned up gracefully.
If you MUST run PushSharp in an ASP.NET application, the best way is to create a singleton PushBroker instance in your Global.asax file. You should keep this singleton instance around for the lifespan of your web application, including a call to pushBroker.StopAllServices() when your Application is ending (Application_End in global.asax).
You can help mitigate losing messages due to unforeseen App Pool terminations or restarts by persisting notifications you want to send in some other way, and only removing them from that persistent storage once the OnNotificationSent event has fired. This is still not perfect (you may risk multiple notification deliveries), but it's probably adequate for most.
You should not be creating and destroying instances of PushBroker each time you send a notification, as this uses unnecessary resources and if you're using Apple APNS, they require you to keep the connection to their servers open as long as possible when sending notifications. You should also call pushBroker.StopAllServices() in your Application_Ended event in your Global.asax. Keep in mind that PushSharp works.
Ref: https://github.com/Redth/PushSharp/issues/240