CngKey.Import on azure - c#

var rawData = Convert.FromBase64String(_signingKey);
var cng = CngKey.Import(rawData, CngKeyBlobFormat.Pkcs8PrivateBlob);
I use this code to extract key, from embedded base64 string.
It works fine when I test it locally but when I publish on azure I get following exception:
WindowsCryptographicException: The system cannot find the file specified
(once again I'm not reading from any file)
I need this to communicate with apple apns for push notifications, is there any workaround?
And this happens only on free service plan, if I switch to basic plan it's working.

I ran into the same error after publishing an existing application to Azure. In my case the problem was solved after I set WEBSITE_LOAD_USER_PROFILE = 1 in App Services / App Name / Application Settings.

Setting WEBSITE_LOAD_USER_PROFILE to equal 1 in the Azure App Service configuration definitely got my remote iOS notifications working. Using dotAPNS for C# .NET I also needed to omit apns.UseSandbox().

It seems that it causes by there is no certificate attached in your Azure Mobile App. If it is that case, we need to upload the "Development" or "Distribution" SSL certificate to the WebApp. More info about how to send push notifications to iOS App, please refer to the azure document.

I've had a similar error trying to construct a X509Certificate2 from a byte array - worked fine locally but once I deploy to Azure Web App, I got the same and VERY misleading file not found exception.
The real issue turned out to be that there was no user store associated with the web service account. You can also get a similar error if there are permission-related errors with accessing the certificate store on Windows.
In any case - In my scenario I fixed the problem by using MachineKeySet:
new X509Certificate2(certRawBytes, default(string), X509KeyStorageFlags.MachineKeySet);
So, in your scenario, try something like:
var keyParams = new CngKeyCreationParameters
{
KeyCreationOptions = CngKeyCreationOptions.MachineKey,
};
CngKey.Create(CngAlgorithm.Rsa, keyName, keyParams);
Note: You may have to set a few parameters to get the above working. The Import method doesn't seem to support MachineKey - but you should be able to achieve similar outcome by using the Create method.

To add to #strohmsn's answer, you can also set the App Service settings with this value directly within Visual Studio on the Publish page for web apps: Right click on web app and select Publish, then select App Service Settings, and you can add setting properties there: WEBSITE_LOAD_USER_PROFILE = 1 in this case. See screenshot:

For making it works, I needed TWO things in AzureWebApp..
So my code is :
//I load the PrivateKey here
ReadedByte = System.IO.File.ReadAllBytes(strPathPrivateKey);
//create the RSA thing
RSA rsa = System.Security.Cryptography.RSA.Create();
//import the key. It crashed HERE with the 'System cannot find file specified'
rsa.ImportPkcs8PrivateKey(source: ReadedByte,bytesRead: out int _);
It works perfectly locally. But, to make it WORK on Azure Web App, I had to have those TWO requirements :
1 - the WEBSITE_LOAD_USER_PROFILE = 1 spoken in the discussion above and below
2 - The App Service Plan must include "Custom domains / SSL" !
...so No 'F1 Share Infrastructure' nor 'D1 Share Infrastructure'. The lowest Service plan that worked for me was 'B1 - 100 Total Acu'.
Maybe I have something wrong somewhere else in my code, or my 'RSA' choice is bad..anyway...
It now works!

Related

CngKey System.Security.Cryptography.CryptographicException The system cannot find the file specified on Azure

I try to generate key from this code
CngKey key = CngKey.Import(Convert.FromBase64String(privateKey), CngKeyBlobFormat.Pkcs8PrivateBlob);
it works fine locally but when I deploy in my Azure app service.
it gives me this error:
System.Security.Cryptography.CryptographicException: The system cannot find the file specified.
at System.Security.Cryptography.NCryptNative.ImportKey(SafeNCryptProviderHandle provider, Byte[] keyBlob, String format)
at System.Security.Cryptography.CngKey.Import(Byte[] keyBlob, String curveName, CngKeyBlobFormat format, CngProvider provider)
I add WEBSITE_LOAD_USER_PROFILE In Configuration with value '1' but it didn't make any difference.
Thanks
If WEBSITE_LOAD_USER_PROFILE = 1 does not work for you. Here is a workaround for this same issue:
Open your Azure App Service (Azure Website) blade in portal.azure.com
Go to the Application settings page
Scroll to App settings
Add a new entry key: WEBSITE_LOAD_CERTIFICATES, and provide a dummy (fake, made-up, randomly-generated) value for it.
Check out the below links for the similar issue for reference:
X509Certificate2 on Azure App Services (Azure Websites) since mid-2017?
GH issue #16
This issue is not only on Azure, I had the same issue on my VPS as well and this answer save my life:
X509Certificate Constructor Exception
Cheers,
Nick
I upgraded the plan service from free to basic with adding WEBSITE_LOAD_USER_PROFILE = 1 in Azure Configuration.
The issue was when the app service was free it use a shared VM but when upgrading my app service pricing into basic it use a private VM.
I have spent around 4 days looking for this also. I have found out another reason. When you run VS as administrator, then you automatically get the privileges to read something called "cert user store".
However if you run it on another machine or some hosting, where you do not have full privileges, then you might run into this issue as well.
Hope this helps as well.

Why would Azure Search Indexer stop working on a Live app

I have a .Net/C# app that calls the Azure Search Service. It's been working fine to find a list of PDF files I have in Azure storage based on keywords submitted. But a couple of days ago, the live app on Azure stopped working - no documents are returned from a search. However, on Local, the app works fine with the same code. I'm suspecting something may have changed with firewall rules, but I can't find where that may have occurred. Hopefully someone has had something similar happen and has a solution.
Here's the code that stopped working on Live.
var indexClient = GetIndexClient(); // sets up SearchIndexClient with uri, credentials, etc.
SearchParameters sp =
new SearchParameters()
{
Select = new[] { "metadata_storage_name" },
SearchMode = SearchMode.Any
};
var docs = indexClient.Documents.Search(searchString, sp); // this line no longer works on Live
As it turns out, it had to do with Microsoft's TLS 1.1 and 1.0 decommissioning in the last 2 weeks. I was able to add the following to my code to make it work again (added to my page_load procedure in the default template):
System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
However, I'm still working on an issue where the PDF links that are listed in an editor window (using CKEditor extension), will no longer work. I'm assuming this is the same problem, as it works on my local, but not from the Azure web app.

Azure Function fails to find file

I have an azure function that uses the Azure context. When I execute my function from visual studio 2019 on my machine, it executes correctly. However when I publish this to my Azure account, I get an error that the my.azureauth file cannot be found.
Could not find file 'D:\Program Files (x86)\SiteExtensions\Functions\2.0.12950\32bit\my.azureauth'
The code that is used:
var authFilePath = "my.azureauth";
Console.WriteLine($"Authenticating with Azure using credentials in file at {authFilePath}");
azure = Azure.Authenticate(authFilePath).WithDefaultSubscription();
sub = azure.GetCurrentSubscription();
Console.WriteLine($"Authenticated with subscription '{sub.DisplayName}' (ID: {sub.SubscriptionId})");
This is code that I found on one of the Microsoft tutorials. I have set my my.azureauth file to "Copy Always".
Could anyone point my in the right direction?
You are get this file path because the Directory.GetCurrentDirectory() would return D:\Program Files (x86)\SiteExtensions\Functions\2.0.12950\32bit instead of D:\home\site\wwwroot\ or D:\home\site\wwwroot\FunctionName.
And if you want to get the wwwroot folder or the function app directory you should use ExecutionContext. Further more information you could refer to this wiki doc.
So the right file path should be context.FunctionDirectory+"\my.azureauth" or context.FunctionAppDirectory+"\my.azureauth", which one to use depends on where your file is stored.
I have found that Kudu is extremely useful in seeing what has been deployed to Azure.
Navigate to your function in the Azure portal.
The instructions here will help get to the kudu console.
https://www.gslab.com/blogs/kudu-azure-web-app
From there you can browse the files which have been deployed into your function's file system.
If you add " , ExecutionContext context)" at the end of the function's run entry point, you can then get the folder which your function is running from with "var path = context.FunctionAppDirectory;
PS apologies for any formatting I am editing this on my phone.
Welcome to Stackoverflow.
Firstly, I'd recommend strongly against using file-based authentication as shown in your question.
From notes:
Note, file-based authentication is an experimental feature that may or may not be available in later releases. The file format it relies on is subject to change as well.
Instead, I would personally store the connection string details (AzureCredentials) in the config file (Web/SiteSettings) and use the provided constructor...
Again, the below are taken from the documentation notes:
Similarly to the file-based approach, this method requires a service principal registration, but instead of storing the credentials in a local file, the required inputs can be supplied directly via an instance of the AzureCredentials class:
var creds = new AzureCredentialsFactory().FromServicePrincipal(client, key, tenant, AzureEnvironment.AzureGlobalCloud);
var azure = Azure.Authenticate(creds).WithSubscription(subscriptionId);
or
var creds = new AzureCredentialsFactory().FromServicePrincipal(client, pfxCertificatePath, password, tenant, AzureEnvironment.AzureGlobalCloud);
var azure = Azure.Authenticate(creds).WithSubscription(subscriptionId);
where client, tenant, subscriptionId, and key or pfxCertificatePath and password are strings with the required pieces of information about your service principal and subscription. The last parameter, AzureEnvironment.AzureGlobalCloud represents the Azure worldwide public cloud. You can use a different value out of the currently supported alternatives in the AzureEnvironment enum.
The first example is most likely the one you should be looking at.
The notes I got this information from can be accessed here.
If you have some problems with AAD, these screenshots may help you.
Client ID:
Key:
Please note that the Key value can only be copied when it is created, after which it will be hidden.
Hope this helps you get started with AAD quickly.

Unity Play Games can't authenticate

I have a issue with play games services and unity.
I've done everything by the documentation. I'm a tester, testing is allowed and I've changed the sha1 in api console to the one used by the app. I'm using code from the docs and examples so here is a brief:
PlayGamesClientConfiguration conf = new PlayGamesClientConfiguration.Builder().Build();
PlayGamesPlatform.InitializeInstance(conf);
PlayGamesPlatform.DebugLogEnabled = true;
PlayGamesPlatform.Activate();
Debug.Log("Authenticating...");
Social.localUser.Authenticate((bool success) =>
{
if (success)
{
Debug.Log("Welcome " + Social.localUser.userName);
}
else
{
Debug.Log("Authentication failed.");
}
});
When I build the app in development mode the Play Games popup appears, starts loading and disappears and gives me a Authentication failed message. But when I build the app without the development mode nothing happens and I get the authentication failed message instantly. And yes I'm the correct sha1 key.
Please help me
I did a number of things and finally got it to work. I cannot be sure if they all contributed to solving this issue so here I will list what I did, from greatest to least of my guess of their relevance:
Match the SHA-1 certificates. If you are using an app downloaded from Play Store use the "app signing certificate", else use the upload certificate. These are found in the Play Console under YourApp/Release Management/App Signing. As a note, if you are building from Unity directly to your device, you should make sure that you are building with the same key used to upload to Google Play. More info here
If you are using a custom config and are requesting things such .RequestServerAuthCode(false), you must create an additional Web App. Go to your console project, and under create credentials choose OAuth client ID, and then select Web App.
If using internal testing, make sure to authorize accounts in the Play Console under Game Services/Your App/Testing.
Try disabling Anti-Piracy in the Play Console under Game Services/Your App/Linked Apps/Your App. Only do this if you are testing app outside Google Play. I think if you are logging in using verified test accounts this doesn't matter.
Edit: Publishing Game Services is required even for testing.. at least that seemed to be the case for me to get it working.
Try clearing the cache of your App on your device, I ran into this problem again and this solved it for me.
I finally think I have got it fixed for good.. after nearly a month later. :o Hope this helps.
Google Services can be published/unpublished separately from your app.
Check if they are correctly published doing the following:
Google Play Console
Game services
Select your game
Publishing
Check if Game Services are published.

Accessing Google Cloud Datastore locally from ASP.NET throws Grpc.Core.RpcException: "Missing or insufficient permissions."

I'm following the Using Cloud Datastore with .NET tutorial. At some point it says that you can run the provided project locally by just pressing F5. When I do that I get the following exception
Grpc.Core.RpcException: 'Status(StatusCode=PermissionDenied, Detail="Missing or insufficient permissions.")'
This exception is thrown exactly at the _db.RunQuery(query) line.
var query = new Query("Book") { Limit = pageSize };
if (!string.IsNullOrWhiteSpace(nextPageToken))
query.StartCursor = ByteString.FromBase64(nextPageToken);
var results = _db.RunQuery(query);`
If I deploy the application to cloud it works as expected, no error. I've given datastore owner permissions to all accounts in Cloud IAM (Identity and Access Management) but it still doesn't work. Does anybody have any ideas?
As Jon Skeet pointed out, I was using the wrong json key locally. Previously I created a compute engine that has a separate service account. After I downloaded a new json key from Console -> IAM & Admin -> Service Accounts it worked locally as well.

Categories