Convert NetworkCredentials to a PSCredential? - c#

Is it possible to convert NetworkCredentials (or a variant) to a PSCredential?
I want to transparently use the credentials of the current user, but don't want to prompt them.

$cred = new-object PSCredential($nc.UserName, (ConvertTo-SecureString $nc.Password -AsPlainText -Force))

[System.Reflection.Assembly]::Load("System.Management.Automation")
$dc = [System.Net.CredentialCache]::DefaultCredentials
$psc = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $dc.UserName, $dc.SecurePassword
That would do it, except that the default credentials are not set for the current application domain in which PowerShell is running.
At least this illustrates how you can create a PSCredential from a persisted username and password. The SecureString is always a bummer but I believe NetworkCredential offers a plaintext constructor you can use and a SecureString property to read from.

Related

Did New-AzureADApplicationKeyCredential double base64 encode for CustomKeyIdentifier and Value?

I am using New-AzureADApplicationKeyCredential to create a KeyCredential for an application. document
At first, I base64 encode the Thumbprint
$cer = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$cer.Import("C:\Users\PFuller\Desktop\abc.cer")
...
$bin = $cer.GetCertHash()
$base64Thumbprint = [System.Convert]::ToBase64String($bin)
...
New-AzureADApplicationKeyCredential -ObjectId <id>
-CustomKeyIdentifier $base64Thumbprint
-Type AsymmetricX509Cert
-Usage Verify -Value $base64Value
But the result on AAD manifest is double base64 encoded:
"keyCredentials": [
{
"customKeyIdentifier": "base64(base64Thumbprint)",
...
}
],
According to Microsoft identity platform application authentication certificate credentials
The customKeyIdentifier should only base64 encoded once and store on manifest's keyCredentials.
Did I misuse this cmd or something wrong here? becase after I turned the manifest to base64 encoded once, every thing works properly.
Thanks for help.
Based on my test, both customKeyIdentifier and value are double base64 encoded when we create a KeyCredential for an application using New-AzureADApplicationKeyCredential.
After that I use the following code to get the access token with this cert:
var app = ConfidentialClientApplicationBuilder.Create("{clientId}")
.WithAuthority(AzureCloudInstance.AzurePublic, "{tenantId}")
.WithCertificate(cer)
.Build();
var result = await app.AcquireTokenForClient(new[] { "https://graph.microsoft.com/.default" }).ExecuteAsync();
Console.WriteLine(result.AccessToken);
I find that we can successfully get the access token with correct roles.
So I think New-AzureADApplicationKeyCredential base64 encodes customKeyIdentifier and value for another time. But it also handles them correctly when we use the X509Certificate to get an access token.

Downscale an sql database hosted in Azure from Visual studio online

I need to downscale a database in Azure to its Basic plan.
I have created a task in my Release flow on VSO, a PowerShell Azure Task with my subscription specified in it, and I've put this script in it :
Start-Sleep -s 60
$Credential = Get-Credential
$serverContext = New-AzureSqlDatabaseServerContext -ServerName "XXX-com-staging" -Credential $Credential
$db = Get-AzureSqlDatabase -ServerName "XXX-com-staging" -DatabaseName "YYY-com-staging"
$P1 = Get-AzureSqlDatabaseServiceObjective $serverContext -ServiceObjectiveName "P1"
Set-AzureSqlDatabase $serverContext -Database $db -ServiceObjective $P1 -Force -Edition Basic
It doesn't work because $Credential is either Null, or it throws an Error saying that:
Credential Parameter is mandatory.
Is there any simple way to achieve this, or a fix for my script ?
It was easy, no need for Credentials
$P0 = Get-AzureSqlDatabaseServiceObjective -ServerName "XXX-com-staging" -ServiceObjectiveName "Basic"
Set-AzureSqlDatabase -DatabaseName "YYY-com-staging" -ServerName "XXX-com-staging" -ServiceObjective $P0 -Force -Edition Basic

Azure Function role like permissions to Stop Azure Virtual Machines

I'm hoping to manage some Azure resources using a scheduled C# Azure Function.
Currently in a command line application I've made, I've been using libraries 'Microsoft.IdentityModel.Clients.ActiveDirectory' for token authorization and 'Microsoft.Azure.Management.Compute' for client calls for resource management like so.
//... var credential generated my AD authentication and extending Microsoft.Rest.ServiceClientCredentials
using (var client = new ComputeManagementClient(credential)) {
client.SubscriptionId = "[SOME_SUBSCRIPTION_ID]";
client.VirtualMachines.BeginPowerOff("[RESOURCE_GROUP]", "[VM_NAME]");
}
Can my management client interact with Azure resources without providing a User Credential or Key-Secret like credential establishment?
My previous experience is related to AWS and admittedly it has confused my view of Azure Resource Management.
Older posts I've looked at are: Start and Stop Azure Virtual Machine
and
Is it possible to stop/start an Azure ARM Virtual from an Azure Function?
-EDIT 1-
I was hoping for something similar to run-time credentials in AWS resource clients for Lambda based on an assigned role with a variety of permissions. I will have a look at certificates though.
There are a few resources online on using C# to make REST API calls to start and stop a VM. Here's a link to such a document:
https://msftstack.wordpress.com/2016/01/03/how-to-call-the-azure-resource-manager-rest-api-from-c/
You could use the above as a reference to create C# Functions to start/stop your VM.
However, using C# to make these REST calls requires pre-packaging the HTTP request and post processing the HTTP response. If your use-case just calls for a start/stop VM, an easier approach would be use PowerShell in Azure Functions to call the Start-AzureRmVM and Stop-AzureRmVM cmdlets.
The following are steps on how to create HTTP-triggered PowerShell Functions to start and stop a VM:
Setup a service principal to obtain the username, password and tenant id. This initial setup may be considered tedious by some users, but since it's a one-time task, I feel that it is worth it to leverage running Azure PowerShell in Functions. There are many docs online, but here are some links to documents on how to setup your service principal:
i. http://blog.davidebbo.com/2014/12/azure-service-principal.html (I used this one)
ii. https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-group-create-service-principal-portal
Log into the Functions portal to access your Function app.
Click on Function app settings->Configure app settings and add the key-value pairs for the settings SP_USERNAME, SP_PASSWORD, and TENANTID (You may use other desired key names).
Create an HTTP-triggered PowerShell Function named, e.g. StartVm with the following content in its run.ps1 file.
$requestBody = Get-Content $req -Raw | ConvertFrom-Json
# Set Service Principal credentials
# SP_PASSWORD, SP_USERNAME, TENANTID are app settings
$secpasswd = ConvertTo-SecureString $env:SP_PASSWORD -AsPlainText -Force;
$mycreds = New-Object System.Management.Automation.PSCredential ($env:SP_USERNAME, $secpasswd)
Add-AzureRmAccount -ServicePrincipal -Tenant $env:TENANTID -Credential $mycreds;
$context = Get-AzureRmContext;
Set-AzureRmContext -Context $context;
# Start VM
Start-AzureRmVM -ResourceGroupName $requestBody.resourcegroup -Name $requestBody.vmname | Out-String
Click on the Save button.
Next, click on the Logs button to open the log viewer.
Click on the Test button to open the simple HTTP client. In the request body, provide the vmname and resourcegroup values for the VM, e.g.
{
"vmname": "testvm",
"resourcegroup": "testresourcegroup"
}
Click on the Run button and wait for a few seconds. It takes some time for the Start-AzureRmVM cmdlet to run to completion. When it does, you should see similar entries in the log viewer.
2016-11-30T07:11:26.479 Function started (Id=1e38ae2c-3cca-4e2f-a85d-f62c0d565c34)
2016-11-30T07:11:28.276 Microsoft.Azure.Commands.Profile.Models.PSAzureContext
2016-11-30T07:11:28.276 Microsoft.Azure.Commands.Profile.Models.PSAzureContext
2016-11-30T07:11:59.312 RequestId IsSuccessStatusCode StatusCode ReasonPhrase
--------- ------------------- ---------- ------------
True OK OK
2016-11-30T07:11:59.327 Function completed (Success, Id=1e38ae2c-3cca-4e2f-a85d-f62c0d565c34)
Repeat steps 4-8 to create the StopVm Function with the following content in its run.ps1 file. If the execution succeeds, the log output should be similar to the log entries for the StartVm Function.
$requestBody = Get-Content $req -Raw | ConvertFrom-Json
# Set Service Principal credentials
# SP_PASSWORD, SP_USERNAME, TENANTID are app settings
$secpasswd = ConvertTo-SecureString $env:SP_PASSWORD -AsPlainText -Force;
$mycreds = New-Object System.Management.Automation.PSCredential ($env:SP_USERNAME, $secpasswd)
Add-AzureRmAccount -ServicePrincipal -Tenant $env:TENANTID -Credential $mycreds;
$context = Get-AzureRmContext;
Set-AzureRmContext -Context $context;
# Stop VM
Stop-AzureRmVM -ResourceGroupName $requestBody.resourcegroup -Name $requestBody.vmname -Force | Out-String
When the StopVm Function execution succeeds, you may also add another GetVm Function with the following content in its run.ps1 file to verify that the VM has indeed been stopped.
$requestBody = Get-Content $req -Raw | ConvertFrom-Json
# Set Service Principal credentials
# SP_PASSWORD, SP_USERNAME, TENANTID are app settings
$secpasswd = ConvertTo-SecureString $env:SP_PASSWORD -AsPlainText -Force;
$mycreds = New-Object System.Management.Automation.PSCredential ($env:SP_USERNAME, $secpasswd)
Add-AzureRmAccount -ServicePrincipal -Tenant $env:TENANTID -Credential $mycreds;
$context = Get-AzureRmContext;
Set-AzureRmContext -Context $context;
# Get VM
Get-AzureRmVM -ResourceGroupName $requestBody.resourcegroup -Name $requestBody.vmname -Status | Out-String
The log entries for the GetVM Function on a stopped VM will would be similar to the following:
2016-11-30T07:53:59.956 Function started (Id=1841757f-bbb8-45cb-8777-80edb4e75ced)
2016-11-30T07:54:02.040 Microsoft.Azure.Commands.Profile.Models.PSAzureContext
2016-11-30T07:54:02.040 Microsoft.Azure.Commands.Profile.Models.PSAzureContext
2016-11-30T07:54:02.977 ResourceGroupName : testresourcegroup
Name : testvm
BootDiagnostics :
ConsoleScreenshotBlobUri : https://teststorage.blob.core.windows.net/boot
diagnostics-vmtest-[someguid]/testvm.[someguid].screenshot.bmp
Disks[0] :
Name : windowsvmosdisk
Statuses[0] :
Code : ProvisioningState/succeeded
Level : Info
DisplayStatus : Provisioning succeeded
Time : 11/30/2016 7:15:15 AM
Extensions[0] :
Name : BGInfo
VMAgent :
VmAgentVersion : Unknown
Statuses[0] :
Code : ProvisioningState/Unavailable
Level : Warning
DisplayStatus : Not Ready
Message : VM Agent is unresponsive.
Time : 11/30/2016 7:54:02 AM
Statuses[0] :
Code : ProvisioningState/succeeded
Level : Info
DisplayStatus : Provisioning succeeded
Time : 11/30/2016 7:15:15 AM
Statuses[1] :
Code : PowerState/deallocated
Level : Info
DisplayStatus : VM deallocated
2016-11-30T07:54:02.977 Function completed (Success, Id=1841757f-bbb8-45cb-8777-80edb4e75ced)
Note: FYI, while you may write a Function to create a VM by calling the New-AzureRmVM cmdlet, it will not run to completion in Azure Functions. VM creation in Azure Function's infrastructure seem to take ~9 mins to complete but a Function's execution is terminated at 5 minutes. You may write another script to poll the results separately. This limitation will be lifted when we start supporting custom configuration for maximum execution time in one of our upcoming releases.
--Update--
I just realized you were trying to create scheduled Functions. In that case, you can use Timer-triggered PowerShell Functions and hard-code the vmname and resourcegroup.
Well, I don't really understand how do you expect to authenticate without authenticating, I guess your only option would be certificates?
https://azure.microsoft.com/en-us/resources/samples/active-directory-dotnet-daemon-certificate-credential/

Redirect digest request to active directory

I am trying to redirect an http request with digest MD5 header information to an active directory to validate the credentials.
I do have the information given by the http header like nonce and username. My problem now is that I have no link to put this information into a PrincipalContext object.
I obviously can't use PrincipalContext.ValidateCredentials(username, password) cause it requires the password in plain text.
The only validation that I am able to use is UserPrincipal user = UserPrincipal.FindByIdentity(context, IdentityType.SamAccountName, username);, but this does not include the password.
I do have a HttpListenerContext object. But the user variable is null.
After I told my server to user AuthenticationSchemes.IntegratedWindowsAuthentication he automaticaly deliveres a WindowsPrincipal, which provides information from the AD.
Tim once you get the information you can do something like this to check if is Valid or not If I am understanding what you want to test properly then try something like this
if you are running this via code or a service you should have no issues with the password in regards to being exposed ..if you are concerned about that then you need to write something that will decrypt the MD5 Header Information where the pass word is.
using(PrincipalContext prContext= new PrincipalContext(ContextType.Domain, "Your Domain"))
{
bool isValid = prContext.ValidateCredentials("Username", "Password");
}

X509Certificate Constructor Exception

//cert is an EF Entity and
// cert.CertificatePKCS12 is a byte[] with the certificate.
var certificate = new X509Certificate(cert.CertificatePKCS12, "SomePassword");
When loading a certificate from our database, on our staging server (Windows 2008 R2/IIS7.5) we get this exception:
System.Security.Cryptography.CryptographicException: An internal error occurred.
at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
at System.Security.Cryptography.X509Certificates.X509Utils._LoadCertFromBlob(Byte[] rawData, IntPtr password, UInt32 dwFlags, Boolean persistKeySet, SafeCertContextHandle& pCertCtx)
at System.Security.Cryptography.X509Certificates.X509Certificate.LoadCertificateFromBlob(Byte[] rawData, Object password, X509KeyStorageFlags keyStorageFlags)
NOTE: This issue does not happen locally (Windows 7/Casini).
Any insight is greatly appreciated.
Turns out there's a setting in the IIS Application Pool configuration (Application Pools > Advanced Settings) to load the user profile for the application pool identity user. When set to false, the key containers aren't accessible.
So just set Load User Profile option as True
More than likely, when you are running from Visual Studio/Cassini, it is accessing your user certificate store, even though you're loading it from bytes. Could you please try this and see if it solves your issue:
var certificate = new X509Certificate(
cert.CertificatePKCS12, "SomePassword", X509KeyStorageFlags.MachineKeySet);
This will cause IIS (which runs as the ASP.NET user which likely doesn't have access to a user store) to use the Machine store.
This page explains the constructor in more detail, and this page explains the X509KeyStorageFlags enumeration.
Edit:
Based on the second link from cyphr, it looks like it might be a good idea (if the previous solution doesn't work), to combine some of the FlagsAttribute enumeration values like so:
var certificate = new X509Certificate(
cert.CertificatePKCS12, "SomePassword",
X509KeyStorageFlags.MachineKeySet
| X509KeyStorageFlags.PersistKeySet
| X509KeyStorageFlags.Exportable);
Additionally, if you have access, you may want to try changing your Application Pool setting to use LocalService (and then restart the AppPool). This may elevate your permissions to an appropriate level if that is the problem.
Finally, you can use File.WriteAllBytes to write out the CertificatePKCS12 contents to a pfx file and see if you can manually import it using the certificate console under MMC (you can delete after successful import; this is just to test). It could be that your data is getting munged, or the password is incorrect.
Use this code:
certificate = new X509Certificate2(System.IO.File.ReadAllBytes(p12File)
, p12FilePassword
, X509KeyStorageFlags.MachineKeySet |
X509KeyStorageFlags.PersistKeySet |
X509KeyStorageFlags.Exportable);
I had trouble on Windows 2012 Server R2 where my application could not load certificates for a PFX on disk. Things would work fine running my app as admin, and the exception said Access Denied so it had to be a permissions issue. I tried some of the above advice, but I still had the problem. I found that specifying the following flags as the third parameter of the cert constructor did the trick for me:
X509KeyStorageFlags.UserKeySet |
X509KeyStorageFlags.PersistKeySet |
X509KeyStorageFlags.Exportable
To be able really solve your problem and not just guess, what can it be, one need be able to reproduce your problem. If you can't provide test PFX file which have the same problem you have to examine the problem yourself. The first important question is: are the origin of the exception "An internal error occurred" in the private key part of the PKCS12 or in the public part of the certificate itself?
So I would recommend you to try to repeat the same experiment with the same certificate, exported without private key (like .CER file):
var certificate = new X509Certificate(cert.CertificateCER);
or
var certificate = new X509Certificate.CreateFromCertFile("My.cer");
It could help to verify whether the origin of your problem is the private key or some properties of the certificate.
If you will have problem with the CER file you can safe post the link to the file because it have public information only. Alternatively you can at least execute
CertUtil.exe -dump -v "My.cer"
or
CertUtil.exe -dump -v -privatekey -p SomePassword "My.pfx"
(you can use some other options too) and post some parts of the output (for example properties of the private key without the PRIVATEKEYBLOB itself).
An alternative to changing the Load User Profile is to make the Application Pool use the Network Service Identity.
See also What exactly happens when I set LoadUserProfile of IIS pool?
On an application running IIS 10, I managed to fix the access denied error by using LocalSystem identity for the app pool with the following code:
new X509Certificate2(certificateBinaryData, "password"
, X509KeyStorageFlags.MachineKeySet |
X509KeyStorageFlags.PersistKeySet);
Enabling Load User Profile didn't work for me and while there where many suggestions to do this, they did not indicate that setting for 'Load User Profile' to True only works on user accounts and not:
ApplicationPoolIdentity
NetworkService
You need to import a .cer certificate to your local machine keystore. There's no need to import your .p12 cert - instead use the second certyficate issued to your account by Apple. I think it must be a valid pair of certificates (one in filesystem, second in keystore). You'll have to set all 3 flags in dll of course.
The following code will help you, you can generate algorithm using bouncy castle library:
private static ECDsa GetEllipticCurveAlgorithm(string privateKey)
{
var keyParams = (ECPrivateKeyParameters)PrivateKeyFactory
.CreateKey(Convert.FromBase64String(privateKey));
var normalizedECPoint = keyParams.Parameters.G.Multiply(keyParams.D).Normalize();
return ECDsa.Create(new ECParameters
{
Curve = ECCurve.CreateFromValue(keyParams.PublicKeyParamSet.Id),
D = keyParams.D.ToByteArrayUnsigned(),
Q =
{
X = normalizedECPoint.XCoord.GetEncoded(),
Y = normalizedECPoint.YCoord.GetEncoded()
}
});
}
and generate the token in the following way:
var signatureAlgorithm = GetEllipticCurveAlgorithm(privateKey);
ECDsaSecurityKey eCDsaSecurityKey = new ECDsaSecurityKey(signatureAlgorithm)
{
KeyId = settings.Apple.KeyId
};
var handler = new JwtSecurityTokenHandler();
var token = handler.CreateJwtSecurityToken(
issuer: iss,
audience: AUD,
subject: new ClaimsIdentity(new List<Claim> { new Claim("sub", sub) }),
expires: DateTime.UtcNow.AddMinutes(5),
issuedAt: DateTime.UtcNow,
notBefore: DateTime.UtcNow,
signingCredentials: new SigningCredentials(eCDsaSecurityKey, SecurityAlgorithms.EcdsaSha256));

Categories