X509Certificate Constructor Exception - c#

//cert is an EF Entity and
// cert.CertificatePKCS12 is a byte[] with the certificate.
var certificate = new X509Certificate(cert.CertificatePKCS12, "SomePassword");
When loading a certificate from our database, on our staging server (Windows 2008 R2/IIS7.5) we get this exception:
System.Security.Cryptography.CryptographicException: An internal error occurred.
at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
at System.Security.Cryptography.X509Certificates.X509Utils._LoadCertFromBlob(Byte[] rawData, IntPtr password, UInt32 dwFlags, Boolean persistKeySet, SafeCertContextHandle& pCertCtx)
at System.Security.Cryptography.X509Certificates.X509Certificate.LoadCertificateFromBlob(Byte[] rawData, Object password, X509KeyStorageFlags keyStorageFlags)
NOTE: This issue does not happen locally (Windows 7/Casini).
Any insight is greatly appreciated.

Turns out there's a setting in the IIS Application Pool configuration (Application Pools > Advanced Settings) to load the user profile for the application pool identity user. When set to false, the key containers aren't accessible.
So just set Load User Profile option as True

More than likely, when you are running from Visual Studio/Cassini, it is accessing your user certificate store, even though you're loading it from bytes. Could you please try this and see if it solves your issue:
var certificate = new X509Certificate(
cert.CertificatePKCS12, "SomePassword", X509KeyStorageFlags.MachineKeySet);
This will cause IIS (which runs as the ASP.NET user which likely doesn't have access to a user store) to use the Machine store.
This page explains the constructor in more detail, and this page explains the X509KeyStorageFlags enumeration.
Edit:
Based on the second link from cyphr, it looks like it might be a good idea (if the previous solution doesn't work), to combine some of the FlagsAttribute enumeration values like so:
var certificate = new X509Certificate(
cert.CertificatePKCS12, "SomePassword",
X509KeyStorageFlags.MachineKeySet
| X509KeyStorageFlags.PersistKeySet
| X509KeyStorageFlags.Exportable);
Additionally, if you have access, you may want to try changing your Application Pool setting to use LocalService (and then restart the AppPool). This may elevate your permissions to an appropriate level if that is the problem.
Finally, you can use File.WriteAllBytes to write out the CertificatePKCS12 contents to a pfx file and see if you can manually import it using the certificate console under MMC (you can delete after successful import; this is just to test). It could be that your data is getting munged, or the password is incorrect.

Use this code:
certificate = new X509Certificate2(System.IO.File.ReadAllBytes(p12File)
, p12FilePassword
, X509KeyStorageFlags.MachineKeySet |
X509KeyStorageFlags.PersistKeySet |
X509KeyStorageFlags.Exportable);

I had trouble on Windows 2012 Server R2 where my application could not load certificates for a PFX on disk. Things would work fine running my app as admin, and the exception said Access Denied so it had to be a permissions issue. I tried some of the above advice, but I still had the problem. I found that specifying the following flags as the third parameter of the cert constructor did the trick for me:
X509KeyStorageFlags.UserKeySet |
X509KeyStorageFlags.PersistKeySet |
X509KeyStorageFlags.Exportable

To be able really solve your problem and not just guess, what can it be, one need be able to reproduce your problem. If you can't provide test PFX file which have the same problem you have to examine the problem yourself. The first important question is: are the origin of the exception "An internal error occurred" in the private key part of the PKCS12 or in the public part of the certificate itself?
So I would recommend you to try to repeat the same experiment with the same certificate, exported without private key (like .CER file):
var certificate = new X509Certificate(cert.CertificateCER);
or
var certificate = new X509Certificate.CreateFromCertFile("My.cer");
It could help to verify whether the origin of your problem is the private key or some properties of the certificate.
If you will have problem with the CER file you can safe post the link to the file because it have public information only. Alternatively you can at least execute
CertUtil.exe -dump -v "My.cer"
or
CertUtil.exe -dump -v -privatekey -p SomePassword "My.pfx"
(you can use some other options too) and post some parts of the output (for example properties of the private key without the PRIVATEKEYBLOB itself).

An alternative to changing the Load User Profile is to make the Application Pool use the Network Service Identity.
See also What exactly happens when I set LoadUserProfile of IIS pool?

On an application running IIS 10, I managed to fix the access denied error by using LocalSystem identity for the app pool with the following code:
new X509Certificate2(certificateBinaryData, "password"
, X509KeyStorageFlags.MachineKeySet |
X509KeyStorageFlags.PersistKeySet);
Enabling Load User Profile didn't work for me and while there where many suggestions to do this, they did not indicate that setting for 'Load User Profile' to True only works on user accounts and not:
ApplicationPoolIdentity
NetworkService

You need to import a .cer certificate to your local machine keystore. There's no need to import your .p12 cert - instead use the second certyficate issued to your account by Apple. I think it must be a valid pair of certificates (one in filesystem, second in keystore). You'll have to set all 3 flags in dll of course.

The following code will help you, you can generate algorithm using bouncy castle library:
private static ECDsa GetEllipticCurveAlgorithm(string privateKey)
{
var keyParams = (ECPrivateKeyParameters)PrivateKeyFactory
.CreateKey(Convert.FromBase64String(privateKey));
var normalizedECPoint = keyParams.Parameters.G.Multiply(keyParams.D).Normalize();
return ECDsa.Create(new ECParameters
{
Curve = ECCurve.CreateFromValue(keyParams.PublicKeyParamSet.Id),
D = keyParams.D.ToByteArrayUnsigned(),
Q =
{
X = normalizedECPoint.XCoord.GetEncoded(),
Y = normalizedECPoint.YCoord.GetEncoded()
}
});
}
and generate the token in the following way:
var signatureAlgorithm = GetEllipticCurveAlgorithm(privateKey);
ECDsaSecurityKey eCDsaSecurityKey = new ECDsaSecurityKey(signatureAlgorithm)
{
KeyId = settings.Apple.KeyId
};
var handler = new JwtSecurityTokenHandler();
var token = handler.CreateJwtSecurityToken(
issuer: iss,
audience: AUD,
subject: new ClaimsIdentity(new List<Claim> { new Claim("sub", sub) }),
expires: DateTime.UtcNow.AddMinutes(5),
issuedAt: DateTime.UtcNow,
notBefore: DateTime.UtcNow,
signingCredentials: new SigningCredentials(eCDsaSecurityKey, SecurityAlgorithms.EcdsaSha256));

Related

C# X509 certificate validation, with Online CRL check, without importing root certificate to trusted root CA certificate store

I'm trying to validate an X509 certificate chain without importing the root CA certificate into the trusted root CA certificate store (in production this code will run in an Azure Function, and you can't add certificates to the trusted root CA certificate store on Azure App Services).
We also need to perform an online CRL check on this certificate chain.
I've searched on this and I see many others are facing the same problem, but none of the suggestions seem to work. I've followed the approach outlined in this SO post, which echoes the suggestions from issue #26449 on the dotnet/runtime GitHub. Here's a small console application (targetting .NET Core 3.1) reproducing the problem:
static void Main(string[] args)
{
var rootCaCertificate = new X509Certificate2("root-ca-cert.cer");
var intermediateCaCertificate = new X509Certificate2("intermediate-ca-cert.cer");
var endUserCertificate = new X509Certificate2("end-user-cert.cer");
var chain = new X509Chain();
chain.ChainPolicy.ExtraStore.Add(rootCaCertificate);
chain.ChainPolicy.ExtraStore.Add(intermediateCaCertificate);
chain.ChainPolicy.VerificationFlags = X509VerificationFlags.AllowUnknownCertificateAuthority;
chain.ChainPolicy.RevocationMode = X509RevocationMode.Online;
chain.Build(endUserCertificate);
chain.Build(new X509Certificate2(endUserCertificate));
var errors = chain.ChainStatus.ToList();
if (!errors.Any())
{
Console.WriteLine("Certificate is valid");
return;
}
foreach (var error in errors)
{
Console.WriteLine($"{error.Status.ToString()}: {error.StatusInformation}");
}
}
When ran this returns three errors:
UntrustedRoot: A certificate chain processed, but terminated in a root certificate which is not trusted by the trust provider.
RevocationStatusUnknown: The revocation function was unable to check revocation for the certificate.
OfflineRevocation: The revocation function was unable to check revocation because the revocation server was offline.
However, if I add the root CA certificate to the trusted root CA certificate store then all three errors disappear.
Questions
Is this something wrong with my implementation, or is what I'm trying to do not possible?
What are my options to try to achieve this? A bit of Googling suggests the X509ChainPolicy.CustomTrustStore offered in .NET 5 might save the day. Is Bouncy Castle another option for achieving this?
A bit of Googling suggests the X509ChainPolicy.CustomTrustStore offered in .NET 5 might save the day
Yep.
Instead of putting rootCaCertificate into ExtraStore, put it into CustomTrustStore, then set chain.ChainPolicy.TrustMode = X509ChainTrustMode.CustomRootTrust;. Now your provided root is the only root valid for the chain. You can also remove the AllowUnknownCertificateAuthority flag.
OfflineRevocation
This error is slightly misleading. It means "revocation was requested for the chain, but one or more revocation responses is missing". In this case it's missing because the builder didn't ask for it, because it didn't trust the root. (Once you don't trust the root you can't trust the CRLs/OCSP responses, so why ask for them at all?)
RevocationStatusUnknown
Again, the unknown is because it didn't ask for it. This code is different than OfflineRevocation because technically a valid OCSP response is (effectively) "I don't know". That'd be an online/unknown.
UntrustedRoot
Solved by the custom trust code above.
Other things of note: The correct way to determine the certificate is valid is to capture the boolean return value from chain.Build. For your current chain, if you had disabled revocation (chain.ChainPolicy.RevocationMode = X509RevocationMode.NoCheck) then Build would have returned true... but the UntrustedRoot error would still be present in the ChainStatus output. The boolean return from Build is false if there are any errors that the VerificationFlags didn't say to ignore.
You also only need to call Build once :).
static void Main(string[] args)
{
var rootCaCertificate = new X509Certificate2("root-ca-cert.cer");
var intermediateCaCertificate = new X509Certificate2("intermediate-ca-cert.cer");
var endUserCertificate = new X509Certificate2("end-user-cert.cer");
var chain = new X509Chain();
chain.ChainPolicy.CustomTrustStore.Add(rootCaCertificate);
chain.ChainPolicy.ExtraStore.Add(intermediateCaCertificate);
chain.ChainPolicy.RevocationMode = X509RevocationMode.Online;
chain.ChainPolicy.TrustMode = X509ChainTrustMode.CustomRootTrust;
bool success = chain.Build(endUserCertificate);
if (success)
{
return;
}
foreach (X509ChainStatus error in chain.ChainStatus)
{
Console.WriteLine($"{error.Status.ToString()}: {error.StatusInformation}");
}
}
Well, not a full answer, but probably it will get you going.
We've built before an Azure Web App (not a function app, but I guess wouldn't matter) which did exactly what you want. Took us a week or so. Full answer would be posting the code of the whole app, we I obviously cannot do. Some hints though.
We walked around the certificate problem by uploading certificates to the app (TLS settings) and accessing them in code through WEBSITE_LOAD_CERTIFICATES setting (you put thumbprints there, it's also mentioned in the link you posted), than you can get them in code and build your own certificate store:
var certThumbprintsString = Environment.GetEnvironmentVariable("WEBSITE_LOAD_CERTIFICATES");
var certThumbprints = certThumbprintsString.Split(",").ToList();
var store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
for (var i = 0; i < certThumbprints.Count; i++)
{
store.Certificates.Find(X509FindType.FindByThumbprint, certThumbprint, false);
}
Than we implemented a number of validators: for subject, time, certificate chain (you basically mark your root somehow and go from the certificate you validate up the chain in your store and see if you end up in your root) and CRL.
For CRL Bouncy Castle offers support:
// Get CRL from certificate, fetch it, cache it
var crlParser = new X509CrlParser();
var crl = crlParser.ReadCrl(data);
var isRevoked = crl.IsRevoked(cert);
Getting CRL from the certificate is tricky, but doable (I followed this for the purpose, more or less https://learn.microsoft.com/en-us/archive/blogs/joetalksmicrosoft/pki-authentication-as-a-azure-web-app).

Not able to Set Password and Enable Account using C# and admin User

Using WPF & C#, I can set all the attributes in Active Directory, but can't do the following :
1) Can't Set User Password
2) Can't Enable User
However, I can do the same thing manually!
Approach Tried:
1.
DirectoryEntry directoryEntry=
directoryEntry.Invoke("SetPassword", new object[] {myPass#x6712}); // To set password
directoryEntry.Properties["userAcountControl"].Value=0x0200; //To Enable User
2.
DirectoryEntry uEntry = new DirectoryEntry(userDn);
uEntry.Invoke("SetPassword", new object[] { password });
uEntry.Properties["LockOutTime"].Value = 0; //unlock account
3.
using (var context = new PrincipalContext( ContextType.Domain ))
{
using (var user = UserPrincipal.FindByIdentity( context, IdentityType.SamAccountName, userName ))
{
user.SetPassword( "newpassword" );
// or
user.ChangePassword( "oldPassword", "newpassword" );
user.Save();
}
}
ERROR ON PASSWORD SET: Exception has been thrown by the target invocation.
ERROR ON ENABLE USER: Access is denied.
NOTE: I'm using a Domain Admin User.
The program gives the exception in these above lines.
Please, Advice! Thanks in Advance !!
Maybe this is just a mistake in your question, but the code you show in your first example wouldn't compile because the password is not in quotes. It should be:
directoryEntry.Invoke("SetPassword", new object[] {"myPass#x6712"});
That code invokes IADsUser.SetPassword. The 'Remarks' in the documentation point to some prerequisites for it to work, namely, that it must be a secure connection. So it may have failed in setting up a secure connection. It would usually try Kerberos to do that, so something might have gone wrong there.
You can try specifically connecting via LDAPS (LDAP over SSL) by pointing it at port 636 (new DirectoryEntry("LDAP://example.com:636/CN=whatever,DC=example,DC=com")), but that requires that you trust the certificate that is served up. Sometimes it's a self-signed cert, so you would need to add the cert to the trusted certs on whichever computer you run this from.
Or, the account you are running it with does not have the 'Reset Password' permission on the account.
For enabling, the userAccountControl attribute is a bit flag, so you don't want to set it to 2, mostly because 2 (or more accurately, the second bit) means that it's disabled. So you want to unset the second bit. You would do that like this:
directoryEntry.Properties["userAcountControl"].Value =
(int) directoryEntry.Properties["userAcountControl"].Value & ~2;
Most of the time that will result in a value of 512 (NORMAL_ACCOUNT), but not necessarily. The account could have other bits set that you don't want to inadvertently unset.
You also need to call .CommitChanges() for the changes to userAcountControl to take effect:
directoryEntry.CommitChanges();

Keyset does not exist when accessing certificate from azure web site

When I access a certificate from the file system, either locally, or on an azure website, with the following code, I have no problems:
X509Certificate2 certificate = new X509Certificate2(
keyFilePath,
"mysecret",
X509KeyStorageFlags.MachineKeySet
| X509KeyStorageFlags.PersistKeySet
| X509KeyStorageFlags.Exportable);
However, when I follow the instructions at https://azure.microsoft.com/en-us/blog/using-certificates-in-azure-websites-applications/ for utilizing the azure certificate store, everything works for the first 3 to 9 requests, and all subsequent calls fail on the following line
var rsa = certificate.PrivateKey as RSACryptoServiceProvider;
with the error "System.Security.Cryptography.CryptographicException: Keyset does not exist" until the site is restarted, and will then work for another at least 3 requests.
I'm baffled as to why it works for at least 3 and up to 9 requests, then always fails with the error. I would appreciate any advice.
When loading from the PFX you were specifying PersistKeySet, which should usually only be set when you are planning on persisting the certificate to a cert store. Though it's possible that some aspect of the code tried being clever and cleaning up the private key on it's own, by marking an RSACryptoServiceProvider object's PersistKeyInCsp to false.
The reason I point this out is that the "Keyset does not exist" error almost always means "the cert store was told that a private key existed, but someone has since deleted it without informing the cert store". The most likely culprit is something somewhere setting PersistKeyInCsp to false (which means "delete the key file on Dispose/Finalize").
If you are setting PersistKeyInCsp to false but not disposing the object manually, you'd get a deferred cleanup due to the finalizer, which would explain why it's 3-9 successes instead of a deterministic number.
(I also feel compelled to point out that you should use cert.GetRSAPrivateKey() instead of cert.PrivateKey, because a) it's type-safe and b) it's caller-owned lifetime (you're supposed to dispose it) instead of shared/ambiguous lifetime. It makes things a bit more predictable, though it almost never returns an RSACryptoServiceProvider, so you shouldn't try to cast it)
Looks like explicit disposal (with using keyword) solves the problem:
using (X509Certificate2 certificate = new X509Certificate2(keyFilePath, "mysecret", X509KeyStorageFlags.MachineKeySet |
X509KeyStorageFlags.PersistKeySet |
X509KeyStorageFlags.Exportable))
{
// ... use cert
}
A singleton worked for me:
class static CertificateSingleton
{
private static X509Certificate2 certificate;
public Get()
{
if (certificate == null)
{
byte[] certificateBytes = .....;
string password = .....;
certificate = new X509Certificate2(certificateBytes, password, X509KeyStorageFlags.MachineKeySet | X509KeyStorageFlags.PersistKeySet | X509KeyStorageFlags.Exportable);
}
return certificate;
}
}
So everytime I want to use it, I use CertificatesSingleton.Get() and this guarantees I'm using the same instance everytime.

"Keyset does not exist" when trying to instantiate a new RSACryptoServiceProvider

I am trying to automate deployment of certificates, including managing permissions on the private key. Using this question, I have cobbled together some code that should update permissions for a certificate:
public static SetPermissionsResult SetPermissions(X509Certificate2 certificate, string userName)
{
var account = new SecurityIdentifier(WellKnownSidType.NetworkServiceSid, null);
using (var store = new X509Store(StoreName.My, StoreLocation.LocalMachine))
{
store.Open(OpenFlags.MaxAllowed);
var newCertificate = store.Certificates.Find(X509FindType.FindBySerialNumber, certificate.SerialNumber, false)[0];
var rsa = newCertificate.PrivateKey as RSACryptoServiceProvider;
if (rsa == null)
{
return SetPermissionsResult.Failure;
}
rsa.PersistKeyInCsp = true;
var cspParams = new CspParameters(
rsa.CspKeyContainerInfo.ProviderType,
rsa.CspKeyContainerInfo.ProviderName,
rsa.CspKeyContainerInfo.KeyContainerName)
{
Flags =
CspProviderFlags.UseExistingKey
| CspProviderFlags.UseMachineKeyStore,
CryptoKeySecurity =
rsa.CspKeyContainerInfo.CryptoKeySecurity,
KeyNumber = (int)rsa.CspKeyContainerInfo.KeyNumber/*,
KeyPassword = password*/
};
cspParams.CryptoKeySecurity.AddAccessRule(
new CryptoKeyAccessRule(account, CryptoKeyRights.GenericRead, AccessControlType.Allow));
using (var rsa2 = new RSACryptoServiceProvider(cspParams))
{
}
return SetPermissionsResult.Success;
}
}
On the line that reads using (var rsa2 = new RSACryptoServiceProvider(cspParams)) (where the new crypto provider is instantiated to persist the new access rule), I get a CryptographicException "Keyset does not exist".
I know from experience that this generally means that the current security context does not have permissions to access the primary key. To troubleshoot this possibility, I have done the following:
Figure out what the current user is with System.Security.Principal.WindowsIdentity.GetCurrent() in the Immediate window
Ensure that that user has Full control permission in the MMC snap-in for certificates (and double-checked that I'm looking at the right store for it)
Granted Full control to the Everyone user.
Tried creating the CspParameters object with and without the KeyPassword (you can see it commented there).
I am out of ideas. The certificate is a bogus self-signed test cert, so it's not a matter of other certs in the chain missing permissions. Any help would be appreciated.
UPDATE:
I've been able to get this code to execute by modifying some of the flags for the installation of the certificate that precedes this step. Now, the code executes, apparently successfully, but to no visible effect I can see in the MMC.
I resolved this issue, more or less. The code for managing permissions listed above is sound. What was problematic was the code for instiating the certificate before adding it. The common sense way to that looks like this:
var certificate = new X509Certificate2(bytes, password);
The problem with this is that it doesn't cause the private key to be persisted in the store, which is what the permissions code relies on (duh). To cause this to happen, you need to instantiate the certificate to be added to the store like this:
var certificate = new X509Certificate2(bytes, "password123", X509KeyStorageFlags.Exportable | X509KeyStorageFlags.MachineKeySet | X509KeyStorageFlags.PersistKeySet);
And add it like this:
using (var store = new X509Store(storeName, StoreLocation.LocalMachine))
{
store.Open(OpenFlags.MaxAllowed);
store.Add(certificate);
}
FINAL POINT: the MMC snap-in is dicey about displaying the updated permissions, i.e., it may not display the new permissions even though they actually have changed. I reached a point where I had everything working perfectly, but didn't realize it because the MMC was showing me different. Closing the MMC and reopening forces a refresh if you're doubting what you're seeing.

X509 certificate not loading private key file on server

I'm using the Google Analytics API and I followed this SO question to set up the OAuth: https://stackoverflow.com/a/13013265/1299363
Here is my OAuth code:
public void SetupOAuth ()
{
var Cert = new X509Certificate2(
PrivateKeyPath,
"notasecret",
X509KeyStorageFlags.Exportable);
var Provider = new AssertionFlowClient(GoogleAuthenticationServer.Description, Cert)
{
ServiceAccountId = ServiceAccountUser,
Scope = ApiUrl + "analytics.readonly"
};
var Auth = new OAuth2Authenticator<AssertionFlowClient>(Provider, AssertionFlowClient.GetState);
Service = new AnalyticsService(Auth);
}
PrivateKeyPath is the path of the private key file provided by Google API Console. This works perfectly on my local machine, but when I push it up to our test server I get
System.Security.Cryptography.CryptographicException: An internal error occurred.
with the following stack trace (irrelevant parts removed):
System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr) +33
System.Security.Cryptography.X509Certificates.X509Utils._LoadCertFromFile(String fileName, IntPtr password, UInt32 dwFlags, Boolean persistKeySet, SafeCertContextHandle& pCertCtx) +0
System.Security.Cryptography.X509Certificates.X509Certificate.LoadCertificateFromFile(String fileName, Object password, X509KeyStorageFlags keyStorageFlags) +237
System.Security.Cryptography.X509Certificates.X509Certificate2..ctor(String fileName, String password, X509KeyStorageFlags keyStorageFlags) +140
Metrics.APIs.GoogleAnalytics.SetupOAuth() in <removed>\Metrics\APIs\GoogleAnalytics.cs:36
Metrics.APIs.GoogleAnalytics..ctor(String PrivateKeyPath) in <removed>\Metrics\APIs\GoogleAnalytics.cs:31
So it appears as if it is having trouble loading the file. I've checked the PrivateKeyPath that is passed in and it is pointing to the correct location.
Any ideas? I don't know if this is an issue with the server, the file, the code or what.
One of things that comes to my mind is the identity of your app pool, make sure that the Load user profile is turned on otherwise the crypto subsystem does not work.
I'm loading my p12 file with
new X509Certificate2(
HostingEnvironment.MapPath(#"~/App_Data/GoogleAnalytics-privatekey.p12"), ....
I actually got a FileNotFoundException even though File.Exists(filename) returned true.
As #Wiktor Zychla said it's as simple as enabling Load User Profile
Here's an image of the setting that needs changing
Just right click on the app pool under 'Application Pools' in IIS and select 'Advanced Settings' and the setting you need is about halfway down.
Tip: I'd recommend commenting your code with this to prevent future time wasted since it's so obscure if you've never come across it before.
// If this gives FileNotFoundException see
// http://stackoverflow.com/questions/14263457/
Also try specifying X509KeyStorageFlags
var certificate = new X509Certificate2(KeyFilePath, KeyFilePassword,
X509KeyStorageFlags.MachineKeySet | X509KeyStorageFlags.PersistKeySet |
X509KeyStorageFlags.Exportable);
As mentioned above you need to configure IIS but as our case, some time you need to check the permission of the following folder:
C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys
If you set X509KeyStorageFlags parameter it will create a key file in this folder. In my case there was a difference in permission of this folder. Pool account was not added in the mentioned folder.
Nope, is "File.Exists(...)" also in advanced settings? I had 3 pools, all of them had true enabled for "Load User Profile". I'm thinking my problem might have something to do with dependencies and NuGet Packages as the code worked just fine as a Console App but gives me problem in MVC.

Categories