Firestore C#: Missing or insufficient permissions - c#

I am currently exploring Firebase's Cloud Firestore in C# and has encountered with the error which I could not resolve after searching SO. It seems that the resources for Firestore in C# is quite limited:
"Status(StatusCode=PermissionDenied, Detail=\"Missing or insufficient permissions.\")"
My code so far:
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", "FirebaseCsharpTest.json");
string project = "MyProjectId";
FirestoreDb db = FirestoreDb.Create(project);
Console.WriteLine("Created Cloud Firestore client with project ID: {0}", project);
AddPerson(db).GetAwaiter().GetResult();
}
public static async Task AddPerson(FirestoreDb db)
{
CollectionReference collection = db.Collection("users");
DocumentReference document = await collection.AddAsync(new
{
Name = new
{ First = "Ada", Last = "Lovelace" },
Born = 1815
});
}
}
I have checked on my Firebase console that the Firestore security rules are set to public (as of now, for testing sake). I have also ensured that the authentication json file is the right file generated from Google Developer Console as suggested in this post.
Is there something I'm missing?
EDIT:
My permissions on google cloud console:

Do check the projectID that you have been adding.
ProjectID is supposed to be the one that is specified in your json file.

This is a Cloud IAM issue and not a security rule issues. The C# server client library creates a privileged server environment that doesn't take into account Cloud Firestore security rules.. It seems that either your key file's service account doesn't have the correct IAM role or your code is not finding your keyfile.
Try using the full path to your keyfile instead of the relative path.
You can also try setting up a new service account and key file as described here.
Also, make sure you are changing the project variable:
string project = "NewFirebaseCsharpTest";
FirestoreDb db = FirestoreDb.Create(project);

Change the string project to the Id of the project and not the name of the project. The Id will probably start with the name and have a numeric suffix.
If that fails, go to the Log Explorer and query the logs. You will see an error that explains why your permission was denied.
You're never going to get enough information from the client-side for security reasons. Google isn't going to tell someone trying to gain access to something they shouldn't have access to what is going wrong. But, an admin can view the logs on the other side which will log all the information.

Related

Service account is not authorized to manage project

I recently joined a team, and am adding Android Management Api to the already existing project. I added the management API, created service accounts with permissions, and am writing the .NET project to test it out.
I made a service account with Android Management User and Owner permissions. However, when I try to use the .NET library to make an enterprise, I get
The service androidmanagement has thrown an exception. HttpStatusCode is Forbidden. Caller is not authorized to manage project.
If it helps:
The API key I'm using is allowed to call any API, and the application name is a temporary one that does NOT match the project name. As for the service account with private key, I am using a FileStream to read a .json file downloaded when the service account was created.
This is my code, based on the sample app https://developers.google.com/android/management/sample-app
The error gets thrown on the createRequst.Execute()
string CreateEnterprise()
{
SignupUrlsResource.CreateRequest signupUrlRequest = managementService.SignupUrls.Create();
signupUrlRequest.ProjectId = cloud_project_id;
signupUrlRequest.CallbackUrl = "https://www.yahoo.com";
var signupUrl = signupUrlRequest.Execute();
string enterpriseToken = signupUrl.Url;
Console.WriteLine("Signup: " + enterpriseToken);
EnterprisesResource.CreateRequest createRequest = managementService.Enterprises.Create(new Enterprise());
createRequest.ProjectId = "Test Project";
createRequest.SignupUrlName = signupUrl.Name;
createRequest.EnterpriseToken = enterpriseToken;
var enterprise = createRequest.Execute();
return enterprise.Name;
}
Turns out the createRequest.ProjectId must match the project name that has the Android Management API, aka the project I'm working with.

Cross Account Access to DynamoDb tables C#

I have been dealing with this issue for which I am not able to find solution online anywhere.
I have a code which connects to AWS DynmoDb and performs read/write operations on one or more tables. This worked fine as long as my code and the DynamoDb table are in the same AWS account. Also the code uses the IAM Role attached to the Web Server. The role as all the necessary permissions assigned to it.
private AmazonDynamoDBClient GetDbClient(int ConnectionTimeOut, int ReadWriteTimeOut, int MaxRetry)
{
AmazonDynamoDBConfig clientConfig = new AmazonDynamoDBConfig
{
Timeout = TimeSpan.FromMilliseconds(ConnectionTimeOut),
ReadWriteTimeout = TimeSpan.FromMilliseconds(ReadWriteTimeOut),
MaxErrorRetry = MaxRetry
};
return new AmazonDynamoDBClient(clientConfig);
}
Recently I need to move my code to different AWS account and things started going crazy.
Following steps I have already taken.
VPC Peering done between the VPC in the old AWS account and the new AWS account.
Cross account permissions on the DynamobDb tables are given to the role which is used by the Web server on the new AWS Account.
With this change, I do not see any more permission errors but the code tries to look for the table on the new AWS account.
It is clear in the code that AWS AccountId is not used anywhere while creating AWS DynamoDb client. So I assume that I should be able to tell the code where to look for DynamoDb table. But the C# SDK of DynamoDb does not have any provision where I can provide AWS AccountId while creating DynamoDb client.
So my issue here is related to C# code to connect to DynamoDb service and not the IAM roles and permissions on AWS (for this I am able to fine plenty of solution).
Found this question aws cross account dynamodb access with IAM role with similar issue but it does not suggest the fix to do in the code.
One way to proceed is to use Security Token Service. First you need to assume a role and get temporary credentials:
Credentials GetCredentials(String roleArn)
{
using (var stsClient = new AmazonSecurityTokenServiceClient())
{
try
{
var response = stsClient.AssumeRole(new AssumeRoleRequest(roleARN));
if (response.HttpStatusCode == System.Net.HttpStatusCode.OK) return response.Credentials;
}
catch (AmazonSecurityTokenServiceException ex)
{
return null;
}
}
}
You can then use the credentials to initiate your DynamoDB client.
See another example here.
The AWS SDK and CLI (whether it's running locally or on (say) an EC2 instance) looks in the following locations for credentials:
Command line options
Environment variables
CLI credentials file
CLI configuration file
Container credentials
Instance profile credentials
If you have a credentials file configured, then, assuming we're running under the default profile, this will indirectly define the account under which it is running via the access key provided.
You can also define AWS-specific environment variables, such as AWS_ACCESS_KEY_ID which take precedence over the credentials file.
https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html

Access to path is denied in Production, works in Dev

This is frustratingly weird. I'm suddenly getting this error in Production, but not locally in development.
Specs:
.Net Core 3.1
IIS 10
Access to the path '\\[network share]\Files\Video' is denied.
It occurs when this method is called:
public Task<bool> UploadAsync()
{
if (request.Metadata.ChunkIndex == 0)
{
if (!Directory.Exists(request.Metadata.UploadLocation))
{
Directory.CreateDirectory(request.Metadata.UploadLocation);
}
}
var fixedFileName = FileManager.FixFileName(request.Metadata.FileName);
var basePath = Path.Combine(request.Metadata.UploadLocation, PARTIALS);
if (!Directory.Exists(basePath))
{
Directory.CreateDirectory(basePath);
}
var filePath = Path.Combine(basePath, fixedFileName);
AppendToFile(filePath, request.File);
return Task.FromResult(true);
}
... and this line is called:
Directory.CreateDirectory(request.Metadata.UploadLocation)
It seems like it's due to some missing permission with IIS and the web application, since it works when I run the application in IISExpress.
And this is important, the application in both cases is pointing to the same network share location to drop the video file.
So it seems like any permissions on the network share folder is correct, but IIS can't access it. That's my theory, but my network administrator says he hasn't changed anything with the application's app pool, so I don't have any other ideas.
Any ideas what is causing this issue?
According to your description and error codes, it seems this is related with permission issue.
To solve this issue, I suggest you could firstly modify the IIS application pool identity with a user which have the enough permission to access the network share and try again. If this is working well, this is related with the application pool identity account permission.
More details about how to modify the application pool identity permission, you could refer to this answer.

Azure Function fails to find file

I have an azure function that uses the Azure context. When I execute my function from visual studio 2019 on my machine, it executes correctly. However when I publish this to my Azure account, I get an error that the my.azureauth file cannot be found.
Could not find file 'D:\Program Files (x86)\SiteExtensions\Functions\2.0.12950\32bit\my.azureauth'
The code that is used:
var authFilePath = "my.azureauth";
Console.WriteLine($"Authenticating with Azure using credentials in file at {authFilePath}");
azure = Azure.Authenticate(authFilePath).WithDefaultSubscription();
sub = azure.GetCurrentSubscription();
Console.WriteLine($"Authenticated with subscription '{sub.DisplayName}' (ID: {sub.SubscriptionId})");
This is code that I found on one of the Microsoft tutorials. I have set my my.azureauth file to "Copy Always".
Could anyone point my in the right direction?
You are get this file path because the Directory.GetCurrentDirectory() would return D:\Program Files (x86)\SiteExtensions\Functions\2.0.12950\32bit instead of D:\home\site\wwwroot\ or D:\home\site\wwwroot\FunctionName.
And if you want to get the wwwroot folder or the function app directory you should use ExecutionContext. Further more information you could refer to this wiki doc.
So the right file path should be context.FunctionDirectory+"\my.azureauth" or context.FunctionAppDirectory+"\my.azureauth", which one to use depends on where your file is stored.
I have found that Kudu is extremely useful in seeing what has been deployed to Azure.
Navigate to your function in the Azure portal.
The instructions here will help get to the kudu console.
https://www.gslab.com/blogs/kudu-azure-web-app
From there you can browse the files which have been deployed into your function's file system.
If you add " , ExecutionContext context)" at the end of the function's run entry point, you can then get the folder which your function is running from with "var path = context.FunctionAppDirectory;
PS apologies for any formatting I am editing this on my phone.
Welcome to Stackoverflow.
Firstly, I'd recommend strongly against using file-based authentication as shown in your question.
From notes:
Note, file-based authentication is an experimental feature that may or may not be available in later releases. The file format it relies on is subject to change as well.
Instead, I would personally store the connection string details (AzureCredentials) in the config file (Web/SiteSettings) and use the provided constructor...
Again, the below are taken from the documentation notes:
Similarly to the file-based approach, this method requires a service principal registration, but instead of storing the credentials in a local file, the required inputs can be supplied directly via an instance of the AzureCredentials class:
var creds = new AzureCredentialsFactory().FromServicePrincipal(client, key, tenant, AzureEnvironment.AzureGlobalCloud);
var azure = Azure.Authenticate(creds).WithSubscription(subscriptionId);
or
var creds = new AzureCredentialsFactory().FromServicePrincipal(client, pfxCertificatePath, password, tenant, AzureEnvironment.AzureGlobalCloud);
var azure = Azure.Authenticate(creds).WithSubscription(subscriptionId);
where client, tenant, subscriptionId, and key or pfxCertificatePath and password are strings with the required pieces of information about your service principal and subscription. The last parameter, AzureEnvironment.AzureGlobalCloud represents the Azure worldwide public cloud. You can use a different value out of the currently supported alternatives in the AzureEnvironment enum.
The first example is most likely the one you should be looking at.
The notes I got this information from can be accessed here.
If you have some problems with AAD, these screenshots may help you.
Client ID:
Key:
Please note that the Key value can only be copied when it is created, after which it will be hidden.
Hope this helps you get started with AAD quickly.

Accessing Google Cloud Datastore locally from ASP.NET throws Grpc.Core.RpcException: "Missing or insufficient permissions."

I'm following the Using Cloud Datastore with .NET tutorial. At some point it says that you can run the provided project locally by just pressing F5. When I do that I get the following exception
Grpc.Core.RpcException: 'Status(StatusCode=PermissionDenied, Detail="Missing or insufficient permissions.")'
This exception is thrown exactly at the _db.RunQuery(query) line.
var query = new Query("Book") { Limit = pageSize };
if (!string.IsNullOrWhiteSpace(nextPageToken))
query.StartCursor = ByteString.FromBase64(nextPageToken);
var results = _db.RunQuery(query);`
If I deploy the application to cloud it works as expected, no error. I've given datastore owner permissions to all accounts in Cloud IAM (Identity and Access Management) but it still doesn't work. Does anybody have any ideas?
As Jon Skeet pointed out, I was using the wrong json key locally. Previously I created a compute engine that has a separate service account. After I downloaded a new json key from Console -> IAM & Admin -> Service Accounts it worked locally as well.

Categories