Azure - Programmatically add new MySQL db instance - c#

I am trying to programmatically add new MySQL db instance to azure portal.
I looked at this library: https://github.com/Azure/azure-libraries-for-net but i only see an example of how to create a SQL Server and not a MySQL.
var credentials = SdkContext.AzureCredentialsFactory.FromFile(Environment.GetEnvironmentVariable("AZURE_AUTH_LOCATION"));
var azure = Azure
.Configure()
.WithLogLevel(HttpLoggingDelegatingHandler.Level.Basic)
.Authenticate(credentials)
.WithDefaultSubscription();
var sqlServer = azure.SqlServers.Define(sqlServerName)
.WithRegion(Region.USEast)
.WithNewResourceGroup(rgName)
.WithAdministratorLogin(AdministratorLogin)
.WithAdministratorPassword(AdministratorPassword)
.WithNewFirewallRule(FirewallRuleIPAddress)
.WithNewFirewallRule(FirewallRuleStartIPAddress, FirewallRuleEndIPAddress)
.Create();
var database = sqlServer.Databases
.Define(DatabaseName)
.Create();
Any idea if its supported to programmatically create a MySQL server as well?

Looking at the release notes and SDK code, it seems the ability to manage MySQL databases is still not supported (as of version 1.3).
What you could do is consume the REST API for managing Azure MySQL databses directly. For creation of database, please look at Create Or Update Database operation.
To get started with Azure REST API, you may find this link useful: https://learn.microsoft.com/en-us/rest/api/.

Related

Cross Account Access to DynamoDb tables C#

I have been dealing with this issue for which I am not able to find solution online anywhere.
I have a code which connects to AWS DynmoDb and performs read/write operations on one or more tables. This worked fine as long as my code and the DynamoDb table are in the same AWS account. Also the code uses the IAM Role attached to the Web Server. The role as all the necessary permissions assigned to it.
private AmazonDynamoDBClient GetDbClient(int ConnectionTimeOut, int ReadWriteTimeOut, int MaxRetry)
{
AmazonDynamoDBConfig clientConfig = new AmazonDynamoDBConfig
{
Timeout = TimeSpan.FromMilliseconds(ConnectionTimeOut),
ReadWriteTimeout = TimeSpan.FromMilliseconds(ReadWriteTimeOut),
MaxErrorRetry = MaxRetry
};
return new AmazonDynamoDBClient(clientConfig);
}
Recently I need to move my code to different AWS account and things started going crazy.
Following steps I have already taken.
VPC Peering done between the VPC in the old AWS account and the new AWS account.
Cross account permissions on the DynamobDb tables are given to the role which is used by the Web server on the new AWS Account.
With this change, I do not see any more permission errors but the code tries to look for the table on the new AWS account.
It is clear in the code that AWS AccountId is not used anywhere while creating AWS DynamoDb client. So I assume that I should be able to tell the code where to look for DynamoDb table. But the C# SDK of DynamoDb does not have any provision where I can provide AWS AccountId while creating DynamoDb client.
So my issue here is related to C# code to connect to DynamoDb service and not the IAM roles and permissions on AWS (for this I am able to fine plenty of solution).
Found this question aws cross account dynamodb access with IAM role with similar issue but it does not suggest the fix to do in the code.
One way to proceed is to use Security Token Service. First you need to assume a role and get temporary credentials:
Credentials GetCredentials(String roleArn)
{
using (var stsClient = new AmazonSecurityTokenServiceClient())
{
try
{
var response = stsClient.AssumeRole(new AssumeRoleRequest(roleARN));
if (response.HttpStatusCode == System.Net.HttpStatusCode.OK) return response.Credentials;
}
catch (AmazonSecurityTokenServiceException ex)
{
return null;
}
}
}
You can then use the credentials to initiate your DynamoDB client.
See another example here.
The AWS SDK and CLI (whether it's running locally or on (say) an EC2 instance) looks in the following locations for credentials:
Command line options
Environment variables
CLI credentials file
CLI configuration file
Container credentials
Instance profile credentials
If you have a credentials file configured, then, assuming we're running under the default profile, this will indirectly define the account under which it is running via the access key provided.
You can also define AWS-specific environment variables, such as AWS_ACCESS_KEY_ID which take precedence over the credentials file.
https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html

Mongo client API with resource token

I have been trying to use a Mongo API cosmos account with multiple databases and wanted to generate resource token for the individual resources. I am seeing the implementation for Document DB like below.
client = new DocumentClient(new Uri(endpointUrl), resourceToken);
However, I am looking for implementation related to Mongo.Driver
MongoClientSettings settings = new MongoClientSettings();
settings.Server = new MongoServerAddress(host, 10255);
settings.UseSsl = true;
settings.SslSettings = new SslSettings();
settings.SslSettings.EnabledSslProtocols = SslProtocols.Tls12;
MongoIdentity identity = new MongoInternalIdentity(dbName, userName);
MongoIdentityEvidence evidence = new PasswordEvidence(tokepass2);
settings.Credential = new MongoCredential("SCRAM-SHA-1", identity, evidence);
MongoClient client = new MongoClient(settings);
I am trying to replace the "tokepass2" with the resource token that is generated. But that is not working and ended up with the exception
One or more errors occurred. (Unable to authenticate using sasl protocol mechanism SCRAM-SHA-1.)
I know we have the possibility where we can do a REST based post call with the token in the header, however I am looking for an implementation related to Mongo Client, if some one has implemented.
Unfortunately,i don't think it could be implemented in the C# Mongo DB driver.Based on Wire protocol compatibility :
Azure Cosmos DB implements wire protocols of common NoSQL databases
including Cassandra, MongoDB, Gremlin, and Azure Tables Storage. By
providing a native implementation of the wire protocols directly and
efficiently inside Cosmos DB, it allows existing client SDKs, drivers,
and tools of the NoSQL databases to interact with Cosmos DB
transparently. Cosmos DB does not use any source code of the databases
for providing wire-compatible APIs for any of the NoSQL databases.
By default, new accounts created using Azure Cosmos DB's API for
MongoDB are compatible with version 3.6 of the MongoDB wire protocol.
Any MongoDB client driver that understands this protocol version
should be able to natively connect to Cosmos DB.
Cosmos db mongo api only implements wire protocols for Mongo DB,it doesn't have any specific sdk for mongo db. And other mongo db driver like mongo c# driver or mongoose etc,they built for mongo db,not for cosmos db mongo api. So the resource token feature can't be supported by those drivers directly. You can't replace master key with resource token.
If you do want to use resource token,you could use :
1.REST API as you mentioned in your question
2.Migrate mongo db to cosmos db sql api. Please refer to this link:https://learn.microsoft.com/en-us/azure/cosmos-db/import-data

Is it possible to create a new Gremlin Graph DB and Graph using c# code

I have a Azure Cosmos DB account with Gremlin API. I'm using Gremlin.Net to query the db.
var gremlinServer = new GremlinServer(hostname, port, enableSsl: true, username: "/dbs/" + database + "/colls/" + collectionName, password: authKey);
username parameter takes dbname and collection name.
Just wondering how to create a new graph db and a graph under the account using c# code.
Thanks
I searched the azure gremlin .Net source code , no such methods like creating database or graph could be found. There is an statement mentioned in above linkļ¼š
This sample uses the open-source Gremlin.Net driver to connect to an
Azure Cosmos DB Graph API account and run some basic Create, Read,
Update, Delete Gremlin queries.
It seems that we could only execute gremlin queries, can't CRUD database itself.
I also searched Cosmos DB REST API, no such special api for CRUD cosmos db graph api. Only operations of database and collection could be found. So I tested it with below sample code with Document DB .Net SDK and it works for me.
client = new DocumentClient(new Uri(ConfigurationManager.AppSettings["endpoint"]), ConfigurationManager.AppSettings["authKey"]);
await client.CreateDatabaseAsync(new Database { Id = "db"});
await client.CreateDocumentCollectionAsync(
UriFactory.CreateDatabaseUri(DatabaseId),
new DocumentCollection
{
Id = "coll"
},
new RequestOptions { OfferThroughput = 400 });
I know it is strange(there is no collection in cosmos db graph api,it supposed to be graph) but it works for me.You could try it on your side.
Hope it helps you.

Azure Database export with C#

My C# programme works with Azure Database.
I'm using Microsoft.Rest and Microsoft.Azure.Management libraries to do some stuff (DB copy, manipulate, delete, etc...).
I try to do an export of an Azure DB, but I can't find how to do that in C#.
Does anyone know how I can do that, or direct me to an example ?
Based on my understanding, you are talking about Azure SQL database. I assumed that you could Export your Azure SQL database to a BACPAC file.
I try to do an export of an Azure DB, but I can't find how to do that in C#.
According to your description, I checked Microsoft Azure Management Libraries and found you could refer to the following code snippet for exporting your azure sql database into the azure blob storage:
CertificateCloudCredentials credential = new CertificateCloudCredentials("{subscriptionId}","{managementCertificate}");
var sqlManagement = new SqlManagementClient(credential);
var result = sqlManagement.Dac.Export("{serverName}", new DacExportParameters()
{
BlobCredentials = new DacExportParameters.BlobCredentialsParameter()
{
StorageAccessKey = "{storage-account-accesskey}",
Uri = new Uri("https://{storage-accountname}.blob.core.windows.net/{container-name}")
},
ConnectionInfo = new DacExportParameters.ConnectionInfoParameter()
{
DatabaseName = "{dbname}",
ServerName = "{serverName}.database.windows.net",
UserName = "{username}",
Password = "{password}"
}
});
And you could use sqlManagement.Dac.GetStatus for retrieving the status of the export operation.
Additionally, the Microsoft Azure Management Libraries uses Export Database (classic), for newer resource manager based REST API, you could refer to here. Moreover, you could refer to create a storage account and leverage Microsoft Azure Storage Explorer for a simple way to manage your storage resources, for more details, you could refer to here.
I've found the solution of my problem:
I had to update my Microsoft.Azure.Management.Sql library.
Now, I can use this export method:
public static ImportExportResponse Export(this IDatabasesOperations
operations, string resourceGroupName, string serverName, string
databaseName, ExportRequest parameters);

Working with Azure local Development Storage via SSMS (sql server management studio)

I am trying to write code to access my azure local development storage. I started off by creating a new storage for myself:
dsInit /forceCreate
I can now see the DevelopmentStorageDb20090919 in SSMS with some precreated tables such as dbo.TableContainer, dbo.TableRow etc.
Now, can I simply add tables to this database via SSMS (ex. Employee table) and start accessing them via code ? Is this the right way to do stuff ?
For example:
var svc = CloudStorageAccount.DevelopmentStorageAccount
.CreateCloudTableClient().GetDataServiceContext();
//"Employees" is the name of the table
svc.AddObject("Employees", new Employees("John Doe"));
svc.SaveChangesWithRetries();
2 . And additionally, once I am all done, how do I port the table and the data into the actual cloud account ? By running scripts there ?
I think you're confusing Azure Table Storage with SQL Server or SQL Azure, which are completely different. You cannot access Azure Storage tables at all with SSMS. The code sample you provided is using the Azure SDK (which is using the Storage REST API underneath). That's the only way to access Azure Storage.
If you want to create / view tables in a more graphical way, try Cerebrata's Cloud Storage Studio, ClumsyLeaf's AzureXplorer, David Pallman's Azure Storage Explorer, or some other similar tool. These tools all rely on the SDK or direct API calls.
Now, regarding your code: You need to create your table before inserting objects. See CreateTablesFromModel() and CreateTableIfNotExist(). The Azure Platform Training Kit has a great intro/lab for creating and using tables, and shows how to use CreateTablesFromModel().
As long as that table exists, then yes, the code you have written will add "John Doe" to the employees table. While it can be interesting to look at data through SSMS and you could try altering data that way, I wouldn't recommend trying it. Development storage is funny enough without poking it with a stick. There are differences between dev storage and actual cloud storage, so I have found that the sooner you can stop using dev storage the better.
At the moment there is no fancy way of transferring data between Azure tables (be they in dev storage or in the cloud). It boils down to running a query that selects everything from the source table, then writes each individual item to the destination table. Depending on how your data is partitioned you might be able to batch the writes or you might be able to do them in parallel. If you're willing to use the REST API directly you could avoid the storage library having to deserialise each item before it's written.
Even though it's best to use the APIs to talk to the DevStorage, there might be scenarios where the direct database access could prove beneficial. More specifically, it can be used to circumvent some SDK issues that are specific to DevStorage.
I once ran into a problem with renaming of large blobs - the operation would simply time out (note that blobs cannot be renamed - they first need to be copied using CopyFromBlob() and then deleted). I tried both in Azure Storage Explorer and by writing code and increasing all the timeouts. Solution? SQL to the rescue!
Here is an example of SQL that would rename a blob within a container or move it to a different one:
begin tran
alter table CommittedBlock nocheck constraint BlockBlob_CommittedBlock
update CommittedBlock set BlobName = #TargetBlobName, ContainerName = #TargetContainerName where BlobName = #SourceBlobName and ContainerName = #SourceContainerName
update BlockData set BlobName = #TargetBlobName, ContainerName = #TargetContainerName where BlobName = #SourceBlobName and ContainerName = #SourceContainerName
update Blob set BlobName = #TargetBlobName, ContainerName = #TargetContainerName where BlobName = #SourceBlobName and ContainerName = #SourceContainerName
alter table CommittedBlock with check check constraint BlockBlob_CommittedBlock
rollback tran
Of course, use it at your own risk - this is a completely unsupported way of working with dev stotage.

Categories