I would like connect to existing Azure Storage Table and read (filtered) data from it.
I found some examples, but everyone used deprecated WindowsAzure.Storage namespace. I can not found any solution for Azure.Storage.Common or some else from current Microsoft namespaces. There is many examples for Azure.Storage.Blobs, but i need solution for Table Storage service. Thanks much ...
The Azure.Data.Tables package is the current generation update to the Tables client library and is currently in beta. Samples can be found in the Azure SDK repository in the Tables sample area.
Another suggested solution is to use the Microsoft.Azure.Cosmos.Table package. As stated in the description:
When used with Azure Table Storage service, this library offers similar APIs and functionalities as WindowsAzure.Storage 8.7.0.
Its full documentation can be found here.
A code sample that helps you get a connection to a table and do actions in the below:
private async Task<CloudTable> GetTable() {
var storageAccount = CloudStorageAccount.Parse(StorageConnectionString);
var tableClient = storageAccount.CreateCloudTableClient(new TableClientConfiguration());
var table = tableClient.GetTableReference(TableName);
await table.CreateIfNotExistsAsync();
return table;
}
Related
I currently have a fully functioning Virtual Assistant Template-based chatbot with a skill attached to it. My goal is for the skill to work as a search function that can find resources in a CosmosDB and pull them back for the user to use. After doing some research I believe the best way to do this would be to use Azure search to retrieve said info. From what I've seen in the Virtual Assistant Template documentation integration with Azure Search should definitely be possible... I just haven't found any examples or tutorials on how to do so. If anyone knows how to create an azure search resource and integrate it into a bot, or knows of a resource that tells you how to do so, please let me know!
For your scenario, an outline of what to do is:
Create an Azure search service
In that create an indexer that will point to your Cosmos DB data source. Here is documentation specific to how you can crawl through your data in Cosmos DB: https://learn.microsoft.com/en-us/azure/search/search-howto-index-cosmosdb
Once your indexer runs and has crawled through your data, it should be available for searching, from the app in your search index.
There isn't an end to end tutorial about integrating with a bot, but here is an Azure search tutorial that shows an complete scenario of crawling through a SQL database and then searching using full-text search.
https://learn.microsoft.com/en-us/azure/search/search-indexer-tutorial
You should be able to follow most of the guidance there, except replace the parts about SQL indexer with details from Cosmos DB indexer in the link above.
I want to do a similar search (only in AzureBlob instead of Cosmos DB). I am using sdk v4 for my bot framework and Visual Studio 2019. I'm trying to call the service through the code below:
public ISearchIndexClient CreateSearchIndexClient()
{
string searchServiceName = "MySearchServiceName";
string queryApiKey = "MySearchServiceKey";
string indexName = "MyIndexName";
SearchIndexClient indexClient = new SearchIndexClient(searchServiceName, indexName, new SearchCredentials(queryApiKey));
return indexClient;
}
public async Task StartAsync(ITurnContext turnContext, string searchText){
ISearchIndexClient infoClient = CreateSearchIndexClient();
string indexname = infoClient.IndexName;
DocumentSearchResult<Document> results = infoClient.Documents.Search(searchText);
await turnContext.SendActivityAsync(MessageFactory.Text($"Here should be the results: {results} \n...and then my index: {indexname}."));
}
It runs without errors, so one could use it. But it never shows the message at StartAsync. If anyone sees what I am missing, thank u in advance.
I am trying to programmatically add new MySQL db instance to azure portal.
I looked at this library: https://github.com/Azure/azure-libraries-for-net but i only see an example of how to create a SQL Server and not a MySQL.
var credentials = SdkContext.AzureCredentialsFactory.FromFile(Environment.GetEnvironmentVariable("AZURE_AUTH_LOCATION"));
var azure = Azure
.Configure()
.WithLogLevel(HttpLoggingDelegatingHandler.Level.Basic)
.Authenticate(credentials)
.WithDefaultSubscription();
var sqlServer = azure.SqlServers.Define(sqlServerName)
.WithRegion(Region.USEast)
.WithNewResourceGroup(rgName)
.WithAdministratorLogin(AdministratorLogin)
.WithAdministratorPassword(AdministratorPassword)
.WithNewFirewallRule(FirewallRuleIPAddress)
.WithNewFirewallRule(FirewallRuleStartIPAddress, FirewallRuleEndIPAddress)
.Create();
var database = sqlServer.Databases
.Define(DatabaseName)
.Create();
Any idea if its supported to programmatically create a MySQL server as well?
Looking at the release notes and SDK code, it seems the ability to manage MySQL databases is still not supported (as of version 1.3).
What you could do is consume the REST API for managing Azure MySQL databses directly. For creation of database, please look at Create Or Update Database operation.
To get started with Azure REST API, you may find this link useful: https://learn.microsoft.com/en-us/rest/api/.
My C# programme works with Azure Database.
I'm using Microsoft.Rest and Microsoft.Azure.Management libraries to do some stuff (DB copy, manipulate, delete, etc...).
I try to do an export of an Azure DB, but I can't find how to do that in C#.
Does anyone know how I can do that, or direct me to an example ?
Based on my understanding, you are talking about Azure SQL database. I assumed that you could Export your Azure SQL database to a BACPAC file.
I try to do an export of an Azure DB, but I can't find how to do that in C#.
According to your description, I checked Microsoft Azure Management Libraries and found you could refer to the following code snippet for exporting your azure sql database into the azure blob storage:
CertificateCloudCredentials credential = new CertificateCloudCredentials("{subscriptionId}","{managementCertificate}");
var sqlManagement = new SqlManagementClient(credential);
var result = sqlManagement.Dac.Export("{serverName}", new DacExportParameters()
{
BlobCredentials = new DacExportParameters.BlobCredentialsParameter()
{
StorageAccessKey = "{storage-account-accesskey}",
Uri = new Uri("https://{storage-accountname}.blob.core.windows.net/{container-name}")
},
ConnectionInfo = new DacExportParameters.ConnectionInfoParameter()
{
DatabaseName = "{dbname}",
ServerName = "{serverName}.database.windows.net",
UserName = "{username}",
Password = "{password}"
}
});
And you could use sqlManagement.Dac.GetStatus for retrieving the status of the export operation.
Additionally, the Microsoft Azure Management Libraries uses Export Database (classic), for newer resource manager based REST API, you could refer to here. Moreover, you could refer to create a storage account and leverage Microsoft Azure Storage Explorer for a simple way to manage your storage resources, for more details, you could refer to here.
I've found the solution of my problem:
I had to update my Microsoft.Azure.Management.Sql library.
Now, I can use this export method:
public static ImportExportResponse Export(this IDatabasesOperations
operations, string resourceGroupName, string serverName, string
databaseName, ExportRequest parameters);
I'm trying to list all my RDS instances on AWS, using the .NET SDK for AWS.
I was expecting the SDK to offer something similar to the SDK's EC2 describe-instances, and sure enough, that is part of the CLI, but not so straight-forward in the SDK.
Does anyone know how to do this ?
Solution
The AWS .NET SDK (v3) contains a similar construct for RDS as for EC2. I missed that somehow. See my answer with source-code below.
Thanks in advance
I think you are looking for DescribeDBInstances. The DescribeDBInstancesResult has a list of DBInstances. That's where you'll find the information on each RDS instance.
Edit: The function and object names are the same but here's the link for V3.
So it turns out, that the procedure to get all RDS instances closely mimic the EC2 way of doing it.
You will need to install the AWSSDK.RDS nuget package
In Package Management Console in VS.NET
Install-Package AWSSDK.RDS
Once you have done that, you will need to add the necessary assemblies:
using Amazon.RDS;
using Amazon.RDS.Model;
And then you can do something like this:
public static void ListAllRDSInstances(RegionEndpoint region)
{
var c = new AmazonRDSClient(region);
var request = new DescribeDBInstancesRequest();
var response = c.DescribeDBInstances(request);
response.DBInstances
.ForEach(instance => {
//do stuff for each instance in region
});
}
I am trying to write code to access my azure local development storage. I started off by creating a new storage for myself:
dsInit /forceCreate
I can now see the DevelopmentStorageDb20090919 in SSMS with some precreated tables such as dbo.TableContainer, dbo.TableRow etc.
Now, can I simply add tables to this database via SSMS (ex. Employee table) and start accessing them via code ? Is this the right way to do stuff ?
For example:
var svc = CloudStorageAccount.DevelopmentStorageAccount
.CreateCloudTableClient().GetDataServiceContext();
//"Employees" is the name of the table
svc.AddObject("Employees", new Employees("John Doe"));
svc.SaveChangesWithRetries();
2 . And additionally, once I am all done, how do I port the table and the data into the actual cloud account ? By running scripts there ?
I think you're confusing Azure Table Storage with SQL Server or SQL Azure, which are completely different. You cannot access Azure Storage tables at all with SSMS. The code sample you provided is using the Azure SDK (which is using the Storage REST API underneath). That's the only way to access Azure Storage.
If you want to create / view tables in a more graphical way, try Cerebrata's Cloud Storage Studio, ClumsyLeaf's AzureXplorer, David Pallman's Azure Storage Explorer, or some other similar tool. These tools all rely on the SDK or direct API calls.
Now, regarding your code: You need to create your table before inserting objects. See CreateTablesFromModel() and CreateTableIfNotExist(). The Azure Platform Training Kit has a great intro/lab for creating and using tables, and shows how to use CreateTablesFromModel().
As long as that table exists, then yes, the code you have written will add "John Doe" to the employees table. While it can be interesting to look at data through SSMS and you could try altering data that way, I wouldn't recommend trying it. Development storage is funny enough without poking it with a stick. There are differences between dev storage and actual cloud storage, so I have found that the sooner you can stop using dev storage the better.
At the moment there is no fancy way of transferring data between Azure tables (be they in dev storage or in the cloud). It boils down to running a query that selects everything from the source table, then writes each individual item to the destination table. Depending on how your data is partitioned you might be able to batch the writes or you might be able to do them in parallel. If you're willing to use the REST API directly you could avoid the storage library having to deserialise each item before it's written.
Even though it's best to use the APIs to talk to the DevStorage, there might be scenarios where the direct database access could prove beneficial. More specifically, it can be used to circumvent some SDK issues that are specific to DevStorage.
I once ran into a problem with renaming of large blobs - the operation would simply time out (note that blobs cannot be renamed - they first need to be copied using CopyFromBlob() and then deleted). I tried both in Azure Storage Explorer and by writing code and increasing all the timeouts. Solution? SQL to the rescue!
Here is an example of SQL that would rename a blob within a container or move it to a different one:
begin tran
alter table CommittedBlock nocheck constraint BlockBlob_CommittedBlock
update CommittedBlock set BlobName = #TargetBlobName, ContainerName = #TargetContainerName where BlobName = #SourceBlobName and ContainerName = #SourceContainerName
update BlockData set BlobName = #TargetBlobName, ContainerName = #TargetContainerName where BlobName = #SourceBlobName and ContainerName = #SourceContainerName
update Blob set BlobName = #TargetBlobName, ContainerName = #TargetContainerName where BlobName = #SourceBlobName and ContainerName = #SourceContainerName
alter table CommittedBlock with check check constraint BlockBlob_CommittedBlock
rollback tran
Of course, use it at your own risk - this is a completely unsupported way of working with dev stotage.