Adding to an Azure CosmosDB from inside an Azure Function - c#

I've created a CosmosDB DB, with a single table, called MyTable. My idea is that I want to insert into this table from an Azure function. Having had a look around, I've come up with this function:
public static HttpResponseMessage Run(HttpRequestMessage req, TraceWriter log, out object tableOutput)
{
log.Info("C# HTTP trigger function processed a request.");
// parse query parameter
string field1 = req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "field1", true) == 0)
.Value;
string field2 = req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "field2", true) == 0)
.Value;
var item = new
{
field1 = field1,
field2 = field2
};
tableOutput = item;
return req.CreateResponse(HttpStatusCode.OK);
}
The error that I get is this:
2017-12-07T15:52:44.066 Exception while executing function:
Functions.MyFunc. Microsoft.Azure.WebJobs.Host: Error while handling
parameter tableOutput after function returned:.
Microsoft.Azure.WebJobs.Extensions.DocumentDB: The collection
'tableOutput' (in database 'myCosmosDB') does not exist. To
automatically create the collection, set 'CreateIfNotExists' to
'true'. Microsoft.Azure.Documents.Client: Message: {"Errors":["Owner
resource does not exist"]}
I have set-up the output parameter, and I have seen the checkbox mentioned here (CreateIfNotExists); however, I have an existing Cosmos table set-up, and would like to write to that; so my question is, how can I access that table from within an Azure function?
The function.json is below:
{
"bindings": [
{
"authLevel": "function",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "$return",
"type": "http",
"direction": "out"
},
{
"type": "documentDB",
"name": "outputDocument",
"databaseName": "mycosmosdb",
"collectionName": "tableOutput",
"createIfNotExists": false,
"connection": "mycostmosdb_DOCUMENTDB",
"direction": "out"
}
],
"disabled": false
}
EDIT:
Two people have implied that for the purposes of this question the terms Table and Collection are synonymous. If I have understood correctly, then this appears to not be the case, as even when I change the collection name in the function.json to match the name of the table that I created, I get the same error.
To clarify the CosmosDB table configuration, I am seeing, inside data explorer, a node entitles TablesDB, which has a sub-node of the table that I've created.

You need to update the Function.json with the correct "collectionName" value for your Cosmos DB Collection you're trying to save the new Document to. Currently, you do NOT have a Collection in your Cosmos DB Database ("mycosmosdb") with the name "tableOutput".
Also, if you'd wish for the Azure Function Output binding to automatically create the Collection if it doesn't exist, then set the Function.json property "createIfNotExists" to "true" and it'll create the Collection if it doesn't exist.
Either approach will get your code working without error, but you'll need to make sure the "collectionName" is set to the correct name of your Cosmos DB Collection you have set up.
For example, try changing your Function.json to the following:
{
"bindings": [
{
"authLevel": "function",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "$return",
"type": "http",
"direction": "out"
},
{
"type": "documentDB",
"name": "outputDocument",
"databaseName": "mycosmosdb",
"collectionName": "MyTable",
"createIfNotExists": true,
"connection": "mycostmosdb_DOCUMENTDB",
"direction": "out"
}
],
"disabled": false
}

Make sure this is the value of your existing collection:
(you mention MyTable but it says tableOutput)
"collectionName": "tableOutput",
Make sure the names are identical. Collection names are case sensitive. "mytable" and "myTable" are different collections

Related

Convert name strings with diacritics into e-mail compatible format (SQL, C# or PowerAutomate)

I have a bit of experience with SQL and Power Platform/Logic Apps and am at the very beginning of learning C#.
I have a large SQL table with names (first names and last names) that are supposed to be used to create email addresses in Azure.
I want the display names of the emails to stay original, but I have to remove the German special characters äöüß and replace them with ae, oe, ue and ss and some characters with diacritics é, à etc that I need to replace with e or a etc.
Now, I'm wondering if I should create a view in SQL with two new columns that will convert the characters, or if I should create a logic app that will do all the replacements (and wrap that in a custom api) or if that would be a good project for my first azure function in C-Sharp (to be wrapped in a custom api).
I'm curious as to where in the system to handle the issue, and if there are any standard procedures available to handle this. I cannot be the first one to have exactly that problem.
A Google search gave me 20 different ways to handle it, all of which seemed a bit awkward or didn't work for me.
Thank you guys.
One of the workarounds is that this can be done directly using SQL commands itself. Below is a sample of data that I'm working on.
I tried creating another Colum called "FullName" and use CONCAT() operation to concat the FirstName and LastName.
ALTER TABLE <TABLENAME>
ADD [FullName] varchar(50);
Update <TABLENAME>
SET FullName = CONCAT(FirstName,' ',LastName);
Now I have used REPLACE() Function to replace all the required German special Characters.
UPDATE <TableName>
SET FullName = Replace(Replace(Replace(Replace(Replace(Replace(FullName,'ä','ae'),'ö','oe'),'ü','ue'),'ß','ss'),'é','e'),'à','a')
WHERE 'FullName'!='';
Now you can use the Full Name to be the Display Name while maintaining the original values in our table i.e., Firstname and LastName.
Alternatively, You can do that same using the logic apps too using Execute a SQL query (V2) action.
RESULTS:
Below is the codeview of my Logic App
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Execute_a_SQL_query_(V2)": {
"inputs": {
"body": {
"query": "select * from Table1"
},
"host": {
"connection": {
"name": "#parameters('$connections')['sql']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('default'))},#{encodeURIComponent(encodeURIComponent('default'))}/query/sql"
},
"runAfter": {
"Execute_a_SQL_query_(V2)_3": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Execute_a_SQL_query_(V2)_2": {
"inputs": {
"body": {
"query": "Update Table1\nSET FullName = CONCAT(FirstName,' ',LastName);"
},
"host": {
"connection": {
"name": "#parameters('$connections')['sql']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('default'))},#{encodeURIComponent(encodeURIComponent('default'))}/query/sql"
},
"runAfter": {},
"type": "ApiConnection"
},
"Execute_a_SQL_query_(V2)_3": {
"inputs": {
"body": {
"query": "UPDATE Table1\nSET FullName = Replace(Replace(Replace(Replace(Replace(Replace(FullName,'ä','ae'),'ö','oe'),'ü','ue'),'ß','ss'),'é','e'),'à','a')\nWHERE 'FullName'!='';"
},
"host": {
"connection": {
"name": "#parameters('$connections')['sql']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('default'))},#{encodeURIComponent(encodeURIComponent('default'))}/query/sql"
},
"runAfter": {
"Execute_a_SQL_query_(V2)_2": [
"Succeeded"
]
},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"sql": {
"connectionId": "/subscriptions/<SubId>/resourceGroups/<RG>/providers/Microsoft.Web/connections/sql-2",
"connectionName": "sql-2",
"id": "/subscriptions/<SubId>/providers/Microsoft.Web/locations/centralus/managedApis/sql"
}
}
}
}
}

Avro schema GenericRecord missing key

I am using Avro schema to dynamically produce message to a Kafka cluster from a C# application, using the Confluent Kafka client. The data types are not known on compile time so I am using the GenericRecord class from the Avro.Generic namespace, as described here: https://www.confluent.io/blog/decoupling-systems-with-apache-kafka-schema-registry-and-avro/.
I however have an issue - if the schema has a field that can contain null value, it is still required to add the field to the GenericRecord by using the Add method, giving null as value. My application is not aware of the fields that can be null and I don't think it should be - because that would defy the purpose of nullable fields in the schema.
Avro schema:
{
"namespace": "Test",
"type": "record",
"doc": "Test bool type",
"name": "BoolType",
"version": "1",
"fields": [
{
"name": "Data",
"type": [ "null", "boolean" ],
"default": null
},
{
"name": "Source",
"type": "string"
}
]
}
C# code:
var valueRecord = new GenericRecord( valueAvroSchema );
valueRecord.Add( "Data", null );
valueRecord.Add( "Source", "Test app .NET" );
var messageToSend = new Message<GenericRecord, GenericRecord>
{
Key = keyRecord,
Value = valueRecord
};
await _producer.ProduceAsync( _topicName, messageToSend );
If the line:
valueRecord.Add( "Data", null );
is not present, the ProduceAsync method throws a Confluent.Kafka.ProduceException, as seen in the screenshot below.
Is there any way I can automatically populate the fields that can be null in the GenericRecord? The same would apply if I would have to populate the fields with their default value.
Is there any way of doing this in a standard way or do I need to write my own code to read the schema and if there are any nullable fields that are not already set by my application to add them at the end, before publishing?
Thank you!
Avro defaults are only relevant to consumers. The producer must always set each field

How to represent an array of longs as a query parameter in Swagger 2.0?

I am having a hard time figuring out what would be the correct Swagger 2.0 spec for the case where I am expecting a parameter in the query to be a list of long (C#). This is what I tried based on seeing examples where the parameter being passed in query is simple datatypes like int or boolean. But this does not seem to work. It does not look like it is getting parsed correctly.
My URI is somethng like this :
https://.../testinstance/FeatureTexts?api-version=2016-09-13&featureIds=1629988727&featureIds=1924980024
And in my API-level test it does not gets resolved to anything similar after the part api-version=2016-09-13&featureIds=
"get": {
"tags": [
"FeatureText"
],
"operationId": "RenderFeatureTexts",
"description": "The operation to get feature texts for specified features",
"parameters": [
{
"name": "featureIds",
"in": "query",
"required": true,
"schema": {
"type": "array",
"collectionFormat": "multi",
"items": {
"type": "integer",
"format": "int64"
}
},
.......
C# code generated by Swagger Codegen:
public static async System.Threading.Tasks.Task<object> ListFeatureTextsAsync(this IAgentClient operations, object featureIds, System.Threading.CancellationToken cancellationToken = default(System.Threading.CancellationToken))
{
using (var _result = await operations.ListFeatureTextsWithHttpMessagesAsync(featureIds, null, cancellationToken).ConfigureAwait(false))
{
return _result.Body;
}
}
Change the parameter definition as shown below, that is, move type, items and collectionFormat out of schema. In OpenAPI 2.0, schema is only used for body parameters, and other parameter types use type etc. directly.
"parameters": [
{
"name": "featureIds",
"in": "query",
"required": true,
"type": "array",
"collectionFormat": "multi",
"items": {
"type": "integer",
"format": "int64"
}
You can easily catch syntax errors like this by pasting your spec into Swagger Editor.

Service Bus Output Bindings Throwing Error in both Azure and Local Testing

So I have the following function header:
[FunctionName("listenServiceBus")]
public static void Run([ServiceBusTrigger("metadataingest", AccessRights.Manage, Connection = "ServiceBus")]string mySbMsg,
ExecutionContext context, [ServiceBus("successqueue", Connection = "DEVservicebus", EntityType = Microsoft.Azure.WebJobs.ServiceBus.EntityType.Queue)]out string outputSuccess,
[ServiceBus("failqueue", Connection = "DEVservicebus", EntityType = Microsoft.Azure.WebJobs.ServiceBus.EntityType.Queue)]out string outputFailure, ILogger Log)
Where I bind to two separate Service Buses for input and outputs of the function. My function.json file looks like the following:
{
"bindings": [
{
"type": "serviceBusTrigger",
"connection": "ServiceBus",
"queueName": "metadataingest",
"accessRights": "manage",
"name": "mySbMsg"
},
{
"name": "outputSuccess ",
"type": "serviceBus",
"queueName": "successqueue",
"connection": "DEVservicebus",
"direction": "out"
},
{
"name": "outputFailure",
"type": "serviceBus",
"queueName": "failqueue",
"connection": "DEVservicebus",
"direction": "out"
}
],...
I am getting the following errors:
No job functions found. Try making your job classes and methods public. If
you're using binding extensions (e.g. ServiceBus, Timers, etc.) make sure
you've called the registration method for the extension(s) in your startup
code (e.g. config.UseServiceBus(), config.UseTimers(), etc.).
And:
listenServiceBus: The binding name outputSuccess is invalid. Please assign
a valid name to the binding.
I am currently running .Net.Sdk.Function 1.0.12.Anyone know any workarounds?
See "name": "outputSuccess ", in your bindings, looks like there is an extra space after outputSuccess. You should delete it and try again.

Azure Functions SDK

After upgrading to 1.0.1 CLI tools without any code changes, I suddenly started to get the following error:
ResizeImage: Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.ResizeImage'.
Microsoft.Azure.WebJobs.Extensions.DocumentDB:
'Id' is required when binding to a DocumentClient property.
The following code:
[FunctionName(nameof(ResizeImage))]
public static async Task RunAsync([BlobTrigger("profile-pictures/{name}")] CloudBlockBlob myBlob, string name, [DocumentDB(databaseName: "x", collectionName: "UserProfile", CreateIfNotExists = true)] DocumentClient client, [Blob("profile-pictures/resized-{name}", FileAccess.ReadWrite)] CloudBlockBlob resizedBlob, TraceWriter log)
I thought Id is optional? At least that's what the docs says.
According to the docs:
The properties id and sqlQuery cannot both be specified. If neither id
nor sqlQuery is set, the entire collection is retrieved.
The generated json:
{
"bindings": [
{
"type": "blobTrigger",
"path": "profile-pictures/{name}",
"direction": "in",
"name": "myBlob"
},
{
"type": "documentDB",
"databaseName": "x",
"collectionName": "UserProfile",
"createIfNotExists": true,
"direction": "out",
"name": "client"
},
{
"type": "blob",
"path": "profile-pictures/resized-{name}",
"direction": "inout",
"name": "resizedBlob"
}
],
"disabled": false,
"scriptFile": "..\\X.Functions.dll",
"entryPoint": "X.Functions.ResizeImage.RunAsync"
}
I'm using 1.0.0 SDK
I thought Id is optional? At least that's what the docs says.
Yes, id is optional. But according to the document of Azure Functions Cosmos DB bindings. We need to use IEnumerable<dynamic> as the binding type. Please change your code as following.
[DocumentDB(...)] IEnumerable<dynamic> documents
You will get all the documents from the collection. I tested it and it worked fine on my side.
In addition, the direction should be changed to in if you want to get data from DocumentDB.
"direction": "in"

Categories