In my project, I am forced to do some Machine Learning with C#. Unfortunately, ml.net is much less intuitive than in all other languages, and I fail to execute a RegressionExperiment.
First, here are my data classes:
public class DataPoint
{
[ColumnName("Label")]
public float y { get; set; }
[ColumnName("catFeature")]
public string str { get; set; }
[ColumnName("smth")]
public float smth { get; set; }
}
public class MLOutput
{
[ColumnName("Score")]
public float score { get; set; }
}
I think my problem lies in the encoding of a category variable. For a single model, the code below works fine.
//Create an ML Context
var ctx = new MLContext();
IDataView trainingData = ctx.Data.LoadFromEnumerable(data: data as IEnumerable<DataPoint>);
// Build your data processing and training pipeline
var pipeline = ctx.Transforms.Categorical.OneHotEncoding(outputColumnName: "catFeatureEnc", inputColumnName: "catFeature")
.Append(ctx.Transforms.Concatenate("Features", new[] {"catFeatureEnc","smth"}))
.Append(ctx.Regression.Trainers.FastForest());
// Train your model ????
var trainedModel = pipeline.Fit(trainingData); // shouldn't we transform before fit?????
IDataView transformedData = trainedModel.Transform(trainingData);
Now, removing the FastForest model from the pipeline and adding the AutoML code, Microsoft.ML cannot handle the encoding:
// Build your data processing and training pipeline
var pipeline = ctx.Transforms.Categorical.OneHotEncoding(outputColumnName: "catFeatureEnc", inputColumnName: "catFeature")
.Append(ctx.Transforms.Concatenate("Features", new[] {"catFeatureEnc","smth"}))
.Append(ctx.Transforms.Conversion.ConvertType("Features", "Features", DataKind.Single));
// do smth ???
var trainedModel = pipeline.Fit(trainingData); // nothing there to be fitted???
IDataView transformedData = trainedModel.Transform(trainingData); // shouldn't we transform before fit?????
var experimentSettings = new RegressionExperimentSettings();
experimentSettings.MaxExperimentTimeInSeconds = 60;
// Cancel experiment after the user presses any key
var cts = new CancellationTokenSource();
experimentSettings.CancellationToken = cts.Token;
RegressionExperiment experiment = ctx.Auto().CreateRegressionExperiment(experimentSettings);
ExperimentResult<RegressionMetrics> experimentResult = experiment.Execute(transformedData, "Label");
Now, I get the following exception:
Only supported feature column types are Boolean, Single, and String. Please change the feature column catFeatureEnc of type Key<UInt32, 0-2> to one of the supported types. "
If I remove catFeatureEnc from the Concatenate call, the code works fine. Alternatively, I tried to create a new pipeline for the training with the transformed data. Unfortunately, this approach doesn't work in the slightest, as the new pipeline expects arbitrary data types for many features.
Another alternative approach:
ExperimentResult<RegressionMetrics> experimentResult = experiment.Execute(trainingData, "Label");
throws the exception:
Training failed with the exception: System.InvalidOperationException: Concatenated columns should have the same type. Column 'smth' has type of Single, but the expected column type is Byte
Idk...Why is a Byte expected?
How can I use the encoded feature with Microsoft Auto.ML?
It looks like you're using the old version of the API. I would recommend trying the latest version.
To use it, you'll have to add the ML.NET daily feed.
https://pkgs.dev.azure.com/dnceng/public/_packaging/MachineLearning/nuget/v3/index.json
Here is a few samples using the new API:
AutoML with column inference and Auto featurizer
You can also take a look at this other sample which include OneHotEncoding. AutoML with data processing pipeline
Related
I am using KsqlDb a table with the following form:
KSQL-DB Query
create table currency (id integer,name varchar) with (kafka_topic='currency',partitions=1,value_format='avro');
C# model
public class Currency
{
public int Id{get;set;}
public string Name{get;set;}
}
Now i want to know how should i write/read data from this topic in C# using the Confluent library:
Writing
IProducer<int, Currency> producer=....
Currency cur=new Currency();
Message<int,Currency> message = new Message<int, Currency>
{
Key = msg.Id,
Timestamp = new Timestamp(DateTime.UtcNow, TimestampType.CreateTime),
Value = msg
};
DeliveryResult<int,Currency> delivery = await this.producer.ProduceAsync(topic,message);
Reading
IConsumer<int,Currency> iconsumer = new ConsumerBuilder<int, Currency>(config)
.SetKeyDeserializer(Deserializers.Int32) //i assume i need to use the id from my dto
.SetValueDeserializer(...) //what deserializer
.Build();
ConsumeResult<int,Currency> result = consumer.Consume();
Currency message = // what deserializer JsonSerializer.Deserialize<Currency>(result.Message.Value);
I am not sure how to go about this so i tried looking for serializer. I found this library AvroSerializer , but i do not get where the author fetches the schema.
Any help on how to read/write to a specific topic that would match with my ksqldb models ?
Update
After some research and some answers here i have started using the schemaRegistry
var config = new ConsumerConfig
{
GroupId = kafkaConfig.ConsumerGroup,
BootstrapServers = kafkaConfig.ServerUrl,
AutoOffsetReset = AutoOffsetReset.Earliest
};
var schemaRegistryConfig = new SchemaRegistryConfig
{
Url = kafkaConfig.SchemaRegistryUrl
};
var schemaRegistry = new CachedSchemaRegistryClient(schemaRegistryConfig);
IConsumer<int,Currency> consumer = new ConsumerBuilder<int, Currency>(config)
.SetKeyDeserializer(new AvroDeserializer<int>(schemaRegistry).AsSyncOverAsync())
.SetValueDeserializer(new AvroDeserializer<Currency>(schemaRegistry).AsSyncOverAsync())
.Build();
ConsumeResult<int, Currency> result = consumer.Consume();
Now i am getting another error:
Expecting data framing of length 5 bytes or more but total data size
is 4 bytes
As someone kindly pointed out it seems i retrieving only the id from the schema registry.
How can i just : insert into currency (id,name) values (1,3) and retrieve it in C# as a POCO (listed above) ?
Update 2
After i have found this source program it seems i am not able to publish messages to tables for some reason.
There is no error when sending the message but it is not published to Kafka.
I found this library AvroSerializer , but i do not get where the author fetches the schema.
Unclear why you need to use a library other than the Confluent one, but they get it from the Schema Registry. You can use CachedSchemaRegistryClient to get the schema string easily, however you shouldn't need this in the code as the deserializer will download from the registry on its own.
If you refer to the examples/ in the confluent-kafka-dotnet repo for Specific Avro consumption, you can see they generate the User class from User.avsc file, which seems to be exactly what you want to do here for Currency rather than write it yourself
I have solved the problem by defining my custom serializer , thus implementing the ISerializer<T> and IDeserializer<T> interfaces which in their belly are just wrappers over System.Text.Json.JsonSerializer or NewtonsoftJson.
Serializer
public class MySerializer:ISerializer<T>
{
byte[] Serialize(T data, SerializationContext context)
{
var str=System.Text.Json.JsonSerializer.Serialize(data); //you can also use Newtonsoft here
var bytes=Encoding.UTF8.GetBytes(str);
return bytes;
}
}
Usage
var config = new ConsumerConfig
{
GroupId = kafkaConfig.ConsumerGroup,
BootstrapServers = kafkaConfig.ServerUrl,
AutoOffsetReset = AutoOffsetReset.Earliest
};
IConsumer<int,Currency> consumer = new ConsumerBuilder<int, Currency>(config)
.SetValueDeserializer(new MySerializer<Currency>())
.Build();
ConsumeResult<int, Currency> result = consumer.Consume();
P.S
I am not even using the schema registry here afteri implemented the interface
As part of ML automation process I want to dynamically create new AutoML model. I'm using C# (.net framework) and Google.Cloud.AutoML.V1.
After trying to run CreateDataSet code:
var autoMlClient = AutoMlClient.Create();
var parent = LocationName.FromProjectLocation(_projectId, _locationId);
var dataset = new Google.Cloud.AutoML.V1.Dataset();
dataset.DisplayName = "NewDataSet";
var response = autoMlClient.CreateDataset(parent, dataset);
I get the following error:
Field: dataset.dataset_metadata; Message: Required field not set
According to this user manual I should set Dataset Metadata Type, but the list contains only specific types of classifications (Translation/ImageClassifications etc.), I can't find a simple classification type.
How do I create a simple classification data set with the API ? in the AutoML UI its just with a simple button click ("NEW DATASET") - and have to provide only name & region - no classification type.
I also tried to set:
dataset.TextClassificationDatasetMetadata =
new TextClassificationDatasetMetadata() { ClassificationType = ClassificationType.Multiclass };
But I was unable to import data to it (got too many errors of invalid inputs from the input CSV file), I guess its related to the reason that the input format is not suitable for Text Classification.
UPDATE
I've just notice that the Nuget works with AutoML v1 but v1 beta does contains TablesDatasetMetadata Dataset Metadata Type for normal classifications. I'm speechless.
I also experienced this scenario today while creating a dataset using the NodeJS client. Since the Google AutoML table service is in the beta level you need to use the beta version of the AutoML client. In the Google cloud documentation they have used the beta client to create a dataset.
In NodeJS importing the beta version require('#google-cloud/automl').v1beta1.AutoMlClient instead of importing the normal version (v1) require('#google-cloud/automl').v1 worked for me to successfully execute the create dataset functionality.
In C# you can achieve the same through a POST request. Hope this helps :)
After #RajithaWarusavitarana comment, and my last question update , below is the code that did the trick. The token is being generated by GoogleClientAPI nuget and AutoML is handled by REST.
string GcpGlobalEndPointUrl = "https://automl.googleapis.com";
string GcpGlobalLocation = "us-central1"; // api "parent" parameter
public string GetToken(string jsonFilePath)
{
var serviceAccountCredentialFileContents = System.IO.File.ReadAllText(jsonFilePath);
var credentialParameters = NewtonsoftJsonSerializer.Instance.Deserialize<JsonCredentialParameters>(serviceAccountCredentialFileContents);
var initializer = new ServiceAccountCredential.Initializer(credentialParameters.ClientEmail)
{
Scopes = new List<string> { "https://www.googleapis.com/auth/cloud-platform" }
};
var cred = new ServiceAccountCredential(initializer.FromPrivateKey(credentialParameters.PrivateKey));
string accessToken = cred.GetAccessTokenForRequestAsync("https://oauth2.googleapis.com/token").Result;
return accessToken;
}
public void GetDataSetList(string projectId, string token)
{
var restClient = new RestClient(GcpGlobalEndPointUrl);
var createDataSetReqUrl = $"v1beta1/projects/{projectId}/locations/{GcpGlobalLocation}/datasets";
var createDataSetReq = new RestRequest(createDataSetReqUrl, Method.GET);
createDataSetReq.AddHeader("Authorization", $"Bearer {token}");
var createDatasetResponse = restClient.Execute(createDataSetReq);
createDatasetResponse.Dump();
}
I took the token generation code from google-api-dotnet-client Test File
I have a model that looks like this:
public class Facility
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public int Id { get; set; }
public NetTopologySuite.Geometries.Point Location { get; set; }
}
Test code for adding a Point:
var testFacility = new Facility();
testFacility.Location = new NetTopologySuite.Geometries.Point(13.003725d, 55.604870d) { SRID = 3857 };
//Other values tested with the same error error
//testFacility.Location = new NetTopologySuite.Geometries.Point(13.003725d, 55.604870d);
//testFacility.Location = new NetTopologySuite.Geometries.Point(55.604870d, 13.003725d);
//var geometryFactory = NtsGeometryServices.Instance.CreateGeometryFactory(srid: 3857);
//var currentLocation = geometryFactory.CreatePoint(new Coordinate(13.003725d, 55.604870d));
//testFacility.Location = currentLocation;
db.Facilities.Add(testFacility);
//Exception on Save
db.SaveChanges();
I'm using the following NuGets, version 3.1.0
Microsoft.AspNetCore.Identity.EntityFrameworkCore
Microsoft.EntityFrameworkCore.SqlServer
Microsoft.EntityFrameworkCore.Tools
Microsoft.EntityFrameworkCore.SqlServer.NetTopologySuite
The exception I get on save is the following:
SqlException: The incoming tabular data stream (TDS) remote procedure
call (RPC) protocol stream is incorrect. Parameter 7 ("#p6"): The
supplied value is not a valid instance of data type geography. Check
the source data for invalid values. An example of an invalid value is
data of numeric type with scale greater than precision.
According to all documentation it should be X for longitude and Y for latitude so I don't think that is a problem. I tried to reverse the coordinates just in case but I got the same error as you can see in the examples I have tried.
https://learn.microsoft.com/en-us/ef/core/modeling/spatial
Lat = Y Long = X
https://gis.stackexchange.com/a/68856/71364
I can't find out anything obvious that seems wrong. Optionsbuilder is set up, the table is created with Data Type geography that worked really well with DbGeography for Entity Framework 6.
var optionsBuilder = new DbContextOptionsBuilder<ApplicationDbContext>();
optionsBuilder.UseSqlServer("Server=(localdb)\\mssqllocaldb;Database=TestDb;Trusted_Connection=True;MultipleActiveResultSets=true",
x => x.UseNetTopologySuite());
var db = new ApplicationDbContext(optionsBuilder.Options);
There is no specific cases to handle for a single Point either what I can see in documentation for SQL server.
https://learn.microsoft.com/en-us/ef/core/modeling/spatial#sql-server
The coordinates I'm saving is from Google Maps and therefore EPSG 3857 is used.
https://gis.stackexchange.com/questions/48949/epsg-3857-or-4326-for-googlemaps-openstreetmap-and-leaflet
What am I missing?
TLDR
The SRID is not present in SQL Server sys.spatial_reference_systems
Change to one that exist like 4326 and it will work:
select *
from sys.spatial_reference_systems
where spatial_reference_id = '4326'
Long answer:
Google Maps API uses EPSG 3857 but Google Maps Web application uses EPSG 4326
https://developers.google.com/maps/documentation/javascript/markers
https://www.google.com/maps/#55.604933,13.003662,14z
Therefore a point from Google Maps Web Application should be created and saved like this:
var testFacility = new Facility();
testFacility.Location = new NetTopologySuite.Geometries.Point(13.003725d, 55.604870d) { SRID = 4326 };
db.Facilities.Add(testFacility);
db.SaveChanges();
It was however a bit tricky to project EPSG 4326 coordinates to EPSG 3857 coordinate system. Microsoft recommends using ProjNet4GeoAPI so I decided to use that.
https://learn.microsoft.com/en-us/ef/core/modeling/spatial#srid-ignored-during-client-operations
I have verified that it works here:
http://epsg.io/transform#s_srs=4326&t_srs=3857&x=13.003725&y=55.604870
Example conversion:
var x = 13.003725d;
var y = 55.604870d;
var epsg3857ProjectedCoordinateSystem = ProjNet.CoordinateSystems.ProjectedCoordinateSystem.WebMercator;
var epsg4326GeographicCoordinateSystem = ProjNet.CoordinateSystems.GeographicCoordinateSystem.WGS84;
var coordinateTransformationFactory = new ProjNet.CoordinateSystems.Transformations.CoordinateTransformationFactory();
var coordinateTransformation = coordinateTransformationFactory.CreateFromCoordinateSystems(epsg4326GeographicCoordinateSystem, epsg3857ProjectedCoordinateSystem);
var epsg4326Coordinate = new GeoAPI.Geometries.Coordinate(x, y);
var epsg3857Coordinate = coordinateTransformation.MathTransform.Transform(epsg4326Coordinate);
Complete example program:
To get it running:
Install NuGets
Following NuGets are at version 3.1:
Microsoft.EntityFrameworkCore
Microsoft.EntityFrameworkCore.SqlServer
Microsoft.EntityFrameworkCore.Tools
Microsoft.EntityFrameworkCore.SqlServer.NetTopologySuite
ProjNET4GeoAPI
Add-Migration InitialCreate
Update-Database
Code:
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Design;
using NetTopologySuite;
using NetTopologySuite.Geometries;
using ProjNet.CoordinateSystems;
using ProjNet.CoordinateSystems.Transformations;
using System;
using System.ComponentModel.DataAnnotations;
using System.ComponentModel.DataAnnotations.Schema;
namespace TestConsoleAppEFGeo
{
public class ApplicationDbContextFactory : IDesignTimeDbContextFactory<ApplicationDbContext>
{
public ApplicationDbContext CreateDbContext(string[] args)
{
var optionsBuilder = new DbContextOptionsBuilder<ApplicationDbContext>();
optionsBuilder.UseSqlServer("Server=(localdb)\\mssqllocaldb;Database=TestApp;Trusted_Connection=True;MultipleActiveResultSets=true",
x => x.UseNetTopologySuite());
return new ApplicationDbContext(optionsBuilder.Options);
}
}
public class ApplicationDbContext : DbContext
{
public ApplicationDbContext(DbContextOptions<ApplicationDbContext> options)
: base(options)
{
}
public virtual DbSet<Facility> Facilities { get; set; }
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
}
}
public class Facility
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public int Id { get; set; }
public NetTopologySuite.Geometries.Point Location { get; set; }
}
class Program
{
static void Main(string[] args)
{
var applicationDbContextFactory = new ApplicationDbContextFactory();
var db = applicationDbContextFactory.CreateDbContext(null);
var x = 13.003725d;
var y = 55.604870d;
var srid = 4326;
if (!db.Facilities.AnyAsync(x => x.Id == 1).Result)
{
var testFacility = new Facility();
var geometryFactory = NtsGeometryServices.Instance.CreateGeometryFactory(srid);
var currentLocation = geometryFactory.CreatePoint(new NetTopologySuite.Geometries.Coordinate(x, y));
testFacility.Id = 1;
testFacility.Location = currentLocation;
var testFacility2 = new Facility();
testFacility2.Id = 2;
testFacility2.Location = new Point(x, y) { SRID = srid };
db.Facilities.Add(testFacility);
db.Facilities.Add(testFacility2);
//Will throw an exception
//var testFacility3 = new Facility();
//testFacility3.Id = 3;
//testFacility3.Location = new Point(1447568.0454157612d, 7480155.2276327936d) { SRID = 3857 };
//db.Facilities.Add(testFacility3);
db.SaveChanges();
}
var facility1 = db.Facilities.FirstAsync(x => x.Id == 1).Result;
var facility2 = db.Facilities.FirstAsync(x => x.Id == 2).Result;
if(facility1.Location == facility2.Location)
{
Console.WriteLine("facility1.Location is equal to facility2.Location");
}
else
{
Console.WriteLine("facility1.Location is NOT equal to facility2.Location");
}
//Test conversion
//Show coordinate: http://epsg.io/map#srs=4326&x=13.003725&y=55.604870&z=14&layer=streets
//Conversion: http://epsg.io/transform#s_srs=4326&t_srs=3857&x=13.0037250&y=55.6048700
//Google Maps - https://www.google.se/maps shows EPSG:4326 when viewing a location
//https://epsg.io/3857 - Google Maps API is EPSG:3857 however
//Example: https://developers.google.com/maps/documentation/javascript/markers
var epsg3857ProjectedCoordinateSystem = ProjectedCoordinateSystem.WebMercator;
var epsg4326GeographicCoordinateSystem = GeographicCoordinateSystem.WGS84;
var coordinateTransformationFactory = new CoordinateTransformationFactory();
var coordinateTransformation = coordinateTransformationFactory.CreateFromCoordinateSystems(epsg4326GeographicCoordinateSystem, epsg3857ProjectedCoordinateSystem);
var epsg4326Coordinate = new GeoAPI.Geometries.Coordinate(facility1.Location.Coordinate.X, facility1.Location.Coordinate.Y);
var epsg3857Coordinate = coordinateTransformation.MathTransform.Transform(epsg4326Coordinate);
}
}
}
More info here:
https://github.com/dotnet/efcore/issues/19416
It's probably already in 4326, happy days, easy to store, sql should let you store this (an API might use 3857 but supply a location's lat/lon in degrees not metres and actually you've already been given the lat/lon in 4326)
Assuming you are getting lat/lon's in SRID=3857 and want to try to store it that way:
Check you have a version of an SRID equivalent to 3857 that'll work in your DB
SELECT * FROM sys.spatial_reference_systems
WHERE authorized_spatial_reference_id
IN('3857', '900913', '3587', '54004', '41001', '102113', '102100', '3785')
For example if you happen to have 900913 try using that on a lat/lon insert with no conversion if you have it, I'm basing this assumption on comparing the properties of the hyperlinked "alternatives codes" to EPSG:3857
I have no idea if it'll work, and this is totally not my field of knowledge.
Assuming you get no SQL rows back then you'll have to convert 3857 to 4326 to store in your DB...
How to convert 3857 to 4326 so you can store it:
Install ProjNet4GeoAPI via NuGet and use the following code:
using GeoAPI.Geometries;
using ProjNet.CoordinateSystems;
using ProjNet.CoordinateSystems.Transformations;
...
// setup
var epsg3857 = ProjectedCoordinateSystem.WebMercator;
var epsg4326 = GeographicCoordinateSystem.WGS84;
var convertTo4326 = new CoordinateTransformationFactory()
.CreateFromCoordinateSystems(epsg3857, epsg4326);
// input 6415816.17/171048.38 (Brussels lat/lon in meters SRID 3857)
// N.B. method called needs the values as lon/lat (x/y), not lat/lon
var coordIn3857 = new GeoAPI.Geometries.Coordinate(171048.38, 6415816.17);
var coordIn4326 = convertTo4326.MathTransform.Transform(coordIn3857);
// output 49.82379612579344/1.5365537407788388 (Brussels lat/lon in degrees SRID 4326)
now go save that into your DB
testFacility.Location = new NetTopologySuite.Geometries.Point(1.536553, 49.823796)
{ SRID = 4326 };
To convert in the other direction and use 3857 from stored 4326 values, it's pretty easy to figure out or see Ogglas's answer
Saving Disk information from Azure:
var credentials = SdkContext.AzureCredentialsFactory.FromServicePrincipal("myclientId", "mytenant", "mysecretId", AzureEnvironment.AzureGlobalCloud);
var azure = Azure.Authenticate(credentials).WithSubscription("mySubscription");
var groupName = "myResourceGroup";
var vmName = "myVM";
var location = Region.USWest;
var vm = azure.Disks.List();
Console.WriteLine("Getting information about the virtual machine...");
MongoClient client = new MongoClient("mylocalhost");
IMongoDatabase database = client.GetDatabase("VM");
var collection = database.GetCollection<IDisk>("Disk ");
collection.InsertManyAsync(vm);
When I save them to Mongodb, I get an error:
maximum serialization depth exceeded (does the object being serialized have a circular reference?).
What am I doing wrong here?
It sounds like the IDisk that you're getting back from that API is resolving to a circular graph that the Mongodb isn't very happy with. The most simple fix, then, is : don't serialize the IFile - after all, it isn't your type, and you can't control it. Instead, create some type of your own that has exactly what you want on it, and serialize that. For example:
sealed class MyDisk // TODO: rename me
{
public string Key {get;set;}
public string Name {get;set;}
public string RegionName {get;set;}
public int SizeGB {get;set;}
// etc, but *only* the information you actually want, and *only* using
// primitive types or types you control
}
...
var disks = (from disk in azure.Disks.List();
select new MyDisk {
Key = disk.Key,
Name= disk.Name,
RegionName = disk.RegionName,
SizeGB = disk.SizeInGB,
// etc
}).ToList();
And store disks, which we know doesn't have a circular reference, because we control it.
I am new to Google APIs. I want to know how to call Google Dialogflow API in C# to get intent form the input text. But I can't find any example to call Dialogflow using C#.
Please provide some example to call Dialogflow from C#.
If I understand your question correctly you want to call the DialogFlow API from within a C# application (rather than writing fulfillment endpoint(s) that are called from DialogFlow. If that's the case here's a sample for making that call:
using Google.Cloud.Dialogflow.V2;
...
...
var query = new QueryInput
{
Text = new TextInput
{
Text = "Something you want to ask a DF agent",
LanguageCode = "en-us"
}
};
var sessionId = "SomeUniqueId";
var agent = "MyAgentName";
var creds = GoogleCredential.FromJson("{ json google credentials file)");
var channel = new Grpc.Core.Channel(SessionsClient.DefaultEndpoint.Host,
creds.ToChannelCredentials());
var client = SessionsClient.Create(channel);
var dialogFlow = client.DetectIntent(
new SessionName(agent, sessionId),
query
);
channel.ShutdownAsync();
In an earlier version of the DialogFlowAPI I was running into file locking issues when trying to re-deploy a web api project which the channel.ShutDownAsync() seemed to solve. I think this has been fixed in a recent release.
This is the simplest version of a DF request I've used. There is a more complicated version that passes in an input context in this post:
Making DialogFlow v2 DetectIntent Calls w/ C# (including input context)
(Nitpicking: I assume you know DialogFlow will call your code as specified/registered in the action at DialogFlow? So your code can only respond to DialogFlow, and not call it.)
Short answer/redirect:
Don't use Google.Apis.Dialogflow.v2 (with GoogleCloudDialogflowV2WebhookRequest and GoogleCloudDialogflowV2WebhookResponse) but use Google.Cloud.Dialogflow.v2 (with WebhookRequest and WebhookResponse) - see this eTag-error. I will also mention some other alternatives underneath.
Google.Cloud.Dialogflow.v2
Using Google.Cloud.Dialogflow.v2 NuGet (Edit: FWIW: this code was written for the beta-preview):
[HttpPost]
public dynamic PostWithCloudResponse([FromBody] WebhookRequest dialogflowRequest)
{
var intentName = dialogflowRequest.QueryResult.Intent.DisplayName;
var actualQuestion = dialogflowRequest.QueryResult.QueryText;
var testAnswer = $"Dialogflow Request for intent '{intentName}' and question '{actualQuestion}'";
var dialogflowResponse = new WebhookResponse
{
FulfillmentText = testAnswer,
FulfillmentMessages =
{ new Intent.Types.Message
{ SimpleResponses = new Intent.Types.Message.Types.SimpleResponses
{ SimpleResponses_ =
{ new Intent.Types.Message.Types.SimpleResponse
{
DisplayText = testAnswer,
TextToSpeech = testAnswer,
//Ssml = $"<speak>{testAnswer}</speak>"
}
}
}
}
}
};
var jsonResponse = dialogflowResponse.ToString();
return new ContentResult { Content = jsonResponse, ContentType = "application/json" }; ;
}
Edit: It turns out that the model binding may not bind all properties from the 'ProtoBuf-json' correctly (e.g. WebhookRequest.outputContexts[N].parameters),
so one should probably use the Google.Protobuf.JsonParser (e.g. see this documentation).
This parser may trip over unknown fields, so one probably also wants to ignore that. So now I use this code (I may one day make the generic method more generic and thus useful, by making HttpContext.Request.InputStream a parameter):
public ActionResult PostWithCloudResponse()
{
var dialogflowRequest = ParseProtobufRequest<WebhookRequest>();
...
var jsonResponse = dialogflowResponse.ToString();
return new ContentResult { Content = jsonResponse, ContentType = "application/json" }; ;
}
private T ParseProtobufRequest<T>() where T : Google.Protobuf.IMessage, new()
{
// parse ProtoBuf (not 'normal' json) with unknown fields, else it may not bind ProtoBuf correctly
// https://github.com/googleapis/google-cloud-dotnet/issues/2425 "ask the Protobuf code to parse the result"
string requestBody;
using (var reader = new StreamReader(HttpContext.Request.InputStream))
{
requestBody = reader.ReadToEnd();
}
var parser = new Google.Protobuf.JsonParser(JsonParser.Settings.Default.WithIgnoreUnknownFields(true));
var typedRequest = parser.Parse<T>(requestBody);
return typedRequest;
}
BTW: This 'ProtoBuf-json' is also the reason to use WebhookResponse.ToString() which in turn uses Google.Protobuf.JsonFormatter.ToDiagnosticString.
Microsoft's BotBuilder
Microsoft's BotBuilder packages and Visual Studio template.
I havent't used it yet, but expect approximately the same code?
Hand written proprietary code
A simple example of incoming request code (called an NLU-Response by Google) is provided by Madoka Chiyoda (Chomado) at Github. The incoming call is simply parsed to her DialogFlowResponseModel:
public static async Task<HttpResponseMessage> Run([...]HttpRequestMessage req, [...]CloudBlockBlob mp3Out, TraceWriter log)
...
var data = await req.Content.ReadAsAsync<Models.DialogFlowResponseModel>();
Gactions
If you plan to work without DialogFlow later on, please note that the interface for Gactions differs significantly from the interface with DialogFlow.
The json-parameters and return-values have some overlap, but nothing gaining you any programming time (probably loosing some time by starting 'over').
However, starting with DialogFlow may gain you some quick dialog-experience (e.g. question & answer design/prototyping).
And the DialogFlow-API does have a NuGet package, where the Gactions-interface does not have a NuGet-package just yet.