I am writing sample applications in .Net core to interact with Kafka.
I have downloaded Kafka and Zookeeper official docker images to my machine.
I am using Confluent.Kafka nuget package for both producer and consumer. I am able to produce the message to Kafka. But my consumer part is not working.
Below is my producer and consumer code snippet. I am not sure what mistake I'm doing here.
Do we need to Explicitly create a consumer group?
Consumer Code (This code not working. Thread waiting at consumer.Consume(cToken);)
var config = new ConsumerConfig
{
BootstrapServers = "localhost:9092",
GroupId = "myroupidd",
AutoOffsetReset = AutoOffsetReset.Earliest,
};
var cToken = ctokenSource.Token;
var consumerBuilder = new ConsumerBuilder<Null, string>(config);
consumerBuilder.SetPartitionsAssignedHandler((consumer, partitionlist) =>
{
consumer.Assign(new TopicPartition("myactual-toppics", 0) { });
Console.WriteLine("inside SetPartitionsAssignedHandler action");
});
using var consumer = consumerBuilder.Build();
consumer.Subscribe("myactual-toppics");
while (!cToken.IsCancellationRequested)
{
var consumeResult = consumer.Consume(cToken);
if (consumeResult.Message != null)
Console.WriteLine(consumeResult.Message.Value);
}
Producer Code (This is working fine. I am able to see the messages using Conduckto Tool)
var config = new ProducerConfig
{
BootstrapServers = "localhost:9092",
ClientId = Dns.GetHostName(),
};
using (var producer = new ProducerBuilder<Null, string>(config).Build())
{
while (!stoppingToken.IsCancellationRequested)
{
var top = new TopicPartition("myactual-toppics", 0);
var result = await producer.ProduceAsync(top, new Message<Null, string> { Value = "My First Message" });
Console.WriteLine($"Publishedss1234" );
await Task.Delay(5000, stoppingToken);
Related
I'm trying to create a topic (an Event Hub) programmatically from the Kafka interface using AdminClient.CreateTopicsAsync. This works when connecting to Kafka, but not to Event Hub. I'm running into the following error:
Default partition count (KIP-464) not supported by broker, requires
broker version <= 2.4.0
using Confluent.Kafka;
using Confluent.Kafka.Admin;
var adminClient =
new AdminClientBuilder(
new[] {
("sasl.mechanism","PLAIN"),
("security.protocol","SASL_SSL"),
("bootstrap.servers", Address),
("sasl.username", "$ConnectionString"),
("sasl.password", ConnectionString),
}.Select((kvp) => new KeyValuePair<string, string>(kvp.Item1, kvp.Item2))
)
.Build();
await adminClient.CreateTopicsAsync(new[] {
new TopicSpecification {
Name = "test-topic"
}
});
It complains that using a default number of partitions is not supported, but as far as I can tell, I can't provide one as the underlying librdkafka does not support it.
The only information I could find by googling this is that someone in 2021 did make it work.
This code works for me on both Kafka and EventHub.
using (var kafkaProducer = new ProducerBuilder<string, string>(producerConfig)
.Build())
{
using (var adminClient = new DependentAdminClientBuilder(kafkaProducer.Handle).Build())
{
var metaData = adminClient.GetMetadata(TimeSpan.FromSeconds(5));
var topicInfo = metaData.Topics.Where(tp => string.Equals(fullTopicName, tp.Topic, StringComparison.OrdinalIgnoreCase)).FirstOrDefault();
if (topicInfo == null)
{
var t = new Confluent.Kafka.Admin.TopicSpecification
{
Name = fullTopicName,
// at least 2 partitions
NumPartitions = kafkaTestConfig.CreateTopicOptions.NumPartitions, // 3, //1,
//// at least 1 replication factor
ReplicationFactor = kafkaTestConfig.CreateTopicOptions.ReplicationFactor, // 3, //(short)numberOfBrokers,
Configs = kafkaTestConfig.CreateTopicOptions.DynamicConfigs,
};
var o = new CreateTopicsOptions { OperationTimeout = TimeSpan.FromMilliseconds(_timeout), };
AssertES.True(adminClient.CreateTopicsAsync(new List<Confluent.Kafka.Admin.TopicSpecification> { t, }
, o).Wait(_timeout), "Failed to create topic in time: " + fullTopicName);
}
}
}
I am trying to add multiple schemas to the same subject in the schema registry, so I have set ValueSubjectNameStrategy to SubjectNameStrategy.TopicRecord, also set the register automatically to AutomaticRegistrationBehavior.Always. But while auto registering the schema it still using the SubjectNameStrategy.Topic strategy.
var schemaRegistryConfig = new SchemaRegistryConfig { Url = "http://localhost:8081", ValueSubjectNameStrategy = SubjectNameStrategy.TopicRecord };
var registry = new CachedSchemaRegistryClient(schemaRegistryConfig);
var builder = new ProducerBuilder<string, SplitLineKGN>(KafkaConfig.Producer.GetConfig(_config.GetSection("KafkaProducer")))
.SetAvroValueSerializer(registry, registerAutomatically: AutomaticRegistrationBehavior.Always)
.SetErrorHandler((_, error) => Console.Error.WriteLine(error.ToString()));
_producerMsg = builder.Build();
await _producerMsg.ProduceAsync("MyTopic", new Message<string, SampleMessage> { Key = key, Value = line });
how to auto register multiple schemas to a topic?
Ensure that you changed a subject naming strategy for a topic
SchemaRegistryConfig.ValueSubjectNameStrategy is deprecated, it should now be configured using the serializer's configuration: code
For producing multiple event types with a single producer you have to use AvroSerializer<ISpecificRecord> as described below:
var schemaRegistryConfig = new SchemaRegistryConfig { Url = "http://localhost:8081" };
using var schemaRegistryClient = new CachedSchemaRegistryClient(schemaRegistryConfig);
var avroSerializerConfig = new AvroSerializerConfig
{
SubjectNameStrategy = SubjectNameStrategy.TopicRecord,
AutoRegisterSchemas = true // (the default)
};
// Assuming this is your own custom code because the Confluent
// producer doesn't have anything like this.
var producerConfig = KafkaConfig.Producer.GetConfig(_config.GetSection("KafkaProducer"));
using var producer = new ProducerBuilder<string, ISpecificRecord>(producerConfig)
.SetValueSerializer(new AvroSerializer<ISpecificRecord>(schemaRegistryClient, avroSerializerConfig))
.SetErrorHandler((_, error) => Console.Error.WriteLine(error))
.Build();
var deliveryResult = await producer.ProduceAsync("MyTopic", new Message<string, ISpecificRecord>
{
Key = key,
Value = line
});
Console.WriteLine($"Delivered to: {deliveryResult.TopicPartitionOffset}");
I struggle with understanding how does LSP client-side works. I mean I think I understand the theory of communication (JSON-RPC/LSP Protocol basics) but I struggle with existing libraries that are used for this for VS Code and I think trying to rewrite it is kinda pointless, especially client-side where I do not feel proficient at all
All examples I see provide a path to the server, so the LSP client can start it
it makes sense, but I'd rather avoid it during development, I'd want to have the server aopen in debugging mode and just start VS Code
I tried to start with basic of basic server implementation (C#)
public class Server
{
private JsonRpc RPC { get; set; }
public async Task Start()
{
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Debug()
.WriteTo.Console()
.CreateLogger();
var pipeName = "LSP_Pipe";
var writerPipe = new NamedPipeClientStream(pipeName);
var readerPipe = new NamedPipeClientStream(pipeName);
await writerPipe.ConnectAsync(10_000);
await readerPipe.ConnectAsync(10_000);
Log.Information("RPC Listening");
RPC = new JsonRpc(writerPipe, readerPipe, this);
RPC.StartListening();
this.RPC.Disconnected += RPC_Disconnected;
await Task.Delay(-1);
}
private void RPC_Disconnected(object sender, JsonRpcDisconnectedEventArgs e)
{
Log.Information("Disconnected");
}
[JsonRpcMethod(RPCMethods.InitializeName)]
public object Initialize(JToken arg)
{
Log.Information("Initialization");
var serializer = new JsonSerializer()
{
ContractResolver = new ResourceOperationKindContractResolver()
};
var param = arg.ToObject<InitializeParams>();
var clientCapabilities = param?.Capabilities;
var capabilities = new ServerCapabilities
{
TextDocumentSync = new TextDocumentSyncOptions(),
CompletionProvider = new CompletionOptions(),
SignatureHelpProvider = new SignatureHelpOptions(),
ExecuteCommandProvider = new ExecuteCommandOptions(),
DocumentRangeFormattingProvider = false,
};
capabilities.TextDocumentSync.Change = TextDocumentSyncKind.Incremental;
capabilities.TextDocumentSync.OpenClose = true;
capabilities.TextDocumentSync.Save = new SaveOptions { IncludeText = true };
capabilities.CodeActionProvider = clientCapabilities?.Workspace?.ApplyEdit ?? true;
capabilities.DefinitionProvider = true;
capabilities.ReferencesProvider = true;
capabilities.DocumentSymbolProvider = true;
capabilities.WorkspaceSymbolProvider = false;
capabilities.RenameProvider = true;
capabilities.HoverProvider = true;
capabilities.DocumentHighlightProvider = true;
return new InitializeResult { Capabilities = capabilities };
}
}
but I'm unable to setup client with those vscode-languageclient/node libraries even to get Log.Information("Initialization"); part
How can I provide the way they communicate - e.g name of named pipe? or just HTTP posts?
I'm not proficent in js / node development at all, so sorry for every foolish question
I've seen mature/production grade C# Language Server implementations but I'm overwhelmed just by their builders, there's sooo much stuff happening, sop that's why I'd want to write server from scratch, but for client use existing libs
var server = await LanguageServer.From(
options =>
options
.WithInput(Console.OpenStandardInput())
.WithOutput(Console.OpenStandardOutput())
.ConfigureLogging(
x => x
.AddSerilog(Log.Logger)
.AddLanguageProtocolLogging()
.SetMinimumLevel(LogLevel.Debug)
)
.WithHandler<TextDocumentHandler>()
.WithHandler<DidChangeWatchedFilesHandler>()
.WithHandler<FoldingRangeHandler>()
.WithHandler<MyWorkspaceSymbolsHandler>()
.WithHandler<MyDocumentSymbolHandler>()
.WithHandler<SemanticTokensHandler>()
.WithServices(x => x.AddLogging(b => b.SetMinimumLevel(LogLevel.Trace)))
.WithServices(
services => {
services.AddSingleton(
provider => {
var loggerFactory = provider.GetService<ILoggerFactory>();
var logger = loggerFactory.CreateLogger<Foo>();
logger.LogInformation("Configuring");
return new Foo(logger);
}
);
services.AddSingleton(
new ConfigurationItem {
Section = "typescript",
}
).AddSingleton(
new ConfigurationItem {
Section = "terminal",
}
);
}
)
.OnInitialize(
async (server, request, token) => {
var manager = server.WorkDoneManager.For(
request, new WorkDoneProgressBegin {
Title = "Server is starting...",
Percentage = 10,
}
);
workDone = manager;
await Task.Delay(2000);
manager.OnNext(
new WorkDoneProgressReport {
Percentage = 20,
Message = "loading in progress"
}
);
}
)
.OnInitialized(
async (server, request, response, token) => {
workDone.OnNext(
new WorkDoneProgressReport {
Percentage = 40,
Message = "loading almost done",
}
);
await Task.Delay(2000);
workDone.OnNext(
new WorkDoneProgressReport {
Message = "loading done",
Percentage = 100,
}
);
workDone.OnCompleted();
}
)
.OnStarted(
async (languageServer, token) => {
using var manager = await languageServer.WorkDoneManager.Create(new WorkDoneProgressBegin { Title = "Doing some work..." });
manager.OnNext(new WorkDoneProgressReport { Message = "doing things..." });
await Task.Delay(10000);
manager.OnNext(new WorkDoneProgressReport { Message = "doing things... 1234" });
await Task.Delay(10000);
manager.OnNext(new WorkDoneProgressReport { Message = "doing things... 56789" });
var logger = languageServer.Services.GetService<ILogger<Foo>>();
var configuration = await languageServer.Configuration.GetConfiguration(
new ConfigurationItem {
Section = "typescript",
}, new ConfigurationItem {
Section = "terminal",
}
);
var baseConfig = new JObject();
foreach (var config in languageServer.Configuration.AsEnumerable())
{
baseConfig.Add(config.Key, config.Value);
}
logger.LogInformation("Base Config: {Config}", baseConfig);
var scopedConfig = new JObject();
foreach (var config in configuration.AsEnumerable())
{
scopedConfig.Add(config.Key, config.Value);
}
logger.LogInformation("Scoped Config: {Config}", scopedConfig);
}
)
);
Thanks in advance
I'm trying to use DialogFlow API v2 with Unity.
Since there's no official SDK for Unity yet I used the Grpc beta unity SDK and the generated C# code I created with Protobuf and protoc from Grpc tools
The Grpc beta unity sdk is hidden in this link.
https://packages.grpc.io/ just click a build ID and you will find a built unity package.
I imported Google.Apis.Auth.OAuth2 and Grpc.Auth which weren't included in the official Grpc unity beta sdk.
Then I wrote this code which seems to work fine except that await responseStream.MoveNext() is stuck.
I believe the main reason is I'm not sure where to set the path to the end point which is '/v2/projects/project-id/agent/intents'
GoogleCredential credential = GoogleCredential.FromJson(privateKey);
Grpc.Core.Channel channel = new Grpc.Core.Channel("dialogflow.googleapis.com", credential.ToChannelCredentials());
var client = new SessionsClient(channel);
CallOptions options = new CallOptions();
var duplexStream = client.StreamingDetectIntent();
var responseHandlerTask = System.Threading.Tasks.Task.Run(async () =>
{
IAsyncEnumerator<StreamingDetectIntentResponse> responseStream = duplexStream.ResponseStream;
while (await responseStream.MoveNext())//stuck here
{
StreamingDetectIntentResponse response = responseStream.Current;
}
// The response stream has completed
});
// Send requests to the server
bool done = false;
while (!done)
{
// Initialize a request
var queryInput = new QueryInput();
queryInput.AudioConfig = new InputAudioConfig();
queryInput.AudioConfig.LanguageCode = "ja";
queryInput.AudioConfig.SampleRateHertz = 141000;
queryInput.AudioConfig.AudioEncoding = AudioEncoding.Linear16;
StreamingDetectIntentRequest request = new StreamingDetectIntentRequest
{
Session = "",
QueryInput = queryInput,
};
var bytes = File.ReadAllBytes("test.wav");
request.InputAudio = Google.Protobuf.ByteString.CopyFrom(bytes);
try
{
await duplexStream.RequestStream.WriteAsync(request);
}
catch (System.Exception e)
{
context.Post(state =>
{
Debug.LogErrorFormat("{0}\n{1}\n{2}\n{3}", e.Message, e.HelpLink, e.Source, e.StackTrace);
}, null);
}
done = true;
}
await duplexStream.RequestStream.CompleteAsync();
await responseHandlerTask;
Thanks for advance.
I didn't add correction session to the request. The following fixed it.
StreamingDetectIntentRequest request = new StreamingDetectIntentRequest
{
Session = "projects/project-id/agent/sessions/sessionid",
QueryInput = queryInput,
};
I'm using kafka 0.8.1.1 on a Red Hat VM with kafka-net plugin. How can I configure my consumer to stop receiving earlier messages from kafka?
My consumer code:
var options = new KafkaOptions(new Uri("tcp://199.53.249.150:9092"), new Uri("tcp://199.53.249.151:9092"));
Stopwatch sp = new Stopwatch();
var router = new BrokerRouter(options);
var consumer = new Consumer(new ConsumerOptions("Test", router));
ThreadStart start2 = () =>
{
while (true)
{
sp.Start();
foreach (var message in consumer.Consume())
{
if (MessageDecoderReceiver.MessageBase(message.Value) != null)
{
PrintMessage(MessageDecoderReceiver.MessageBase(message.Value).ToString());
}
else
{
Console.WriteLine(message.Value);
}
}
sp.Stop();
}
};
var thread2 = new Thread(start2);
thread2.Start();
The Consumer in Kafka-net does not currently auto track the offsets being consumed. You will have to implement the offset tracking manually.
To Store the offset in kafka version 0.8.1:
var commit = new OffsetCommitRequest
{
ConsumerGroup = consumerGroup,
OffsetCommits = new List<OffsetCommit>
{
new OffsetCommit
{
PartitionId = partitionId,
Topic = IntegrationConfig.IntegrationTopic,
Offset = offset,
Metadata = metadata
}
}
};
var commitResponse = conn.Connection.SendAsync(commit).Result.FirstOrDefault();
To set the consumer to start importing at a specific offset point:
var offsets = consumer.GetTopicOffsetAsync(IntegrationConfig.IntegrationTopic).Result
.Select(x => new OffsetPosition(x.PartitionId, x.Offsets.Max())).ToArray();
var consumer = new Consumer(new ConsumerOptions(IntegrationConfig.IntegrationTopic, router), offsets);
Note the above code will set the consumer to start consuming at the very end of the log, effectively only receiving new messages.