App terminates when using Thread in Xamarin/C# - c#

I'm trying to run some code in a new thread because I'm noticing a slowness on my device. It is compiling ok but at the app starting, it freezes and terminates with the message (MyApp is presenting errors constantly).
What I'm doing wrong?
using System.Threading.Tasks;
[assembly: XamlCompilation(XamlCompilationOptions.Compile)]
namespace MyApp
{
public partial class App : Application
{
public App()
{
InitializeComponent();
}
protected override void OnStart()
{
new System.Threading.Thread(new System.Threading.ThreadStart(() => {
TestaLogin();
})).Start();
}
private void TestaLogin()
{
try
{
Context mContext = Android.App.Application.Context;
ISharedPreferences pref = PreferenceManager.GetDefaultSharedPreferences(mContext);
string uid = pref.GetString("uid", "0");
string token = pref.GetString("token", "0");
if (uid == "0" || token == "0")
{
Device.BeginInvokeOnMainThread(() => { MainPage = new Login(); });
}
else
{
if (Logar(uid, token))
Device.BeginInvokeOnMainThread(() => { MainPage = new MainPage(uid); });
else
Device.BeginInvokeOnMainThread(() => { MainPage = new Login(); });
}
}
catch (Exception e)
{
throw e;
}
}
private bool Logar(string user, string pass)
{
try
{
using (WebClient client = new WebClient())
{
return client.DownloadString("https://www.example.com/mobile/login.php?u=" + user + "&t=" + pass) == "0" ? false : true;
}
}
catch
{
return false;
}
}
}
}
I also wanted to know if lines like this has to be inside the MainThread, I thought that because it's handling changing the page:
Device.BeginInvokeOnMainThread(() => { MainPage = new MainPage(uid); });

Related

MAUI-Android: How to keep Google Speech Recognizer from timeout

I am trying out Microsoft .NET MAUI that currently in Preview stage.
I try to make a small Android app that will use Google voice recognizer service as a way to let user navigate the app. Just a small demo to see what can I do with it. This is also my first time to actually write a Xamarin/MAUI project, so I am not really sure what I can actually do wit the platform.
The problem is that I would like to have this Google service to always on (without timeout) or auto-close then re-open when timeout. In short, I want user to never actually have to deal with this screen:
My intention is that the will be a background thread to keep asking user to say the command, only stop when user do, and the service will always ready to receive the speech.
However, I am unable to keep the above service always on or auto-close=>reopen when timeout.
I am search around and it seems that I cannot change the timeout of the service, so the only way is trying to auto-close=>reopen the service, but I don't know how to.
The below is my code, could you guy give me some direction with it?
1. The login page: only have username and password field, use will be asked to say the username. If it is exist, then asked to say password.
<ContentPage xmlns="http://schemas.microsoft.com/dotnet/2021/maui"
xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"
x:Class="MauiDemo.LoginPage"
BackgroundColor="White">
<ContentPage.Content>
<StackLayout Margin="30" VerticalOptions="StartAndExpand">
<Label
x:Name="lblTitle"
HorizontalTextAlignment="Center"
FontSize="Large"
FontAttributes="Bold"
/>
<Label/>
<Button
x:Name="btnSpeak"
Text="Start"
Clicked="btnSpeak_Clicked"
FontAttributes="Bold"
BackgroundColor="DarkGreen"
/>
<Label/>
<Label
x:Name="lblUsername"
Text="Username"
FontAttributes="Bold"
/>
<Entry
x:Name="txtUsername"
TextColor="Black"
FontSize="18"
VerticalOptions="StartAndExpand"
HorizontalOptions="Fill"
IsReadOnly="True"
/>
<Label/>
<Label
x:Name="lblPassword"
Text="Password"
FontAttributes="Bold"
/>
<Entry
x:Name="txtPassword"
IsPassword="True"
TextColor="Black"
FontSize="18"
VerticalOptions="StartAndExpand"
HorizontalOptions="Fill"
IsReadOnly="True"
/>
<Label/>
<Label
x:Name="lblDisplayname"
Text="Name"
FontAttributes="Bold"
/>
<Label
x:Name="txtDisplayname"
/>
<Label/>
<Label
x:Name="lblMessage"
Text=""/>
</StackLayout>
</ContentPage.Content>
</ContentPage>
using MauiDemo.Common;
using MauiDemo.Speech;
using Microsoft.Maui.Controls;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace MauiDemo
{
public partial class LoginPage : ContentPage
{
private string _field = string.Empty;
private int _waitTime = 2000;
public List<Language> Languages { get; }
private SpeechToTextImplementation _speechRecongnitionInstance;
private struct VoiceMode
{
int Username = 1;
int Password = 2;
}
public LoginPage()
{
InitializeComponent();
this.lblTitle.Text = "Login" + App.Status;
CheckMicrophone();
CommonData.CurrentField = string.Empty;
try
{
_speechRecongnitionInstance = new SpeechToTextImplementation();
_speechRecongnitionInstance.Language = DefaultData.SettingLanguage;
}
catch (Exception ex)
{
DisplayAlert("Error", ex.Message, "OK");
}
MessagingCenter.Subscribe<ISpeechToText, string>(this, "STT", (sender, args) =>
{
ReceivedUsernameAsync(args);
});
MessagingCenter.Subscribe<ISpeechToText>(this, "Final", (sender) =>
{
btnSpeak.IsEnabled = true;
});
MessagingCenter.Subscribe<IMessageSender, string>(this, "STT", (sender, args) =>
{
SpeechToTextRecievedAsync(args);
});
isReceiveUsername = false;
isReceivePassword = false;
RequestUsername();
}
protected override void OnDisappearing()
{
CommonData.CurrentField = string.Empty;
base.OnDisappearing();
}
private async void btnSpeak_Clicked(Object sender, EventArgs e)
{
isReceiveUsername = false;
isReceivePassword = false;
await RequestUsername();
}
private async void SpeechToTextRecievedAsync(string args)
{
switch (_field)
{
case "Username":
await this.ReceivedUsernameAsync(args);
break;
case "Password":
await this.ReceivedPasswordAsync(args);
break;
}
}
bool isReceiveUsername = false;
bool isReceivePassword = false;
private async Task ReceivedUsernameAsync(string args)
{
txtUsername.Text = args.Replace(" ", string.Empty);
lblMessage.Text = string.Empty;
if (string.IsNullOrWhiteSpace(txtUsername.Text))
{
isReceiveUsername = false;
}
else
{
isReceiveUsername = true;
var checkUser = DefaultData.Users.Where(x => x.Username.ToLower().Equals(txtUsername.Text.ToLower()));
if (checkUser.Any())
{
await RequestPassword();
}
else
{
string message = CommonData.GetMessage(MessageCode.WrongUsername);
lblMessage.Text = message;
isReceiveUsername = false;
await RequestUsername(message);
}
}
}
private async Task ReceivedPasswordAsync(string args)
{
txtPassword.Text = args.Replace(" ", string.Empty);
lblMessage.Text = string.Empty;
if (string.IsNullOrWhiteSpace(txtPassword.Text))
{
isReceivePassword = false;
}
else
{
isReceivePassword = true;
var checkUser = DefaultData.Users.Where(x => x.Username.ToLower().Equals(txtUsername.Text.ToLower()) && x.Password.Equals(txtPassword.Text));
if (checkUser.Any())
{
_field = "";
lblDisplayname.Text = checkUser.FirstOrDefault().Displayname;
string msg = CommonData.GetMessage(MessageCode.LoginSuccess);
await Plugin.TextToSpeech.CrossTextToSpeech.Current.Speak(
msg
, crossLocale: CommonData.GetCrossLocale(DefaultData.SettingLanguage)
, speakRate: DefaultData.SettingSpeed
, pitch: DefaultData.SettingPitch
);
await Navigation.PushAsync(new MainPage());
}
else
{
string message = CommonData.GetMessage(MessageCode.WrongPassword);
lblMessage.Text = message;
isReceivePassword = false;
await RequestPassword(message);
}
}
}
private async Task RepeatVoiceUsername(string message)
{
do
{
//_speechRecongnitionInstance.StopSpeechToText();
//_speechRecongnitionInstance.StartSpeechToText();
await Plugin.TextToSpeech.CrossTextToSpeech.Current.Speak(
message
, crossLocale: CommonData.GetCrossLocale(DefaultData.SettingLanguage)
, speakRate: DefaultData.SettingSpeed
, pitch: DefaultData.SettingPitch
);
Thread.Sleep(_waitTime);
}
while (!isReceiveUsername);
}
private async Task RepeatVoicePassword(string message)
{
do
{
//_speechRecongnitionInstance.StopSpeechToText();
//_speechRecongnitionInstance.StartSpeechToText();
await Plugin.TextToSpeech.CrossTextToSpeech.Current.Speak(
message
, crossLocale: CommonData.GetCrossLocale(DefaultData.SettingLanguage)
, speakRate: DefaultData.SettingSpeed
, pitch: DefaultData.SettingPitch
);
Thread.Sleep(_waitTime);
}
while (!isReceivePassword);
}
private bool CheckMicrophone()
{
string rec = Android.Content.PM.PackageManager.FeatureMicrophone;
if (rec != "android.hardware.microphone")
{
// no microphone, no recording. Disable the button and output an alert
DisplayAlert("Error", CommonData.GetMessage(MessageCode.SettingSaveSuccess), "OK");
btnSpeak.IsEnabled = false;
return false;
}
return true;
}
private async Task RequestUsername(string message = "")
{
_field = "Username";
isReceiveUsername = false;
txtUsername.Text = string.Empty;
lblDisplayname.Text = string.Empty;
txtUsername.Focus();
message = (message.IsNullOrWhiteSpace() ? CommonData.GetMessage(MessageCode.InputUsername) : message);
Task.Run(() => RepeatVoiceUsername(message));
_speechRecongnitionInstance.StartSpeechToText(_field);
}
private async Task RequestPassword(string message = "")
{
_field = "Password";
isReceivePassword = false;
txtPassword.Text = string.Empty;
lblDisplayname.Text = string.Empty;
txtPassword.Focus();
message = (message.IsNullOrWhiteSpace() ? CommonData.GetMessage(MessageCode.InputPassword) : message);
Task.Run(() => RepeatVoicePassword(message));
_speechRecongnitionInstance.StartSpeechToText(_field);
}
}
}
2. The Speech Recognizer class:
using Android.App;
using Android.Content;
using Android.Speech;
using Java.Util;
using Plugin.CurrentActivity;
using System;
using System.Threading;
using System.Threading.Tasks;
namespace MauiDemo.Speech
{
public class SpeechToTextImplementation
{
public static AutoResetEvent autoEvent = new AutoResetEvent(false);
private readonly int VOICE = 10;
private Activity _activity;
private float _timeOut = 3;
private string _text;
public SpeechToTextImplementation()
{
_activity = CrossCurrentActivity.Current.Activity;
}
public SpeechToTextImplementation(string text)
{
_text = text;
_activity = CrossCurrentActivity.Current.Activity;
}
public string Language;
public void StartSpeechToText()
{
StartRecordingAndRecognizing();
}
public void StartSpeechToText(string text)
{
_text = text;
StartRecordingAndRecognizing();
}
private async void StartRecordingAndRecognizing()
{
string rec = global::Android.Content.PM.PackageManager.FeatureMicrophone;
if (rec == "android.hardware.microphone")
{
try
{
var locale = Locale.Default;
if (!string.IsNullOrWhiteSpace(Language))
{
locale = new Locale(Language);
}
Intent voiceIntent = new Intent(RecognizerIntent.ActionRecognizeSpeech);
voiceIntent.PutExtra(RecognizerIntent.ExtraLanguageModel, RecognizerIntent.LanguageModelFreeForm);
voiceIntent.PutExtra(RecognizerIntent.ExtraPrompt, _text);
voiceIntent.PutExtra(RecognizerIntent.ExtraSpeechInputCompleteSilenceLengthMillis, _timeOut * 1000);
voiceIntent.PutExtra(RecognizerIntent.ExtraSpeechInputPossiblyCompleteSilenceLengthMillis, _timeOut * 1000);
voiceIntent.PutExtra(RecognizerIntent.ExtraSpeechInputMinimumLengthMillis, _timeOut * 1000);
voiceIntent.PutExtra(RecognizerIntent.ExtraMaxResults, 1);
voiceIntent.PutExtra(RecognizerIntent.ExtraLanguage, locale.ToString());
_activity.StartActivityForResult(voiceIntent, VOICE);
await Task.Run(() => { autoEvent.WaitOne(new TimeSpan(0, 2, 0)); });
}
catch (ActivityNotFoundException ex)
{
String appPackageName = "com.google.android.googlequicksearchbox";
try
{
Intent intent = new Intent(Intent.ActionView, global::Android.Net.Uri.Parse("market://details?id=" + appPackageName));
_activity.StartActivityForResult(intent, VOICE);
}
catch (ActivityNotFoundException e)
{
Intent intent = new Intent(Intent.ActionView, global::Android.Net.Uri.Parse("https://play.google.com/store/apps/details?id=" + appPackageName));
_activity.StartActivityForResult(intent, VOICE);
}
}
}
else
{
throw new Exception("No mic found");
}
}
public void StopSpeechToText()
{
// Do something here to close the service
}
}
}
3. MainActivity:
using Android.App;
using Android.Content;
using Android.Content.PM;
using Android.Speech;
using MauiDemo.Common;
using Microsoft.Maui;
using Microsoft.Maui.Controls;
namespace MauiDemo
{
[Activity(Label = "Maui Demo", Theme = "#style/Maui.SplashTheme", MainLauncher = true, ConfigurationChanges = ConfigChanges.ScreenSize | ConfigChanges.Orientation | ConfigChanges.UiMode | ConfigChanges.ScreenLayout | ConfigChanges.SmallestScreenSize)]
public class MainActivity : MauiAppCompatActivity, IMessageSender
{
private readonly int VOICE = 10;
protected override void OnActivityResult(int requestCode, Result resultCode, Intent data)
{
if (requestCode == VOICE)
{
if (resultCode == Result.Ok)
{
var matches = data.GetStringArrayListExtra(RecognizerIntent.ExtraResults);
if (matches.Count != 0)
{
string textInput = matches[0];
MessagingCenter.Send<IMessageSender, string>(this, "STT", textInput);
}
else
{
MessagingCenter.Send<IMessageSender, string>(this, "STT", "");
}
}
}
base.OnActivityResult(requestCode, resultCode, data);
}
}
}
4. MainApplication
using Android.App;
using Android.OS;
using Android.Runtime;
using Microsoft.Maui;
using Microsoft.Maui.Hosting;
using Plugin.CurrentActivity;
using System;
namespace MauiDemo
{
[Application]
public class MainApplication : MauiApplication
{
public MainApplication(IntPtr handle, JniHandleOwnership ownership)
: base(handle, ownership)
{
}
protected override MauiApp CreateMauiApp() => MauiProgram.CreateMauiApp();
public override void OnCreate()
{
base.OnCreate();
CrossCurrentActivity.Current.Init(this);
}
public override void OnTerminate()
{
base.OnTerminate();
}
public void OnActivityCreated(Activity activity, Bundle savedInstanceState)
{
CrossCurrentActivity.Current.Activity = activity;
}
public void OnActivityDestroyed(Activity activity)
{
}
public void OnActivityPaused(Activity activity)
{
}
public void OnActivityResumed(Activity activity)
{
CrossCurrentActivity.Current.Activity = activity;
}
public void OnActivitySaveInstanceState(Activity activity, Bundle outState)
{
}
public void OnActivityStarted(Activity activity)
{
CrossCurrentActivity.Current.Activity = activity;
}
public void OnActivityStopped(Activity activity)
{
}
}
}
After struggle for a few days without any success, I found a new way to do this thing by using SpeechRecognizer class, instead of using a Google service. With this, I am able to have a better control on the process.
To use SpeechRecognizer, I copied the code in "Create platform microphone services" for permission from this Microsoft page: https://learn.microsoft.com/en-us/xamarin/xamarin-forms/data-cloud/azure-cognitive-services/speech-recognition
I have update my code as below:
Login page: currently is named Prototype2.
using MauiDemo.Common;
using MauiDemo.Speech;
using Microsoft.Maui.Controls;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace MauiDemo.View
{
public partial class Prototype2 : ContentPage
{
private string _field = string.Empty;
private int _waitTime = 2000;
public List<Language> Languages { get; }
private SpeechToTextImplementation2 _speechRecognizer;
//private BackgroundWorker worker = new BackgroundWorker();
private struct VoiceMode
{
int Username = 1;
int Password = 2;
}
public Prototype2()
{
InitializeComponent();
this.lblTitle.Text = "Prototype2" + App.Status;
CheckMicrophone();
CommonData.CurrentField = string.Empty;
try
{
_speechRecognizer = new SpeechToTextImplementation2();
_speechRecognizer.Language = DefaultData.SettingLanguage;
}
catch (Exception ex)
{
DisplayAlert("Error", ex.Message, "OK");
}
MessagingCenter.Subscribe<ISpeechToText, string>(this, "STT", (sender, args) =>
{
ReceivedUsernameAsync(args);
});
MessagingCenter.Subscribe<ISpeechToText>(this, "Final", (sender) =>
{
btnSpeak.IsEnabled = true;
});
MessagingCenter.Subscribe<IMessageSender, string>(this, "STT", (sender, args) =>
{
SpeechToTextRecievedAsync(args);
});
isReceiveUsername = false;
isReceivePassword = false;
RequestUsername(true);
}
protected override void OnDisappearing()
{
CommonData.CurrentField = string.Empty;
base.OnDisappearing();
}
private async void btnSpeak_Clicked(Object sender, EventArgs e)
{
isReceiveUsername = false;
isReceivePassword = false;
await RequestUsername(true);
}
private async void SpeechToTextRecievedAsync(string args)
{
switch (_field)
{
case "Username":
await this.ReceivedUsernameAsync(args);
break;
case "Password":
await this.ReceivedPasswordAsync(args);
break;
}
}
bool isReceiveUsername = false;
bool isReceivePassword = false;
private async Task ReceivedUsernameAsync(string args)
{
txtUsername.Text = args.Replace(" ", string.Empty);
lblMessage.Text = string.Empty;
if (string.IsNullOrWhiteSpace(txtUsername.Text))
{
isReceiveUsername = false;
}
else
{
isReceiveUsername = true;
var checkUser = DefaultData.Users.Where(x => x.Username.ToLower().Equals(txtUsername.Text.ToLower()));
if (checkUser.Any())
{
await RequestPassword(true);
}
else
{
string message = CommonData.GetMessage(MessageCode.WrongUsername);
lblMessage.Text = message;
isReceiveUsername = false;
await RequestUsername(false, message);
}
}
}
private async Task ReceivedPasswordAsync(string args)
{
txtPassword.Text = args.Replace(" ", string.Empty);
lblMessage.Text = string.Empty;
if (string.IsNullOrWhiteSpace(txtPassword.Text))
{
isReceivePassword = false;
}
else
{
isReceivePassword = true;
var checkUser = DefaultData.Users.Where(x => x.Username.ToLower().Equals(txtUsername.Text.ToLower()) && x.Password.Equals(txtPassword.Text));
if (checkUser.Any())
{
_field = "";
lblDisplayname.Text = checkUser.FirstOrDefault().Displayname;
string msg = CommonData.GetMessage(MessageCode.LoginSuccess);
await Plugin.TextToSpeech.CrossTextToSpeech.Current.Speak(
msg
, crossLocale: CommonData.GetCrossLocale(DefaultData.SettingLanguage)
, speakRate: DefaultData.SettingSpeed
, pitch: DefaultData.SettingPitch
);
await Navigation.PushAsync(new MainPage());
}
else
{
string message = CommonData.GetMessage(MessageCode.WrongPassword);
lblMessage.Text = message;
isReceivePassword = false;
await RequestPassword(false, message);
}
}
}
private async Task RepeatVoiceUsername(string message)
{
do
{
await Plugin.TextToSpeech.CrossTextToSpeech.Current.Speak(
message
, crossLocale: CommonData.GetCrossLocale(DefaultData.SettingLanguage)
, speakRate: DefaultData.SettingSpeed
, pitch: DefaultData.SettingPitch
);
Thread.Sleep(_waitTime);
}
while (!isReceiveUsername);
}
private async Task RepeatVoicePassword(string message)
{
do
{
await Plugin.TextToSpeech.CrossTextToSpeech.Current.Speak(
message
, crossLocale: CommonData.GetCrossLocale(DefaultData.SettingLanguage)
, speakRate: DefaultData.SettingSpeed
, pitch: DefaultData.SettingPitch
);
Thread.Sleep(_waitTime);
}
while (!isReceivePassword);
}
private bool CheckMicrophone()
{
string rec = Android.Content.PM.PackageManager.FeatureMicrophone;
if (rec != "android.hardware.microphone")
{
// no microphone, no recording. Disable the button and output an alert
DisplayAlert("Error", CommonData.GetMessage(MessageCode.SettingSaveSuccess), "OK");
btnSpeak.IsEnabled = false;
return false;
}
return true;
}
private async Task RequestUsername(bool isRepeat, string message = "")
{
_field = "Username";
isReceiveUsername = false;
//txtUsername.Text = string.Empty;
//lblDisplayname.Text = string.Empty;
txtUsername.Focus();
message = (message.IsNullOrWhiteSpace() ? CommonData.GetMessage(MessageCode.InputUsername) : message);
if (isRepeat)
{
Task.Run(() => RepeatVoiceUsername(message));
}
else
{
await Plugin.TextToSpeech.CrossTextToSpeech.Current.Speak(
message
, crossLocale: CommonData.GetCrossLocale(DefaultData.SettingLanguage)
, speakRate: DefaultData.SettingSpeed
, pitch: DefaultData.SettingPitch
);
}
_speechRecognizer.StartListening();
}
private async Task RequestPassword(bool isRepeat, string message = "")
{
_field = "Password";
isReceivePassword = false;
//txtPassword.Text = string.Empty;
//lblDisplayname.Text = string.Empty;
txtPassword.Focus();
message = (message.IsNullOrWhiteSpace() ? CommonData.GetMessage(MessageCode.InputPassword) : message);
if (isRepeat)
{
Task.Run(() => RepeatVoicePassword(message));
}
else
{
await Plugin.TextToSpeech.CrossTextToSpeech.Current.Speak(
message
, crossLocale: CommonData.GetCrossLocale(DefaultData.SettingLanguage)
, speakRate: DefaultData.SettingSpeed
, pitch: DefaultData.SettingPitch
);
}
_speechRecognizer.StartListening();
}
}
}
New Microphone Service to handle the permission
using Android.App;
using Android.Content.PM;
using Android.OS;
using AndroidX.Core.App;
using Google.Android.Material.Snackbar;
using System.Threading.Tasks;
namespace MauiDemo.Speech
{
public class MicrophoneService
{
public const int RecordAudioPermissionCode = 1;
private TaskCompletionSource<bool> tcsPermissions;
string[] permissions = new string[] { Manifest.Permission.RecordAudio };
public MicrophoneService()
{
tcsPermissions = new TaskCompletionSource<bool>();
}
public Task<bool> GetPermissionAsync()
{
if ((int)Build.VERSION.SdkInt < 23)
{
tcsPermissions.TrySetResult(true);
}
else
{
var currentActivity = MainActivity.Instance;
if (ActivityCompat.CheckSelfPermission(currentActivity, Manifest.Permission.RecordAudio) != (int)Permission.Granted)
{
RequestMicPermissions();
}
else
{
tcsPermissions.TrySetResult(true);
}
}
return tcsPermissions.Task;
}
public void OnRequestPermissionResult(bool isGranted)
{
tcsPermissions.TrySetResult(isGranted);
}
void RequestMicPermissions()
{
if (ActivityCompat.ShouldShowRequestPermissionRationale(MainActivity.Instance, Manifest.Permission.RecordAudio))
{
Snackbar.Make(MainActivity.Instance.FindViewById(Android.Resource.Id.Content),
"Microphone permissions are required for speech transcription!",
Snackbar.LengthIndefinite)
.SetAction("Ok", v =>
{
((Activity)MainActivity.Instance).RequestPermissions(permissions, RecordAudioPermissionCode);
})
.Show();
}
else
{
ActivityCompat.RequestPermissions((Activity)MainActivity.Instance, permissions, RecordAudioPermissionCode);
}
}
}
}
New Speech=>Text class to use SpeechRecognizer: Mostly take frrm this How to increase the voice listen time in Google Recognizer Intent(Speech Recognition) Android
using Android;
using Android.App;
using Android.Content;
using Android.OS;
using Android.Runtime;
using Android.Speech;
using AndroidX.Core.App;
using Java.Util;
using MauiDemo.Common;
using Microsoft.Maui.Controls;
using Plugin.CurrentActivity;
using System.Threading;
namespace MauiDemo.Speech
{
public class SpeechToTextImplementation2 : Java.Lang.Object, IRecognitionListener, IMessageSender
{
public static AutoResetEvent autoEvent = new AutoResetEvent(false);
private readonly int VOICE = 10;
private Activity _activity;
private float _timeOut = 3;
private SpeechRecognizer _speech;
private Intent _speechIntent;
public string Words;
public string Language;
private MicrophoneService micService;
public SpeechToTextImplementation2()
{
micService = new MicrophoneService();
_activity = CrossCurrentActivity.Current.Activity;
var locale = Locale.Default;
if (!string.IsNullOrWhiteSpace(Language))
{
locale = new Locale(Language);
}
_speech = SpeechRecognizer.CreateSpeechRecognizer(this._activity);
_speech.SetRecognitionListener(this);
_speechIntent = new Intent(RecognizerIntent.ActionRecognizeSpeech);
_speechIntent.PutExtra(RecognizerIntent.ExtraLanguageModel, RecognizerIntent.LanguageModelFreeForm);
_speechIntent.PutExtra(RecognizerIntent.ExtraSpeechInputCompleteSilenceLengthMillis, _timeOut * 1000);
_speechIntent.PutExtra(RecognizerIntent.ExtraSpeechInputPossiblyCompleteSilenceLengthMillis, _timeOut * 1000);
_speechIntent.PutExtra(RecognizerIntent.ExtraSpeechInputMinimumLengthMillis, _timeOut * 1000);
_speechIntent.PutExtra(RecognizerIntent.ExtraMaxResults, 1);
_speechIntent.PutExtra(RecognizerIntent.ExtraLanguage, locale.ToString());
}
void RestartListening()
{
var locale = Locale.Default;
if (!string.IsNullOrWhiteSpace(Language))
{
locale = new Locale(Language);
}
_speech.Destroy();
_speech = SpeechRecognizer.CreateSpeechRecognizer(this._activity);
_speech.SetRecognitionListener(this);
_speechIntent = new Intent(RecognizerIntent.ActionRecognizeSpeech);
_speechIntent.PutExtra(RecognizerIntent.ExtraLanguageModel, RecognizerIntent.LanguageModelFreeForm);
_speechIntent.PutExtra(RecognizerIntent.ExtraSpeechInputCompleteSilenceLengthMillis, _timeOut * 1000);
_speechIntent.PutExtra(RecognizerIntent.ExtraSpeechInputPossiblyCompleteSilenceLengthMillis, _timeOut * 1000);
_speechIntent.PutExtra(RecognizerIntent.ExtraSpeechInputMinimumLengthMillis, _timeOut * 1000);
_speechIntent.PutExtra(RecognizerIntent.ExtraMaxResults, 1);
_speechIntent.PutExtra(RecognizerIntent.ExtraLanguage, locale.ToString());
StartListening();
}
public async void StartListening()
{
bool isMicEnabled = await micService.GetPermissionAsync();
if (!isMicEnabled)
{
Words = "Please grant access to the microphone!";
return;
}
_speech.StartListening(_speechIntent);
}
public void StopListening()
{
_speech.StopListening();
}
public void OnBeginningOfSpeech()
{
}
public void OnBufferReceived(byte[] buffer)
{
}
public void OnEndOfSpeech()
{
}
public void OnError([GeneratedEnum] SpeechRecognizerError error)
{
Words = error.ToString();
MessagingCenter.Send<IMessageSender, string>(this, "Error", Words);
RestartListening();
}
public void OnEvent(int eventType, Bundle #params)
{
}
public void OnPartialResults(Bundle partialResults)
{
}
public void OnReadyForSpeech(Bundle #params)
{
}
public void OnResults(Bundle results)
{
var matches = results.GetStringArrayList(SpeechRecognizer.ResultsRecognition);
if (matches == null)
Words = "Null";
else
if (matches.Count != 0)
Words = matches[0];
else
Words = "";
MessagingCenter.Send<IMessageSender, string>(this, "STT", Words);
RestartListening();
}
public void OnRmsChanged(float rmsdB)
{
}
}
}
Update MainActivities for permission
using Android.App;
using Android.Content;
using Android.Content.PM;
using Android.OS;
using Android.Runtime;
using Android.Speech;
using MauiDemo.Common;
using MauiDemo.Speech;
using Microsoft.Maui;
using Microsoft.Maui.Controls;
namespace MauiDemo
{
[Activity(Label = "Maui Demo", Theme = "#style/Maui.SplashTheme", MainLauncher = true, ConfigurationChanges = ConfigChanges.ScreenSize | ConfigChanges.Orientation | ConfigChanges.UiMode | ConfigChanges.ScreenLayout | ConfigChanges.SmallestScreenSize)]
public class MainActivity : MauiAppCompatActivity, IMessageSender
{
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
Instance = this;
micService = new MicrophoneService();
}
private readonly int VOICE = 10;
protected override void OnActivityResult(int requestCode, Result resultCode, Intent data)
{
if (requestCode == VOICE)
{
if (resultCode == Result.Ok)
{
var matches = data.GetStringArrayListExtra(RecognizerIntent.ExtraResults);
if (matches.Count != 0)
{
string textInput = matches[0];
MessagingCenter.Send<IMessageSender, string>(this, "STT", textInput);
}
else
{
MessagingCenter.Send<IMessageSender, string>(this, "STT", "");
}
//SpeechToTextImplementation.autoEvent.Set();
}
}
base.OnActivityResult(requestCode, resultCode, data);
}
MicrophoneService micService;
internal static MainActivity Instance { get; private set; }
public override void OnRequestPermissionsResult(int requestCode, string[] permissions, [GeneratedEnum] Android.Content.PM.Permission[] grantResults)
{
// ...
switch (requestCode)
{
case MicrophoneService.RecordAudioPermissionCode:
if (grantResults[0] == Permission.Granted)
{
micService.OnRequestPermissionResult(true);
}
else
{
micService.OnRequestPermissionResult(false);
}
break;
}
}
}
}
Feel free to check out the code, but I will not use this for anything serious because it does not runs properly yet.
Any opinion to improve for the code will be really appreciated, as I really want to get good with this MAUI platform.

Xamarin Android ForceDarkHelper What is it?

Periodically, the application begins to update itself. There is a constant call in the logs:
[ForceDarkHelper] updateByCheckExcludeList: pkg: com.companyname.manimobile activity: crc64d14753dcc52b83b4.MainActivity#a894c70
[ForceDarkHelper] updateByCheckExcludeList: pkg: com.companyname.manimobile activity: crc64d14753dcc52b83b4.MainActivity#a894c70
[ForceDarkHelper] updateByCheckExcludeList: pkg: com.companyname.manimobile activity: crc64d14753dcc52b83b4.MainActivity#a894c70
[ForceDarkHelper] updateByCheckExcludeList: pkg: com.companyname.manimobile activity: crc64d14753dcc52b83b4.MainActivity#a894c70
When this happens, if, for example, you open the menu , it closes itself, if something is filled in, it is cleared, the page is updated. There are no timers in the code. I'm testing the app on Xiaomi Redmi. I repeat sometimes it happens sometimes it doesn't. What is it?
I do not know what the problem is, but occasionally, it happens that the application throws the fingerprint to the page. It is intermittent. Sometimes everything works fine. That is, I go through the fingerprint, the next page opens, everything is normal and a second after 5 I am again thrown to the page where you need to enter the fingerprint.
Code for the authorization page:
public authentification()
{
try
{
InitializeComponent();
bool auth = CrossSettings.Current.GetValueOrDefault("authorized", false);
if (auth == false) { CheckAuth(); }
else
{
Application.Current.MainPage = new MasterLk();
}
}
catch { }
}
async void CheckAuth()
{
try
{
var avail = await CrossFingerprint.Current.IsAvailableAsync();
if (!avail)
{
CrossSettings.Current.GetValueOrDefault("authorized", true);
Application.Current.MainPage = new MasterLk();
}
else
{
var request = new AuthenticationRequestConfiguration("NeedAuth", "-");
var result = await CrossFingerprint.Current.AuthenticateAsync(request);
if (result.Authenticated)
{
CrossSettings.Current.GetValueOrDefault("authorized", true);
Application.Current.MainPage = new MasterLk();
}
else
{
CheckAuth();
}
}
}
catch { }
}
On the page where it throws it there is a ListView with a binding:
public class OrdersViewModel : BaseViewModel
{
private Table oldLoan;
private bool isRefreshing;
private readonly string clientId;
public bool IsRefreshing
{
get
{
return isRefreshing;
}
set
{
isRefreshing = value;
OnPropertyChanged("IsRefreshing");
}
}
public ICommand RefreshCommand { get; set; }
public ObservableCollection<Table> Loans { get; set; }
public void ShowOrHideLoan(Table loan)
{
if (oldLoan == loan)
{
loan.IsExpanded = !loan.IsExpanded;
Reload(loan);
}
else
{
if (oldLoan != null)
{
oldLoan.IsExpanded = false;
Reload(oldLoan);
}
loan.IsExpanded = true;
Reload(loan);
}
oldLoan = loan;
}
private void Reload(Table loan)
{
var index = Loans.IndexOf(loan);
Loans.Remove(loan);
Loans.Insert(index, loan);
}
public async Task LoadDataAsync()
{
IsRefreshing = true;
Loans.Clear();
try
{
var loans = await ConnectAPI.GetOrdersAsync(clientId);
await Task.Delay(1000);
foreach (var item in loans)
{
Loans.Add(item);
}
}
catch (Exception exc)
{
Console.WriteLine(exc.Message);
}
finally
{
oldLoan = null;
IsRefreshing = false;
}
}
public OrdersViewModel(string clientId)
{
IsRefreshing = false;
this.clientId = clientId;
Loans = new ObservableCollection<Table>();
RefreshCommand = new Command(async () =>
{
await LoadDataAsync();
});
Task.Run(async () => await LoadDataAsync());
}
}
That is, whenever the [ForceDarkHelper] updateByCheckExcludeList: pkg: com.companyname.manimobile activity: crc64d14753dcc52b83b4 event appears.MainActivity#a894c70
Throws it to the print page...
and if you stay on this page, it is updated after a while.
MIUI 12 has made an intelligent dark theme. The system itself repaints the applications if they do not support the dark theme. Apparently this service is ForceDarkHelper. And ExcludeList is in the settings a list of applications that cannot be repainted

Listview not populating when using back arrow after typing in search bar

I have a list view with a search bar, i can search a item in the listview and click on the item and navigate to details of that item, but when i click the back arrow i get an System.NullReferenceException on my HttpResponseMessage.
Could someone please advise me as to what i could be doing wrong.
If the search bar is empty it works fine.
ViewModel
private async Task GetProjects(string email)
{
IsBusy = true;
ProjectList = new ObservableCollection<ProjectModel>();
using (HttpClient client = new HttpClient())
{
try
{
using (HttpResponseMessage response = await client.GetAsync("http://example/api/GetProject/email=" + email + "/"))
{
if (response.IsSuccessStatusCode)
{
using (HttpContent content = response.Content)
{
var textresponse = await content.ReadAsStringAsync();
var json = JsonConvert.DeserializeObject<List<ProjectModel>>(textresponse);
foreach (var t in json)
{
if (t.pjtIsActive == 1)
{
ProjectList.Add(new ProjectModel
{
..............
});
}
}
IsBusy = false;
}
}
else
{
}
}
}
catch (Exception)
{
IsBusy = false;
}
}
}
private ICommand _searchCommand;
public ICommand SearchCommand
{
get
{
return _searchCommand ?? (_searchCommand = new Command<string>
(async (text) =>
{
if (text.Length >= 1)
{
ProjectList.Clear();
await GetProjects(EmailAddress);
var projectSearch = ProjectList.Where(c => c.pjtName.ToLower().StartsWith(text.ToLower()) || c.ClientName.ToLower().StartsWith(text.ToLower()) || c.ContractorName.ToLower().StartsWith(text.ToLower()) || c.pjtNumber.ToLower().StartsWith(text.ToLower())).ToList();
ProjectList.Clear();
foreach (var item in projectSearch)
ProjectList.Add(item);
}
else
{
GetProjects(EmailAddress);
}
}));
}
}
private ICommand _projectDetailsCommand;
public ICommand ProjectDetailsCommand=> _projectDetailsCommand?? (_projectDetailsCommand= new Command(async (object obj) => {
var item = (obj as ProjectModel);
ProjectModel project = new ProjectModel();
...........
Navigation.PushAsync(new Project_Details(project));
}));
Content Page
protected override void OnAppearing()
{
BindingContext = new Project_View_ViewModel(Navigation);
base.OnAppearing();
}
You should call the binding context in the constructor and refresh the required data in the OnAppearing,
private Project_View_ViewModel bindingv;
public Project_View()
{
try
{
InitializeComponent();
bindingv = new Project_View_ViewModel(Navigation);
BindingContext = bindingv;
}
catch (Exception ex)
{
Logger.Log(ex);
}
}
protected async override void OnAppearing()
{
base.OnAppearing();
try
{
if (bindingv != null)
{
await bindingv.GetProjects();
}
}
catch (Exception ex)
{
Logger.Log(ex);
}
}

UWP AppServiceConnection - SendResponseAsync returns AppServiceResponseStatus.Failure

I'm trying to create a UWP service app on the Raspberry Pi3 which provides the access to the on board UART. I'm facing an issue about the AppConnection Request/response.
this is the service method that handles the incoming requests from client apps
internal class Inbound
{
public static async void OnRequestReceived(AppServiceConnection sender, AppServiceRequestReceivedEventArgs args)
{
var messageDeferral = args.GetDeferral();
var response = new ValueSet();
bool success = false;
var msg = args.Request.Message.Keys;
if (args.Request.Message.TryGetValue(ServiceApiRequests.Keys.Command, out object command))
{
try
{
switch (command)
{
case ServiceApiRequests.CommandValues.UartWrite:
if (args.Request.Message.TryGetValue(ServiceApiRequests.Keys.UartTxBuffer, out object txBuffer))
{
string rxBuff = "";
success = await Pi3.Peripherals.Uart.GerInstance(57600).Write((string)txBuffer);
if (success)
{
Debug.WriteLine("Tx: " + (string)txBuffer);
if (args.Request.Message.TryGetValue(ServiceApiRequests.Keys.ReadUartResponse, out object getResponse))
{
if ((string)getResponse == ServiceApiRequests.ReadUartResponse.Yes)
{
rxBuff = await Pi3.Peripherals.Uart.GerInstance(57600).Read();
Debug.WriteLine("Rx: " + rxBuff);
}
}
}
response.Add(ServiceApiRequests.Keys.UartRxBuffer, rxBuff);
}
break;
}
}
catch (Exception ex)
{
success = false;
}
}
response.Add(new KeyValuePair<string, object>(ServiceApiRequests.Keys.Result, success ? ServiceApiRequests.ResultValues.Ok : ServiceApiRequests.ResultValues.Ko));
var result = await args.Request.SendResponseAsync(response);
if (result == AppServiceResponseStatus.Failure)
{
Debug.WriteLine("Failed to send the response");
}
messageDeferral.Complete();
}
}
As you can figure out, the Uart class is get using the Singleton pattern using the method Pi3.Peripherals.Uart.GerInstance(57600).
Following the code i using for send the request from the client app.
public static class Uart
{
public static IAsyncOperation<string> SendCommand(this AppServiceConnection DriverControllerConnection, string txBuffer, string awaitResponse = ServiceApiRequests.ReadUartResponse.Yes)
{
return _SendCommand(DriverControllerConnection, txBuffer, awaitResponse).AsAsyncOperation();
}
private static async Task<string> _SendCommand(AppServiceConnection DriverControllerConnection, string txBuffer, string awaitResponse)
{
AppServiceResponse response = null;
string response_str = "";
try
{
if (DriverControllerConnection != null)
{
response = await DriverControllerConnection.SendMessageAsync(new ServiceApiRequests.UartWrite().GetCommand(txBuffer, awaitResponse));
if (response.Status == AppServiceResponseStatus.Success)
{
if (response.Message.TryGetValue(ServiceApiRequests.Keys.Result, out object result))
{
if ((string)result == ServiceApiRequests.ResultValues.Ok && awaitResponse == ServiceApiRequests.ReadUartResponse.Yes)
{
response_str = response.Message[ServiceApiRequests.Keys.UartRxBuffer] as string;
}
}
}
}
}
catch (Exception ex)
{
// TODO: log
}
return response_str;
}
}
The system works well just for a while, until i have response.Status == AppServiceResponseStatus.Success , then the result of the request changes and it becomes AppServiceResponseStatus.Failure. This way the program counter never steps into the condition if (response.Status == AppServiceResponseStatus.Success).
Any idea about the cause?
Thank you so much for the help.
EDIT
Follow the suggestions, i added an handler for the ServiceClosed event. This is the main class.
public sealed class DriverListener : IBackgroundTask
{
private BackgroundTaskDeferral backgroundTaskDeferral;
private AppServiceConnection appServiceConnection;
public void Run(IBackgroundTaskInstance taskInstance)
{
backgroundTaskDeferral = taskInstance.GetDeferral();
// taskInstance.Canceled += OnTaskCanceled;
var triggerDetails = taskInstance.TriggerDetails as AppServiceTriggerDetails;
appServiceConnection = triggerDetails.AppServiceConnection;
appServiceConnection.RequestReceived += Inbound.OnRequestReceived;
appServiceConnection.ServiceClosed += OnTaskCanceled;
}
private void OnTaskCanceled(AppServiceConnection sender, AppServiceClosedEventArgs reason)
{
if (this.backgroundTaskDeferral != null)
{
Debug.WriteLine("ServiceClosed");
// Complete the service deferral.
this.backgroundTaskDeferral.Complete();
}
}
}
Placing a breakpoint in this function, i see that it was never triggered.
The app connection is opened using the singleton pattern, and putted in a dll that i use in the client app
public static AppServiceConnection GetDriverConnectionInstance()
{
if (_DriverConnectionInstance == null)
{
try
{
_DriverConnectionInstance = OpenDriverConnection().AsTask().GetAwaiter().GetResult();
}
catch
{
}
}
return _DriverConnectionInstance;
}
I also add a Request to the service that toggles a led, and i noticed that the led status changes but the response from the app service is still "Failure" and the message is null.
The AppService has a default lifetime of 25sec, unless it is being requested by the foreground experience. When the service shuts down the connection, your client process will receive the ServiceClosed event, so you know you will need to reopen the connection the next time you want to send a request.

C# Discord Bot Coding: Creating a command that spams messages then stops with another command

I just need a working line of code that spams a message every five seconds then stops after another command is entered. Like for an example, if a user enters "~raid" then the bot will spam "RAID RAID" every five seconds and stops when the user does "~stop". If anyone could help that'd be awesome.
Here is what I got so far:
class MyBot
{
DiscordClient discord;
CommandService commands;
public MyBot()
{
discord = new DiscordClient(x =>
{
x.LogLevel = LogSeverity.Info;
x.LogHandler = Log;
});
discord.UsingCommands(x =>
{
x.PrefixChar = '~';
x.AllowMentionPrefix = true;
});
commands = discord.GetService<CommandService>();
commands.CreateCommand("checked")
.Do(async (e) =>
{
commands.CreateCommand("weewoo")
.Do(async (e) =>
{
await e.Channel.SendMessage("**WEE WOO**");
});
discord.ExecuteAndWait(async () =>
{
await discord.Connect("discordkeyhere", TokenType.Bot);
});
}
public void Log(object sender, LogMessageEventArgs e)
{
Console.WriteLine(e.Message);
}
}
}
Here is a small example how you could do it.
In this case, you can Start and Stop your bot from outside using the methods Start and Stop.
class MyBot
{
public MyBot()
{
}
CancellationTokenSource cts;
public void Start()
{
cts = new CancellationTokenSource();
Task t = Task.Run(() =>
{
while (!cts.IsCancellationRequested)
{
Console.WriteLine("RAID RAID");
Task.Delay(5000).Wait();
}
}, cts.Token);
}
public void Stop()
{
cts?.Cancel();
}
}
Here is the code to test the MyBot class
static void Main(string[] args)
{
try
{
var b = new MyBot();
while (true)
{
var input = Console.ReadLine();
if (input.Equals("~raid", StringComparison.OrdinalIgnoreCase))
b.Start();
else if (input.Equals("~stop", StringComparison.OrdinalIgnoreCase))
b.Stop();
else if (input.Equals("exit", StringComparison.OrdinalIgnoreCase))
break;
Task.Delay(1000);
}
}
catch (Exception)
{
throw;
}
}

Categories