I want to turn on or off airplane mode using AccessibilityService.
Any idea how we can do it?
Yes, you can't change it from app that target bigger than Android 4.2. But you can open the settings page instead if you want:
if (Android.OS.Build.VERSION.SdkInt < BuildVersionCodes.JellyBeanMr1)
{
try
{
Intent intentAirplaneMode = new Intent(Android.Provider.Settings.ActionAirplaneModeSettings);
intentAirplaneMode.SetFlags(ActivityFlags.NewTask);
Context.StartActivity(intentAirplaneMode);
}
catch (ActivityNotFoundException e)
{
Log.Error("exception", e + "");
}
}
else
{
Intent intent1 = new Intent("android.settings.WIRELESS_SETTINGS");
intent1.SetFlags(ActivityFlags.NewTask);
Context.StartActivity(intent1);
}
}
And AccessibilityService can used with dependency service.
Kamal you’re not going to be able to do it.
It doesn’t seem like you’re doing iOS, but iOS has a lot of limitations due to privacy and security purposes that won’t allow you to do this. You can see more details here stackoverflow.com/q/20469425/11104068
Also android blocked being able to do this from Android 4.2 onwards. Only system apps can make changes to Airplane mode, as you can see here stackoverflow.com/a/5533943/11104068
Since it doesn’t seem you’re creating a system app that gets installed with the operating system, and not through the Play Store, you won’t be able to get permissions. It will give you an error /exception even if you implement everything
Related
We have a desktop Windows app (written in WFP/C#) that we distribute as a single .exe file with no installer (it bundles all its dependencies via a Fody/Costura plugin).
We would like to integrate a local Action Center toast functionality where the app can display a toast and respond to it when it's clicked.
Displaying the toast is straightforward and can be done by using the Microsoft.Toolkit.Uwp.Notifications nuget package. However, in order to actually receive proper notifications when the toast is clicked in the Action Center (as opposed to the balloon tip) we need to register with notification platform.
The guide on how to do this seems to be focused on apps with an installer (e.g. Wix): https://learn.microsoft.com/en-us/windows/uwp/design/shell/tiles-and-notifications/send-local-toast-desktop?fbclid=IwAR2AoHRKI88VNGRG-pTUytwhkMuovWT4bEr0RoXEayWpWsoGlghtZeq4Mo4#step-4-register-with-notification-platform
The specific task we're trying to achieve is, from the documentation:
If you're using classic Win32 (or if you support both), you have to declare your Application User Model ID (AUMID) and toast activator CLSID (the GUID from step #3) on your app's shortcut in Start.
How can we do it without writing an installer? We would like our app to do this registration on first run.
Note: the app already has provisions for elevating itself through UAC if needed by restarting itself in Administrator context.
Additional references: WPF native windows 10 toasts
[Update]
I managed to follow the instructions in https://learn.microsoft.com/en-us/windows/uwp/design/shell/tiles-and-notifications/send-local-toast-desktop
and https://learn.microsoft.com/en-us/windows/win32/shell/enable-desktop-toast-with-appusermodelid to put together what should have been a working solution, but in the end, clicking on toasts in the Action Center does not trigger OnActivated() in my NotificationActivatior.
Salient points:
Sending notification
var toast = new ToastNotification(toastXml);
DesktopNotificationManagerCompat.CreateToastNotifier().Show(toast);
Registration:
string shortcutPath = Path.Combine(
Environment.GetFolderPath(Environment.SpecialFolder.Programs),
"Toasty.lnk");
DesktopNotificationManagerCompat.RegisterAumidAndComServer
<MyNotificationActivator>(AppName);
DesktopNotificationManagerCompat.RegisterActivator
<MyNotificationActivator>();
if (!File.Exists(shortcutPath))
{
ShortcutManager.RegisterAppForNotifications(
shortcutPath,
Assembly.GetExecutingAssembly().Location,
null,
AppName,
ActivationId);
}
Creating a shortcut
public static void RegisterAppForNotifications(
string shortcutPath,
string appExecutablePath,
string arguments,
string appName,
string activatorId)
{
var shellLinkClass = new ShellLinkCoClass();
IShellLinkW shellLink = (IShellLinkW)shellLinkClass;
shellLink.SetPath(appExecutablePath);
IPropertyStore propertyStore = (IPropertyStore)shellLinkClass;
IPersistFile persistFile = (IPersistFile)shellLinkClass;
if (arguments != null)
{
shellLink.SetArguments(arguments);
}
// https://learn.microsoft.com/en-us/windows/win32/properties/props-system-appusermodel-id
propertyStore.SetValue(
new PropertyKey("9F4C2855-9F79-4B39-A8D0-E1D42DE1D5F3", 5),
new PROPVARIANT(appName));
// https://learn.microsoft.com/en-us/windows/win32/properties/props-system-appusermodel-toastactivatorclsid
propertyStore.SetValue(
new PropertyKey("9F4C2855-9F79-4B39-A8D0-E1D42DE1D5F3", 26),
new PROPVARIANT(new Guid(activatorId)));
propertyStore.Commit();
persistFile.Save(shortcutPath, true);
}
[Update]
Finally got it to work - not sure what was wrong before, but the final version seems to be okay. Full code: https://gist.github.com/davidair/c4ea207bf6eece4ef08b97ab29a3036f
I have the same problem with my project now.
Managed to find this repository - https://github.com/felixrieseberg/electron-windows-interactive-notifications
Here's C++ implementation for installing shortcut (InteractiveNotifications file, InstallShortcut method). I guess the problem is how we set the value to PropertyStore, string GUID is not suitable for some reason. Still, I wasn't able to solve the problem for now.
UPDATED: Finally, was able to install shortcut from code! Check my example at Github. https://github.com/romayavorskyi/WpfNotificationTest (still a lot of hardcode, but it should give you the general idea). And you were right, shortcut path matters. It seems shortcut should be in the ProgramData folder for correct work.
I am trying to access the GPIO on my custom SBC using Windows 10 IoT Core. I have discovered that I must use LightningProviders to accomplish this . So I tried to follow this guide to use lightning providers properly.
I used very simple code:
if (LightningProvider.IsLightningEnabled)
{
LowLevelDevicesController.DefaultProvider = LightningProvider.GetAggregateProvider();
}
GpioStatus = "Initializing...";
var gpio = GpioController.GetDefault();
if (gpio == null)
{
GpioStatus = "There is no GPIO controller on this device.";
}
else
{
gpio.OpenPin(1).Write(GpioPinValue.High);
GpioStatus = gpio.OpenPin(1).Read().ToString();
}
Where GpioStatus is output text on a UI.
I discovered that if I run the LowLevelDevicesController.DefaultProvider = LightningProvider.GetAggregateProvider(); line outside of the enabled check, it picks up the GPIO controller and lets me detect how many pins I have and read them (All low). However I can't change the DriveMode or write to the pins without error. The error I get just says to Make sure the LightningProviders are enabled.
This brings me back to the guide I linked at the start. It suggests to enable DMAP drivers using the Device Portal for W10IoT or DMAPUtil.exe. I have tried both. In the Device Portal the area where it should be is just blank. And in the command line trying to use the DMAPUtil.exe only returns that dmaputil.exe is not available on this system.
Therefore I am asking if there is any other way to enable the LightningProviders or if there a way to know if they are incompatible with my board?
Thanks!
UPDATE
Also tried using the devcon.exe commands in the W10IoT Command line.
I am able to locate the Direct memory access controller but when i do devcon.exe enable *PNP0200 it says it is enabled but remains disabled when checked with devcon.exe status *PNP0200
Please confirm if you have added the IOT_DMAP_DRIVER feature in your OEMInput.xml, this feature will add the DMAP driver in the image. If IOT_DMAP_DRIVER is removed from the OEMInput.xml, the Default Driver Controller will be blank in device protal, and dmaputil will be not available on Windows IoT Core. Please see the IoT Core feature list.
Update:
You can download the source of Lighting Provider, and then deploy and debug in your custom image.
TL;DR:
Would appreciate any extra information on Android's
abstract class Vibrator
vs
performHapticFeedback
Preferably avoiding the use of the Vibrator class and prioritising performHapticFeedback to circumvent having to ask the user for permissions, and rely only on their system's preference.
Scenario:
I'm working with Xamarin trying to implement Haptic Feedback for Android and iOS.
Now, whereas the iOS documentation has a short explanation, which I've implemented as the following:
void Platform.Vibrate(HapticsIntensity HapticsIntensity)
{
UIKit.UIImpactFeedbackGenerator ImpactFeedbackGenerator;
switch (HapticsIntensity)
{
case HapticsIntensity.Light:
ImpactFeedbackGenerator = new UIKit.UIImpactFeedbackGenerator(UIKit.UIImpactFeedbackStyle.Light);
break;
case HapticsIntensity.Medium:
ImpactFeedbackGenerator = new UIKit.UIImpactFeedbackGenerator(UIKit.UIImpactFeedbackStyle.Medium);
break;
case HapticsIntensity.Heavy:
ImpactFeedbackGenerator = new UIKit.UIImpactFeedbackGenerator(UIKit.UIImpactFeedbackStyle.Heavy);
break;
default:
ImpactFeedbackGenerator = null;
break;
};
if (ImpactFeedbackGenerator != null)
{
ImpactFeedbackGenerator.Prepare();
ImpactFeedbackGenerator.ImpactOccurred();
}
}
The Android documentation for Haptic Feedback states that the method performHapticFeedback expects a HapticFeedbackConstant as a parameter.
public boolean performHapticFeedback (int feedbackConstant)
The available feedbackConstant's are here, but they seem to have no difference between them.
Calling:
LongPress
Engine.AndroidActivity.Window.DecorView.PerformHapticFeedback(Android.Views.FeedbackConstants.LongPress);
has the same effect as
VirtualKey
Engine.AndroidActivity.Window.DecorView.PerformHapticFeedback(Android.Views.FeedbackConstants.VirtualKey);
or
KeyboardTap
Engine.AndroidActivity.Window.DecorView.PerformHapticFeedback(Android.Views.FeedbackConstants.KeyboardTap);
moreover, some of the FeedbackConstants don't even result in haptic feedback.
Does anyone know where I could find any more documentation around this matter?
The reason why I ask is that I am implementing an abstract layer over Xamarin with Invention where my intention is to have my method calls like:
Vibrate(HapticsIntensity.Light);
Vibrate(HapticsIntensity.Medium);
Vibrate(HapticsIntensity.Heavy);
This works today, but where on iOS I get the tactile feedback of Light, Medium and Heavy vibration, on Android, I can't differentiate between them.
Now, I know Android has a Vibrate class (see here), which allows for granular control; however, to use this I need to add to my manifest or ask for specific permissions for my app (android.permission.VIBRATE), and that is not optimal.
Also, if I add the android.permission.VIBRATE permission to my manifest, it seems like (if the device has Haptic Feedback enabled in its settings), I don't even need to add the Vibrate() method call to my buttons` onClick; and they will already provide the tactile feedback (BZZZTT!!1!).
It totally depends upon if the device OEM has altered ASOP code and the vibration timing arrays in com.android.internal.R.array resource to enable a special haptic feedback "engine" that they are using on their device.
By default, the hardware OEMs are only required to support (in hardware) a standard on/off vibration (linear actuator, weighted rotary, etc..), not a "true" haptic feedback one which is normally based upon waveforms.
In comparison to the newer iOS devices (7|8+?), they are using the "Taptic Engine" (fancy speak for an "advanced linear actuator") for haptic feedback and only recently are Android devices "catching" up on the hardware side (new OnePlus, Pixel 3s, etc...) are starting to include more advanced haptic/vibration hardware (whether are not the OEM has done any special with that new hardware, you decide...)
So if you look at ASOP's PhoneWindowManager.java you will find that most of the HapticFeedbackConstants get lumped into a few VibrationEffects constants such as:
~~~
VibrationEffect.EFFECT_TICK
VibrationEffect.EFFECT_CLICK
VibrationEffect.EFFECT_HEAVY_CLICK
~~~
Look at the source if you want to see what the ASOP default VibrationEffects would be for a specific HapticFeedbackConstants:
PhoneWindowManager.java
If you have to provide manual-based haptic for your app for some reason, you can use the Vibrator API and provide the byte array for your on/off timing and then special case it for phone devices at offer more hardware features.
I know this question as been beaten to death, but I don't want anything super complicated here.
We have a companion app with our site that is only compatible with 7 and 10-inch tablets. We need to only alert users on those devices about our app. Problem is, I can't go by resolution. My Galaxy S3 has a 1280 x 720 screen, but is obviously not a tablet. I also can't for the life of me find out a way to get the physical size of the screen. The only solution I have come up with is detecting whether the device can make calls with MobileCapabilities.CanInitiateVoiceCall. Unfortuantely, by boss isn't happy with that solution.
So... How can I distinguish between a phone and a tablet in my web app (Server or client side)?
UPDATE: So far it seems that the best approach for Android is something from a blog post by the Android team: All Android phones use "Mobile" in the UserAgent string, so checking for "Mobile" *and "Android" will tell you if it's a phone, while just "Android" should be a tablet. iOS devices should be just as simple--checking for "iPhone" vs "iPad" seems to have worked so far.
I know this is a little late, but I was looking for the same thing.
Wurfl has wat you want. You can implement it easily and and even have an api you can query.
For ASP.NET application first you must place the one-off initialization.
public class Global : HttpApplication
{
public const String WurflDataFilePath = "~/App_Data/wurfl.zip";
private void Application_Start(Object sender, EventArgs e)
{
var wurflDataFile = HttpContext.Current.Server.MapPath(WurflDataFilePath);
var configurer = new InMemoryConfigurer().MainFile(wurflDataFile);
var manager = WURFLManagerBuilder.Build(configurer);
HttpContext.Current.Cache[WurflManagerCacheKey] = manager;
}
}
And then use it like this.
var device = WURFLManagerBuilder.Instance.GetDeviceForRequest(userAgent);
var isTablet = device.GetCapability("is_tablet");
var isSmartphone = device.GetCapability("is_smartphone");
For more info check ASP.NET implementation
Hope this helps anyone else looking for this.
You can try to do a user agent detection and search for the keywrords, for example, all Non tablet devices have a "Mobile Safari" key words on their user agent.
I've got a C# control wrapped around the DirectShow libraries. Though I'm not certain it's relevant, I'm running on Windows CE 6.0R3. When trying to play a WMA audio file using the control, the following code throws an exception of "No such interface supported":
m_graph = new DShowGraph(mediaFile);
m_graphBuilder = m_graph.Open();
m_videoWindow = (IVideoWindow)m_graph.GetVideoWindow();
if (m_videoWindow == null)
{
// this is not hit
}
try
{
m_videoWindow.put_WindowStyle((int)(WS.CHILD | WS.VISIBLE | WS.CLIPSIBLINGS));
}
catch (Exception ex)
{
// I end up here
}
The Open call looks like this (error handling, etc. trimmed):
private IGraphBuilder _graphBuilder;
internal IGraphBuilder Open()
{
object filterGraph = ClassId.CoCreateInstance(ClassId.FilterGraph);
_graphBuilder = (IGraphBuilder)filterGraph;
_graphBuilder.RenderFile(_input, null);
return _graphBuilder;
}
The GetVideoWindow call simply looks like this:
public IVideoWindow GetVideoWindow()
{
if (_graphBuilder == null)
return null;
return (IVideoWindow)(_graphBuilder);
}
Strangely, this all works just fine with the same control DLL, same application and same media file when run under Windows CE 5.0.
My suspicion is that it might have something to do with the fact we're playing an audio-only file (checking to see if the same problem occurs with a video file now), but I'm not overly versed in Direct Show, so I'd like to understand exactly what's going on here.
One of the large challenges in debugging this is that I don't have the failing hardware in my office - it's at a customer's site, so I have to make changes, send them and wait for a reply. While that doesn't affect the question, it does affect my ability to quickly follow up with suggestions or follow on questions anyone might have.
EDIT1
Playing a WMV file works fine, so it is related to the file being audio-only. We can't test MP3 to see if it's a WMA codec issue becasu the device OEM does not include the MP3 codec in the OS due to their concerns over licensing.
The graph's IVideoWindow is nothing but forward to underlying IVideoWindow of video rendering filter. With audio only pipeline you don't have the video renderer (obviously) and IVideoWindow does not make much sense. The interface is still available but once you try to call methods, there is nothing to forward, hence the error.