Holding event is not fired on touch screen device - c#

I'm trying to create a simple uwp app where I require to display a context menu, if user right clicks on non-touch device and if user holds on touch screen device.
I'm using righttapped event which works perfectly on my desktop. The other side, I use holding event which should be triggered when screen is long pressed but it doesn't work(on touch screen mobile).
private void Webview_Holding(object sender, HoldingRoutedEventArgs e)
{
Webview.ContextFlyout.ShowAt(sender as FrameworkElement);
}
Do I need to use some other event for showing context menu, if yes then which one?

You should use RightTapped event for both touch and non-touch devices as I answered here. However, WebView does not support all of the touch or keyboard events.
Remarks section of MSDN for WebView control:
As indicated in the Events table, WebView doesn’t support most of the
user input events inherited from UIElement, such as KeyDown, KeyUp,
and PointerPressed. A common workaround is to use InvokeScriptAsync
with the JavaScript eval function to use the HTML event handlers, and
to use window.external.notify from the HTML event handler to notify
the application using WebView.ScriptNotify.

Related

App not getting keyboard focus when entering from other app

In WPF I have a Window_KeyDown event which changes the mouse cursor shape when the user presses shift. It works as expected except if I have clicked on another app. So I added a Window_MouseEnter event like this to grab keyboard focus when the mouse reenters my app:
private void Window_MouseEnter(object sender, MouseEventArgs e)
{
IInputElement b = Keyboard.Focus(this);
this.Focus();
Debug.WriteLine(b + DateTime.Now.ToLongTimeString());
}
I can see the MouseEnter event firing (with the debug line) when the mouse enters the app but my app still doesn't get keyboard events until I click in the app.
It's irritating because the mouse cursor changes properly when the mouse enters controls within my app so one would expect things to work but the shift-mouse functions don't work until after a click.
What am I missing?
I found that the secret is the Activate() method. I put it in the MouseEnter function which has a side-effect of forcing the entire app to show if some of it was hidden by other apps.
With Activate(), the Focus() method is not needed.

UWP Template10 SystemNavigationManager Back button adding GotFocus event

I am trying to to add an OnGotFocus event on Template 10 Back button as:
In PageViewModels.cs:
public override async Task OnNavigatedToAsync(object parameter, NavigationMode mode, IDictionary<string, object> suspensionState)
{
SystemNavigationManager.GetForCurrentView().BackRequested += OnGotFocus;
}
private async void OnGotFocus(object sender, BackRequestedEventArgs e)
{
....
}
But that does not work. Can anyone give me any pointers?
What you did in your code snippet is that you wired up a handler for the BackRequested event. This event is fired when the system registers a request to go back in the app. This can be fired by the user tapping the back button in Task bar in Tablet mode on desktop, or by clicking the back button in title bar of your app in Windowed mode or pushing the back button on mobile device.
Either way, this event is fired by the system and the only thing it does is that it calls your method. The name of your method does not matter at all.
I think you should review some basics about event handling in C# to clear up any confusion.
To be able to use the OnGotFocus event, you will have to create your own back button in XAML and add the handler to this button, because only this way you have full control over the control. If you just use the system provided BackRequested event, the system is in control and apart from this event you can't customize anything.
<Button GotFocus="OnGotFocus" Content="My back button" />

Touch start Event Handle on Windows Phone c#

I've made some researches and I found that on MSDN
It says that:
Tap
There are two behaviors associated with a tap gesture:
Finger down provides touch indication
Finger up executes the action
I want to handle only the first behavior (On finger Down), but I didn't find anything.
How can I handle this?
You can use ManipulationStarted event to handle touch start event. Basically, this event occurs when user begins a manipulation on the UIElement doesn't matter what the gesture is :
private void MyControl_ManipulationStarted(object sender,
System.Windows.Input.ManipulationStartedEventArgs e)
{
}
Why don't you try using the MouseLeave and MouseEnter events to your control.
So i guess in your case you could use the MouseEnter event.
MSDN reference

How to fire UIElement.TouchDown event on Windows Forms C# on Touchscreen, windows 8

This need to be Windows Forms (winforms) - not WPF. Problem is with single MouseDown event that didn't fire properly on touchscreen on windows 8. MouseDown fire either after touch and move finger or after click - it fire just after you get off your finger. I want to fire it like a normal MouseDown should - after i touch the screen. The solution to that would be that TouchDown event. But, i just cant to handle that event. What I do is that I create class Multitouch like that:
class Moultitouch : UIElement
{
public Moultitouch()
{
this.TouchDown += new EventHandler<System.Windows.Input.TouchEventArgs>(Moultitouch_TouchDown);
}
void Moultitouch_TouchDown(object sender, System.Windows.Input.TouchEventArgs e)
{
//it never goes in
}
protected override void OnTouchDown(System.Windows.Input.TouchEventArgs e)
{
base.OnTouchDown(e);
}
}
After that I'm declaring that class in my component which on Touch should handle that event. I did a lot of research on that and 've been trying to do that and nothing so far. I think that "Hosting a WPF Composite Control in Windows Forms" http://msdn.microsoft.com/en-us/library/ms742215.aspx could solve my problem but that solution could be really long and tough. Any help or ideas are really appreciated.
Your issue is that touchscreen fired events and mouse fired events are not the same thing. For example a Winform button can toggle its background colour by handling OnMouseEnter and OnMouseLeave. Code that up on a form and run the cursor in and out of the button and everything works fine. Use a touch screen and run your finger over it in the same way and nothing happens until you release contact with the screen, and then a click event is raised. Not the answer you are looking for, but I am having the same issue with an on-screen keyboard I'm developing for a disabled friend of mine. My research led me here https://blogs.windows.com/buildingapps/2017/05/25/uwp-evolution-touch-development/#02octDx0X3uBHXqm.97 and specifically to the links that provide detail on the ManipulationStarted and ManipulationEnded events. Hope that helps, it has for what I'm trying to achieve.

Button event on release

Im working on an app to control some stuff over bluetooth.
Currently I have a form with some buttons. On each button I want to trigger an event when the button is pressed down and fire a different event when the button is released.
Is this possible?
Not sure what the "bluetooth" info should tell me.
But in C# Winforms, yes that is possible. See the MouseDown and MouseUp Events: http://msdn.microsoft.com/en-us/library/system.windows.forms.button_events.aspx
(You may need to combine them with the keyboard events to capture them as well)

Categories