I've made some researches and I found that on MSDN
It says that:
Tap
There are two behaviors associated with a tap gesture:
Finger down provides touch indication
Finger up executes the action
I want to handle only the first behavior (On finger Down), but I didn't find anything.
How can I handle this?
You can use ManipulationStarted event to handle touch start event. Basically, this event occurs when user begins a manipulation on the UIElement doesn't matter what the gesture is :
private void MyControl_ManipulationStarted(object sender,
System.Windows.Input.ManipulationStartedEventArgs e)
{
}
Why don't you try using the MouseLeave and MouseEnter events to your control.
So i guess in your case you could use the MouseEnter event.
MSDN reference
Related
I'm using Mapsui as a mapping control in a C# application.
By default, panning is initiated by dragging using the left mouse button.
I want to change this to the middle mouse button.
Does anyone know how to do this?
Mapsui has an object called PanMode, you can create an instance as follows, however, I believe it is just an enum for centering the map when panning:
Mapsui.UI.PanMode panMode = new PanMode();
EDIT:
Based on what 'pauldendulk's' answer (thank you for your support) I think I need to do something like this:
First, catch the middle button click and relay it to the mapsui left button method. Unfortuantly MapControlMouseLeftButtonDown() is a private method so this will not work.
MyMapControl.MouseDown += MapControlOnMouseButtonDown;
private void MapControlOnMouseButtonDown(object sender, MouseButtonEventArgs e)
{
if(e.ChangedButton == MouseButton.Middle)
{
Mapsui.UI.Wpf.MapControl.MapControlMouseLeftButtonDown(sender, e);
}
}
Secondly I need to stop the origional left button click from firing.
MyMapControl.MouseLeftButtonDown += null;
Again, this is not correct syntax as it throws an exception (cannot be null).
Does anyone know how to solve these issues?
Mapsui was not designed with this in mind. Perhaps it is possible if you assign an event handler to WPFs mouse down event, set the viewport in there. Also, you need to suppress the regular mouse event. Perhaps this is possible by setting the MouseLeftButtonDown event handler to null.
PanMode is not relevant. It is meant to limit the area where users can pan/zoom.
I'm trying to create a simple uwp app where I require to display a context menu, if user right clicks on non-touch device and if user holds on touch screen device.
I'm using righttapped event which works perfectly on my desktop. The other side, I use holding event which should be triggered when screen is long pressed but it doesn't work(on touch screen mobile).
private void Webview_Holding(object sender, HoldingRoutedEventArgs e)
{
Webview.ContextFlyout.ShowAt(sender as FrameworkElement);
}
Do I need to use some other event for showing context menu, if yes then which one?
You should use RightTapped event for both touch and non-touch devices as I answered here. However, WebView does not support all of the touch or keyboard events.
Remarks section of MSDN for WebView control:
As indicated in the Events table, WebView doesn’t support most of the
user input events inherited from UIElement, such as KeyDown, KeyUp,
and PointerPressed. A common workaround is to use InvokeScriptAsync
with the JavaScript eval function to use the HTML event handlers, and
to use window.external.notify from the HTML event handler to notify
the application using WebView.ScriptNotify.
This need to be Windows Forms (winforms) - not WPF. Problem is with single MouseDown event that didn't fire properly on touchscreen on windows 8. MouseDown fire either after touch and move finger or after click - it fire just after you get off your finger. I want to fire it like a normal MouseDown should - after i touch the screen. The solution to that would be that TouchDown event. But, i just cant to handle that event. What I do is that I create class Multitouch like that:
class Moultitouch : UIElement
{
public Moultitouch()
{
this.TouchDown += new EventHandler<System.Windows.Input.TouchEventArgs>(Moultitouch_TouchDown);
}
void Moultitouch_TouchDown(object sender, System.Windows.Input.TouchEventArgs e)
{
//it never goes in
}
protected override void OnTouchDown(System.Windows.Input.TouchEventArgs e)
{
base.OnTouchDown(e);
}
}
After that I'm declaring that class in my component which on Touch should handle that event. I did a lot of research on that and 've been trying to do that and nothing so far. I think that "Hosting a WPF Composite Control in Windows Forms" http://msdn.microsoft.com/en-us/library/ms742215.aspx could solve my problem but that solution could be really long and tough. Any help or ideas are really appreciated.
Your issue is that touchscreen fired events and mouse fired events are not the same thing. For example a Winform button can toggle its background colour by handling OnMouseEnter and OnMouseLeave. Code that up on a form and run the cursor in and out of the button and everything works fine. Use a touch screen and run your finger over it in the same way and nothing happens until you release contact with the screen, and then a click event is raised. Not the answer you are looking for, but I am having the same issue with an on-screen keyboard I'm developing for a disabled friend of mine. My research led me here https://blogs.windows.com/buildingapps/2017/05/25/uwp-evolution-touch-development/#02octDx0X3uBHXqm.97 and specifically to the links that provide detail on the ManipulationStarted and ManipulationEnded events. Hope that helps, it has for what I'm trying to achieve.
I'm currently developing a custom control and realize that my code is being run twice. It is not really a huge issue (it is only a Focus method call). However, I would like to understand it.
From reading the MSDN description for click | onclick event, it states that:
Fires when the user clicks the left mouse button on the object.
So I added the OnClick event and the MouseClick events to handle both left and right clicking. But after debugging the code I found that the OnClick handles both left and right click events.
Why is OnClick handling both and do I need to keep both events in my code for some reason I'm overlooking?
protected override void OnClick(EventArgs e)
{
this.Focus();
base.OnClick(e);
}
private void CustomControl_MouseClick(object sender, MouseEventArgs e)
{
if (e.Button == MouseButtons.Right)
{
rightClickMenu(e);
}
}
According to MSDN, the Click event is called not only when the mouse is clicked, but also when the Enter button is pressed. If you only need to handle mouse clicks, I'd move all of your code in the MouseClick event. You can't do it the other way around because the Click event doesn't tell you which mouse button (if any) was clicked.
First of all, your link is incorrect, it links to HTML and DHTML Reference, not WinForms :)
Correct link is Control.MouseClick event
You need to override only one method. If you want to handle only mouse clicks - override OnMouseClick() and don't handle MouseClick event, otherwise - override OnClick() and don't override OnMouseClick().
You shouldn't need to have both events... Just keep the OnClick.
Also, I haven't done Windows Forms in quite a while, but I think there's a better way to accept focus than manually setting it on the click event, but I can't tell you specifically what it is... I think there's a property for it or something.
In Winforms, the Click event is raised when either mouse key is clicked.
If my memory serves me right, click does both mouseclick and the 'Enter' key or even setting focus on the control using the 'Tab' key and then using 'Space' or 'Enter' to "click" it.
If such behaviour is acceptable/desired, you may do the following.
I had this workaround for a DoubleClick event...
void ControlClick(object sender, EventArgs e)
{
MouseEventArgs mEvt=e as MouseEventArgs; // or (MouseEventArgs)e;
// now mEvt has the same properties as 'e' in MouseClick event
}
Hope this helps.
-Nurchi
The OnClick and CustomControl_MouseClick is the same event
You can have how many methods you want attached to an event ( this.Click += ...)
I'm making a custom control with a panel. I want to be able to drag and drop it so I've implemented that in the MouseDown event of my control. But I want the thing to react when you start drag to give a little feedback to the user. So in the MouseDown even I change the color. Then I want to change it back in the MouseUp event.
My control is not installed into VS2008 but just a class I've written that I instanciate at run time (I don't know in advance how many I need and so on). Now, my control exposes a MouseDown event so as to be able to be dragged. When I subscribe to this event from the parent application to actually perform the drag and drop my control is not repainted on its MouseUp event! In fact, the MouseUp is never invoked. If, on the other hand, I don't subscribe to the event in the parent app it works as intended.
What's going on? Is the parent interrupting the flow so that the MouseUp event never fires in my control? How do I get around this?
I'm not sure if you are using Windows Forms or WPF, but in Windows forms here is what I mean:
public class DerivedPanel : Panel
{
protected override void OnMouseDown(MouseEventArgs e)
{
base.OnMouseDown(e);
Capture = true;
}
protected override void OnMouseUp(MouseEventArgs e)
{
base.OnMouseUp(e);
Capture = false;
// Change your color or whatever here
}
}
In WPF there are two methods, CaptureMouse() and ReleaseMouseCapture() to do the same thing. When the control captures the mouse, it will received mouse events even if the cursor isn't over the control. This could be causing your problem. See MSDN Article
Do you capture the mouse in the custom control on the mousedown event? Try capturing on the mousedown and releasing the capture on the mouseup.