In my c# windows forms code I'd like to detect once a button is pressed (and perform an action) and when a button is released (and perform another action).
I know the existance of MouseDown and MouseUp events and up to Windows XP everything was fine.
The problem comes now with Windows 7 and a capacitive touchscreen, when Microsoft introduces gesture and "PressAndHold" function: the MouseDown events is recevide several seconds after the user touches the screen. (N.B. using a mouse everything works fine).
How can I avoid this annoing delay before receiving the MouseDown event?
I already tried with GlobalAddAtom("MicrosoftTabletPenServiceProperty") and I had a little change: I do not receive the RightButton anymore, I receive LeftButton instead, but always after the same amount of time.
I also tried with MouseHover event with if (MouseButtons == MouseButtons.Left) but without success (it works with mouse only, not touch).
N.B. I need to let gesture and pressandhold feature active for other controls in the form.
I think I found a way on my Windows 7 touch:
I have to disable PressAndHold and also register the button as touch, so I get the MouseDown event almost immediately for that button, preserving gesture for all other controls.
TogglePressAndHold(btnMoveUp.Handle, false);
RegisterTouchWindow(btnMoveUp.Handle, 0);
For first function refer to this MSDN article: https://msdn.microsoft.com/en-us/library/ms812373.aspx
RegisterTouchWindow is a User32.dll function
Hope to be of some help for others. Let me know if it works also on others platforms.
Have you tried using the Control.Click Event ?
Related
I am currently developing a UI for my Unity game. Due that I have started creating a KeyBinder-Menu where one can reassign the keys. This works fine until I tryed to implement it for the Mousebuttons.
I tried using Event.current.button so i can handle mouse input, but first of all it returns 0 all the time when i am not pressing anything else then mousebutton 0 and secondly it does not react to my extra mousebuttons.
Then I tryed Input.GetKeyDown(KeyCode.MouseX) (X would be the mousebutton I want to handle) This works fine with mousebutton 0, 1 and 2 but does not work with my extrabuttons to. I have a Mouse from Logitech with 2 extra buttons and they work fine with all games (like lol, Rainbow, minecraft ....) so I dont know why unity can not handle them.
Thanks for all answers I may get.
This is currently a bug, i already submitted a bug report about it you can vote for it here:
https://issuetracker.unity3d.com/issues/event-dot-button-only-supports-right-left-and-middle-mouse-buttons
However Input.GetMouseButton(x); will still read your extra mouse buttons correctly where x is an integer
Eg.: Input.GetMouseButton(12); will read your 12th mouse button on a gaming mouse ...
You can also do release, push events:
Returns true during the frame the user pressed the given mouse button.
Input.GetMouseButtonDown(int button);
Returns whether the given mouse button is held down or not.
Input.GetMouseButton(int button);
Returns true during the frame the user releases the given mouse button.
Input.GetMouseButtonUp(int button);
Update from Unity:
Thanks again for reporting this issue. It has been reviewed by our developers and unfortunately, it has been decided that it will not be fixed. As of now, IMGUI doesn't support more than 3 buttons, and since IMGUI is being replaced with UIElements this ability is not going to be added.
I have a UWP app in which I have several buttons. Once the app starts to run, I set the focus in my code to the first button using a code like this:
firstButton.Focus(FocusState.Programmatic);
After this point, what I am interested in is that once the user use the mouse wheel, the UWP app automatically scroll to second, third, fourth, ... button(exactly like when we use tab key on keyboard to move between buttons).
However, when I use mouse wheel, nothing happens in the app.
I should also say that in firstbutton xaml, I use pointerwheelchanged event listener to change the focus to second button. However, this event handler does not work with mouse wheel UNTIL I MOVE THE MOUSE CURSOR INSIDE THE AREA OF FIRST BUTTON. What I am interested in is that this scrolling using mouse wheel becomes automatic exactly like the tab key of keyboard.
Any suggestions?
Place the event on the container control (like the Grid). If it's a control that already processes the event, use the AddHandler method with handledEventsToo set to true.
I have decided to use list view with SwipeControl at each item,
first of all I have opened the AppUiBasics example provided by Microsoft, and I have found that I cannot slide any element which supports "sliding". I wonder if mouse pressed and moved is recognized as swipe bcos "SlidableListItem_RightCommandRequested" event never happens.
Can this "Swipe" movement be simulated with a mouse or it works only with sensor displays? What am I doing wrong or did i miss something?
Not supported, here is an answer
https://github.com/FrayxRulez/SwipeListView/issues/1
there is an alternative for desktop
https://github.com/brookshi/LLMListView
I have a "borderless" window in WPF. It can be dragged from any part of the window that does not handle the Click event, using this code:
// Drag the window:
private void GlassWindow_MouseDown(object sender, MouseButtonEventArgs e)
{
if (e.ChangedButton != MouseButton.Left) return;
if (AllowDrag) DragMove();
}
(Note: AllowDrag is always set to true in this case)
This code works fine, except that when I click on the window and hold down the left mouse button, without moving the cursor (= not dragging the window), the window freezes for about 2-3 seconds (i.e. all animations pause, progressbar stops moving). This behaviour is consistent, and does not happen when I click on a button or when I drag the window, only when I hold left click.
Is there any solution for this or is this intended windows behavior?
EDIT: Things that don't solve the problem:
https://stackoverflow.com/a/3275712/2719183
https://stackoverflow.com/a/5494769/2719183
http://www.codeproject.com/Articles/11114
if (AllowDrag) DragMove();
DragMove() is the trouble-maker, it is uses a pretty hacky way to implement the move. And that causes the problem you describe, the WPF team is well-aware of the issue but chose to not fix it. You can read about it in this connect article. Vote if you are not pleased.
So you need to avoid DragMove(). The best way is to do it the way it is normally done, you minimize the risk of reproducing the exact same trouble that way. That requires knowing a little about the way the winapi works. Whenever a window is clicked, Windows sends the WM_NCHITTEST message to ask your app what part of the window was clicked. When you return HTCAPTION, even if you don't have a caption, then Windows takes your word for it and implements what normally happens when you click and drag a window by its caption.
That has been done, you don't have to be an expert in the winapi to get that going. Google "wpf wm_nchittest" to find code. The top hit is an existing SO question, Tergiver's code looks good.
I have a problem with WPF application/presentation for Tablet PC with (multi-) touch screen. One "slide" of the presentation consists of Canvas on background and of a small UserControl. This UserControl is invisible at start, but whenever user touches the screen, it becomes visible and if user moves his finger, the control moves accordingly ("following" the finger, like a cursor). Then, when user stops touching the screen, the control becomes invisible again.
This is not very hard to do using the TouchDown, TouchUp and TouchMove event handlers and it works fine if user touches the screen with just one finger. However, when user holds one finger on position X (e.g. canvas coordinates [100,100]) and another finger on position Y (e.g. [500, 100]), the UserControl starts jumping between positions X and Y, which doesn't look very well...
Now I'd like the screen to react only to one touch at the time, which I can do in operating system (Windows 7) using Control Panel -> Pen and Touch -> Touch by unchecking item "Enable multi-touch gestures and inking".
This works fine, exactly as I want it to, unfortunately it's not very convenient, because sometimes I need to use the multi-touch and I can't change it every time I decide to use the application...
That's why I'd like to ask if there is any way how to disable the multi-touch programmaticaly, in the application (or just in WPF UserControl) where I need it. Thanks a lot in advance for any help.
Take a look at the TouchDevice.Id property. You can get this from the event arguments passed to the touch events and it will allow you to uniquely identify the touch events so that you don't confuse the first touch with subsequent parallel touches.
You can convert the C++ code from the following question to C#: Programatically enable / disable multitouch finger input?
The CodeProject article, "Single App Instance in C#: Yet Another Way" has a C# implementation of the Win32 GlobalAddAtom() function and PInvoke.net has a reference page for SetProp().
You can use a boolean eg. IsCurrentTouched, set it true if user touches and the control is shown, and set it back to false when the touch is released.
While IsCurrentTouched is true don't react to other touches. So you are able to use multitouch only when needed ;-)