Events do not react to extra Mousebuttons - c#

I am currently developing a UI for my Unity game. Due that I have started creating a KeyBinder-Menu where one can reassign the keys. This works fine until I tryed to implement it for the Mousebuttons.
I tried using Event.current.button so i can handle mouse input, but first of all it returns 0 all the time when i am not pressing anything else then mousebutton 0 and secondly it does not react to my extra mousebuttons.
Then I tryed Input.GetKeyDown(KeyCode.MouseX) (X would be the mousebutton I want to handle) This works fine with mousebutton 0, 1 and 2 but does not work with my extrabuttons to. I have a Mouse from Logitech with 2 extra buttons and they work fine with all games (like lol, Rainbow, minecraft ....) so I dont know why unity can not handle them.
Thanks for all answers I may get.

This is currently a bug, i already submitted a bug report about it you can vote for it here:
https://issuetracker.unity3d.com/issues/event-dot-button-only-supports-right-left-and-middle-mouse-buttons
However Input.GetMouseButton(x); will still read your extra mouse buttons correctly where x is an integer
Eg.: Input.GetMouseButton(12); will read your 12th mouse button on a gaming mouse ...
You can also do release, push events:
Returns true during the frame the user pressed the given mouse button.
Input.GetMouseButtonDown(int button);
Returns whether the given mouse button is held down or not.
Input.GetMouseButton(int button);
Returns true during the frame the user releases the given mouse button.
Input.GetMouseButtonUp(int button);
Update from Unity:
Thanks again for reporting this issue. It has been reviewed by our developers and unfortunately, it has been decided that it will not be fixed. As of now, IMGUI doesn't support more than 3 buttons, and since IMGUI is being replaced with UIElements this ability is not going to be added.

Related

Change pointer cursor when hovering over map elements

My UWP application contains a map with several POI. I am trying to change the mouse cursor from an arrow to a hand when hovering over specific poi to indicate its clickable.
This would change the cursor as soon as it enters the map still, as a simple test, I added a PointerEntered event for the mapcontrol and within it I have the following to change the cursor:
Window.Current.CoreWindow.PointerCursor = new Windows.UI.Core.CoreCursor(Windows.UI.Core.CoreCursorType.Hand, 0);
It appears though the cursor does change however immediately gets overridden back to the pointer cursor.
Edit: Just realised When a poi is clicked (i.e. is selected) the cursor changes to a hand even when not over the map control until the poi is unselected. No good as I would like the cursor to change dynamically when hovering over a poi and revert back to cursor when moved away.
Change pointer cursor when hovering over map elements
I'm afraid you can't edit the default cursor for map element, Because it has handled internally, it will not fired, even you has listen PointerEntered event, it consumed by the control and not passed up the control chain. If you do want this feature, the better way is post this feature with windows feed backhub app.
I don't know if it works just like WinForms, I had to do something like this to click on labels (couldn't use link-labels), what I used was in the Mouse_Move event of the label and it was basically
if (Cursor.Current == Cursors.Default)
{
Cursor.Current = Cursors.Hand;
}
and similar changes and behaviors due to the various conditions. This however got me a small issue: this statement changes the mouse graphic anytime you move on the control, but personally on Windows settings I use the trail graphic function for the mouse (leaving a trail of pointers whenever I move the mouse on the screen), what I suggested you disables this function, or better, it conceals it, since it "recreates" the mouse graphic for every move you do onto the control, and thus it "undoes" the graphic for the mouse and recreates it as a Hand (in my instance). If it doesn't concern you though, it works just fine.
Just I repeat myself: I use this on WinForms, but since it's C# I suppose it just will work(?)

Will this code read mouse button X on many button mouses?

I currently don't have a many mouse button mouse at home, but i found nothing about this online.
Will Input.GetMouseButton(x) will read my mouse Xth button for example.
Will this return true if i hold down my mouse 4th mouse button?
bool holding = Input.GetMouseButton(4);
The unity doc only mentions mouse button 1,2 and 3 :/
Actually, yes the function Input.GetMouseButton(4) will read in the 4th mouse button on your mouse when held down :)
(I tested this like half a year ago, Unity doc team should have mentioned this indeed that it can read any mouse buttons, so we dont need to test it :P)

Fast detect button pressed and released on Touch screen

In my c# windows forms code I'd like to detect once a button is pressed (and perform an action) and when a button is released (and perform another action).
I know the existance of MouseDown and MouseUp events and up to Windows XP everything was fine.
The problem comes now with Windows 7 and a capacitive touchscreen, when Microsoft introduces gesture and "PressAndHold" function: the MouseDown events is recevide several seconds after the user touches the screen. (N.B. using a mouse everything works fine).
How can I avoid this annoing delay before receiving the MouseDown event?
I already tried with GlobalAddAtom("MicrosoftTabletPenServiceProperty") and I had a little change: I do not receive the RightButton anymore, I receive LeftButton instead, but always after the same amount of time.
I also tried with MouseHover event with if (MouseButtons == MouseButtons.Left) but without success (it works with mouse only, not touch).
N.B. I need to let gesture and pressandhold feature active for other controls in the form.
I think I found a way on my Windows 7 touch:
I have to disable PressAndHold and also register the button as touch, so I get the MouseDown event almost immediately for that button, preserving gesture for all other controls.
TogglePressAndHold(btnMoveUp.Handle, false);
RegisterTouchWindow(btnMoveUp.Handle, 0);
For first function refer to this MSDN article: https://msdn.microsoft.com/en-us/library/ms812373.aspx
RegisterTouchWindow is a User32.dll function
Hope to be of some help for others. Let me know if it works also on others platforms.
Have you tried using the Control.Click Event ?

Window.DragMove() causes the window to freeze temporarily if holding down the mouse left button without moving the cursor

I have a "borderless" window in WPF. It can be dragged from any part of the window that does not handle the Click event, using this code:
// Drag the window:
private void GlassWindow_MouseDown(object sender, MouseButtonEventArgs e)
{
if (e.ChangedButton != MouseButton.Left) return;
if (AllowDrag) DragMove();
}
(Note: AllowDrag is always set to true in this case)
This code works fine, except that when I click on the window and hold down the left mouse button, without moving the cursor (= not dragging the window), the window freezes for about 2-3 seconds (i.e. all animations pause, progressbar stops moving). This behaviour is consistent, and does not happen when I click on a button or when I drag the window, only when I hold left click.
Is there any solution for this or is this intended windows behavior?
EDIT: Things that don't solve the problem:
https://stackoverflow.com/a/3275712/2719183
https://stackoverflow.com/a/5494769/2719183
http://www.codeproject.com/Articles/11114
if (AllowDrag) DragMove();
DragMove() is the trouble-maker, it is uses a pretty hacky way to implement the move. And that causes the problem you describe, the WPF team is well-aware of the issue but chose to not fix it. You can read about it in this connect article. Vote if you are not pleased.
So you need to avoid DragMove(). The best way is to do it the way it is normally done, you minimize the risk of reproducing the exact same trouble that way. That requires knowing a little about the way the winapi works. Whenever a window is clicked, Windows sends the WM_NCHITTEST message to ask your app what part of the window was clicked. When you return HTCAPTION, even if you don't have a caption, then Windows takes your word for it and implements what normally happens when you click and drag a window by its caption.
That has been done, you don't have to be an expert in the winapi to get that going. Google "wpf wm_nchittest" to find code. The top hit is an existing SO question, Tergiver's code looks good.

WPF - Is it possible to (programmatically) disable multi-touch?

I have a problem with WPF application/presentation for Tablet PC with (multi-) touch screen. One "slide" of the presentation consists of Canvas on background and of a small UserControl. This UserControl is invisible at start, but whenever user touches the screen, it becomes visible and if user moves his finger, the control moves accordingly ("following" the finger, like a cursor). Then, when user stops touching the screen, the control becomes invisible again.
This is not very hard to do using the TouchDown, TouchUp and TouchMove event handlers and it works fine if user touches the screen with just one finger. However, when user holds one finger on position X (e.g. canvas coordinates [100,100]) and another finger on position Y (e.g. [500, 100]), the UserControl starts jumping between positions X and Y, which doesn't look very well...
Now I'd like the screen to react only to one touch at the time, which I can do in operating system (Windows 7) using Control Panel -> Pen and Touch -> Touch by unchecking item "Enable multi-touch gestures and inking".
This works fine, exactly as I want it to, unfortunately it's not very convenient, because sometimes I need to use the multi-touch and I can't change it every time I decide to use the application...
That's why I'd like to ask if there is any way how to disable the multi-touch programmaticaly, in the application (or just in WPF UserControl) where I need it. Thanks a lot in advance for any help.
Take a look at the TouchDevice.Id property. You can get this from the event arguments passed to the touch events and it will allow you to uniquely identify the touch events so that you don't confuse the first touch with subsequent parallel touches.
You can convert the C++ code from the following question to C#: Programatically enable / disable multitouch finger input?
The CodeProject article, "Single App Instance in C#: Yet Another Way" has a C# implementation of the Win32 GlobalAddAtom() function and PInvoke.net has a reference page for SetProp().
You can use a boolean eg. IsCurrentTouched, set it true if user touches and the control is shown, and set it back to false when the touch is released.
While IsCurrentTouched is true don't react to other touches. So you are able to use multitouch only when needed ;-)

Categories