I have decided to use list view with SwipeControl at each item,
first of all I have opened the AppUiBasics example provided by Microsoft, and I have found that I cannot slide any element which supports "sliding". I wonder if mouse pressed and moved is recognized as swipe bcos "SlidableListItem_RightCommandRequested" event never happens.
Can this "Swipe" movement be simulated with a mouse or it works only with sensor displays? What am I doing wrong or did i miss something?
Not supported, here is an answer
https://github.com/FrayxRulez/SwipeListView/issues/1
there is an alternative for desktop
https://github.com/brookshi/LLMListView
Related
I am currently developing a UI for my Unity game. Due that I have started creating a KeyBinder-Menu where one can reassign the keys. This works fine until I tryed to implement it for the Mousebuttons.
I tried using Event.current.button so i can handle mouse input, but first of all it returns 0 all the time when i am not pressing anything else then mousebutton 0 and secondly it does not react to my extra mousebuttons.
Then I tryed Input.GetKeyDown(KeyCode.MouseX) (X would be the mousebutton I want to handle) This works fine with mousebutton 0, 1 and 2 but does not work with my extrabuttons to. I have a Mouse from Logitech with 2 extra buttons and they work fine with all games (like lol, Rainbow, minecraft ....) so I dont know why unity can not handle them.
Thanks for all answers I may get.
This is currently a bug, i already submitted a bug report about it you can vote for it here:
https://issuetracker.unity3d.com/issues/event-dot-button-only-supports-right-left-and-middle-mouse-buttons
However Input.GetMouseButton(x); will still read your extra mouse buttons correctly where x is an integer
Eg.: Input.GetMouseButton(12); will read your 12th mouse button on a gaming mouse ...
You can also do release, push events:
Returns true during the frame the user pressed the given mouse button.
Input.GetMouseButtonDown(int button);
Returns whether the given mouse button is held down or not.
Input.GetMouseButton(int button);
Returns true during the frame the user releases the given mouse button.
Input.GetMouseButtonUp(int button);
Update from Unity:
Thanks again for reporting this issue. It has been reviewed by our developers and unfortunately, it has been decided that it will not be fixed. As of now, IMGUI doesn't support more than 3 buttons, and since IMGUI is being replaced with UIElements this ability is not going to be added.
In my c# windows forms code I'd like to detect once a button is pressed (and perform an action) and when a button is released (and perform another action).
I know the existance of MouseDown and MouseUp events and up to Windows XP everything was fine.
The problem comes now with Windows 7 and a capacitive touchscreen, when Microsoft introduces gesture and "PressAndHold" function: the MouseDown events is recevide several seconds after the user touches the screen. (N.B. using a mouse everything works fine).
How can I avoid this annoing delay before receiving the MouseDown event?
I already tried with GlobalAddAtom("MicrosoftTabletPenServiceProperty") and I had a little change: I do not receive the RightButton anymore, I receive LeftButton instead, but always after the same amount of time.
I also tried with MouseHover event with if (MouseButtons == MouseButtons.Left) but without success (it works with mouse only, not touch).
N.B. I need to let gesture and pressandhold feature active for other controls in the form.
I think I found a way on my Windows 7 touch:
I have to disable PressAndHold and also register the button as touch, so I get the MouseDown event almost immediately for that button, preserving gesture for all other controls.
TogglePressAndHold(btnMoveUp.Handle, false);
RegisterTouchWindow(btnMoveUp.Handle, 0);
For first function refer to this MSDN article: https://msdn.microsoft.com/en-us/library/ms812373.aspx
RegisterTouchWindow is a User32.dll function
Hope to be of some help for others. Let me know if it works also on others platforms.
Have you tried using the Control.Click Event ?
I'm wondering if there any solution that could make the app automatically adjust the controls position when the keyboard is activated. For example, in the image below, I want to make those four button on the screen move dependently with keyboard. When the keyboard is activated, buttons move to the center and move back when the keyboard is gone.
There might be some similar questions here but I couldnt see them in the search result, maybe they are using some different words on title so if this question is duplicated it will be appreciated if you guys could paste the link on comment or whereever.
The idea is you can listen to the Showing and Hiding event of the InputPane. In the event handler, you adjust your UI layout with respect to the keyboard.
For example, you can realign the button relative to the height of the keyboard.
For more information on InputPane, refer to https://msdn.microsoft.com/EN-US/library/windows/apps/windows.ui.viewmanagement.inputpane.aspx
For dynamically align the UI layout example, refer to
https://code.msdn.microsoft.com/windowsapps/Keyboard-Events-Sample-866ba41c
I have declared in my XAML the following element:
<ListView Name='mPlaylist' AllowDrop='True' DragEnter='HandlePlaylist_DragEnter' Drop='HandlePlaylist_Drop' />
When I drag a file from Windows Explorer or the desktop onto my ListView using the mouse, the DragEnter handler is executed and I can set the AcceptedOperation member of the DragEventArgs to Link (and Handled to true). So far, so good.
When I release left click, the Drop event handler is never fired. I can't figure out why not.
When I drag a file over the ListView, the cursor changes to a stop/invalid cursor, and the file thumbnail is overlaid with a red cross (despite my setting the AcceptedOperation to match one of the RequestedOperations - Link).
Although likely unrelated, the thumbnail also jumps up and to the left, a fair distance from the cursor. Moving the cursor over the ListView and the thumbnail does not maintain a constant distance from the cursor - I haven't been able to figure that out either.
I'm working on the PC, and using version 10.0.10069.0 of the Universal app platform in VS2015 RC.
Any ideas on what I can do to cause the Drop event to fire on my ListView (and perhaps even fix the visual glitch of the thumbnail 'separating' itself from the cursor when over the ListView) would be very much appreciated.
Solution is to use the DragOver event to set the AcceptedOperation member of the DragEventArgs, rather than DragEnter.
Thanks to Igor Ralic for posting the solution on his blog.
I have a problem with WPF application/presentation for Tablet PC with (multi-) touch screen. One "slide" of the presentation consists of Canvas on background and of a small UserControl. This UserControl is invisible at start, but whenever user touches the screen, it becomes visible and if user moves his finger, the control moves accordingly ("following" the finger, like a cursor). Then, when user stops touching the screen, the control becomes invisible again.
This is not very hard to do using the TouchDown, TouchUp and TouchMove event handlers and it works fine if user touches the screen with just one finger. However, when user holds one finger on position X (e.g. canvas coordinates [100,100]) and another finger on position Y (e.g. [500, 100]), the UserControl starts jumping between positions X and Y, which doesn't look very well...
Now I'd like the screen to react only to one touch at the time, which I can do in operating system (Windows 7) using Control Panel -> Pen and Touch -> Touch by unchecking item "Enable multi-touch gestures and inking".
This works fine, exactly as I want it to, unfortunately it's not very convenient, because sometimes I need to use the multi-touch and I can't change it every time I decide to use the application...
That's why I'd like to ask if there is any way how to disable the multi-touch programmaticaly, in the application (or just in WPF UserControl) where I need it. Thanks a lot in advance for any help.
Take a look at the TouchDevice.Id property. You can get this from the event arguments passed to the touch events and it will allow you to uniquely identify the touch events so that you don't confuse the first touch with subsequent parallel touches.
You can convert the C++ code from the following question to C#: Programatically enable / disable multitouch finger input?
The CodeProject article, "Single App Instance in C#: Yet Another Way" has a C# implementation of the Win32 GlobalAddAtom() function and PInvoke.net has a reference page for SetProp().
You can use a boolean eg. IsCurrentTouched, set it true if user touches and the control is shown, and set it back to false when the touch is released.
While IsCurrentTouched is true don't react to other touches. So you are able to use multitouch only when needed ;-)