I want to detect mouse hovering over visual elements in .NET MAUI ( like buttons ). I cannot find any "onmouseover"/"ismouseover" events for this. How can I implement this.
All requests for "desktop" style control of keyboard and mouse have been consolidated under Maui issue: Desktop platforms: Mouse and Keyboard.
jfversluis' comment (at that issue or a related issue) indicates that no work in this area has yet been planned.
That issue is the place to discuss the topic (express interest in, or add any specific requirements or use cases), until there is a plan.
Until then, the work-around is generally to write Windows-only code in the Windows folder of your Maui project.
Unfortunately for mouse, that only gives window or desktop-relative mouse information. AFAIK, There currently is no easy way to interact with individual UI elements while the mouse moves. (Because Maui started as Xamarin started as mobile cross-platform code; touch devices don't have concept of cursor being moved around screen.)
A crude hack is to use AbsoluteLayout: position UI elements at exact positions, so you know where they are, then write (Windows-only) code that does your own "detection".
The next level of sophistication would be to write code that traverses a window's "visual tree" of UI elements, to detect what control the mouse is over. Maui must have such logic internally somewhere, to handle touch events. Unless it relies on each platform's UI code to decide that. TBD.
For now, Maui makes sense for apps that can live with a "touch" paradigm, rather than a "desktop" paradigm.
This isn't an answer per say, but as a suggestion you could try and determine when the mouse would be over the VisualElement and then use StateTriggers to actively apply a VisualState.
MAUI Triggers
Related
I've been tasked from my workplace to create a program which, when ran, moves the cursor to over a Windows Toolbar within the Taskbar, and left-clicks it.
The rationale for this is that everyone’s using the same mapped Toolbars on their desktop and considering how tiny the button is to activate the dropdown, and how often we’re clicking on it, it becomes tedious.
I’ve made code to move the mouse cursor to a hardcoded location on the taskbar and send the left-click message, and it works pretty fantastically actually.
The reason why the above approach won’t work however is because everyone’s Toolbars are constantly changing locations (variable number mapped apps on taskbar, docking/undocking laptop from large monitors).
The only thing that’s constant is the name of the mapped Toolbars. What I really need is to determine the screen coordinates for Toolbar “x”, and then move the mouse cursor to that.
I have honestly researched this for some weeks now, looking into UI Automation (people C# one of this is deprecated but not C++), and several other related posts, but with no success. I’ve found a C++ solution, but it didn’t actually include a code example (says “do this, then do that”); plus I’m not as- proficient in C++.
Through Spy++ I’m able to see the coordinates, window handle, etc. for these Toolbars, which leads me to believe this must be possible. FindWindow won’t work because the window handle isn’t constant.
Any help you can provide on this is greatly appreciated.
I'm currently developing a console app and I've designed my menus in a way such that the active element of the menu is highlighted with a specific background color. To re-render the menu after the change of an active element I'm moving the cursor with the Console.SetCursorPosition(0,0) to the very top of the window to then rewrite all the lines. Also, the Console.Clear() method is used to clear the console before moving to another menu (which can be of different size).
However, this approach does not seem to work on MacOS - the Console.Clear() method is not working, and the Console.SetCursorPosition(0,0) is not working in the way it should - the menus overlap each other instead of being cleared.
Is there any way to make this work on MacOS, or is it impossible due to some of the features of the MacOS terminal?
My suggestion would be to use the CursesSharp library to do this.
CursesSharp provides C# bindings to the ncurses.h library for Unix-like systems. In theory, you ought to be able to do anything you want to do that the ncurses library is capable of, including clearing the screen and setting the cursor position.
I'm writing a hybrid XNA + Silverlight app for WP7 and I'm trying to work out the touch splitting between them. I've got this halfway worked out: I can suppress XNA TouchPanel touches when the user clicks a Silverlight button. However I have not figured out how to throw out game-only touches for Silverlight. So if you're holding a touch point in the game space (say, for moving the player around), a second touch on a button won't work. It think it's picking it up as a multi-touch gesture and only allowing the first touch point to click buttons.
My question is: how can you suppress this touch point in Silverlight processing?
The simplest way woudl be to design your app so that you don't use both types of control in an interactive way on the same page.
Or, if you must, when you detect the first XNA touch disable touch on the Silverlight controls.
I'm building a windows 7 touch app and want to be able to handle touch gestures as simple events, specifically flicks. I can see windows responding, it pops up the icons as I flick but I can't seem to find an easy way to handle them. Something like grid.OnLeftFlick ...
Am I missing something stupidly simple? Is there a toolkit or something I can use? Or, do I have to write my own listener?
Thanks.
While WPF4 does support multi-touch, it does not have any built in gesture support, at least not of the kind you are thinking about. Features such as flick, pinch-zoom, etc. all have to be programmed on top of WPF4.
I don't know of any higher-level gesture support toolkits but writing your own is not as hard as it sounds. This WPF4 touch sample shows for example touch based move, pinch-zoom and pinch-rotate:
Walkthrough: Creating Your First Touch Application
If you compile and run the sample you'll see the handlers are only ten to twenty lines of code and make use of powerful infrastructure in the touch API itself and core WPF features such as transformation matrices. The touch API supports inertia, etc. so try your hand at a flick that meets your needs.
Also, of course, this sample only works with Pen and Touch input devices, not an ordinary mouse, but you specified that your are developing a touch application.
I just noticed something weird in WM 6.5 emulators. Unlike 6.1 where finger panning kind of worked, the only way to scroll a Textbox appears to be through scrollbars.
This behaviour is in contrast to what they have done for comboboxes: they are now gesture-friendly without the programmer's intervention. I.e. the user can select a choice from a standard drop down menu by panning and scrolling. Previously, you had to use the embedded scrollbar. The combobox's case implies that MS took some measures to provide standard gesture support for classic finger gestures, yet I cannot see something similar for textboxes. This makes me ask the following:
Is there anything that can be done to make textboxes finger scrollabe easily?
Note that I refer to managed .NET CF development. It is my understanding that in native development I could use the new Gestures API to achieve the scrolling effect. Yet, I am not sure if there is an easier and more straightforward method that I have missed.
Create a textarea and limit it to two rows: one for text, one for a scrollbar.
You could try using the gesture api, there are managed wrappers for it...
http://blog.markarteaga.com/CategoryView,category,Samples.aspx