What is the difference between Microsoft surface 2.0
and Windows touch events in .Net 4.0 ?
Are the two platforms parallel or is one built on top of the other ?
Windows touch events and surface touch events are not the same events ?
Help, I'm confused.
Microsoft Surface is a product http://www.microsoft.com/surface/en/us/whatissurface.aspx
Windows Touch is the ability of the Windows operatings system to listen to touch events http://windows.microsoft.com/en-US/windows7/products/features/touch
WPF has the ability to handle touch events http://msdn.microsoft.com/en-us/library/ms754010.aspx
So if you have hardware that supports touch running a version of windows that supports touch, you can run a WPF app that handles touch events. (A bit of a simplification but that is the basic idea)
As I understand it, the 'Surface' platform is built on top of touch events. For example, You can code standard WPF components for Touch interaction yourself, as a subset of the usual .NET components support touch events. Or, you can use the Surface controls instead which generally have done alot of the work for you in handling interactions like dragging, pinching, and so on.
Touch is a tricky mechanism to get right from a user POV, so I'd say start with the already available Surface library if you want to get started in this area.
Related
I want to detect mouse hovering over visual elements in .NET MAUI ( like buttons ). I cannot find any "onmouseover"/"ismouseover" events for this. How can I implement this.
All requests for "desktop" style control of keyboard and mouse have been consolidated under Maui issue: Desktop platforms: Mouse and Keyboard.
jfversluis' comment (at that issue or a related issue) indicates that no work in this area has yet been planned.
That issue is the place to discuss the topic (express interest in, or add any specific requirements or use cases), until there is a plan.
Until then, the work-around is generally to write Windows-only code in the Windows folder of your Maui project.
Unfortunately for mouse, that only gives window or desktop-relative mouse information. AFAIK, There currently is no easy way to interact with individual UI elements while the mouse moves. (Because Maui started as Xamarin started as mobile cross-platform code; touch devices don't have concept of cursor being moved around screen.)
A crude hack is to use AbsoluteLayout: position UI elements at exact positions, so you know where they are, then write (Windows-only) code that does your own "detection".
The next level of sophistication would be to write code that traverses a window's "visual tree" of UI elements, to detect what control the mouse is over. Maui must have such logic internally somewhere, to handle touch events. Unless it relies on each platform's UI code to decide that. TBD.
For now, Maui makes sense for apps that can live with a "touch" paradigm, rather than a "desktop" paradigm.
This isn't an answer per say, but as a suggestion you could try and determine when the mouse would be over the VisualElement and then use StateTriggers to actively apply a VisualState.
MAUI Triggers
Microsoft's release notes for .NET Framework 4.6.2 includes the following:
"Soft Keyboard support enables automatic invocation and dismissal of the touch keyboard in WPF applications without disabling WPF stylus/touch support on Windows 10. Prior to 4.6.2, WPF applications do not implicitly support the invocation or dismissal of the touch keyboard without disabling WPF stylus/touch support. This is due to a change in the way the touch keyboard tracks focus in applications starting in Windows 8."
The List of Changes likewise states:
"Enable automatic invocation and dismissal of the touch keyboard in WPF applications without disabling WPF stylus/touch support on Windows 10 [178044]"
But I cannot find any indication of HOW to do this, and I cannot find anything in the official API diff that seems to be this.
Can anyone help me find documentation of how to do this thing that I can now allegedly do?
My context is that I have an application that explicitly launches "OSK.exe" when needed. On touch devices with a built-in Windows on-screen keyboard, this results in TWO on-screen keyboards being shown. I want to disable the standard one and only launch "OSK.exe" explicitly.
Thanks!
I think this (especially the comments) should answer your question. The short story is: there is nothing specific to do, but it works only on Win10 anniversary edition.
To disable the soft keyboard, you can do what's indicate in the comments:
WPF on Windows 7 with touch: hide soft keyboard (and the popup icon that enables it)
or
https://blogs.msdn.microsoft.com/winuiautomation/2015/04/26/how-and-why-i-added-an-on-screen-keyboard-to-my-windows-store-app/
If in WPF on Windows 10 anniversary edition, you can override automation peer in TextBox:
protected override AutomationPeer OnCreateAutomationPeer()
{
return new FrameworkElementAutomationPeer(this);
}
On Windows 7 it is enough to put
InputMethod.IsInputMethodEnabled=”False”
Hope it helps.
I recently received a brand new Acer T232HL multi-touch screen at my work to be able to debug a multi touch application. The multi touch function is working great in Paint or with sample application.
However when I try it on my own applications programmed in C# I have issues and no touch events are launched. I tried two samples (from the MSDN website):
A simple scatter view application without touch events handling
A scatter view application with touch events handling
I used a tool called Snoop to see which events my application can detect. None of my application can detect TouchDown, TouchMove or TouchUp events. I guess any touch events cannot be detected by my application except the virtual keyboard key events.
I looked at the control panel for "Pen and touch settings" but I cannot see anything wrong with it.
Does anyone had this problem before? How could I set my screen to be listened by my C# programs?
I wrote a simple c# windows forms application (.net, in visual studios 2012) for a mouse-controlled keyboard for use in a desktop application. Ultimately, I want to have a keyboard form that can be used on a touchscreen in a kiosk-like setting.
My question: can I expect the desktop app to work "as is" on a touchscreen? My specific concern is whether I can reasonably expect the mouse-click events to intercept touch events on the touchscreen, or whether I should a priori consider importing certain libraries and/or bind events other than "Click". I would simply test it myself but I don't have access rights now to a touchscreen device on which I can run the app.
Can I expect the desktop app to work "as is" on a touchscreen?
Yes, it's down to the hardware to translate a "touch" to a click. You can write more advanced apps which target touch screen devices specifically e.g. swiping/pinching etc. However, if yours is just a basic app with buttons it should all work the same on a touch screen.
I'm building a windows 7 touch app and want to be able to handle touch gestures as simple events, specifically flicks. I can see windows responding, it pops up the icons as I flick but I can't seem to find an easy way to handle them. Something like grid.OnLeftFlick ...
Am I missing something stupidly simple? Is there a toolkit or something I can use? Or, do I have to write my own listener?
Thanks.
While WPF4 does support multi-touch, it does not have any built in gesture support, at least not of the kind you are thinking about. Features such as flick, pinch-zoom, etc. all have to be programmed on top of WPF4.
I don't know of any higher-level gesture support toolkits but writing your own is not as hard as it sounds. This WPF4 touch sample shows for example touch based move, pinch-zoom and pinch-rotate:
Walkthrough: Creating Your First Touch Application
If you compile and run the sample you'll see the handlers are only ten to twenty lines of code and make use of powerful infrastructure in the touch API itself and core WPF features such as transformation matrices. The touch API supports inertia, etc. so try your hand at a flick that meets your needs.
Also, of course, this sample only works with Pen and Touch input devices, not an ordinary mouse, but you specified that your are developing a touch application.