Issue recovering Touch Events with multi-touch screen - c#

I recently received a brand new Acer T232HL multi-touch screen at my work to be able to debug a multi touch application. The multi touch function is working great in Paint or with sample application.
However when I try it on my own applications programmed in C# I have issues and no touch events are launched. I tried two samples (from the MSDN website):
A simple scatter view application without touch events handling
A scatter view application with touch events handling
I used a tool called Snoop to see which events my application can detect. None of my application can detect TouchDown, TouchMove or TouchUp events. I guess any touch events cannot be detected by my application except the virtual keyboard key events.
I looked at the control panel for "Pen and touch settings" but I cannot see anything wrong with it.
Does anyone had this problem before? How could I set my screen to be listened by my C# programs?

Related

Mouse hover detection in .NET MAUI

I want to detect mouse hovering over visual elements in .NET MAUI ( like buttons ). I cannot find any "onmouseover"/"ismouseover" events for this. How can I implement this.
All requests for "desktop" style control of keyboard and mouse have been consolidated under Maui issue: Desktop platforms: Mouse and Keyboard.
jfversluis' comment (at that issue or a related issue) indicates that no work in this area has yet been planned.
That issue is the place to discuss the topic (express interest in, or add any specific requirements or use cases), until there is a plan.
Until then, the work-around is generally to write Windows-only code in the Windows folder of your Maui project.
Unfortunately for mouse, that only gives window or desktop-relative mouse information. AFAIK, There currently is no easy way to interact with individual UI elements while the mouse moves. (Because Maui started as Xamarin started as mobile cross-platform code; touch devices don't have concept of cursor being moved around screen.)
A crude hack is to use AbsoluteLayout: position UI elements at exact positions, so you know where they are, then write (Windows-only) code that does your own "detection".
The next level of sophistication would be to write code that traverses a window's "visual tree" of UI elements, to detect what control the mouse is over. Maui must have such logic internally somewhere, to handle touch events. Unless it relies on each platform's UI code to decide that. TBD.
For now, Maui makes sense for apps that can live with a "touch" paradigm, rather than a "desktop" paradigm.
This isn't an answer per say, but as a suggestion you could try and determine when the mouse would be over the VisualElement and then use StateTriggers to actively apply a VisualState.
MAUI Triggers

How Can I Globally Detect Touch-screen Events in the Windows 8 Metro UI?

I am developing a WinForms C# application, that needs to know when the computer is not "idle", that is, that a user is actively using it. I can detect keyboard and mouse activity in both the standard desktop and Metro (I think thats what they call it). I can also detect touchscreen activity in the standard desktop - these events are detected as mouse events.
However, using the touchscreen in Metro, none of these events are raised. I have tried this project: http://globalmousekeyhook.codeplex.com/ as well as writing code to hook into the Windows SetWindowsHookEx API with no luck.
I spent some time investigating the WinEvents API too, but can't find anything there that will help.
Anybody know if this can be done? And how?
There happens to be a nice function for that but unfortunately it is C++, so you'll have to P/Invoke it.

Microsoft Surface vs Windows Touch?

What is the difference between Microsoft surface 2.0
and Windows touch events in .Net 4.0 ?
Are the two platforms parallel or is one built on top of the other ?
Windows touch events and surface touch events are not the same events ?
Help, I'm confused.
Microsoft Surface is a product http://www.microsoft.com/surface/en/us/whatissurface.aspx
Windows Touch is the ability of the Windows operatings system to listen to touch events http://windows.microsoft.com/en-US/windows7/products/features/touch
WPF has the ability to handle touch events http://msdn.microsoft.com/en-us/library/ms754010.aspx
So if you have hardware that supports touch running a version of windows that supports touch, you can run a WPF app that handles touch events. (A bit of a simplification but that is the basic idea)
As I understand it, the 'Surface' platform is built on top of touch events. For example, You can code standard WPF components for Touch interaction yourself, as a subset of the usual .NET components support touch events. Or, you can use the Surface controls instead which generally have done alot of the work for you in handling interactions like dragging, pinching, and so on.
Touch is a tricky mechanism to get right from a user POV, so I'd say start with the already available Surface library if you want to get started in this area.

Regarding desktop share concept

i have developed two a small apps by which i can send desktop picture with mouse position to another apps. another apps receiving the picture and shows those picture on picture box.
1) so i want to know that how could i send my all mouse keys and keyboard activity in very structure way to another apps which is sending picture.
2) when another apps will receive my mouse & keyboard activity then they will fire properly.
so please tell me how could i send mouse & keyboard activity very structure way and another things is to know that how could programmatically fire those mouse & keyboard activity on another machine. here i am developing this apps with c#. so please give me the concept as a result i can code it in c#. thanks.
Sending a picture (or any data) over a network is relatively easy.
Intercepting and re-creating Mouse and Keybord actions is entirely different. It will involve a lot of lowlevel hooking into Windows, no ready-to-use libraries.
You will have to be proficient in Interop and Marshaling, and maybe C# (.NET) just isn't the best tool for this job.

WPF/WinForm not receiving touch events

I'm trying to create a fullscreen WPF application and trying to implement a android like navigation drawer.
So in order to accomplish this I have to detect a swipe coming from offscreen, just like the Windows 8 charm bar.
But windows does not sent any touch/mouse events to the application if the swipe didn't start on the touchscreen (global hook isn't working too).
How am I able to detect those swipes starting offscreen just like Windows does it for the charm bar?
Just if it matters I'm using a Surface pro 3 with Win 8.1
You need to add manipulation events on your screen/window to detect if any swipe event takes place on your screen. There is no way that you directly get that event offscreen.
Referring to this link may help you.

Categories