C# WinForms Multitouch - c#

Hi I have been lurking for awhile and could not find an answer to my question. I am wondering if it is possible to use multitouch with C# and WinForms.
What I am trying to accomplish is being able to use one finger to move around and detect coordinates and with my other finger when it touches the screen to change a label to saying "You Clicked".
Currently I am using global hook to detect where the mouse is and I have a button on in the form that I am trying to press with the second touch.
This is a picture to better explain:
http://postimg.org/image/o2n49xtit/
I am hoping there is a way to do this with WinForms because I would like to create a on screen game pad that can have a joystick and buttons to press.

Related

How do you code unity ui button touch controls to receive an input?

I have a right, left and up button on the screen (its a 2D game). I am trying to figure out how to write a script so that when the user clicks the buttons, they receive the input and either move the player right left or up. I have watched some tutorials on Youtube but they are from a couple of years ago and are out of date and don't work for me. Could anyone send me a script or tell me how to code it (C#) so that it can recognise the touch input?
Functions OnMouseDown, OnMouseUp, OnMouseDrag, OnMouseUpAsButton etc. work excellent not only on PC but on Android, too.
But don't forget than you must add BoxCollider2d to the pbject to make these functions work:)

Unity C# TouchScript - Overlapped collider detection issue

I'm working on a 2D non-game application. I used TouchScript to have all the multitouch gesture but I have an issue.
In the application, i have the ability to open a lot of popup that are draggable, pinch resizable and we can rotate them.
These popup are made with UIPanel. I add a collider2D on them
The issue is that when 2 popup are overlapping, if i want to move the one on top, i will randomly hit the one on the back or the top one.
It i like the touch goes through the first collider to hit the one behind...
First answering to the comment on your question suggesting to use the UI event system:
If you just use unity's UI events, you won't get advanced gestures such as swipe, pinch, etc and will have to code it yourself.
If you need these gestures, Touchscript is working fine and is a good choice.
Now to your question: I had the same problem and solved it by putting the script "UILayer" on the camera, instead of "CameraLayer2D"

How do I do a "mouse click" with kinect?

I'm in the midst of developing a kinect application.
Basically, what I have now is a simple WPF form with 3 buttons, a rightHand image and a leftHand image is tracked and is working.
I have 2 problems though.
1st Problem: Unable to move "hand" pointer to the extreme right and left.
I know this has something to do with bounding box and i'm still looking for a solution online.
2nd Problem: How do I initiate a "click" without making use of Kinect Region and the Kinect Tile Buttons? I'm looking for something similar to the video shown below
https://www.youtube.com/watch?v=d_UqFf4KYJA
If this is what you are looking for ?
X-box common gestures - Open the system menu
I think you have to use timer to check if the current positon is where you want it to be and start a counter for few secs.
Maybe this can help:
Hovering button

WPF - touch gestures - suppressed mouse click if scrolling

I have multiple videos in ScrollViewer and when I click on one of these videos It plays and I got movie on fullscreen and where I moved with scroll bar It's moving with videos. This is okay. I have implemented this by adding mouse click to media element and now I am testing this on touch screen and it's working.
But I want to differentiate between click on video and moving in that area. That user don't want to "select" video but he is just moving with area but not in scroll bar but in area with videos. For better understanding I added image:
To have WPF make difference between click and move (to have both but still calling just one). What is the best way to do this? Thanks
It is very hard to handle touch events on your own, especially when you have to handle nested controls.
I think the easiest way to solve this, is for you to use the Microsoft Surface Touch Pack
Replace your ScrollView with a SurfaceScrollView and place a transparent SurfaceButton above your MediaElements and react on the Click-event (or handle it via the a command).
This way, Microsoft does the tricky panning (scrolling) or tapping (clicking) for you. And as a nice side effect, you get things like bouncing and interia of the ScrollViewer for free.

Suppressing touch points in Silverlight WP7

I'm writing a hybrid XNA + Silverlight app for WP7 and I'm trying to work out the touch splitting between them. I've got this halfway worked out: I can suppress XNA TouchPanel touches when the user clicks a Silverlight button. However I have not figured out how to throw out game-only touches for Silverlight. So if you're holding a touch point in the game space (say, for moving the player around), a second touch on a button won't work. It think it's picking it up as a multi-touch gesture and only allowing the first touch point to click buttons.
My question is: how can you suppress this touch point in Silverlight processing?
The simplest way woudl be to design your app so that you don't use both types of control in an interactive way on the same page.
Or, if you must, when you detect the first XNA touch disable touch on the Silverlight controls.

Categories