C# XNA touch panel - Surface Pro touchscreen not recognized - c#

I used some tutorials on the internet that taught how to use touch input in XNA for windows computers with a touch screen. However, I'm using a Surface Pro and the XNA studio doesn't even recognize it as a touch screen. I used a test by printing the status of TouchPanel IsConnected and it was false. The maximum touch capability returned 0 (I think surface has a multitouch of 10).
Is anyone familiar with this? I was trying to do some simple tasks like, receiving multiple finger clicks on the screen.

XNA 4.0 doesn't support touch on any device other than Windows Phone 7. See this blog post for full details.
That linked blog post provides some alternative methods for receiving touch input (basically: use the Windows APIs).

Related

How to implement pinch zoom

I am working on windows phone8 xaml c# application, I need to implement pinch zoom in my reader page. Any help?
you could follow this msdn tutorial
http://www.windowsphone.com/en-in/how-to/wp7/start/gestures-flick-pan-and-stretch
And
The Windows Phone emulator only supports multitouch gestures with a multitouch device. You would have to have a screen, or touchpad to work with it. Also, there has been comments about using two mice and there is a project on codeplex http://multitouchvista.codeplex.com for it.

Microsoft Surface vs Windows Touch?

What is the difference between Microsoft surface 2.0
and Windows touch events in .Net 4.0 ?
Are the two platforms parallel or is one built on top of the other ?
Windows touch events and surface touch events are not the same events ?
Help, I'm confused.
Microsoft Surface is a product http://www.microsoft.com/surface/en/us/whatissurface.aspx
Windows Touch is the ability of the Windows operatings system to listen to touch events http://windows.microsoft.com/en-US/windows7/products/features/touch
WPF has the ability to handle touch events http://msdn.microsoft.com/en-us/library/ms754010.aspx
So if you have hardware that supports touch running a version of windows that supports touch, you can run a WPF app that handles touch events. (A bit of a simplification but that is the basic idea)
As I understand it, the 'Surface' platform is built on top of touch events. For example, You can code standard WPF components for Touch interaction yourself, as a subset of the usual .NET components support touch events. Or, you can use the Surface controls instead which generally have done alot of the work for you in handling interactions like dragging, pinching, and so on.
Touch is a tricky mechanism to get right from a user POV, so I'd say start with the already available Surface library if you want to get started in this area.

C# .NET Trackball Support - But Not As a Mouse!

I want to use a trackball in my C# .NET application. But I do not want the trackball to be used by Windows as a mouse. When I connect both a trackball and a mouse the are both given control of the cursor.
So there are essentially two questions:
How can the trackball be removed/disabled/coaxed into giving up control of the cursor?
After doing this how can I access the trackball movement information from my C# .NET application.
Thank you!!
Ash
PS: I specifically want trackball support, but am hoping that joystick support will be similar as I may expand to this functionality in the future. Since the joystick does not default to having control of the cursor I can see how this may be different...perhaps easier?
I would think you need to install a special driver that would not identify the mouse as a mouse, but as an input device that streams the movement.
The windows driver kit might help. It says they have a HID device example.
I highly recommend Edward Tse's SDG Toolkit (Single Desktop Groupware Toolkit). I used it several years ago for an application that required multiple mice and monitors.
You can also get the C# source code: http://grouplab.cpsc.ucalgary.ca/cookbook/index.php/Toolkits/SDGSourceCode

Working with multiple screens in XNA 4.0

I'm struggling to find info how to detect available screens in a XNA 4.0.
What I'm trying to do is to get a list of available "screens", select one and start my XNA app as full screen on that screen.
Specifically I wan't to open the XNA window in Full Screen mode on a Full HD TV connected via HDMI cable.
Thanks,
Stefan
This article explains how to get XNA working on multiple monitors.

Microsoft Based Touch/Stylus Input

Disclaimer:It's my first time developing an app that will be on a tablet style pc.
The app needs to allow a user to write notes with either a tablet pen or by touch. This must be a .net app and may be on either Windows XP and/or Windows 7. I am mainly tasked with capturing written notes from the user interface, but I am sure I will be working on other aspects of the app.
I have looked into Digital Ink and it seems the way to go, but I am unsure of how much support currently exists for windows 7 and how much support there is going to be for this technology in the future.
My questions:
Am I going in the right direction, and if not...is there something that would allow me to better accommodate this type of user input?
Does any one have other tips or good reference sites with good info on microsoft based touch and/or tablet stylus input.
If you plan on doing custom programming, you can check out the Windows Touch API for Windows 7.
Touch and Digital Ink both use C# and .NET framework, so I would imagine there is a considerable amount of support for Digital Ink in Windows 7.
Also, in the .NET framework, there's a Stylus class that tracks stylus coordinates in a text box, even if the stylus leaves the box and comes back in:
System.Windows.Input.Stylus
Hope this helps!
Microsoft also provides the RealTimeStylus API, see this tutorial.

Categories