WPF/WinForm not receiving touch events - c#

I'm trying to create a fullscreen WPF application and trying to implement a android like navigation drawer.
So in order to accomplish this I have to detect a swipe coming from offscreen, just like the Windows 8 charm bar.
But windows does not sent any touch/mouse events to the application if the swipe didn't start on the touchscreen (global hook isn't working too).
How am I able to detect those swipes starting offscreen just like Windows does it for the charm bar?
Just if it matters I'm using a Surface pro 3 with Win 8.1

You need to add manipulation events on your screen/window to detect if any swipe event takes place on your screen. There is no way that you directly get that event offscreen.
Referring to this link may help you.

Related

Handle Screen Off/On Event

I am searching for an event with that I get weather the screen is turned on or off. In Android there is this nice BroadcastReceiver. Is there anything like that in UWP-Apps?
Kind regards
No, its the system level behavior and there is no bridge like BroadcastReceiver to deliver such message into the app. There used to be one for Windows Phone Silverlight but not available on Windows Universal App.
If you simply wants to keep the screen on, try this: RequestActive | requestActive method

Kinect for Windows v2 hand cursor outside WPF windows

I would like to write an application where you can use your hand instead of mouse. I wrote some WPF and use a hand style and it work good.
But I want to use this control outside and WPF window, just like normal mouse coursor. Is it possible to write it with Kinect For Windows V2 ?
What you need is a service, but not WPF app. So that it still can process input from Kinect in background, not only when your app is active and foreground.
You need to emit mouse move/click events in order to use it globally in the whole Windows OS. Here is one of examples on MSDN: https://social.msdn.microsoft.com/Forums/vstudio/en-US/1ea09f18-94f6-4f4f-bcba-d02da27beaa4/control-mouse-position-and-generate-click-from-program-c-winforms-aim-control-pc-from-serial?forum=csharpgeneral

Writting on InkCanvas with stylus but not with finger

I have WPF applications running on a Windows8 tablet that captures user writting. I use the InkCanvas control to capture the data but I want the user can only writte with the stylus not the finger.
MoreOver, this application can run on a Desktop PC and in that case I would like to capture the writting from the mouse.
Is there any way to filter the "input mode" of the inkCanvas? I have been searching but I haven't found any way to do it.
Many thanks in advance.

Tap and Drag gesture in Windows Phone without XNA

I want to know if there's some kind of gesture for Windows Phone that allow me to tap on a Button and then drag to another and tap it, of course without XNA framework because this is not a game and I don't want use it.
For example, I have two buttons "A" and "B", if tap on A then I drag my finger (still on the screen) on B then B should be selected with A.
I hope everybody understand my problem, Of course I'm doing all this with C# and XAML for Windows Phone 8.
Thanks for the answers. Cheers.
You can use the DragStarted and DragCompleted events from Microsoft.Phone.Controls.GestureListener available in Phone Toolkit. Sample code: http://www.scottlogic.co.uk/blog/colin/2012/06/a-gesture-driven-windows-phone-todo-application-part-two-drag-re-ordering/
I also prefer using the already implemented gestures that Microsoft provides with the windpwsphone toolkit. It's probably the best implementation of standard gestures you can get.
And as #Arun already showed, there are already drag and drop implementations done.
But if this is not sufficient for you, you can always create your own gestures based on the raw event, that ever UIElement has. See:
http://msdn.microsoft.com/en-us/library/ms604577%28v=vs.95%29.aspx

How can I summon the touch keyboard from a Windows 7 tablet, through winforms?

Basically I just need to be able to summon the keyboard whenever a user touches a text box in my application, instead of them having to go to the taskbar and actually request that the keyboard is summoned.
Does anybody have any idea how to do this? Whether there is a specific touch API I can touch into? My application is currently written in winforms and c#.
Many thanks,
Christian
Process.Start(#"C:\windows\system32\osk.exe");

Categories