WPF with Mvvm Light handling mouse and touch - c#

Currently I have to develop a very simple WPF user control that allows the user to select several points on a canvas. The difficulty that I encounter, is that uses with a touch screen should be able to so by triggering a TouchDown event whereas users that don't have touch screen should use the mouse and thus trigger a MouseLeftButtonDown event. Is there a simple way to handle these two cases without duplicating code? Also, I need to use Mvvm Light, so code-behind solutions like How to get Touchscreen to use Mouse Events instead of Touch Events in C# app won't do the trick.

Your linked question provides an answer for you, whether you are using MVVM or not. Using MVVM does not mean that you cannot handle UI control events. It just means that you should write an Attached Property to handle them for you. So, your answer is yes, you can handle the two events together and in almost the same way as your linked page suggests.
Your only difference is that the handler must be attached to the events in an Attached Property. Rather than go over the whole story once again here, I'll just briefly explain the procedure and request that you view my answer from the What's the best way to pass event to ViewModel? question for a code example.
First declare your Attached Property with its getter and setter and ensure that it has a PropertyChangedCallback handler attached. The PropertyChangedCallback handler is where you attach your single handler to the events (the code example just attaches a single event). This just means that it will only attach the handler to the events when your Attached Property is set. Finally, just add your single handler to handle the two events.

Related

How to hook the event or the changing of the windows in a WPF application?

everyone, hope you well.
Here is the requirement: I have a C# WPF application, and I would like to track the events in this app.
Not a native speaker, forgive.
Thank you.
I found a mouse hook on 'https://github.com/justcoding121/Windows-User-Action-Hook', but it can hook the global events rather the events in the app.
This hook cannot hook the events in the WPF application.
Two things you need to be aware of before we can get to the answer itself:
1) In WPF, everything except the window itself is artificial -- all UI elements are drawn via DirectX and not the operating system (which is why, for example, in WPF you can put a button into another button but in Windows Forms you can't). So there isn't a way for the operating system to somehow know about mouse events within the app.
2) Almost all events in WPF are routed -- that means, when a control fires an event, it can actually be handled in multiple places, typically on that control first and then up the tree to and including the window. Controls up (or down) the tree will fire this event even if it is not declared on them (like Click is not declared on a Window, but a window can still indirectly respond to clicks within itself).
That out of the way, you can use events either normally (for example, this.MouseMove += MyHandler; in the window constructor or <Window MouseDown="DownHandler" ...> in XAML) or through the routed events syntax (this.AddHandler(Button.ClickEvent, MyHandler); in C# or <Window Button.Click="ClickHandler" ...> in XAML). The latter is usually needed only when you handle an event that is not declared on the element (like Click on a Window).

Tab Control not firing system events

This is for WinForms.
I am having a strange problem that is driving me absolutely batty. I have a tab control for which standard system events, and only standard system events, are not firing. The specific event I was trying to get to fire was the TabIndexChanged event. It doesn't matter whether I add this programmatically or with the designer.
Note: Mouse events will fire. Keypress events will fire. All other events that I have tried will fire.
System events on OTHER controls will fire.
It isn't a single tab control that is having a problem either. If I drag a new tab control onto the form it will also have this problem.
I do not really have any code to show here because it would just be the event as generated by the designer and a Console.WriteLine message to see if it is firing (this line outputs for other events). What I am hoping for is some insight as to what could cause this problem.
The entire program is quite large, so I cannot really clip the whole thing into this forum so that individuals can hunt for a specific problem. My hope is that somebody might be able to point me to what might cause this behavior. I am thinking that maybe something got screwed up when editing in design mode, but I just do not know what to look for. I am relatively new to C# and programming is a hobby for me.
Thank you for your time,
FC
You need to try the property SelectedIndexChanged; to expose that property, with the property window open, click once outside the tabbed container then click one of the small tabs once, don't click inside the tab area. Find the property, type in an event name and then double click it.

tracking mouse events not related to controls in c#

I need to write event handlers for tracking mouse movements, but the mouse-down and mouse-up events are not related to any Control object. What is the generic way to do this? On researching this question on StackOverflow, I found a lot of information about mouse events related to specific controls, such as the PictureBox, but I don't find any on C# generic mouse listeners. I did find a much older question, but it wasn't related to C#.

Detecting UserControl User Custom Cursor, and Getting Callbacks

I am creating a fullscreen demo application (demo = not production, so hacky code is okay though not preferred) for the Kinect SDK. The application hides the Windows cursor and shows a custom hand cursor which is defined as a object.
What I would like to do is create a custom UserControl (let's call it "HoverControl") that can detect when the cursor object is over it and then send back timer ticks, allowing the cursor object to update in some way (showing the user that something is about to happen).
The behavior is pretty much a copy of the Xbox 360 Kinect behavior. How things look will just be a little different.
How can I detect with the cursor object is over a "HoverControl" and have receive a callback from the HoverControl?
Thank you for any help or suggestions!
CLARIFICATION:
I am not currently moving the Windows cursor, so MouseEnter doesn't fire.
You can use your own cursor by making one using Online Cursor Maker. See this website on how to set it. Then you can use MouseEnter and/or MouseLeave.
Coding my own cursor in XAML, and creating a UserControl out of it, I set up a timer inside the control to perform a hit test for certain buttons (again, their own unique UserControl type) around my interface.
I ran into one issue with the hit test, which I was ultimately able to solve and detail in the following post at MSDN:
http://social.msdn.microsoft.com/Forums/en-US/wpf/thread/a8cdb265-21cc-4fd0-b40d-e6778b659852

XNA - Passing Information Between GameScreens

So, here's what I'm trying to do:
I'm making a game for practice using the GameStateManagement example from creators.xna.com.
The whole of the system is too much to explain here, so experience with the sample is kind of necessary.
The gist of it is that there are multiple game screens that overlay one another. One contains a board, overlayed above it is a few card images. The end goal is to be able to drag-and-drop the card image onto the table (or click it, the method isn't really the problem).
The actual question, then, is how can I pass data from one screen (the overlayed cards) to the other (the table).
I considered that it might be possible to create an event that gets fired when I drag-and-drop a card or click it or what have you, and then register an event handler on the other screen to handle it.
I, however, have pretty much no experience with custom events in C#. I'm not sure if all fired events go into a big "event pool" somewhere and handlers can pick them up regardless of where they are or how that works. I ask because the table screen and the overlayed screen are separate classes that both deal with drawing and updating, not with sending information around to one another.
Hopefully this is enough information to outline the problem decently. Any tips/advice are welcomed.
Thanks!
After some more experimenting, it looks like events are the way to go.
Without knowing the details of how events work, it seems that a fired event can be picked up from just about anywhere, so long as you register to receive it.
So, the solution to my problem was to create an "OnDrag" event in my "Hand Screen." Then to register to handle this event in the "Table Screen."
This resource helped me get going with the basic custom event handling: Writing C# Custom Events

Categories