Mouse gestures can be bound to commands using the MouseBinding InputBinding,
for example:
<Grid.InputBindings>
<MouseBinding Command="{Binding MyCommand}" Gesture="LeftClick"/>
</Grid.InputBindings>
In that example, the LeftClick gesture is used. What is the full list of gesture strings? I'm looking for a left mouse button down gesture, if it exists.
That is a MouseAction value. You can see possible values in the documentation. Mouse down is not a built-in gesture. Only various clicks and double clicks are in the enumeration.
It is possible to make your own input bindings by creating classes that extend InputBinding and InputGesture. You can reference the implementation of MouseBinding for an example. Alternatively, you can find a different way to accomplish whatever it is you are trying to do.
I'm looking for a left mouse button down gesture, if it exists.
That would be the LeftClick mouse action that you are currently using.
If you want to invoke a command when the MouseLeftButtonDown event occurs, you could do this using an interaction trigger:
<i:Interaction.Triggers>
<i:EventTrigger EventName="MouseLeftButtonDown" >
<i:InvokeCommandAction Command="{Binding MyCommand}"/>
</i:EventTrigger>
</i:Interaction.Triggers>
Please refer to the following blog post for more information about this.
Handling events in an MVVM WPF application: https://blog.magnusmontin.net/2013/06/30/handling-events-in-an-mvvm-wpf-application/
The EventTrigger class is included in th Expression Blend SDK which you can download from here: http://www.microsoft.com/en-us/download/details.aspx?id=10801.
Related
I have a Prism application and I'm attempting to bind PreviewMouseDown and PreviewMouseUp button events in my view to commands in my view model. When I run the code I see the following exception:
As a workaround, I'm currently binding to methods in the view and use a reference to the data context of the view model to execute the command. This works but doesn't seem correct because the view now has knowledge of the view model.
What is the proper way to handle something like this?
You cannot bind events to commands like, for example, the Command property of a button.
Luckily, you don't need to, because you have the Command property. It even disables the button if the command returns false from CanExecute.
If you have something other than a button or something other than MouseDown, you can use InvokeCommandAction (from Prism or from Interactivity)...
xmlns:i="http://schemas.microsoft.com/xaml/behaviors"
<i:Interaction.Triggers>
<i:EventTrigger EventName="MouseDown">
<prism:InvokeCommandAction Command="{Binding MyCommand}"/>
<!-- or -->
<i:InvokeCommandAction Command="{Binding MyCommand}"/>
</i:EventTrigger>
</i:Interaction.Triggers>
I'm currently trying to implement a ImageViewer with the possibility to move and zoom with touch inputs.
I already implemented these functionalities in a recent project in the Code Behind, but I'm struggling to do so in a View Model in MVVM.
Problem is that for my code to work I have to know how many touch inputs are recognized at the same time.
In my Code-Behind I used:
canvas.TouchesCaptured.Count()
The ViewModel shouldn't not know any Controls of the View, so passing the Canvas as a Command Parameter is not the way to go.
Beside the canvas I need the TouchEventArgs of the triggered TouchEvent to determine the position of the TouchEvent on the canvas.
Using Prism I was able to get the TouchEventArgs into the ViewModel.
<i:Interaction.Triggers>
<i:EventTrigger EventName="TouchDown">
<prism:InvokeCommandAction Command="{Binding TouchDownCommand}"
</i:EventTrigger>
</i:Interaction.Triggers>
prism:InvokeCommandAction automatically sets the EventArgs as the CommandParameter for clarification.
To determine the position of the TouchEvent on the Canvas I need the canvas and the TouchEvent.
In my Code-Behind it looked like that:
startingPoint = e.GetTouchPoint(canvas);
Anyone has a idea how I can solve this problem without violating the MVVM Pattern?
You could try writing a Blend behavior that encapsulates the Canvas event handling and exposes commands (e.g. ManipulationDelta in particular). You could even add a property to the behavior that exposes the TouchesCaptured values (during the ManipulationDelta event).
e.g.
<Canvas>
<i:Interaction.Behaviors>
<bhv:CanvasBehavior ManipulationDeltaCommand="{Binding MyViewModelCommand}" TouchPointCount="{Binding MyViewModelTouchPointCount}" />
</i:Interaction.Behaviors>
</Canvas>
I am trying to write a XAML control for a Piano Keyboard in WPF which responds to NoteOn and NoteOff MIDI events from an external MIDI keyboard. I am using Tom Lokovic's midi-dot-net to raise NoteOn and NoteOff events triggered by the hardware but I need a way to get these events to raise the NoteOn and NoteOff events of my XAML Key class (derived from the WPF Button). The colour of a key should change when it is on and the event should be subscribable to so that a user of the Piano Keyboard control can play a sound the key is pressed.
I could do this by passing every Midi.InputDevice to every single key on the keyboard so that each one can subscribe to the NoteOn and NoteOff events of every InputDevice then, in turn raise their own NoteOn and NoteOff events but the problem with this is that the PianoKeyboard control (an ItemsControl which holds Keys) and its nested Key controls all become tightly coupled to the implementation of midi-dot-net. If I have to do this I will, but it seemed like there should be a better way of doing this in WPF moving the dependency on midi-dot-net higher up in the call stack.
I have too much code to paste in its entirety here and still be readable so here's a sample of one of the DataTemplates I'm using as my PianoKeyboard's ItemTemplate.
<DataTemplate x:Key="naturalKeyTemplate">
<!--NOTE: Background and Foreground Color assignment and changing is accounted for in the style.-->
<local:Key Grid.Column="{Binding Converter={StaticResource keyToColumnNumberConverter}}"
Grid.ColumnSpan="{Binding Converter={StaticResource keyToColumnSpanConverter}}"
Grid.RowSpan="2"
Style="{StaticResource naturalKeyStyle}"
Note="{Binding Note}"
IsHighlighted="{Binding IsHighlighted}">
<!--TODO: Find a way of raising the Key's NoteOn and NoteOff events from here.-->
</local:Key>
</DataTemplate>
Essentially what I'm asking is: given an input device is not supported to trigger a WPF button with the button's built in behaviour (e.g. a mouse click), how does one get it to trigger the button without coupling it to a derived class of the button?
You could raise the Click event programmatically:
button1.RaiseEvent(new RoutedEventArgs(ButtonBase.ClickEvent));
I think this can help you to call specific Metod based on event called.
<local:Key>
<i:Interaction.Triggers>
<i:EventTrigger EventName="Mouse.PreviewMouseDown">
<ei:CallMethodAction MethodName="NoteOn"
TargetObject="{Binding}" />
</i:EventTrigger>
<i:EventTrigger EventName="Mouse.PreviewMouseUp">
<ei:CallMethodAction MethodName="NoteOff"
TargetObject="{Binding}" />
</i:EventTrigger>
</i:Interaction.Triggers>
</local:Key>
You will need to add xmlns extensions:
xmlns:i="http://schemas.microsoft.com/expression/2010/interactivity"
xmlns:ei="http://schemas.microsoft.com/expression/2010/interactions"
I would have done a MIDI listenner service singleton that listen to all midi events, which update a keyboard state repository, which expose a viewmodel binded to your keyboard control...but i do not really see your programm structure, so it's not easy to say.
binding the midi event in your xaml to the wrapper midi.net does not seems to be the LOB way, because you link the gui with midi.net...
by the way, On/Off, is not enough, you need "pressed" as key is pressed state??
Hooking into it using code behind is trivial, but I can't work out a good way of handling it using MVVM.
I'll answer my question.
You can use Interaction.Triggers.
<i:Interaction.Triggers>
<i:EventTrigger EventName="Completed">
<i:InvokeCommandAction Command="{Binding CompletedCommand}" />
</i:EventTrigger>
</i:Interaction.Triggers>
One method I can think at the back of my mind now is, other than putting an EventTrigger for storyboard animation, you should also put up a DataTrigger. The DataTrigger should trigger on the same Property being animated, and the trigger value should be the value when the animation completes. If this control is your own custom control (or UserControl), you can create a new dependency property HasCompletedAnimation on it, and bind it to the ViewModel. If the control is neither a custom control nor UserControl, you can try using that control's Tag property.
Is it possible to create a command behavior using Prism's CommandBehaviorBase class for Silverlight's grid? I know that it is only intended for actual controls, so I was wondering if anyone might know if a workaround. I would like to create an attachable mouse over behavior for a grid, that executes a specific command, and ideally would like to use Prism for this approach, just can't seem to use CommandBehaviorBase for a Grid.
Thanks.
The arguably easier way to achieve this is to use Triggers. Doesn't require you to write any code, all you have to do is this:
<Grid>
<i:Interaction.Triggers>
<i:EventTrigger EventName="MouseEnter">
<si:InvokeDataCommand Command="{Binding DoSomethingCommand}"/>
</i:EventTrigger>
</i:Interaction.Triggers>
...
</Grid>
Here the DoSomethingCommand (defined in a ViewModel) will trigger when MouseEnter event is fired on the Grid.