I have a C# winForms application that makes use of the Windows 8 keyboard.
I open the keyboard by launching tabtip.exe.
I am able to close the keyboard using a PostMessage command like this:
public static void HideOnScreenKeyboard()
{
uint WM_SYSCOMMAND = 274;
uint SC_CLOSE = 61536;
IntPtr KeyboardWnd = FindWindow("IPTip_Main_Window", null);
PostMessage(KeyboardWnd.ToInt32(), WM_SYSCOMMAND, (int)SC_CLOSE, 0);
}
I think using PostMessage it should be possible to simulate almost anything programmatically if you just pass the correct values.
The values used for closing the keyboard (274 and 61536) I just found on the internet.
It looks that it is possible to grab these values using Spy++, or some other tools but I am unable how to do this.
Can anybody tell me the values needed to simulate a press on the &123 key, so the keyboard switches to the numeric keyboard?
Or, does anybody know how to get these values?
I have tried Spy++, but so many messages are passing constantly that I don't know where to look.
Look at the image of the OnScreenKeyboard to see what key I mean
You could try to use SendInput to simulate a mouse click event on the &123 button of the virtual keyboard window.
Below is an example of how to use SendInput to send a mouse click (left_down + left_up) to the button but I haven't included the code to programatically find the window and get the window size.
[StructLayout(LayoutKind.Sequential)]
public struct MINPUT
{
internal uint type;
internal short dx;
internal short dy;
internal ushort mouseData;
internal ushort dwFlags;
internal ushort time;
internal IntPtr dwExtraInfo;
internal static int Size
{
get { return Marshal.SizeOf(typeof(INPUT)); }
}
}
const ushort MOUSEEVENTF_ABSOLUTE = 0x8000;
const ushort MOUSEEVENTF_LEFTDOWN = 0x0002;
const ushort MOUSEEVENTF_LEFTUP = 0x0004;
// programatically determine the position and size of the TabTip window
// compute the location of the center of the &123 key
int coordinateX = ...
int coordinateY = ...
var pInputs = new[] {
new MINPUT() {
type = 0×01; //INPUT_KEYBOARD
dx = coordinateX,
dy = coordinateY;
dwFlags = MOUSEEVENTF_ABSOLUTE | MOUSEEVENTF_LEFTDOWN;
time = 0;
dwExtraInfo = IntPtr.Zero;
},
new MINPUT() {
type = 0×01; //INPUT_KEYBOARD
dx = coordinateX,
dy = coordinateY;
dwFlags = MOUSEEVENTF_ABSOLUTE | MOUSEEVENTF_LEFTUP;
time = 0;
dwExtraInfo = IntPtr.Zero;
}
};
SendInput((uint)pInputs.Length, pInputs, MINPUT.Size);
Why don't you set the input scope of the edit control to numeric? New built-in edit controls with the correct properties automatically trigger the numeric mode when touched.
Rather than hacking the Touch Input Panel which appears differently in different locales, etc. set InputScope of the text box to Number and let Windows do the magic.
Related
Let me first explain my current situation and why I think I need this. It may very well be the case that I am handling this completely the wrong way and am thus open to suggestions.
We have a c# program which uses winforms and only has an embededded System.Windows.Forms.WebBrowser which shows a website with an embedded java applet.
Our users have several of these programs opened at a time. The java applet sometimes needs some time to calculate data and the user meanwhile just uses one of his other windows. When the java applet finishes it sets focus to itself and the whole window pops on top which interrupts the other task the user was doing. We don't have access to the java applets source and can't modify it in any way. Calling a java script function which was hooked up through COM to the WebBrowser has the same effect btw.
To counter this we created the event "Deactivate" on the whole form. When called it sets the embedded WebBrowser.Enabled = false. In the corresponding event "Activated" the WebBrowser gets enabled again.
This works really nice: windows don't pop on top anymore just because the java applet wants to set the focus to itself while in the background.
The problem we now have is that when a user clicks on a deactivated window the window gets activated but the mouse click doesn't get forwarded to the WebBrowser. So for example a user has to click twice to press a button.
So I think my question is how to forward the mouse click which activated the window to the WebBrowser.
Thanks in advance
Markus
I think I found a solution. It works but I don't like it:
[DllImport( "user32.dll", CharSet = CharSet.Auto, CallingConvention = CallingConvention.StdCall )]
public static extern void mouse_event( uint dwFlags, uint dx, uint dy, uint cButtons, uint dwExtraInfo );
private const int MOUSEEVENTF_LEFTDOWN = 0x02;
private const int MOUSEEVENTF_RIGHTDOWN = 0x08;
protected override void WndProc( ref Message m )
{
// If the panel which contains the WebBrowser was diabled, the messages WM_LBUTTONDOWN and WM_RBUTTONDOWN will not reach the panel
// but will reach the window instead. We then resend them.
if( m.Msg == 0x0201 ) // WM_LBUTTONDOWN
{
short x, y;
MouseCoordsFromMessage( m, out x, out y );
mouse_event( MOUSEEVENTF_LEFTDOWN, (uint)x, (uint)y, 0, 0 );
}
else if( m.Msg == 0x0204 ) // WM_RBUTTONDOWN
{
short x, y;
MouseCoordsFromMessage( m, out x, out y );
mouse_event( MOUSEEVENTF_RIGHTDOWN, (uint)x, (uint)y, 0, 0 );
}
else
{
base.WndProc( ref m );
}
}
private static void MouseCoordsFromMessage( Message m, out short x, out short y )
{
x = unchecked( (short)(long)m.LParam );
y = unchecked( (short)( (long)m.LParam >> 16 ) );
}
I am not sure how fragile this is and would like to hear other opinions.
I have an multithreaded application that needs to be able to preform multiple mouse click at the same time.
I have an IntPtr intptr to a process on which i need to send a mouse click to.
I have tried to find this information on the web and there are some examples which i have tried. But I have not got any of them to work.
As I understand the correct way to solv my issue is to use the function
SendMessage(IntPtr hWnd, int Msg, IntPtr wParam, IntPtr lParam);
hWnd is the IntPtr to the process.
Msg is the wanted action, which I want a left click, int WM_LBUTTONDBLCLK = 0x0203;
IntPtr wParam is of no intrest to this problem ( as I understand)
And the coordinates to the click is in lParam.
I construct lParam like,
Int32 word = MakeLParam(x, y);
private int MakeLParam(int LoWord, int HiWord)
{
return ((HiWord << 16) | (LoWord & 0xffff));
}
But as you might understand, I cant get this to work.
My first question is, the coordinates are they within the window of this process or are
the absolut screen coordinates?
And my second question, what am I doing wrong?
I was trying to simulate mouse clicks in C# just recently, I wrote this little helper class to do the trick:
public static class SimInput
{
[DllImport("user32.dll")]
static extern void mouse_event(uint dwFlags, uint dx, uint dy, uint dwData, UIntPtr dwExtraInfo);
[Flags]
public enum MouseEventFlags : uint
{
Move = 0x0001,
LeftDown = 0x0002,
LeftUp = 0x0004,
RightDown = 0x0008,
RightUp = 0x0010,
MiddleDown = 0x0020,
MiddleUp = 0x0040,
Absolute = 0x8000
}
public static void MouseEvent(MouseEventFlags e, uint x, uint y)
{
mouse_event((uint)e, x, y, 0, UIntPtr.Zero);
}
public static void LeftClick(Point p)
{
LeftClick((double)p.X, (double)p.Y);
}
public static void LeftClick(double x, double y)
{
var scr = Screen.PrimaryScreen.Bounds;
MouseEvent(MouseEventFlags.LeftDown | MouseEventFlags.LeftUp | MouseEventFlags.Move | MouseEventFlags.Absolute,
(uint)Math.Round(x / scr.Width * 65535),
(uint)Math.Round(y / scr.Height * 65535));
}
public static void LeftClick(int x, int y)
{
LeftClick((double)x, (double)y);
}
}
The coordinates are a fraction of 65535, which is a bit odd, but this class will handle that for you.
I'm not 100% sure I understand what you're trying to accomplish. But if you want to simulate mouse input then I'd recommend using the SendInput API.
You can provide an array of inputs to be inserted into the input stream.
See also: PInvoke reference
I don't understand why anyone would want to send multiple mouse clicks simultaneously. If it's to test your GUI, it's the wrong test. No one can physically click something multiple times in the same time space.
But going back to your question, using SendMessage won't help you, because it is basically a blocking call. Even if you tried to use PostMessage, you won't be able to accomplish simultaneous clicks, because the message queue is getting pumped from the UI thread and has messages popped off and handled sequentially.
I used this code to click left button in handle
public static void MouseLeftClick(Point p, int handle = 0)
{
//build coordinates
int coordinates = p.X | (p.Y << 16);
//send left button down
SendMessage(handle, 0x201, 0x1, coordinates);
//send left button up
SendMessage(handle, 0x202, 0x1, coordinates);
}
If you set no handle with calling - then it sends click to Desktop, so coordinates should be for whole screen, if you will set handle, then message will be sent to handle's window and you should set coordinates for window.
How about just using VirtualMouse? I use it in C# and it works great.
public partial class Form1 : Form
{
private VirtualMouse vm = new VirtualMouse();
public Form1()
{
InitializeComponent();
}
private void MouseClickHere(Point myPoint)
{
vm.ClickIt(myPoint, 150);
}
private void Clicker()
{
MouseClickHere(new Point(250,350));
}
}
I want to make a little application that changes the default playback device in windows 7. The only solution was to interact with the Sound Applet. I succeeded to get the handle to the SysListView32 window that has the devices name but i cant get the text from the ListView.
This is the code used:
IntPtr sListView = (window handle received from another function)
LVITEM lvi = new LVITEM();
lvi.mask = LVIF_TEXT;
lvi.cchTextMax = 1024;
lvi.iItem = 0; // i tried with a loop trought all the items
lvi.iSubItem = 0;
lvi.pszText = Marshal.AllocHGlobal(1024);
IntPtr ptrLvi = Marshal.AllocHGlobal(Marshal.SizeOf(lvi));
Marshal.StructureToPtr(lvi, ptrLvi, false);
SendMessage(sListView, (int)WinMesages.LVM_GETITEMW, IntPtr.Zero, ptrLvi);
string strLvi = Marshal.PtrToStringAuto(lvi.pszText);
The result (strLvi) are some chinese letters. What is wrong in the script?
UPDATE: LVITEM struct is this:
private struct LVITEM
{
public uint mask;
public int iItem;
public int iSubItem;
public uint state;
public uint stateMask;
public IntPtr pszText;
public int cchTextMax;
public int iImage;
public IntPtr lParam;
}
The sLIstView handle is correct... a checked in spy++.
What test do i need to perform to check where is the problem? I could give you all the script if that would help.
Have you tried using LWM_GETITEMTEXTW instead?
I'm working on a project where I need to use a joystick (DirectInput) to control the mouse pointer inside a wpf application. I need to be able to press/release a mouse button as well as possibly drag across the screen. Preferably this should actually control the mouse, allowing the joystick to be used to control other applications as well. I've got everything figured out on the DirectInput side, but I'm having trouble with the mouse-drag interaction.
This is how I'm doing left-button down:
[DllImport("user32.dll", SetLastError = true)]
static extern uint SendInput(uint nInputs, ref Input pInputs, int cbSize);
...
var aInput = new Input {
type = 0x0,
mouse = new MouseInput {
dwFlags = 0x6,
dwExtraInfo = 0,
mouseData = 0,
time = 0
}
};
SendInput(1, ref aInput, 28);
where Input and MouseInput are as follows:
[StructLayout(LayoutKind.Explicit)]
public struct Input {
[FieldOffset(0)]
public int type; // 4
[FieldOffset(4)]
public MouseInput mouse; // 24
}
[StructLayout(LayoutKind.Explicit)]
public struct MouseInput {
[FieldOffset(0)]
public int dx; // 4
[FieldOffset(4)]
public int dy; // 4
[FieldOffset(8)]
public int mouseData; // 4
[FieldOffset(12)]
public int dwFlags; // 4
[FieldOffset(16)]
public int time; // 4
[FieldOffset(20)]
public int dwExtraInfo; // 4
};
This method works for left/right mouse button down, and System.Windows.Forms.Cursor.Position works well for mouse movement, but I'm not sure how to get a mouse drag rigged up. Any pointers?
See related articles here on SO:
Injecting Mouse Input in WPF Applications
Simulate Mouse/Keyboard Input In WPF
Move the mouse in wpf
EDIT: concerning the specific "drag" need, here is another link (from the NUnitForms project here: http://nunitforms.sourceforge.net/) about a MouseController utility code that contains mouse simulation methods:
http://nunitforms.svn.sourceforge.net/viewvc/nunitforms/trunk/nunitforms/source/NUnitForms/MouseController.cs?view=markup
It has a Drag method. You could test if this works. I know, it's especially designed for WPF, but it's worth a try. Plus, if it's only about mouse mouvement, I don't see a problem if you need to reference Winforms assembly.
Benjamin, I used your code for generating mouse click. I played with the contents and found that setting "dwFlags = 0x3" did the mouse left button click and hold :)
I have been working on a touch screen application. I need to know if there exists a ready touch screen keyboard that I can use as a Controller for my application.
I tried using the windows ready keyboard but it is too small for a touch screen.
Process.Start(Environment.GetFolderPath(Environment.SpecialFolder.System) + Path.DirectorySeparatorChar + "osk.exe");
any ideas how to start building one in minimal time will be greatly appreciated....
New Question
I have copied this code from someone and did some modifications to it :
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using System.Windows;
using System.Diagnostics;
using System.IO;
using System.Runtime.InteropServices;
class Win32
{
[DllImport("user32.dll", EntryPoint = "SetWindowPos")]
public static extern bool SetWindowPos(
int hWnd, // window handle
int hWndInsertAfter, // placement-order handle
int X, // horizontal position
int Y, // vertical position
int cx, // width
int cy, // height
uint uFlags); // window positioning flags
public const int HWND_BOTTOM = 0x0001;
public const int HWND_TOP = 0x0000;
public const int SWP_NOSIZE = 0x0001;
public const int SWP_NOMOVE = 0x0002;
public const int SWP_NOZORDER = 0x0004;
public const int SWP_NOREDRAW = 0x0008;
public const int SWP_NOACTIVATE = 0x0010;
public const int SWP_FRAMECHANGED = 0x0020;
public const int SWP_SHOWWINDOW = 0x0040;
public const int SWP_HIDEWINDOW = 0x0080;
public const int SWP_NOCOPYBITS = 0x0100;
public const int SWP_NOOWNERZORDER = 0x0200;
public const int SWP_NOSENDCHANGING = 0x0400;
}
namespace Keyboard
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
this.KeyText.MouseDoubleClick += new MouseEventHandler(KeyText_MouseDoubleClick);
}
private Process process;
private string getKeyboardText()
{
KeyScreen k = new KeyScreen();
k.ShowDialog();
if (k.DialogResult.ToString().Equals("OK"))
return k.Text1;
else
return null;
}
void KeyText_MouseDoubleClick(object sender, MouseEventArgs e)
{
this.KeyText.Text = getKeyboardText();
}
private void KeyBtn_Click(object sender, EventArgs e)
{
this.showKeypad();
//Process.Start(Environment.GetFolderPath(Environment.SpecialFolder.System) + Path.DirectorySeparatorChar + "osk.exe");
}
private void showKeypad()
{
Process process = new Process();
process.StartInfo.UseShellExecute = false;
process.StartInfo.RedirectStandardOutput = true;
process.StartInfo.RedirectStandardError = true;
process.StartInfo.CreateNoWindow = true;
process.StartInfo.FileName = "c:\\Windows\\system32\\osk.exe";
process.StartInfo.Arguments = "";
process.StartInfo.WorkingDirectory = "c:\\";
process.Start(); // Start Onscreen Keyboard
process.WaitForInputIdle();
Win32.SetWindowPos((int)process.MainWindowHandle,
Win32.HWND_BOTTOM,
300, 300, 1200, 600,
Win32.SWP_SHOWWINDOW | Win32.SWP_NOZORDER);
}
}
}
This works just fine I have a nice keypad (good size for a touch screen), However, i am not familiar with C# and VB at all.... change the Text in a textbox according to the touch screen keyboard I show.
Thanks,
To create your own on screen keyboard you basically need to do the following.
1- Create a windows keyboard application that you can interact with but does not steal the input focus from the current control in which every application you where currently working.
2- The keyboard application should send key presses to the currently active control in response to clicking of buttons in the keyboard application.
The following answer I provided previously provides a simple demonstration of how to do this using WinForms.
C# - Sending keyboard events to (last) selected window
Basically the window is created with the WS_EX_NOACTIVATE style, which prevents the window from becomming active and steeling input focus. Then in response to button clicks it uses SendKeys to send the approriate key press message to the control that has input focus.
The following answer shows how to apply the WS_EX_NOACTIVATE style to a WPF application.
Trying to create a WPF Touch Screen Keyboard Appliaction, Can't get WPF App to Send Keys to another window? Any suggestions?
There are other solutions other than WS_EX_NOACTIVATE, but this is simple and only suffers from a minor presentational glitch when dragging the window arround. Which with some clever message processing can be overcome.
I know this is really, really old but just in case someone else's first thought was also that the built in Windows on screen keyboard is too small (as was also my first thought), you can in fact make it bigger simply by making the window it's in bigger (which you could also do programmatically).
+1 for the question and posting the code though because I'm making one simply because I need one with more direct control over it from my application.