I hope to simulate a left mouse button click on another window, and hold the button for about 2 seconds. I have tried the following code:
int WM_LBUTTONDOWN = 0x0201;
int WM_LBUTTONUP = 0x0202;
SendMessage(hd, WM_LBUTTONDOWN, new IntPtr(1), lParam);
Thread.Sleep(2000);
SendMessage(hd, WM_LBUTTONUP, new IntPtr(1), lParam);
The parameter "hd" is the handle of another window and "lParam" contains coordinate infomation. But it didn't work as my expectation.I used breakpoint to debug the code. When "WM_LBUTTONDOWN" message was sent to another window, the push button in another window was clicked immediately, rather than be held and wait for the message "WM_LBUTTONUP".
When I used real mouse to click and hold the button, spy++ showed that there are not any other messages except "WM_MOUSEMOVE" between "WM_LBUTTONDOWN" and "WM_LBUTTONUP".
Picture of Spy++ showed
So, how to simulate mouse button down and being hold in C#? Any advice would be helpful, thank you!
You can’t simulate keyboard input by sending window messages. You need to use SendInput() instead (C# declaration).
See: Send keys through SendInput in user32.dll.
"mouse_event" API function can solve the problem.But the side effect is the mouse poiter will move actually, when the program is running you cannot move your mouse or unexpected wrong position would be clicked.
Related
I need your help with some code I found on the web a long time ago. Sadly I don't remember from where it is :( To move the borderless forms in my project I use this code snipped:
protected override void OnMouseDown(MouseEventArgs e)
{
base.OnMouseDown(e);
if (e.Button == System.Windows.Forms.MouseButtons.Left)
{
this.Capture = false;
Message msg = Message.Create(this.Handle, 0XA1, new IntPtr(2), IntPtr.Zero);
this.WndProc(ref msg);
}
}
My problem is that I don't completely understand how the code works. As far as I understand the event gets activated when a mouse button is clicked on the forms. Then follows the query, if the mouse click is a left click. And from there I don't know what the following code does :(
The this.Capture=false tells the OS to stop capturing mouse events. The Message.Create creates a new message to be send to the message loop of the current application. 0xA1 is WM_NCLBUTTONDOWN; which is a non-client left-button down message. Meaning it simulated clicking the left mouse button on the missing border.
Windows then picks up the rest of the process.
At a basic level, you are sending a message to your window and having it handle it.
You are giving it a 0xA1 (WM_NCLBUTTONDOWN) and by sending a 0x02 as the parameter (HTCAPTION) you fool the process into thinking you are on the caption bar. Drags on a caption bar move the window around, hence you can drag the window by using your code.
Samples of doing this at:
C#: How to drag a from by the form and it's controls?
http://www.catch22.net/tuts/win32-tips-tricks
You're basically posting a message to the window. A little MSDN research uncovers that the message you're posting is WM_NCLBUTTONDOWN. Basically, you're telling the underlying window that the left mouse button is being held down and it needs to respond to that. That response typically happens to be dragging the window about.
A standard Windows dialog will flash if its owner window is clicked. The effect is similar to activating and deactivating the window.
When implementing a custom window border on my dialog, however, I can't figure out when I should flash the window. Windows does not flash the dialog for me.
Here's what I tried:
I watched all of the messages going to both the owner and dialog, but was unable to find any messages which exist solely to tell the window to flash.
I hooked Spy++ onto a default Windows dialog, but was also unable to find a "flash" message.
Looking in WinUser.h I couldn't find a "flash" message, so I am assuming it is some sort of combination of one or more messages with lParam and wParam specified.
Does anyone have any experience with this, or perhaps can point me to some pages which explain this? Any ideas are appreciated, since I have been working on this problem for several months now.
EDIT
In response to comments, here is the code for what I am currently using:
private IntPtr WndProc(IntPtr hwnd, int msg, IntPtr wParam, IntPtr lParam, ref bool handled)
{
if (msg == 0x0020)
{
if ((short)((long)lParam & 0xffff) == (-2))
{
short hiword = (short)((((long)lParam) >> 16) & 0xffff);
if (hiword == 0x0201 || hiword == 0x0204)
Flash(); // My function which simulates a window flash
}
}
return IntPtr.Zero;
}
There isn't a message that tells you that Windows wants your window to flash. What you can do, however, is watch for the same trigger that Windows uses to start flashing your window in the first place.
Your window will flash when it has an owner window that's disabled (the WS_DISABLED style bit is set) and the user clicks a mouse button on any part of the disabled window.
Internally, this is handled by DefWindowProc in response to the WM_SETCURSOR message:
If the low-order word of lParam is HTERROR, and
the high-order word of lParam is one of the mouse button down messages (WM_LBUTTONDOWN, etc), and
the window has an enabled owned popup window, then
DefWindowProc will call FlashWindowEx on the popup window
So to identify the trigger for when you should flash your dialog yourself, all you have to do is the same thing as Windows does. In the owner window's window procedure, handle the WM_SETCURSOR message, perform the above three tests, and if all three are true then you can trigger your own custom flashing for your dialog. And of course, in that situation you wouldn't pass the message back to DefWindowProc to handle.
I would like to create a winform like this:
I already accomplished the visual effect (like as seen in the picture), by following other question. But I can't disallow resizing the form, since to have the border, it must be "Sizeable". Someone suggested putting Minimum Size and Maximum Size values equal to the current Form Size. This solves part of the issue, but when the mouse hovers the border, it still shows the double-ended arrow, suggesting the form is resizeable. Is there any way of disable this cursor change? My goal is to mimic the original systray popups in Windows 7, like the network, sound, etc.
Thank you!
Example code:
private const int WM_NCHITTEST = 0x84;
private const int HTCLIENT = 0x1;
protected override void WndProc(ref Message m)
{
switch (m.Msg)
{
case WM_NCHITTEST:
m.Result = (IntPtr)HTCLIENT;
return;
}
base.WndProc(ref m);
}
This way, when the cursor hovers the borders, the pointer doesn't change, because it's treated as if it was inside the form, achieving the desired effect.
Add a message handler to your form and handle WM_NCHITTEST. When the original returns HTSIZE (etc.), return HTNONE or HTCAPTION.
Something like this question should get you started.
To explain:
When Windows wants to know which cursor to use for your window, it first sends you a WM_NCHITTEST message (non-client hit test). This message is handled by the WndProc method. Your window is supposed to return one of the HT* codes to tell Windows which part of the window the mouse is over. For example, return HTCAPTION for the caption area, HTCLIENT for the client area, or HTSIZENESW for the bottom left sizing corner. The default message handler (calling base.WndProc) deals with this for standard windows.
We don't have a standard window.
What we're trying to do here is ask the original window what the mouse is over. If it returns any of the HTSIZE* values, we want to replace that return value with HTNONE (for no action) or HTCLIENT (if you want the cursor to be treated as inside the window -- probably not this one) or HTCAPTION (if you want to be able to drag the window by the edges -- might be useful).
I'm building a WPF application with some hackery involved to embed Form elements on the same render layer. Essentially what I do is I render the form off-screen and have a mock-up of the form in a WPF image. I also have a low level mouse hook which steals events from the WPF application if they are destined for the Form mock-up and instead use PostMessage(...) to send the events off to the hidden form element.
If I return a non-zero value from my hook procedure indicating to eat the event (even if I still call all the mouse hooks in the queue), the cursor gets stuck in one position. I'm assuming this is because the cursor position gets handled in some sort of WPF application layer that the event isn't reaching.
I figured that it was fine to prevent the WPF application from knowing about the event at all because I could just set the cursor position myself-- there are coordinates attached to a mouse event after all. Unfortunately, it seems that these mouse coordinates are horribly incorrect. In fact, no matter where my cursor is located, I always receive the same coordinates.
Here is my code:
if (nCode >= 0){
MOUSEHOOKSTRUCT_LL mousehookstruct_ll1 =
((MOUSEHOOKSTRUCT_LL)Marshal.PtrToStructure(((IntPtr)lParam), typeof(MOUSEHOOKSTRUCT_LL)));
if (mousehookstruct_ll1 != null) {
if ((user != null) && user.SystemMouseHookProc(nCode, wParam, lParam, new Point(mousehookstruct_ll1.pt_x, mousehookstruct_ll1.pt_y), mousehookstruct_ll1.dwExtraInfo)) {
return new IntPtr(1);// CallNextHookEx(this.MessageHookHandle, 1, wParam, lParam);// It doesn't matter that I don't call CallNextHook here.
}
}
}
GC.KeepAlive(this);
return CallNextHookEx(this.MessageHookHandle, nCode, wParam, lParam);
}
Then in user.SystemMouseHookProc(...) I print out the correct cursor position followed by the coordinates pulled by the mouse hook, and the output is always something like the following:
Cursor: 523,578
{X=1985777551,Y=1985777602} //This coordinate never changes
That output is clearly wrong. What can I do to get the correct mouse coordinates from a mouse hook?
Thank you.
P.S. This solution is derived from a popular one online. Unfortunately, that solution didn't meet my needs so I've had to alter it to this form.
Any reason the static property System.Windows.Forms.Control.MousePosition won't work?
I have following Situation: I have a special 3D program which I need to make able to react on multi-touch events without changing the program itself. Therefore I need a mapper program, which receives the multi-touch events of Windows 7, converts them in corresponding mouse and keyboard events and send this emulated events to the 3D program, so that it can process those events.
I have already read and tried a lot and my current approach is to have an almost transparent overlay window over the 3D programm to catch multi-touch events. But this is also the problem, I'm unable to manage to forward the generated mouse events to the underlying 3D programm in a usable way. Right now I used pinvoke functions like mouse_event, SendMessage and so on, but none of them worked for me. Since I always had to bring the 3D program to front, send the event and afterwards I needed to put my mapper programm to front again. This works quite crappy.
So my question is more or less, is there a nice working approach to do the things I mentioned above? Or at least a nice way to send mouse and keyboard events to processes in the background?
Hope anyone could give me a hint or suggestion....
Here are the way I simulate mouse clicks right now:
private void OnMouseDown(object sender, MouseEventArgs e)
{
Point position = this.PointToScreen(new Point(e.Location.X, e.Location.Y));
//Simulate the mouse event on the given position
this.Visible = false;
Cursor.Position = position;
mouse_event(Convert.ToUInt16(MouseEventFlags.LEFTDOWN), 0, 0, 0, 0);
}
private void OnMouseUp(object sender, MouseEventArgs e)
{
Point position = this.PointToScreen(new Point(e.Location.X, e.Location.Y));
//Simulate the mouse event on the given position
Cursor.Position = e.Location;
mouse_event(Convert.ToUInt16(MouseEventFlags.LEFTUP), 0, 0, 0, 0);
this.Visible = true;
}
//Get a handle to the mouse event manager
[DllImport("user32.dll")]
private static extern void mouse_event(uint dwFlags, uint dx, uint dy, uint dwData, int dwExtraInfo);
maybe have a look at InfoStrat.VE - Bing Maps 3D for WPF and Microsoft Surface.
They made the Bind maps control usable for touch.
Maybe that helps.
There isn't a way to direct mouse/keyboard events to a particular process in Win32. That being said, however, you might be able to get this approach to work:
Register your mapper window as a touch window to get touch events.
Add a handler for WM_NCHITTEST in your message loop, call DefWindowProc() to get the default handler, and then convert any HTCLIENT returns to HTTRANSPARENT. This will cause Windows to pass any mouse events through your window, onto the underlying window.
This probably won't work if your mapper window lives in a separate process from your client program; if that's the case, you'll have to do some hacking with AttachThreadInput. I don't recommend this, incidentally, as merged thread queues are very prone to bugs.