I need your help with some code I found on the web a long time ago. Sadly I don't remember from where it is :( To move the borderless forms in my project I use this code snipped:
protected override void OnMouseDown(MouseEventArgs e)
{
base.OnMouseDown(e);
if (e.Button == System.Windows.Forms.MouseButtons.Left)
{
this.Capture = false;
Message msg = Message.Create(this.Handle, 0XA1, new IntPtr(2), IntPtr.Zero);
this.WndProc(ref msg);
}
}
My problem is that I don't completely understand how the code works. As far as I understand the event gets activated when a mouse button is clicked on the forms. Then follows the query, if the mouse click is a left click. And from there I don't know what the following code does :(
The this.Capture=false tells the OS to stop capturing mouse events. The Message.Create creates a new message to be send to the message loop of the current application. 0xA1 is WM_NCLBUTTONDOWN; which is a non-client left-button down message. Meaning it simulated clicking the left mouse button on the missing border.
Windows then picks up the rest of the process.
At a basic level, you are sending a message to your window and having it handle it.
You are giving it a 0xA1 (WM_NCLBUTTONDOWN) and by sending a 0x02 as the parameter (HTCAPTION) you fool the process into thinking you are on the caption bar. Drags on a caption bar move the window around, hence you can drag the window by using your code.
Samples of doing this at:
C#: How to drag a from by the form and it's controls?
http://www.catch22.net/tuts/win32-tips-tricks
You're basically posting a message to the window. A little MSDN research uncovers that the message you're posting is WM_NCLBUTTONDOWN. Basically, you're telling the underlying window that the left mouse button is being held down and it needs to respond to that. That response typically happens to be dragging the window about.
Related
I hope to simulate a left mouse button click on another window, and hold the button for about 2 seconds. I have tried the following code:
int WM_LBUTTONDOWN = 0x0201;
int WM_LBUTTONUP = 0x0202;
SendMessage(hd, WM_LBUTTONDOWN, new IntPtr(1), lParam);
Thread.Sleep(2000);
SendMessage(hd, WM_LBUTTONUP, new IntPtr(1), lParam);
The parameter "hd" is the handle of another window and "lParam" contains coordinate infomation. But it didn't work as my expectation.I used breakpoint to debug the code. When "WM_LBUTTONDOWN" message was sent to another window, the push button in another window was clicked immediately, rather than be held and wait for the message "WM_LBUTTONUP".
When I used real mouse to click and hold the button, spy++ showed that there are not any other messages except "WM_MOUSEMOVE" between "WM_LBUTTONDOWN" and "WM_LBUTTONUP".
Picture of Spy++ showed
So, how to simulate mouse button down and being hold in C#? Any advice would be helpful, thank you!
You can’t simulate keyboard input by sending window messages. You need to use SendInput() instead (C# declaration).
See: Send keys through SendInput in user32.dll.
"mouse_event" API function can solve the problem.But the side effect is the mouse poiter will move actually, when the program is running you cannot move your mouse or unexpected wrong position would be clicked.
I'm beginner in c# and need some help. After loading Form I want to display on Form coordinates of a Mouse when it's Clicked. Click can be made outside of the Form. For example in Browser. Can someone help me with this.
Maybe the most simple way is setting Capture property of a form to true, then handle click event and convert the position (that is position related to top left point of form) to screen position using PointToScreen method of form.
For example you can put a button on form and do:
private void button1_Click(object sender, EventArgs e)
{
//Key Point to handle mouse events outside the form
this.Capture = true;
}
private void MouseCaptureForm_MouseDown(object sender, MouseEventArgs e)
{
this.Activate();
MessageBox.Show(this.PointToScreen(new Point(e.X, e.Y)).ToString());
//Cursor.Position works too as RexGrammer stated in his answer
//MessageBox.Show(this.PointToScreen(Cursor.Position).ToString());
//if you want form continue getting capture, Set this.Capture = true again here
//this.Capture = true;
//but all clicks are handled by form now
//and even for closing application you should
//right click on task-bar icon and choose close.
}
But more correct (and slightly difficult) way is using global hooks.
If you really need to do it, you can take a look at this links:
Processing Global Mouse and Keyboard Hooks in C#
Low-Level Mouse Hook in C#
Application and Global Mouse and Keyboard Hooks .Net Libary in C#
I think you can't handle the mouse click outside your Form at least easily.
inside the form using MouseEventArgs it can simply be handled.
private void Form1_MouseClick(object sender, MouseEventArgs e)
{
// e.Location.X & e.Location.Y
}
Learn more about this topic at Mouse Events in Windows Forms.
I hope it helps.
Cursor.Position and Control.MousePosition both return the position of the mouse cursor in screen coordinates.
The following articles deal with capturing Global mouse click events:
Processing Global Mouse and Keyboard Hooks in C#
Global Windows Hooks
You need a global mouse hook.
See this question
According to this MSDN Article : "The MouseLeave event is raised in response to a touch event occurring outside the object's bounding area.". But in implementation am finding it being triggered while releasing a tap on the object's bounding area (precisely like OnMouseLeave).
So how do I get it to fire while tapping outside the UI Object's boundary?
I am new to C# and StackOverflow, but I thought I might contribute what I found. I did a quick Google search of "" and found this article about detecting clicks outside of a GUI.
Here is a quick method/description (found in link) which detects the clicks outside of a rectangle drawn over the GUI. "When you use a GUI window or group / area, you can use its Rect to check if the mouse is inside."
Rect windowPos = new Rect(10,10,200,150);
void OnGUI()
{
Event e = Event.current;
windowPos = GUI.Window(0, windowPos, drawWindow, "MyWindow");
if (e.type == EventType.MouseDown && !windowPos.Contains(e.mousePosition))
{
// Click was outside of the GUI window
}
}
void drawWindow(int aID)
{
// draw the window content
}
Just go read on that page. It seemed to answer their question before.
Good luck!
I would like to create a winform like this:
I already accomplished the visual effect (like as seen in the picture), by following other question. But I can't disallow resizing the form, since to have the border, it must be "Sizeable". Someone suggested putting Minimum Size and Maximum Size values equal to the current Form Size. This solves part of the issue, but when the mouse hovers the border, it still shows the double-ended arrow, suggesting the form is resizeable. Is there any way of disable this cursor change? My goal is to mimic the original systray popups in Windows 7, like the network, sound, etc.
Thank you!
Example code:
private const int WM_NCHITTEST = 0x84;
private const int HTCLIENT = 0x1;
protected override void WndProc(ref Message m)
{
switch (m.Msg)
{
case WM_NCHITTEST:
m.Result = (IntPtr)HTCLIENT;
return;
}
base.WndProc(ref m);
}
This way, when the cursor hovers the borders, the pointer doesn't change, because it's treated as if it was inside the form, achieving the desired effect.
Add a message handler to your form and handle WM_NCHITTEST. When the original returns HTSIZE (etc.), return HTNONE or HTCAPTION.
Something like this question should get you started.
To explain:
When Windows wants to know which cursor to use for your window, it first sends you a WM_NCHITTEST message (non-client hit test). This message is handled by the WndProc method. Your window is supposed to return one of the HT* codes to tell Windows which part of the window the mouse is over. For example, return HTCAPTION for the caption area, HTCLIENT for the client area, or HTSIZENESW for the bottom left sizing corner. The default message handler (calling base.WndProc) deals with this for standard windows.
We don't have a standard window.
What we're trying to do here is ask the original window what the mouse is over. If it returns any of the HTSIZE* values, we want to replace that return value with HTNONE (for no action) or HTCLIENT (if you want the cursor to be treated as inside the window -- probably not this one) or HTCAPTION (if you want to be able to drag the window by the edges -- might be useful).
I have following Situation: I have a special 3D program which I need to make able to react on multi-touch events without changing the program itself. Therefore I need a mapper program, which receives the multi-touch events of Windows 7, converts them in corresponding mouse and keyboard events and send this emulated events to the 3D program, so that it can process those events.
I have already read and tried a lot and my current approach is to have an almost transparent overlay window over the 3D programm to catch multi-touch events. But this is also the problem, I'm unable to manage to forward the generated mouse events to the underlying 3D programm in a usable way. Right now I used pinvoke functions like mouse_event, SendMessage and so on, but none of them worked for me. Since I always had to bring the 3D program to front, send the event and afterwards I needed to put my mapper programm to front again. This works quite crappy.
So my question is more or less, is there a nice working approach to do the things I mentioned above? Or at least a nice way to send mouse and keyboard events to processes in the background?
Hope anyone could give me a hint or suggestion....
Here are the way I simulate mouse clicks right now:
private void OnMouseDown(object sender, MouseEventArgs e)
{
Point position = this.PointToScreen(new Point(e.Location.X, e.Location.Y));
//Simulate the mouse event on the given position
this.Visible = false;
Cursor.Position = position;
mouse_event(Convert.ToUInt16(MouseEventFlags.LEFTDOWN), 0, 0, 0, 0);
}
private void OnMouseUp(object sender, MouseEventArgs e)
{
Point position = this.PointToScreen(new Point(e.Location.X, e.Location.Y));
//Simulate the mouse event on the given position
Cursor.Position = e.Location;
mouse_event(Convert.ToUInt16(MouseEventFlags.LEFTUP), 0, 0, 0, 0);
this.Visible = true;
}
//Get a handle to the mouse event manager
[DllImport("user32.dll")]
private static extern void mouse_event(uint dwFlags, uint dx, uint dy, uint dwData, int dwExtraInfo);
maybe have a look at InfoStrat.VE - Bing Maps 3D for WPF and Microsoft Surface.
They made the Bind maps control usable for touch.
Maybe that helps.
There isn't a way to direct mouse/keyboard events to a particular process in Win32. That being said, however, you might be able to get this approach to work:
Register your mapper window as a touch window to get touch events.
Add a handler for WM_NCHITTEST in your message loop, call DefWindowProc() to get the default handler, and then convert any HTCLIENT returns to HTTRANSPARENT. This will cause Windows to pass any mouse events through your window, onto the underlying window.
This probably won't work if your mapper window lives in a separate process from your client program; if that's the case, you'll have to do some hacking with AttachThreadInput. I don't recommend this, incidentally, as merged thread queues are very prone to bugs.