How to Receive click location from another process with C#? - c#

My C# application needs to receive a click position from another process, I then need to show on my app. But I don't know how I would implement it.
Could someone help me figure out how to do this?
Thanks so much

What you need is called a "Hook". Windows allows you to hook both the keyboard and mouse events. Basically windows works by injecting the appropriate movements and clicks of the mouse (and keys typed) into the application that has focus.
However using the hook, you receive all of the events, not just those relevant to your app. Once you have the hook established you can then do what you want with the information.
Note that you are going down to the windows OS and if you do the wrong thing here, you can leak the handles and you can also cause windows to get into a bad state.
There is a great tutorial here from MS Technet that describes how to do this in C#.

Related

How can I capture G-Key input using Logitech G-Key Macro SDK without focus?

I have implemented a check for G-Keys being pressed in a Windows Forms Application (C#) using the Logitech G-Key Macro SDK. Specifically using a wrapper class and the supplied LogitechGkeyEnginesWrapper.dll exactly as it suggests in the documentation (included in the SDK).
This works perfectly when my application has focus, and when any child windows have focus. I can check via a callback or an update call (as suggested in the docs) but neither work when my application does not have focus.
I am hoping to capture this input (of the G-Keys only) outside of application focus (global). I do not believe these keys send a scan code and the G-Key Macro SDK is required to get the state of this keyboard/mouse key.
Any help would be greatly appreciated. I have sent an email to Logitech Dev Support regarding this, but have not received anything back at this time.
Link to Logitech Developers page:
http://gaming.logitech.com/en-au/developers
After contacting Logitech G Developer support, they responded with the following:
You are correct in that the G-keys do not function when the application loses focus. There is currently no way around this, as our design rationale is that a third-party application should not be able to read the keystrokes of the current app in question.
So the scope of G-keys data is, by design, tied to the application in focus.

Determining (programmatically) who controls the mouse on a PC using C# or C++

Is there a way to determine who is controlling the mouse (and which mouse) on a PC programmatically? I recently installed LogMeIn (logmein.com) and wanted to know if it's possible to (1) tell within a program if the mouse is being clicked/moved by the direct user or by a remote user, (2) write a stand-alone program that simply shows mouse events (on any application) and whether or not the mouse event was generated by a local or remote user. I am somewhat familiar with Win32 hooks, but don't think that they can give this sort of information. Regarding (1) it would seem like a common request. I.e. "Only allow user to complete button presses related to password change if he is local..." or something like that.
Of secondary importance (just academic interest actually) is the question of telling whether the local user is using the mouse or the trackpad.
you can hook the device API's of windows that you want to get the information from, then if the mouse moves the api is probably not called when the user movers the mouse, but if logmein does then it probably calls some SetMousePointerPos-like C WinAPI.
Since posting, I came across this post which basically answers the question.
C# Get Mouse handle (GetRawInputDeviceInfo)
I'd add that for those of you who want to use WPF instead of WinForms (as the example above uses), check out ComponentDispatcher.ThreadFilterMessage (for WPF) or IMessageFilter (Windows Forms).
I wrote a couple of programs based on the above posting (one is basically the posting above with some minor additions and the other is a WPF (as opposed to WinForms)). If I can figure out GitHub I'll post all the code and add a comment here. But the posting above definitely gives you all you need.

Capture event for fingerprint reader in no active application

I have an application that validates users through a fingerprint reader. The validation is done in a method that i subscribed to manage the event, it looks like this.
FingerprintVerificationControl.OnComplete+=new DPFP.Gui.Verification.VerificationControl._OnComplete(FingerprintVerificationControl_OnComplete);
Everything goes well while i'm woriking with the application, i mean, when it has the focus, but, i have put it in the system tray using a notifyicon control and associating it with a contextmenu control to restore and close the app; so when it is in the system tray (is not the active application) i have no response from the fingerprint to manage the validation; the event of read the finger of the user does not fires.
My question is, what is the best way to manage that? Is it possible?. I found that i can do it if i make a windows service, other sites say that with Win32 API, others have examples but with keyboard events like presss key and so on. Any idea? any idea would be thank.
I have found that making a Windows Service is the best way to do stuff like that. However I don't any about the windows32 API cause I've been able to do everything I can without it.
As for the keyboard events, I have tried those and have found that they only work when the application has focus therefore, they are unusable. And services are just useful in general. You can try to talk to your application from your service via the local network so you don't have to rework the entire application.
I hope this is helpful.

how to use console/windows app without focus on it?

im going to create a KEylog application that enables me to write all data typed(keys pressed) on a text file/database how can i do this without focus on the windows app/console app?
for a reason , for all you to know, this is for my PC and im not trying to hack an account.
just for me to know what they are doing on my computer.
Find an example written in .NET here:
Processing Global Mouse and Keyboard Hooks in C#
This class allows you to tap keyboard and mouse and/or to detect their
activity even when an application runs in the background or does not
have any user interface at all.

Send keys to WPF Browser control

Can I programatically send [UserID]{TAB}[Password]{CARRIAGE RETURN} to a webbrowser control which has a userID, password and Sign-in button there. I wanted to use my own virtual keyboard in my application. Any tips here?
Sorry for the late answer but I've just finished a similar project and as part of the work am in the process of open sourcing two projects to Codeplex.
The first is the Windows Input Simulator which is a simple .NET wrapper around the Win32 SendInput written in C#.
The second is a very customisable on screen keyboard or touch screen keyboard control and toolkit called WpfKB and will be available as an initial release tomorrow. Hope these are of help to you or anyone else who comes across the projects.
I recently had to implement automatic authentication through a WPF browser control, and I looked into simulating keystrokes. I didn't need a full virtual keyboard so interacting with the DOM of the login page through IHTMLDocument2 ended up being the best approach, but I looked into keystroke automation before making that decision and found a few options.
You can raise the appropriate routed events on the control as described in Simulating basic keyboard events and Simulating text input. I don't know of any specific problems with this approach but I opted against it simply because I wasn't comfortable simulating input without looking at how the CLR handles the actual input, and without at least raising the complete lifetime (PreviewKeyDown, KeyDown, PreviewKeyUp, KeyUp) I was wary of unintended consequences.
Take a look at WOSK on CodePlex. It's a good example of how to invoke Win32 keybd_event and SendInput functions to generate the low-level input messages via Managed Windows API to simulate input. There's some unnecessary fluff (eg transparency) and some odd WPF usage, such as using a CommandParameter with a Click event instead of a Command on the buttons, but the general approach is sane and it's reasonably complete.
You can also invoke the windows on-screen keyboard as alluded to by Jeroen. I didn't try this because I didn't need a virtual keyboard, but if you're going to call into Win32 anyway, you might as well follow the WOSK model and build the UI the way you want it.

Categories