The Microsoft WinAPI documentation appears to suggest that user32.dll contains a function called GetNextWindow() which supposedly allows one to enumerate open windows in their Z order by calling this function repeatedly.
Pinvoke usually gives me the necessary DllImport statement to use WinAPI functions from C#. However, for GetNextWindow() it doesn't have an entry. So I tried to construct my own:
[DllImport("user32.dll")]
static extern IntPtr GetNextWindow(IntPtr hWnd, uint wCmd);
Unfortunately, when trying to call this, I get an EntryPointNotFoundException saying:
Unable to find an entry point named 'GetNextWindow' in DLL 'user32.dll'.
This seems to apply only to GetNextWindow(); other functions that are listed on Pinvoke are fine. I can call GetTopWindow() and GetWindowText() without throwing an exception.
Of course, if you can suggest a completely different way to enumerate windows in their current Z order, I'm happy to hear that too.
GetNextWindow() is actually a macro for GetWindow(), rather than an actual API method. It's for backward compatibility with the Win16 API.
[DllImport("user32.dll", SetLastError = true)]
static extern IntPtr GetWindow(IntPtr hWnd, uint uCmd);
enum GetWindow_Cmd : uint {
GW_HWNDFIRST = 0,
GW_HWNDLAST = 1,
GW_HWNDNEXT = 2,
GW_HWNDPREV = 3,
GW_OWNER = 4,
GW_CHILD = 5,
GW_ENABLEDPOPUP = 6
}
(From Pinvoke.net)
GetNextWindow is a c++ macro that calls GetWindow, so you cannot call it from .NET. Call GetWindow instead.
From MSDN:
Using this function is the same as calling the GetWindow function with the GW_HWNDNEXT or GW_HWNDPREV flag set
Related
Background
I'm working with the console and PInvokes in C#, just to see what magic you can do when you're not limiting yourself to the .NET framework. I stumbled upon a function in Kernel32 called GetConsoleDisplayMode that basically lets you see if the console is running in windowed, full-screen or hardware full-screen mode. The documentation for this function can be found here.
GetConsoleDisplayMode takes only one parameter: an LPDWORD in which the display mode can be stored. The function returns a bool, indicating the success of the call. According to this documentation, an LPDWORD is a pointer to a DWORD, which in turn is simply a 32-bit unsigned integer (uint32 in C#).
So, from what I understand from the docs is that this should work:
[DllImport("kernel32", EntryPoint = "GetConsoleDisplayMode", ExactSpelling = false, CharSet = CharSet.Unicode, SetLastError = true)]
public static extern bool GetConsoleDisplayMode(ref uint lpModeFlags);
static void Main(string[] args) {
uint mode = 0;
var success = GetConsoleDisplayMode(ref mode);
Console.WriteLine(success);
Console.WriteLine(mode);
Console.ReadKey(true);
}
However, the function always returns false!
What I have tried
Of course I have tried to modify the code, in hopes of getting it to work properly.
A call to GetLastError() returns the most curious error code 3221684230 of which even Google and Microsoft Documentation have no knowledge (nor is it uint.MaxValue or int.MaxValue, I have checked). This does not get me a step further, so I dropped this option.
Another thing I've tried (in the hopes that I had read or understood the documentation incorrectly) is changing ref to out, which passes an uninitialized variable for the function to initialize for me:
[DllImport("kernel32", EntryPoint = "GetConsoleDisplayMode", ExactSpelling = false, CharSet = CharSet.Unicode, SetLastError = true)]
public static extern bool GetConsoleDisplayMode(out uint lpModeFlags);
static void Main(string[] args) {
var success = GetConsoleDisplayMode(out var mode);
if (!success) {
Console.WriteLine(Kernel32.GetLastError());
}
Console.WriteLine(success);
Console.WriteLine(mode);
Console.ReadKey(true);
}
Alas, this too was unsuccessful and, interestingly, produces the same error code.
My question(-s)
Why does my call to GetConsoleDisplayMode always return false, why does GetLastError return such an obscure error code and most importantly: how do I find the error that's bugging me and/or get this to work?
More information
I am running Windows 10 x64 build 1709 (Fall Creators Update).
Maybe this excerpt from John Paul Mueller's book on the Windows API and the .NET framework can be of help (that's about the best documentation of GetConsoleDisplayMode that I can find).
On a system with a non-US keyboard / culture, I am receiving a Barcode from a scanner as keyboard input. The scanner can be set to different cultures. In my case it is set to en-US. In this case the system language and the barcode scanner language encoding are different.
I have declared this function to decode:
[DllImport("user32.dll", CharSet = CharSet.Unicode)]
private static extern int ToUnicodeEx(uint virtualKeyCode, uint scanCode, byte[] keyboardState, [Out, MarshalAs(UnmanagedType.LPWStr, SizeParamIndex = 4)] StringBuilder receivingBuffer, int bufferSize, uint flags, IntPtr dwhkl);
I use the method below to load a keyboard layout:
[DllImport("user32.dll")]
private static extern IntPtr LoadKeyboardLayout(string pwszKLID, uint Flags);
I use the method like below:
// loads the interpretation of the key into buf.
ToUnicodeEx(key, scankey, keyboardState, buf, 256, 0, LoadKeyboardLayout("00000409", 1));
My usage of the method works - the interpretation is correct - BUT I have a side effect that my system language setting is affected. Before method call it looks like this:
And after the method call it looks like this:
How can I fix my code so that my system's language is not affected?
I have tried to change flags parameter of the LoadKeyboardLayout to 0, but then ToUnicodeEx uses the system language, not the loaded en-US.
I'm sure that almost four years later you've either solved this or no longer care. I'll put the answer here in case someone else comes across this, though.
The last parameter of LoadKeyboardLayout is dwFlags. You gave it 1, which is equivalent to KLF_ACTIVATE, or "load this bad boy up and make it my keyboard!"
You want to call it with 0, which tells it to just load the layout, give you a handle (IntPtr) to it, and do nothing else.
I'm sorry I couldn't help you back when you posted this. :-)
I'm writing a super-simple ultra-lite weight .Net wrapper for the LibVLC media library since the only things I need access to are the ability to play, pause and stop media files. I've posted a couple of questions on this and gotten some answers but unfortunately I'm just left with more questions.
We'll start from the top and work down.
The documentation first states I have to initialize VLC with a call to the function with this specification:
libvlc_instance_t* libvlc_new (int argc, const char *const *argv)
for which I have the defined the following method:
[DllImport("libvlc", EntryPoint = "libvlc_new",
CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr NewCore(int argc, IntPtr argv);
And I'm calling the function like this:
private IntPtr Instance;
this.Instance = DGLibVLC.NewCore(0, IntPtr.Zero);
I have tried it several different ways. Initially I did not know about the CallingConvention which was leading to an unbalanced stack which brought me here in the first place. That issue was resolved and the method has gone through several iterations, none of which have proved successful, by which I mean IntPtr is always 0 after the method call. I've tried it like it is above, with the second argument being String[] argc, [MarshalAs(UnmanagedType.LPArray, ArraySubType = UnmanagedType.LPStr)] string[],
I've tried having it return to a Long (which actually resulted in the Long having a value in it), but nothing so far has worked correctly.
Does anyone know the correct way to call this function from the LibVLC DLL Library?
EDIT: On a suggestion I tried calling the error message function of the library:
Specification:
const char* libvlc_errmsg (void)
Implementation:
[DllImport("libvlc", EntryPoint = "libvlc_errmsg",
CallingConvention = CallingConvention.Cdcel)]
public static extern string GetLastError();
Call:
Console.WriteLine(DGLibVLC.GetLastError());
Result:
Null
The documentation states it will return Null if there is no error. This must indicate that the initial function call NewCore was working correctly but something is still going wrong somehow.
To be cover all bases I checked that the DLLs match the documentation, they do. 2.0.6.0. The documentation I am referencing is here.
EDIT: I can confirm there is no error. When using an initialized to zero long variable to store the result of NewCore I can see it returning something. What I am doing wrong here is where I am trying to store the pointer being returned by the unmanaged function that returns the pointer to the object. How do I store the pointer to the opaque structure reference being passed back?
It doesn't have anything to do with the way you call the function. You cannot get anywhere when you get IntPtr.Zero back from libvlc_new(). It means "there was an error". You'll need to focus on error reporting first, call libvlc_errmsg() to try to get a description for the problem.
So after much looking around and asking questions I've come full circle.
I looked deeply into LibVLC.Net and found how they were importing the DLL functions and adapted what they did to my own wrapper and it worked.
To summarize:
There are some Win32 API functions declared in the code at the start:
[DllImport("kernel32.dll", CharSet = CharSet.Unicode, SetLastError = true)]
[return: MarshalAs(UnmanagedType.Bool)]
private static extern bool SetDllDirectory(string lpPathName);
[DllImport("kernel32", SetLastError = true)]
private static extern IntPtr LoadLibrary(string lpFileName);
[DllImport("kernel32", CharSet = CharSet.Ansi, ExactSpelling = true, SetLastError = true)]
private static extern IntPtr GetProcAddress(IntPtr hModule, string procName);
[DllImport("kernel32.dll", SetLastError = true)]
private static extern bool FreeLibrary(IntPtr hModule);
that handle providing a handle to a dll and setting a directory search path.
I don't know exactly what it all means but when you initialize the LibVLC.Net library (the primary object) it loads pretty much EVERY function like so:
m_libvlc_media_player_new = (libvlc_media_player_new_signature)LoadDelegate<libvlc_media_player_new_signature>("libvlc_media_player_new");
That delegate is defined here like so:
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
private delegate IntPtr libvlc_media_player_new_signature(IntPtr p_instance);
//==========================================================================
private readonly libvlc_media_player_new_signature m_libvlc_media_player_new;
//==========================================================================
public IntPtr libvlc_media_player_new(IntPtr p_instance)
{
VerifyAccess();
return m_libvlc_media_player_new(p_instance);
}
And it has a public function that calls the delegate once defined.
I simply stripped down the function that defines the library instance and imported only the functionality I needed.
Thanks very much to everyone who was so patient in helping me along. I likely wouldn't have been able to come to a solution without your help.
EDIT: Okay so it wasn't that. It was the location of the LibVLC Plugin Directory. So it was something stupid -.-;
I want to use prefix \\\?\ as stated in this msdn BCL Team Blog, Long Paths in .NET, Part 2 of 3: Long Path Workarounds [Kim Hamilton]
Even after going through it again and again, I couldn't figure out how to actually use this feature, wondering if anyone can tell me simplest way to use it and how.
Note: I want to use it for creating a directory
You have to use Win32 functions and P/Invoke to achieve this. Use the Unicode version of the API.
Here's what you're looking for:
[DllImport("kernel32.dll", CharSet = CharSet.Unicode)]
[return: MarshalAs(UnmanagedType.Bool)]
static extern bool CreateDirectory(string lpPathName, IntPtr lpSecurityAttributes);
public static void CreateDir(string dirPath)
{
if (!CreateDirectory(#"\\?\" + dirPath, IntPtr.Zero))
{
throw new IOException("Could not create dir");
}
}
CreateDirectory method
More information about how naming works in Windows
I've noticed something very strange. I was trying to call the CRT function "putchar", and was unable to get it to work. So I double-checked that I wasn't missing something, and I copied the code directly from the P/Invoke tutorial on MSDN to see if it worked.
http://msdn.microsoft.com/en-us/library/aa288468%28VS.71%29.aspx
You'll notice that they import "puts".
So I tested the exact code copied from MSDN. It didn't work! So now I got frustrated. I've never had this problem before.
Then I just so happened to run WITHOUT debugging (hit ctrl+f5), and it worked! I tested other functions which output to the console too, and none of them work when debugging but all work when not debugging.
I then wrote a simple C dll which exports a function called "PrintChar(char c)". When I call that function from C#, it works even if I'm debugging or not, without any problems.
What is the deal with this?
The Visual Studio hosting process is capable of redirecting console output to the Output window. How exactly it manages to do this is not documented at all, but it gets in the way here. It intercepts the WriteFile() call that generates the output of puts().
Project + Properties, Debug tab, untick "Enable the Visual Studio hosting process". On that same page, enabling unmanaged debugging also fixes the problem.
It's a bad example, using the C-Runtime Library DLL to call puts. Keep reading the tutorial as there is good info there, but try making Win32 API calls instead.
Here is a better introduction to p/invoke: http://msdn.microsoft.com/en-us/magazine/cc164123.aspx
It's old, but the information is still good.
Edited
My explaination was wrong.
I went looking for a correct explaination and I discovered that the C-Runtime puts method and the .NET Framework Console.Write method differ in how they write to the console (Console.Write works where the p/invoke to puts does not). I thought maybe the answer was in there, so I whipped up this demonstration:
using System;
using System.Diagnostics;
using System.IO;
using System.Runtime.InteropServices;
using System.Text;
class Program
{
public static void Main()
{
int written;
string outputString = "Hello, World!\r\n";
byte[] outputBytes = Encoding.Default.GetBytes(outputString);
//
// This is the way the C-Runtime Library method puts does it
IntPtr conOutHandle = CreateFile("CONOUT$", 0x40000000, FileShare.ReadWrite, IntPtr.Zero, FileMode.Open, 0, IntPtr.Zero);
WriteConsole(conOutHandle, outputBytes, outputString.Length, out written, IntPtr.Zero);
//
// This is the way Console.Write does it
IntPtr stdOutputHandle = GetStdHandle(STD_OUTPUT_HANDLE);
WriteFile(stdOutputHandle, outputBytes, outputBytes.Length, out written, IntPtr.Zero);
// Pause if running under debugger
if (Debugger.IsAttached)
{
Console.Write("Press any key to continue . . . ");
Console.ReadKey();
}
}
const int STD_OUTPUT_HANDLE = -11;
[DllImport("kernel32.dll", SetLastError = true)]
static extern IntPtr GetStdHandle(int nStdHandle);
[DllImport("kernel32.dll", SetLastError = true)]
static extern int WriteFile(IntPtr handle, [In] byte[] bytes, int numBytesToWrite, out int numBytesWritten, IntPtr mustBeZero);
[DllImport("kernel32.dll", CharSet = CharSet.Auto, SetLastError = true)]
static extern IntPtr CreateFile(string lpFileName, int dwDesiredAccess, FileShare dwShareMode, IntPtr securityAttrs, FileMode dwCreationDisposition, int dwFlagsAndAttributes, IntPtr hTemplateFile);
[DllImport("kernel32.dll", CharSet = CharSet.Ansi, SetLastError = true)]
static extern bool WriteConsole(IntPtr hConsoleOutput, [In] byte[] lpBuffer, int nNumberOfCharsToWrite, out int lpNumberOfCharsWritten, IntPtr mustBeZero);
}
Both of those successfully output under the debugger, even with the hosting process enabled. So that is a dead end.
I wanted to share it in case it leads someone else to figuring out why it happens -- Hans?