.Net TCP/IP send character with Ctrl Key modifier - c#

I am interfacing with a piece of hardware via a TCP/IP socket, however the documentation is limited.
At present I can get the response I am after by sending Ctrl+G in Putty which prints as '^G'
How can i replicate this in code? I can send 'G' but I'm not sure how to modify it with 'Ctrl'
Dim buffer = Text.ASCIIEncoding.ASCII.GetBytes(cmd & vbNewLine)
ns.Write(buffer, 0, buffer.Length) 'ns = Network Stream
ns.Flush()
I am working in VB.Net however examples in C# are fine.

The Ctrl button isn't actuall sent, the Ctrl button + a character acts as a shortcut to a number of ASCII control charachters. In this case Ctrl+G sends the (ASCII value 007) control character (printed as ^G)
https://en.wikipedia.org/wiki/Control_character#In_ASCII

Related

Chrome remote desktop Console.ReadKey yelds System.ConsoleKey.Packet instead of keystroke

This is more of a curiosity to be completely honest.
Inside a console project I am developing I request input from the user in a looped form of this:
ConsoleKey response;
Console.Write("Write messages to log? [y/n] ");
response = Console.ReadKey(false).Key;
assume 4 cases:
case 1: Pressing the 'y' key on a keyboard directly atached to the pc running the software.
case 2: Connecting from another computer trough remote desktop connection and pressing the 'y' key on that machine.
case 3: Using on screen keyboard and press the 'y' key trough clicking on it (remotely or locally had no difference)
case 4: Connecting from another machine (specifically a phone) trough chrome remote desktop and pressing the 'y' key.
In case 1, 2 and 3, 'response' will contain 'Y'.
In case 4 'response' will instead contain System.ConsoleKey.Packet aka enum 231 PACKET key (used to pass Unicode characters with keystrokes).
I have tried the same thing trough a second pc and I noticed this behavior does not appear to occurr.
The most interesting thing is that the console shows me this
Write messages to log? [y/n] y
From this I evince that the keystroke is indeed received but handled incorrectly by my code.
I am at a loss as to how to proceed.
Console.ReadLine yelds the correct keystrokes but i would prefer to use Console.ReadKey if possible.
Is this specific behavior for phone keyboards? How would I obtain the actual key?
How would I obtain the actual key?
The MSDN docs for ConsoleKey.Packet doesn't say anything useful, so I found references to ConsoleKey in the source which lead here. That's casting ir.keyEvent.virtualKeyCode to a ConsoleKey where ir is an InputRecord.
A quick google finds the WinApi equivalent is INPUT_RECORD, and chasing the docs through KEY_EVENT_RECORD leads to this doc of Virtual-Key codes, which contains some more docs for VK_PACKET:
Used to pass Unicode characters as if they were keystrokes. The VK_PACKET key is the low word of a 32-bit Virtual Key value used for non-keyboard input methods. For more information, see Remark in KEYBDINPUT, SendInput, WM_KEYDOWN, and WM_KEYUP
The Remarks for KEYBDINPUT say:
INPUT_KEYBOARD supports nonkeyboard-input methods—such as handwriting recognition or voice recognition—as if it were text input by using the KEYEVENTF_UNICODE flag. If KEYEVENTF_UNICODE is specified, SendInput sends a WM_KEYDOWN or WM_KEYUP message to the foreground thread's message queue with wParam equal to VK_PACKET. Once GetMessage or PeekMessage obtains this message, passing the message to TranslateMessage posts a WM_CHAR message with the Unicode character originally specified by wScan. This Unicode character will automatically be converted to the appropriate ANSI value if it is posted to an ANSI window.
From my searching it doesn't look like .NET implements this mechanism for you, so you might have to do it yourself!
I'm afraid I've no idea why it's happening in your case however...
I just ran into this issue too! In my case, it only happened when connecting to my development pc from a phone as well. (I was using the RDP app)
Here's what I came up with today that works!
public static bool Confirm(string prompt)
{
Console.Write($"{prompt} [y/n] ");
char letter;
do
{
letter = char.ToLower(Console.ReadKey(true).KeyChar);
} while (letter != 'y' && letter != 'n');
Console.WriteLine(letter);
return letter == 'y';
}
Thanks for confirming that I'm not the only one with the issue!

c# SerialPort: how to send "0 byte" transfers (AKA ZLP: zero length packet)?

I've done some playing around with SerialPort (frustratingly so) and have finally hit a point where I absolutely have no idea why this isn't working. There's a USB CDC device that I'm trying to send hex commands to, the way I'm doing this is over the COM port interface it exposes. I can handshake with the device, when I say HI it replies with HI back, but then I send another command to it which must be followed by a zero byte packet or else the device stops responding altogether. Keep in mind, this zero byte packet has ABSOLUTELY nothing in it, meaning it doesn't have a \0 or a 0x00 or 0 or even a null (SerialPort throws an exception on null).
Now, one way I was able to circumvent this was to use libusbdotnet. I accessed the CDC device directly instead of the COM interface, set the endpoints correctly and sent hex commands like that. I'm able to successfully send "0 byte" packets using this method with the following c# code:
string zlpstring = "";
byte[] zlpbyte = Encoding.Default.GetBytes(zlpstring);
....snip
ecWrite = writer.SubmitAsyncTransfer(zlpbyte, 0, zlpbyte.Length, 100, out usbWriteTransfer);
zlpbyte is the buffer, 0 is the offset, zlpbyte.Length is the packet length in bytes, 100 is the timeout, and out usbWriteTransfer is the transfer context.
When I use this same method on the COM port:
string zlpstring = "";
byte[] zlpbyte = Encoding.Default.GetBytes(zlpstring);
_serialPort.Write(zlpbyte, 0, zlpbyte.Length);
the USB logger reports that absolutely nothing was sent. It's as if the COM port is ignoring the zero byte transfer. Before it's mentioned that "you cannot do this", there's various programs out there that can send a zero-byte packet to this exact device's COM port without doing ANY driver manipulation. This is what I'm going for, which is why I'm trying to ditch libusbdotnet and go straight to the COM port.
EDIT:
After some more toying around and a different USB logger I don't find zero bytes being sent but rather this:
IRP_MJ_DEVICE_CONTROL (IOCTL_SERIAL_WAIT_ON_MASK)
I think this may be the issue. If a 0 byte was being sent then I assume it would show up as:
IRP_MJ_WRITE > UP > STATUS_SUCCESS > (blank) > (blank)
My program is sending back a response of 01 00 00 00, however while logging another successful program it's SETTING the wait mask:
IRP_MJ_DEVICE_CONTROL (IOCTL_SERIAL_SET_WAIT_MASK) DOWN STATUS_SUCCESS 01 00 00 00
If my assumptions are right, this question might've just turned into how do I set a serial port's/COM port's wait mask? There's absolutely nothing about this in the c# SerialPort class...which is why I can now see why so many articles called it "lacking". I also took a look around c++: https://msdn.microsoft.com/en-us/library/aa363214(v=vs.85).aspx this also does not seem to cover the wait mask. Using the USB filter libusb is starting to look a lot more pleasing each minute...(although I'm going to question myself forever why sending a zero byte works there but it doesn't over SerialPort).
SECOND EDIT:
I'm a moron. It was definitely a setting that the manufacturer probably didn't figure anyone would ever touch nor know how to set:
#define EV_RXFLAG 0x0001
SetCommMask(hSerial, EV_RXFLAG);
I then saw this over the USB logs:
IRP_MJ_DEVICE_CONTROL (IOCTL_SERIAL_SET_WAIT_MASK) DOWN STATUS_SUCCESS 01 00 00 00
Bingo. The RXFLAG was originally set to 0x0002. I couldn't find a way to change this in C# yet. So I had to do with some C++ code for now. It totally works, and sends the "zero byte" like it's supposed to without me actually sending it from the code. This setting I assume was the "handshake" method between my device and whatever else it's interacting with in Flash mode. Hope this helps someone else out there whose COM/Serial device is rejecting/discarding zero byte packets yet requiring ZLP at the same time...how goofy?!
have you tried to concatenate an extra new line or carriage return or both at the end of the data?
I would say add a 0xA (new line), or 0xD (carriage return), or both 0xA and 0xD to the end of your byte array and see if you get something.
byte[] zlpbyte = new byte[1] {0};
_serialPort.Write(zlpbyte, 0, 1);
[Update]
Based on our discussions, it appears that you are trying to have control over the control signals of the serial port. I have not tried it before but I can see that it is possible to set the control signals (if i understand the source properly) into certain states.
Try to set the Handshake property
public enum Handshake
{
None,
XOnXOff,
RequestToSend,
RequestToSendXOnXOff,
}
I am not sure exactly how it affects the IOCTL settings but it should be able to affect it somehow I believe

redirect input from keyboard like device to background process in windows environment

I use simple RFID reader device which connected to PC with usb cable and identified as keyboard like device.
I want to read input from this device by my application but it is running in background i.e. other application is in focus.
How can I redirect stdin to my application in windows 7 environment?
I can't change focus to my application
I can't make change in front application (e.g. catch stdin and send it to my application via pipe etc.)
My application is written in C# but I can rewrite it to Java / C
The PC runs Win 7 OS.
Thanks
y
Here my solution:
I have connected a normal Keyboard and a Card Reader via USB to my computer.
Both devices write to the windows keyboard buffer.
I want to redirect the input of the Card Reader to another application or file and remove it from the keyboard buffer (so that this input will not show up in any editor).
What I know / Prerequisites:
The input of the Card Reader only contains hexadecimal letters (0-9, A-F) and is finished by a newline.
The Card Reader input will be received "on-block", this means between two received letters there are only a few milliseconds
It is not possible for a human to enter two or more numbers within less than 70ms(I have tried it)
What I do:
I listen to the keyboard buffer and take out every single input letter/key. Any input which is not 0-9 or A-F will be put back to the keyboard buffer immediately.
If a input 0-9 or A-F comes up I will store it in a string buffer (it might be from the card reader)
If there is no further input for more than 70ms and there are at least 4 Bytes containing 0-9 or A-F in the buffer, then I assume/know it was from the Card Reader and use it in my own way. These Bytes have been already taken out of the keyboard buffer.
If there are only one/two/three letter(s) 0-9/A-F in my buffer, then after 70ms they will be put back to the windows keyboard buffer. While typing you can't realise that "some" letters show up a little postponed for a human being.
Here my program (written in the script language AutoHotkey):
KeybRedir.ahk
; KeybRedir
; Programmiert von Michael Hutter - Mai 2018
#NoEnv ;Avoids checking empty variables to see if they are environment variables
#SingleInstance force
Process, Priority, , Normal
#Warn All
SaveKeyList=
0::
1::
2::
3::
4::
5::
6::
7::
8::
9::
+A::
+B::
+C::
+D::
+E::
+F::
Return::
{ ; If one of these characters was typed => take it out of windows key buffer...
if A_ThisHotkey = Return
Hotkey:=Chr(13)
else
Hotkey:=A_ThisHotkey
SaveKeyList = %SaveKeyList%%Hotkey%
SetTimer , DoKeyPlay, 70 ; Wait 70ms for another key press of charlist (re-trigger the timer in case it already runs)
}
return
DoKeyPlay:
SetTimer , , Off
SaveText:=RegExReplace(SaveKeyList, "\+", "")
StringReplace, SaveText, SaveText, `r, , All
if StrLen(SaveText) < 4 ; Accumulated text size < 4 letters => normal key input
{
SendPlay %SaveKeyList% ; put captured text back in windows key buffer
}
else ; Now we have the input of the card reader
{
SplashTextOn, , , %SaveText% ; Do something with the input or the card reader ...
}
SaveKeyList=
SaveText=
SetTimer , SplashOff, 2000
return
SplashOff:
SplashTextOff
SetTimer , , Off
return
You can compile this script to a exe file (327KB) using the Autohotkey Compiler.

How to send Keys to an application

I have a very basic problem, For a game (Dota 2) I want to write a little macro which opens the console, writes "setinfo name ".........................."" into it and then closes it, the hotkey for console is set to '#'. For this I wrote an application which listens for the key 'f' to be pressed, and then send
1) '#' (open the console)
2) "messag ebla bla b...."
3) '#' (close the console)
everything is working except that it will not open the console (but if the console is already open it will write #messagej.j....# into it when i press f just as wanted)
my code for sending the keys.
SendKeys.Send("#");
SendKeys.Send("my message for consol");
SendKeys.Send("#");
does anybody know why the hotkeys dont work by sending keys? I thought its an simulation of when the user presses F or Q.
First thing You should try is adding a "software" delay between keypresses.
string MSG = "#yourmessage#";
foreach (object c_loopVariable in MSG) {
c = c_loopVariable;
Sendkeys.Send(c);
sleep(20);
}
The other thing to consider - sendkeys do not emulate the keypress at the hardware layer - it is on windows API level, and the DirectX DirectInput., witch most games use for user input processing goes a level further. So it is possible You will not be able to do it this easy.
You should check out more of keybd_event. Ive had better luck using that - as i understand it goes a level deeper than just sendkeys.
http://msdn.microsoft.com/en-us/library/windows/desktop/ms646304(v=vs.85).aspx
Anyways, I hope u are not going to use this for some spamming in a great game! :)

Does .NET 3.5 SP1 SerialPort class append extra 0's on transmission?

Update
Turns out the error was in the crypto processor code, that is fixed. But now running into what seems like it might be a handshaking issue.
On first transmission, we get a single byte back from the device with value 0xFF(don't know why, the engineer I'm working with isn't too experienced with RS-232 either). Then, things run as normal (just sending the device one byte at a time, and waiting for a matching echo). However, neither the device nor the .NET app can send more than a couple of bytes at a time before one of them locks up and refuses to send or receive.
At work I'm writing an app that interfaces over RS232 with a crypto processor inside a device to reprogram flash modules inside the device.
To just take things slow and make sure all our headers are right, we're writing one byte at a time with SerialPort.Write(). However, when we run the code on the crypto processor, it reads an extra NULL in between each byte. When I test the .NET code on my local machine with two serial ports and a crossover cable, I capture the output in HyperTerminal or Putty and there are no extra NULLs when I view the log in Notepad++.
However, to further complicate things, when we manually type messages byte-per-byte via HyperTerminal to the crypto processor, it reads the input as a single byte only, no extra NULLs (as compared to the .NET app). Anybody have any experience with .NET doing mysterious things when it writes to a SerialPort?
We're initializing a test chunk with this:
byte[] testBytes = new byte[] { (byte)'A', (byte)'B', (byte)'C', (byte)'D', (byte)'E', (byte)'F', (byte)'G', (byte)'H' };
byte[] byteArray = new byte[256];
for (int i = 0; i < 32; i++)
{
testBytes.CopyTo(byteArray, i * 8);
}
And sending it with this:
public void StutterSend(byte[] data, long delayMs)
{
bool interactive = false;
if (delayMs < 0)
interactive = true;
for (int i = 0; i < data.Length; i++)
{
serialPort.Write(data, i, 1);
if (interactive)
{
WriteLine("Sent byte " + (i + 1) + " of " + data.Length + ". Press any key to send moar.");
Console.ReadKey();
}
else
{
double timer = DateTime.Now.TimeOfDay.TotalMilliseconds;
do { } while ((DateTime.Now.TimeOfDay.TotalMilliseconds - timer) < delayMs);
}
}
WriteLine("Done sending bytes.");
}
Our SerialPort is configured with all the matching parameters (stop bits, data bits, parity, baud rate, port name), and our handshake is set to None (it's just how our uart driver works).
Regarding your update, it sounds like your crypto processor has some more problems. Getting a 0xff back can be the result of an unexpected glitch of <= 1 bit time on the Tx line of the RS232 port. This is interpreted as a start bit by the PC. After the glitch, the Tx line returns to the mark state and now that the UART on the PC has a start bit, it interprets the "data" bits as all ones (the value for the mark state). The mark state is also the correct value for the stop bit(s) so your PC's UART has received a valid byte with a value of 0xff. Note that the glitch can be very fast relative to the RS232 data rate and still be interpreted as a start bit so have your engineer look at this line with an oscilloscope in normal mode/single sequence trigger to confirm this.
What is the Encoding property for the serialPort set to? The docs for SerialPort.Write( byte[], int, int) say that it runs its data through an Encoder object (which doesn't really make sense to me for a byte[]). It's supposed to default to ASCIIEncoding, but it seems like it might be set to something else. try explicitly setting it to ASCIIEncoding and see if that helps. I can't recall if this was an issue for me back when I did some serial port stuff in .NET to talk to an embedded board...
Note that even with ASCIIEncoding in use, you'll get some (probably unwanted) transformation of data - if you try to send something above value 127, the encoder will convert it to '?' since it's not a valid ASCII character. I can't recall off the top of my head how I got the serial port to simply leave my data alone - I'll have to dig around in some source code...
SerialPort sets the Parity property to Parity.None if you don't specify any. This means in case your receiver expects a Partity bit, it will never get one as long as you don't tell SerialPort explicitely to send along a Parity Bit with the transmitted data.
And the fact that it went well on HyperTerminal could be that HyperTerminal uses a Parity bit by default ( I don't know HyperTerminal well).

Categories