One of my softwares need to change system proxies on Windows. Changing the HTTP proxy for LAN connections behind a router is easy, but I cannot find any information on how to change proxies for dialup or direct DSL (i.e. PPPoE) connections.
This is bad because a significant fraction of my clients are in China. In China, many people do not have more than one computer, and thus find a router wasteful. They simply connect their ADSL modem to their ethernet port and use PPPoE. Yes, this sucks for security and everything (one reason why botnets roam so freely in China) but it is reality and my software needs to work.
I also need code that gives me the list of all network connections. Just having code as in my related question that requires one to know the connection to edit would not work.
I also prefer something that would work by using the reg command. Simple C++ or C# code using the Windows API is also useful, but note that I'm using Racket, a language with a rather cumbersome FFI, which means that it would be best to minimize use of the Windows C API.
Assuming you are unable to use Windows native API calls, I’ll put my bit by providing a solution that only would require calls to Windows commands (reg) and array/strings manipulation, something that the "Racket" language must implement for sure.
This is not the cleanest way but given the requirements it should be a feasible solution for you.
Well, as you perhaps have noticed, the proxy configuration for the different connections is stored under the key: HKCU\Software\Microsoft\Windows\CurrentVersion\Internet Settings\Connections
Under that key there’s a value that stores the DefaultConnectionSettings and another value that stores the SavedLegacySettings (both of type REG_BINARY). In addition to the two mentioned values there’s a value per each system connection (also of type REG_BINARY) that stores the connection configuration including the proxy settings. The name of the values equals the connection names.
Fortunately some guy have reverse engineered the structure of the BINARY data stored in those values.
Byte number zero always has a 3C or 46 - I couldn't find more information about this byte.The next three bytes are zeros.
Byte number 4 is a counter used by the 'Internet Options' property sheet (Internet explorer->Tools->Internet Options...). As you manually
change the internet setting (such as LAN settings in the Connections
tab), this counter increments.Its not very useful byte.But it MUST
have a value. I keep it zero always. The next three bytes are zeros
(Bytes 5 to 7).
Byte number 8 can take different values as per your settings. The value is :
09 when only 'Automatically detect settings' is enabled
03 when only 'Use a proxy server for your LAN' is enabled
0B when both are enabled
05 when only 'Use automatic configuration script' is enabled
0D when 'Automatically detect settings' and 'Use automatic configuration script' are enabled
07 when 'Use a proxy server for your LAN' and 'Use automatic configuration script' are enabled
0F when all the three are enabled.
01 when none of them are enabled. The next three bytes are zeros (Bytes 9 to B).
Byte number C (12 in decimal) contains the length of the proxy server address.For example a proxy server '127.0.0.1:80' has length 12
(length includes the dots and the colon).The next three bytes are
zeros (Bytes D to F).
Byte 10 (or 16 in decimal) contains the proxy server address - like '127.0.0.1:80' (where 80 is obviously the port number)
The byte immediatley after the address contians the length of additional information.The next three bytes are zeros. For example if
the 'Bypass proxy server for local addresses' is ticked, then this
byte is 07,the next three bytes are zeros and then comes a string i.e.
'' ( indicates that you are bypassing the proxy
server.Now since has 7 characters, the length is 07!). You
will have to experiment on your own for finding more about this. If
you dont have any additional info then the length is 0 and no
information is added.
The byte immediately after the additional info, is the length of the automatic configuration script address (If you dont have a script
address then you dont need to add anything,skip this step and goto
step 8).The next three bytes are zeros,then comes the address.
Finally, 32 zeros are appended.(I dont know why! Presumably to fill the binary blob, perhaps it is expected to be a certain length by
something, don't you wish windows had some source?)
Complete info can be found here.
With this info I think you can manage to obtain the values in. Just do reg query "HKCU\Software\Microsoft\Windows\CurrentVersion\Internet Settings\Connections", properly parse the output and use reg again to write back the modifications.
I hope this helps.
You may use this c# code to change proxy server for VPN connections:
// host looks like "127.0.0.1:8080"
public static void EnableVPNProxy(string host)
{
RegistryKey RegKey = Registry.CurrentUser.OpenSubKey("Software\\Microsoft\\Windows\\CurrentVersion\\Internet Settings\\Connections", true);
foreach (var name in RegKey.GetValueNames())
{
try
{
byte[] server = Encoding.ASCII.GetBytes(host);
byte[] current = (byte[])RegKey.GetValue(name);
byte[] data = new byte[100];
data[0] = current[0];
data[1] = data[2] = data[3] = data[4] = data[5] = data[6] = data[7] = 0;
data[8] = 3;
data[9] = data[10] = data[11] = 0;
data[12] = Convert.ToByte(server.Length);
data[13] = data[14] = data[15] = 0;
int i = 16;
foreach (var b in server)
{
data[i] = b;
i++;
}
for (var x = 0; x < 40; x++)
{
data[i] = 0;
i++;
}
RegKey.SetValue(name, data);
}
catch (Exception ex) { }
}
}
And enable proxy
EnableVPNProxy("127.0.0.1:8080");
Related
I am writing an android app in c# (using Avalonia) to communicate with my embedded usb device. The device has an atxmega microcontroller and utilises the atmel provided cdc usb driver. I have a desktop version of the app using System.IO.Ports.SerialPort. I use SerialPort.Write() to write data and SerialPort.ReadByte() in a loop to read data - using SerialPort.Read() and trying to read multiple bytes very often fails, while the looped byte by byte reading basically never does. It works on both Windows and Mac pcs.
The communication is very straightforward - the host sends commands and expects a known length of data to come in from the device.
On the android app I am using the android.hardware.usb class and adapted the code from the multiple serial port wrappers for cdc devices. Here is my connect function:
public static bool ConnectToDevice()
{
if (Connected) return true;
UsbDevice myDevice;
if ((myDevice = GetConnectedDevice()) == null) return false;
if (!HasPermission(myDevice)) return false;
var transferInterface = myDevice.GetInterface(1);
var controlInterface = myDevice.GetInterface(0);
writeEndpoint = transferInterface.GetEndpoint(1);
readEndpoint = transferInterface.GetEndpoint(0);
deviceConnection = manager.OpenDevice(myDevice);
if (deviceConnection != null)
{
deviceConnection.ClaimInterface(transferInterface, true);
deviceConnection.ClaimInterface(controlInterface, true);
SendAcmControlMessage(0x20, 0, parameterMessage);
SendAcmControlMessage(0x22, 0x03, null); //Dtr Rts true
Connected = true;
OnConnect?.Invoke();
return true;
}
return false;
}
The two control transfers are what I borrowed from the serial port wrappers and I use them to:
Set the default parameters - 115200 baudrate, 1 stop bit, parity none, 8 data bits which is the same as in my embedded usb config. This doesn't seem necessary but I do it anyway. The response from the control transfer is 7, so I assume it works properly.
Set Dtr and Rts to true. This doesn't seem to be necessary either - the desktop code works with both set to false as well as to true. The control transfer response is 0 so I assume it also works properly.
I checked all endpoints and parameters after connection and they all seem to be correct - both read and write Endpoints are bulk endpoints with correct data direction.
Writing to my device is not a problem. I can use either the bulkTransfer method as well as UsbRequest and they both work perfectly.
Unfortunately everything falls apart when trying to read data. The initial exchange with the device is as follows:
Host sends a command and expects 4 bytes of data.
Host sends another command and expects 160 bytes of data.
The bulk transfer method looks like this:
public static byte[] ReadArrayInBulk(int length)
{
byte[] result = new byte[length];
int transferred = deviceConnection.BulkTransfer(readEndpoint, result, length, 500);
if (transferred != length)
{
transferred = deviceConnection.BulkTransfer(readEndpoint, result, length, 500);
}
return result;
}
The first transfer almost always (99.9%) returns 0 and then the second returns the proper data, up to a point. For the first exchange it does receive the 4 bytes correctly. However, for the second exchange, when it expects 160 bytes the second transfer returns 80, 77, 22 and other random lengths of data. I tried making a loop and performing bulk transfers until the number of bytes transferred is equal to the expected amount but unfortunately after receiving the random length of data for the first time the bulk transfer starts always returning 0.
So I tried the UsbRequest way to get the data:
public static byte[] GetArrayRequest(int length)
{
byte[] result = new byte[length];
UsbRequest req = new UsbRequest();
req.Initialize(deviceConnection, readEndpoint);
ByteBuffer buff = ByteBuffer.Wrap(result);
req.Queue(buff, length);
var resulting = deviceConnection.RequestWait();
if (resulting != null) buff.Get(result);
req.Close();
return result;
}
The WrapBuffer does get filled with the expected amount of data (the position is the same as length), unfortunately the data is all 0. It does work for a couple transfers once in a while, but 99% of the time it just returns zeroes.
So, given that the desktop version also has trouble reading data in bulk using the SerialPort.Read() method but works perfectly when reading them one by one (SerialPort.ReadByte() which underneath uses SerialPort.Read() for a single byte) in a loop I decided to try doing that in my app as well.
Unfortunately, when looping using the UsbRequest method not only does it take ages but it also always returns zeroes. When looping using the BulkTransfer method for a single byte array it returns 0 on the first transfer and -1 on the second, indicating a timeout.
I have spent so much time with this that I am at my wits' end. I checked multiple implementations of usb serial for android projects from mik3y, kai-morich, felHR85 but all they do differently from me are the two control transfers, which I have now added - to no avail.
Can anyone tell me what could cause the bulkTransfer to always return 0 in response to the first transfer (which is supposed to be a success, but what does it actually do with no data read?) and only return some data or timeout (in case of a single byte transfer) on the second one? If it were reliably doing so I would just get used to it, but unfortunately it only seems to work correctly for smaller transfers and only up to a certain point, because while the initial couple/couple dozen bytes are correct, then it stops receiving any more in random intervals. Or why the UsbRequest method does fill the buffers, but with only zeroes 99% of the time?
I used to filter packets into Wireshark with the simple dtls argument as filter. (Data Transport Layer Security which is some UDP TLS protocol)
Now, i wanted to do the same using C# and PcapDOTNet wrapper that uses WinPcap filters.
Sadly, i can't find anywhere the equivalent, and dtls is not recognised in the C# app, and so doesn't grab any packet anymore. (Simply it crashes the interpreter since the string is not recognised)
using (PacketCommunicator communicator = selectedDevice.Open(65536, PacketDeviceOpenAttributes.None, 1000))
{
using (BerkeleyPacketFilter filter = communicator.CreateFilter("dtls") {
communicator.SetFilter(filter);
communicator.ReceivePackets(1, packetHandler);
}
}
Is there any equivalent, please ?
EDIT : It looks like dtls is only a DISPLAY filter, and not a CAPTURE one. I could only capture filter by using udp port xx (xx being the port) but since the used ports are always randoms, i can't. So i would be glad to find another filtering workaround if you have one! I prefer only capturing the desired packets, rather than capturing everything then filtering the datas...
Wireshark : DTLS
There is only two packets i would like to capture. The one containing The Server Hello Done message or the one containing handshake message (the one with Record Layer) :
EDIT 2 : Ok, i am close to find what i need, but i need your help.
This answer from here must be the solution. tcp[((tcp[12] & 0xf0) >> 2)] = 0x16 is looking for handshake 22, but dtls is udp and not tcp and so the 12 offset might be different. Can anyone help me figure out what would be the correct formula to adapt it for dtls instead of tcp tls ?
I tried to use this on wireshark, but the filter is invalid and i don't really know why. If at least you could make it to work into wireshark, i could experiment differents value myself and come back with a final answer. udp[((udp[12] & 0xf0) >> 2)] = 0x16 is not a valid filter on wireshark.
So, i gave up on the dynamical way of finding the correct position of the data.
But this is what i ended with :
using (PacketCommunicator communicator = selectedDevice.Open(65536, PacketDeviceOpenAttributes.None, 1000))
{
using (BerkeleyPacketFilter filter = communicator.CreateFilter("udp && ((ether[62:4] = 0x16fefd00) || (ether[42:4] = 0x16fefd00))") {
communicator.SetFilter(filter);
communicator.ReceivePackets(1, packetHandler);
}
}
bytes[62:4] is the position of 16fefd00 for ipv6 packets, (42 for ipv4).
The 16 is for Content type handshake protocol 22 and the following fefd is for DTLS version 1.2. The last two zeros are just because using slice of bytes only works for 1,2 or 4, not 3. So i had to take them in consideration.
This is absolutly not perfect, i know, but for now, it works, since i couldn't find any other workaround yet.
I've done some playing around with SerialPort (frustratingly so) and have finally hit a point where I absolutely have no idea why this isn't working. There's a USB CDC device that I'm trying to send hex commands to, the way I'm doing this is over the COM port interface it exposes. I can handshake with the device, when I say HI it replies with HI back, but then I send another command to it which must be followed by a zero byte packet or else the device stops responding altogether. Keep in mind, this zero byte packet has ABSOLUTELY nothing in it, meaning it doesn't have a \0 or a 0x00 or 0 or even a null (SerialPort throws an exception on null).
Now, one way I was able to circumvent this was to use libusbdotnet. I accessed the CDC device directly instead of the COM interface, set the endpoints correctly and sent hex commands like that. I'm able to successfully send "0 byte" packets using this method with the following c# code:
string zlpstring = "";
byte[] zlpbyte = Encoding.Default.GetBytes(zlpstring);
....snip
ecWrite = writer.SubmitAsyncTransfer(zlpbyte, 0, zlpbyte.Length, 100, out usbWriteTransfer);
zlpbyte is the buffer, 0 is the offset, zlpbyte.Length is the packet length in bytes, 100 is the timeout, and out usbWriteTransfer is the transfer context.
When I use this same method on the COM port:
string zlpstring = "";
byte[] zlpbyte = Encoding.Default.GetBytes(zlpstring);
_serialPort.Write(zlpbyte, 0, zlpbyte.Length);
the USB logger reports that absolutely nothing was sent. It's as if the COM port is ignoring the zero byte transfer. Before it's mentioned that "you cannot do this", there's various programs out there that can send a zero-byte packet to this exact device's COM port without doing ANY driver manipulation. This is what I'm going for, which is why I'm trying to ditch libusbdotnet and go straight to the COM port.
EDIT:
After some more toying around and a different USB logger I don't find zero bytes being sent but rather this:
IRP_MJ_DEVICE_CONTROL (IOCTL_SERIAL_WAIT_ON_MASK)
I think this may be the issue. If a 0 byte was being sent then I assume it would show up as:
IRP_MJ_WRITE > UP > STATUS_SUCCESS > (blank) > (blank)
My program is sending back a response of 01 00 00 00, however while logging another successful program it's SETTING the wait mask:
IRP_MJ_DEVICE_CONTROL (IOCTL_SERIAL_SET_WAIT_MASK) DOWN STATUS_SUCCESS 01 00 00 00
If my assumptions are right, this question might've just turned into how do I set a serial port's/COM port's wait mask? There's absolutely nothing about this in the c# SerialPort class...which is why I can now see why so many articles called it "lacking". I also took a look around c++: https://msdn.microsoft.com/en-us/library/aa363214(v=vs.85).aspx this also does not seem to cover the wait mask. Using the USB filter libusb is starting to look a lot more pleasing each minute...(although I'm going to question myself forever why sending a zero byte works there but it doesn't over SerialPort).
SECOND EDIT:
I'm a moron. It was definitely a setting that the manufacturer probably didn't figure anyone would ever touch nor know how to set:
#define EV_RXFLAG 0x0001
SetCommMask(hSerial, EV_RXFLAG);
I then saw this over the USB logs:
IRP_MJ_DEVICE_CONTROL (IOCTL_SERIAL_SET_WAIT_MASK) DOWN STATUS_SUCCESS 01 00 00 00
Bingo. The RXFLAG was originally set to 0x0002. I couldn't find a way to change this in C# yet. So I had to do with some C++ code for now. It totally works, and sends the "zero byte" like it's supposed to without me actually sending it from the code. This setting I assume was the "handshake" method between my device and whatever else it's interacting with in Flash mode. Hope this helps someone else out there whose COM/Serial device is rejecting/discarding zero byte packets yet requiring ZLP at the same time...how goofy?!
have you tried to concatenate an extra new line or carriage return or both at the end of the data?
I would say add a 0xA (new line), or 0xD (carriage return), or both 0xA and 0xD to the end of your byte array and see if you get something.
byte[] zlpbyte = new byte[1] {0};
_serialPort.Write(zlpbyte, 0, 1);
[Update]
Based on our discussions, it appears that you are trying to have control over the control signals of the serial port. I have not tried it before but I can see that it is possible to set the control signals (if i understand the source properly) into certain states.
Try to set the Handshake property
public enum Handshake
{
None,
XOnXOff,
RequestToSend,
RequestToSendXOnXOff,
}
I am not sure exactly how it affects the IOCTL settings but it should be able to affect it somehow I believe
I am working on the connection between C# and a Siemens PLC (S7-1200). I've created a datablock (a.k.a. database) to which I need to read and eventually write. The connection to the PLC works, but I can't read anything from its datablock. It always give me the following error:
Error 33028 context is not supported. Step7 says: Function not implemented or error in telegram.
The number of the datablock is 311. I am starting at the first byte and I've only given up 1 as length.
I've also disabled Optimize block access, enabling does not solve this problem.
if (0 == lndConnection.connectPLC())
{
Byte[] bytes = new Byte[1];
res = lndConnection.readBytes(libnodave.daveDB ,311,0, 1, bytes);
if (res == 0)
MessageBox.Show(lndConnection.getS32().ToString());
else
{
mInt = 0;
MessageBox.Show("error " + res + " " + libnodave.daveStrerror(res));
}
}
I hope somebody can help me out with the problem.
The answer from Roatin Mart:
"S7-1200 has indirect access on by default. Check if Optimised Block access is disabled."
S7-1500 implements extended communication protocol which is not supported by libnodave.
It is possible though to connect and read/write global db blocks using the "old" protocol.
I have communicate successfully with both 1200 and 1500, but some additional setting plc-side are needed.
S7-1200
Only global DBs can be accessed.
The optimized block access must be turned off.
S7-1500
Only global DBs can be accessed.
The optimized block access must be turned off.
The access level must be “full” for the plc.
“connection mechanism” must allow GET/PUT for external partners
Details with screen-shots can be found at:
http://snap7.sourceforge.net/snap7_client.html
Hope it helps!
Cheers,
peter
Update
Turns out the error was in the crypto processor code, that is fixed. But now running into what seems like it might be a handshaking issue.
On first transmission, we get a single byte back from the device with value 0xFF(don't know why, the engineer I'm working with isn't too experienced with RS-232 either). Then, things run as normal (just sending the device one byte at a time, and waiting for a matching echo). However, neither the device nor the .NET app can send more than a couple of bytes at a time before one of them locks up and refuses to send or receive.
At work I'm writing an app that interfaces over RS232 with a crypto processor inside a device to reprogram flash modules inside the device.
To just take things slow and make sure all our headers are right, we're writing one byte at a time with SerialPort.Write(). However, when we run the code on the crypto processor, it reads an extra NULL in between each byte. When I test the .NET code on my local machine with two serial ports and a crossover cable, I capture the output in HyperTerminal or Putty and there are no extra NULLs when I view the log in Notepad++.
However, to further complicate things, when we manually type messages byte-per-byte via HyperTerminal to the crypto processor, it reads the input as a single byte only, no extra NULLs (as compared to the .NET app). Anybody have any experience with .NET doing mysterious things when it writes to a SerialPort?
We're initializing a test chunk with this:
byte[] testBytes = new byte[] { (byte)'A', (byte)'B', (byte)'C', (byte)'D', (byte)'E', (byte)'F', (byte)'G', (byte)'H' };
byte[] byteArray = new byte[256];
for (int i = 0; i < 32; i++)
{
testBytes.CopyTo(byteArray, i * 8);
}
And sending it with this:
public void StutterSend(byte[] data, long delayMs)
{
bool interactive = false;
if (delayMs < 0)
interactive = true;
for (int i = 0; i < data.Length; i++)
{
serialPort.Write(data, i, 1);
if (interactive)
{
WriteLine("Sent byte " + (i + 1) + " of " + data.Length + ". Press any key to send moar.");
Console.ReadKey();
}
else
{
double timer = DateTime.Now.TimeOfDay.TotalMilliseconds;
do { } while ((DateTime.Now.TimeOfDay.TotalMilliseconds - timer) < delayMs);
}
}
WriteLine("Done sending bytes.");
}
Our SerialPort is configured with all the matching parameters (stop bits, data bits, parity, baud rate, port name), and our handshake is set to None (it's just how our uart driver works).
Regarding your update, it sounds like your crypto processor has some more problems. Getting a 0xff back can be the result of an unexpected glitch of <= 1 bit time on the Tx line of the RS232 port. This is interpreted as a start bit by the PC. After the glitch, the Tx line returns to the mark state and now that the UART on the PC has a start bit, it interprets the "data" bits as all ones (the value for the mark state). The mark state is also the correct value for the stop bit(s) so your PC's UART has received a valid byte with a value of 0xff. Note that the glitch can be very fast relative to the RS232 data rate and still be interpreted as a start bit so have your engineer look at this line with an oscilloscope in normal mode/single sequence trigger to confirm this.
What is the Encoding property for the serialPort set to? The docs for SerialPort.Write( byte[], int, int) say that it runs its data through an Encoder object (which doesn't really make sense to me for a byte[]). It's supposed to default to ASCIIEncoding, but it seems like it might be set to something else. try explicitly setting it to ASCIIEncoding and see if that helps. I can't recall if this was an issue for me back when I did some serial port stuff in .NET to talk to an embedded board...
Note that even with ASCIIEncoding in use, you'll get some (probably unwanted) transformation of data - if you try to send something above value 127, the encoder will convert it to '?' since it's not a valid ASCII character. I can't recall off the top of my head how I got the serial port to simply leave my data alone - I'll have to dig around in some source code...
SerialPort sets the Parity property to Parity.None if you don't specify any. This means in case your receiver expects a Partity bit, it will never get one as long as you don't tell SerialPort explicitely to send along a Parity Bit with the transmitted data.
And the fact that it went well on HyperTerminal could be that HyperTerminal uses a Parity bit by default ( I don't know HyperTerminal well).