Windows C# implementation of linux dd command - c#

I'm writing a C#.Net app to run on windows that needs to take an image of a removable disk and chuck it onto a Linux Live USB. The Live USB is the inserted into the target machine and boots, on start up it runs a script which uses the dd command like so to flash it onto another drive:
dd if=/path/to/file/from/csharp/program of=/dev/sdX
The problem I am having is creating the image on the windows side. I have tried my Live Linux out with files I have created on a Linux system using dd and that works fine, but I need to be able to create these files from within a C#.Net application on Windows. I'd rather not have to rely on cygwin or some other dependency so tried to use the Win32 CreateFile function to open the physical device.
CreateFile is called with the first arg set to "\.\F:" (if F: is the drive I want to image), like so:
SafeFileHandle TheDevice = CreateFile(_DevicePath, (uint)FileAccess.Read, (uint)(FileShare.Write | FileShare.Read | FileShare.Delete), IntPtr.Zero, (uint)FileMode.Open, (uint)FILE_ATTRIBUTE_SYSTEM | FILE_FLAG_SEQUENTIAL_SCAN, IntPtr.Zero);
if (TheDevice.IsInvalid)
{
throw new IOException("Unable to access drive. Win32 Error Code " + Marshal.GetLastWin32Error());
}
FileStream Dest = System.IO.File.Open(_SaveFile, FileMode.Create);
FileStream Src = new FileStream(TheDevice, FileAccess.Read);
Src.CopyTo(Dest);
Dest.Flush();
Src.Close();
Dest.Close();
But when the output file is dd'd back onto a disk using the Live Linux USB the result is not as expected (the disk isn't bootable etc, but from examining the output file in a hex editor, it looks like there is an MBR at the beginning etc).
Is this a problem with endianess or should I using something other than a FileStream to copy the data into the file.
Alternatively is there an example of dd for Windows source code (C# or C++, i've looked at the Delphi for http://www.chrysocome.net/dd and don't totally understand it or have a decent Delphi IDE to pick the code apart) so I can see how that works?
UPDATE/EDIT:
Here is a hex string of the first 512 Bytes that the dd output contains:
33 C0 FA 8E D8 8E D0 BC 00 7C 89 E6 06 57 8E C0 FB FC BF 00 06 B9 00 01 F3 A5 EA 1F 06
00 00 52 52 B4 41 BB AA 55 31 C9 30 F6 F9 CD 13 72 13 81 FB 55 AA 75 0D D1 E9 73 09 66
C7 06 8D 06 B4 42 EB 15 5A B4 08 CD 13 83 E1 3F 51 0F B6 C6 40 F7 E1 52 50 66 31 C0 66
99 E8 66 00 E8 21 01 4D 69 73 73 69 6E 67 20 6F 70 65 72 61 74 69 6E 67 20 73 79 73 74
65 6D 2E 0D 0A 66 60 66 31 D2 BB 00 7C 66 52 66 50 06 53 6A 01 6A 10 89 E6 66 F7 36 F4
7B C0 E4 06 88 E1 88 C5 92 F6 36 F8 7B 88 C6 08 E1 41 B8 01 02 8A 16 FA 7B CD 13 8D 64
10 66 61 C3 E8 C4 FF BE BE 7D BF BE 07 B9 20 00 F3 A5 C3 66 60 89 E5 BB BE 07 B9 04 00
31 C0 53 51 F6 07 80 74 03 40 89 DE 83 C3 10 E2 F3 48 74 5B 79 39 59 5B 8A 47 04 3C 0F
74 06 24 7F 3C 05 75 22 66 8B 47 08 66 8B 56 14 66 01 D0 66 21 D2 75 03 66 89 C2 E8 AC
FF 72 03 E8 B6 FF 66 8B 46 1C E8 A0 FF 83 C3 10 E2 CC 66 61 C3 E8 62 00 4D 75 6C 74 69
70 6C 65 20 61 63 74 69 76 65 20 70 61 72 74 69 74 69 6F 6E 73 2E 0D 0A 66 8B 44 08 66
03 46 1C 66 89 44 08 E8 30 FF 72 13 81 3E FE 7D 55 AA 0F 85 06 FF BC FA 7B 5A 5F 07 FA
FF E4 E8 1E 00 4F 70 65 72 61 74 69 6E 67 20 73 79 73 74 65 6D 20 6C 6F 61 64 20 65 72
72 6F 72 2E 0D 0A 5E AC B4 0E 8A 3E 62 04 B3 07 CD 10 3C 0A 75 F1 CD 18 F4 EB FD 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 19 16 9F 29 00 00 80 01 01 00 06 FE 3F 0E 3F 00 00 00 61 C8 03 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 55 AA
and here is what my code produces:
EB 76 90 4D 53 44 4F 53 35 2E 30 00 02 04 04 00 02 00 02 00 00 F8 F2 00 3F 00 FF 00 3F
00 00 00 61 C8 03 00 80 00 29 7A E8 21 04 4E 4F 20 4E 41 4D 45 20 20 20 20 46 41 54 31
36 20 20 20 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 E9 05 01 B4 0E 53 33 DB CD 10 5B C3 8A 07 3C 00 74 06 E8 EE FF 43 EB F4 C3
0D 4E 6F 20 42 50 42 3A 20 43 61 6E 27 74 20 62 6F 6F 74 20 75 73 69 6E 67 20 43 48 53
20 66 75 6E 63 74 69 6F 6E 73 00 50 B0 2E E8 BC FF 58 33 DB 8E 06 E4 01 F6 06 DC 01 02
75 42 F6 06 DC 01 04 75 07 80 3E E8 01 80 72 34 53 53 52 50 06 53 55 6A 10 8B F4 52 50
8A 16 E8 01 B8 00 42 F9 CD 13 8A EC 58 5A 8D 64 10 72 14 80 FD 00 75 0F 03 C5 83 D2 00
C3 BB 91 00 E8 78 FF F4 EB FD 83 3E 18 00 00 74 F0 52 50 8B CD F7 36 18 00 8B F2 03 D1
3B 16 18 00 76 06 8B 0E 18 00 2B CE 33 D2 F7 36 1A 00 88 16 E9 01 8B F8 8B D7 51 8A C1
8D 4C 01 C0 E6 06 0A CE 8A EA 8B 16 E8 01 B4 02 CD 13 59 73 15 80 FC 09 75 0A 49 EB DE
8A C4 04 30 E8 18 FF B4 00 CD 13 EB D1 58 5A 03 C1 83 D2 00 2B E9 74 07 C1 E1 09 03 D9
EB 94 C3 00 00 00 00 FA FC E8 00 00 5E 81 EE 85 01 2E 8B 84 E4 01 8E D8 8E C0 8E D0 2E
C7 84 7C 01 AF 01 2E 89 84 7E 01 B9 00 01 BF 00 00 F3 2E A5 2E FF AC 7C FF BC 00 0A FB
80 3E E8 01 FF 75 04 88 16 E8 01 83 06 E4 01 20 A1 E0 01 8B 16 E2 01 BD 02 00 E8 E9 FE
50 52 EB 74 90 00 00 00 00 00 00 00 00 00 00 00 D3 20 00 00 00 30 80 00 FF 00 68 41 00
40 09 FF 40 5A AC 04 00 00 AC 04 00 00 00 00 12 00 55 AA
This was taken from exactly the same CF card without any editing/writing etc happening, so i'm confused as to why they are so different, but both end with the correct 55 AA bytes too. Does Windows mangle the MBR's on cards when they're accessed this way or is some other weird under the hood stuff happening that I'm not aware of?

I think what you have should work - I've tried this myself using a bootable floppy disk image (mounted as a virtual drive using ImDisk) and the resulting file is binary identical to the original image.
For completeness here is the code I used (in its entirity):
using System;
using System.IO;
using System.Runtime.InteropServices;
using Microsoft.Win32.SafeHandles;
namespace ConsoleApplication1
{
public class Program
{
const int FILE_ATTRIBUTE_SYSTEM = 0x4;
const int FILE_FLAG_SEQUENTIAL_SCAN = 0x8;
[DllImport("Kernel32.dll", SetLastError = true, CharSet = CharSet.Auto)]
public static extern SafeFileHandle CreateFile(string fileName, [MarshalAs(UnmanagedType.U4)] FileAccess fileAccess, [MarshalAs(UnmanagedType.U4)] FileShare fileShare, IntPtr securityAttributes, [MarshalAs(UnmanagedType.U4)] FileMode creationDisposition, int flags, IntPtr template);
[STAThread]
static void Main()
{
using (SafeFileHandle device = CreateFile(#"\\.\E:", FileAccess.Read, FileShare.Write | FileShare.Read | FileShare.Delete, IntPtr.Zero, FileMode.Open, FILE_ATTRIBUTE_SYSTEM | FILE_FLAG_SEQUENTIAL_SCAN, IntPtr.Zero))
{
if (device.IsInvalid)
{
throw new IOException("Unable to access drive. Win32 Error Code " + Marshal.GetLastWin32Error());
}
using (FileStream dest = File.Open("TempFile.bin", FileMode.Create))
{
using (FileStream src = new FileStream(device, FileAccess.Read))
{
src.CopyTo(dest);
}
}
}
}
}
}
If this doesn't work then it seems to indicate that:
There is a problem with the original image.
The problem is with whatever is using the disk image that you've just written.
There is some subtle differences in dealing with the specific device you are accessing (although I can't think what)
The most likely culprit is step 2. What exactly is it that you are doing with the resulting disk image?
Update: This is written in the comments, but for completeness I thought I'd add it to my answer - it looks like whats happening is that the contents of the first partition of the disk is being written, when instead what is wanted is the contents of the entire disk.
When you take a look at the second hex string (the one produced by sample code) in something like HxD we see this:
ëv.MSDOS5.0..........øò.?.ÿ.?...aÈ..€.)zè!.NO NAME FAT16 ..
........................................................é..´.S3Û
Í.[Ê.<.t.èîÿCëôÃ.No BPB: Can't boot using CHS functions.P°.è¼ÿX
3ÛŽ.ä.ö.Ü..uBö.Ü..u.€>è.€r4SSRP.SUj.‹ôRPŠ.è.¸.BùÍ.ŠìXZ.d.r.€ý.u.
.ŃÒ.û‘.èxÿôëýƒ>...tðRP‹Í÷6..‹ò.Ñ;...v.‹...+Î3Ò÷6..ˆ.é.‹ø‹×QŠÁ.
L.Àæ..Ίê‹.è.´.Í.Ys.€ü.u.IëÞŠÄ.0è.ÿ´.Í.ëÑXZ.ÁƒÒ.+ét.Áá..Ùë”Ã....
úüè..^.î…..‹„ä.ŽØŽÀŽÐ.Ç„|.¯..‰„~.¹..¿..ó.¥.ÿ¬|ÿ¼..û€>è.ÿu.ˆ.è.ƒ.
ä. ¡à.‹.â.½..èéþPRët............Ó ...0€.ÿ.hA.#.ÿ#Z¬...¬.......Uª
This looks to me like the boot sector of a FAT16 partition - the presence of the strings "MSDOS5.0", "NO NAME" and "FAT16" near the start is a dead giveaway.
Compare this to the output of the first hex string (the one produced by dd):
3ÀúŽØŽÐ¼.|‰æ.WŽÀûü¿..¹..ó¥ê....RR´A»ªU1É0öùÍ.r..ûUªu.Ñés.fÇ...´B
ë.Z´.Í.ƒá?Q.¶Æ#÷áRPf1Àf™èf.è!.Missing operating system...f`f1Ò».
|fRfP.Sj.j.‰æf÷6ô{Àä.ˆáˆÅ’ö6ø{ˆÆ.áA¸..Š.ú{Í..d.faÃèÄÿ¾¾}¿¾.¹ .ó¥
Ãf`‰å»¾.¹..1ÀSQö.€t.#‰ÞƒÃ.âóHt[y9Y[ŠG.<.t.$.<.u"f‹G.f‹V.f.Ðf!Òu.
f‰Âè¬ÿr.è¶ÿf‹F.è ÿƒÃ.âÌfaÃèb.Multiple active partitions...f‹D.f.
F.f‰D.è0ÿr..>þ}Uª.….ÿ¼ú{Z_.úÿäè..Operating system load error...^
¬´.Š>b.³.Í.<.uñÍ.ôëý......................................Ÿ)..€.
...þ?.?...aÈ..................................................Uª
And we see something that looks to me a lot like a master boot record. Why? Because in the MBR all of the first 440 bytes is boot code, unlike a FAT boot sector which contains the distinctive bios parameter block (it looks like garbage above, but if you put that through a disassembler you get something that looks like valid 16 bit code).
Also, both of those look like valid and completely different boot sectors (complete with error messages). There is no way that a programming error could have "mangled" one to look like the other - it must just be that the wrong thing is being read.
In order to get CreateFile to return the disk instead of the partition it looks like you just need to pass it a different string, for example #"\\.\PhysicalDrive0" opens the first physical disk.
See:
Low Level Disk Access
INFO: Direct Drive Access Under Win32

This is what i've written to do get the \.\PhysicalDriveX path for a given drive letter. If Pass the drive letter into this and take the return value and pass into CreateFile as the first Param I should now get something similar to dd under Linux.
using System.Management; //Add in a reference to this as well in the project settings
public static string GetPhysicalDevicePath(char DriveLetter)
{
ManagementClass devs = new ManagementClass( #"Win32_Diskdrive");
{
ManagementObjectCollection moc = devs.GetInstances();
foreach(ManagementObject mo in moc)
{
foreach (ManagementObject b in mo.GetRelated("Win32_DiskPartition"))
{
foreach (ManagementBaseObject c in b.GetRelated("Win32_LogicalDisk"))
{
string DevName = string.Format("{0}", c["Name"]);
if (DevName[0] == DriveLetter)
return string.Format("{0}", mo["DeviceId"]);
}
}
}
}
return "";
}

Related

TLS/SSL RABBIT MQ (Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host..)

TLS/SSL following steps as:
To create the root CA certificate
genrsa -des3 -out CA-key.pem 2048
req -new -key CA-key.pem -x509 -days 1000 -out CA-cert.pem -subj "/C=US/ST=Oregon/L=Portland/O=user/OU=Org/CN=right.xcl.one"
To create a Signing a Server Certificate:
genrsa -des3 -out server-key.pem 2048
req –new –config openssl.cnf –key server-key.pem –out signingReq.csr
x509 -req -days 365 -in signingReq.csr -CA CA-cert.pem -CAkey CA-key.pem -CAcreateserial -out server-cert.pem
Client Certificate :-
pkcs12 -export -out client-cert.p12 -inkey server-key.pem -in server-cert.pem
Server RabbitMQ Config:-
[
{rabbit, [
{auth_mechanisms, ['EXTERNAL']},
{loopback_users, []},
{ssl_listeners, [5671]},
{ssl_options, [{cacertfile,"D:/RabbitMQ/certs/CA-cert.pem"},
{certfile,"D:/RabbitMQ/certs/server-cert.pem"},
{keyfile,"D:/RabbitMQ/certs/server-key.pem"},
{verify,verify_peer},
{password, "test"},
{fail_if_no_peer_cert, false}]
}
]}
].
In C# Calling from local:-
var hostName = "right.xcl.one";
var cf = new ConnectionFactory
{
HostName = hostName,
UserName = "user",
Password = "user",
VirtualHost = "/",
AuthMechanisms = new IAuthMechanismFactory[] { new ExternalMechanismFactory() },
Ssl = new SslOption
{
Enabled = true,
ServerName = "right.xcl.one",
AcceptablePolicyErrors = SslPolicyErrors.RemoteCertificateNameMismatch |
SslPolicyErrors.RemoteCertificateChainErrors,
CertPath = #"D:\client-cert.p12",
CertPassphrase = "test",
}
};
using (IConnection conn = cf.CreateConnection())
OpenSSL> s_client -connect malta1597.startdedicated.com:5671 -cert client-cert.pem -key client-key.pem -CAfile CA-cert.pem -verify 8 -verify_hostname malta1597.startdedicated.com\ -state -debug
verify depth is 8
Enter pass phrase for client-key.pem:
CONNECTED(00000144)
SSL_connect:before SSL initialization
write to 0x1924d107020 [0x1924d1267b0] (330 bytes => 330 (0x14A))
0000 - 16 03 01 01 45 01 00 01-41 03 03 f7 84 a2 00 f6 ....E...A.......
0010 - 82 f2 f0 ef 26 79 3d fb-56 dd f9 37 79 fd 19 58 ....&y=.V..7y..X
0020 - 81 c8 a0 bc b3 5f f3 b5-29 a3 73 20 f8 06 9d 28 ....._..).s ...(
0030 - ec eb 1b c8 e6 f8 4f fe-97 1c 74 23 93 8f db ef ......O...t#....
0040 - 8a ad 18 af 71 96 c2 40-b1 99 9d 92 00 3e 13 02 ....q..#.....>..
0050 - 13 03 13 01 c0 2c c0 30-00 9f cc a9 cc a8 cc aa .....,.0........
0060 - c0 2b c0 2f 00 9e c0 24-c0 28 00 6b c0 23 c0 27 .+./...$.(.k.#.'
0070 - 00 67 c0 0a c0 14 00 39-c0 09 c0 13 00 33 00 9d .g.....9.....3..
0080 - 00 9c 00 3d 00 3c 00 35-00 2f 00 ff 01 00 00 ba ...=.<.5./......
0090 - 00 00 00 21 00 1f 00 00-1c 6d 61 6c 74 61 31 35 ...!.....malta15
00a0 - 39 37 2e 73 74 61 72 74-64 65 64 69 63 61 74 65 97.startdedicate
00b0 - 64 2e 63 6f 6d 00 0b 00-04 03 00 01 02 00 0a 00 d.com...........
00c0 - 0c 00 0a 00 1d 00 17 00-1e 00 19 00 18 00 23 00 ..............#.
00d0 - 00 00 16 00 00 00 17 00-00 00 0d 00 30 00 2e 04 ............0...
00e0 - 03 05 03 06 03 08 07 08-08 08 09 08 0a 08 0b 08 ................
00f0 - 04 08 05 08 06 04 01 05-01 06 01 03 03 02 03 03 ................
0100 - 01 02 01 03 02 02 02 04-02 05 02 06 02 00 2b 00 ..............+.
0110 - 09 08 03 04 03 03 03 02-03 01 00 2d 00 02 01 01 ...........-....
0120 - 00 33 00 26 00 24 00 1d-00 20 d3 71 9e 9a b6 7d .3.&.$... .q...}
0130 - 1f 40 6c f3 35 dc b0 86-bc 52 c9 7e ba b8 64 0b .#l.5....R.~..d.
0140 - d7 09 df b9 a5 34 15 f4-0f 1e .....4....
SSL_connect:SSLv3/TLS write client hello
read from 0x1924d107020 [0x1924d11d593] (5 bytes => -1 (0xFFFFFFFF))
SSL_connect:error in SSLv3/TLS write client hello
write:errno=10054
no peer certificate available
No client certificate CA names sent
SSL handshake has read 0 bytes and written 330 bytes
Verification: OK
New, (NONE), Cipher is (NONE)
Secure Renegotiation IS NOT supported
Compression: NONE
Expansion: NONE
No ALPN negotiated
Early data was not sent
Verify return code: 0 (ok)
read from 0x1924d107020 [0x1924d0fb0a0] (8192 bytes => -1 (0xFFFFFFFF))
error in s_client
OpenSSL>
Authenticated successfully but the error is now:-
One or more errors occurred. (Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host..)
Please help.

Why does this C# TcpClient code not always see a response?

I am using the following code to read a psuedo-HTTP request response. It works sometimes but not always and I do not understand.
Background: I have a device that takes HTTP GET requests and sends a chunked HTTP response. In one case, the response is not a proper chunked HTTP response. It leaves out the null chunk that indicates the end of data. I have fixed that problem in the device, but I am trying to figure out how to read the non-comforming HTTP response. I found code from Create http request using TcpClient that sometimes works and sometimes doesn't and I do not understand why.
If I use the code unaltered, it works fine. If I use it by replacing the "www.bing.com" with my device's IP, "192.1.168.89" in both places the string appears, for example, and change the GET command line to "GET /index.htm HTTP/1.1", it works fine. This version of the command returns a web page that is constructed by the device and sends several TCP buffers (about 1400 bytes in my device) of chunked data.
However, if I change to another command that my device understands, "GET /request.htm?T HTTP/1.1", but returns less than 500 bytes of chunked data, then I never see the response. In fact it never gets past the call to "CopyToAsync(memory)" and I do not understand why. The device sees the request, parses it and sends a proper HTTP response. (I know it is a proper response because I have code that uses HTTPClient to read the response and it sees the response fine. And I see the response data from the device side is exactly the same going out in both cases. I can see the device data because I am writing the device's firmware and can change it to printf() the data being sent out to the TCP routines.)
Anyone have an explanation for why the code below isn't always seeing a response?
private static async Task<string> HttpRequestAsync() {
string result = string.Empty;
using (var tcp = new TcpClient("www.bing.com", 80))
using (var stream = tcp.GetStream())
{
tcp.SendTimeout = 500;
tcp.ReceiveTimeout = 1000;
// Send request headers
var builder = new StringBuilder();
builder.AppendLine("GET /?scope=images&nr=1 HTTP/1.1");
builder.AppendLine("Host: www.bing.com");
//builder.AppendLine("Content-Length: " + data.Length); // only for POST request
builder.AppendLine("Connection: close");
builder.AppendLine();
var header = Encoding.ASCII.GetBytes(builder.ToString());
await stream.WriteAsync(header, 0, header.Length);
// Send payload data if you are POST request
//await stream.WriteAsync(data, 0, data.Length);
// receive data
using (var memory = new MemoryStream())
{
await stream.CopyToAsync(memory);
memory.Position = 0;
var data = memory.ToArray();
var index = BinaryMatch(data, Encoding.ASCII.GetBytes("\r\n\r\n")) + 4;
var headers = Encoding.ASCII.GetString(data, 0, index);
memory.Position = index;
if (headers.IndexOf("Content-Encoding: gzip") > 0)
{
using (GZipStream decompressionStream = new GZipStream(memory, CompressionMode.Decompress))
using (var decompressedMemory = new MemoryStream())
{
decompressionStream.CopyTo(decompressedMemory);
decompressedMemory.Position = 0;
result = Encoding.UTF8.GetString(decompressedMemory.ToArray());
}
}
else
{
result = Encoding.UTF8.GetString(data, index, data.Length - index);
//result = Encoding.GetEncoding("gbk").GetString(data, index, data.Length - index);
}
}
//Debug.WriteLine(result);
return result;
}
}
private static int BinaryMatch(byte[] input, byte[] pattern)
{
int sLen = input.Length - pattern.Length + 1;
for (int i = 0; i < sLen; ++i)
{
bool match = true;
for (int j = 0; j < pattern.Length; ++j)
{
if (input[i + j] != pattern[j])
{
match = false;
break;
}
}
if (match)
{
return i;
}
}
return -1;
}
=====================
Let me edit the function above to show what it is now and maybe clarify things.
static async Task<byte[]> getTcpClientHttpDataRequestAsync(string ipAddress, string request)
{
string result = string.Empty;
List<byte> arrayList = new List<byte>();
using (var tcp = new TcpClient("192.168.1.89", 80))
using (var stream = tcp.GetStream())
using (var memory = new MemoryStream())
{
tcp.SendTimeout = 500;
tcp.ReceiveTimeout = 10000;
tcp.NoDelay = true;
// Send request headers
var builder = new StringBuilder();
builder.AppendLine("GET /request.htm?x01011920000000000001 HTTP/1.1");
builder.AppendLine("Host: 192.168.1.89");
builder.AppendLine("Connection: Close");
builder.AppendLine();
var header = Encoding.ASCII.GetBytes(builder.ToString());
Console.WriteLine("======");
Console.WriteLine(builder.ToString());
Console.WriteLine("======");
await stream.WriteAsync(header, 0, header.Length);
do { } while (stream.DataAvailable == 0);
Console.WriteLine("Data available");
bool done = false;
do
{
int next = stream.ReadByte();
if (next < 0)
{
done = true;
}
else
{
arrayList.Add(Convert.ToByte(next));
}
} while (stream.DataAvailable && !done);
byte[] data = arrayList.ToArray();
return data;
}
}
The GET command is what my device is responding to. If the command starts with 'x' as shown then it responds with a proper HTTP response and the function above reads the data. If it starts with 'd' it is missing the 0 length chunk at the end and the function above never sees any data from the device.
With Wireshark, I am seeing the following responses for the 'x' and 'd' commands.
The 'x' command returns 2 TCP frames with the following data:
0000 1c 6f 65 d3 f0 e2 4c 60 de 41 3f 67 08 00 45 00 .oe...L`.A?g..E.
0010 00 9c 00 47 00 00 64 06 d2 49 c0 a8 01 59 c0 a8 ...G..d..I...Y..
0020 01 22 00 50 05 5d fc f5 9e 72 ad 75 e3 2c 50 18 .".P.]...r.u.,P.
0030 00 01 a9 cd 00 00 48 54 54 50 2f 31 2e 31 20 32 ......HTTP/1.1 2
0040 30 30 20 4f 4b 0d 0a 43 6f 6e 6e 65 63 74 69 6f 00 OK..Connectio
0050 6e 3a 20 63 6c 6f 73 65 0d 0a 43 6f 6e 74 65 6e n: close..Conten
0060 74 2d 54 79 70 65 3a 20 74 65 78 74 2f 68 74 6d t-Type: text/htm
0070 6c 0d 0a 43 61 63 68 65 2d 43 6f 6e 74 72 6f 6c l..Cache-Control
0080 3a 20 6e 6f 2d 63 61 63 68 65 0d 0a 54 72 61 6e : no-cache..Tran
0090 73 66 65 72 2d 45 6e 63 6f 64 69 6e 67 3a 20 63 sfer-Encoding: c
00a0 68 75 6e 6b 65 64 0d 0a 0d 0a hunked....
0000 1c 6f 65 d3 f0 e2 4c 60 de 41 3f 67 08 00 45 00 .oe...L`.A?g..E.
0010 00 45 00 48 00 00 64 06 d2 9f c0 a8 01 59 c0 a8 .E.H..d......Y..
0020 01 22 00 50 05 5d fc f5 9e e6 ad 75 e3 2c 50 18 .".P.].....u.,P.
0030 00 01 fc 20 00 00 30 30 31 0d 0a 2b 0d 0a 30 30 ... ..001..+..00
0040 37 0d 0a 01 85 86 00 00 0d 0a 0d 0a 30 30 30 0d 7...........000.
0050 0a 0d 0a ...
By comparison the 'd' command returns data in 2 TCP frames as:
0000 1c 6f 65 d3 f0 e2 4c 60 de 41 3f 67 08 00 45 00 .oe...L`.A?g..E.
0010 00 9c 00 4e 00 00 64 06 d2 42 c0 a8 01 59 c0 a8 ...N..d..B...Y..
0020 01 22 00 50 05 5e d3 c3 f9 f5 69 cc 6d a3 50 18 .".P.^....i.m.P.
0030 00 01 30 ae 00 00 48 54 54 50 2f 31 2e 31 20 32 ..0...HTTP/1.1 2
0040 30 30 20 4f 4b 0d 0a 43 6f 6e 6e 65 63 74 69 6f 00 OK..Connectio
0050 6e 3a 20 63 6c 6f 73 65 0d 0a 43 6f 6e 74 65 6e n: close..Conten
0060 74 2d 54 79 70 65 3a 20 74 65 78 74 2f 68 74 6d t-Type: text/htm
0070 6c 0d 0a 43 61 63 68 65 2d 43 6f 6e 74 72 6f 6c l..Cache-Control
0080 3a 20 6e 6f 2d 63 61 63 68 65 0d 0a 54 72 61 6e : no-cache..Tran
0090 73 66 65 72 2d 45 6e 63 6f 64 69 6e 67 3a 20 63 sfer-Encoding: c
00a0 68 75 6e 6b 65 64 0d 0a 0d 0a hunked....
0000 1c 6f 65 d3 f0 e2 4c 60 de 41 3f 67 08 00 45 00 .oe...L`.A?g..E.
0010 00 36 00 4f 00 00 64 06 d2 a7 c0 a8 01 59 c0 a8 .6.O..d......Y..
0020 01 22 00 50 05 5e d3 c3 fa 69 69 cc 6d a3 50 18 .".P.^...ii.m.P.
0030 00 01 64 c2 00 00 30 30 37 0d 0a 01 90 91 00 00 ..d...007.......
0040 0d 0a 0d 0a ....
The only discernible differences that I see is that in the second frame of the 'd' command it is missing a 1 byte chunk that is part of our protocol (and shouldn't have any effect on the TCP/HTTP function) and the last 7 bytes of data that the 'x' command provides, which is the 0 length chunk expected for HTTP.
Going back to the code in HttpRequestAsync(), if the 'd' command is sent then the code never sees stream.DataAvailable become true, even though the data has been sent. Why?
await stream.CopyToAsync()
will not complete until
stream.DataAvailable == false
You have indicated to the server, in the headers that you will close the TCP connection when done, but have not done so. The server will eventually close the connection when it thinks you're gone. The server is not obligated to obey your "Connection: close" request and that should be indicated in the headers the server returns.
Before you call stream.CopyToAsync() you should check the headers to determine if what Content-Length has been supplied and pass a buffer length to stream.CopyToAsync() and then call TcpClient.Close()

C# BinaryFormatter bytes orde

I am using binary formatter in order to serialize my object.
I would like to know what is the order of the properties in the serialized byte array (according to properties order in the object class? randomaly?)
And if I can control the order of the bytes according to the props.
For example,
If I serialize the following obj:
public class Human
{
int Age {get;set;}
int Weight {get; set;}
}
If I will serialize it, what is the order of bytes means? (does the first 4 bytes will represent the age, and the next are the weight? and so on.. or the binary formatter set it randomaly)
Why don't you just try it? Lets take your class
[Serializable]
public class Human
{
public int Age {get;set;}
public int Weight {get; set;}
}
And serialize it, then inspect the result by examining the HexDump
var bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
using(var ms = new MemoryStream())
{
bf.Serialize(ms, new Human{ Age = 42, Weight = -1 });
HexDump(ms.ToArray());
}
This will give:
00000 : 00 01 00 00 00 FF FF FF FF 01 00 00 00 00 00 00 .....????.......
00016 : 00 0C 02 00 00 00 43 71 75 65 72 79 5F 6C 68 68 ......Cquery_lhh
00032 : 75 78 68 2C 20 56 65 72 73 69 6F 6E 3D 30 2E 30 uxh, Version=0.0
00048 : 2E 30 2E 30 2C 20 43 75 6C 74 75 72 65 3D 6E 65 .0.0, Culture=ne
00064 : 75 74 72 61 6C 2C 20 50 75 62 6C 69 63 4B 65 79 utral, PublicKey
00080 : 54 6F 6B 65 6E 3D 6E 75 6C 6C 05 01 00 00 00 0F Token=null......
00096 : 55 73 65 72 51 75 65 72 79 2B 48 75 6D 61 6E 02 UserQuery+Human.
00112 : 00 00 00 14 3C 41 67 65 3E 6B 5F 5F 42 61 63 6B ....<Age>k__Back
00128 : 69 6E 67 46 69 65 6C 64 17 3C 57 65 69 67 68 74 ingField.<Weight
00144 : 3E 6B 5F 5F 42 61 63 6B 69 6E 67 46 69 65 6C 64 >k__BackingField
00160 : 00 00 08 08 02 00 00 00 2A 00 00 00 FF FF FF FF ........*...????
00176 : 0B .
That is the convoluted format Hans is talking about. If you squint a bit you recognize an assemblyname, the classname, the fieldnames (kind of) and if you apply the magic offered by jdweng you notice the 4 bytes 2A 00 00 00 which would make 42 (Age) and the next 4 bytes represent -1 (Weight).
Let's add a public field Name as the first field:
[Serializable]
public class Human
{
public string Name;
public int Age {get;set;}
public int Weight {get; set;}
}
and let's look at the changed bytes:
00096 : 55 73 65 72 51 75 65 72 79 2B 48 75 6D 61 6E 03 UserQuery+Human.
00112 : 00 00 00 04 4E 61 6D 65 14 3C 41 67 65 3E 6B 5F ....Name.<Age>k_
00128 : 5F 42 61 63 6B 69 6E 67 46 69 65 6C 64 17 3C 57 _BackingField.<W
00144 : 65 69 67 68 74 3E 6B 5F 5F 42 61 63 6B 69 6E 67 eight>k__Backing
00160 : 46 69 65 6C 64 01 00 00 08 08 02 00 00 00 06 03 Field...........
00176 : 00 00 00 04 54 65 73 74 2A 00 00 00 FE FF FF FF ....Test*...????
00192 : 0B .
That seems to make sense. Let's put that field at the end:
[Serializable]
public class Human
{
public int Age {get;set;}
public int Weight {get; set;}
public string Name;
}
and the result is:
00096 : 55 73 65 72 51 75 65 72 79 2B 48 75 6D 61 6E 03 UserQuery+Human.
00112 : 00 00 00 04 4E 61 6D 65 14 3C 41 67 65 3E 6B 5F ....Name.<Age>k_
00128 : 5F 42 61 63 6B 69 6E 67 46 69 65 6C 64 17 3C 57 _BackingField.<W
00144 : 65 69 67 68 74 3E 6B 5F 5F 42 61 63 6B 69 6E 67 eight>k__Backing
00160 : 46 69 65 6C 64 01 00 00 08 08 02 00 00 00 06 03 Field...........
00176 : 00 00 00 04 54 65 73 74 2A 00 00 00 FE FF FF FF ....Test*...????
00192 : 0B .
No change at all.
One final example to convince you that the output of the BinaryFormatter is an implementation detail and that serializing and deserializing should be left to that class and is not be attempted by other means.
[Serializable]
public class Human
{
public string[] Address;
private string _name;
public int Weight {get; set;} // switched
public int Age {get;set;}
public string Name {get{return _name;} set{_name=value;}}
}
And if we initialize that class as follows:
new Human{ Name ="Test", Age = 42, Weight = -1, Address =new []{"foo","bar"}}
the hexdump will show this:
00096 : 55 73 65 72 51 75 65 72 79 2B 48 75 6D 61 6E 04 UserQuery+Human.
00112 : 00 00 00 07 41 64 64 72 65 73 73 05 5F 6E 61 6D ....Address._nam
00128 : 65 17 3C 57 65 69 67 68 74 3E 6B 5F 5F 42 61 63 e.<Weight>k__Bac
00144 : 6B 69 6E 67 46 69 65 6C 64 14 3C 41 67 65 3E 6B kingField.<Age>k
00160 : 5F 5F 42 61 63 6B 69 6E 67 46 69 65 6C 64 06 01 __BackingField..
00176 : 00 00 08 08 02 00 00 00 09 03 00 00 00 06 04 00 ................
00192 : 00 00 04 54 65 73 74 FF FF FF FF 2A 00 00 00 11 ...Test????*....
00208 : 03 00 00 00 02 00 00 00 06 05 00 00 00 03 66 6F ..............fo
00224 : 6F 06 06 00 00 00 03 62 61 72 0B o......bar.
Notice the order of Address and _name although the actual values of the string[] array are put at the end.
So to answer your question:
I would like to know what is the order of the properties in the serialized byte array (according to properties order in the object class? randomly?)
It is an implementation detail that depends on the type of the field and its order in the class. It's metadata and actual value might be in a different order as well. It is not randomly and it is not the order in the class.
And if I can control the order of the bytes according to the props.
It might seems you can control it to some extent but this is so much of an implementation detail that it is not practical to try to influence it, predict it or rely on it.
Keep in mind that you can only serialize and deserialize the specific version of the class. There is no backward compatibility.
If you need to have strict control over the serialization format use an open standard, like XML, JSON or proto-buf. Or roll your own serializer, leveraging the BinaryWriter as suggested by Peter.

C# Converting to ?? from Wireshark Response

I am struggling a bit here. I am sending an tcp request to an endpoint.
client = new TcpClient(serverIP, serverPort);
tcpStream = client.GetStream();
var stringtosend = #"43 50 43 52 00 01 19 00 00 00 1a 00 00 00
01 00 00 00 00 00 00 00 32 00 00 01";
stringtosend = stringtosend.Replace(" ", "");
var requestbytes = ConvertToByteArray(stringtosend);
tcpStream.Write(requestbytes, 0, requestbytes.Length);
List<int> ResponseBytes = new List<int>();
var sb = new StringBuilder();
do
{
var byteRead = tcpStream.ReadByte();
if (byteRead != -1)
{
sb.Append(byteRead);
}
else
{
break;
}
} while (tcpStream.DataAvailable);
This is the response as read from wireshark:
43 50 43 52 00 01 19 00 00 00 34 01 00 00 1B 01 00 00 0B 01 0B 01 3C 00 00 32 00 01 FF FF 0C 00 48 56 41 43 20 43 54 52 4C 00 02 02 40 01 46 0B 01 00 00 00 00 00 00 45 4D 2D 4D 38 38 49 23 31 00 11 00 00 00 00 81 01 00 00 00 00 00 00 41 43 31 4D 38 38 49 23 32 00 11 00 00 00 00 81 02 00 00 00 00 00 00 45 4D 2D 4D 38 38 4F 23 31 00 12 00 00 00 00 82 01 00 00 00 00 00 00 41 43 31 4D 38 38 4F 23 32 00 12 00 00 00 00 82 02 00 00 00 00 00 00 41 43 32 4D 52 54 55 23 31 00 0F 03 0B 01 46 90 01 00 00 00 00 00 00 41 43 33 4D 52 54 55 23 32 00 0F 03 0B 01 46 90 02 00 00 00 00 00 00 41 43 34 4D 52 54 55 23 33 00 0F 03 0B 01 46 90 03 00 00 00 00 00 00 41 43 35 4D 52 54 55 23 34 00 0F 03 0B 01 46 90 04 00 00 00 00 00 00 41 43 36 4D 52 54 55 23 35 00 0F 03 0B 01 46 90 05 00 00 00 00 00 00 52 41 43 4B 20 31 26 32 00 00 03 02 40 01 46 01 01 00 00 00 00 00 00 52 41 43 4B 20 33 26 34 00 00 03 02 40 01 46 02 01 00 00 00 00 00 00
CPCR......4...........<..2..ÿÿ..HVAC CTRL...#.F........EM-M88I#1.............AC1M88I#2.............EM-M88O#1.............AC1M88O#2.............AC2MRTU#1.....F.......AC3MRTU#2.....F.......AC4MRTU#3.....F.......AC5MRTU#4.....F.......AC6MRTU#5.....F.......RACK 1&2....#.F........RACK 3&4....#.F........
I am interested in how to convert what I believe to be a hex response into what is bolded above.
I'm new at this so please be gentle.
At the simplest you can do the following which was taken directly from
https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/types/how-to-convert-between-hexadecimal-strings-and-numeric-types
// Convert the number expressed in base-16 to an integer.
int value = Convert.ToInt32(hex, 16);
// Get the character corresponding to the integral value.
string stringValue = Char.ConvertFromUtf32(value);
Also I have used websites like the following
http://www.rapidtables.com/convert/number/hex-to-ascii.htm
I've just knocked this up as a console application, so you should be able to copy and paste as your Program.cs:
using System;
using System.Linq;
using System.Text;
using System.Text.RegularExpressions;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
const string inputString = "43 50 43 52 00 01 19 00 00 00 34 01 00 00 1B 01 00 00 0B 01 0B 01 3C 00 00 32 00 01 FF FF 0C 00 48 56 41 43 20 43 54 52 4C 00 02 02 40 01 46 0B 01 00 00 00 00 00 00 45 4D 2D 4D 38 38 49 23 31 00 11 00 00 00 00 81 01 00 00 00 00 00 00 41 43 31 4D 38 38 49 23 32 00 11 00 00 00 00 81 02 00 00 00 00 00 00 45 4D 2D 4D 38 38 4F 23 31 00 12 00 00 00 00 82 01 00 00 00 00 00 00 41 43 31 4D 38 38 4F 23 32 00 12 00 00 00 00 82 02 00 00 00 00 00 00 41 43 32 4D 52 54 55 23 31 00 0F 03 0B 01 46 90 01 00 00 00 00 00 00 41 43 33 4D 52 54 55 23 32 00 0F 03 0B 01 46 90 02 00 00 00 00 00 00 41 43 34 4D 52 54 55 23 33 00 0F 03 0B 01 46 90 03 00 00 00 00 00 00 41 43 35 4D 52 54 55 23 34 00 0F 03 0B 01 46 90 04 00 00 00 00 00 00 41 43 36 4D 52 54 55 23 35 00 0F 03 0B 01 46 90 05 00 00 00 00 00 00 52 41 43 4B 20 31 26 32 00 00 03 02 40 01 46 01 01 00 00 00 00 00 00 52 41 43 4B 20 33 26 34 00 00 03 02 40 01 46 02 01 00 00 00 00 00 00";
var bytes = HexStringToBytes(inputString);
var readableString = Encoding.UTF7.GetString(bytes).ToCharArray();
var paddedReadableString = readableString.Select(ch => char.IsControl(ch) ? '.' : ch).ToArray();
var outputString = new String(paddedReadableString);
Console.WriteLine(outputString);
Console.ReadLine();
}
private static byte[] HexStringToBytes(string hex)
{
var strippedHex = Regex.Replace(hex, #"\s+", "");
byte[] raw = new byte[strippedHex.Length / 2];
for (int i = 0; i < raw.Length; i++)
{
raw[i] = Convert.ToByte(strippedHex.Substring(i * 2, 2), 16);
}
return raw;
}
}
}
For info, your desired format is UTF7.
You are reading your bytes direct into a StringBuilder.
The key line here is:
var byteRead = tcpStream.ReadByte();
You have an opportunity to convert the encoding here using this static method which will look something like:
// Create two different encodings.
Encoding utf8 = Encoding.UTF8;
Encoding unicode = Encoding.Unicode;
// Perform the conversion from one encoding to the other.
byte[] unicodeBytes = Encoding.Convert(utf8, unicode, new byte[]{byteRead} );

AcceptSecurityContext fails when application is running as a service

I have a simple HTTP server that authenticates clients with Negotiate protocol. It uses SSPI calls to acquire server credentials and establish security context. The server is in domain and is running on behalf of the domain user. Everything works fine and I'm getting HTTP 200 response if I start the server in console mode. However when I'm running it as a service I'm getting SEC_E_INVALID_HANDLE error. Here is what happens when I start it in the console mode:
1.Client sends HTTP Get request http://localhost:8082
2.Server responds with WWW-Authenticate: Negotiate header.
3.Client sends Authorization header and includes the following data:
60 73 06 06 2B 06 01 05 05 02 A0 69 30 67 A0 30 `s..+..... i0g 0
30 2E 06 0A 2B 06 01 04 01 82 37 02 02 0A 06 09 0...+....7.....
2A 86 48 82 F7 12 01 02 02 06 09 2A 86 48 86 F7 *H÷......*H÷
12 01 02 02 06 0A 2B 06 01 04 01 82 37 02 02 1E ......+....7...
A2 33 04 31 4E 54 4C 4D 53 53 50 00 01 00 00 00 ¢3.1NTLMSSP.....
97 B2 08 E2 04 00 04 00 2D 00 00 00 05 00 05 00 ².â....-.......
28 00 00 00 06 01 B1 1D 00 00 00 0F 50 41 43 45 (.....±.....PACE
4D 42 4C 41 48 MBLAH
4.Server responds with HTTP 401 error and negotiate header prompting to continue:
A1 81 CE 30 81 CB A0 03 0A 01 01 A1 0C 06 0A 2B ¡Î0Ë ....¡...+
06 01 04 01 82 37 02 02 0A A2 81 B5 04 81 B2 4E ....7...¢µ.²N
54 4C 4D 53 53 50 00 02 00 00 00 08 00 08 00 38 TLMSSP.........8
00 00 00 15 C2 89 E2 B0 3B BE 20 45 33 FD 92 80 ....Ââ°;¾ E3ý
04 E7 01 00 00 00 00 72 00 72 00 40 00 00 00 06 .ç.....r.r.#....
01 B1 1D 00 00 00 0F 42 00 4C 00 41 00 48 00 02 .±.....B.L.A.H..
00 08 00 42 00 4C 00 41 00 48 00 01 00 0A 00 50 ...B.L.A.H.....P
00 41 00 43 00 45 00 4D 00 04 00 10 00 62 00 6C .A.C.E.M.....b.l
00 61 00 68 00 2E 00 63 00 6F 00 6D 00 03 00 1C .a.h...c.o.m....
00 50 00 61 00 63 00 65 00 6D 00 2E 00 62 00 6C .P.a.c.e.m...b.l
00 61 00 68 00 2E 00 63 00 6F 00 6D 00 05 00 10 .a.h...c.o.m....
00 62 00 6C 00 61 00 68 00 2E 00 63 00 6F 00 6D .b.l.a.h...c.o.m
00 07 00 08 00 5D B3 C5 9A 0F 17 D1 01 00 00 00 .....]³Å..Ñ....
00 .
5.Client sends Authorization header:
A1 77 30 75 A0 03 0A 01 01 A2 5A 04 58 4E 54 4C ¡w0u ....¢Z.XNTL
4D 53 53 50 00 03 00 00 00 00 00 00 00 58 00 00 MSSP.........X..
00 00 00 00 00 58 00 00 00 00 00 00 00 58 00 00 .....X.......X..
00 00 00 00 00 58 00 00 00 00 00 00 00 58 00 00 .....X.......X..
00 00 00 00 00 58 00 00 00 15 C2 88 E2 06 01 B1 .....X....Ââ..±
1D 00 00 00 0F C0 BD 0C 5B F5 F9 35 FE 78 6D 08 .....À½.[õù5þxm.
BF 7B D9 CC E3 A3 12 04 10 01 00 00 00 F5 17 A7 ¿{ÙÌã£.......õ.§
50 2D 22 9A 84 00 00 00 00 P-"....
6.Server responds with HTTP 200 and negotiate header:
A1 1B 30 19 A0 03 0A 01 00 A3 12 04 10 01 00 00 ¡.0. ....£......
00 43 87 E0 88 C1 36 E3 A9 00 00 00 00 .CàÁ6ã©....
Now if I run the application as a service I will get almost identical responses, but AcceptSecurityContext will fail on step #6 and return SEC_E_INVALID_HANDLE error. I wonder why would it fail if I run the same application and specify the same user as a service logon identity? Could it be somehow related to session 0 isolation? Also is there a way to troubleshoot it better, I don't see any error messages in the event viewer and the invalid handle error doesn't say much about what is missing.
Here is the server code to authenticate:
public static WinAuthResult Authenticate(string clientId, byte[] clientTokenBytes, string securityPackage, ILogger logger)
{
if (clientTokenBytes == null || clientTokenBytes.Length == 0)
{
ClearContext(clientId);
throw new Win32Exception(Secur32.SEC_E_INVALID_TOKEN);
}
var serverCredExpiry = new Secur32.SECURITY_INTEGER();
var serverCredHandle = new Secur32.SecHandle();
var acquireResult = Secur32.AcquireCredentialsHandle(null, securityPackage, Secur32.SECPKG_CRED_INBOUND, IntPtr.Zero, IntPtr.Zero, 0, IntPtr.Zero, out serverCredHandle, out serverCredExpiry);
if (acquireResult != Secur32.SEC_E_OK)
throw new Win32Exception(acquireResult);
var oldContextExists = contexts.ContainsKey(clientId);
var oldContextHandle = GetContextHandle(clientId);
var newContextHandle = new Secur32.SecHandle();
var clientToken = new Secur32.SecBufferDesc(clientTokenBytes);
var outputToken = new Secur32.SecBufferDesc(61440);
var contextAttributes = (uint)0;
var outputCresExpiry = new Secur32.SECURITY_INTEGER();
int acceptResult;
if (!oldContextExists)
{
acceptResult = Secur32.AcceptSecurityContext(
ref serverCredHandle,
IntPtr.Zero,
ref clientToken,
0,
Secur32.SECURITY_NATIVE_DREP,
ref newContextHandle,
ref outputToken,
out contextAttributes,
out outputCresExpiry);
}
else
{
acceptResult = Secur32.AcceptSecurityContext(
ref serverCredHandle,
ref oldContextHandle,
ref clientToken,
0,
Secur32.SECURITY_NATIVE_DREP,
ref newContextHandle,
ref outputToken,
out contextAttributes,
out outputCresExpiry);
}
if (acceptResult == Secur32.SEC_E_OK)
{
ClearContext(clientId);
return new WinAuthResult(false, outputToken.GetSecBufferByteArray());
}
else if (acceptResult == Secur32.SEC_I_CONTINUE_NEEDED)
{
contexts[clientId] = newContextHandle;
return new WinAuthResult(true, outputToken.GetSecBufferByteArray());
}
else
{
ClearContext(clientId);
throw new Win32Exception(acceptResult);
}
}
In both cases I'm trying to access the web page from the same machine where the server is running and with the same domain user. Also I'm using the same domain user to run console app and windows service. The issue is not reproducible on Windows Server 2003 which makes me think that it is something related to new security features.
It's been a while but I think I saw similar issues when Load User Profile was set to false in the Advanced Settings of the Application Pool. That setting is new to IIS 7.0. The documentation does say that the false value corresponds to the Windows Server 2003 behavior but I recall that not loading the profile interferes somehow with the SSPI subsystem. And you're right, there is precious little error reporting from there, I had to jump through some hoops to tease the answer out of it.
UPDATE
Those functions ultimately rely on Kerberos client implementation of which a large part resides in the lsass.exe process. Here's a good link regarding troubleshooting the whole subsystem: http://blogs.msdn.com/b/canberrapfe/archive/2012/01/02/kerberos-troubleshooting.aspx
Also, I remember that once a client had issues around authentication that we ultimately traced down to some protocol mismatch between a client running on Server 2008 (or something like that, important fact is the version was higher than 2003) connecting to a secondary domain controller running on Server 2003. Didn't trace it further, the client just upgraded the DC.
FINAL UPDATE
OK, I was able to reproduce the problem and I was actually able to make it work. Your Authenticate(string clientId, byte[] clientTokenBytes, string securityPackage, ILogger logger) method gets called at least twice due to first call to AcceptSecurityContext returning SEC_I_CONTINUE_NEEDED. Every time Authenticate gets a new credentials handle by calling AcquireCredentialsHandle function. That worked for me in console and inside service running as LocalSystem, but not if the service was running under domain account, just like you said.
So, I pulled the AcquireCredentialsHandle call out of Authenticate so that I could obtain it once and then reuse for subsequent incoming calls. That fixed the service for me.
On a related note, you should free the credentials handle using FreeCredentialsHandle call, otherwise you could get a memory leak in lsass.exe which would require you to restart the server. See Remarks section in the MSDN description of AcquireCredentialsHandle

Categories