Converting VB.NET code to C# - c#

I have the follwing VB.NET code I am trying to convert to C#.
Dim decryptedBytes(CInt(encryptedStream.Length - 1)) As Byte
I tried this:
int tempData = Convert.ToInt32(encryptedStream.Length - 1);
Byte decryptedBytes;
decryptedBytes = decryptedBytes[tempData];
But got this error message:
Cannot apply indexing with [] to an expression of type byte.
Please note that the VB.NET code works.

Using the SharpDevelop code converter, the output for your VB code is:
byte[] decryptedBytes = new byte[Convert.ToInt32(encryptedStream.Length - 1) + 1];
Note that VB specifies for upper bound of the array where C# specifies the length, so the converter added the "+ 1".
I would simplify that to:
byte[] decryptedBytes = new byte[(int)encryptedStream.Length];

byte[] decryptedBytes = new byte[(Int32)encryptedStream.Length];
By the way if you have further problems try this:
http://www.developerfusion.com/tools/convert/vb-to-csharp/

Related

Encode unicode string as byte array C++ and C#

I have C++ code which I want to rewrite to C#. This part
case ID_TYPE_UNICODE_STRING :
if(items[i].GetUString().length() > 0xFFFF)
throw dppError("error");
//GetUstring returns std::wstring type object
DataSize = (WORD) (sizeof(WCHAR)*(items[i].GetUString().length()));
blob.AppendData((const BYTE *) &DataSize, sizeof(WORD)); //blob is byte array
//GetUstring returns std::wstring type object
blob.AppendData((const BYTE *) items[i].GetUString().c_str(), DataSize);
break ;
basically serializes length in bytes of unicode string and string itself to byte array.
Here comes my problem (this code then sends this data to server). I don't know which encoding is used in above lines of code(UTF16, UTF8, etc.).
So I don't know what is the best way to reimplement it in C#.
How can I guess what encoding is used in this C++ project?
And if I can't find encoding used in C++ project, given endianness is same as stated in accepted answer of this question, do you think the two methods (GetBytes and GetString) in accepted answer will work for me (for serializing the unicode string as in C++ project and retrieving it back)? e.g.
these two:
static byte[] GetBytes(string str)
{
byte[] bytes = new byte[str.Length * sizeof(char)];
System.Buffer.BlockCopy(str.ToCharArray(), 0, bytes, 0, bytes.Length);
return bytes;
}
static string GetString(byte[] bytes)
{
char[] chars = new char[bytes.Length / sizeof(char)];
System.Buffer.BlockCopy(bytes, 0, chars, 0, bytes.Length);
return new string(chars);
}
Or I am better of to learn what is the encoding used in C++ project?
I will then need to reconstruct the string in the same way from byte array too. And if I am better of learning which encoding was used in C++, how do I get the length of the string in bytes in C#, using System.Text.ASCII.WhateverEncodingWasUsedinC++.GetByteCount(string); ??
PS. Do you think the C++ code is working in encoding agnostic way? If yes, how can I repeat that also in C#?
UPDATE: I am guessing the encoding used is UTF16 because I saw that being mentioned in several variables names, so I think I will assume UTF16 is used, and if something doesn't work out during testing, look for alternative solutions. In that case, what is the best way to get the number of bytes of the UTF16 string? Is following method OK: System.Text.ASCII.Unicode.GetByteCount(string); ??
feedback and comments welcome. Am I wrong somewhere in my reasoning? Thanks
Change the method signature as like this for getting byte[] equivalent of input string.
static byte[] GetBytes(string str)
{
UnicodeEncoding uEncoding = new UnicodeEncoding();
byte[] stringContentBytes = uEncoding.GetBytes("Your string");
return stringContentBytes;
}
For reverse:
static string GetString(byte[] bytes)
{
UnicodeEncoding uEncoding = new UnicodeEncoding();
string stringContent=uEncoding.GetString(bytes);
return new string(stringContent);
}

C# byte array conversion to VB.NET

As per my last question I'm borrowing some code from the Opus project to integrate into VB.NET software.
Consider
byte[] buff = _encoder.Encode(segment, segment.Length, out len);
which I've translated to:
Dim buff(wavEnc.Encode(segment, segment.Length, len)) As Byte
It is throwing a:
Value of type '1-dimensional array of Byte' cannot be converted to 'Integer' error...
How can I fix this problem?
Try this:
Dim buff = wavEnc.Encode(segment, segment.Length, len)
Of course you can do a direct translation of the c#:
Dim buff As Byte() = wavEnc.Encode(segment, segment.Length, len)
No need for a type at all - let the compiler figure it out.
_encoder.Encode() is the right-hand side of an assignment. The left-hand side is a byte array.
The way you are using it in your VB sample is as an array dimensioner: an Integer.

What is the .NET Equivalent of Java's SecretKeySpec class?

I was provided the following code sample in Java and I'm having trouble converting it to C#. How would I go about converting this so it'll work in .NET 4.5?
public static String constructOTP(final Long counter, final String key)
throws NoSuchAlgorithmException, DecoderException, InvalidKeyException
{
// setup the HMAC algorithm, setting the key to use
final Mac mac = Mac.getInstance("HmacSHA512");
// convert the key from a hex string to a byte array
final byte[] binaryKey = Hex.decodeHex(key.toCharArray());
// initialize the HMAC with a key spec created from the key
mac.init(new SecretKeySpec(binaryKey, "HmacSHA512"));
// compute the OTP using the bytes of the counter
byte[] computedOtp = mac.doFinal(
ByteBuffer.allocate(8).putLong(counter).array());
//
// increment the counter and store the new value
//
// return the value as a hex encoded string
return new String(Hex.encodeHex(computedOtp));
}
Here is the C# code that I've come up with thanks to Duncan pointing out the HMACSHA512 class, but I'm unable to verify the results match without installing java, which I can't do on this machine. Does this code match the above Java?
public string ConstructOTP(long counter, string key)
{
var mac = new HMACSHA512(ConvertHexStringToByteArray(key));
var buffer = BitConverter.GetBytes(counter);
Array.Resize(ref buffer, 8);
var computedOtp = mac.ComputeHash(buffer);
var hex = new StringBuilder(computedOtp.Length * 2);
foreach (var b in computedOtp)
hex.AppendFormat("{0:x2", b);
return hex.ToString();
}
A SecretKeySpec is used to convert binary input into something that is recognised by Java security providers as a key. It does little more than decorate the bytes with a little post-it note saying "Pssst, it's an HmacSHA512 key...".
You can basically ignore it as a Java-ism. For your .NET code, you just need to find a way of declaring what the HMAC key is. Looking at the HMACSHA512 class, this seems quite straight-forward. There is a constructor that takes a byte array containing your key value.

dbml changes the type of varbinay(max) on Sql Server to Systm.Data.Linq.Binary but not byte[]

I have some table in Sql Server which has VarBinary(MAX) and I want to upload files to them, I need to make the Dbml to make the field byte[] but instead I get Systm.Data.Linq.Binary.
Why is that and how to make the default as byte[]? Thanks.
I read the file in my MVC 3 coroller action like this (resourceFile is a HttpPostedFileBase and newFile has the byte[])
newFile.FileContent = new byte[resourceFile.ContentLength];
resourceFile.InputStream.Read(newFile.FileContent, 0, resourceFile.ContentLength);
The System.Linq.Data.Binary type is a wrapper for a byte array that makes the wrapped byte array immutable. See msdn page.
Sample code to illustrate immutability:
byte[] goesIn = new byte[] { 0xff };
byte[] comesOut1 = null;
byte[] comesOut2 = null;
byte[] comesOut3 = null;
System.Data.Linq.Binary theBinary = goesIn;
comesOut1 = theBinary.ToArray();
comesOut1[0] = 0xfe;
comesOut2 = theBinary.ToArray();
theBinary = comesOut1;
comesOut3 = theBinary.ToArray();
The immutability can be seen after changing the value of the first byte in comesOut1. The byte[] wrapped in theBinary is not changed. Only after you assign the whole byte[] to theBinary does it change.
Anyway for your purpose you can use the Binary field. To assign a new value to it do as Dennis wrote in his answer. To get the byte array out of it, use the .ToArray() method.
You can use Binary to store byte[] as well, because Binary has implicit conversion from byte[]:
myEntity.MyBinaryProperty = File.ReadAllBytes("foo.txt");

C# Convert to Byte Array

How can i convert this VBNet code into C#? (ByteToImage is a User Defined Function use to convert Byte Array into Bitmap.
Dim Bytes() As Byte = CType(SQLreader("ImageList"), Byte())
picStudent.Image = jwImage.ByteToImage(Bytes)
I tried
byte[] Bytes = Convert.ToByte(SQLreader("ImageList")); // Error Here
picStudent.Image = jwImage.ByteToImage(Bytes);
but it generates an error saying: Cannot implicitly convert type 'byte' to 'byte[]'
What i am doing is basically converting an Image from database to byte array and displaying it on the picturebox.
byte[] Bytes = (byte[]) SQLreader("ImageList");
picStudent.Image = jwImage.ByteToImage(Bytes);
Try this
byte[] Bytes = (byte[])SQLreader("ImageList");
Hope this helps
The problem is you have an array of bytes (byte[] in C# and Byte() in VB.Net) but the Convert.ToByte call just returns a simple byte. To make this work you need to cast the return of SQLreader to byte[].
There is no perfect analogous construct for CType in C# but a simple cast here should do the trick
byte[] Bytes = (byte[])SQLreader("ImageList");
CType is the equivalent of a type cast, not an actual conversion.Besides, Convert.ToByte tries to convert its input to a single byte, not an array. The equivalent code is
byte[] bytes=(byte[])SQLreader("ImageList");

Categories