Why is this encrypted message damaged? - c#

I asked this question over Security site, and people there suggested I should have posted it here.
Some background. We have proprietary devices which run c over a proprietary OS and other devices which run a c# dll over a windows OS.
Both contact our Server via TCP connection, for our server both type of requests are the same.
The TCP server transfers part of the request to a self-hosted WCF service, through http-binding.
The requests are encrypted as shown in the link(like the C# dll encrypts them).
I am in the process of trying to cut off the TCP server and send requests straight to the WCF service.
My problem is that it seems like the WCF service receives the request string wrong, and it can't decrypt it.
It seems like there are additional \t \n in the server side receives string. other than that it looks the same.
This is the decryption code on the server side:
byte[] byteChiperText = Encoding.Default.GetBytes(input);
if (k.Length != 16)
{
throw new Exception("Wrong key size exception");
}
TripleDESCryptoServiceProvider des = new TripleDESCryptoServiceProvider();
des.Mode = CipherMode.ECB;
des.Padding = PaddingMode.Zeros;
des.Key = k;
ICryptoTransform ic = des.CreateDecryptor();
MemoryStream ms = new MemoryStream(byteChiperText);
CryptoStream cStream = new CryptoStream(ms,
ic,
CryptoStreamMode.Read);
StreamReader sReader = new StreamReader(cStream);
byte[] data = new byte[byteChiperText.Length];
int len = sReader.BaseStream.Read(data, 0, data.Length);
output = Encoding.Default.GetString(data, 0, len);
cStream.Close();

Well this looks broken to start with:
byte[] byteChiperText = Encoding.Default.GetBytes(input);
You're treating encrypted data as if it's text encoded with the platform default encoding. That's a great way to lose data. Encrypted data isn't text. It's arbitrary binary data, and should be treated as such.
Instead, you should use base64 to encode the encrypted data as text (Convert.ToBase64String) and then reverse that (Convert.FromBase64String) later on to get back to the original cypher-text. That's assuming you need it in text form to start with, of course. If you can pass it as a byte[] in the first place, that would be even better.
Also note that your approach to getting the text out is somewhat odd - you're creating a StreamReader, then only using the base stream. It would be better to use:
// You should be using "using" statements for all your streams, by the way...
using (TextReader reader = new StreamReader(cStream))
{
output = reader.ReadToEnd();
}
Note that this will use UTF-8 rather than the platform default encoding - but that's a good thing, so long as you make the corresponding change in the encryption code. Using the platform default encoding is almost always a mistake - it may well not support all of Unicode, and it varies from machine to machine.

The problem could be in Encoding.Default, since:
Different computers can use different encodings as the default.
You should use a given standard encoding (UTF-8, UTF-16, ..).

Related

Do I understand correctly how to encrypt TCP traffic via RSA + AES?

I am working on a project where I need to implement encryption for sending quite large amounts of data through a tcp connection. I wish to write it in C#. The project is also a part of my learning, by implementing such encryption I'll get experience.
I've made some research and tested some things out and I've written how I think I have to go about implementing it below.
Basically, I'd like to ask if someone could please check what I've written below and warn me about any mistakes, security holes or incorrect statements I made. Pointing those out and explaining what I should do to rectify the problems is greatly appreciated.
We have host A and B connected via tcp.
Host A creates an instance of System.Security.Cryptography.RSA via RSA.Create(2048). By doing that, the RSA instance created random public and private keys. The private key can be used to decrypt data that was encrypted by the public key. The public key can be derived from the private key, but reverse is impossible.
A calls rsa.ExportParameters(includePrivateParameters: false); to export the public key and uses some serializer like XmlSerializer to serialize the RSAParameters object into bytes. A can then send the serialized object via TCP with no encryption to B.
B receives and deserializes the RSAParameters object and uses it for creating it's own instance of RSA: RSA.Create(deserializedRSApar);. The RSA object of B now has the public key of A's RSA and thus can encrypt data so that noone else but A can decrypt it because A has the private key.
B creates an instance of System.Security.Cryptography.Aes in CBC mode with 256 bit key and 16 byte IV.
B encrypts the AES key using the RSA object with the imported public key and sends it back to A.
A receives the encrypted data and decrypts it using it's private key and creates it's own instance of System.Security.Cryptography.Aes and sets the key it decrypted. The RSA objects can now be disposed because both hosts have the same AES key.
Further communication will be encrypted with AES in separate messages, the hosts need to use a different unpredictable IV for each of the messages, IV will be prepended in plain bytes before the encrypted data. I will also be putting the length of the given message after IV and before the encrypted data.. I've written it to write the length of the data before encryption because I've heard that AES does not change data length when encrypting.
A part of the method used for sending should look something like this:
aes.GenerateIV();
using(var writer = new BinaryWriter(networkStream))
using(var encryptor = aes.CreateEncryptor())
using(var cryptoWriter = new CryptoStream(networkStream, encryptor, CryptoStreamMode.Write))
{
writer.Write(aes.IV);
writer.Write((int)bytesToSend.Length);
cryptoWriter.Write(bytesToSend, 0, bytesToSend.Length);
}
And a part of the method for receiving should look something like this:
using(var reader = new BinaryReader(networkStream))
using(var decryptor = aes.CreateDecryptor(aes.Key, reader.ReadBytes(aes.IV.Length)))
using(var cryptoReader = new CryptoStream(networkStream, decryptor, CryptoStreamMode.Read))
{
int toRead = reader.ReadInt32();
if (toRead > bufferArray.Length)
throw new Exception("Throw an exception or handle in another way, unimportant for now");
cryptoReader.Read(bufferArray, 0, toRead);
}
I know I should be creating those BinaryReaders and Writers out of scope of the method, I wrote the usings here (as well as that needless (int) cast) to show my intent better than "There is a BinaryReader in the class this snippet of a method is in"
Another question I'd like to ask is: Does it matter how long the messages will be? Is there a limit to how big the messages should be? I don't intend on sending messages bigger than 32kB.

Why is my DeflateStream not receiving data correctly over TCP?

I have a TcpClient class on a client and server setup on my local machine. I have been using the Network stream to facilitate communications back and forth between the 2 successfully.
Moving forward I am trying to implement compression in the communications. I've tried GZipStream and DeflateStream. I have decided to focus on DeflateStream. However, the connection is hanging without reading data now.
I have tried 4 different implementations that have all failed due to the Server side not reading the incoming data and the connection timing out. I will focus on the two implementations I have tried most recently and to my knowledge should work.
The client is broken down to this request: There are 2 separate implementations, one with streamwriter one without.
textToSend = ENQUIRY + START_OF_TEXT + textToSend + END_OF_TEXT;
// Send XML Request
byte[] request = Encoding.UTF8.GetBytes(textToSend);
using (DeflateStream streamOut = new DeflateStream(netStream, CompressionMode.Compress, true))
{
//using (StreamWriter sw = new StreamWriter(streamOut))
//{
// sw.Write(textToSend);
// sw.Flush();
streamOut.Write(request, 0, request.Length);
streamOut.Flush();
//}
}
The server receives the request and I do
1.) a quick read of the first character then if it matches what I expect
2.) I continue reading the rest.
The first read works correctly and if I want to read the whole stream it is all there. However I only want to read the first character and evaluate it then continue in my LongReadStream method.
When I try to continue reading the stream there is no data to be read. I am guessing that the data is being lost during the first read but I'm not sure how to determine that. All this code works correctly when I use the normal NetworkStream.
Here is the server side code.
private void ProcessRequests()
{
// This method reads the first byte of data correctly and if I want to
// I can read the entire request here. However, I want to leave
// all that data until I want it below in my LongReadStream method.
if (QuickReadStream(_netStream, receiveBuffer, 1) != ENQUIRY)
{
// Invalid Request, close connection
clientIsFinished = true;
_client.Client.Disconnect(true);
_client.Close();
return;
}
while (!clientIsFinished) // Keep reading text until client sends END_TRANSMISSION
{
// Inside this method there is no data and the connection times out waiting for data
receiveText = LongReadStream(_netStream, _client);
// Continue talking with Client...
}
_client.Client.Shutdown(SocketShutdown.Both);
_client.Client.Disconnect(true);
_client.Close();
}
private string LongReadStream(NetworkStream stream, TcpClient c)
{
bool foundEOT = false;
StringBuilder sbFullText = new StringBuilder();
int readLength, totalBytesRead = 0;
string currentReadText;
c.ReceiveBufferSize = DEFAULT_BUFFERSIZE * 100;
byte[] bigReadBuffer = new byte[c.ReceiveBufferSize];
while (!foundEOT)
{
using (var decompressStream = new DeflateStream(stream, CompressionMode.Decompress, true))
{
//using (StreamReader sr = new StreamReader(decompressStream))
//{
//currentReadText = sr.ReadToEnd();
//}
readLength = decompressStream.Read(bigReadBuffer, 0, c.ReceiveBufferSize);
currentReadText = Encoding.UTF8.GetString(bigReadBuffer, 0, readLength);
totalBytesRead += readLength;
}
sbFullText.Append(currentReadText);
if (currentReadText.EndsWith(END_OF_TEXT))
{
foundEOT = true;
sbFullText.Length = sbFullText.Length - 1;
}
else
{
sbFullText.Append(currentReadText);
}
// Validate data code removed for simplicity
}
c.ReceiveBufferSize = DEFAULT_BUFFERSIZE;
c.ReceiveTimeout = timeOutMilliseconds;
return sbFullText.ToString();
}
private string QuickReadStream(NetworkStream stream, byte[] receiveBuffer, int receiveBufferSize)
{
using (DeflateStream zippy = new DeflateStream(stream, CompressionMode.Decompress, true))
{
int bytesIn = zippy.Read(receiveBuffer, 0, receiveBufferSize);
var returnValue = Encoding.UTF8.GetString(receiveBuffer, 0, bytesIn);
return returnValue;
}
}
EDIT
NetworkStream has an underlying Socket property which has an Available property. MSDN says this about the available property.
Gets the amount of data that has been received from the network and is
available to be read.
Before the call below Available is 77. After reading 1 byte the value is 0.
//receiveBufferSize = 1
int bytesIn = zippy.Read(receiveBuffer, 0, receiveBufferSize);
There doesn't seem to be any documentation about DeflateStream consuming the whole underlying stream and I don't know why it would do such a thing when there are explicit calls to be made to read specific numbers of bytes.
Does anyone know why this happens or if there is a way to preserve the underlying data for a future read? Based on this 'feature' and a previous article that I read stating a DeflateStream must be closed to finish sending (flush won't work) it seems DeflateStreams may be limited in their use for networking especially if one wishes to counter DOS attacks by testing incoming data before accepting a full stream.
The basic flaw I can think of looking at your code is a possible misunderstanding of how network stream and compression works.
I think your code might work, if you kept working with one DeflateStream. However, you use one in your quick read and then you create another one.
I will try to explain my reasoning on an example. Assume you have 8 bytes of original data to be sent over the network in a compressed way. Now let's assume for sake of an argument, that each and every byte (8 bits) of original data will be compressed to 6 bits in compressed form. Now let's see what your code does to this.
From the network stream, you can't read less than 1 byte. You can't take 1 bit only. You take 1 byte, 2 bytes, or any number of bytes, but not bits.
But if you want to receive just 1 byte of the original data, you need to read first whole byte of compressed data. However, there is only 6 bits of compressed data that represent the first byte of uncompressed data. The last 2 bits of the first byte are there for the second byte of original data.
Now if you cut the stream there, what is left is 5 bytes in the network stream that do not make any sense and can't be uncompressed.
The deflate algorithm is more complex than that and thus it makes perfect sense if it does not allow you to stop reading from the NetworkStream at one point and continue with new DeflateStream from the middle. There is a context of the decompression that must be present in order to decompress the data to their original form. Once you dispose the first DeflateStream in your quick read, this context is gone, you can't continue.
So, to resolve your issue, try to create only one DeflateStream and pass it to your functions, then dispose it.
This is broken in many ways.
You are assuming that a read call will read the exact number of bytes you want. It might read everything in one byte chunks though.
DeflateStream has an internal buffer. It can't be any other way: Input bytes do not correspond 1:1 to output bytes. There must be some internal buffering. You must use one such stream.
Same issue with UTF-8: UTF-8 encoded strings cannot be split at byte boundaries. Sometimes, your Unicode data will be garbled.
Don't touch ReceiveBufferSize, it does not help in any way.
You cannot reliably flush a deflate stream, I think, because the output might be at a partial byte position. You probably should devise a message framing format in which you prepend the compressed length as an uncompressed integer. Then, send the compressed deflate stream after the length. This is decodable in a reliable way.
Fixing these issues is not easy.
Since you seem to control client and server you should discard all of this and not devise your own network protocol. Use a higher-level mechanism such as web services, HTTP, protobuf. Anything is better than what you have there.
Basically there are a few things wrong with the code I posted above. First is that when I read data I'm not doing anything to make sure the data is ALL being read in. As per microsoft documentation
The Read operation reads as much data as is available, up to the
number of bytes specified by the size parameter.
In my case I was not making sure my reads would get all the data I expected.
This can be accomplished simply with this code.
byte[] data= new byte[packageSize];
bytesRead = _netStream.Read(data, 0, packageSize);
while (bytesRead < packageSize)
bytesRead += _netStream.Read(data, bytesRead, packageSize - bytesRead);
On top of this problem I had a fundamental issue with using DeflateStream - namely I should not use DeflateStream to write to the underlying NetworkStream. The correct approach is to first use the DeflateStream to compress data into a ByteArray, then send that ByteArray using the NetworkStream directly.
Using this approach helped to correctly compress data over the network and property read the data on the other end.
You may point out that I must know the size of the data, and that is true. Every call has a 8 byte 'header' that includes the size of the compressed data and the size of the data when it is uncompressed. Although I think the second was utimately not needed.
The code for this is here. Note the variable compressedSize serves 2 purposes.
int packageSize = streamIn.Read(sizeOfDataInBytes, 0, 4);
while (packageSize!= 4)
{
packageSize+= streamIn.Read(sizeOfDataInBytes, packageSize, 4 - packageSize);
}
packageSize= BitConverter.ToInt32(sizeOfDataInBytes, 0);
With this information I can correctly use the code I showed you first to get the contents fully.
Once I have the full compressed byte array I can get the incoming data like so:
var output = new MemoryStream();
using (var stream = new MemoryStream(bufferIn))
{
using (var decompress = new DeflateStream(stream, CompressionMode.Decompress))
{
decompress.CopyTo(output);;
}
}
output.Position = 0;
var unCompressedArray = output.ToArray();
output.Close();
output.Dispose();
return Encoding.UTF8.GetString(unCompressedArray);

Sending files over TCP/ .NET SSLStream is slow/not working

Im writing an Server/Client Application which works with SSL(over SSLStream), which has to do many things(not only file receiving/sending). Currently, It works so: Theres only one connection. I always send the data from the client/server using SSLStream.WriteLine() and receive it using SSLStream.ReadLine(), because I can send all informations over one connection and I can send from all threads without destroying the data.
Now I wanted to implement the file sending and receiving. Like other things in my client/server apps, every message has a prefix (like cl_files or sth) and a base64 encoded content part(prefix and content are seperated by |). I implemented the file sharing like that: The uploader send to the receiver a message about the total file size and after that the uploader sends the base64 encoded parts of the file over the prefix r.
My problem is that the file sharing is really slow. I got around 20KB/s from localhost to localhost. I have also another problem. If I increase the size of the base64 encoded parts of the file(which makes file sharing faster), the prefix r doesnt go out to the receiver anymore(so the datas couldnt be identified).
How can I make it faster?
Any help will be greatly appreciated.
My(propably bad) code is for the client:
//its running inside a Thread
FileInfo x = new FileInfo(ThreadInfos.Path);
long size = x.Length; //gets total size
long cursize = 0;
FileStream fs = new FileStream(ThreadInfos.Path, FileMode.Open);
Int16 readblocks = default(Int16);
while (cursize < size) {
byte[] buffer = new byte[4096];
readblocks = fs.Read(buffer, 0, 4096);
ServerConnector.send("r", getBase64FromBytes(buffer));//It sends the encoded Data with the prefix r over SSLStream.WriteLine
cursize = cursize + Convert.ToInt64(readblocks);
ThreadInfos.wait.setvalue((csize / size) * 100);//outputs value to the gui
}
fs.Close();
For the Server:
case "r"://switch case for prefixes
if (isreceiving)
{
byte[] buffer = getBytesFromBase64(splited[1]);//splited ist the received Line over ReadLine splitted by the seperator "|"
rsize = rsize + buffer.LongLength;
writer.Write(buffer, 0, buffer.Length);//it writes the decoded data into the file
if (rsize == rtotalsize)//checks if file is completed
{
writer.Close();
}
}
break;
Your problem stems from the fact that you are performing what is essentially a binary operation through a text protocol and you are exacerbating that problem by doing it over an encrypted channel. I'm not going to re-invent this for you, but here are some options...
Consider converting to an HTTPS client/server model instead of reinventing the wheel. This will give you a well-defined model for PUT/GET operations on files.
If you can not (or will not) convert to HTTPS, consider other client/server libraries that provide a secure transport and well-defined protocol for binary data. For example, I often use protobuf-csharp-port and protobuf-csharp-rpc to provide a secure protocol and transport within our datacenter or local network.
If you are stuck with your transport being a raw SslStream, try using a well-defined and proven binary serialization framework like protobuf-csharp-port or protobuf-net to define your protocol.
Lastly, if you must continue with the framework you have, try some http-like tricks. Write a name/value pair as text that defines the raw-binary content that follows.
First of all base64 over ssl will be slow anyway, ssl itself is slower then raw transport. File transfers are not done over base64 now days, http protocol is much more stable than anything else and most libraries on all platforms are very well stable. Base64 takes more size then actual data, plus the time to encode.
Also, your following line may be a problem.
ThreadInfos.wait.setvalue((csize / size) * 100);//outputs value to the gui
If your this line is blocking, then this will slow down for every 4kb. Updating for every 4kb is also not right, unless a progress value from previous value differs by significant amount, there is no need to update ui for it.
I'd give a try of gzip compress before/after the network. From my experience, it helps. I'd say some code like this could help :
using(GZipStream stream = new GZipStream(sslStream, CompressionMode.Compress))
{
stream.Write(...);
stream.Flush();
stream.Close();
}
Warning : It may interfer with SSL if the Flush is not done. and it will need some tests... and I didn't try to compile the code.
I think Akash Kava is right.
while (cursize < size) {
DateTime start = DateTime.Now;
byte[] buffer = new byte[4096];
readblocks = fs.Read(buffer, 0, 4096);
ServerConnector.send("r", getBase64FromBytes(buffer));
DateTime end = DateTime.Now;
Console.Writline((end-start).TotalSeconds);
cursize = cursize + Convert.ToInt64(readblocks);
ThreadInfos.wait.setvalue((csize / size) * 100);
end = DateTime.Now;
Console.Writline((end-start).TotalSeconds);
}
By doing this you can find out where is the bottle neck.
Also the way you sending data packets to server is not robust.
Is it possible to paste your implementation of
ThreadInfos.wait.setvalue((csize / size) * 100);

Encoding problem between C# TCP server and Java TCP Client

i'm facing some encoding issue which i'm not able to find the correct solution.
I have a C# TCP server, running as a window service which received and respond XML, the problem comes down when passing special characters in the output such as spanish characters with accents (like á,é,í and others).
Server response is being encoded as UTF-8, and java client is reading using UTF-8. But when i print its output the character is totally different.
This problem only happens in Java client(C# TCP client works as expected).
Following is an snippet of the server code that shows the encoding issue:
C# Server
byte[] destBytes = System.Text.Encoding.UTF8.GetBytes("á");
try
{
clientStream.Write(destBytes, 0, destBytes.Length);
clientStream.Flush();
}catch (Exception ex)
{
LogErrorMessage("Error en SendResponseToClient: Detalle::", ex);
}
Java Client:
socket.connect(new InetSocketAddress(param.getServerIp(), param.getPort()), 20000);
InputStream sockInp = socket.getInputStream();
InputStreamReader streamReader = new InputStreamReader(sockInp, Charset.forName("UTF-8"));
sockReader = new BufferedReader(streamReader);
String tmp = null;
while((tmp = sockReader.readLine()) != null){
System.out.println(tmp);
}
For this simple test, the output show is:
ß
I did some testing printing out the byte[] on each language and while on C# á output as:
195, 161
In java byte[] read print as:
-61,-95
Will this have to do with the Signed (java), UnSigned (C#) of byte type?.
Any feedback is greatly appreciated.
To me this seems like an endianess problem... you can check that by reversing the bytes in Java before printing the string...
which usually would be solved by including a BOM... see http://de.wikipedia.org/wiki/Byte_Order_Mark
Are you sure that's not a unicode character you are attemping to encode to bytes as UTF-8 data?
I found the below has a useful way of testing to see if the data in that string is ccorrect UTF-8 before you send it.
How to test an application for correct encoding (e.g. UTF-8)

Error in C# encrypt code when decrypting!

A bit more background info as suggested:
I'm finsihing of an Intranet CMS web app where I have to use the products API (ASP.NET based). Because of time constraints and issues with Windows authen' I need another way to ensure staff do not need to re login everytime they visit the site to view personalised content. The way it works is that once a user logs in (username/password), a Session ID storing a new different Security context value is generated that is used to display the personalised content. The API login method called uses the username and password as parameters. The only way I can think of automatically logging in the next time the staff visits the site is by storing the password in a enrypted cookie and checking of its existing when the site is visited and then calling the API login method using the username and decrypted password cookie values.
Any other ideas as an alternative welcomed.
Mo
Hi,
I'm using some code found on the web to encrypt and decrypt a password string. It encrypts fine but when it calls the code below to decrypt the string it throws the error "Length of the data to decrypt is invalid" How can I resolve this?
Thanks in advance.
Mo
System.Text.Encoding enc = System.Text.Encoding.ASCII;
byte[] myByteArray = enc.GetBytes(_pword);
SymmetricAlgorithm sa = DES.Create();
MemoryStream msDecrypt = new MemoryStream(myByteArray);
CryptoStream csDecrypt = new CryptoStream(msDecrypt, sa.CreateDecryptor(), CryptoStreamMode.Read);
byte[] decryptedTextBytes = new Byte[myByteArray.Length];
csDecrypt.Read(decryptedTextBytes, 0, myByteArray.Length);
csDecrypt.Close();
msDecrypt.Close();
string decryptedTextString = (new UnicodeEncoding()).GetString(decryptedTextBytes);
A couple of things here...
You shouldn't encrypt passwords usually. You should hash them.
If you decide to continue down the road of encryption..
You are using the DES algorithm. This is considered insecure and flawed. I'd recommend looking at the AES algorithm.
Depending on how much data you are working with, the CryptoStream might be overkill.
Using the ASCII encoding can cause loss of data that isn't ASCII, like Cyrillic letters. The recommended fix is to use something else, like UTF8.
Here is an example:
string text = "Hello";
using (var aes = new AesManaged())
{
var bytes = System.Text.Encoding.UTF8.GetBytes(text);
byte[] encryptedBytes;
using (var encrypt = aes.CreateEncryptor())
{
encryptedBytes = encrypt.TransformFinalBlock(bytes, 0, bytes.Length);
}
byte[] decryptedBytes;
using (var decrypt = aes.CreateDecryptor())
{
decryptedBytes = decrypt.TransformFinalBlock(encryptedBytes, 0, encryptedBytes.Length);
}
var decryptedText = System.Text.Encoding.UTF8.GetString(decryptedBytes);
Console.Out.WriteLine("decryptedText = {0}", decryptedText);
}
This will use a random key every time. It is likely that you will need to encrypt some data, then decrypt it at a later time. When you create the AesManaged object, you can store the Key and IV property. You can re-use the same Key if you'd like, but different data should always be encrypted with a different IV (Initialization Vector). Where you store that key, is up to you. That's why hashing might be a better alternative: there is no key, and no need to worry about storing the key safely.
If you want to go down the hashing route, here is a small example:
var textToHash = "hello";
using (SHA1 sha = new SHA1Managed())
{
var bytesToHash = System.Text.Encoding.UTF8.GetBytes(textToHash);
var hash = sha.ComputeHash(bytesToHash);
string base64hash = Convert.ToBase64String(hash);
}
This uses the SHA1 algorithm, which should work fine for passwords, however you may want to consider SHA256.
The concept is simple: a hash will produce a (mostly) unique output for an input, however the output cannot be converted back to the input - it's destructive. Whenever you want to check if a user should be authenticated, check hash the password they gave you, and check it against the hash of the correct password. That way you aren't storing anything sensitive.
I've actually had this error before and it took me 3 days to figure out the solution. The issue will be the fact that the machine key you need for descryption needs to be registered on your machine itself.
Read fully up on DES encryption, it works by an application key, and a machine-level key. The error you're getting is likely because of the machine key missing.
Compare the bytes used to create the _pword string (in the encryption method) to the bytes retrieved with GetBytes. Probably you will notice a change in the data there.
To store the encrypted bytes, I think you should use Convert.ToBase64String and Convert.FromBase64String turn the encrypted password to/from a string.
I also do not see the code where you set the Key and IV. So I guess you are using a different key to encrypt and decrypt the password.
If the current Key property is null,
the GenerateKey method is called to
create a new random Key. If the
current IV property is null, the
GenerateIV method is called to create
a new random IV.
DES is a block based cipher - only certain lengths of buffers are valid. If I remember correctly, the block size for DES is 64 bits, so you need to ensure that your byte array is a multiple of 8 bytes long.
(That should fix your immediate problem, but I'd reference other peoples advice here - you really ought not to be using DES for any new code, and for passwords it's usually more appropriate to hash than to encrypt).

Categories