I have taken some code from MSDN to read emails using IMAP Client. I have changed little bit code so i can only read unseen email.
I am writing all response in Richtextbox.
The problems is format of Body text of Email is unreadable while all other text is fine.
void ReadEmail()
{
try
{
// there should be no gap between the imap command and the \r\n
// ssl.read() -- while ssl.readbyte!= eof does not work because there is no eof from server
// cannot check for \r\n because in case of larger response from server ex:read email message
// there are lot of lines so \r \n appears at the end of each line
//ssl.timeout sets the underlying tcp connections timeout if the read or write
//time out exceeds then the undelying connection is closed
tcpc = new System.Net.Sockets.TcpClient("imap.gmail.com", 993);
ssl = new System.Net.Security.SslStream(tcpc.GetStream());
ssl.AuthenticateAsClient("imap.gmail.com");
receiveResponse("");
username = "charlie#gmail.com";
password = "********";
receiveResponse("$ LOGIN " + username + " " + password + " \r\n");
receiveResponse("$ LIST " + "\"\"" + " \"*\"" + "\r\n");
receiveResponse("$ SELECT INBOX\r\n");
receiveResponse("$ UID SEARCH UNSEEN\r\n");
MatchCollection collection= Regex.Matches(Result,#" (\d{1,4})");
foreach (Match m in collection)
{
UNREAD_UID.Add(int.Parse(m.Groups[1].Value));
}
foreach (int x in UNREAD_UID)
{
receiveResponse("$ FETCH " +x + " body[header]\r\n");
richTextBox1.Text += Environment.NewLine+"-----------------------------------------------------------------------------------------------------------------------"+Environment.NewLine;
receiveResponse("$ FETCH " +x + " body[text]\r\n");
richTextBox1.Text += Environment.NewLine + "###########################################################2" + Environment.NewLine;
richTextBox1.Update();
}
//receiveResponse("$ STATUS INBOX (MESSAGES)\r\n");
// int number = 1;
receiveResponse("$ LOGOUT\r\n");
}
catch (Exception ex)
{
Console.WriteLine("error: " + ex.Message);
}
finally
{
if (sw != null)
{
sw.Close();
sw.Dispose();
}
if (ssl != null)
{
ssl.Close();
ssl.Dispose();
}
if (tcpc != null)
{
tcpc.Close();
}
}
}
void receiveResponse(string command)
{
try
{
if (command != "")
{
if (tcpc.Connected)
{
dummy = Encoding.ASCII.GetBytes(command);
ssl.Write(dummy, 0, dummy.Length);
}
else
{
throw new ApplicationException("TCP CONNECTION DISCONNECTED");
}
}
ssl.Flush();
buffer = new byte[5120];
bytes = ssl.Read(buffer, 0, 5120);
sb.Append(Encoding.ASCII.GetString(buffer));
Result = sb.ToString();
richTextBox1.Text += Environment.NewLine + Environment.NewLine + Environment.NewLine + Environment.NewLine + sb.ToString();
sb = new StringBuilder();
}
catch (Exception ex)
{
throw new ApplicationException(ex.Message);
}
}
Here is Sample of what i am getting
108 FETCH (BODY[TEXT] {25656}
DQoNCg0KDQo8IURPQ1RZUEUgaHRtbD4NCjxodG1sPg0KPGhlYWQ+DQo8dGl0bGU+TWlj
cm9zb2Z0IHR1cm5zIChhbG1vc3QpIGFueSBjYW1lcmEgaW50byBhIEtpbmVjdDwvdGl0
bGU+DQo8bWV0YSBodHRwLWVxdWl2PSJDb250ZW50LVR5cGUiIGNvbnRlbnQ9InRleHQv
aHRtbDsgY2hhcnNldD1VVEYtOCI+DQo8bWV0YSBuYW1lPSJ2aWV3cG9ydCIgY29udGVu
dD0id2lkdGg9ZGV2aWNlLXdpZHRoIj4NCjwhLS0gRWZmaW5nIFdpbmRvd3MgOCBNYWls
IGNsaWVudC4gQWRkIC13ZWJraXQtbWluLWRldmljZS1waXhlbC1yYXRpbzogMSB0byBz
Y2FyZSBpdCBhd2F5IGZyb20gdGhpcyBiaXQuIC0tPg0KPHN0eWxlIHR5cGU9InRleHQv
Y3NzIj4NCkBtZWRpYSBzY3JlZW4gYW5kIChtYXgtZGV2aWNlLXdpZHRoOiA1MDBweCkg
YW5kICgtd2Via2l0LW1pbi1kZXZpY2UtcGl4ZWwtcmF0aW86IDEpIHsNCgkuYm9keS1j
b250YWluZXIgeyB3aWR0aDoxMDAlICFpbXBvcnRhbnQ7IH0NCgkuYmFubmVyICAgICAg
ICAgeyBkaXNwbGF5OiBub25lICFpbXBvcnRhbnQ7IH0NCgkubW9iaWxlLWJhbm5lciAg
eyBkaXNwbGF5OiBibG9jayAhaW1wb3J0YW50OyB3aWR0aDozNDBweCAhaW1wb3J0YW50
OyBiYWNrZ3JvdW5kOiB1cmwoaHR0cDovL3d3dy5jb2RlcHJvamVjdC5jb20vc2NyaXB0
L21haWxvdXRzL3RlbXBsYXRlcy9uZXdzbGV0dGVyLWluc2lkZXIucG5nKSBuby1yZXBl
YXQgdG9wIGxlZ
Kindly help me.
You need to examine the Content-Transfer-Encoding header to undo Transfer encoding (in this case, that's Base64. Other alternatives are 7-bit or Quoted Printable). Or, better, download the entire message (Body[]) and apply a MIME parser/decoder to it to get an object representation of the headers, body, and attachments.
Max's answer above is correct, but I'm going to illustrate how to implement his suggestion using my MailKit library:
using (var client = new ImapClient ()) {
client.Connect ("imap.gmail.com", 993, true);
// since we're not using an OAuth2 token, remove it from the set
// of possible authentication mechanisms to try:
client.AuthenticationMechanisms.Remove ("XOAUTH2");
client.Authenticate ("charlie#gmail.com", "*****");
// SELECT the INBOX folder
client.Inbox.Open (FolderAccess.ReadWrite);
foreach (var uid in client.Inbox.Search (SearchQuery.NotSeen)) {
var message = client.Inbox.GetMessage (uid);
// at this point, 'message' is a MIME DOM that you can walk
// over to get the particular MIME-part that you want. For
// example, we could get a body part with a filename of
// "test.txt" using LINQ like this:
var attachment = message.BodyParts.OfType<MimePart> ()
.FirstOrDefault (x => x.FileName == "test.txt");
// decode the content to a MemoryStream:
using (var memory = new MemoryStream ()) {
attachment.ContentObject.DecodeTo (memory);
}
// since the attachment is probably a TextPart
// (based on the file extension above), we can actually
// use a simpler approach:
var textPart = attachment as TextPart;
if (textPart != null) {
// decode the content and convert into a 'string'
var text = textPart.Text;
}
}
client.Disconnect (true);
}
Related
I've looked at Parallel.ForEach - System Out of Memory Exception regarding this issue but not much of a solution was given. I'm very new to using Parallel.ForEach, so I'm trying to figure out what's going on.
Diagnostic tools caps out at 1023 repeating (I understand this is an x86 to x64 arch restriction, but I wanted to offer the program in both formats.) I also don't feel like any program should ever meet that threshold. When I compile the program in x64, I sit around 1.1-1.4GB with MaxDegree . For sake of testing, I am running GC.Collection() at the end of each Parallel.ForEach iteration (I understand this isn't good practice, I'm just trying to troubleshoot at this point.)
Here's what I'm seeing:
Now if I try to use a Partitioner method, such as:
var checkforfinished = Parallel.ForEach<ListViewItem>(Partitioner.Create(0,lstBackupUsers.Items.Count), lstBackupUsers.Items.Cast<ListViewItem>(), opts, name =>
The I get an error of:
"No overload for method 'ForEach' takes 4 arguments"
That's fine, I modify my Parallel.ForEach statement so it looks like this:
var checkforfinished = Parallel.ForEach(Partitioner.Create(0,lstBackupUsers.Items.Count), lstBackupUsers.Items, opts, name => (I removed my casts)
and then my ForEach method won't accept the statement because it wants me to explicitly tell it that it's addressing a listviewbox.items method.
I am so confused on what to do.
Do I create a Partitioner, and if I do, how do I make my Parallel.ForEach method understand how to address a listviewbox?
update 1
I want to try to give as many details as possible because this is just rough. I'm sure it's easy, I'm just overcomplicationg it by an nth degree.
I have my Parallel.ForEach(//) running in a background worker function. My DoWork process is over 300 lines (I'm not an expert in C#, I'm just putting things together for a program at work.)
Here are bullet points of its basic structure
User clicks a "Start backup" button as seen in the screenshot
Button begins a separate function that checks to see which method the user selected to grab a list of usernames from (LDAP or flat text file)
That function then sends off a bgw_dowork() request
Inside the DoWork request, it looks like a summary of:
Check preliminary statements (bgw.cancellationpending for example)
Move on to grabbing some settings from the configurationmanager.appsettings
Begin the "complex" Parallel.ForEach command which Reads the listbox record rows and foreach row performs a very long list of commands to complete an operation for one user
The entire program, especially bgw_dowork heavily uses Google's v3 Drive API to login as a user, grab a file as recorded by other functions that prepare the user directory to be backed up (separate functions which login as a user, record their files (and fileIds) and their directories/subdirectories) and the bgw_dowork performs a chunk of the actual download functionality, which then calls off to the other functions to finish moving the files after they have been downloaded.
The actual "code" I use is (and I promise it's not pretty...)
private void bgW_DoWork(object sender, DoWorkEventArgs e)
{
{
try
{
txtFile.ReadOnly = true;
btnStart.Text = "Cancel Backup";
var appSettings = ConfigurationManager.AppSettings;
string checkreplace = ConfigurationManager.AppSettings["checkreplace"];
string userfile = txtFile.Text;
int counter = 0;
int arraycount = 0;
if (bgW.CancellationPending)
{
e.Cancel = true;
stripLabel.Text = "Operation was canceled!";
}
else
{
for (int z = 0; z >= counter; z++)
{
if (bgW.CancellationPending)
{
e.Cancel = true;
stripLabel.Text = "Operation was canceled!";
break;
}
else
{
double totalresource = int.Parse(ConfigurationManager.AppSettings["multithread"]);
totalresource = (totalresource / 100);
//var opts = new ParallelOptions { MaxDegreeOfParallelism = Convert.ToInt32(Math.Ceiling((Environment.ProcessorCount * totalresource) * 1.0)) };
var opts = new ParallelOptions { MaxDegreeOfParallelism = 2 };
var part = Partitioner.Create(1, 100);
//foreach (ListViewItem name in lstBackupUsers.Items)
var checkforfinished = Parallel.ForEach(lstBackupUsers.Items.Cast<ListViewItem>(), name =>
{
try
{
string names = name.SubItems[0].Text;
lstBackupUsers.Items[arraycount].Selected = true;
lstBackupUsers.Items[arraycount].BackColor = Color.CornflowerBlue;
arraycount++;
stripLabel.Text = "";
Console.WriteLine("Selecting user: " + names.ToString());
txtLog.Text += "Selecting user: " + names.ToString() + Environment.NewLine;
txtCurrentUser.Text = names.ToString();
// Define parameters of request.
string user = names.ToString();
// Check if directory exists, create if not.
string savelocation = ConfigurationManager.AppSettings["savelocation"] + user + "\\";
if (File.Exists(savelocation + ".deltalog.tok"))
File.Delete(savelocation + ".deltalog.tok");
FileInfo testdir = new FileInfo(savelocation);
testdir.Directory.Create();
string savedStartPageToken = "";
var start = CreateService.BuildService(user).Changes.GetStartPageToken().Execute();
// This token is set by Google, it defines changes made and
// increments the token value automatically.
// The following reads the current token file (if it exists)
if (File.Exists(savelocation + ".currenttoken.tok"))
{
StreamReader curtokenfile = new StreamReader(savelocation + ".currenttoken.tok");
savedStartPageToken = curtokenfile.ReadLine().ToString();
curtokenfile.Dispose();
}
else
{
// Token record didn't exist. Create a generic file, start at "1st" token
// In reality, I have no idea what token to start at, but 1 seems to be safe.
Console.Write("Creating new token file.\n");
//txtLog.Text += ("Creating new token file.\n" + Environment.NewLine);
StreamWriter sw = new StreamWriter(savelocation + ".currenttoken.tok");
sw.Write(1);
sw.Dispose();
savedStartPageToken = "1";
}
string pageToken = savedStartPageToken;
int gtoken = int.Parse(start.StartPageTokenValue);
int mytoken = int.Parse(savedStartPageToken);
txtPrevToken.Text = pageToken.ToString();
txtCurrentToken.Text = gtoken.ToString();
if (gtoken <= 10)
{
Console.WriteLine("Nothing to save!\n");
//txtLog.Text += ("User has nothing to save!" + Environment.NewLine);
}
else
{
if (pageToken == start.StartPageTokenValue)
{
Console.WriteLine("No file changes found for " + user + "\n");
//txtLog.Text += ("No file changes found! Please wait while I tidy up." + Environment.NewLine);
}
else
{
// .deltalog.tok is where we will place our records for changed files
Console.WriteLine("Changes detected. Making notes while we go through these.");
lblProgresslbl.Text = "Scanning Drive directory.";
// Damnit Google, why did you change how the change fields work?
if (savedStartPageToken == "1")
{
statusStripLabel1.Text = "Recording folder list ...";
txtLog.Text = "Recording folder list ..." + Environment.NewLine;
exfunctions.RecordFolderList(savedStartPageToken, pageToken, user, savelocation);
statusStripLabel1.Text = "Recording new/changed files ... This may take a bit!";
txtLog.Text += Environment.NewLine + "Recording new/changed list for: " + user;
exfunctions.ChangesFileList(savedStartPageToken, pageToken, user, savelocation);
}
else
{
//proUserclass = proUser;
statusStripLabel1.Text = "Recording new/changed files ... This may take a bit!";
txtLog.Text += Environment.NewLine + "Recording new/changed list for: " + user + Environment.NewLine;
exfunctions.ChangesFileList(savedStartPageToken, pageToken, user, savelocation);
}
// Get all our files for the user. Max page size is 1k
// after that, we have to use Google's next page token
// to let us get more files.
StreamWriter logFile = new StreamWriter(savelocation + ".recent.log");
string[] deltafiles = File.ReadAllLines(savelocation + ".deltalog.tok");
int totalfiles = deltafiles.Count();
int cnttototal = 0;
Console.WriteLine("\nFiles to backup:\n");
if (deltafiles == null)
{
return;
}
else
{
double damn = ((gtoken - double.Parse(txtPrevToken.Text)));
damn = Math.Round((damn / totalfiles));
if (damn <= 0)
damn = 1;
foreach (var file in deltafiles)
{
try
{
if (bgW.CancellationPending)
{
stripLabel.Text = "Backup canceled!";
e.Cancel = true;
break;
}
DateTime dt = DateTime.Now;
string[] foldervalues = File.ReadAllLines(savelocation + "folderlog.txt");
cnttototal++;
bgW.ReportProgress(cnttototal);
proUser.Maximum = int.Parse(txtCurrentToken.Text);
stripLabel.Text = "File " + cnttototal + " of " + totalfiles;
double? mathisfun;
mathisfun = ((100 * cnttototal) / totalfiles);
if (mathisfun <= 0)
mathisfun = 1;
double mathToken = double.Parse(txtPrevToken.Text);
mathToken = Math.Round((damn + mathToken));
// Bring our token up to date for next run
txtPrevToken.Text = mathToken.ToString();
File.WriteAllText(savelocation + ".currenttoken.tok", mathToken.ToString());
int proval = int.Parse(txtPrevToken.Text);
int nowval = int.Parse(txtCurrentToken.Text);
if (proval >= nowval)
proval = nowval;
proUser.Value = (proval);
lblProgresslbl.Text = ("Current progress: " + mathisfun.ToString() + "% completed.");
// Our file is a CSV. Column 1 = file ID, Column 2 = File name
var values = file.Split(',');
string fileId = values[0];
string fileName = values[1];
string mimetype = values[2];
mimetype = mimetype.Replace(",", "_");
string folder = values[3];
string ext = null;
int folderfilelen = foldervalues.Count();
fileName = GetSafeFilename(fileName);
Console.WriteLine("Filename: " + values[1]);
logFile.WriteLine("ID: " + values[0] + " - Filename: " + values[1]);
logFile.Flush();
// Things get sloppy here. The reason we're checking MimeTypes
// is because we have to export the files from Google's format
// to a format that is readable by a desktop computer program
// So for example, the google-apps.spreadsheet will become an MS Excel file.
switch (mimetype)
{
(switch statement here removed due to body length issues for this post.)
}
if (ext.Contains(".doc") || ext.Contains(".xls"))
{
string whatami = null;
if (ext.Contains(".xls"))
{
whatami = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet";
}
else if (ext.Contains(".doc"))
{
whatami = "application/vnd.openxmlformats-officedocument.wordprocessingml.document";
}
else if (ext.Contains(".ppt"))
{
whatami = "application/vnd.openxmlformats-officedocument.presentationml.presentation";
}
if (fileName.Contains(".mov") || ext == ".ggl" || fileName.Contains(".mp4"))
{
txtLog.Text += Environment.NewLine + "Skipping file.";
return;
}
var requestfileid = CreateService.BuildService(user).Files.Export(fileId, whatami);
statusStripLabel1.Text = (savelocation + fileName + ext);
txtCurrentUser.Text = user;
string dest1 = Path.Combine(savelocation, fileName + ext);
var stream1 = new System.IO.FileStream(dest1, FileMode.OpenOrCreate, FileAccess.ReadWrite);
scrolltobtm();
requestfileid.MediaDownloader.ProgressChanged +=
(IDownloadProgress progress) =>
{
switch (progress.Status)
{
case DownloadStatus.Downloading:
{
Console.WriteLine(progress.BytesDownloaded);
logFile.WriteLine("Downloading: " + progress.BytesDownloaded);
txtLog.Text += ("Downloading ... " + progress.BytesDownloaded + Environment.NewLine);
scrolltobtm();
logFile.Flush();
break;
}
case DownloadStatus.Completed:
{
Console.WriteLine("Download complete.");
logFile.WriteLine("[" + user + "] Download complete for: " + requestfileid.ToString());
txtLog.Text += ("[" + user + "] Download complete for: " + fileName + Environment.NewLine);
logFile.Flush();
break;
}
case DownloadStatus.Failed:
{
Console.WriteLine("Download failed.");
logFile.WriteLine("Download failed.");
logFile.Flush();
break;
}
}
};
scrolltobtm();
GC.Collect();
GC.WaitForPendingFinalizers();
requestfileid.Download(stream1);
stream1.Close();
stream1.Dispose();
}
else
{
scrolltobtm();
var requestfileid = CreateService.BuildService(user).Files.Get(fileId);
//Generate the name of the file, and create it as such on the local filesystem.
statusStripLabel1.Text = (savelocation + fileName + ext);
string dest1 = Path.Combine(savelocation, fileName + ext);
var stream1 = new System.IO.FileStream(dest1, FileMode.OpenOrCreate, FileAccess.ReadWrite);
requestfileid.MediaDownloader.ProgressChanged +=
(IDownloadProgress progress) =>
{
switch (progress.Status)
{
case DownloadStatus.Downloading:
{
Console.WriteLine(progress.BytesDownloaded);
logFile.WriteLine("Downloading: " + progress.BytesDownloaded);
txtLog.Text += ("Downloading ... " + progress.BytesDownloaded + Environment.NewLine);
scrolltobtm();
logFile.Flush();
break;
}
case DownloadStatus.Completed:
{
Console.WriteLine("Download complete.");
logFile.WriteLine("Download complete for: " + requestfileid.ToString());
txtLog.Text += (Environment.NewLine + "[" + user + "] Download complete for: " + fileName + Environment.NewLine);
logFile.Flush();
break;
}
case DownloadStatus.Failed:
{
Console.WriteLine("Download failed.");
logFile.WriteLine("Download failed.");
logFile.Flush();
break;
}
}
};
scrolltobtm();
GC.Collect();
GC.WaitForPendingFinalizers();
requestfileid.Download(stream1);
stream1.Close();
stream1.Dispose();
}
}
catch (Google.GoogleApiException ex)
{
Console.Write("\nInfo: ---> " + ex.Message.ToString() + "\n");
}
}
}
exfunctions.MoveFiles(savelocation);
Console.WriteLine("\n\n\tBackup completed for selected user!");
txtLog.Text += ("\n\nBackup completed for selected user.\n\n");
statusStripLabel1.Text = "";
//logFile.Close();
//logFile.Dispose();
}
}
}
catch (Google.GoogleApiException ex)
{
Console.WriteLine("Info: " + ex.Message.ToString());
}
}
);
if (checkforfinished.IsCompleted == true)
{
MessageBox.Show("Parallel.ForEach() Finished!");
Console.WriteLine("Parallel.ForEach() Finished!");
}
else
{
MessageBox.Show("Parallel.ForEach() not completed!");
Console.WriteLine("Parallel.ForEach() not completed!");
}
}
}
}
}
catch (Google.GoogleApiException ex)
{
Console.WriteLine("Info: " + ex.Message.ToString());
}
}
}
You can see where I initiate the Parallel.ForEach(...) and then see what it is in charge of doing. It's a lot, and I understand it's not pretty, so I appreciate constructive criticism.
I am working on an application that crawls my email and sniffs out any emails with attachments. All attachment are being returned in the order they were received. Now I want to go a step further and would like to save any attachments in a local directory. I have been looking for documentation or examples but I have come up empty. I will show you a snippet of my code
This Function will get Email Attachments
public static List<IMessage> GetEmailAttachments()
{
OutlookServicesClient star_Mail_Box = Start_OutLook_Services();
try
{
var Email_Box = star_Mail_Box.Users["*****#dell.com"].Folders["Inbox"].Messages.Where(m => m.HasAttachments == true).Expand(m => m.Attachments).ExecuteAsync();
var messages = Email_Box.Result.CurrentPage;
foreach (var message in messages.OrderByDescending(m=> m.DateTimeReceived))
{
var attachments = message.Attachments.CurrentPage;
foreach (var attachment in attachments)
{
///This is where I will need to put my Logic.
}
}
}
catch (Exception ex)
{
Console.WriteLine("Not Able To Get This Mail Box" + ex.Message + "\n\nDetails : \n\n " + ex.InnerException);
Console.ReadLine();
}
return null; // returning null right now for testing
}
Ok so after looking at the attachment definition I figured I go through a byte array to achieve what I want. Here goes the Some code for my attachment loop.
foreach (FileAttachment attachment in attachments)
{
byte[] bytefiles = attachment.ContentBytes;
string path = #"C:\Top-Level\" + attachment.Name;
if (!string.IsNullOrEmpty(message.Subject))
{
path = #"C:\Top-Level\" + message.Subject + "." + attachment.ContentType;
}
File.WriteAllBytes(path, bytefiles);
}
OK, so this time I created a service with a file watcher to process file once created.
it seems that my service crashes when the files being processed reaches 1000 (I'm receiving loads of messages).
here is my logic: files comes in, file watcher read the text send it to email, insert into DB, move original message to a folders.
on the service start, I'm processing pending messages first before start to watch (I'm talking about over 1000 of text file pending) and my service needs about a second to work on each file.
All goes OK, but when the total incoming files reaches 1000, it simply crash.
sometimes the service stops processing pending and only start looking for new files only.
I have the "InternalBufferSize = 64000" the max recommended.
Please help me with my code (I know it should be multi-threaded for better handling, but I'm not that expert):
protected override void OnStart(string[] args)
{
using(TREEEntities TEX = new TREEEntities())
{
var mp= TEX.TREE_settings.FirstOrDefault(x=>x.SET_key =="MSGDump");
MsgsPath = mp.SET_value;
var dc = TEX.TREE_settings.FirstOrDefault(x => x.SET_key == "DupCash");
DupCash = Convert.ToInt16(dc.SET_value);
}
if (Directory.Exists(MsgsPath))
{
if (!Directory.Exists(MsgsPath+"\\Archive"))
{
Directory.CreateDirectory(MsgsPath+"\\Archive");
}
if (!Directory.Exists(MsgsPath + "\\Duplicates"))
{
Directory.CreateDirectory(MsgsPath + "\\Duplicates");
}
if (!Directory.Exists(MsgsPath + "\\Unsent"))
{
Directory.CreateDirectory(MsgsPath + "\\Unsent");
}
}
else
{
Directory.CreateDirectory(MsgsPath);
Directory.CreateDirectory(MsgsPath + "\\Archive");
Directory.CreateDirectory(MsgsPath + "\\Duplicates");
Directory.CreateDirectory(MsgsPath + "\\Unsent");
}
processPending();//<--- process pending files after last service stop
fileSystemWatcher1.Path = MsgsPath;//<--- path to be watched
fileSystemWatcher1.EnableRaisingEvents = true;
fileSystemWatcher1.InternalBufferSize = 64000;
addToLog(DateTime.Now, "Service Started", 0, "Service", "Info");
addToLog(DateTime.Now, "File Watcher Started", 0, "Service", "Info");
//dupList.Clear();//<--- clear duplicates validation list
}
protected override void OnStop()
{
fileSystemWatcher1.EnableRaisingEvents = false;
addToLog(DateTime.Now, "File Watcher Stopped", 0, "Service", "Alert");
addToLog(DateTime.Now, "Service Stopped", 0, "Service", "Alert");
}
private void fileSystemWatcher1_Created(object sender, FileSystemEventArgs e)
{
try
{
//---------read from file------------
Thread.Sleep(200);//<---give the file some time to get released
string block;
using (StreamReader sr = File.OpenText(MsgsPath + "\\" + e.Name))
{
block = sr.ReadToEnd();
}
PRT = block.Substring(block.Length - 6, 6);//<--- get the printer name
seq = Convert.ToInt16(block.Substring(block.Length - 20, 20).Substring(0, 4));//<--- get the sequence number
switch (PRT)//<----track sequence number from the 3 printers
{
case "64261B"://<---prt1
int seqPlus1=0;
if(seqPrt1 == 9999)//<---ignore sequence change from 9999 to 1
{ seqPlus1 = 1; }
else { seqPlus1 = seqPrt1 + 1; }
if (seq != seqPlus1 && seqPrt1 != 0)//<---"0" to avoid first service start
{
int x = seq - seqPrt1 - 1;
for (int i = 1; i <= x; i++)
{
addToMissing(PRT, seqPlus1);
addToLog(DateTime.Now, "Missing Sequence Number On Printer: " + PRT + " - " + seqPlus1, seqPlus1, "Service", "Missing");
seqPlus1++;
}
seqPrt1 = seq;
}
else { seqPrt1 = seq; }
break;
case "24E9AA"://<---prt2
int seqPlus2=0;
if(seqPrt2 == 9999)
{ seqPlus2 = 1; }
if (seq != seqPlus2 && seqPrt2 != 0)
{
int x = seq - seqPrt2 - 1;
for (int i = 1; i <= x; i++)
{
addToMissing(PRT, seqPlus2);
addToLog(DateTime.Now, "Missing Sequence Number On Printer: " + PRT + " - " + seqPlus2, seqPlus2, "Service", "Missing");
seqPlus2++;
}
seqPrt2 = seq;
}
else { seqPrt2 = seq; }
break;
case "642602"://<---prt3
int seqPlus3=0;
if(seqPrt3 == 9999)
{ seqPlus3 = 1; }
if (seq != seqPlus3 && seqPrt3 != 0)
{
int x = seq - seqPrt3 - 1;
for (int i = 1; i <= x; i++)
{
addToMissing(PRT, seqPlus3);
addToLog(DateTime.Now, "Missing Sequence Number On Printer: " + PRT + " - " + seqPlus3, seqPlus3, "Service", "Missing");
seqPlus3++;
}
seqPrt3 = seq;
}
else { seqPrt3 = seq; }
break;
}
block = block.Remove(block.Length - 52);//<--- trim the sequence number and unwanted info
string[] Alladd;
List<string> sent = new List<string>();
if (!dupList.Contains(block)) //<--- if msg not found in duplicates validation list
{
//--------extract values--------------
if (block.Substring(0, 3) == "\r\nQ") //<--- if the msg. contains a priority code
{
Alladd = block.Substring(0, block.IndexOf(".")).Replace("\r\n", " ").Substring(4).Split(' ').Distinct().Where(x => !string.IsNullOrEmpty(x)).ToArray(); ;
}
else//<--- if no priority code
{
Alladd = block.Substring(0, block.IndexOf(".")).Replace("\r\n", " ").Substring(1).Split(' ').Distinct().Where(x => !string.IsNullOrEmpty(x)).ToArray(); ;
}
string From = block.Substring(block.IndexOf('.') + 1).Substring(0, 7);
string Msg = block.Substring(block.IndexOf('.') + 1);
Msg = Msg.Substring(Msg.IndexOf('\n') + 1);
//--------add msg content to the DB group table--------
using (TREEEntities TE1 = new TREEEntities())
{
TREE_group tg = new TREE_group()
{
GROUP_original = block,
GROUP_sent = Msg,
GROUP_dateTime = DateTime.Now,
GROUP_from = From,
GROUP_seq = seq,
GROUP_prt = PRT,
};
TE1.AddToTREE_group(tg);
TE1.SaveChanges();
GID = tg.GROUP_ID;
}
//--------validate addresses---------------
foreach (string TB in Alladd)
{
string email = "";
string typeB = "";
TREEEntities TE = new TREEEntities();
var q1 = from x in TE.TREE_users where x.USR_TypeB == TB && x.USR_flag == "act" select new { x.USR_email, x.USR_TypeB };
foreach (var itm in q1)
{
email = itm.USR_email;
typeB = itm.USR_TypeB;
}
//-------send mail if the user exist----
if (TB == typeB)
{
if (typeB == "BAHMVGF")
{
addToFtl(block);
}
try
{
sendMail SM = new sendMail();
SM.SendMail(Msg, "Message from: " + From, email);
//---save record in DB----
addToMsg(typeB, email,"sent","act",1,GID,seq);
sent.Add(typeB);
}
catch (Exception x)
{
addToMsg(typeB, email, "Failed", "act", 1, GID, seq);
addToLog(DateTime.Now, "Send message failed: " + x.Message, GID, "Service", "Warning");
}
}
//-------if no user exist----
else
{
if (TB == "BAHMVGF")
{
addToFtl(block);
}
addToMsg(TB, "No email", "Failed", "act", 1, GID, seq);
addToLog(DateTime.Now, "Send message failed, unknown Type-B address: " + TB, GID, "Service", "Warning");
}
}
if (sent.Count < Alladd.Count())//<--- if there is unsent addresses
{
StringBuilder b = new StringBuilder(block);
foreach (string add in sent)
{
b.Replace(add, "");//<--- remove address that has been sent from the original message and write new msg. to unsent folder
}
if (!Directory.Exists(MsgsPath + "\\Unsent"))
{
Directory.CreateDirectory(MsgsPath + "\\Unsent");
}
using (StreamWriter w = File.AppendText(MsgsPath + "\\Unsent\\" + e.Name))
{
w.WriteLine(b);
}
}
sent.Clear();
//---add to dupList to validate the next messages-------------
if (dupList.Count > DupCash)
{
dupList.RemoveAt(0);
}
dupList.Add(block);
//---move msg to archive folder-----------------
if (!Directory.Exists(MsgsPath + "\\Archive"))
{
Directory.CreateDirectory(MsgsPath + "\\Archive");
}
File.Move(MsgsPath + "\\" + e.Name, MsgsPath + "\\Archive\\" + e.Name);
}
else //<--- if message is a duplicate
{
addToLog(DateTime.Now, "Duplicated message, message not sent", seq, "Service", "Info");
//---move msg to duplicates folder-----------------
if (!Directory.Exists(MsgsPath + "\\Duplicates"))
{
Directory.CreateDirectory(MsgsPath + "\\Duplicates");
}
File.Move(MsgsPath + "\\" + e.Name, MsgsPath + "\\Duplicates\\" + e.Name);
}
}
catch (Exception x)
{
addToLog(DateTime.Now, "Error: " + x.Message, seq, "Service", "Alert");
if (!Directory.Exists(MsgsPath + "\\Unsent"))
{
Directory.CreateDirectory(MsgsPath + "\\Unsent");
}
//---move msg to Unsent folder-----------------
File.Move(MsgsPath + "\\" + e.Name, MsgsPath + "\\Unsent\\" + e.Name);
}
}
I wanted to add this as a comment but it exceeded the allowed number of characters so here goes.
First thing I noticed in your code that you are doing all the file handling inside the Created event handler. This is not a good practice, you should always let your FileSystemWatcher event handlers do minimum work as it might result an overflow which is probably what you are facing.
Instead it is better to delegate the work on a separate thread. You can add the events to a queue and let a background thread worry handling them. You can filter the files in the event handler so that your queue does not get filled with garbage.
Using Sleep inside the event handler is also considered a bad practice as you'll be blocking the FileSystemWatcher event.
The maximum buffer size allowed is 64K, it is not a recommended buffer size unless you are dealing with long paths. Increasing buffer size is expensive, because it comes from non-paged memory that cannot be swapped out to disk, so keep the buffer as small as possible. To avoid a buffer overflow, use the NotifyFilter and IncludeSubdirectories properties to filter out unwanted change notifications.
And finally I would suggest reading the FileSystemWatcher MSDN article and looking at several examples online before attempting to write code as the Windows watcher is somewhat delicate and prone to errors
When I connect to pop3.live.com the connection is fine and it also shows me the amount of messages I have and the size of the file but when I try and use "RETR" to get the messages and show them on a console application nothing is presented.
Here is what I have so far
string str = string.Empty;
string strTemp = string.Empty;
using (TcpClient tc = new TcpClient())
{
tc.Connect("pop3.live.com", 995);
using (SslStream sl = new SslStream(tc.GetStream()))
{
sl.AuthenticateAsClient("pop3.live.com");
using (StreamReader sr = new StreamReader(sl))
{
using (StreamWriter sw = new StreamWriter(sl))
{
sw.WriteLine("USER " + _username);
sw.Flush();
sw.WriteLine("PASS "+ _password);
sw.Flush();
sw.WriteLine("LIST");
sw.Flush();
sw.WriteLine("RETR");
sw.Flush();
sw.WriteLine("QUIT ");
sw.Flush();
while ((strTemp = sr.ReadLine()) != null)
{
if (strTemp == "." || strTemp.IndexOf("-ERR") != -1)
{
break;
}
str += strTemp;
}
}
}
}
}
Console.WriteLine(str);
Console.ReadLine();
First you have to use the LIST command, which lists the message numbers. Then issue one or more RETR commands with a single message number from the previous list. Message numbers does not necessarily start from 1! See also my comment on your question about debugging this issue.
For example:
LIST
+OK 2 messages (4095)
1 710
2 3385
.
RETR 1
+OK 710 octets
Return-Path: <john#example.com>
...
With RETR, you need to specify which message to retrieve. RETR without a number is not supported per the POP3 specification.
I trying to capture packets using SharpPcap library.
I'm able to return the packets details but I'm having problem to get what the message content inside the packet.
the packet using .Data to return the message and when I use it it is returning (System.Byte[]).
here is the library website:
http://www.codeproject.com/KB/IP/sharppcap.aspx
here is my code:
string packetData;
private void packetCapturingThreadMethod()
{
Packet packet = null;
int countOfPacketCaptures = 0;
while ((packet = device.GetNextPacket()) != null)
{
packet = device.GetNextPacket();
if (packet is TCPPacket)
{
TCPPacket tcp = (TCPPacket)packet;
myPacket tempPacket = new myPacket();
tempPacket.packetType = "TCP";
tempPacket.sourceAddress = Convert.ToString(tcp.SourceAddress);
tempPacket.destinationAddress = Convert.ToString(tcp.DestinationAddress);
tempPacket.sourcePort = Convert.ToString(tcp.SourcePort);
tempPacket.destinationPort = Convert.ToString(tcp.DestinationPort);
tempPacket.packetMessage = Convert.ToString(tcp.Data);
packetsList.Add(tempPacket);
packetData =
"Type= TCP" +
" Source Address = "+ Convert.ToString(tcp.SourceAddress)+
" Destination Address =" +Convert.ToString(tcp.DestinationAddress)+
" SourcePort =" + Convert.ToString(tcp.SourcePort)+
" SourcePort =" +Convert.ToString(tcp.DestinationPort)+
" Messeage =" + Convert.ToString(tcp.Data);
txtpackets.Invoke(new UpdatetxtpacketsCallback(this.Updatetxtpackets),
new object[] { packetData });
string[] row = { packetsList[countOfPacketCaptures].packetType, packetsList[countOfPacketCaptures].sourceAddress, packetsList[countOfPacketCaptures].destinationAddress, packetsList[countOfPacketCaptures].sourcePort, packetsList[countOfPacketCaptures].destinationPort, packetsList[countOfPacketCaptures].packetMessage };
try { //dgwPacketInfo.Rows.Add(row); countOfPacketCaptures++;
//lblCapturesLabels.Text = Convert.ToString(countOfPacketCaptures);
}
catch (Exception e) { }
}
else if (packet is UDPPacket)
{
UDPPacket udp = (UDPPacket)packet;
myPacket tempPacket = new myPacket();
tempPacket.packetType = "UDP";
tempPacket.sourceAddress = Convert.ToString(udp.SourceAddress);
tempPacket.destinationAddress = Convert.ToString(udp.DestinationAddress);
tempPacket.sourcePort = Convert.ToString(udp.SourcePort);
tempPacket.destinationPort = Convert.ToString(udp.DestinationPort);
tempPacket.packetMessage = udp.Data.ToArray() + "\n";
packetsList.Add(tempPacket);
packetData =
"Type= UDP" +
" Source Address = "+ Convert.ToString(udp.SourceAddress)+
" Destination Address =" +Convert.ToString(udp.DestinationAddress)+
" SourcePort =" + Convert.ToString(udp.SourcePort)+
" SourcePort =" +Convert.ToString(udp.DestinationPort)+
" Messeage =" + udp.Data.ToArray() + "\n";
string[] row = { packetsList[countOfPacketCaptures].packetType, packetsList[countOfPacketCaptures].sourceAddress, packetsList[countOfPacketCaptures].destinationAddress, packetsList[countOfPacketCaptures].sourcePort, packetsList[countOfPacketCaptures].destinationPort, packetsList[countOfPacketCaptures].packetMessage };
try {
//dgwPacketInfo.Rows.Add(row);
//countOfPacketCaptures++;
//lblCapturesLabels.Text = Convert.ToString(countOfPacketCaptures);
txtpackets.Invoke(new UpdatetxtpacketsCallback(this.Updatetxtpackets),
new object[] { packetData });
}
catch (Exception e) { }
}
}
}
I found the answer...
Data is a byte array so I need to use bit converter and instead of using:
Convert.ToString(tcp.Data);
I should use:
BitConverter.ToString(tcp.Data)
The parser isn't that complex...
I looked at the Packet.Net code (which is the parse for SharpPcap) and all of the fields are stored in commonly used formats.
The IP Addresses are stored in System.Net.IPAddress format so you can just call .ToString on them to get a text string that properly includes the dot marks.
The port numbers are stored as ushort which can be printed the same as any other integer.
The only part that needs to be interpreted in its binary form is the Data field because that changes based on what protocol is being used on the next layer up. SharpPcap/Packet.Net does most of the work for you already and fields are stored in the most convenient or identical forms to those found in the protocol specification. Just use intellisense to check the field's type and if it's not one you're familiar with (such as System.Net.IPAddress or System.NetworkInformation.PhysicalAddress (For MAC addresses)) just google it.