spamassassin check score C# code - c#

Is there a way to check the score in an ASP.Net application? A class or something similar for .Net? How about other Spam Filters out there.
--Edited
I am looking for a way to check the spam score of the email messages in C#.

Here is my super simplified "just check the score" code for connecting to a running Spam Assassin email check from C# which I wrote for http://elasticemail.com. Just setup SA to run on a server and set the access permissions.
Then you can use this code to call it:
public class SimpleSpamAssassin
{
public class RuleResult
{
public double Score = 0;
public string Rule = "";
public string Description = "";
public RuleResult() { }
public RuleResult(string line)
{
Score = double.Parse(line.Substring(0, line.IndexOf(" ")).Trim());
line = line.Substring(line.IndexOf(" ") + 1);
Rule = line.Substring(0, 23).Trim();
Description = line.Substring(23).Trim();
}
}
public static List<RuleResult> GetReport(string serverIP, string message)
{
string command = "REPORT";
StringBuilder sb = new StringBuilder();
sb.AppendFormat("{0} SPAMC/1.2\r\n", command);
sb.AppendFormat("Content-Length: {0}\r\n\r\n", message.Length);
sb.AppendFormat(message);
byte[] messageBuffer = Encoding.ASCII.GetBytes(sb.ToString());
using (Socket spamAssassinSocket = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp))
{
spamAssassinSocket.Connect(serverIP, 783);
spamAssassinSocket.Send(messageBuffer);
spamAssassinSocket.Shutdown(SocketShutdown.Send);
int received;
string receivedMessage = string.Empty;
do
{
byte[] receiveBuffer = new byte[1024];
received = spamAssassinSocket.Receive(receiveBuffer);
receivedMessage += Encoding.ASCII.GetString(receiveBuffer, 0, received);
}
while (received > 0);
spamAssassinSocket.Shutdown(SocketShutdown.Both);
return ParseResponse(receivedMessage);
}
}
private static List<RuleResult> ParseResponse(string receivedMessage)
{
//merge line endings
receivedMessage = receivedMessage.Replace("\r\n", "\n");
receivedMessage = receivedMessage.Replace("\r", "\n");
string[] lines = receivedMessage.Split('\n');
List<RuleResult> results = new List<RuleResult>();
bool inReport = false;
foreach (string line in lines)
{
if (inReport)
{
try
{
results.Add(new RuleResult(line.Trim()));
}
catch
{
//past the end of the report
}
}
if (line.StartsWith("---"))
inReport = true;
}
return results;
}
}
Usage is quite easy:
List<RuleResult> spamCheckResult = SimpleSpamAssassin.GetReport(IP OF SA Server, FULL Email including headers);
It will return the list of spam check rules you hit and the resulting score impact.

I am not exactly sure if that's what you are searching for, but there is a C# wrapper that simplifies the communication with a SpamAssassin server on Code Project:
A C# Wrapper for the SpamAssassin Protocol
Hope that helps!

Related

Parsing performance of row data from files to SQL Server database

I have the PAF raw data in several files (list of all addresses in the UK).
My goal is to create a PostCode lookup in our software.
I have created a new database but there is no need to understand it for the moment.
Let's take a file, his extension is ".c01" and can be open with a text editor. The data in this file are in the following format :
0000000123A
With (according to the developer guide), 8 char for the KEY, 50 char for the NAME.
This file contains 2,449,652 rows (it's a small one !)
I create a Parsing class for this
private class SerializedBuilding
{
public int Key
{
get; set;
}
public string Name
{
get; set;
}
public bool isValid = false;
public Building ToBuilding()
{
Building b = new Building();
b.BuildingKey = Key;
b.BuildingName = Name;
return b;
}
private readonly int KEYLENGTH = 8;
private readonly int NAMELENGTH = 50;
public SerializedBuilding(String line)
{
string KeyStr = null;
string Name = null;
try
{
KeyStr = line.Substring(0, KEYLENGTH);
}
catch (Exception e)
{
Console.WriteLine("erreur parsing key line " + line);
return;
}
try
{
Name = line.Substring(KEYLENGTH - 1, NAMELENGTH);
}
catch (Exception e)
{
Console.WriteLine("erreur parsing name line " + line);
return;
}
int value;
if (!Int32.TryParse(KeyStr, out value))
return;
if (value == 0 || value == 99999999)
return;
this.Name = Name;
this.Key = value;
this.isValid = true;
}
}
I use this method to read the file
public void start()
{
AddressDataContext d = new AddressDataContext();
Count = 0;
string line;
// Read the file and display it line by line.
System.IO.StreamReader file =
new System.IO.StreamReader(filename);
SerializedBuilding sb = null;
Console.WriteLine("Number of line detected : " + File.ReadLines(filename).Count());
while ((line = file.ReadLine()) != null)
{
sb = new SerializedBuilding(line);
if (sb.isValid)
{
d.Buildings.InsertOnSubmit(sb.ToBuilding());
if (Count % 100 == 0)
d.SubmitChanges();
}
Count++;
}
d.SubmitChanges();
file.Close();
Console.WriteLine("building added");
}
I use Linq to SQL classes to insert data to my database. The connection string is the default one.
This seems to work, I have added 67200 lines. It just crashed but my questions are not about that.
My estimations :
33,647,015 rows to parse
Time needed for execution : 13 hours
It's a one-time job (just needs to be done on my sql and on the client server later) so I don't really care about performances but I think it can be interesting to know how it can be improved.
My questions are :
Is readline() and substring() the most powerful ways to read these huge files ?
Can the performance be improved by modifying the connection string ?

EntityTooSmall in CompleteMultipartUploadResponse

using .NET SDK v.1.5.21.0
I'm trying to upload a large file (63Mb) and I'm following the example at:
http://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFileDotNet.html
But using a helper instead the hole code and using jQuery File Upload
https://github.com/blueimp/jQuery-File-Upload/blob/master/basic-plus.html
what I have is:
string bucket = "mybucket";
long totalSize = long.Parse(context.Request.Headers["X-File-Size"]),
maxChunkSize = long.Parse(context.Request.Headers["X-File-MaxChunkSize"]),
uploadedBytes = long.Parse(context.Request.Headers["X-File-UloadedBytes"]),
partNumber = uploadedBytes / maxChunkSize + 1,
fileSize = partNumber * inputStream.Length;
bool lastPart = inputStream.Length < maxChunkSize;
// http://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFileDotNet.html
if (partNumber == 1) // initialize upload
{
iView.Utilities.Amazon_S3.S3MultipartUpload.InitializePartToCloud(fileName, bucket);
}
try
{
// upload part
iView.Utilities.Amazon_S3.S3MultipartUpload.UploadPartToCloud(fs, fileName, bucket, (int)partNumber, uploadedBytes, maxChunkSize);
if (lastPart)
// wrap it up and go home
iView.Utilities.Amazon_S3.S3MultipartUpload.CompletePartToCloud(fileName, bucket);
}
catch (System.Exception ex)
{
// Huston, we have a problem!
//Console.WriteLine("Exception occurred: {0}", exception.Message);
iView.Utilities.Amazon_S3.S3MultipartUpload.AbortPartToCloud(fileName, bucket);
}
and
public static class S3MultipartUpload
{
private static string accessKey = System.Configuration.ConfigurationManager.AppSettings["AWSAccessKey"];
private static string secretAccessKey = System.Configuration.ConfigurationManager.AppSettings["AWSSecretKey"];
private static AmazonS3 client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKey, secretAccessKey);
public static InitiateMultipartUploadResponse initResponse;
public static List<UploadPartResponse> uploadResponses;
public static void InitializePartToCloud(string destinationFilename, string destinationBucket)
{
// 1. Initialize.
uploadResponses = new List<UploadPartResponse>();
InitiateMultipartUploadRequest initRequest =
new InitiateMultipartUploadRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'));
initResponse = client.InitiateMultipartUpload(initRequest);
}
public static void UploadPartToCloud(Stream fileStream, string destinationFilename, string destinationBucket, int partNumber, long uploadedBytes, long maxChunkedBytes)
{
// 2. Upload Parts.
UploadPartRequest request = new UploadPartRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'))
.WithUploadId(initResponse.UploadId)
.WithPartNumber(partNumber)
.WithPartSize(maxChunkedBytes)
.WithFilePosition(uploadedBytes)
.WithInputStream(fileStream) as UploadPartRequest;
uploadResponses.Add(client.UploadPart(request));
}
public static void CompletePartToCloud(string destinationFilename, string destinationBucket)
{
// Step 3: complete.
CompleteMultipartUploadRequest compRequest =
new CompleteMultipartUploadRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'))
.WithUploadId(initResponse.UploadId)
.WithPartETags(uploadResponses);
CompleteMultipartUploadResponse completeUploadResponse =
client.CompleteMultipartUpload(compRequest);
}
public static void AbortPartToCloud(string destinationFilename, string destinationBucket)
{
// abort.
client.AbortMultipartUpload(new AbortMultipartUploadRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'))
.WithUploadId(initResponse.UploadId));
}
}
my maxChunckedSize is 6Mb (6 * (1024*1024)) as I have read that the minimum is 5Mb...
why am I getting "Your proposed upload is smaller than the minimum allowed size" exception? What am I doing wrong?
The error is:
<Error>
<Code>EntityTooSmall</Code>
<Message>Your proposed upload is smaller than the minimum allowed size</Message>
<ETag>d41d8cd98f00b204e9800998ecf8427e</ETag>
<MinSizeAllowed>5242880</MinSizeAllowed>
<ProposedSize>0</ProposedSize>
<RequestId>C70E7A23C87CE5FC</RequestId>
<HostId>pmhuMXdRBSaCDxsQTHzucV5eUNcDORvKY0L4ZLMRBz7Ch1DeMh7BtQ6mmfBCLPM2</HostId>
<PartNumber>1</PartNumber>
</Error>
How can I get ProposedSize if I'm passing the stream and stream length?
Here is a working solution for the latest Amazon SDK (as today: v.1.5.37.0)
Amazon S3 Multipart Upload works like:
Initialize the request using client.InitiateMultipartUpload(initRequest)
Send chunks of the file (loop until the end) using client.UploadPart(request)
Complete the request using client.CompleteMultipartUpload(compRequest)
If anything goes wrong, remember to dispose the client and request, as well fire the abort command using client.AbortMultipartUpload(abortMultipartUploadRequest)
I keep the client in Session as we need this for each chunk upload as well, keep an hold of the ETags that are now used to complete the process.
You can see an example and simple way of doing this in Amazon Docs itself, I ended up having a class to do everything, plus, I have integrated with the lovely jQuery File Upload plugin (Handler code below as well).
The S3MultipartUpload is as follow
public class S3MultipartUpload : IDisposable
{
string accessKey = System.Configuration.ConfigurationManager.AppSettings.Get("AWSAccessKey");
string secretAccessKey = System.Configuration.ConfigurationManager.AppSettings.Get("AWSSecretKey");
AmazonS3 client;
public string OriginalFilename { get; set; }
public string DestinationFilename { get; set; }
public string DestinationBucket { get; set; }
public InitiateMultipartUploadResponse initResponse;
public List<PartETag> uploadPartETags;
public string UploadId { get; private set; }
public S3MultipartUpload(string destinationFilename, string destinationBucket)
{
if (client == null)
{
System.Net.WebRequest.DefaultWebProxy = null; // disable proxy to make upload quicker
client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKey, secretAccessKey, new AmazonS3Config()
{
RegionEndpoint = Amazon.RegionEndpoint.EUWest1,
CommunicationProtocol = Protocol.HTTP
});
this.OriginalFilename = destinationFilename.TrimStart('/');
this.DestinationFilename = string.Format("{0:yyyy}{0:MM}{0:dd}{0:HH}{0:mm}{0:ss}{0:fffff}_{1}", DateTime.UtcNow, this.OriginalFilename);
this.DestinationBucket = destinationBucket;
this.InitializePartToCloud();
}
}
private void InitializePartToCloud()
{
// 1. Initialize.
uploadPartETags = new List<PartETag>();
InitiateMultipartUploadRequest initRequest = new InitiateMultipartUploadRequest();
initRequest.BucketName = this.DestinationBucket;
initRequest.Key = this.DestinationFilename;
// make it public
initRequest.AddHeader("x-amz-acl", "public-read");
initResponse = client.InitiateMultipartUpload(initRequest);
}
public void UploadPartToCloud(Stream fileStream, long uploadedBytes, long maxChunkedBytes)
{
int partNumber = uploadPartETags.Count() + 1; // current part
// 2. Upload Parts.
UploadPartRequest request = new UploadPartRequest();
request.BucketName = this.DestinationBucket;
request.Key = this.DestinationFilename;
request.UploadId = initResponse.UploadId;
request.PartNumber = partNumber;
request.PartSize = fileStream.Length;
//request.FilePosition = uploadedBytes // remove this line?
request.InputStream = fileStream; // as UploadPartRequest;
var up = client.UploadPart(request);
uploadPartETags.Add(new PartETag() { ETag = up.ETag, PartNumber = partNumber });
}
public string CompletePartToCloud()
{
// Step 3: complete.
CompleteMultipartUploadRequest compRequest = new CompleteMultipartUploadRequest();
compRequest.BucketName = this.DestinationBucket;
compRequest.Key = this.DestinationFilename;
compRequest.UploadId = initResponse.UploadId;
compRequest.PartETags = uploadPartETags;
string r = "Something went badly wrong";
using (CompleteMultipartUploadResponse completeUploadResponse = client.CompleteMultipartUpload(compRequest))
r = completeUploadResponse.ResponseXml;
return r;
}
public void AbortPartToCloud()
{
// abort.
client.AbortMultipartUpload(new AbortMultipartUploadRequest()
{
BucketName = this.DestinationBucket,
Key = this.DestinationFilename,
UploadId = initResponse.UploadId
});
}
public void Dispose()
{
if (client != null) client.Dispose();
if (initResponse != null) initResponse.Dispose();
}
}
I use DestinationFilename as the destination file so I can avoid the same name, but I keep the OriginalFilename as I needed later.
Using jQuery File Upload Plugin, all works inside a Generic Handler, and the process is something like this:
// Upload partial file
private void UploadPartialFile(string fileName, HttpContext context, List<FilesStatus> statuses)
{
if (context.Request.Files.Count != 1)
throw new HttpRequestValidationException("Attempt to upload chunked file containing more than one fragment per request");
var inputStream = context.Request.Files[0].InputStream;
string contentRange = context.Request.Headers["Content-Range"]; // "bytes 0-6291455/14130271"
int fileSize = int.Parse(contentRange.Split('/')[1]);,
maxChunkSize = int.Parse(context.Request.Headers["X-Max-Chunk-Size"]),
uploadedBytes = int.Parse(contentRange.Replace("bytes ", "").Split('-')[0]);
iView.Utilities.AWS.S3MultipartUpload s3Upload = null;
try
{
// ######################################################################################
// 1. Initialize Amazon S3 Client
if (uploadedBytes == 0)
{
HttpContext.Current.Session["s3-upload"] = new iView.Utilities.AWS.S3MultipartUpload(fileName, awsBucket);
s3Upload = (iView.Utilities.AWS.S3MultipartUpload)HttpContext.Current.Session["s3-upload"];
string msg = System.String.Format("Upload started: {0} ({1:N0}Mb)", s3Upload.DestinationFilename, (fileSize / 1024));
this.Log(msg);
}
// cast current session object
if (s3Upload == null)
s3Upload = (iView.Utilities.AWS.S3MultipartUpload)HttpContext.Current.Session["s3-upload"];
// ######################################################################################
// 2. Send Chunks
s3Upload.UploadPartToCloud(inputStream, uploadedBytes, maxChunkSize);
// ######################################################################################
// 3. Complete Upload
if (uploadedBytes + maxChunkSize > fileSize)
{
string completeRequest = s3Upload.CompletePartToCloud();
this.Log(completeRequest); // log S3 response
s3Upload.Dispose(); // dispose all objects
HttpContext.Current.Session["s3-upload"] = null; // we don't need this anymore
}
}
catch (System.Exception ex)
{
if (ex.InnerException != null)
while (ex.InnerException != null)
ex = ex.InnerException;
this.Log(string.Format("{0}\n\n{1}", ex.Message, ex.StackTrace)); // log error
s3Upload.AbortPartToCloud(); // abort current upload
s3Upload.Dispose(); // dispose all objects
statuses.Add(new FilesStatus(ex.Message));
return;
}
statuses.Add(new FilesStatus(s3Upload.DestinationFilename, fileSize, ""));
}
Keep in mind that to have a Session object inside a Generic Handler, you need to implement IRequiresSessionState so your handler will look like:
public class UploadHandlerSimple : IHttpHandler, IRequiresSessionState
Inside fileupload.js (under _initXHRData) I have added an extra header called X-Max-Chunk-Size so I can pass this to Amazon and calculate if it's the last part of the uploaded file.
Fell free to comment and make smart edits for everyone to use.
I guess you didn't set the content-length of the part inside the UploadPartToCloud() function.

c# If statement always returns false

I'm currently developing a small server/client application for a personal project. The server is meant to be run in Linux under Mono while the client is being run from a Windows PC.
I have an issue where I am passing data to the server (In this case string value of "on" and "off") however an if statement is always returning a false value.
The code from the server is as follows
public void startServer() {
TcpListener listen = new TcpListener(serverIP, 10000);
listen.Start();
Console.Clear();
Console.WriteLine("IP Address = {0}", serverIP.ToString());
Console.WriteLine("Server is listening for connection");
Socket a = listen.AcceptSocket();
Console.WriteLine("Connection from {0}", a.RemoteEndPoint);
ASCIIEncoding respond = new ASCIIEncoding();
a.Send(respond.GetBytes("You are connected to LED Control Server"));
bool keepListening = true;
while (keepListening) {
byte[] b = new byte[1000];
a.Receive(b);
Console.WriteLine("Message Received from {0}, string: {1}", a.RemoteEndPoint, recData(b));
string serverMsg = procIns(recData(b));
a.Send(respond.GetBytes(serverMsg));
}
}
private string recData(byte[] b) //receiving data from the client {
int count = b.Length;
string message = "";
for (int a = 0; a < count; a++) {
message += Convert.ToChar(b[a]);
}
return message;
}
private string procIns(string instruction) {
string returnMsg;
Console.WriteLine(instruction.ToLower());
if (instruction.ToLower() == "on") {
FileGPIO gpio = new FileGPIO();
gpio.OutputPin(FileGPIO.enumPIN.gpio17, true);
returnMsg = "GPIO 17 1";
} else if (instruction.ToLower() == "off") {
FileGPIO gpio = new FileGPIO();
gpio.OutputPin(FileGPIO.enumPIN.gpio17, false);
returnMsg = "GPIO 17 0";
} else {
returnMsg = "Invalid Command";
}
return returnMsg;
}
The cause for the if statement in procIns method returning false is escaping me, if anyone could offer any advice I would appreciate it!
I would guess it would have to be padded spaces. Try this instead...
if (instruction.Trim().ToLower() == "on")
while (keepListening) {
byte[] b = new byte[1000];
int bytesRcvd = a.Receive(b);
Console.WriteLine("Message Received from {0}, string: {1}", a.RemoteEndPoint, recData(b));
string serverMsg = procIns(recData(b, bytesRcvd ));
a.Send(respond.GetBytes(serverMsg));
}
a.Receive(b) method returns number of bytes received. You can store the value in some variable and pass the variable to recData method.
private string recData(byte[] b, int bytesRcvd) {
string message = "";
for (int a = 0; a < bytesRcvd; a++) {
message += Convert.ToChar(b[a]);
}
return message;
}
The number of bytes received will help in truncating the extra values in byte array.

Detecting if internet connection is busy

We are developing an application that will install on PC and it will perform some background upload and download to/from our server. One of the requirement is to detect if the internet connection is currently busy (say above 50% utilization) and if it is, it needs to back-off and try another time. The main reason is to ensure the app does not interfere with user experience if they are in the middle of gaming, watching online movie or aggressively downloading files
After much thinking and research on google and of course SO, I still haven't found a good way on how to implement this, so decided to throw this out here. The application is implemented in C#, .NET 4.0 and I am looking for all forms of responses - either implementation in C# or other languages, pseudo-logic or approach on how to achieve - measuring of internet traffic utilization on local PC with good enough accuracy.
To avoid duplication of effort, so far I have tried these (and why they aren't suitable)
Use WMI to get network statistic. Most SO posts and solutions out there since to refer to this as the approach but it doesn't meet our requirement as measuring of bytes sent/received against network interface capacity (e.g. 1GB Ethernet Card) for utilisation will yield a good measure for LAN traffic but not for internet traffic (where the actual internet bandwidth might only be say 8Mbps)
Use of .NET Network Information Statistics or performance counter - yield similar readings to the above hence have the same shortcomings
Use ICMP (Ping) and measure RTT. It was suggested that 400ms RTT is considered as slow and good indication for busy network, however I was told that user with modem (yes we have to support that), use of reverse proxy or microwave link often get ping above that hence not a good measure
Start downloading a known file and measure the speed - this itself generate traffic which we are trying to avoid, also if this check is done often enough, our application will end up creating a lot of internet traffic - which again not ideal
MOD: Using BITS - this service can be disabled on user pc, requires group policy changes and assume server to be IIS (with custom configuration) and in our case our server is not IIS
So here it is, I'm all confuse and looking for some advice. I highlighted the question text so that you guys don't get lost reading this and wondering what the question is. Thanks.
You could use UPnP to query the router, and retrive the number of bytes sent and received over the network. You could keep checking this value on the router to determine what the activity is. Unfortunately this functionality doesn't seem to be well-documented, but it is possible to implement UPnP communication functionality within a C# application. You will need to use UDP to query for the router (UPnP discover), and once you have found the device, query its functionality, and then query the number of packets sent and received for the Internet Gateway Device using a WebClient (TCP).
Code for a UPnP library:
using System;
using System.Collections.Generic;
using System.Text;
using System.Net.Sockets;
using System.Net;
using System.Xml;
using System.IO;
namespace UPNPLib
{
public class RouterElement
{
public RouterElement()
{
}
public override string ToString()
{
return Name;
}
public List children = new List();
public RouterElement parent;
public string Name;
public string Value;
public RouterElement this[string name] {
get
{
foreach (RouterElement et in children)
{
if (et.Name.ToLower().Contains(name.ToLower()))
{
return et;
}
}
foreach (RouterElement et in children)
{
Console.WriteLine(et.Name);
}
throw new KeyNotFoundException("Unable to find the specified entry");
}
}
public RouterElement(XmlNode node, RouterElement _parent)
{
Name = node.Name;
if (node.ChildNodes.Count
/// Gets the root URL of the device
///
///
public static string GetRootUrl()
{
StringBuilder mbuilder = new StringBuilder();
mbuilder.Append("M-SEARCH * HTTP/1.1\r\n");
mbuilder.Append("HOST: 239.255.255.250:1900\r\n");
mbuilder.Append("ST:upnp:rootdevice\r\n");
mbuilder.Append("MAN:\"ssdp:discover\"\r\n");
mbuilder.Append("MX:3\r\n\r\n");
UdpClient mclient = new UdpClient();
byte[] dgram = Encoding.ASCII.GetBytes(mbuilder.ToString());
mclient.Send(dgram,dgram.Length,new IPEndPoint(IPAddress.Broadcast,1900));
IPEndPoint mpoint = new IPEndPoint(IPAddress.Any, 0);
rootsearch:
dgram = mclient.Receive(ref mpoint);
string mret = Encoding.ASCII.GetString(dgram);
string orig = mret;
mret = mret.ToLower();
string url = orig.Substring(mret.IndexOf("location:") + "location:".Length, mret.IndexOf("\r", mret.IndexOf("location:")) - (mret.IndexOf("location:") + "location:".Length));
WebClient wclient = new WebClient();
try
{
Console.WriteLine("POLL:" + url);
string reply = wclient.DownloadString(url);
if (!reply.ToLower().Contains("router"))
{
goto rootsearch;
}
}
catch (Exception)
{
goto rootsearch;
}
return url;
}
public static RouterElement enumRouterFunctions(string url)
{
XmlReader mreader = XmlReader.Create(url);
XmlDocument md = new XmlDocument();
md.Load(mreader);
XmlNodeList rootnodes = md.GetElementsByTagName("serviceList");
RouterElement elem = new RouterElement();
foreach (XmlNode et in rootnodes)
{
RouterElement el = new RouterElement(et, null);
elem.children.Add(el);
}
return elem;
}
public static RouterElement getRouterInformation(string url)
{
XmlReader mreader = XmlReader.Create(url);
XmlDocument md = new XmlDocument();
md.Load(mreader);
XmlNodeList rootnodes = md.GetElementsByTagName("device");
return new RouterElement(rootnodes[0], null);
}
}
public class RouterMethod
{
string url;
public string MethodName;
string parentname;
string MakeRequest(string URL, byte[] data, string[] headers)
{
Uri mri = new Uri(URL);
TcpClient mclient = new TcpClient();
mclient.Connect(mri.Host, mri.Port);
Stream mstream = mclient.GetStream();
StreamWriter textwriter = new StreamWriter(mstream);
textwriter.Write("POST "+mri.PathAndQuery+" HTTP/1.1\r\n");
textwriter.Write("Connection: Close\r\n");
textwriter.Write("Content-Type: text/xml; charset=\"utf-8\"\r\n");
foreach (string et in headers)
{
textwriter.Write(et + "\r\n");
}
textwriter.Write("Content-Length: " + (data.Length).ToString()+"\r\n");
textwriter.Write("Host: " + mri.Host+":"+mri.Port+"\r\n");
textwriter.Write("\r\n");
textwriter.Flush();
Stream reqstream = mstream;
reqstream.Write(data, 0, data.Length);
reqstream.Flush();
StreamReader reader = new StreamReader(mstream);
while (reader.ReadLine().Length > 2)
{
}
return reader.ReadToEnd();
}
public RouterElement Invoke(string[] args)
{
MemoryStream mstream = new MemoryStream();
StreamWriter mwriter = new StreamWriter(mstream);
//TODO: Implement argument list
string arglist = "";
mwriter.Write("" + "" + "");
mwriter.Write("");//" + arglist + "");
mwriter.Write("");
mwriter.Flush();
List headers = new List();
headers.Add("SOAPAction: \"" + parentschema + "#" + MethodName + "\"");
mstream.Position = 0;
byte[] dgram = new byte[mstream.Length];
mstream.Read(dgram, 0, dgram.Length);
XmlDocument mdoc = new XmlDocument();
string txt = MakeRequest(url, dgram, headers.ToArray());
mdoc.LoadXml(txt);
try
{
RouterElement elem = new RouterElement(mdoc.ChildNodes[0], null);
return elem["Body"].children[0];
}
catch (Exception er)
{
RouterElement elem = new RouterElement(mdoc.ChildNodes[1], null);
return elem["Body"].children[0];
}
}
public List parameters = new List();
string baseurl;
string parentschema;
public RouterMethod(string svcurl, RouterElement element,string pname, string baseURL, string svcpdsc)
{
parentschema = svcpdsc;
baseurl = baseURL;
parentname = pname;
url = svcurl;
MethodName = element["name"].Value;
try
{
foreach (RouterElement et in element["argumentList"].children)
{
parameters.Add(et.children[0].Value);
}
}
catch (KeyNotFoundException)
{
}
}
}
public class RouterService
{
string url;
public string ServiceName;
public List methods = new List();
public RouterMethod GetMethodByNonCaseSensitiveName(string name)
{
foreach (RouterMethod et in methods)
{
if (et.MethodName.ToLower() == name.ToLower())
{
return et;
}
}
throw new KeyNotFoundException();
}
public RouterService(RouterElement element, string baseurl)
{
ServiceName = element["serviceId"].Value;
url = element["controlURL"].Value;
WebClient mclient = new WebClient();
string turtle = element["SCPDURL"].Value;
if (!turtle.ToLower().Contains("http"))
{
turtle = baseurl + turtle;
}
Console.WriteLine("service URL " + turtle);
string axml = mclient.DownloadString(turtle);
XmlDocument mdoc = new XmlDocument();
if (!url.ToLower().Contains("http"))
{
url = baseurl + url;
}
mdoc.LoadXml(axml);
XmlNode mainnode = mdoc.GetElementsByTagName("actionList")[0];
RouterElement actions = new RouterElement(mainnode, null);
foreach (RouterElement et in actions.children)
{
RouterMethod method = new RouterMethod(url, et,ServiceName,baseurl,element["serviceType"].Value);
methods.Add(method);
}
}
}
}
Code for a bandwidth meter:
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Text;
using System.Windows.Forms;
using UPNPLib;
using System.IO;
namespace bandwidthmeter
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
BinaryReader mreader = new BinaryReader(File.Open("bandwidthlog.txt", FileMode.OpenOrCreate));
if (mreader.BaseStream.Length > 0)
{
prevsent = mreader.ReadInt64();
prevrecv = mreader.ReadInt64();
}
mreader.Close();
List services = new List();
string fullurl = UPNP.GetRootUrl();
RouterElement router = UPNP.enumRouterFunctions(fullurl);
Console.WriteLine("Router feature enumeration complete");
foreach (RouterElement et in router.children)
{
services.Add(new RouterService(et.children[0], fullurl.Substring(0, fullurl.IndexOf("/", "http://".Length+1))));
}
getReceiveDelegate = services[1].GetMethodByNonCaseSensitiveName("GetTotalBytesReceived");
getSentDelegate = services[1].GetMethodByNonCaseSensitiveName("GetTotalBytesSent");
Console.WriteLine("Invoking " + getReceiveDelegate.MethodName);
//Console.WriteLine(services[1].GetMethodByNonCaseSensitiveName("GetTotalPacketsSent").Invoke(null));
Timer mymer = new Timer();
mymer.Tick += new EventHandler(mymer_Tick);
mymer.Interval = 1000;
mymer.Start();
FormClosed += new FormClosedEventHandler(Form1_FormClosed);
}
long prevsent = 0;
long prevrecv = 0;
void Form1_FormClosed(object sender, FormClosedEventArgs e)
{
BinaryWriter mwriter = new BinaryWriter(File.Open("bandwidthlog.txt", FileMode.OpenOrCreate));
mwriter.Write(getsent());
mwriter.Write(getreceived());
mwriter.Flush();
mwriter.Close();
}
long getsent()
{
long retval = Convert.ToInt64(getSentDelegate.Invoke(null).children[0].Value);
if (prevsent > retval)
{
retval = prevsent + retval;
}
return retval;
}
long getreceived()
{
long retval = Convert.ToInt64(getReceiveDelegate.Invoke(null).children[0].Value);
if (prevrecv > retval)
{
retval = prevrecv + retval;
}
return retval;
}
void mymer_Tick(object sender, EventArgs e)
{
label1.Text = "Sent: "+(getsent()/1024/1024).ToString()+"MB\nReceived: "+(getreceived()/1024/1024).ToString()+"MB";
}
RouterMethod getSentDelegate;
RouterMethod getReceiveDelegate;
}
}
Have you considered using Background Intelligent Transfer Service (BITS). It's designed to do this job already:
Background Intelligent Transfer Service (BITS) transfers files (downloads or uploads) between a client and server and provides progress information related to the transfers. You can also download files from a peer.
and,
Preserve the responsiveness of other network applications.
I'm not sure if there's a managed interface to it (I can see reference to Powershell Cmdlets), so you might have to use COM interop to use it.
Making the assumption that you are targetting Windows PC's (as you said you were developing in C#), have you looked at BITS, the Background Intelligent Transfer Service?
There's examples of how to hook into it using C# on MSDN and elsewhere, e.g. http://msdn.microsoft.com/en-us/magazine/cc188766.aspx

Azure ServiceBus returns null on Client.Receive()

I have a problem with receiving messages from a queue i have set up in azure.
I have done this successfully using the same code before but now i just get null when i try to fetch messages.
When i view the queue in azure management console i clearly see that the queue contains 5 messages.
Here is the code:
ServiceBus SB = new ServiceBus();
Microsoft.ServiceBus.Messaging.BrokeredMessage message;
while (true)
{
message = SB.ReceiveMessage("orders");
if (message == null)
{
break;
}
Procurement.Order order = message.GetBody<Procurement.Order>();
order.id = Guid.NewGuid().ToString();
order.remindercount = 0;
using (DbManager db = new DbManager())
{
if (db.SetSpCommand("CreateOrderHead",
db.Parameter("#companyId", order.companyId),
db.Parameter("#orderId", order.orderId),
db.Parameter("#suppliercode", order.suppliercode),
db.Parameter("#supplierorderId", order.supplierorderId),
db.Parameter("#orderdate", order.orderdate),
db.Parameter("#desireddate", order.desireddate),
db.Parameter("#ordertext", order.ordertext),
db.Parameter("#name", order.name),
db.Parameter("#street", order.street),
db.Parameter("#zip", order.zip),
db.Parameter("#city", order.city),
db.Parameter("#country", order.country),
db.Parameter("#countrycode", order.countrycode),
db.Parameter("#deliveryterms", order.deliveryterms),
db.Parameter("#reference", order.reference),
db.Parameter("#deliveryinstruction", order.deliveryinstruction),
db.Parameter("#id", order.id),
db.Parameter("#partycode", order.partyCode)
).ExecuteNonQuery() == 1)
{
message.Complete();
message = null;
}
db.SetSpCommand("DeleteOrderRows",
db.Parameter("#orderid", order.orderId),
db.Parameter("#companyId", order.companyId)
).ExecuteNonQuery();
foreach (Procurement.Orderrow r in order.Orderrows)
{
db.SetSpCommand("CreateOrderRow",
db.Parameter("#companyId", r.companyId),
db.Parameter("#orderId", r.orderId),
db.Parameter("#orderrowId", r.orderrowId),
db.Parameter("#itemId", r.itemId),
db.Parameter("#itemdesc", r.itemdesc),
db.Parameter("#orderqty", r.orderqty),
db.Parameter("#desireddate", r.desireddate),
db.Parameter("#rowtext", r.rowtext),
db.Parameter("#supplieritemId", r.supplieritemId),
db.Parameter("#unit", r.unit),
db.Parameter("#id", order.id),
db.Parameter("#unitprice", r.unitprice),
db.Parameter("#rowprice", r.rowprice)
).ExecuteNonQuery();
}
}
}
Thread.Sleep(new TimeSpan(0, 1, 0));
And this is the ServiceBus-class:
public class ServiceBus
{
TokenProvider TokenProvider;
MessagingFactory Factory;
public ServiceBus()
{
TokenProvider = TokenProvider.CreateSharedSecretTokenProvider(GetIssuerName(), GetSecret());
Factory = MessagingFactory.Create(
GetURINameSpace(),
TokenProvider
);
}
public void SendMessage(string queue, BrokeredMessage message)
{
var client = Factory.CreateQueueClient(queue);
client.Send(message);
}
public BrokeredMessage ReceiveMessage(string queue)
{
var client = Factory.CreateQueueClient(queue, ReceiveMode.ReceiveAndDelete);
BrokeredMessage message = client.Receive();
return message;
}
private static Uri GetURINameSpace()
{
return ServiceBusEnvironment.CreateServiceUri("sb", GetNamespace(), string.Empty);
}
private static string GetNamespace()
{
return "Namespace i have verified its the right one";
}
private static string GetIssuerName()
{
return "Issuer i have verified its the right one";
}
private static string GetSecret()
{
return "Key i have verified its the right one";
}
}
I think this should be pretty straight forward but i cant find out what im doing wrong.
Its probably something small that im missing...
Anyways, thanks in advance!
Those BrokeredMessages you see in your SubcriptionDescription.MessageCount are not just regular messages but also the count of the messages in the $DeadLetterQueue-sub queue!!!
Use this code snippet to retrieve all messages from that sub-queue and print out their details. Rename [topic] and [subscription] to your actual ones:
MessagingFactory msgFactory = MessagingFactory.Create(_uri, _tokenProvider);
MessageReceiver msgReceiver = msgFactory.CreateMessageReceiver("[topic]/subscriptions/[subscription]/$DeadLetterQueue", ReceiveMode.PeekLock);
while (true)
{
BrokeredMessage msg = msgReceiver.Receive();
if (msg != null)
{
Console.WriteLine("Deadlettered message.");
Console.WriteLine("MessageId: {0}", msg.MessageId);
Console.WriteLine("DeliveryCount: {0}", msg.DeliveryCount);
Console.WriteLine("EnqueuedTimeUtc: {0}", msg.EnqueuedTimeUtc);
Console.WriteLine("Size: {0} bytes", msg.Size);
Console.WriteLine("DeadLetterReason: {0}",
msg.Properties["DeadLetterReason"]);
Console.WriteLine("DeadLetterErrorDescription: {0}",
msg.Properties["DeadLetterErrorDescription"]);
Console.WriteLine();
msg.Complete();
}
}
The solution to this problem was either a bug in azure management-portal making it show the wrong number of messages on the queue or the messages somehow got flagged so that they would not be read.
In other words it worked all along, i just had to add some new messages to the queue.

Categories