I have almost 200 text files in my laptop, I wrote the code in C# which reads these text files line by line and makes a directory per each line in FTP server.
This is my code:
static void Main()
{
for (int i = 0; i <= 200; i++)
{
var lines = File.ReadAllLines(#"D:\file_" + i.ToString().PadLeft(5, '0') + ".txt");
foreach (var line in lines)
{
try
{
WebRequest request = WebRequest.Create("ftp://myftp/dir/" + line);
request.Method = WebRequestMethods.Ftp.MakeDirectory;
request.Credentials = new NetworkCredential("user", "pass");
request.GetResponse();
}
catch (Exception ex)
{}
}
}
}
But this is very slow to create directories, are there other faster ways to do this? For example, get text file as an array and than create all of its directories.
Reading of the text file is really not the problem. The slow part is the FTP.
Use more threads to parallelize the processing:
List<Task> tasks = new List<Task>();
for (int i = 0; i <= 200; i++)
{
tasks.Add(new Task(() =>
{
var lines =
File.ReadAllLines(#"D:\file_" + i.ToString().PadLeft(5, '0') + ".txt");
foreach (var line in lines)
{
WebRequest request = WebRequest.Create("ftp://myftp/dir/" + line);
request.Method = WebRequestMethods.Ftp.MakeDirectory;
request.Credentials = new NetworkCredential("user", "pass");
request.GetResponse();
}
}
));
}
Task.WaitAll(tasks.ToArray());
Though note that you should also take care of disposing the WebResponse's.
The slow part of your programm are the requests you are sending one by one.
You can do some tricks to speed them up:
// allow more connections at a time
ServicePointManager.DefaultConnectionLimit = 30;
// don't wait the 100ms every request do
ServicePointManager.Expect100Continue = false;
Further you can send the request in multi-threading, so you don't have to wait for every request until it's finished. But be aware, that a lot of request can bring down a server. 200 shouldn't be a problem.
Here you have some code you could test:
static void Main()
{
// allow more connections at a time
ServicePointManager.DefaultConnectionLimit = 30;
// don't wait the 100ms every request do
ServicePointManager.Expect100Continue = false;
List<Task> tasks = new List<Task>();
for (int i = 0; i <= 200; i++)
{
var lines = File.ReadAllLines(#"D:\file_" + i.ToString().PadLeft(5, '0') + ".txt");
foreach (var line in lines)
{
tasks.Add(Task.Run(() =>
{
try
{
WebRequest request = WebRequest.Create("ftp://myftp/dir/" + line);
request.Method = WebRequestMethods.Ftp.MakeDirectory;
request.Credentials = new NetworkCredential("user", "pass");
request.GetResponse();
}
catch (Exception ex)
{ }
}
));
}
}
Task.WaitAll(tasks.ToArray());
}
Related
I have a program that sends/receives POST requests/responses from an online API, similar to a market bot that buys and sells a commodity. It works great, however when i run it, it locks up the current thread and im unable to use anything else in the program. In the future i would also like to make buying and selling asynchronous, so they could happen at the same time. Here is the code that executes, and as you can see it loops continuously until conditions are met:
private void RunBot()
{
numToBuy = (int)nudNumToBuy.Value;
for (int i = 0; i < numToBuy; i++)
{
while (true)
{
if (AttachAndBuy())
{
break;
}
}
}
}
private bool AttachAndBuy()
{
string data = "<DATA HERE>";
string URL = "<URL HERE>";
Cookies.SetCookies(cookie, "PHPSESSID=" + SessionIDTextBox.Text.Replace("PHPSESSID=", ""));
HttpWebRequest Request = (HttpWebRequest)WebRequest.Create(URL);
Request.ContentType = "application/json";
Request.Accept = "*/*";
Request.CookieContainer = Cookies;
Request.Host = "<HOST HERE>";
Request.Method = "POST";
Request.UserAgent = "<USER AGENT HERE>";
Request.Headers.Add("<HEADER>", "<VALUE>");
Request.KeepAlive = true;
byte[] CompressedRequest = SimpleZlib.CompressToBytes(Encoding.UTF8.GetBytes(data), 9);
Stream RequestStream = Request.GetRequestStream();
RequestStream.Write(CompressedRequest, 0, CompressedRequest.Length);
RequestStream.Flush();
Stream CompressedResponseStream = Request.GetResponse().GetResponseStream();
byte[] CompressedResponseData = ReadToEnd(CompressedResponseStream);
string DecompressedResponseData = SimpleZlib.Decompress(CompressedResponseData, null);
OffersResponse Return = Json.Deserialize<OffersResponse>(DecompressedResponseData);
int LowestCost = 1000000000;
Offer BestOffer = new Offer();
foreach (Offer CurrentOffer in Return.data.offers)
{
bool moneyOffer = false;
int Costs = CurrentOffer.requirementsCost;
string id = CurrentOffer._id;
foreach (Requirement CurrentRequirement in CurrentOffer.requirements)
{
if (CurrentRequirement._tpl == "<TEMPLATE ID HERE>")
{
moneyOffer = true;
}
}
if (moneyOffer == false)
{
continue;
}
if (Costs < LowestCost)
{
LowestCost = Costs;
BestOffer = CurrentOffer;
}
}
BestOfferID = BestOffer._id;
BestOfferCost = LowestCost;
string MoneyID = getStack(BestOfferCost);
while (true)
{
BuyRequestAttemptCounter++;
if (LowestCost > 140000)
{
AddLog("No Suitable Item! Skipping! Lowest Item Cost: " + LowestCost.ToString());
return false;
}
else
AddLog("Best Item Cost: " + LowestCost.ToString() + " | ID: " + BestOfferID);
int Result = buyOrder(MoneyID);
if (Result == 0)
{
//log info for averaging
numberPurchased++;
TotalCost += BestOfferCost;
averageCost = TotalCost / numberPurchased;
lblNumPurchased.Text = numberPurchased.ToString();
lblAverageCost.Text = averageCost.ToString();
lstPricesPurchased.Items.Add(LowestCost.ToString());
AddLog("====================================");
AddLog("Number Purchased: " + numberPurchased);
AddLog("Average Cost: " + averageCost);
AddLog("====================================");
System.Media.SystemSounds.Exclamation.Play();
return true;
}
else if (Result == 1)
return false;
else if (Result == 2)
continue;
else
return false;
}
}
private int buyOrder(string MoneyID)
{
string data = "<DATA HERE>";
string URL = "<URL HERE>";
Cookies.SetCookies(cookie, "PHPSESSID=" + SessionIDTextBox.Text.Replace("PHPSESSID=", ""));
HttpWebRequest Request = (HttpWebRequest)WebRequest.Create(URL);
Request.ContentType = "application/json";
Request.Accept = "*/*";
Request.CookieContainer = Cookies;
Request.Host = "<HOST HERE>";
Request.Method = "POST";
Request.UserAgent = "<USER AGENT HERE>";
Request.Headers.Add("<HEADER>", "<VALUE>");
Request.KeepAlive = true;
byte[] CompressedRequest = SimpleZlib.CompressToBytes(Encoding.UTF8.GetBytes(data), 9);
Stream RequestStream = Request.GetRequestStream();
RequestStream.Write(CompressedRequest, 0, CompressedRequest.Length);
RequestStream.Flush();
Stream CompressedResponseStream = Request.GetResponse().GetResponseStream();
byte[] CompressedResponseData = ReadToEnd(CompressedResponseStream);
string DecompressedResponseData = SimpleZlib.Decompress(CompressedResponseData, null);
ResponseRoot Return = Json.Deserialize<ResponseRoot>(DecompressedResponseData);
string returnErrorCode = DecompressedResponseData.ToString();
//AddLog(DecompressedResponseData);
if (Return.err == 0 && returnErrorCode.Contains("id"))
{
System.Windows.Forms.Clipboard.SetText(DecompressedResponseData);
//AddLog("Successful Purchase!");
return 0;
}
else if (returnErrorCode.Contains("1503"))
{
//AddLog("Failed with 1503!");
return 1;
}
else if (returnErrorCode.Contains("1512"))
{
// AddLog("Failed with 1512!");
return 2;
}
return 3;
}
As stated above, ideally i would like to run both the "attachandbuy" and "buyorder" functions at the same time, and then eventually i will add a sell function, also running concurrently. Is this possible? Thanks.
You can fix this by creating a list of tasks and executing them, this stops the main thread from being blocked while a call hasn't returned yet, e.g. -
var tasks = new List<Task>();
for (int i = 0; i < numToBuy; i++)
{
var task = new Task(() =>
{
AttachAndBuy()
});
tasks.Add(task);
task.Start();
}
Task.WaitAll(tasks.ToArray());
(Note: I've not actually tested this code, it's just a rough example)
Misunderstanding below -
You'll want to use some parallel programming for this - https://learn.microsoft.com/en-us/dotnet/standard/parallel-programming/task-based-asynchronous-programming?redirectedfrom=MSDN within your for loop e.g. -
private void RunBot()
{
numToBuy = (int)nudNumToBuy.Value;
for (int i = 0; i < numToBuy; i++)
{
while (true)
{
Parallel.Invoke(() => AttachAndBuy(), () => BuyOrder());
}
}
}
To put things into context, imagine a program that retrieves the size of a file from an FTP server using FtpWebRequest.BeginGetResponse(). If it fails to get the file size due to some connection error, it tries again after some time. It also allows the user to cancel the whole operation.
The problem occurs after retrying a specific amount of times: in case FtpWebRequest.BeginGetResponse() fails to connect more than ServicePointManager.DefaultConnectionLimit times on the same endpoint, subsequent callbacks won't be called, and wait handles will block indefinitely.
Using TCPView, one may see that no new connection attempt is made after the first ServicePointManager.DefaultConnectionLimit (failed) connection attempts.
Let me demonstrate the problem with some contrived examples.
Assuming,
.NET 4.6
ServicePointManager.DefaultConnectionLimit == 2
192.168.1.2 is not the IP address of the machine running the examples.
Either 192.168.1.2 is not reachable or it has no service running at port 21.
The synchronous version,
static void Main(string[] args){
var uri = new Uri("ftp://192.168.1.2/test.txt");
for (var z = 0; z < ServicePointManager.DefaultConnectionLimit + 1; z += 1){
try {
var request = (FtpWebRequest) WebRequest.Create(uri);
request.Method = WebRequestMethods.Ftp.GetFileSize;
using (var response = request.GetResponse())
Console.WriteLine(response.ContentLength);
} catch (WebException ex){
Console.WriteLine(ex.Status);
}
}
}
terminates with console output,
ConnectFailure
ConnectFailure
ConnectFailure
while,
static void Main(string[] args){
var uri = new Uri("ftp://192.168.1.2/test.txt");
for (var z = 0; z < ServicePointManager.DefaultConnectionLimit + 1; z += 1){
var request = (FtpWebRequest) WebRequest.Create(uri);
request.Method = WebRequestMethods.Ftp.GetFileSize;
request.BeginGetResponse(PrintFileSizeCallback, request);
}
Console.ReadKey(false);
}
static void PrintFileSizeCallback(IAsyncResult ar){
try {
using (var response = ((FtpWebRequest) ar.AsyncState).EndGetResponse(ar))
Console.WriteLine(response.ContentLength);
} catch (WebException ex){
Console.WriteLine(ex.Status);
}
}
will print,
ConnectFailure
ConnectFailure
and the third callback will never take place.
Even waiting on the async method to complete makes no difference:
static void Main(string[] args){
var uri = new Uri("ftp://192.168.1.2/test.txt");
for (var z = 0; z < ServicePointManager.DefaultConnectionLimit + 1; z += 1){
try {
var request = (FtpWebRequest) WebRequest.Create(uri);
request.Method = WebRequestMethods.Ftp.GetFileSize;
var responseAsyncResult = request.BeginGetResponse(null, null);
using (var response = request.EndGetResponse(responseAsyncResult))
Console.WriteLine(response.ContentLength);
} catch (WebException ex){
Console.WriteLine(ex.Status);
}
}
}
will output,
ConnectFailure
ConnectFailure
and hang.
Identical behaviour is observed if awaiting FtpWebRequest.GetResponseAsync().
Now for the funny part: If 192.168.1.2 is the IP address of the machine that runs any of the previous examples, or 192.168.1.2 is replaced by a loopback address, the problem vanishes.
Ok, let's make some minor modifications to the previous example:
static void Main(string[] args){
var uri = new Uri("ftp://192.168.1.2/test.txt");
for (var z = 0; z < ServicePointManager.DefaultConnectionLimit + 1; z += 1){
try {
var request = WebRequest.Create(uri);
var responseAsyncResult = request.BeginGetResponse(null, null);
using (var response = request.EndGetResponse(responseAsyncResult))
Console.WriteLine(response.ContentLength);
} catch (WebException ex){
Console.WriteLine(ex.Status);
}
}
}
Again, it'll fail like the other ones did, unless the address is local/loopback.
But if the uri scheme is changed to http, all is well...
So, is this a bug or am I doing something wrong?
I have a windows form application that reads a bunch of urls(~800) from a text file to List. The application then shows the status code of all the urls.
The problem is If I run a normal for loop from 0 to list count, it takes a lot of time. I need to speed up the process to the maximum possible without blocking the UI. Below is my code
private async void button1_Click(object sender, EventArgs e)
{
System.IO.StreamReader file = new System.IO.StreamReader("urls.txt");
while ((line = file.ReadLine()) != null)
{
pages.Add(line);
}
file.Close();
for(int i=0; i<pages.Count; i++)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(pages[i]);
int code = 0;
try
{
WebResponse response = await request.GetResponseAsync();
HttpWebResponse r = (HttpWebResponse)response;
code = (int)r.StatusCode;
}
catch (WebException we)
{
var r = ((HttpWebResponse)we.Response).StatusCode;
code = (int)r;
}
}
//add the url and status code to a datagridview
}
One approach would be to use tasks, so that you are not waiting for the last request to complete, before starting the next.
Task<int> tasks;
for (int i = 0; i < 10; i++)
{
tasks = Task.Run<int>(() =>
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(pages[i]);
int code = 0;
try
{
WebResponse response = request.GetResponse();
HttpWebResponse r = (HttpWebResponse)response;
code = (int)r.StatusCode;
}
catch (WebException we)
{
var r = ((HttpWebResponse)we.Response).StatusCode;
code = (int)r;
}
return code;
}
);
}
await tasks;
I have a program accessing database and downloading images. I was using BlockingCollection for that purpose. However, to access some UI elements I decided to use combination of Backgroundworker and BlockingCollection. It reduced speed of processing considerably as compared to speed when only Blockingcollection was used. What can be the reason? Or as I am now accessing UI elements, there is reduction in speed?
Here is the code I am working on:
private void button_Start_Click(object sender, System.EventArgs e)
{
BackgroundWorker bgWorker = new BackgroundWorker();
bgWorker.DoWork += bw_DoWork;
bgWorker.RunWorkerCompleted += bw_RunWorkerCompleted;
bgWorker.ProgressChanged += bw_ProgressChanged;
bgWorker.WorkerSupportsCancellation = true;
bgWorker.WorkerReportsProgress = true;
Button btnSender = (Button)sender;
btnSender.Enabled = false;
bgWorker.RunWorkerAsync();
}
and Do_Work() is as follows:
{
HttpWebRequest request = null;
using (BlockingCollection<ImageFileName> bc = new BlockingCollection<ImageFileName>(30))
{
using (Task task1 = Task.Factory.StartNew(() =>
{
foreach (var fileName in fileNames)
{
string baseUrl = "http://some url";
string url = string.Format(baseUrl, fileName);
request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "GET";
request.ContentType = "application/x-www-form-urlencoded";
var response = (HttpWebResponse)request.GetResponse();
Stream stream = response.GetResponseStream();
img = Image.FromStream(stream);
FileNameImage = new ImageFileName(fileName.ToString(), img);
bc.Add(FileNameImage);
Thread.Sleep(100);
Console.WriteLine("Size of BlockingCollection: {0}", bc.Count);
}
}))
{
using (Task task2 = Task.Factory.StartNew(() =>
{
foreach (ImageFileName imgfilename2 in bc.GetConsumingEnumerable())
{
if (bw.CancellationPending == true)
{
e.Cancel = true;
break;
}
else
{
int numIterations = 4;
Image img2 = imgfilename2.Image;
for (int i = 0; i < numIterations; i++)
{
img2.Save("C:\\path" + imgfilename2.ImageName);
ZoomThumbnail = img2;
ZoomSmall = img2;
ZoomLarge = img2;
ZoomThumbnail = GenerateThumbnail(ZoomThumbnail, 86, false);
ZoomThumbnail.Save("C:\\path" + imgfilename2.ImageName + "_Thumb.jpg");
ZoomThumbnail.Dispose();
ZoomSmall = GenerateThumbnail(ZoomSmall, 400, false);
ZoomSmall.Save("C:\\path" + imgfilename2.ImageName + "_Small.jpg");
ZoomSmall.Dispose();
ZoomLarge = GenerateThumbnail(ZoomLarge, 1200, false);
ZoomLarge.Save("C:\\path" + imgfilename2.ImageName + "_Large.jpg");
ZoomLarge.Dispose();
// progressBar1.BeginInvoke(ProgressBarChange);
int percentComplete = (int)(((i + 1.0) / (double)numIterations) * 100.0);
//if (progressBar1.InvokeRequired)
//{
// BeginInvoke(new MethodInvoker(delegate{bw.ReportProgress(percentComplete)};))
//}
}
Console.WriteLine("This is Take part and size is: {0}", bc.Count);
}
}
}))
Task.WaitAll(task1, task2);
}
}
}
A better option might be to make retrieving the data and writing it to disk run synchronously, and instead use Parallel.ForEach() to allow multiple requests to be in-flight at the same time. That should reduce the amount of waiting in a couple spots:
No need to wait for one HTTP request to complete before issuing subsequent requests.
No need to block on that BlockingCollection
No need to wait for one disk write to complete before firing off the next one.
So perhaps something more like this:
Parallel.ForEach(fileNames,
(name) =>
{
string baseUrl = "http://some url";
string url = string.Format(baseUrl, fileName);
var request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "GET";
request.ContentType = "application/x-www-form-urlencoded";
var response = (HttpWebResponse)request.GetResponse();
Stream stream = response.GetResponseStream();
var img = Image.FromStream(stream);
// Cutting out a lot of steps from the 2nd Task to simplify the example
img.Save(Path.Combine("C:\\path", fileName.ToString()));
});
One possible problem you could run into with this approach is that it will start generating too many requests at once. That might cause resource contention issues, or perhaps the webserver will interpret it as malicious behavior and stop responding to you. You can limit the number of requests that happen at the same time by setting the MaxDegreeOfParallelism. The following example shows how you could limit the operation to process no more than 4 files at the same time.
var options = new ParallelOptions { MaxDegreeOfParallelism = 4 };
Parallel.ForEach(fileNames, (name) => { /* do stuff */ }, options);
I want to stress my website with multiple access. To do that i created a windows based application that call 1000 times the website.
Unfortunatly it work just for 2 call. This is the code:
static void myMethod( int i)
{
int j = 0;
try
{
string url = "";
WebRequest wr = null;
HttpWebResponse response = null;
url = String.Format("http://www.google.com");
wr = WebRequest.Create(url);
//wr.Timeout = 1000;
response = (HttpWebResponse)wr.GetResponse();
MessageBox.Show("end");
}
catch (Exception ex)
{
MessageBox.Show(j.ToString() + " " + ex.Message);
}
}
private void button1_Click(object sender, EventArgs e)
{
for (int i = 0; i < 1000; i++)
{
ThreadStart starter = delegate { myMethod(i); };
Thread thread = new Thread(starter);
thread.Start();
}
}
Rather use the Free WCAT Tool to load test your ASP.NET page.
Also view this video [How Do I:] Load Test a Web Application
If you have Visual Studio 2010 Ultimate, see this link
I hope this helps.
By default HttpRequest only allows two connections to the same host.
You can change this by setting the DefaultConnectionLimit property.
Try disposing the IDisposable instances (i.e. the response) before continuing.
static void myMethod( int i)
{
int j = 0;
try
{
string url = String.Format("http://www.google.com");
WebRequest wr = WebRequest.Create(url);
using(HttpWebResponse response = (HttpWebResponse)wr.GetResponse())
using(Stream responseStream = wr.GetResponseStream())
{
//handle response / response stream
}
MessageBox.Show("end"); //this won't scale!!!
}
catch (Exception ex)
{
MessageBox.Show(j.ToString() + " " + ex.Message);
}
}