How retry try statement in try/catch inside a foreach - c#

Please see this thread:
In this thread we have the same issue answered about do...while. But what about foreach? Meaning how can we retry a try statement in try/catch inside this foreach with another proxy (another number of proxy_line_num integer):
foreach (string link_1 in links_with_kid)
{
try
{
getData = "";
req = (HttpWebRequest)WebRequest.Create(link_1);
req.Method = "GET";
req.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
req.UserAgent = "Mozilla/5.0 (Windows NT 6.1; rv:19.0) Gecko/20100101 Firefox/19.0";
req.ContentType = "text/html; charset=utf-8";
req.Referer = "http://www.example.com/";
req.KeepAlive = true;
req.Timeout = 25000;
if (useproxy)
{
string[] Ok_ip_port_ar = list_lines_GPL[proxy_line_num].Split(':');
proxy_line_num++;
if (proxy_line_num == list_lines_GPL.Count)
{
proxy_line_num = 0;
}
proxy = new WebProxy(Ok_ip_port_ar[0], int.Parse(Ok_ip_port_ar[1]));
}
req.Proxy = proxy;
req.CookieContainer = cookieJar1;
res = (HttpWebResponse)req.GetResponse();
Stream = res.GetResponseStream();
reader = new StreamReader(Stream);
reader_str = reader.ReadToEnd();
htmlDoc = new HtmlAgilityPack.HtmlDocument();
htmlDoc.LoadHtml(reader_str);
var images = from image in htmlDoc.DocumentNode.Descendants("img")
select image;
string address = string.Empty;
foreach (var image in images)
{
if (image.Attributes["src"] != null)
{
string[] address_ar = image.Attributes["src"].Value.Split('/');
string address_ar_last = address_ar[address_ar.Length - 1];
char[] address_ar_last_char = address_ar_last.ToCharArray();
if (address_ar_last_char.Length == 8
&&
Char.IsUpper(address_ar_last_char[0])
&&
Char.IsUpper(address_ar_last_char[1])
&&
Char.IsUpper(address_ar_last_char[2])
&&
Char.IsUpper(address_ar_last_char[3]))
{
address = image.Attributes["src"].Value;
string localFilename = #"c:\images-from-istgah\" + address_ar_last;
using (WebClient client = new WebClient())
{
client.DownloadFile(address, localFilename);
}
}
}
}
}
catch (Exception ex)
{
}
reader.Close();
Stream.Close();
res.Close();
}

I don't think that you can with a foreach. It is designed to give you the next item in the iterator.
If I were you I would use an ordinary for-loop with an iterator. That way you can control when to go to the next item.
Edit:
When writing it, a while actually made more sense in C#.
IEnumerator<String> iter = list.GetEnumerator();
bool bHasMore = iter.MoveNext();
while (bHasMore) {
try {
...
bHasMore = Iter.MoveNext();
}
catch ...
}
I dont have the entire refernce in my head, so you might need to look something up to get it to compile, but I hope not.

Related

C# asynchronous threading API crawler

I have a program that sends/receives POST requests/responses from an online API, similar to a market bot that buys and sells a commodity. It works great, however when i run it, it locks up the current thread and im unable to use anything else in the program. In the future i would also like to make buying and selling asynchronous, so they could happen at the same time. Here is the code that executes, and as you can see it loops continuously until conditions are met:
private void RunBot()
{
numToBuy = (int)nudNumToBuy.Value;
for (int i = 0; i < numToBuy; i++)
{
while (true)
{
if (AttachAndBuy())
{
break;
}
}
}
}
private bool AttachAndBuy()
{
string data = "<DATA HERE>";
string URL = "<URL HERE>";
Cookies.SetCookies(cookie, "PHPSESSID=" + SessionIDTextBox.Text.Replace("PHPSESSID=", ""));
HttpWebRequest Request = (HttpWebRequest)WebRequest.Create(URL);
Request.ContentType = "application/json";
Request.Accept = "*/*";
Request.CookieContainer = Cookies;
Request.Host = "<HOST HERE>";
Request.Method = "POST";
Request.UserAgent = "<USER AGENT HERE>";
Request.Headers.Add("<HEADER>", "<VALUE>");
Request.KeepAlive = true;
byte[] CompressedRequest = SimpleZlib.CompressToBytes(Encoding.UTF8.GetBytes(data), 9);
Stream RequestStream = Request.GetRequestStream();
RequestStream.Write(CompressedRequest, 0, CompressedRequest.Length);
RequestStream.Flush();
Stream CompressedResponseStream = Request.GetResponse().GetResponseStream();
byte[] CompressedResponseData = ReadToEnd(CompressedResponseStream);
string DecompressedResponseData = SimpleZlib.Decompress(CompressedResponseData, null);
OffersResponse Return = Json.Deserialize<OffersResponse>(DecompressedResponseData);
int LowestCost = 1000000000;
Offer BestOffer = new Offer();
foreach (Offer CurrentOffer in Return.data.offers)
{
bool moneyOffer = false;
int Costs = CurrentOffer.requirementsCost;
string id = CurrentOffer._id;
foreach (Requirement CurrentRequirement in CurrentOffer.requirements)
{
if (CurrentRequirement._tpl == "<TEMPLATE ID HERE>")
{
moneyOffer = true;
}
}
if (moneyOffer == false)
{
continue;
}
if (Costs < LowestCost)
{
LowestCost = Costs;
BestOffer = CurrentOffer;
}
}
BestOfferID = BestOffer._id;
BestOfferCost = LowestCost;
string MoneyID = getStack(BestOfferCost);
while (true)
{
BuyRequestAttemptCounter++;
if (LowestCost > 140000)
{
AddLog("No Suitable Item! Skipping! Lowest Item Cost: " + LowestCost.ToString());
return false;
}
else
AddLog("Best Item Cost: " + LowestCost.ToString() + " | ID: " + BestOfferID);
int Result = buyOrder(MoneyID);
if (Result == 0)
{
//log info for averaging
numberPurchased++;
TotalCost += BestOfferCost;
averageCost = TotalCost / numberPurchased;
lblNumPurchased.Text = numberPurchased.ToString();
lblAverageCost.Text = averageCost.ToString();
lstPricesPurchased.Items.Add(LowestCost.ToString());
AddLog("====================================");
AddLog("Number Purchased: " + numberPurchased);
AddLog("Average Cost: " + averageCost);
AddLog("====================================");
System.Media.SystemSounds.Exclamation.Play();
return true;
}
else if (Result == 1)
return false;
else if (Result == 2)
continue;
else
return false;
}
}
private int buyOrder(string MoneyID)
{
string data = "<DATA HERE>";
string URL = "<URL HERE>";
Cookies.SetCookies(cookie, "PHPSESSID=" + SessionIDTextBox.Text.Replace("PHPSESSID=", ""));
HttpWebRequest Request = (HttpWebRequest)WebRequest.Create(URL);
Request.ContentType = "application/json";
Request.Accept = "*/*";
Request.CookieContainer = Cookies;
Request.Host = "<HOST HERE>";
Request.Method = "POST";
Request.UserAgent = "<USER AGENT HERE>";
Request.Headers.Add("<HEADER>", "<VALUE>");
Request.KeepAlive = true;
byte[] CompressedRequest = SimpleZlib.CompressToBytes(Encoding.UTF8.GetBytes(data), 9);
Stream RequestStream = Request.GetRequestStream();
RequestStream.Write(CompressedRequest, 0, CompressedRequest.Length);
RequestStream.Flush();
Stream CompressedResponseStream = Request.GetResponse().GetResponseStream();
byte[] CompressedResponseData = ReadToEnd(CompressedResponseStream);
string DecompressedResponseData = SimpleZlib.Decompress(CompressedResponseData, null);
ResponseRoot Return = Json.Deserialize<ResponseRoot>(DecompressedResponseData);
string returnErrorCode = DecompressedResponseData.ToString();
//AddLog(DecompressedResponseData);
if (Return.err == 0 && returnErrorCode.Contains("id"))
{
System.Windows.Forms.Clipboard.SetText(DecompressedResponseData);
//AddLog("Successful Purchase!");
return 0;
}
else if (returnErrorCode.Contains("1503"))
{
//AddLog("Failed with 1503!");
return 1;
}
else if (returnErrorCode.Contains("1512"))
{
// AddLog("Failed with 1512!");
return 2;
}
return 3;
}
As stated above, ideally i would like to run both the "attachandbuy" and "buyorder" functions at the same time, and then eventually i will add a sell function, also running concurrently. Is this possible? Thanks.
You can fix this by creating a list of tasks and executing them, this stops the main thread from being blocked while a call hasn't returned yet, e.g. -
var tasks = new List<Task>();
for (int i = 0; i < numToBuy; i++)
{
var task = new Task(() =>
{
AttachAndBuy()
});
tasks.Add(task);
task.Start();
}
Task.WaitAll(tasks.ToArray());
(Note: I've not actually tested this code, it's just a rough example)
Misunderstanding below -
You'll want to use some parallel programming for this - https://learn.microsoft.com/en-us/dotnet/standard/parallel-programming/task-based-asynchronous-programming?redirectedfrom=MSDN within your for loop e.g. -
private void RunBot()
{
numToBuy = (int)nudNumToBuy.Value;
for (int i = 0; i < numToBuy; i++)
{
while (true)
{
Parallel.Invoke(() => AttachAndBuy(), () => BuyOrder());
}
}
}

TCP connections are stacked until 65k when using HttpWebRequest fetching function - internet connectivity is lost

I don't understand what may cause this error
I am using the below function with about 1000 concurrent connections
Each connection uses a different webproxy
After a while like 15 minutes working, the established TCP connection count starts to stack and internet connectivity becomes lost
When i do not use any webproxy, i do not encounter any error
I am using below function to retrieve active TCP connections count
var properties = IPGlobalProperties.GetIPGlobalProperties();
I don't see any leak in my function
So i need your help to solve this annoying problem
c# .net 4.6.2
Here the statuses of active TCP Connections when this problem occurs
public static cs_HttpFetchResults func_fetch_Page(
string srUrl, int irTimeOut = 60,
string srRequestUserAgent = "Mozilla/5.0 (Windows NT 6.3; WOW64; rv:31.0) Gecko/20100101 Firefox/31.0",
string srProxy = null, int irCustomEncoding = 0, bool blAutoDecode = true, bool blKeepAlive = true)
{
cs_HttpFetchResults mycs_HttpFetchResults = new cs_HttpFetchResults();
mycs_HttpFetchResults.srFetchingFinalURL = srUrl;
HttpWebRequest request = null;
WebResponse response = null;
try
{
request = (HttpWebRequest)WebRequest.Create(srUrl);
request.CookieContainer = new System.Net.CookieContainer();
if (srProxy != null)
{
string srProxyHost = srProxy.Split(':')[0];
int irProxyPort = Int32.Parse(srProxy.Split(':')[1]);
WebProxy my_awesomeproxy = new WebProxy(srProxyHost, irProxyPort);
my_awesomeproxy.Credentials = new NetworkCredential();
request.Proxy = my_awesomeproxy;
}
else
{
request.Proxy = null;
}
request.ContinueTimeout = irTimeOut * 1000;
request.ReadWriteTimeout = irTimeOut * 1000;
request.Timeout = irTimeOut * 1000;
request.UserAgent = srRequestUserAgent;
request.KeepAlive = blKeepAlive;
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
WebHeaderCollection myWebHeaderCollection = request.Headers;
myWebHeaderCollection.Add("Accept-Language", "en-gb,en;q=0.5");
myWebHeaderCollection.Add("Accept-Encoding", "gzip, deflate");
request.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
using (response = request.GetResponse())
{
using (Stream strumien = response.GetResponseStream())
{
Encoding myEncoding = Encoding.UTF8;
string srContentType = "";
if (response.ContentType != null)
{
srContentType = response.ContentType;
if (srContentType.Contains(";"))
{
srContentType = srContentType.Split(';')[1];
}
srContentType = srContentType.Replace("charset=", "");
srContentType = func_Process_Html_Input(srContentType);
}
try
{
myEncoding = Encoding.GetEncoding(srContentType);
}
catch
{
myEncoding = irCustomEncoding == 0 ? Encoding.UTF8 : Encoding.GetEncoding(irCustomEncoding);
}
using (StreamReader sr = new StreamReader(strumien, myEncoding))
{
mycs_HttpFetchResults.srFetchBody = sr.ReadToEnd();
if (blAutoDecode == true)
{
mycs_HttpFetchResults.srFetchBody = HttpUtility.HtmlDecode(mycs_HttpFetchResults.srFetchBody);
}
mycs_HttpFetchResults.srFetchingFinalURL = Return_Absolute_Url(response.ResponseUri.AbsoluteUri.ToString(), response.ResponseUri.AbsoluteUri.ToString());
mycs_HttpFetchResults.blResultSuccess = true;
}
}
}
if (request != null)
request.Abort();
request = null;
}
catch (Exception E)
{
if (E.Message.ToString().Contains("(404)"))
mycs_HttpFetchResults.bl404 = true;
csLogger.logCrawlingErrors("crawling failed url: " + srUrl, E);
}
finally
{
if (request != null)
request.Abort();
request = null;
if (response != null)
response.Close();
response = null;
}
return mycs_HttpFetchResults;
}

performance problems of Multithreading check remote file exists or not

void test() //apply multithreading
{
ThreadPool.SetMaxThreads(int.Parse(TxtThread.Text), int.Parse(TxtThread.Text) + 10);
foreach (string url in list_url)
{
ThreadPool.QueueUserWorkItem(new WaitCallback(CheckFile), url);
}
}
void CheckFile(object url) //incoming url to check files exists or not
{
HttpWebResponse response = null;
foreach (string str in filenameArr)
{
try
{
string strUrlFile2 = UriFile(url.ToString(), str);
response = Com.WebResponse(strUrlFile2);
if (response.StatusCode == HttpStatusCode.OK && response.ContentType.ToLower() != "text/html")
{
}
}
catch (Exception ex)
{
}
finally
{
if (response != null)
{
response.Close();
}
}
}
}
public static HttpWebResponse WebResponse(string strUrlFile) //check method
{
HttpWebRequest req = null;
try
{
//System.GC.Collect();
req = (HttpWebRequest)WebRequest.Create(strUrlFile);
req.Method = "HEAD";
req.Timeout = 100;
req.ProtocolVersion = HttpVersion.Version11;
req.AllowAutoRedirect = false;
req.Accept = "*/*";
req.KeepAlive = false;
HttpWebResponse res = (HttpWebResponse)req.GetResponse();
return res;
}
catch (Exception)
{
return null;
}
}
list_url:url ArrayList
filenameArr:filename dictionary array
Questions:
1、After a while when httprequesting to check telefile several times,task didn't finish.And all threads stopped.
2、When the number of thread pool threads to grow to a certain number, and all sub-thread operation is not the main UI thread, but the program interface slow phenomenon

HttpWebRequest starting to time out after 30ish threads

I am running the following code to start my threads:
stpStartInfo.MaxWorkerThreads = Convert.ToInt32(parserThreadCount.Value);
stpStartInfo.MinWorkerThreads = 1;
_smartThreadPool2 = new Smart.SmartThreadPool(stpStartInfo);
string f = sourceFileName.Text;
int totCount = countLinesInFile(f);
using (StreamReader r = new StreamReader(f))
{
int iia = 0;
string line;
while ((line = r.ReadLine()) != null)
{
if (!string.IsNullOrEmpty(line))
{
_smartThreadPool2.QueueWorkItem(
new Amib.Threading.Func<string, int, int, int>(checkSource),
line, iia, totCount);
iia++;
}
}
}
My threads are doing this HttpWebRequest:
try
{
HttpWebRequest _wReq;
HttpWebResponse _wResp;
System.IO.StreamReader _sr;
System.Text.ASCIIEncoding _enc = new System.Text.ASCIIEncoding();
_wReq = (HttpWebRequest)WebRequest.Create(PAGE_URL);
_wReq.CookieContainer = cookieCont;
_wReq.UserAgent = "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
_wReq.ReadWriteTimeout = 10000;
_wReq.Timeout = 10000;
_wReq.ProtocolVersion = HttpVersion.Version10;
_wReq.KeepAlive = false;
_wResp = (HttpWebResponse)_wReq.GetResponse();
_sr = new System.IO.StreamReader(_wResp.GetResponseStream());
html = _sr.ReadToEnd();
_sr.Close();
_cookies = _wReq.CookieContainer;
_wResp.Close();
}
catch (WebException ex)
{
if (ex.Status == WebExceptionStatus.ProtocolError && ex.Response != null)
{
var resp = (HttpWebResponse)ex.Response;
if (resp.StatusCode == HttpStatusCode.ServiceUnavailable)
{
}
}
}
It works perfectly, but after 30ish threads the requests start to time out..
Any idea how to get around this? :)

Why does HttpWebrequest through .Net proxy fail?

I have the following code:
int repeat = 1;
int proxyIndex = 1;
if (listBox1.Items.Count == proxyIndex) //If we're at the end of the proxy list
{
proxyIndex = 0; //Make the selected item the first item in the list
}
try
{
int i = 0;
while (i < listBox1.Items.Count)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(textBox1.Text);
string proxy = listBox1.Items[i].ToString();
string[] proxyArray = proxy.Split(':');
WebProxy proxyz = new WebProxy(proxyArray[0], int.Parse(proxyArray[1]));
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string str = reader.ReadToEnd();
Thread.Sleep(100);
{
repeat++;
continue;
}
}
catch (Exception ex) //Incase some exception happens
{
listBox2.Items.Add("Error:" + ex.Message);
}
I don't understand what I do wrong?
You're not setting Proxy on your HttpWebRequest. (You're creating a WebProxy object, but not using it.) You need to add:
request.Proxy = proxyz;
before calling request.GetResponse().
You also need to fix your use of objects which implement IDisposable. Since they're created in a loop, you cannot delay this - it could be causing any amount of random damage:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
string[] proxyArray = proxyHostAndPort.Split(':');
WebProxy proxyz = new WebProxy(proxyArray[0], int.Parse(proxyArray[1]));
request.Proxy = proxyz;
using (HttpWebResponse response = (HttpWebResponse) request.GetResponse())
{
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
string str = reader.ReadToEnd();
}
}

Categories