check if folder exists c# webrequest - c#

I'm using this to create a folder in an existing sharepoint location. Is there a way to check if a folder exists before creation instead of using try/catch to figure out of this method fails and then assume the folder exists?
I've checked the webrequest methods, but there is no such this as a check.
try
{
HttpWebRequest request = (System.Net.HttpWebRequest)HttpWebRequest.Create("https://site.sharepoint.com/files/"+foldername);
request.Credentials = CredentialCache.DefaultCredentials;
request.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
request.Method = "MKCOL";
HttpWebResponse response = (System.Net.HttpWebResponse)request.GetResponse();
response.Close();
}
catch (Exception ex)
{
//if this piece fails the folder exists already
}

public void CheckWebFoldersExist()
{
try
{
WebClient client = new WebClient();
client.Credentials = CredentialCache.DefaultCredentials;
// Create a request for the URL.
WebRequest request = WebRequest.Create("myAddress");
// If required by the server, set the credentials.
request.Credentials = CredentialCache.DefaultCredentials;
// Get the response.
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
//check response status
if (string.Compare(response.StatusDescription, "OK", true) == 0)
{
//URL exists so that means folder exists
}
else
{
//URL does not exist so that means folder does not exist
}
}
catch (Exception error)
{
//error catching
}
}

you can use SPWeb.GetFolder Method
private bool CheckFolderExists(SPWeb parentWeb, string folderName) {
SPFolder folder = parentWeb.GetFolder(folderName);
return folder.Exists;
}
ref : http://mundeep.wordpress.com/2009/02/24/checking-if-a-spfolder-exists/

Related

How can i reuse cookies from HttpWebResponse to stay logged in on a web page?

I have a case with a simple application that browses to a XenForo Forum site, fetches cookies and further sends post data to login and fetch some information only available for logged in users.
I am able to successfully fetch the cookies and verify that i am successfully logged in the first time, but i cant seem to stay logged in when i try to "browse" further when trying to reuse the same cookies.
Here is what i got so far:
public MainWindow()
{
InitializeComponent();
if (IsLoggedIn())
{
GetPage("http://thesiteiloginto.org/someotherpage");
}
}
// Store Cookies
CookieCollection Cookies;
void GetCookies(string cookieUrl)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(cookieUrl);
request.CookieContainer = new CookieContainer();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
using (Stream responseStream = response.GetResponseStream())
{
// Store Cookies
Cookies = response.Cookies;
}
}
bool IsLoggedIn()
{
GetCookies(_cookieUrl);
CookieCollection cookies = Cookies;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(loginUrl);
request.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
request.CookieContainer = new CookieContainer();
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
string postData = "login=" + username +
"&password=" + password +
"&_xfToken=" + xfToken +
"&cookie_check=" + cookie_check +
"&redirect=" + redirect;
byte[] bytes = Encoding.UTF8.GetBytes(postData);
request.ContentLength = bytes.Length;
if (cookies != null)
{
Console.WriteLine("Cookies are present");
Console.WriteLine(Cookies.Count);
Console.WriteLine(Cookies[0].Value);
request.CookieContainer.Add(cookies);
}
using (Stream requestStream = request.GetRequestStream())
{
requestStream.Write(bytes, 0, bytes.Length);
WebResponse response = request.GetResponse();
using (Stream stream = response.GetResponseStream())
{
using (StreamReader reader = new StreamReader(stream))
{
HtmlAgilityPack.HtmlDocument doc = new HtmlAgilityPack.HtmlDocument();
doc.LoadHtml(reader.ReadToEnd());
bool _loggedIn = false;
try
{
var uniqueNodeTest = doc.DocumentNode.SelectSingleNode("//*[#id=\"navigation\"]/div/nav/div/ul[2]/li[1]/a/strong[1]");
if (uniqueNodeTest.InnerText.Trim().ToLower() == uniqueNodeName)
{
Console.WriteLine("Logged in");
_loggedIn = true;
}
}
catch (Exception ex)
{
Console.WriteLine("Ops! [Login] # SelectSingleNode\n" + ex.Message);
_loggedIn = false;
}
return _loggedIn;
}
}
}
}
void GetPage(string url)
{
if (Cookies != null)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.PreAuthenticate = true;
request.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
request.CookieContainer = new CookieContainer();
request.CookieContainer.Add(Cookies);
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
using (Stream responseStream = response.GetResponseStream())
{
using (StreamReader reader = new StreamReader(responseStream))
{
var pageSource = reader.ReadToEnd();
// Retuns false, meaning i am not logged in
//Where do i go from here?
Console.WriteLine(pageSource.Contains("Log Out"));
}
}
}
}
Console:
Cookies are present
1
22a6c5a4c5557a7f7db36f50a1d746f1
Logged in
False
As you can see i am logged in after the first test, but i cant seem to stay logged in when trying to further browse reusing the cookies.
What am i not taking into consideration? How can i stay logged in to the site?
Use CookieContainer as class to store them locally.
When retrieving the response, put all your cookies into the CookieContainer. When preparing the request, just set the request.CookieContainer to your own object on the new call.
This is the code I used for saving my cookies in a project of mine. Since it's part of one fully decked out HttpSession class specifically meant for the kind of requests you're doing, you'll notice both the response and the cookie container are class variables.
/// <summary>
/// Fetches the new cookies and saves them in the cookie jar.
/// </summary>
private void SaveNewCookies()
{
try
{
foreach (Cookie c in this.m_HttpWebResponse.Cookies)
{
if (c.Domain.Length > 0 && c.Domain[0] == '.')
c.Domain = c.Domain.Remove(0, 1);
this.m_CookieJar.Add(new Uri(this.m_HttpWebResponse.ResponseUri.Scheme + "://" + c.Domain), c);
}
if (this.m_HttpWebResponse.Cookies.Count > 0)
this.BugFixCookieDomain(this.m_CookieJar);
}
catch
{
// no new cookies
}
}
As you see, this contains some smaller bug fixes as well. The mentioned BugFixCookieDomain function is a fix specifically for the 3.5 framework, which you can find here, but if you've moved past that to 4.0 and beyond it won't be particularly useful to you.

C# - Connection: keep-alive Header is Not Being Sent During HttpWebRequest redirect

I have a big problem when I using HttpWebRequest to get content from my blog.
First let's see the code
The request is made by a C# console application:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://localhost/MyBlog2014/Security/Login");
request.ProtocolVersion = HttpVersion.Version10;
var data = "Email=myemail&Password=1234567&keepMeOn=false";
byte[] send = Encoding.Default.GetBytes(data);
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8";
request.Headers.Add("Accept-Language: ro-RO,ro;q=0.8,en-US;q=0.6,en;q=0.4");
request.Headers.Add("Cache-Control: max-age=0");
request.KeepAlive = true;
request.UnsafeAuthenticatedConnectionSharing = true;
request.ServicePoint.ConnectionLimit = 1;
request.ServicePoint.UseNagleAlgorithm = false;
request.ServicePoint.Expect100Continue = false;
request.Method = "POST";
request.UserAgent =
"Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.143 Safari/537.36";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = send.Length;
request.AllowAutoRedirect = true;
Stream sout = request.GetRequestStream();
sout.Write(send, 0, send.Length);
sout.Flush();
sout.Close();
Console.WriteLine("\nThe HTTP request Headers for the first request are: \n{0}", request.Headers);
IAsyncResult result = (IAsyncResult)request.BeginGetResponse(new AsyncCallback(RespCallback), request);
Callback method
static void RespCallback(IAsyncResult asynchronousResult)
{
try
{
HttpWebRequest request = (HttpWebRequest)asynchronousResult.AsyncState;
WebResponse res = request.GetResponse();
var stream = new StreamReader(res.GetResponseStream());
Console.WriteLine(stream.ReadToEnd());
stream.Close();
res.Close();
Console.WriteLine();
}
catch (WebException e)
{
//do something
}
catch (Exception e)
{
//do something
}
}
Now, this code don't work like I wish. It goes in my login method, save email on the session object and then makes a redirect to action index and here appears my big problem in the index method the object Session is null.
Let's see the code
Login method
[HttpPost]
public ActionResult Login(LoginDto login)
{
// if email and password are ok the save email on the sessio
// then reirect to index
//else show error messages(s) return View(login);
}
Index method
[AuthorizeUser(false)]
public ActionResult Index(int pag = 1)
{
//I built the model
return View(new HomePageModel
{
CUrrentPage = pag,
Articles = model,
MaxPage = totalpagesArticles
});
}
So, I will not receive the content from index method because I am not authorized.
The code for save email on the session object is this:
public void SaveDetaliiUser(SessionUserDetails userDetails)
{
HttpContext.Current.Session[SessionDetaliiUtilizator] = userDetails;
}
Before to write here I searched on the net a solution for my problem and the following links didn't help me
How to send KeepAlive header correctly in c#?
C# - Connection: keep-alive Header is Not Being Sent During HttpWebRequest
Keep a http connection alive in C#?
http://social.msdn.microsoft.com/Forums/en-US/87bc7029-ce23-438a-a767-f7c32dcc63a7/how-to-keep-connection-live-while-using-httpwebrequest?forum=netfxnetcom
Thank you in advance,
Marian
Finally I solved the problem. It seems to keep session during redirect I should set request.CookieContainer = new CookieContainer();

httpwebrequest fails to load rss feed

I am attempting to load a page I've received from an RSS feed and I receive the following WebException:
Cannot handle redirect from HTTP/HTTPS protocols to other dissimilar ones.
with an inner exception:
Invalid URI: The hostname could not be parsed.
Here's the code I'm using:
System.Net.HttpWebRequest req = (System.Net.HttpWebRequest)System.Net.HttpWebRequest.Create(url);
string source = String.Empty;
Uri responseURI;
try
{
req.UserAgent=#"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:31.0) Gecko/20100101 Firefox/31.0";
req.Headers.Add("Accept-Language", "en-us,en;q=0.5");
req.AllowAutoRedirect = true;
using (System.Net.WebResponse webResponse = req.GetResponse())
{
using (HttpWebResponse httpWebResponse = webResponse as HttpWebResponse)
{
responseURI = httpWebResponse.ResponseUri;
StreamReader reader;
if (httpWebResponse.ContentEncoding.ToLower().Contains("gzip"))
{
reader = new StreamReader(new GZipStream(httpWebResponse.GetResponseStream(), CompressionMode.Decompress));
}
else if (httpWebResponse.ContentEncoding.ToLower().Contains("deflate"))
{
reader = new StreamReader(new DeflateStream(httpWebResponse.GetResponseStream(), CompressionMode.Decompress));
}
else
{
reader = new StreamReader(httpWebResponse.GetResponseStream());
}
source = reader.ReadToEnd();
reader.Close();
}
}
}
catch (WebException we)
{
Console.WriteLine(url + "\n--\n" + we.Message);
return null;
}
I'm not sure if I'm doing something wrong or if there's something extra I need to be doing. Any help would be greatly appreciated! let me know if there's more information that you need.
############ UPDATE
So after following Jim Mischel's suggestions I've narrowed it down to a UriFormatException that claims Invalid URI: The hostname could not be parsed.
Here's the URL that's in the last "Location" Header: http:////www-nc.nytimes.com/
I guess I can see why it fails, but I'm not sure why it gives me trouble here but when I take the original url it processes it just fine in my browser. Is there something I'm missing/not doing that I should be in order to handle this strange URL?

Unable to use external proxy in c# HttpWebRequest

i am hitting this url:
http://www.google.co.uk/search?q=online stores uk&hl=en&cr=countryUK%7CcountryGB&as_qdr=all&tbs=ctr:countryUK
Basically i get the ppcUrls, it works perfect without any proxy.
But when i try to use a proxy which are available on the internet:
http://proxy-list.org/en/index.php?pp=3128&pt=any&pc=any&ps=any&submit=Filter+Proxy
The above link wont open in any way :|, i did check the ipz with the internet explorer and it opened , but here in HTTPWEBREQUEST , sometime i get 503 Server unavailable, or Too Many redirections
The link wont open with any ip .
Any suggestion ? Below is my getting HTML function:
public string getHtml(string url, string proxytmp)
{
string responseData = "";
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Accept = "*/*";
request.AllowAutoRedirect = true;
request.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
request.Timeout = 60000;
request.Method = "GET";
if (proxies.Count > 0)
{
try
{
int customIP = 0;
int port = 0;
string ip = string.Empty;
string[] splitter = proxytmp.Split(':');
if (splitter.Length > 0)
{
ip = splitter[0].ToString();
port = Convert.ToInt32(splitter[1].ToString());
}
WebProxy proxy = new WebProxy(ip, port);
request.Proxy = proxy;
}
catch (Exception exp)
{
}
}
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response.StatusCode == HttpStatusCode.OK)
{
Stream responseStream = response.GetResponseStream();
StreamReader myStreamReader = new StreamReader(responseStream);
responseData = myStreamReader.ReadToEnd();
}
response.Close();
}
catch (System.Exception e)
{
responseData = e.ToString();
}
return responseData;
}
UPDATE
The Url opens when i use the same proxy with Internet explorer so there must be a way.But i cannot figure it out.
Thank you
My guess is that the proxy blocks incoming connections of a certain nature and that is why you are running into various issues and these checks might be complex in nature or it might be as simple as setting the User-Agent to a valid browser.. I am not sure what other things a proxy can check, I would suggest you take a look at the request object created (something like Referrer, Port etc ) when you use your browser to make the request and make changes accordingly in your C# code..
Good luck, let me know how it works out for you.

HttpWebResponse return The remote server returned an error: (403) Forbidden

i want to get HTML output in
http://www.belmondo.si/turisticna-ponudba/pocitnice/kratkirezultati?cid=ID&cityid=DPS&izhid=&trajanjeid=&oskrbaid=&kategorijaid=&ooseb=2&otrok=0&lasten=1&prvic=1&rid=0-1&subtemplate=eksotika
but i always get
HTTPWEBRESPONSE The remote server returned an error: (403) Forbidden
I am using HttpWebResponse
protected string GetHtmlStringA(string url)
{
string sHtml = "";
HttpWebRequest request;
HttpWebResponse response = null;
Stream stream = null;
request = (HttpWebRequest)WebRequest.Create(url);
response = (HttpWebResponse)request.GetResponse();
stream = response.GetResponseStream();
StreamReader sr = new StreamReader(stream, System.Text.Encoding.Default);
sHtml = sr.ReadToEnd();
if (stream != null) stream.Close();
if (response != null) response.Close();
return sHtml;
}
i also try with UserAgent but it is the same
req.request=
"Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.0.13) Gecko/2009073022 Firefox/3.0.13";
i can't find any solution on forums or internet
It seems you also need to send an Accept header. Sending a request with the following headers will work:
request.UserAgent = "Foo";
request.Accept = "*/*";
You need to pass authentication credentials with the web request:
request.Credentials = new NetworkCredentials("username", "password");
Make sure you have your credentials set correctly.
request.Credentials = CredentialCache.DefaultCredentials;
// if we have a proxy set its creds as well
if( request.Proxy != null )
{
request.Proxy.Credentials = CredentialCache.DefaultCredentials;
}
If you need specific credentials you can create them this way
request.Credentials = new NetworkCredentials("username", "password");

Categories