At the moment I am trying to HTTP response request in my Sharepoint WarmUp script, however for this, I will have to use this code
using (System.IO.StreamReader file = new System.IO.StreamReader(filePath))
while ((line = file.ReadLine()) != null)
{
try
{
WebRequest request = WebRequest.Create(line);
request.Proxy = null;
if ((userName == null) || (userName == ""))
{
request.Credentials = CredentialCache.DefaultCredentials;
}
else
{
CredentialCache myCache = new CredentialCache();
myCache.Add(new Uri(line), "NTLM", new NetworkCredential(userName, password, DomainName));
request.Credentials = myCache;
}
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
using (Stream dataStream = response.GetResponseStream())
using (StreamReader reader = new StreamReader(dataStream))
{
string responseFromServer = reader.ReadToEnd();
reader.Close();
dataStream.Close();
response.Close();
}
Thread.Sleep(2000);
}
I am not suppose to use any User Account to get response, however I can use Default Credentials that only works when am logged in at server, otherwise they wont gonna work i guess.
Is there any other way of doing it, can't use powershell as its MOSS not sharepoint 2010
cheers
You CAN use Powershell for the warm-up script. You can call stsadm command, make HTTP requests and even use the server object model once you do the proper imports.
This post uses powershell to retrieve available sites using stsadm. Once the script retrieves this list, it instantiates a WebClient object and hits each site URL.
If you only want to hit the sites in a list, you can simply use a WebClient object to do this.
Alternatively, if you have a working Search Service within your SSP, you can configure a scheduled crawl. Not only you'll benefit from up to date crawl results but your site will be warm-up natively without the need of an additional job doing it "just for warming up".
Related
Relatively new to C#, but making good progress.
I'm currently trying to test a System.Net.WebRequest method. Using the useful http testing kit at https://httpbin.org/ - I am trying to pass network credentials to https://httpbin.org/basic-auth/user/passwd and to retrieve a successful connection. (Currently getting 401's)
The username is user, and the password is passwrd.
I have a simple form, and button which starts the request. However, as stated, its not working, and i'm getting a 401 error.
Here is what I have so far:
private void button1_Click(object sender, EventArgs e)
{
NetworkCredential myCred = new NetworkCredential(
"user", "passwrd", "https://httpbin.org");
// Create a request for the URL.
WebRequest request = WebRequest.Create(
"https://httpbin.org/basic-auth/user/passwd");
// If required by the server, set the credentials.
request.Credentials = myCred;
// Get the response.
WebResponse response = request.GetResponse();
// Display the status.
Console.WriteLine(((HttpWebResponse)response).StatusDescription);
// Get the stream containing content returned by the server.
Stream dataStream = response.GetResponseStream();
// Open the stream using a StreamReader for easy access.
StreamReader reader = new StreamReader(dataStream);
// Read the content.
string responseFromServer = reader.ReadToEnd();
// Display the content.
Console.WriteLine(responseFromServer);
// Clean up the streams and the response.
reader.Close();
response.Close();
}
The problem is with your credentials, you are assuming that domain is the HTTP domain, but that's only useful for something like Active Directory domains. Just remove that parameter (and fix the password) and it will work:
NetworkCredential myCred = new NetworkCredential("user", "passwd");
I'm trying to create a simple C# application to consume the GitHub API, but when I try to execute the following code:
HttpWebRequest request = WebRequest.Create("https://api.github.com/users/AndreStoicov/repos") as HttpWebRequest;
request.Method = "GET";
request.Proxy = null;
using (HttpWebResponse response = request.GetResponse() as HttpWebResponse)
{
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
return reader.ReadToEnd();
}
}
It returns Protocol Violation Section=ResponseStatusLine due to me not being an authorized user.
In other words, I want to use the GitHub API without any user authorization; is there any way to perform that?
GitHub rules state you must specify User-Agent header in request:
request.UserAgent = "appname";
It does not require registered credentials, just application name would be enough.
I am trying to read the Out Of Office status and message from other users in the organization from my c# program. We are running Exchange 2013 on premises.
This app is running as an Active Directory account (with it's own exchange mailbox), and I cannot use impersonation.
I have spent some time trying out the solutions to similar questions such as:
Retrieve out of office Status using EWS 2.0 This one uses impersonation
Read out of office from exchange server for other users I'm not using the Outlook Object Model
How to get Out of Office for another mailbox again no impersonation for me
http://gsexdev.blogspot.com/2011/11/using-mailtips-in-ews-to-get-oof-out-of.html I don't have reference to ExchangeServiceBinding, even though I'm using Microsoft.Exchange.WebServices.Data;
http://blogs.msdn.com/b/devmsg/archive/2014/06/03/ews-how-to-retrieve-the-oof-out-of-facility-settings-message-using-ews-for-an-exchange-user.aspx This one takes as a paramater, urlname and I'm not sure where that url is coming from. This one seems the most promising though, any ideas where that comes from?
I'm trying to get something like:
public void checkOOF(string userEmail){
bool isOOF = checkstuff(userEmail);
string message;
if(isOOF)
message = getOOFMessage(userEmail);
}
Please help me understand, thank you.
This is what I ended up using and it works.
public static string getOOM(string emailToCheck) //needs to be full email of user.
{
string EWSurl = String.Format("https://{0}/EWS/Exchange.asmx", ExchangePath);
WebRequest webRequest = WebRequest.Create(EWSurl);
HttpWebRequest httpwebRequest = (HttpWebRequest)webRequest;
httpwebRequest.Method = "POST";
httpwebRequest.ContentType = "text/xml; charset=utf-8";
httpwebRequest.ProtocolVersion = HttpVersion.Version11;
httpwebRequest.Credentials = new NetworkCredential("user", "password", "domain");//service Account
httpwebRequest.Timeout = 60000;
Stream requestStream = httpwebRequest.GetRequestStream();
StreamWriter streamWriter = new StreamWriter(requestStream, Encoding.ASCII);
StringBuilder getMailTipsSoapRequest = new StringBuilder("<soap:Envelope xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"");
getMailTipsSoapRequest.Append(" xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" ");
getMailTipsSoapRequest.Append("xmlns:soap=\"http://schemas.xmlsoap.org/soap/envelope/\" ");
getMailTipsSoapRequest.Append("xmlns:t=\"http://schemas.microsoft.com/exchange/services/2006/types\"><soap:Header>");
getMailTipsSoapRequest.Append(" <t:RequestServerVersion Version=\"Exchange2010\"/></soap:Header><soap:Body>");
getMailTipsSoapRequest.Append("<GetMailTips xmlns=\"http://schemas.microsoft.com/exchange/services/2006/messages\">");
getMailTipsSoapRequest.Append("<SendingAs>");
getMailTipsSoapRequest.Append("<t:EmailAddress>accessingemail#domain.com</t:EmailAddress>");
getMailTipsSoapRequest.Append("<t:RoutingType>SMTP</t:RoutingType></SendingAs>");
getMailTipsSoapRequest.Append("<Recipients><t:Mailbox>");
getMailTipsSoapRequest.Append("<t:EmailAddress>" + emailToCheck + "</t:EmailAddress>");
getMailTipsSoapRequest.Append("<t:RoutingType>SMTP</t:RoutingType></t:Mailbox></Recipients>");
getMailTipsSoapRequest.Append(" <MailTipsRequested>OutOfOfficeMessage</MailTipsRequested></GetMailTips>");
getMailTipsSoapRequest.Append("</soap:Body></soap:Envelope>");
streamWriter.Write(getMailTipsSoapRequest.ToString());
streamWriter.Close();
HttpWebResponse webResponse = (HttpWebResponse)httpwebRequest.GetResponse();
StreamReader streamreader = new StreamReader(webResponse.GetResponseStream());
string response = streamreader.ReadToEnd();
if (response.Contains("<t:Message/>"))
return null;
int messageIndex = response.IndexOf("<t:Message>");
response = response.Substring(messageIndex, response.IndexOf("</t:Message>") - messageIndex);
return response;
}
http://blogs.msdn.com/b/devmsg/archive/2014/06/03/ews-how-to-retrieve-the-oof-out-of-facility-settings-message-using-ews-for-an-exchange-user.aspx This one takes as a paramater, urlname and I'm not sure where that url is coming from. This one seems the most promising though, any ideas where that comes from?
urlname is just the EWS URL if you are using the Managed API than just use the value from service.url.
http://gsexdev.blogspot.com/2011/11/using-mailtips-in-ews-to-get-oof-out-of.html I don't have reference to ExchangeServiceBinding, even though I'm using Microsoft.Exchange.WebServices.Data;
This is proxy code generated from the Exchange Web Service WSDL file see https://msdn.microsoft.com/en-us/library/office/dd877040(v=exchg.140).aspx (Microsoft.Exchange.WebServices.Data) is the Managed API.
Solution worked great for me as supplied by Aaron. However returning the substring gave me issues since ours was an html string rather than plain text and it didn't parse correctly. So i replaced the StreamReader down portion with the following. This still returns as a string but correctly parses as the html it was returned as.
XDocument doc;
using (Stream responseStream = response.GetResponseStream())
{
doc = XDocument.Load(responseStream);
}
return doc.Root.Descendants().Where(d => d.Name.LocalName == "Message").Select(d => d.Value).FirstOrDefault();
Ok, I've been racking my brain on this one solo for too long. I've been unable to crack it even with hours spent on this and many other sites.
Essentially, I'm trying to strip some data from a webpage behind a LogIn page using WebRequest/Response. (I have gotten this to work using a WebBrowser control with some layered events which navigate to the different web pages but it's causing some problems when trying to refactor - not to mention it's been stated that using a hidden form to do the work is 'bad practice'.)
This is what I have:
string formParams = string.Format("j_username={0}&j_password={1}", userName, userPass);
string cookieHeader;
WebRequest request = WebRequest.Create(_signInPage);
request.ContentType = "text/plain";
request.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
request.ContentLength = bytes.Length;
using (Stream os = request.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse response = request.GetResponse();
cookieHeader = response.Headers["Set-Cookie"];
WebRequest getRequest = WebRequest.Create(sessionHistoryPage);
getRequest.Method = "GET";
getRequest.Headers.Add("Cookie", cookieHeader);
WebResponse getResponse = getRequest.GetResponse();
try
{
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
textBox1.AppendText(sr.ReadToEnd());
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
throw;
}
So far, I'm able to get to the proper page from the first link but when I go to the second, it sends me back to the login page as if I didn't log in.
The problem may lie in cookies not getting captured correctly but I'm a novice so maybe I'm just doing it wrong. It captures the cookies sent back from the POST: JSESSIONID and S2V however, when we go to the "GET", using FireFox WebConsole, the browser shows that it sends JSESSIONID, S2V and a SPRING_SECURITY_REMEMBER_ME_COOKIE, which I believe is the cookie used when I click the "Remember Me" box on the login form.
I've tried many different ways of doing this using the resources of SO but I have yet to get to the page I need. So, for the sake of the hair I have left, I've decided to ask for help on good ole SO. (This is one of those things I don't want to let up on - stubborn like that sometimes)
If someone wants the actual address of the site I'm trying to log into, I'd be more than happy to send it to a couple people in a private message.
Code that I have to reflect a suggested answer by Wal:
var request = (HttpWebRequest)WebRequest.Create(sessionHistoryPage);
request.Credentials = new NetworkCredential(userName, userPass);
request.CookieContainer = new CookieContainer();
request.PreAuthenticate = true;
WebResponse getResponse = request.GetResponse();
try
{
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
textBox1.AppendText(sr.ReadToEnd());
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
throw;
}
This suggestion, at least the way I implemented it, didn't work.
As Krizz suggested, I changed the code to use CookieContainer and transferring the cookies from one request to the other however, the response just gives me back the original login page as if I didn't login.
Are there certain sites that just WILL NOT allow this type of behavior?
Final Solution
The final solution was proposed by Adrian Iftode where he stated that the website I'm trying to log in might not allow to have an authentication without a valid session so adding a GET to the beginning of the process allowed me to get that cookie.
Thanks for all your help guys!
I was doing some sort of cookie transfer for a website written with PHP.
Clearly you are passing the cookie, but maybe is like in that situation
var phpsessid = response.Headers["Set-Cookie"].Replace("; path=/", String.Empty);
The Set-Cookie header contains other related info about the cookie and possible other instructions for other cookies. I had one cookie with its info (Path), the session id which I needed to sent back to the server so the server would know that I am the same client which did the GET request.
The new request had to include this cookie
request.Headers["Cookie"] = phpsessid;
You already do this, but make sure the cookies you receive, you sent back to the server.
Considering session and authentication, there are two cookies, one for session, one for authentication and some servers/application might not allow to have an authentication without a valid session. What I want to say is that you might need to pass the session cookie too. So the steps would be:
Do first a GET request to obtain the session cookie.
Next do the POST request to authenticate and get the auth cookie.
Use both cookies to navigate to the protected pages.
Also check this question, it doesn't show the entire class, but the idea is to keep the CookieContainer in the same class, add the new cookies from POST/GET requests and assign them to the each new request, like #Krizz answered.
Try using CookieContainer which is a class to keep cookies context between several requests. You simply create an instance of it and assign it to each WebRequest.
Therefore, modifying your code:
string formParams = string.Format("j_username={0}&j_password={1}", userName, userPass);
string cookieHeader;
var cookies = new CookieContainer(); // added this line
var request = WebRequest.Create(_signInPage) as HttpWebRequest; // modified line
request.CookieContainer = cookies; // added this line
request.ContentType = "text/plain";
request.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
request.ContentLength = bytes.Length;
using (Stream os = request.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
request.GetResponse(); // removed some code here, no need to read response manually
var getRequest = WebRequest.Create(sessionHistoryPage) as HttpWebRequest; //modified line
getRequest.CookieContainer = cookies; // added this line
getRequest.Method = "GET";
WebResponse getResponse = getRequest.GetResponse();
try
{
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
textBox1.AppendText(sr.ReadToEnd());
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
throw;
}
I created RESTful webservice (WCF) where I check credentials on each request. One of my clients is Android app and everything seems to be great on server side. I get request and if it's got proper header - I process it, etc..
Now I created client app that uses this service. This is how I do GET:
// Create the web request
var request = WebRequest.Create(Context.ServiceURL + uri) as HttpWebRequest;
if (request != null)
{
request.ContentType = "application/json";
// Add authentication to request
request.Credentials = new NetworkCredential(Context.UserName, Context.Password);
// Get response
using (var response = request.GetResponse() as HttpWebResponse)
{
// Get the response stream
if (response != null)
{
var reader = new StreamReader(response.GetResponseStream());
// Console application output
var s = reader.ReadToEnd();
var serializer = new JavaScriptSerializer();
var returnValue = (T)serializer.Deserialize(s, typeof(T));
return returnValue;
}
}
}
So, this code get's my resource and deserializes it. As you see - I'm passing credentials in my call.
Then when debugging on server-side I noticed that I get 2 requests every time - one without authentication header and then server sends back response and second request comes bach with credentials. I think it's bad for my server - I'd rather don't make any roundtrips. How should I change client so it doesn't happen? See screenshot of Fiddler
EDIT:
This is JAVA code I use from Android - it doesn't do double-call:
MyHttpResponse response = new MyHttpResponse();
HttpClient client = mMyApplication.getHttpClient();
try
{
HttpGet request = new HttpGet(serviceURL + url);
request.setHeader(new BasicHeader(HTTP.CONTENT_TYPE, "application/json"));
request.addHeader("Authorization", "Basic " + Preferences.getAuthorizationTicket(mContext));
ResponseHandler<String> handler = new BasicResponseHandler();
response.Body = client.execute(request, handler);
response.Code = HttpURLConnection.HTTP_OK;
response.Message = "OK";
}
catch (HttpResponseException e)
{
response.Code = e.getStatusCode();
response.Message = e.getMessage();
LogData.InsertError(mContext, e);
}
The initial request doesn't ever specify the basic header for authentication. Additionally, since a realm is specified, you have to get that from the server. So you have to ask once: "hey, I need this stuff" and the server goes "who are you? the realm of answering is 'secure area'." (because realm means something here) Just because you added it here:
request.Credentials = new NetworkCredential(Context.UserName, Context.Password);
doesn't mean that it's going to be for sure attached everytime to the request.
Then you respond with the username/password (in this case you're doing BASIC so it's base64 encoded as name:password) and the server decodes it and says "ok, you're all clear, here's your data".
This is going to happen on a regular basis, and there's not a lot you can do about it. I would suggest that you also turn on HTTPS since the authentication is happening in plain text over the internet. (actually what you show seems to be over the intranet, but if you do go over the internet make it https).
Here's a link to Wikipedia that might help you further: http://en.wikipedia.org/wiki/Basic_access_authentication
Ok, I got it. I manually set HttpHeader instead of using request.Credentials
request.Headers.Add(HttpRequestHeader.Authorization, "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes(Context.UserName + ":" + Context.Password)));
Now I see only single requests as expected..
As an option you can use PreAuthenticate property of HttpClientHandler. This would require a couple of lines more
var client = new HttpClient(new HttpClientHandler
{
Credentials = yourCredentials,
PreAuthenticate = true
});
With using this approach, only the first request is sent without credentials, but all the rest requests are OK.