Application to download files from URL - c#

Email Link
So I currently work for an engineering company and we receive files via Aconex (Aconex is a web based document management system for all consulting teams on a given project). Currently we have a system where we download the files (there is a link in the email that leads to the Aconex website) from the email and file them in a dated folder under the specific project. I've attached an image of the Aconex email link.
Now for the issue. Sometimes it can be quite overwhelming when you receive 20+ project related emails in a day (on top of everything else) and some of these may slip through the gaps.
Basically I would like to automate this process somehow. I want the user to be able to add the email link to the application, hit 'Process' and the files are then downloaded and filed under the specific project.
I've got some basic programming experience (mainly in c#) and would like to use this as my first 'real world' programming project.
Any help that can be offered is really appreciated.
Thanks people!

So, actually you want to make HTTP request to download it from publicly available web server?
You would need these two in the header:
using System.IO;
using System.Net;
The IO will help you messing with the local file system paths, directories and etc. Check: https://msdn.microsoft.com/en-us/library/54a0at6s(v=vs.110).aspx
The Net namespace contains many options for creating requests or downloading files.
Then in a method you create a WebRequest instance, with e.g. Create static method that have an URI object as input:
var httpRequest = WebRequest.Create(urlObject);
// here you setup your stuff like authorization or request method etc:
httpRequest.Method = "GET";
httpRequest.Timeout = settings.timeout;
// and finally call the request (this is an async approach)
httpRequest.BeginGetResponse(getResponse, this);
Here you get the response
public void getResponse(IAsyncResult __result)
{
WebResponse response = httpRequest.EndGetResponse(__result);
// here you deal with the response
}
Or you may use simpler way:
var webClient = new WebClient();
Uri urlObject = new Uri("http://yourUrl");
String localPath = "local\\filesystem\\path.file";
webClient.DownloadFileAsync(urlObject, localPath, this);
Check: https://msdn.microsoft.com/en-us/library/w8bysebz(v=vs.110).aspx
Have I got your issue right?

Related

Dropbox Url Masking

Note: I posted this in dropboxforum at: https://www.dropboxforum.com/t5/API-support/Masking-Dropbox-URL/m-p/217458#M11358
We have a Dropbox bussiness account.
And we want to move several files from our web site server into dropbox and use dropbox as a storage solution.
When a user wants to downoad one of these files we would like to keep the URL pointing to our domain and download the file dirctly from dropbox.
Our site is based in ASP.Net ( C# )
I found several solution of how to do this using the old Public folder, but I haven't been able to find an updated solution.
I woudn't mind to have this files publicly shared and even keeping a table in the database with each shared link.
But I rather have the URL:
https://www.OurDomain.com/File?id=1
instead of:
https://www.dropbox.com/s/fxwygu566u3u2l6/doc.pdf?dl=0
EDIT:
Here's an article explaining exactly what I want to do, but its based on the OLD public folder of Dropbox, when you could predict the URL in dropbox.
You can use the DownloadAsync method, which is part of Dropbox API. Based on an example from the Dropbox.NET tutorial, you can do this:
async Task Download(DropboxClient dbx, string folder, string file)
{
using (var response = await dbx.Files.DownloadAsync(folder + "/" + file))
{
return response.GetContentAsByteArrayAsync();
}
}

Creating a twiML file while working with twilio and C#.net

I am working with twilio.com api in C#.net. C# code calls the phone number I have given:
string accountSid = "AC5xxxxxxxxxfdb0cf485d52ce";
string authToken = "57fxxxxxxxxx1xxx71a";
var client = new TwilioRestClient(accountSid, authToken);
var request = new CallListRequest();
var callList = client.ListCalls(request);
var options = new CallOptions();
options.Url = "http://demo.twilio.com/docs/voice.xml";//
options.To = "+919876123456";
options.From = "+15163425887";
var call = client.InitiateOutboundCall(options);
MessageBox.Show(call.Sid);
I call my phone through above given code, picking up the call connects me to the xml file (a twiML file) mentioned in options.Url and I listen the message and .mp3 file mentioned in voice.xml. Now I want to replace this xml file with my custom xml file placed in a server. For testing purpose I created the exact copy of voice.xml and placed it into my server. So I change the url property to:
options.Url="http://productionserver.com/voice.xml";
After making this change when I pick the phone, it says "Application error has occurred" and call ends.
Has anyone worked with twilio and experienced such problem ? Is there any step that I am missing other than just creating an xml file ?
Twilio evangelist here.
First I'd suggest checking to see if anything got logged in your App Monitor. I suspect you will see Twilio has logged a 500 error there.
If you are trying to serve the raw XML file from IIS, then your probably running into the issue that IIS won't serve an XML file from a POST request, which is the default HTTP Method Twilio uses to request the Url.
There are a couple of work-arounds for this:
You can set the Method property on the CallOptions to GET which tells Twilio to make its request using the HTTP GET method OR
Since this is static XML you are serving, you cold use http://twimlbin.com to host the TwiML
Hope that helps.

Facebook Links without browser

I've read a lot of post on here, and other sites, but still not getting any clarification of my question. So here goes.
I have a Facebook link that requires you to be logged in. Is there a way using .Net (C#) that I can use Facebook API or something to "click" this link without a browser control.
Essentially, I have an application that I wrote that detects certain Farmville links. Right now I'm using a browser control to process the links. However, it's messy and crashes a lot.
Is there a way I can send the url along with maybe a token and api key to process the link?
Does anyone even understand what I'm asking? lol.
Disclaimer: I don't know what Facebook's API looks like, but I'm assuming it involves sending HTTP requests to their servers. I am also not 100% sure on your question.
You can do that with the classes in the System.Net namespace, specifically WebRequest and WebResponse:
using System.Net;
using System.IO;
...
HttpWebRequest req = WebRequest.Create("http://apiurl.com?args=go.here");
req.UserAgent = "The name of my program";
WebResponse resp = req.GetResponse();
string responseData = null;
using(TextReader reader = new StreamReader(resp.GetResponseStream())) {
responseData = reader.ReadToEnd();
}
//do whatever with responseData
You could put that in a method for easy access.
Sounds like your are hacking something... but here goes.
You might want to try a solution like Selenium, which is normally used for testing websites.
It is a little trouble to setup, but you can programmatically launch the facebook website in a browser of your choosing for login, programmatically enter username and password, programmatically click the login button, then navigate to your link.
It is kind of clunky, but get's the job done no matter what, since it appears to facebook that you are accessing their site from a browser.
I've tried similar sneaky tricks to enter my name over and over for Publisher's Clearing House, but they eventually wised up and banned my IP.

Best way to transfer files through a web service

My C# program communicates with a server using a web service, I need the client to download big files from the server and have the option to pause and continue their download, the downloader must also be authorized to download the file.
I had two thoughts on how to do that,
one is to use some 3rd party API like wget to download the files. the problem with that is that I need to learn the API commands and that I'm not certain I can show my download progress in the program, another issue is that I would have to use use bare URLs to get the files from the server which seems ugly and could lead to people just downloading them off the server (I want them to be authorized, although this isn't a real issue since this is just a school project).
My other thought was to create a method on the web service that will get a position in the file and an amount of bytes and return them and the client will piece them together, it seems more complicated but more compelling since the user must be authorized to download the file and I can use it to show the tester some more advanced programming skills ;). The issue with that looks like it might be performance taxing.
What's your opinion? what's the best way to download big files off a server?
Absent the need for authorization and partial downloads, WebClient.DownloadData or WebClient.DownloadDataAsync would be the preferred method of downloading a file from a server.
You could still use WebClient for the authorization by setting the Credentials in your WebClient object instance. If the user isn't authorized to download the file, based on those credentials, the server can return a 404 (Not found) or 403 (Forbidden).
If your server supports HTTP 1.1, the client can start in the middle of the file. To do so, you'll have to create a class that inherits from WebClient and override the GetWebRequest method. That method would then set the headers to do a positional GET.
class MyWebClient : WebClient
{
public int StartDownloadAt { get; set; }
protected override WebRequest GetWebRequest(Uri address)
{
HttpWebRequest req = (HttpWebRequest)base.GetWebRequest(address);
req.AddRange(position_to_start);
}
}
And in the code that uses it:
MyWebClient client = new MyWebClient();
client.StartDownloadAt = 1024 * 2024; // start download 1 megabyte into file.
client.DownloadData(...);
The above is just an example. You'd probably want to make that more robust by having the StartDownLoadAt property reset to 0 when a download is done (or aborted), and not do the AddRange if StartdownloadAt is set to 0. To fully support ranges, you'd probably want properties for start and end range, etc.
And, of course, the client will have to handle stitching the disparate downloaded pieces together after download is complete.
The point is that it should be possible, with a little work, by using the WebClient class.

How to integrate sharepoint document into web application and adding credentials for external users

Being new to integrating sharepoint documents into web applications I am struggling with the following task, so any help or guidance would be very much appreciated.
I have a page in my web application that needs to display a document from sharepoint.
I use the appropiate sharepoint web service to get a list of all the documents in the sharepoint list, I then pick out the one I would like to retrieve and get the path to the document so i would end up with something like:
Company%20Hire/Hire%20Site%20Price%20Lists/0.pdf
Within our network I can do following:
iframePdf.Attributes[
"src"] = ConfigurationManager.AppSettings["SharepointUrl"] + _filePath;
This would display the document in iframe... problem I have is coming from outside the network the sharepoint site can't be accessed (due to firewalls etc).
We do have a vpn back to our network on the external server our web application sits on. If i was to use the document url in a browser on the server I am prompted for credentials and then get to view the document.
Is there a way for the web application to use the vpn, use credentials i give (e.g through impersonation) to access the document and display it to exernal clients?
You could consider implementing a handler on your web application that is responsible for downloading documents from SharePoint and streaming them across the network.
One of the more-convenient classes for downloading content is System.Net.WebRequest. This tutorial can help you get rolling with a custom HTTP handler. On your web pages, you would provide to your user hyperlinks to the HTTP handler with, say, query string parameters designating the file to download. Your handler would download the file, set the appropriate content types and content disposition on the response, and copy the web request response bits to the response stream.
The Credentials property of the WebRequest is important for authenticating your client to SharePoint. It's just a little tricky if you are using NTLM Integrated Windows authentication, due to the double-hop problem. You can work around the double hop a number of ways, including explicitly prompting for user credentials (via a form), implementing Kerberos, or handling authorization yourself while impersonating a fixed privileged user.
Good luck!
I resolved it by doing the following:
I used the web client class, which allows you to add credentials and I just called the download file method. Then got another page to read the pdf from saved location (location saved in a session variable) and write file to Response stream.
Following code did it:
using (var _webclient = new WebClient())
{
_webclient.Credentials = new NetworkCredential(ConfigurationManager.AppSettings["DomainUsername"],ConfigurationManager.AppSettings["DomainPassword"],
"domain");
_webclient.DownloadFile(ConfigurationManager.AppSettings["SharePointPortal"] + _filePath, _path);
}
Portal.Common.Objects.CommonSessionHelper.Instance.IframeUrl = _path;
iframePdf.Attributes["src"] = "PdfEmbedder.aspx";
Pdfembeder.aspx.cs is as:
protected void Page_Load(object sender, EventArgs e)
{
try
{
if (!String.IsNullOrEmpty(Portal.Common.Objects.CommonSessionHelper.Instance.IframeUrl))
{
Response.Clear();
Response.ContentType = "application/pdf";
Response.WriteFile(Portal.Common.Objects.CommonSessionHelper.Instance.IframeUrl);
Response.Flush();
Response.Close();
Portal.Common.Objects.CommonSessionHelper.Instance.IframeUrl = null;
}
}
catch (Exception error)
{
Response.Write(error.Message);
}
}

Categories