Note: I posted this in dropboxforum at: https://www.dropboxforum.com/t5/API-support/Masking-Dropbox-URL/m-p/217458#M11358
We have a Dropbox bussiness account.
And we want to move several files from our web site server into dropbox and use dropbox as a storage solution.
When a user wants to downoad one of these files we would like to keep the URL pointing to our domain and download the file dirctly from dropbox.
Our site is based in ASP.Net ( C# )
I found several solution of how to do this using the old Public folder, but I haven't been able to find an updated solution.
I woudn't mind to have this files publicly shared and even keeping a table in the database with each shared link.
But I rather have the URL:
https://www.OurDomain.com/File?id=1
instead of:
https://www.dropbox.com/s/fxwygu566u3u2l6/doc.pdf?dl=0
EDIT:
Here's an article explaining exactly what I want to do, but its based on the OLD public folder of Dropbox, when you could predict the URL in dropbox.
You can use the DownloadAsync method, which is part of Dropbox API. Based on an example from the Dropbox.NET tutorial, you can do this:
async Task Download(DropboxClient dbx, string folder, string file)
{
using (var response = await dbx.Files.DownloadAsync(folder + "/" + file))
{
return response.GetContentAsByteArrayAsync();
}
}
Related
Email Link
So I currently work for an engineering company and we receive files via Aconex (Aconex is a web based document management system for all consulting teams on a given project). Currently we have a system where we download the files (there is a link in the email that leads to the Aconex website) from the email and file them in a dated folder under the specific project. I've attached an image of the Aconex email link.
Now for the issue. Sometimes it can be quite overwhelming when you receive 20+ project related emails in a day (on top of everything else) and some of these may slip through the gaps.
Basically I would like to automate this process somehow. I want the user to be able to add the email link to the application, hit 'Process' and the files are then downloaded and filed under the specific project.
I've got some basic programming experience (mainly in c#) and would like to use this as my first 'real world' programming project.
Any help that can be offered is really appreciated.
Thanks people!
So, actually you want to make HTTP request to download it from publicly available web server?
You would need these two in the header:
using System.IO;
using System.Net;
The IO will help you messing with the local file system paths, directories and etc. Check: https://msdn.microsoft.com/en-us/library/54a0at6s(v=vs.110).aspx
The Net namespace contains many options for creating requests or downloading files.
Then in a method you create a WebRequest instance, with e.g. Create static method that have an URI object as input:
var httpRequest = WebRequest.Create(urlObject);
// here you setup your stuff like authorization or request method etc:
httpRequest.Method = "GET";
httpRequest.Timeout = settings.timeout;
// and finally call the request (this is an async approach)
httpRequest.BeginGetResponse(getResponse, this);
Here you get the response
public void getResponse(IAsyncResult __result)
{
WebResponse response = httpRequest.EndGetResponse(__result);
// here you deal with the response
}
Or you may use simpler way:
var webClient = new WebClient();
Uri urlObject = new Uri("http://yourUrl");
String localPath = "local\\filesystem\\path.file";
webClient.DownloadFileAsync(urlObject, localPath, this);
Check: https://msdn.microsoft.com/en-us/library/w8bysebz(v=vs.110).aspx
Have I got your issue right?
I have place a xml file online on OneDrive and shared whit everyone that use my program. Now I want to read this into my C# application. I have try a lot of code. A list below:
I have try whit a HttpClient and a HttpResponseMessage. But it give me the html code from the webpage of OneDrive. I understand why if you look to the page.
I have also looking for to use LiveSDK in my application whit the following code:
try
{
LiveConnectClient liveClient = new LiveConnectClient(this.session);
LiveOperationResult operationResult = await liveClient.GetAsync(#"https://onedrive.live.com/?****");
dynamic result = operationResult.Result;
}
catch (LiveConnectException ex)
{
Debug.WriteLine("LiveConnectException catched => " + ex.Message);
}
But I can't create an instance of LiveConnectClient because you need to log in and I don't want that. Found on this links: "Working with Microsoft OneDrive folders and files" on MSDN and "Store LiveConnectSession in WP7" on Stackoverflow.
Can anyone help me or explain it? I need only the xml code (not the html code) and will make a graph of the data. I use the language C# for load the file. Sorry for my bad english or some mistakes, I'm new with Microsoft Live accounts and access in into an application.
Thanks
If you have a single file that you'd just like your clients to be able to download, you'll just want to assemble a variant on the sharing link that you received from the share UI, which should look like this:
https://onedrive.live.com/redir
?resid=8bf6ae9dbc6caa4c!116505
&authkey=!AD0q0bcg_i3dmvg
&ithint=file%2ctxt
You will want to modify the path from redir into download, and you can removed the ithint parameter so the request looks like the following:
https://onedrive.live.com/download.aspx
?resid=8bf6ae9dbc6caa4c%216505
&authkey=%21AD0q0bcg_i3dmvg
This url should allow you to make an anonymous request and download that shared file for your applications usage.
For more robust sharing scenarios, I'd recommend looking at https://dev.onedrive.com for a list of all supported scenarios with examples on how to use them
My ASP.NET MVC application will be deployed to a series of load-balanced web servers. One problem I'm still working out is how to handle dynamically-uploaded file content, such as user-uploaded images -- obviously, saving them on the server where they were uploaded won't allow them to be accessed from the other servers in the load balanced group.
Currently I'm planning to save these to a shared storage location, specifically a UNC path referring to a directory on our NAS; but I'm not sure how best to retrieve these files to display them to the client. I'm thinking I'll need to write a custom route handler of some kind to retrieve them from the non-web-accessible storage location on the server side and then stream them back to the client. This seems relatively straightforward to do, yet I'm struggling with how to begin to approach this in ASP.NET.
Another solution I've considered is creating a Virtual Directory in each application directory which points to the network directory.
I've even considered uploading the files to Amazon S3 (via the file upload handling code) and using CloudFront to delivery them, but I'd rather avoid the external service dependency.
Which approach do you recommend, and are there established best practices or even existing components/libraries available for accomplishing this sort of thing?
In ASP.NET MVC you can handle this with a controller action, like so:
public class SharedImageController : Controller {
public ActionResult GetImage(String imageId) {
String uncPath = GetImageUncLocationFromId( imageId );
Response.ContentType = "image/jpeg"; // change as appropriate
return new FileResult( uncPath );
}
}
and in your HTML:
<img src="<%= Url.Action("GetImage", "SharedImage", new { imageId = "SomeImage.jpg" } %>" alt="Some descriptive text" />
You could make a custom HtmlHelper extension method to make this less error-prone if you'll be using this a lot.
i think there are 2 ways to fix it
1.use a tool synchronous files between machines
it makes duplicate files,per machine has same file
2.upload file to a net address like //192.168.1.1/upload,and host a website on iis like img.domain.com,than img url use this domain
file not duplicate,so you should make sure the browser can find it.the img domain not balance
or upload file to a cloud service
I have a client that wants to sell tutorial videos online. I already got previews of his tutorials streaming from CF (This is public). Now I want to use the c# sdk to generate private, time limited URLs to allow customers who purchased the tutorials to download them for a limited time period.
Once the payment has been confirmed, I want to generate a URL and send it to the client via email.
Does CF/.NET SDK support this?
Can someone point me at a sample. I have searched Google, and got a little information overload. Different examples from different versions of sdk/management console. Please help me make sense of it all :)
If you look at the class Amazon.CloudFront.AmazonCloudFrontUrlSigner that has helper methods for creating presigned URL to private distributions. For example this code snippet creates a url that is valid for one day.
var url = AmazonCloudFrontUrlSigner.GetCannedSignedURL(AmazonCloudFrontUrlSigner.Protocol.http, domainName, cloudFrontPrivateKey, file, cloudFrontKeyPairID, DateTime.Now.AddDays(1));
There are other utility methods in that class for adding more specific access rules.
Note this class was added in version 1.5.2.0 of the SDK which came out in late Augest
Yes Amazon S3 as well as CloudFront both support preSignedUrl access. If you want to faster content delivery the you should use CloudFront. Mr. Norm Johanson saying correct. To generate signed url you will need of Public-Private key pair. You can user your own key pair and lets associate with you account of Amazon S3 or you can also generate it at amazon s3 account and download to generate presigned url
You can use the GUI or code in S3SignURL to sign your URL
https://github.com/DigitalBodyGuard/S3SignURL
You can't do this with CloudFront (CF), but you can do this directly with S3. You simply call the GetPreSignedURL function to generate a time-limited URL to a specific (private) S3 item. This approach is covered in a tutorial here.
The simplest code sample is this:
AmazonS3 client;
GetPreSignedUrlRequest request = new GetPreSignedUrlRequest();
request.WithBucketName(bucketName);
request.WithKey(objectKey);
request.Verb = HttpVerb.GET; // Default.
request.WithExpires(DateTime.Now.AddMinutes(5));
string url = client.GetPreSignedURL(request);
I want to write a simple utility to upload images to various free image hosting websites like TinyPic or Imageshack via a right-click context menu for the file.
How can I do this using .NET? I've seen some linux scripts that use cURL to post images to these website but I'm not sure how I could create the post request, complete with an image in C#?
Can someone point me in the right direction?
EDIT:
I've found a pretty good resource. Cropper, a free screenshot tool written in .net, has a lot of open-source plugins. One of them is a SendToTinyPic.. complete with source. Link here:
http://www.codeplex.com/cropperplugins
The FlickrNet API makes this extremely easy for working with Flickr from .NET. You have to have a Flickr account as well as an API key and shared secret. Once you have what you need, working with the API is very simple:
// http://www.flickr.com/services/api/misc.api_keys.html
string flickrApiKey = "<api key>";
string flickrApiSharedSecret = "<shared secret>";
string flickrAuthenticationToken = "<authentication token>";
Flickr flickr = new Flickr( flickrApiKey, flickrApiSharedSecret );
flickr.AuthToken = flickrAuthenticationToken;
foreach ( FileInfo image in new FileInfo[] {
new FileInfo( #"C:\image1.jpg" ),
new FileInfo( #"C:\image2.jpg" ) } )
{
string photoId = flickr.UploadPicture(
image.FullName, image.Name, image.Name, "tag1, tag2" );
}
Use HttpWebRequest.
Using this class, you can POST data to a remote HTTP address, just set the mime/type to multi-part/form encoded, and post the binary data from the image with the request.
http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest(VS.71).aspx
For ImageShack, take a look to this Application.
TinyPic.com doesn't have an API as far as I know, but the Cropper SendToTinyPic Plugin tries to upload using "Screen scraping". The official version of the plugin doesn't work right now, but I put together a patch using the same approach, and submitted it to the cropperplugins project. It's just one source module that changed. Anyone can download the plugins project, and then drop in my patch and it should work.
With the patch, it's PritScrn or Alt-PrntScrn will save the image and upload to tinypic, and stuff the URL of the raw image on your clipboard. All in 2 seconds. easy.
If you don't want the actual tool, you can still look at the source code of my patch to see how to POST a page with form-data and a file upload. No direct link. See http://cropperplugins.codeplex.com/SourceControl/PatchList.aspx and look for #3239.
This example image was produced and then auto-uploaded to tinypic.com with the Alt-PrtScrn key-combo.
To embed it here, I just had to ctrl-V because the URL is stored on the clipboard.