I have a client that wants to sell tutorial videos online. I already got previews of his tutorials streaming from CF (This is public). Now I want to use the c# sdk to generate private, time limited URLs to allow customers who purchased the tutorials to download them for a limited time period.
Once the payment has been confirmed, I want to generate a URL and send it to the client via email.
Does CF/.NET SDK support this?
Can someone point me at a sample. I have searched Google, and got a little information overload. Different examples from different versions of sdk/management console. Please help me make sense of it all :)
If you look at the class Amazon.CloudFront.AmazonCloudFrontUrlSigner that has helper methods for creating presigned URL to private distributions. For example this code snippet creates a url that is valid for one day.
var url = AmazonCloudFrontUrlSigner.GetCannedSignedURL(AmazonCloudFrontUrlSigner.Protocol.http, domainName, cloudFrontPrivateKey, file, cloudFrontKeyPairID, DateTime.Now.AddDays(1));
There are other utility methods in that class for adding more specific access rules.
Note this class was added in version 1.5.2.0 of the SDK which came out in late Augest
Yes Amazon S3 as well as CloudFront both support preSignedUrl access. If you want to faster content delivery the you should use CloudFront. Mr. Norm Johanson saying correct. To generate signed url you will need of Public-Private key pair. You can user your own key pair and lets associate with you account of Amazon S3 or you can also generate it at amazon s3 account and download to generate presigned url
You can use the GUI or code in S3SignURL to sign your URL
https://github.com/DigitalBodyGuard/S3SignURL
You can't do this with CloudFront (CF), but you can do this directly with S3. You simply call the GetPreSignedURL function to generate a time-limited URL to a specific (private) S3 item. This approach is covered in a tutorial here.
The simplest code sample is this:
AmazonS3 client;
GetPreSignedUrlRequest request = new GetPreSignedUrlRequest();
request.WithBucketName(bucketName);
request.WithKey(objectKey);
request.Verb = HttpVerb.GET; // Default.
request.WithExpires(DateTime.Now.AddMinutes(5));
string url = client.GetPreSignedURL(request);
Related
This C# code, with a proper access token (for scope drive.readonly) in the Authorization header, will work fine and return the file metadata in json format
_httpClient.GetAsync($"https://content.googleapis.com/drive/v3/files/{someDriveFileId}")
However this code (still with the same access token) will return a 403 :
_httpClient.GetAsync($"https://content.googleapis.com/drive/v3/files/{someDriveFileId}?alt=media")
And the following response html (exactly as returned) :
<html><title>Error 403 (Forbidden)!!1</title><a
href=//www.google.com/><span id=logo
aria-label=Google></span></a><p><b>403 Forbidden</b><p>Your client
does not have permission.\n
I've been using this code in production for years and it worked fine, so i suppose it's related to the recent changes at Google regarding the OAuth screens ?
I'm not sure what i should change here, or what i'm doing (now) wrong. Also the message seems a little sketchy for something made at Google, makes me think there is maybe an issue on their side ?
UPDATE:
Thanks to #Iamblichus for fixing the layout of my original answer. I'm newer to stackoverflow posting.
Even though the change in the original answer appears to be at the root of the problem, I found it difficult to use the troubleshooting steps to come to a working solution. I also was already passing the Authorization Bearer token solution, and that was not fixing my problem. After some trial and error the change I had to make was:
Broken GET URL:
https://content.googleapis.com/drive/v2/files/MY_FILE_ID?key=MY_KEY&alt=media&source=downloadUrl
Working GET URL:
https://www.googleapis.com/drive/v2/files/MY_FILE_ID?alt=media&source=downloadUrl
NOTE:
I am using v2 of the API, so you would need to update to url to v3 if using that.
In the file object I get from the google filepicker v2 API, I don't get back a single URL that supports the change made in authentication. I had to concat the file.selfLink string to make the new URL work
var url = file.selfLink + "?alt=media&source=downloadUrl";
ORIGINAL ANSWER:
Is it possible that https://cloud.google.com/blog/products/application-development/upcoming-changes-to-the-google-drive-api-and-google-picker-api is your problem?:
download calls to files.get, revisions.get and files.export endpoints which authenticate using the access token in the query parameter will no longer be supported.
Only requests that download media content (alt=media) are affected by this change.
The access token should be provided in the HTTP header, like Authorization: Bearer oauth2-token or, if that's not possible, follow the workarounds provided in the referenced documentation:
For file downloads, redirect to the webContentLink which will instruct the browser to download the content. If the application wants to display the file to the user, they can simply redirect to the alternateLink in v2 or webViewLink in v3.
For file exports, redirect to the export link in exportLinks with the desired mime type which will instruct the browser to download the content.
Reference:
Changes in authorization to Google Drive API
Authorization via HTTP header
v2 files get documentation
v3 files get documentation
Implementing a code of Embedded Signing in MVC C# Project. When I post for the sign document, It's redirecting to DocuSign page and it will redirect to return URL. using below code
private const string returnUrl = "http://localhost:5050/DSReturn";
...
return Redirect(viewUrl.Url);
Here I want to get that signed document in response instead of email. How this is possible? or is there any other way to get signed document after finish signature process?
You would make the API call to the "document" resource (.../documents/{documentId or constant}).
The post-signing redirect URL is for the purposes of continuing your web workflow. The "event" parameter allows your web application to generate the correct page or results. For example, in the "Loan Co" example at the Dev Center generates a post-signing page that has links for the document, which in turn result in the API call to retrieve the document. In a real-world integration, the redirect URL is not a reliable indicator that the envelope is "completed". For example, the signer could close the browser before the redirect was executed, or the envelope may have subsequent signers. The Connect service provides a much more reliable trigger for downloading the documents.
Expanding on what #WTP mentioned, you have a couple of approaches. First is via a raw API call using the /v2/accounts/{accountId}/envelopes/{envelopeId}/documents/{documentId} endpoint and retrieving the file from the response. More information can be found here.
Another option you may or may not be aware of is using the DocuSign Client NuGet package. Your code would then look something like this pseudocode:
Stream documentStream = EnvelopesApi.GetDocument(accountId, envelopeId, documentId);
If you are not using the NuGet package yet, keep in mind there is setup work that you will have to do to set-up the EnvelopesApi. That information can be found here.
Not terribly familiar with CF so please point me to the right direction. In order to utilize AWS CF, do my URLs need to look the same? In other words, when serving a static image, the URL string is the same - just some domain and path to an image. However in customer based case, my URLs contain auth token or some other random information that is irrelevant to the content itself. Is it still possible to utilize Amazon's CDN?
You can configure CF to ignore query strings in which case it will only cache based on the base part of the URL.
Query strings can be useful as a means of invaliding the cache should the file change.
I have recently been trying ot write code to add and delete content form an Amazon S3 bucket. I am completely new to Amazon S3 and the AmazonWS .Net SDK.
The bucket region endpoint is http://sqs.eu-west-1.amazonaws.com so I constructed my client like this:
_s3Client = AWSClientFactory.CreateAmazonS3Client(accessKey, awsSecretKey, new AmazonS3Config().WithServiceURL("http://sqs.eu-west-1.amazonaws.com"));
If I leave out the AmazonS3Config bit I get this error:
A redirect was returned without a new location. This can be caused by
attempting to access buckets with periods in the name in a different
region then the client is configured for.
When I put in the AmazonS3Config bit I no longer get that error but I appear to have no access to this bucket at all or any other bucket that I would usually have access to. Any request I send returns null.
I have tested my code with other buckets that are configured to the standard US region and it all works well. The single difference is in the CreateAmazonS3Client method where I set the config with the EU endpoint.
Could anybody give me some guidance on how I should set up my client to work with a bucket in the EU(Ireland) region. I have been searching for a few hours and every tutorial or document I have followed has not worked so far.
Just use the standard endpoint - s3.amazonaws.com
AmazonS3Config S3Config = new AmazonS3Config {
ServiceURL = "s3.amazonaws.com",
CommunicationProtocol = Amazon.S3.Model.Protocol.HTTP
};
AmazonS3 client = Amazon.AWSClientFactory.CreateAmazonS3Client(AWS_Key, AWS_SecretKey, S3Config);
PutObjectRequest UploadToS3Request = new PutObjectRequest();
UploadToS3Request.WithFilePath(localPath)
.WithBucketName(bucket)
.WithKey(key);
client.PutObject(UploadToS3Request);
To whom it may still concern..
With the old AWS SDKs (version 1), you can simply create the S3 client without a region or an AmazonS3Config. No need to specify a service URL, it uses the default, mentioned above, for you. The only time you really need the region for work with S3 is when you create a bucket which is probably rarely a requirement for an application.
This works for me and all communication I perform to S3 is over https.
With the new AWS SDK for .Net (version 2 and above) it seems the region parameter is required and in fact the AmazonS3Client would throw an exception if not given one. I've tried working around this limitation with specifying a generic https://s3.amazonaws.com URL and failed, because the new SDK does not follow the 301 redirect from the default (US-EAST-1 I think) endpoint.
So in summary, best to specify region, even on the old API, to avoid breaking in the future. If your application is making cross-region calls, and are slower (possibly) and more expensive, it's probably best that your code will testify to that.
Ok, I've never seen this ever when coding againts and sending 3rd party SOAP API calls but looks like PayPal requires their bigger clients to use the X509 certificate in order to send API calls rather than just sending over a standard API signature like most APIs require you to do.
Am I the only one who thinks this is kinda strange or not stadnard?
http://en.wikipedia.org/wiki/X.509
I don't get how this relates to an API call. I see an example code that they gave me in C# implementing the ICertificatePolicy interface in .NET...but it's just foreign to me and how this relates to the fact that they still give you an API signature too in the PayPal sandbox regardless. So why would I need to read a physical file Certificate AND use an API Signature? I guess I don't see the link between the Certificate and the PayPal SOAP API.
This is a common thing among larger names when dealing with connections that demand a more secure "handshake" and thats all it is used for.
This file is made from a Root Certificate and usually a .pem, .p12, .pfx here is an example using python and cURL, it is very simple to do and if you have any trouble with the X.509 file, I would get in contact with whoever you buy your root certificate from or just search google on how to export the file you need ( I personally always end up with a .p12 file ).
Here is the python code
c = pycurl.Curl()
c.setopt(pycurl.URL, FirstDataAPI_URL)
c.setopt(pycurl.HTTPHEADER, ["Accept:"])
c.setopt(pycurl.POST, 1)
c.setopt(pycurl.POSTFIELDS, urllib.urlencode(FirstDataAPI_PostData))
b = StringIO.StringIO()
c.setopt(pycurl.WRITEFUNCTION, b.write)
c.setopt(pycurl.FOLLOWLOCATION, 1)
c.setopt(pycurl.MAXREDIRS, 5)
#c.setopt(pycurl.SSLCERT, '/home/***/***/***/ssl/digitalID.p12')
c.setopt(pycurl.SSLCERT, '/home/***/***/***/ssl/productionDigitalId.p12')
c.setopt(pycurl.SSLCERTTYPE, 'p12')
c.setopt(pycurl.SSLCERTPASSWD, '******')
c.perform()
For use with SOAP I would look for a setting that allows you to set a Certificate file and you will be set.
Just as a side note, this just goes to show that Paypal has not updated their API in quite a few years ... most API's I work on that require a X509 cert are extremely outdated and I haven't seen this used in an API that was writing in the last 2 years.
You're dealing with people's money, and while I'm unaware of the specifics on how the certificate's work, basically it's ensuring that payments sent from your application are more secure.
A simple API key would be easier to spoof, and allow fraud more easily I assume.