I am using c# to create shared access signatures for new resources (The user should have create privileges to create new resources on my storage account).
The MS documentation is out of date and I can't seem to get it to work using the different blog posts I've gone through.
Right now my code looks like so:
public static string GetBlobSharedAccessSignitureUrl(CloudBlobContainer container,string nameOfBlobToCreateSaSfor)
{
var blob = container.GetBlockBlobReference(nameOfBlobToCreateSaSfor);
var policy = new SharedAccessBlobPolicy
{
SharedAccessExpiryTime = DateTime.Now.AddHours(1),
Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Read
};
container.GetSharedAccessSignature(policy);
string sas = blob.GetSharedAccessSignature(policy);
return blob.Uri.AbsoluteUri + sas;
}
and the returned url (for my local machine) looks like this (what seems to be correct)
http://127.0.0.1:10000/devstoreaccount1/photos/photos_4.jpg?sv=2012-02-12&se=2013-01-20T10%3A13%3A17Z&sr=b&sp=rw&sig=xxx
I started the Azure storage simulator and through fiddler tried to POST to this URL (also tried PUT)
I am getting errors (404 or 400 , depends on different code for this function that I have tried)
Do I need to do something else? (In the old examples I saw them create a resource in that location before hand - which I've tried as well but didn't work either...)
Azure SDK version is 2.0 so the MS blog posts (and other tutorials) before October 2012 are broken (also according to MS dev blog http://blogs.msdn.com/b/windowsazurestorage/archive/2012/10/29/windows-azure-storage-client-library-2-0-breaking-changes-amp-migration-guide.aspx)
any help would be appreciated
If you're posting through Fiddler or through your code, please make sure you add "x-ms-blob-type" request header and set it's value as "BlockBlob". Take a look at this sample code where it tries to upload a file:
FileInfo fInfo = new FileInfo(fileName);//fileName is the full path of the file.
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(blobSaSUrl);
NameValueCollection requestHeaders = new NameValueCollection();
requestHeaders.Add("x-ms-blob-type", "BlockBlob");
req.Method = "PUT";
req.Headers.Add(requestHeaders);
req.ContentLength = fInfo.Length;
byte[] fileContents = new byte[fInfo.Length];
using (FileStream fs = fInfo.OpenRead())
{
fs.Read(fileContents, 0, fileContents.Length);
using (Stream s = req.GetRequestStream())
{
s.Write(fileContents, 0, fileContents.Length);
}
using (HttpWebResponse resp = (HttpWebResponse)req.GetResponse())
{
}
}
Create a SAS token that's valid for one hour.
BlobSasBuilder sasBuilder = new BlobSasBuilder()
{
BlobContainerName = containerName,
BlobName = blobName,
Resource = "b",
StartsOn = DateTimeOffset.UtcNow,
ExpiresOn = DateTimeOffset.UtcNow.AddHours(1)
};
Specify read permissions for the SAS.
sasBuilder.SetPermissions(BlobSasPermissions.Read);
Use the key to get the SAS token.
string sasToken = sasBuilder.ToSasQueryParameters(key, accountName).ToString();
Construct the full URI, including the SAS token.
UriBuilder fullUri = new UriBuilder()
{
Scheme = "https",
Host = string.Format("{0}.blob.core.windows.net", accountName),
Path = string.Format("{0}/{1}", containerName, blobName),
Query = sasToken
};
Related
I use Google Cloud Storage Apis. I create Signed Url like this;
var credential = new ServiceAccountCredential(new ServiceAccountCredential.Initializer("username#iam.gserviceaccount.com").FromPrivateKey(PrivateKey));
var urlSigner = UrlSigner.FromServiceAccountCredential(credential);
string url = urlSigner.Sign(bucketName, "Sample.txt", TimeSpan.FromDays(7));
var httpClient = new System.Net.Http.HttpClient();
System.Net.Http.HttpResponseMessage response = await httpClient.GetAsync(url);
var content = await response.Content.ReadAsByteArrayAsync();
it's ok. But I can't use Generation No for versions.
This is normal download without signed url;
I looking for "IfGenerationMatch = " on Signed Url
await client.DownloadObjectAsync(
bucket: bucketName,
objectName: sourcePath,
destination: destinationPath,
progress: progress,
options: new DownloadObjectOptions()
{ ChunkSize = 1048576, Range = rangeHeaderValue, IfGenerationMatch = data.GenerationNo }
);
Keep in mind that as mentioned here, the signed URLs can only access resources in Cloud Storage through XML API endpoints.
The IfGenerationMatch precondition is only present in the JSON API, as such, you will not find it using the signed URLs. You would need to use the x-goog-if-generation-match precondition since that's the equivalent in the XML API.
I am attempting to execute my own HTTP signed request since there is no SDK in C# for the PutMedia API for the AWS Kinesis Video Stream, but I am getting the following error message:
StatusCode: 403, ReasonPhrase: 'Forbidden'
x-amzn-ErrorType: InvalidSignatureException:http://internal.amazon.com/coral/com.amazon.coral.service/
Here is a gist of what my code looks like:
var streamName = "audio-stream-test";
var service = "kinesisvideo";
var endpoint = GetPutMediaEndpoint(streamName);
var host = GetHostFromEndpoint(endpoint);
var region = GetRegionFromEndpoint(endpoint);
var t = DateTime.UtcNow;
var canonical_uri = $"{endpoint}/putMedia";
var httpRequestMessage = new HttpRequestMessage(HttpMethod.Post, new Uri(canonical_uri));
httpRequestMessage.Headers.Add("connection", "keep-alive");
httpRequestMessage.Headers.Add("host", host);
httpRequestMessage.Headers.Add("Transfer-Encoding", "chunked");
httpRequestMessage.Headers.Add("user-agent", "AWS-SDK-KVS/2.0.2");
httpRequestMessage.Headers.Add("x-amzn-fragment-acknowledgment-required", "1");
httpRequestMessage.Headers.Add("x-amzn-fragment-timecode-type", "ABSOLUTE");
httpRequestMessage.Headers.Add("x-amzn-producer-start-timestamp", (t - DateTime.MinValue).TotalMilliseconds.ToString());
httpRequestMessage.Headers.Add("x-amzn-stream-name", streamName);
httpRequestMessage.Headers.Add("x-amz-security-token", sessionToken);
var byteArray = File.ReadAllBytes(filePath);
var content = new ByteArrayContent(byteArray);
httpRequestMessage.Content = content;
var httpClient = new HttpClient();
var aws4RequestSigner = new AWS4RequestSigner(accessKey, secretAccessKey);
var signedHttpRequestMessage = aws4RequestSigner.Sign(httpRequestMessage, service, region).Result;
var httpResponseMessage = httpClient.SendAsync(signedHttpRequestMessage);
Screenshot of Error
I am using the Aws4RequestSigner NuGet package to sign the request. Any ideas what I am doing wrong here? Has anyone tried to use the AWS Kinesis Video Stream with C#/.NET successfully?
Two potential issues with the pseudo-code.
If using session token then the request signing should include the session token as well not only access key/secret access key combination.
The body of the PutMedia is "endless" as it streams out as a realtime stream. As such, the data shouldn't be included in the signature calculation.
This is answer to your question "the actual "content" is not being added to the stream. I see the Put Connection from KVS but no data added".
After you get 200 by setting http headers properly for the signing with below code, you need to have your content set in signedHttpRequestMessage.
var httpResponseMessage = httpClient.SendAsync(signedHttpRequestMessage);
PowerShell has a great function in the Az.Accounts library - "Connect-AzAccount" - that allows me to authenticate against my Azure instance and publish APIM's easily. I need to port the code over to c# and am unable to find an easy equivalent.
Has anyone found a straightforward way to do this? The Azure world / libraries seem to be updating frequently, along with the many security aspects. It's difficult to find a recent, relevant example.
My ultimate goal is to authenticate against Azure, then publish API's to the API Gateway.
The PowerShell code:
$user = "account#domain.com"
$password = ConvertTo-SecureString -String "password goes here" -AsPlainText -Force
$azAccount = #{
subscription = "subscription-guid-goes-here"
credential = New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList $user,$password
}
Connect-AzAccount #azAccount
Currently, Azure Management Library for .NET can only be used to manage partial Azure resources. However, Azure API Management is not included.
So, I suggest you use Azure REST API to manage your resources.
As Azure REST APi is protected under Azure AD. So, the first step is to acquire an accesstoken for authentication.
Here is a sample by using Microsoft.IdentityModel.Clients.ActiveDirectory package
static string GetToken()
{
string tenantId = "your tenant id or name, for example: hanxia.onmicrosoft.com";
string clientId = "1950a258-227b-4e31-a9cf-717495945fc2"; // it is a public client for every tenant.
string resource = "https://management.core.windows.net/";
string username = "user name, jack#hanxia.onmicrosoft.com";
string password = "password, D******";
var upc = new UserPasswordCredential(username, password);
var context = new AuthenticationContext("https://login.microsoftonline.com/" + tenantId);
AuthenticationResult result = context.AcquireTokenAsync(resource, clientId, upc).Result;
return result.AccessToken;
}
After that, you can call Azure REST API along with adding authorization header. Here is a POST request sample:
public static string postRequest(string url, string access_token, string data)
{
byte[] buffer = null;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "post";
request.ContentType = "application/json";
request.Headers.Add("Authorization", "Bearer " + access_token);
//request.Headers.Add("other header", "it's value");
if (data != null)
buffer = Encoding.UTF8.GetBytes(data);
else
buffer = Encoding.UTF8.GetBytes("");
request.ContentLength = buffer.Length;
request.GetRequestStream().Write(buffer, 0, buffer.Length);
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
using (StreamReader reader = new StreamReader(response.GetResponseStream(), Encoding.UTF8))
{
return response.StatusCode + " " + reader.ReadToEnd();
}
}
the existing answer is a bit odd, its linking some old depreciated library. Official library supports "managing" Api Management. That would be the preferred way of managing Azure resources with C# and you dont have to recreate anything.
I am trying to use Tweetsharp's SendTweetWithMedia with a image which I don't have stored locally, I only have a url. All the examples I have found of SendTweetWithMedia use a file on the local system.
var thumb = "http://somesite.net/imageurl";
var service = new TwitterService(key, secret);
service.AuthenticateWith(token, tokenSecret);
var req = WebRequest.Create(thumb);
using (var stream = req.GetResponse().GetResponseStream())
{
response = service.SendTweetWithMedia(new SendTweetWithMediaOptions
{
Status = tweet.Trim(),
Images = new Dictionary<string, Stream> { { fullname, stream } }
});
}
I get the following error from SendTweetWithMedia:
'System.NotSupportedException': This stream does not support seek operations.
I could download the file from the url and save locally, but I'd rather use the url. Is this possible?
In the end, I just created a temporary file:
byte[] data;
using (var client = new WebClient())
{
data = client.DownloadData(thumb);
}
File.WriteAllBytes($"{Path.GetTempPath()}\\xyz.jpg", data);
best answer I could come up with. Still a few more lines than I'd like though.
Does anyone have some experience with cache-control on Windows azure. I just can't get it to work. I have a JSON file which I want some caching rules on. I have the following code:
var matchContainer = _blobClient.GetContainerReference(match.ClassContainer);
matchContainer.CreateIfNotExists();
matchContainer.SetPermissions(new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
// save the blob into the blob service
string uniqueBlobName = string.Format("matchdata/" + match.KeyPath + ".json").ToLower();
CloudBlockBlob blob = matchContainer.GetBlockBlobReference(uniqueBlobName);
var matchString = match.ToString();
MemoryStream stream = new MemoryStream();
StreamWriter writer = new StreamWriter(stream);
writer.AutoFlush = true;
writer.Write(matchString);
stream.Seek(0, SeekOrigin.Begin);
blob.UploadFromStream(stream);
blob.Properties.CacheControl = "max-age=3600, must-revalidate";
blob.SetProperties();
Which is exactly the same as a sample over here
Sample of setting Cache-Control
The header just doesn't get set. Hope you can help.
if you are using fiddle on windows 8.1 preview, I have noticed that it doesn't always hit the cache. I had to use chrome to validate the cache hit for my blog post.
Alex