Invalid paths using CloudFront create invalidation in C# - c#

I am trying to invalidate CloudFront objects in C#/.NET and gettign the following exception:
Your request contains one or more invalid invalidation paths.
My Function:
public bool InvalidateFiles(string[] arrayofpaths)
{
for (int i = 0; i < arrayofpaths.Length; i++)
{
arrayofpaths[i] = Uri.EscapeUriString(arrayofpaths[i]);
}
try
{
Amazon.CloudFront.AmazonCloudFrontClient oClient = new Amazon.CloudFront.AmazonCloudFrontClient(MY_AWS_ACCESS_KEY_ID, MY_AWS_SECRET_KEY, Amazon.RegionEndpoint.USEast1);
CreateInvalidationRequest oRequest = new CreateInvalidationRequest();
oRequest.DistributionId = ConfigurationManager.AppSettings["CloudFrontDistributionId"];
oRequest.InvalidationBatch = new InvalidationBatch
{
CallerReference = DateTime.Now.Ticks.ToString(),
Paths = new Paths
{
Items = arrayofpaths.ToList<string>(),
Quantity = arrayofpaths.Length
}
};
CreateInvalidationResponse oResponse = oClient.CreateInvalidation(oRequest);
oClient.Dispose();
}
catch
{
return false;
}
return true;
}
The array passed to the function contains a single Url like so:
images/temp_image.jpg
The image exists in the S3 bucket and loaded in the browser in the CloudFront URL.
What am I doing wrong?

You invalidation file paths need a / at the front of the string.
If you are in doubt, you can log onto AWS Management, go to Cloudfront, select the distribution you are trying to invalidate files from, select Distribution setting and go to the Invalidations tab.
You can then create validations manually, which allows you to check that your paths are correct.

When you send invalidation request to some object in CloudFront, you still can see your picture in the browser in the CloudFront URL even when invalidation completed, because invalidation does not delete object from S3 bucket and with new request to this image from you browser CloudFront again cached these URl to images/temp_image.jpg in edge locations.
Invalidation of object will be seen, when you update image with the same name.
Your Invalidation function is correct.

Have you tried adding the forward slash at the beginning of the path? (/images/temp_image.jpg)

Related

Access Internet Explorer Downloads List

I want to access Internet Explorer Download List
I tried to access History with UrlHistoryWrapperClass but that does not give me path of the downloaded file.
I need Filename, FilePath and URL aswell.
My problem is accessing View Downloads list of internet explorer. If anyone achieved that before, i appriicate the help.
Thanks in advance.
I do not believe you are able to get this information from the history (unless there is an exploit I'm not aware of). The best you can do is go through the history and if needed, you can trigger a webclient to re-download the file. Attached is a generic example for Internet Explorer for at least traversing the history:
public class InternetExplorer
{
// List of URL objects
public List<URL> URLs { get; set; }
public IEnumerable<URL> GetHistory()
{
// Initiate main object
UrlHistoryWrapperClass urlhistory = new UrlHistoryWrapperClass();
// Enumerate URLs in History
UrlHistoryWrapperClass.STATURLEnumerator enumerator =
urlhistory.GetEnumerator();
// Iterate through the enumeration
while (enumerator.MoveNext())
{
// Obtain URL and Title
string url = enumerator.Current.URL.Replace('\'', ' ');
// In the title, eliminate single quotes to avoid confusion
string title = string.IsNullOrEmpty(enumerator.Current.Title)
? enumerator.Current.Title.Replace('\'', ' ') : "";
// Create new entry
URL U = new URL(url, title, "Internet Explorer");
// Add entry to list
URLs.Add(U);
}
// Optional
enumerator.Reset();
// Clear URL History
urlhistory.ClearHistory();
return URLs;
}

Moving files with Google Drive API v3

Im trying to move a file from one folder to another using the Google Drive API v3. I found documentation how to this here. I used the .NET sample code from the documentation page and created a method that looks like this:
public ActionResult MoveFile(string fileToMove, string destination)
{
DriveService service = new DriveService(new BaseClientService.Initializer
{
HttpClientInitializer = <USER CREDENTIAL>,
ApplicationName = "APPNAME"
});
var searchFiles = service.Files.List();
searchFiles.Corpus = FilesResource.ListRequest.CorpusEnum.User;
searchFiles.Q = "name = '" + fileToMove + "'";
searchFiles.Fields = "files(*)";
string fileToMoveId = searchFiles.Execute().Files[0].Id;
searchFiles.Q = "name = '" + destination + "'";
string destinationId = searchFiles.Execute().Files[0].Id;
//Code used from documentation
// Retrieve the existing parents to remove
var getRequest = service.Files.Get(fileToMoveId);
getRequest.Fields = "parents";
var file = getRequest.Execute();
var previousParents = String.Join(",", file.Parents);
// Move the file to the new folder
var updateRequest = service.Files.Update(file, fileToMoveId);
updateRequest.Fields = "id, parents";
updateRequest.AddParents = destinationId;
updateRequest.RemoveParents = previousParents;
file = updateRequest.Execute();
return RedirectToAction("Files", new {folderId = destinationId});
}
When I execute this code I get the following error:
The parents field is not directly writable in update requests. Use the
addParents and removeParents parameters instead.
The error doesn't really makes sense to me because this code sample came from the documentation page itself. I can't figure out what other paramters they mean. What addParents and removeParents parameters do they mean? Are updateRequest.AddParents and updateRequest.RemoveParents not the right parameters?
Ok here is the problem.
var updateRequest = service.Files.Update(file, fileToMoveId);
The method is requiring that you send a body of a file to be updated. This normally makes sense as any changes you want to make you can add to the body.
Now the problem you are having is that you got your file from a file.get. Which is totally normal. This is how you should be doing it. THe problem is there are some fields in that file that you cant update. So by sending the full file the API is rejecting your update. If you check Files: update under Request body you will see which fiends are updateable.
Issue:
Now this is either a problem with the client library or the API I am going to have to track down a few people at Google to see which is the case.
Fix:
I did some testing and sending an empty file object as the body works just fine. The file is moved.
var updateRequest = service.Files.Update(new Google.Apis.Drive.v3.Data.File(), fileToMove.Id);
updateRequest.AddParents = directoryToMove.Id;
updateRequest.RemoveParents = fileToMove.Parents[0];
var movedFile = updateRequest.Execute();
This method works well when working in your own drive, but not in a team drive where a file (folder) can only have 1 parent strictly. I do not have the solution in a team drive

How to extract the X-XSRF_TOKEN in a web performance test

I had written a web performance test which was earlier working fine. Developers now have added a CSRF token validation (to prevent CSRF attack on the website). After this the test has started to fail (Error, Bad Request). I dug into it and found that the server is generating an XSRF-TOKEN on login request which has to be passed in every request there after.
Now to extract the token we need to parse response to the login request. How can we do it?
My coded tests looks like this:
WebTestRequest request4 = new WebTestRequest("https://servertest:8080/WebConsole/Account/Login");
request4.Method = "POST";
request4.Headers.Add(new WebTestRequestHeader("Accept", "application/json, text/plain, */*"));
request4.Headers.Add(new WebTestRequestHeader("Referer", "https://servertest:8080/WebConsole/index.html#/"));
StringHttpBody request4Body = new StringHttpBody();
request4Body.ContentType = "application/json;charset=utf-8";
request4Body.InsertByteOrderMark = false;
request4Body.BodyString = "{\"UserName\":\"pkdomain\\\\administrator\",\"Password\":\"sqa#123\"}";
request4.Body = request4Body;
yield return request4;
request4 = null;
WebTestRequest request5 = new WebTestRequest("https://servertest:8080/WebConsole/scripts/home/Pages/home-view.html");
request5.ThinkTime = 4;
request5.Headers.Add(new WebTestRequestHeader("Accept", "text/html"));
request5.Headers.Add(new WebTestRequestHeader("Referer", "https://servertest:8080/WebConsole/index.html#/"));
yield return request5;
request5 = null;
I believe the XSRF-TOKEN is returned in a cookie. Assuming that this is true in your case then the Set-Cookie header field contains the value and the required cookie must be extracted from it and saved to a context parameter. Subsequently that context parameter can be used wherever needed.
I suggest you create a sandbox .webtest file, do the steps below then convert it to coded test and copy the useful lines into the real test.
In more detail the steps are:
Add an Extract HTTP Header extraction rule for the Set-Cookie header field to the request that returns the XSRF-TOKEN value. Save the extracted value to a context parameter of your choice, give its name in one of the properties of the extraction rule; see the image below.
Add a call of the plugin below to the first request after the one with the above extraction rule. It extracts the required field from the cookie header field. The image below shows setting the properties of the call. (You might change the plugin to be a PostRequest and add it to the same request as the one with the extraction rule.)
public class ExtractCookieField : WebTestRequestPlugin
{
public string AllCookiesCP { get; set; }
public string FieldWantedCP { get; set; }
public string SavedFieldCP { get; set; }
// Expected to be called with AllCookiesCP containing text similar to:
// SomeHeader=639025785406236250; path=/; XSRF-TOKEN=somestring; secure; HttpOnly
public override void PreRequestDataBinding(object sender, PreRequestDataBindingEventArgs e)
{
string AllCookiesText = e.WebTest.Context[AllCookiesCP].ToString();
foreach (string nameValuePair in AllCookiesText.Split(';'))
{
string[] nameAndValue = nameValuePair.Split(new char[] { '=' }, 2);
if (nameAndValue[0].Trim() == FieldWantedCP)
{
string sessionTokenId = nameAndValue[1].Trim();
e.WebTest.Context[SavedFieldCP] = sessionTokenId;
e.WebTest.AddCommentToResult(string.Format("Setting {{{0}}} to '{1}'", SavedFieldCP, sessionTokenId));
return;
}
}
// Dropping out of the loop means that the field was not found.
throw new WebTestException(string.Format("Cannot extract cookie field '{0}' from '{1}'", FieldWantedCP, AllCookiesText));
}
}
The value of the XSRF-TOKEN should now be in the context parameter specified in the SavedFieldCP property of the plugin call.
This image shows the add extraction rule dialogue and setting the context parameter where the extracted header field is saved, ie into CookieValues. It also show the add plugin and setting the three properties. After the plugin runs, assuming it is successful, the token value should be saved into the context parameter XsrfToken. The parameter values can be modified in the .webtest file via the properties panels of the extraction rule and the plugin. The values should also be clearly seen as simple variables and strings in a coded webb test.

uploading image to azure blob storage

I know this question can be interpreted as a duplicate, but I can simply not get the blop service working. I have followed the standard example on msdn. I have implemented in my code but followed the example. I can get my MobileService, with the supplied script in the example, to insert a blob with open properties. I then use this code to upload an image to the blob storage:
BitmapImage bi = new BitmapImage();
MemoryStream stream = new MemoryStream();
if (bi != null)
{
WriteableBitmap bmp = new WriteableBitmap((BitmapSource)bi);
bmp.SaveJpeg(stream, bmp.PixelWidth, bmp.PixelHeight, 0, 100);
}
if (!string.IsNullOrEmpty(uploadImage.SasQueryString))
{
// Get the URI generated that contains the SAS
// and extract the storage credentials.
StorageCredentials cred = new StorageCredentials(uploadImage.SasQueryString);
var imageUri = new Uri(uploadImage.ImageUri);
// Instantiate a Blob store container based on the info in the returned item.
CloudBlobContainer container = new CloudBlobContainer(
new Uri(string.Format("https://{0}/{1}",
imageUri.Host, uploadImage.ContainerName)), cred);
// Upload the new image as a BLOB from the stream.
CloudBlockBlob blobFromSASCredential = container.GetBlockBlobReference(uploadImage.ResourceName);
await blobFromSASCredential.UploadFromStreamAsync(stream);//error!
// When you request an SAS at the container-level instead of the blob-level,
// you are able to upload multiple streams using the same container credentials.
stream = null;
}
I am getting an error in this code at the point marked error, with the following error:
+ ex {Microsoft.WindowsAzure.Storage.StorageException: The remote server returned an error: NotFound. ---> System.Net.WebException: The remote server returned an error: NotFound. ---> System.Net.WebException: The remote server returned an error: NotFound.
Which I do not understand since the code that returns the string from the script is:
// Generate the upload URL with SAS for the new image.
var sasQueryUrl = blobService.generateSharedAccessSignature(item.containerName,
item.resourceName, sharedAccessPolicy);
// Set the query string.
item.sasQueryString = qs.stringify(sasQueryUrl.queryString);
// Set the full path on the new new item,
// which is used for data binding on the client.
item.imageUri = sasQueryUrl.baseUrl + sasQueryUrl.path;
Of course this also depicts that I do not completely grasp the construction of the blob storage. And therefore any help would be appreciated.
Comment elaborations
From the server code it should create a public note for at least 5 minutes. And therefore not be an issue. My server script is the same as the link. But replicated here:
var azure = require('azure');
var qs = require('querystring');
var appSettings = require('mobileservice-config').appSettings;
function insert(item, user, request) {
// Get storage account settings from app settings.
var accountName = appSettings.STORAGE_ACCOUNT_NAME;
var accountKey = appSettings.STORAGE_ACCOUNT_ACCESS_KEY;
var host = accountName + '.blob.core.windows.net';
if ((typeof item.containerName !== "undefined") && (
item.containerName !== null)) {
// Set the BLOB store container name on the item, which must be lowercase.
item.containerName = item.containerName.toLowerCase();
// If it does not already exist, create the container
// with public read access for blobs.
var blobService = azure.createBlobService(accountName, accountKey, host);
blobService.createContainerIfNotExists(item.containerName, {
publicAccessLevel: 'blob'
}, function(error) {
if (!error) {
// Provide write access to the container for the next 5 mins.
var sharedAccessPolicy = {
AccessPolicy: {
Permissions: azure.Constants.BlobConstants.SharedAccessPermissions.WRITE,
Expiry: new Date(new Date().getTime() + 5 * 60 * 1000)
}
};
// Generate the upload URL with SAS for the new image.
var sasQueryUrl =
blobService.generateSharedAccessSignature(item.containerName,
item.resourceName, sharedAccessPolicy);
// Set the query string.
item.sasQueryString = qs.stringify(sasQueryUrl.queryString);
// Set the full path on the new new item,
// which is used for data binding on the client.
item.imageUri = sasQueryUrl.baseUrl + sasQueryUrl.path;
} else {
console.error(error);
}
request.execute();
});
} else {
request.execute();
}
}
The idea with the pictures is that other users of the app should be able to access them. As far as I understand I have made it public, but only write public for 5 minutes. The url for the blob I save in a mobileservice table, where the user needs to be authenticated, I would like the same safety on the storage. But do not know if this is accomplished? I am sorry for all the stupid questions, but I have not been able to solve it on my own so I have to "seem" stupid :)
If someone ends up in here needing help. The problem for me was the uri. It should have been http and not https. Then there were no error uploading.
But displaying the image even on a test image control from the toolbox, did not succeed. The problem was I had to set the stream to the begining:
stream.Seek(0, SeekOrigin.Begin);
Then the upload worked and was able to retrieve the data.

Embedding Cgi video in site

Hello All I have various web cameras i would like to embed in my site
http://81.137.212.183:4483/GetData.cgi
The problem is at times the cameras go down so i need to check they are active in c# before attempting to render:
<img height="240" width="320" src="http://81.137.212.183:4483/GetData.cgi" />
Please can someone advise how i can check the cgi is active in c# or any other recommendation. If i simple load the cgi and it is down it causes the browser to crash
One recommendation was to use the code below:
The problem with the below approach is the site is forever loading and a fav icon is never shown as can be seen http://www.camsecure.co.uk/
newImage = new Image();
function LoadNewImage() {
var unique = new Date();
document.images.webcam.src = newImage.src;
newImage.src = "http://collectart.dyndns.org:4484/Jpeg/CamImg.jpg?time=" + unique.getTime();
}
function InitialImage() {
var unique = new Date();
newImage.onload = LoadNewImage();
newImage.src = "http://collectart.dyndns.org:4484/Jpeg/CamImg.jpg?time=" + unique.getTime();
document.images.webcam.src = "http://collectart.dyndns.org:4484/Jpeg/CamImg.jpg?time=" + unique.getTime();
document.images.webcam.onload = "";
}
First off, you need to put some security over that first link. It appears the camera settings are public and available to anyone.
If the only problem is the long loading times slowing the rest of the site down, you could load the images in an iframe rather than directly in an image tag -- then the hang is only in the iframe:
<iframe src="http://81.137.212.183:4483/Simple/home.htm?IMG"></iframe>
To check the IP camera is up, you could simply try to get it's host page:
using System.Net.Http;
...
var uri = new Uri("http://81.137.212.183:4483/Simple/index.htm");
var task = new HttpClient().GetAsync(uri);
if (task.Wait(TimeSpan.FromSeconds(1)) && task.Result.IsSuccessStatusCode)
{
// SUCCESS!
}
else
{
// FAILURE... try next camera
}
However, it looks like the image .cgi location can still fail if the camera is available. In that case it would be best to load in an iframe even if you get success.

Categories