Get meta data of a file using c# - c#

I need to find a files's meta data using c#.The file i use is saved in third party site.
I can able to download the file from that server but i can't able to get the original meta data of the file that i downloaded.
How to achieve this using c#.Below is my code.
string FilePath = AppDomain.CurrentDomain.BaseDirectory + #"Downloads\";
string Url = txtUrl.Text.Trim();
Uri _Url = new Uri(Url);
System.Net.HttpWebRequest request = (System.Net.HttpWebRequest)System.Net.WebRequest.Create(_Url);
request.Timeout = Timeout.Infinite;
System.Net.HttpWebResponse response = (System.Net.HttpWebResponse)request.GetResponse();
response.Close();
if (response.ContentType != "text/html; charset=UTF-8")
{
string FileSize = response.Headers.Get("Content-Length");
int lastindex = Url.LastIndexOf("/");
string TempUrlName = Url.Substring(lastindex + 1, Url.Length - (lastindex + 1));
WebClient oWebClient = new WebClient();
oWebClient.DownloadFile(txtUrl.Text.Trim(), FilePath + #"\" + TempUrlName);
if (File.Exists(FilePath + #"\" + TempUrlName))
{
FileInfo oInfo = new FileInfo(FilePath + #"\" + TempUrlName);
DateTime time = oInfo.CreationTime;
time = oInfo.LastAccessTime;
time = oInfo.LastWriteTime;
}
}
I can able to get file size,creation time,last accessed time and last write time only after saving the file in local. But i need the file meta data infos when file is located in server using c#.
Thanks

Since those are properties stored in the file system and changed once you save them locally, you won't be able to access those via HTTP.
Do you have any influence on the third party? Maybe have them send those properties along in the headers?

Related

I cannot delete a file via FTP

I am trying to delete all the files in a folder via FTP. Below is the code I am trying.
Files is an array of strings each one if the name of a file in the folder with its extension.
When I run it I get an a reply of 206 but when I look in the folder all the files remain. I tried variations of the code below, including adding a delay, but still cannot delete the files. What have I missed?
foreach (var FileName2 in Files)
{
if (File.Exists(txtbx_save_backup_to.Text + "/" + FileName2))
{
FtpWebRequest Delrequest = (FtpWebRequest)WebRequest.Create(ftp_address + "/Temp/Backup/" + FileName2);
Delrequest.Credentials = new NetworkCredential(username, password);
Delrequest.Method = WebRequestMethods.Ftp.DeleteFile;
Task.Delay(1000);
using (FtpWebResponse response2 = (FtpWebResponse)request.GetResponse())
{
rchtxtbx_backup_comms.AppendText("Deleted File, status " + response2.StatusDescription + "\r");
rchtxtbx_backup_comms.ScrollToCaret();
}
}
}
The comments above gave me the clue I needed in a roundabout way so thanks to you for answering.
I had missed out the part to "action" the delete request. So I added the following and now it works.
WebResponse GetResponse = Delrequest.GetResponse();
Stream GResponseStream = GetResponse.GetResponseStream();
I removed the wait and the complete code is now
foreach (var FileName2 in Files)
{
if (File.Exists(txtbx_save_backup_to.Text + "/" + FileName2))
{
FtpWebRequest Delrequest = (FtpWebRequest)WebRequest.Create(ftp_address + "/Temp/Backup/" + FileName2);
Delrequest.Credentials = new NetworkCredential(username, password);
Delrequest.Method = WebRequestMethods.Ftp.DeleteFile;
//Action request
WebResponse GetResponse = Delrequest.GetResponse();
Stream GResponseStream = GetResponse.GetResponseStream();
using (FtpWebResponse response2 = (FtpWebResponse)request.GetResponse())
{
rchtxtbx_backup_comms.AppendText("Deleted File, status " + response2.StatusDescription + "\r");
rchtxtbx_backup_comms.ScrollToCaret();
}
GResponseStream.Close();
}
}

get downloaded file from URL and Illegal characters in path

string uri = "https://sometest.com/l/admin/ical.html?t=TD61C7NibbV0m5bnDqYC_q";
string filePath = "D:\\Data\\Name";
WebClient webClient = new WebClient();
webClient.DownloadFile(uri, (filePath + "/" + uri.Substring(uri.LastIndexOf('/'))));
/// filePath + "/" + uri.Substring(uri.LastIndexOf('/')) = "D:\\Data\\Name//ical.html?t=TD61C7NibbV0m5bnDqYC_q"
Accesing the entire ( string ) uri, a .ical file will be automatically downloaded... The file name is room113558101.ics ( not that this will help ).
How can I get the file correctly?
You are building your filepath in a wrong way, which results in invalid file name (ical.html?t=TD61C7NibbV0m5bnDqYC_q). Instead, use Uri.Segments property and use last path segment (which will be ical.html in this case. Also, don't combine file paths by hand - use Path.Combine:
var uri = new Uri("https://sometest.com/l/admin/ical.html?t=TD61C7NibbV0m5bnDqYC_q");
var lastSegment = uri.Segments[uri.Segments.Length - 1];
string directory = "D:\\Data\\Name";
string filePath = Path.Combine(directory, lastSegment);
WebClient webClient = new WebClient();
webClient.DownloadFile(uri, filePath);
To answer your edited question about getting correct filename. In this case you don't know correct filename until you make a request to server and get a response. Filename will be contained in response Content-Disposition header. So you should do it like this:
var uri = new Uri("https://sometest.com/l/admin/ical.html?t=TD61C7NibbV0m5bnDqYC_q");
string directory = "D:\\Data\\Name";
WebClient webClient = new WebClient();
// make a request to server with `OpenRead`. This will fetch response headers but will not read whole response into memory
using (var stream = webClient.OpenRead(uri)) {
// get and parse Content-Disposition header if any
var cdRaw = webClient.ResponseHeaders["Content-Disposition"];
string filePath;
if (!String.IsNullOrWhiteSpace(cdRaw)) {
filePath = Path.Combine(directory, new System.Net.Mime.ContentDisposition(cdRaw).FileName);
}
else {
// if no such header - fallback to previous way
filePath = Path.Combine(directory, uri.Segments[uri.Segments.Length - 1]);
}
// copy response stream to target file
using (var fs = File.Create(filePath)) {
stream.CopyTo(fs);
}
}

403 forbidden error - Accessing Amazon S3 bucket using Http GET Request in c#

I am writing an c# code that is trying to access the Amazon S3 bucket through REST calls.
The code makes a get request to an xml file created in the s3 bucket.
I am using the secret key and access Id to create a signature that will be used in the authorization header.
The Signature I created is based on Amazon's documentation,
http://docs.aws.amazon.com/AmazonS3/latest/dev/RESTAuthentication.html
I have provided permissions for authenticated requests to access the xml file in the s3 bucket.
The Code I am using,
string AccessId = "xyz";
string SecretKey = "xyz";
string bucketName = "bucket";
string filename = "filename.xml";
string httpDate = DateTime.UtcNow.ToString("ddd, dd MMM yyyy HH:mm:ss +0000\n");
string StringtoSign = "GET\n"
+ "\n"
+ "\n"
+ httpDate + "\n"
+ "/bucketName/filename.xml";
//Creating Signature
Encoding e_UTF = new UTF8Encoding();
Encoding e_ASCI = new ASCIIEncoding();
byte[] key_new= e_ASCI.GetBytes(SecretKey);
byte[] message_new = e_UTF.GetBytes(StringtoSign);
HMACSHA1 myhmacsha1 = new HMACSHA1(key_new);
byte[] final=myhmacsha1.ComputeHash(message_new);
string AWSSignature = Convert.ToBase64String(final);
// Sending request
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://"+bucketname+".s3-us-west-2.amazonaws.com/"+filename);
request.Method = "GET";
request.Headers.Add("Authorization", "AWS"+ " " + AccessId + ":" + AWSSignature);
try
{
// Getting response
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream stream = response.GetResponseStream();
StreamReader sr = new StreamReader(stream);
String ResStr = sr.ReadToEnd();
Console.WriteLine(ResStr);
}
catch (WebException ex)
{
var Resp = (HttpWebResponse)ex.Response;
Stream new_str=Resp.GetResponseStream();
StreamReader stred = new StreamReader(new_str);
MessageBox.Show(stred.ReadToEnd().ToString());
}
The Same Code works fine if I set the permissions for the xml file as public. So it has got to do something with the signature.
I am not sure what am doing wrong. It would be great if someone can have a look at it.
You need to tell S3 what date/time you used to calculate the signature by sending either a Date or X-Amz-Date header. Your code's signing the request as if you were sending a Date header, so you should ask the HttpWebRequest to send a matching Date header (and be sure to format it the same way the HttpWebRequest will):
DateTime now = DateTime.UtcNow;
string httpDate = now.ToString("r");
...
request.Date = now;
Alternatively, you may want to consider using the AWS SDK for .NET, which will take care of generating a correct signature for you.

Uploading to S3 but nothing appears in the Web Console

I am trying to upload CSV files to AWS S3. My code has no syntax errors and doesn't cause any exceptions but the CSV files do not appear in web console. I also want to organise the CSV files by date. Here's the code:
string[] files = Directory.GetFiles(folder, "*.csv*", SearchOption.TopDirectoryOnly);
foreach (string file in files)
{
PutObjectRequest request = new PutObjectRequest();
request.BucketName = "WorkFolder";
request.Key = "CSV/" + date + "/";
request.FilePath = file;
s3client.PutObject(request);
response = s3client.PutObject(request);
}
I found the answer. I was missing the "file name" in Key property. The correct code is below:
PutObjectRequest request = new PutObjectRequest();
request.BucketName = "WorkFolder";
request.Key = "CSV/" + date + "/" + file; // where file is the name of the file
request.FilePath = s;
s3client.PutObject(request);

Possible to temporarily store a file locally before sending to ftp server?

I've got an ASP control for file upload. When the user posts it, it's first locally stored on where I run the website and then I copy it to a remote ftp server.
However, is it possible to remove it from the local server once it's been copied to the ftp server? I'm thinking like storing it in a ~temp folder, but I can't get that to work. As of now I need to create a folder within my project called "temp". Any ideas? Here the method:
String id = Request.QueryString["ID"];
String path = Server.MapPath("~/temp/");
String filename = Path.GetFileName(fuPicture.PostedFile.FileName);
if (fuPicture.HasFile)
{
try
{
if (
fuPicture.PostedFile.ContentType == "image/jpeg" ||
fuPicture.PostedFile.ContentType == "image/png" ||
fuPicture.PostedFile.ContentType == "image/gif"
)
{
fuPicture.PostedFile.SaveAs(path + fuPicture.FileName);
}
else
{
lblFeedback.Text = "Not allowed file extension";
}
}
catch (Exception ex)
{
lblFeedback.Text = "Error when uploading";
}
path += fuPicture.FileName;
String ftpServer = "ftp://xxxx:xxxx";
String userName = "xx";
String password = "xx";
FtpWebRequest request =
(FtpWebRequest)WebRequest.Create(new Uri("ftp://xxxx:xxxx/" + id));
request.Method = WebRequestMethods.Ftp.MakeDirectory;
request.Credentials = new NetworkCredential(userName, password);
using (var resp = (FtpWebResponse)request.GetResponse())
{
WebClient client = new WebClient();
client.Credentials = new NetworkCredential(userName, password);
client.UploadFile(ftpServer + "/" + id + "/" +
new FileInfo(path).Name, "STOR", path);
}
You can call client.UploadData() to upload a byte array from memory, without involving your local disk at all.
Why you don't do a file.delete after the using statement?

Categories