I'm trying to upload a file to the contabo object storage from memory stream using AWSSDK.S3
This is my client configuration.
string accessKey = "xxxxxxxxxxxxxxxxxxxxxxxxxx";
string secretKey = "xxxxxxxxxxxxxxxxxxxxxxxxxx";
Amazon.S3.AmazonS3Config config = new Amazon.S3.AmazonS3Config();
Amazon.S3.AmazonS3Client s3Client;
public BookingsController()
{
config.ServiceURL = "https://eu2.contabostorage.com";
config.DisableHostPrefixInjection = true;
s3Client = new Amazon.S3.AmazonS3Client(
accessKey,
secretKey,
config
);
}
This is the method I'm using:
[HttpPost("/api/Bookings/AddFile")]
public async Task<ActionResult> AddBookingFile([FromForm] IFormFile file)
{
using (var newMemoryStream = new MemoryStream())
{
ListBucketsResponse response = await s3Client.ListBucketsAsync();
file.CopyTo(newMemoryStream);
Amazon.S3.Model.PutObjectRequest request = new Amazon.S3.Model.PutObjectRequest();
request.BucketName = "test-bucket";
request.Key = "recording.wav";
request.ContentType = "audio/wav";
request.InputStream = newMemoryStream;
await s3Client.PutObjectAsync(request);
}
return Ok();
}
The ListBucket-Method works properly. The PutObject Method throws the exception that the Host "Der angegebene Host ist unbekannt. (test-bucket.eu2.contabostorage.com:443)" cannot be found.
Referenced to the Contabo-docs is that correct, because Contabo doesn't support virtual hosted buckets (dns prefix). Reference to contabo docs
I thought, that the following configuration fix this, but that wasn't the solution.
config.DisableHostPrefixInjection = true;
Does anyone has any advise hot to prevent the prefixing of the url?
Related
I got to upload file & streams to S3 periodically and the file size usually under 50MB. What am seeing is TransferUtility.UploadAsync method returns success but the file was never created. This occurs sporadically. Below is the code we use. Any suggestions around this?
public TransferUtilityUploadRequest CreateRequest(string bucket, string path)
{
var request = new TransferUtilityUploadRequest
{
BucketName = bucket,
StorageClass = S3StorageClass.Standard,
Key = path,
AutoCloseStream = true,
CannedACL = S3CannedACL.AuthenticatedRead,
ServerSideEncryptionMethod = ServerSideEncryptionMethod.AES256
};
return request;
}
public async Task CreateS3ObjectFromStreamAsync(MemoryStream memeoryStream, string bucket, string filePath)
{
var ftu = new TransferUtility(client);
var request = CreateRequest(bucket, filePath);
request.InputStream = memeoryStream;
await ftu.UploadAsync(request);
}
public async Task CreateS3ObjectFromFileAsync(string sourceFilePath, string bucketName, string destpath)
{
var request = CreateRequest(bucketName, destpath);
request.FilePath = sourceFilePath;
var ftu = new TransferUtility(client);
await ftu.UploadAsync(request);
}
Please use Upload or UploadAsync.Wait. It worked for me!
Im trying to download object from S3 bucket facing below issue
The Security token included in the request is Invalid .
Please check and correct where is the mistake.
Below is my code
1. Get Temporary credentails:
main()
{
string path = "http://XXX.XXX.XXX./latest/meta-data/iam/security-credentials/EC2_WLMA_Permissions";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(path);
request.Method = "GET";
request.ContentType = "application/json";
HttpWebResponse response = request.GetResponse() as HttpWebResponse;
string result = string.Empty;
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
result = reader.ReadToEnd();
dynamic metaData = JsonConvert.DeserializeObject(result);
_awsAccessKeyId = metaData.AccessKeyId;
_awsSecretAccessKey = metaData.SecretAccessKey;
}
}
Create SessionAWSCredentials instance:
SessionAWSCredentials tempCredentials =
GetTemporaryCredentials(_awsAccessKeyId, _awsSecretAccessKey);
//GetTemporaryCredentials method:
private static SessionAWSCredentials GetTemporaryCredentials(
string accessKeyId, string secretAccessKeyId)
{
AmazonSecurityTokenServiceClient stsClient =
new AmazonSecurityTokenServiceClient(accessKeyId,
secretAccessKeyId);
Console.WriteLine(stsClient.ToString());
GetSessionTokenRequest getSessionTokenRequest =
new GetSessionTokenRequest();
getSessionTokenRequest.DurationSeconds = 7200; // seconds
GetSessionTokenResponse sessionTokenResponse =
stsClient.GetSessionToken(getSessionTokenRequest);
Console.WriteLine(sessionTokenResponse.ToString());
Credentials credentials = sessionTokenResponse.Credentials;
Console.WriteLine(credentials.ToString());
SessionAWSCredentials sessionCredentials =
new SessionAWSCredentials(credentials.AccessKeyId,
credentials.SecretAccessKey,
credentials.SessionToken);
return sessionCredentials;
}
Get files from S3 using AmazonS3Client:
using (IAmazonS3 client = new AmazonS3Client(tempCredentials,RegionEndpoint.USEast1))
{
GetObjectRequest request = new GetObjectRequest();
request.BucketName = "bucketName" + #"/" + "foldername";
request.Key = "Terms.docx";
GetObjectResponse response = client.GetObject(request);
response.WriteResponseStreamToFile("C:\\MyFile.docx");
}
We do something a little simpler for interfacing with S3 (downloads and uploads)
It looks like you went the more complex approach. You should try just using the TransferUtility instead:
TransferUtility fileTransferUtility =
new TransferUtility(
new AmazonS3Client("ACCESS-KEY-ID", "SECRET-ACCESS-KEY", Amazon.RegionEndpoint.CACentral1));
// Note the 'fileName' is the 'key' of the object in S3 (which is usually just the file name)
fileTransferUtility.Download(filePath, "my-bucket-name", fileName);
NOTE: TransferUtility.Download() returns void because it downloads the file to the path specified in the filePath argument. This may be a little different than what you were expecting but you can still open a FileStream to that path afterwards and manipulate the file all you want. For example:
using (FileStream fileDownloaded = new FileStream(filePath, FileMode.Open, FileAccess.Read))
{
// Do stuff with our newly downloaded file
}
Bucketname, Accesskey and secretkey, I took from web config. You could type manually.
public void DownloadObject(string imagename)
{
RegionEndpoint bucketRegion = RegionEndpoint.USEast1;
IAmazonS3 client = new AmazonS3Client(bucketRegion);
string accessKey = System.Configuration.ConfigurationManager.AppSettings["AWSAccessKey"];
string secretKey = System.Configuration.ConfigurationManager.AppSettings["AWSSecretKey"];
AmazonS3Client s3Client = new AmazonS3Client(new BasicAWSCredentials(accessKey, secretKey), Amazon.RegionEndpoint.USEast1);
string objectKey = "EMR" + "/" + imagename;
//EMR is folder name of the image inside the bucket
GetObjectRequest request = new GetObjectRequest();
request.BucketName = System.Configuration.ConfigurationManager.AppSettings["bucketname"];
request.Key = objectKey;
GetObjectResponse response = s3Client.GetObject(request);
response.WriteResponseStreamToFile("D:\\Test\\"+ imagename);
}
//> D:\Test\ is local file path.
Following is how I download it. I need to download only .zip files in this case. Restricting to only required file types (.zip in my case), helped me to avoid errors related to (416) Requested Range Not Satisfiable
public static class MyAWS_S3_Helper
{
static string _S3Key = System.Configuration.ConfigurationManager.ConnectionStrings["S3BucketKey"].ConnectionString;
static string _S3SecretKey = System.Configuration.ConfigurationManager.ConnectionStrings["S3BucketSecretKey"].ConnectionString;
public static void S3Download(string bucketName, string _ObjectKey, string downloadPath)
{
IAmazonS3 _client = new AmazonS3Client(_S3Key, _S3SecretKey, Amazon.RegionEndpoint.USEast1);
TransferUtility fileTransferUtility = new TransferUtility(_client);
fileTransferUtility.Download(downloadPath + "\\" + _ObjectKey, bucketName, _ObjectKey);
_client.Dispose();
}
public static async Task AsyncDownload(string bucketName, string downloadPath, string requiredSunFolder)
{
var bucketRegion = RegionEndpoint.USEast1; //Change it
var credentials = new BasicAWSCredentials(_S3Key, _S3SecretKey);
var client = new AmazonS3Client(credentials, bucketRegion);
var request = new ListObjectsV2Request
{
BucketName = bucketName,
MaxKeys = 1000
};
var response = await client.ListObjectsV2Async(request);
var utility = new TransferUtility(client);
foreach (var obj in response.S3Objects)
{
string currentKey = obj.Key;
double sizeCheck = Convert.ToDouble(obj.Size);
int fileNameLength = currentKey.Length;
Console.WriteLine(currentKey + "---" + fileNameLength.ToString());
if (currentKey.Contains(requiredSunFolder))
{
if (currentKey.Contains(".zip")) //This helps to avoid errors related to (416) Requested Range Not Satisfiable
{
try
{
S3Download(bucketName, currentKey, downloadPath);
}
catch (Exception exTest)
{
string messageTest = currentKey + "-" + exTest;
}
}
}
}
}
}
Here is how it is called
static void Main(string[] args)
{
string downloadPath = #"C:\SourceFiles\TestDownload";
Task awsTask = MyAWS_S3_Helper.AsyncDownload("my-files", downloadPath, "mysubfolder");
awsTask.Wait();
}
Here is what I have done to download the files from S3 bucket,
var AwsImportFilePathParcel = "TEST/TEMP"
IAmazonS3 client = new AmazonS3Client(AwsAccessKey,AwsSecretKey);
S3DirectoryInfo info = new S3DirectoryInfo(client, S3BucketName, AwsImportFilePathParcel);
S3FileInfo[] s3Files = info.GetFiles(pattenForParcel);
now in s3Files, you have all the files which are on provided location, using for each you can save all files in to your system
foreach (var fileInfo in s3Files)
{
var localPath = Path.Combine("C:\TEST\", fileInfo.Name);
var file = fileInfo.CopyToLocal(localPath);
}
I am trying to read a bucket at storage.googleapis.com, using the Amazon Web Services .Net SDK in C#.
Can anyone provide a working example of a S3 endpoint Config setup for google, just using the Auth. key/secret pair and a bucket name? Or using any other method to get this working?
According to this tutorial this should be a simple matter, but I get all sorts of exceptions when trying to follow the instructions given. Here is an extract of my current attempt - which throws a TrustFailure exception:
The remote certificate is invalid.
AmazonS3Config conf = new AmazonS3Config();
// Set regionEndpoint to null, or else the serviceURL will be ignored
conf.RegionEndpoint = null;
conf.ServiceURL = "https://s3.storage.googleapis.com";
conf.UseHttp = false;
conf.AuthenticationRegion = null;
conf.UseAccelerateEndpoint = false;
conf.UseDualstackEndpoint = false;
AWSCredentials cred = new BasicAWSCredentials("GOOG3LFXXXXXXXXXXXXX", "BQ6VeMXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX");
IAmazonS3 client = new AmazonS3Client(cred, conf);
GetBucketVersioningRequest request = new GetBucketVersioningRequest { BucketName = "hisbucket" };
GetBucketVersioningResponse response = client.GetBucketVersioning(request);
I finally got the .NET SDK to upload to Google Cloud Storage with:
AWSConfigsS3.UseSignatureVersion4 = false;
AmazonS3Config config = new AmazonS3Config();
config.ServiceURL = "https://storage.googleapis.com";
config.SignatureVersion = "2";
AmazonS3Client client = new AmazonS3Client(accessKey, secretKey, config);
var transferUtilityConfig = new TransferUtilityConfig
{
ConcurrentServiceRequests = 1,
MinSizeBeforePartUpload = 6291456000,
};
var fileTransferUtilityRequest = new TransferUtilityUploadRequest
{
BucketName = bucketName,
FilePath = filePath,
PartSize = 6291456000,
Key = keyName,
};
TransferUtility fileTransferUtility = new TransferUtility(client, transferUtilityConfig);
fileTransferUtility.Upload(fileTransferUtilityRequest);
fileTransferUtility.Dispose();
You need a Amazon S3 service URL, an access key id, a secret access key id and the bucket name.
var s3Config = new AmazonS3Config
{
ServiceURL = Constants.AmazonS3ServiceUrl,
RegionEndpoint = Amazon.RegionEndpoint.EUWest1
};
string accessKeyId = Constants.AmazonAccessKeyId;
string secretAccessKey = Constants.AmazonSecretAccessKey;
var config = new AwsS3Config(){AmazonS3BucketName = Constants.AmazonS3BucketName};
var client = new AmazonS3Client(accessKeyId, secretAccessKey, s3Config);
Then, you should be able to make calls to the amazon client:
var request = new GetObjectRequest
{
BucketName = _bucketName,
Key = entity.Path
};
var response = _client.GetObjectAsync(request).Result;
The code above works on an S3 account, not particularly storage.googleapis.com, which is your case. Anyway, I hope this helps and answers your question.
I use :
minio server to store files
nginx as reverse proxy to be able to use https with minio server
.NET AWSSDK.S3 to communicate with Minio server through nginx
Since server side encryption does not work with minio server, I tried to use the client side encryption help with AWS. But decryption of the file does not work with minio server.
When I use the same code with AWS server account, encryption / decryption work well.
It seems that file loses its metadata when it is created in minio server.
When I try to get file I have an exception :
An unhandled exception of type 'Amazon.Runtime.AmazonServiceException' occurred in AWSSDK.Core.dll
Additional information: Unable to decrypt data for object file-stream-27e52c5f-05d1-4296-
Here is my code
static void Main()
{
string filePath = #"c:/tempPrivateKey.txt";
string privateKey = File.ReadAllText(filePath);
RSA rsaAlgorithm = RSA.Create();
rsaAlgorithm.FromXmlString(privateKey);
EncryptionMaterials encryptionMaterials = new EncryptionMaterials(rsaAlgorithm);
var credentials = new BasicAWSCredentials(AccessKey, SecretKey);
AmazonS3CryptoConfiguration cryptoConfig = new AmazonS3CryptoConfiguration
{
RegionEndpoint = RegionEndpoint.EUWest1,
StorageMode = CryptoStorageMode.ObjectMetadata,
ServiceURL = EndPointNginx,
UseHttp = false,
ForcePathStyle = true
};
_amazonS3Client = new AmazonS3EncryptionClient(credentials, cryptoConfig, encryptionMaterials);
string bucketName = "bucket-" + Guid.NewGuid();
string fileStreamKey = "file-stream-" + Guid.NewGuid();
Stream fileStream = CreateRandomFileOnStream();
CreateBucket(bucketName);
AddFileToBucket(fileStreamKey, fileStream, bucketName);
Stream fileStreamToRead = GetFile(fileStreamKey, bucketName);
using (var reader = new StreamReader(fileStreamToRead))
{
Console.Out.WriteLine(reader.ReadToEnd());
}
DeleteFile(fileStreamKey, bucketName);
DeleteBucket(bucketName);
Console.ReadKey();
}
private static void AddFileToBucket(string fileKey, Stream fileStream, string bucketName)
{
Console.Out.WriteLine();
Console.Out.WriteLine($"adding file {fileKey} to bucket {bucketName}.");
var objectToPut = new PutObjectRequest
{
BucketName = bucketName,
Key = fileKey,
InputStream = fileStream
};
_amazonS3Client.PutObject(objectToPut);
if (fileStream.CanRead)
fileStream.Dispose();
Console.Out.WriteLine("file added");
}
private static Stream GetFile(string fileKey, string bucketName)
{
// This line throw an exception.
GetObjectResponse response = _amazonS3Client.GetObject(new GetObjectRequest { BucketName = bucketName, Key = fileKey });
return response.ResponseStream;
}
At worst I will encrypt / decrypt the file manually but I would like to be sure if there is a solution to this problem.
I am struggling with being able to create a file with its data based on the byte array returned from the WebAPI. The following is my code for making the call to the web api
using (var http = new WebClient())
{
string url = string.Format("{0}api/FileUpload/FileServe?FileID=" + fileID, webApiUrl);
http.Headers[HttpRequestHeader.ContentType] = "application/octet-stream";
http.Headers[HttpRequestHeader.Authorization] = "Bearer " + authCookie.Value;
http.DownloadDataCompleted += Http_DownloadDataCompleted;
byte[] json = await http.DownloadDataTaskAsync(url);
}
The api code is
[HttpGet]
[Route("FileServe")]
[Authorize(Roles = "Admin,SuperAdmin,Contractor")]
public async Task<HttpResponseMessage> GetFile(int FileID)
{
using (var repo = new MBHDocRepository())
{
var file = await repo.GetSpecificFile(FileID);
if (file == null)
{
throw new HttpResponseException(HttpStatusCode.BadRequest);
}
var stream = File.Open(file.PathLocator, FileMode.Open);
HttpResponseMessage response = new HttpResponseMessage(HttpStatusCode.OK);
response.Content = new StreamContent(stream);
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue(file.FileType);
return response;
}
}
I receive a byte array as a response however am unable to create the corresponding file from that byte array. I have no idea how to convert the byte array into the relevant file type (such as jpg, or pdf based on file type in the web api). any help will be appreciated.
Alright so there are a few ways of solving your problem firstly, on the server side of things you can either simply send the content type and leave it at that or you can also send the complete filename which helps you even further.
I have removed the code that is specific to your stuff with basic test code, please just ignore that stuff and use it in terms of your code.
Some design notes here:
[HttpGet]
[Route("FileServe")]
[Authorize(Roles = "Admin,SuperAdmin,Contractor")]
public async Task<HttpResponseMessage> GetFileAsync(int FileID) //<-- If your method returns Task have it be named with Async in it
{
using (var repo = new MBHDocRepository())
{
var file = await repo.GetSpecificFile(FileID);
if (file == null)
{
throw new HttpResponseException(HttpStatusCode.BadRequest);
}
var stream = File.Open(file.PathLocator, FileMode.Open);
HttpResponseMessage response = new HttpResponseMessage(HttpStatusCode.OK);
response.Content = new StreamContent(stream);
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue(file.FileType);
response.Content.Headers.ContentDisposition = new System.Net.Http.Headers.ContentDispositionHeaderValue("attachment") { FileName=Path.GetFileName(file.PathLocator)};
return response;
}
}
Your client side code has two options here:
static void Main(string[] args)
{
using (var http = new WebClient())
{
string url = string.Format("{0}api/FileUpload/FileServe?FileID={1}",webApiUrl, fileId);
http.Headers[HttpRequestHeader.ContentType] = "application/octet-stream";
http.Headers[HttpRequestHeader.Authorization] = "Bearer " + authCookie.Value;
var response = http.OpenRead(url);
var fs = new FileStream(String.Format(#"C:\Users\Bailey Miller\Downloads\{0}", GetName(http.ResponseHeaders)), FileMode.Create);
response.CopyTo(fs); <-- how to move the stream to the actual file, this is not perfect and there are a lot of better examples
fs.Flush();
fs.Close();
}
}
private static object GetName(WebHeaderCollection responseHeaders)
{
var c_type = responseHeaders.GetValues("Content-Type"); //<-- do a switch on this and return a really weird file name with the correct extension for the mime type.
var cd = responseHeaders.GetValues("Content-Disposition")[0].Replace("\"", ""); <-- this gets the attachment type and filename param, also removes illegal character " from filename if present
return cd.Substring(cd.IndexOf("=")+1); <-- extracts the file name
}