Upload image to S3 in C# ASP.NET MVC 5 - c#

I'm trying to upload a image to my bucket, but I can't because I have this error:
An exception of type 'Amazon.Runtime.AmazonServiceException' occurred in mscorlib.dll but was not handled in user code
Detail
{"Encountered a non retryable WebException : RequestCanceled"}
More Detail
{"The request was aborted: The request was canceled."}
Inner Exception
The stream was already consumed. It cannot be read again.
I'm using the AWS SDK for VS2013.
Code
private const string ExistingBucketName = "******development"; //Name of the bucket
private const string KeyName = "Images";
public static void UploadToS3(string filePath)
{
//filePath -> C:\example.jpg
try
{
var fileTransferUtility = new
TransferUtility(new AmazonS3Client(Amazon.RegionEndpoint.USEast1));
// 1. Upload a file, file name is used as the object key name.
fileTransferUtility.Upload(filePath, ExistingBucketName);
Trace.WriteLine("Upload 1 completed");
// 2. Specify object key name explicitly.
fileTransferUtility.Upload(filePath,
ExistingBucketName, KeyName);
Trace.WriteLine("Upload 2 completed");
// 3. Upload data from a type of System.IO.Stream.
using (var fileToUpload =
new FileStream(filePath, FileMode.Open, FileAccess.Read))
{
fileTransferUtility.Upload(fileToUpload,
ExistingBucketName, KeyName);
}
Trace.WriteLine("Upload 3 completed");
// 4.Specify advanced settings/options.
var fileTransferUtilityRequest = new TransferUtilityUploadRequest
{
BucketName = ExistingBucketName,
FilePath = filePath,
StorageClass = S3StorageClass.ReducedRedundancy,
PartSize = 6291456, // 6 MB.
Key = KeyName,
CannedACL = S3CannedACL.PublicRead
};
fileTransferUtilityRequest.Metadata.Add("param1", "Value1");
fileTransferUtilityRequest.Metadata.Add("param2", "Value2");
fileTransferUtility.Upload(fileTransferUtilityRequest);
Trace.WriteLine("Upload 4 completed");
}
catch (AmazonS3Exception s3Exception)
{
Trace.WriteLine(s3Exception.Message);
Trace.WriteLine(s3Exception.InnerException);
}
}

My error was in:
var fileTransferUtility = new
TransferUtility(new AmazonS3Client(Amazon.RegionEndpoint.USEast1));
I was using an incorrect region... I changed by Europe and it works.
var fileTransferUtility = new
TransferUtility(new AmazonS3Client(Amazon.RegionEndpoint.EUWest1));

By doing this:
using (var fileToUpload =
new FileStream(filePath, FileMode.Open, FileAccess.Read))
{
fileTransferUtility.Upload(fileToUpload,
ExistingBucketName, KeyName);
}
You are disposing your filestream before it gets uploaded. Try to remove the using clause or wrap the entire call in it.

User as follows :
TransferUtility fileTransferUtility = new TransferUtility(new AmazonS3Client("Access Key Id", "Access Segret key",Amazon.RegionEndpoint.USEast1));

Related

amazon s3 PutObjectAsync Cannot access a closed stream

I am trying to upload file to amazon s3 but got error Cannot Access a closed stream in await client.PutObjectAsync(request);
using (var stream = new MemoryStream())
{
using (var sWriter = new StreamWriter(stream, Encoding.UTF8))
{
await sWriter.WriteAsync(commandWithMetadata.SerializeToString());
stream.Seek(0, SeekOrigin.Begin);
var fileName = GetFileName(command);
var request = new PutObjectRequest
{
BucketName = BucketName,
Key = fileName,
InputStream = stream
};
await client.PutObjectAsync(request);
}
}
there is AutoCloseStream property on request which by default is true and amazon lib is closing the stream automatically

Amazon S3 does not release the file after the upload

I have a wcf service here to upload the files to Amazon s3 server. After the successful upload, I need to delete the file from my local path. But when I try to delete the file, got an error says The process cannot access the file.Because its being used by another process".Sharing below my code snippets.
var putRequest = new PutObjectRequest
{
BucketName = System.Configuration.ConfigurationManager.AppSettings["S3Bucket"]
.ToString(),
Key = keyName,
FilePath = path,
ContentType = "application/pdf"
};
client = new AmazonS3Client(bucketRegion);
PutObjectResponse response = await client.PutObjectAsync(putRequest);
putRequest = null;
client.Dispose();
File.Delete(path);
If anyone know about the issue, please update..
There might be a timing issue here, so you might want to try to close the stream explicitly.
Do note, I am not sure, if I am mistaken I'll remove this, but it was to long for a comment.
using (var fileStream = new File.OpenRead(path))
{
var putRequest = new PutObjectRequest
{
BucketName = System.Configuration.ConfigurationManager.AppSettings["S3Bucket"]
.ToString(),
Key = keyName,
InputStream = fileStream ,
ContentType = "application/pdf",
AutoCloseStream = false,
};
using (var c = new AmazonS3Client(bucketRegion))
{
PutObjectResponse response = await c.PutObjectAsync(putRequest);
}
} //filestream should be closed here, if not: call fileStream.Close()
File.Delete(path);
More info on the properties: https://docs.aws.amazon.com/sdkfornet1/latest/apidocs/html/T_Amazon_S3_Model_PutObjectRequest.htm

Amazon S3 .NET: Upload base64 image data?

I am trying to convert images to base64, and trying to upload that to AWS S3 using C#. I keep getting a remote server not found exception. But I am able to log in programmatically and list the buckets I have.
Can you please identify whats wrong.
static void Main(string[] args)
{
string configaccess = ConfigurationManager.AppSettings["AWSAccesskey"];
string configsecret = ConfigurationManager.AppSettings["AWSSecretkey"];
var s3Client = new AmazonS3Client(
configaccess,
configsecret,
RegionEndpoint.USEast1
);
Byte[] bArray = File.ReadAllBytes("path/foo.jpg");
String base64String = Convert.ToBase64String(bArray);
try
{
byte[] bytes = Convert.FromBase64String(base64String);
using (s3Client)
{
var request = new PutObjectRequest
{
BucketName = "bucketName",
CannedACL = S3CannedACL.PublicRead,
Key = string.Format("bucketName/{0}", "foo.jpg")
};
using (var ms = new MemoryStream(bytes))
{
request.InputStream = ms;
s3Client.PutObject(request);
}
}
}
catch (Exception ex)
{
Console.WriteLine("AWS Fail");
}
}
I have tested with same code and working fine for me. You need not to specify the bucket name in the Key. We can specify the folder name, if want to store this file in any folder inside the bucket.
Key = string.Format("FolderName/{0}", "foo.jpg").

How to upload image on amzon s3 using .net web api c#?

I have create one api for the image upload. in this code i have upload time image download in my local folder and store. but i need now change my code and move this image download on amzon s3. i have found one link in searching time but in this link static image is upload i need image browse from the file upload control and download on amzon server. but how can do that i have no idea. please any one how can do that then please help me. here below listed my code. and also add i have try this code in below.
this is my api method for the image upload :
[HttpPost]
[Route("FileUpload")]
public HttpResponseMessage FileUpload(string FileUploadType)
{
try
{
var httpRequest = HttpContext.Current.Request;
if (httpRequest.Files.Count > 0)
{
foreach (string file in httpRequest.Files)
{
var postedFile = httpRequest.Files[file];
string fname = System.IO.Path.GetFileNameWithoutExtension(postedFile.FileName.ToString());
string extension = Path.GetExtension(postedFile.FileName);
Image img = null;
string newFileName = "";
newFileName = DateTime.Now.ToString("yyyyMMddhhmmssfff") + ".jpeg";
string path = ConfigurationManager.AppSettings["ImageUploadPath"].ToString();
string filePath = Path.Combine(path, newFileName);
SaveJpg(img, filePath);
return Request.CreateResponse(HttpStatusCode.OK, "Ok");
}
}
}
catch (Exception ex)
{
return ex;
}
return Request.CreateResponse(HttpStatusCode.OK, "Ok");
}
This is my save image api =>
public static void SaveJpg(Image image, string file_name, long compression = 60)
{
try
{
EncoderParameters encoder_params = new EncoderParameters(1);
encoder_params.Param[0] = new EncoderParameter(
System.Drawing.Imaging.Encoder.Quality, compression);
ImageCodecInfo image_codec_info =
GetEncoderInfo("image/jpeg");
image.Save(file_name, image_codec_info, encoder_params);
}
catch (Exception ex)
{
}
}
i have try this code with static image upload on server =>
private string bucketName = "Xyz";
private string keyName = "abc.jpeg";
private string filePath = "C:\\Users\\I BALL\\Desktop\\image\\abc.jpeg";. // this image is store on server
public void UploadFile()
{
var client = new AmazonS3Client(Amazon.RegionEndpoint.USEast1);
try
{
PutObjectRequest putRequest = new PutObjectRequest
{
BucketName = bucketName,
Key = keyName,
FilePath = filePath,
ContentType = "text/plain"
};
PutObjectResponse response = client.PutObject(putRequest);
}
catch (AmazonS3Exception amazonS3Exception)
{
if (amazonS3Exception.ErrorCode != null &&
(amazonS3Exception.ErrorCode.Equals("InvalidAccessKeyId")
||
amazonS3Exception.ErrorCode.Equals("InvalidSecurity")))
{
throw new Exception("Check the provided AWS Credentials.");
}
else
{
throw new Exception("Error occurred: " + amazonS3Exception.Message);
}
}
}
here i have show my code but i need to marge with my code so how can do that please any one know how can do that.
This might be too late, but here is how I did it:
Short Answer: Amazon S3 SDK for .Net has a class called "TransferUtility" which accepts a Stream object, so as long as you can convert your file to any Class derived from the abstract Stream class, you can upload the file.
Long Answer:
The httprequest posted files has an inputStream property, so inside your foreach loop:
var postedFile = httpRequest.Files[file];
If you expand on this object, it is of type "HttpPostedFile", so you have access to the Stream through the InputStream property:
Here is some snippets from a working sample:
//get values from the headers
HttpPostedFile postedFile = httpRequest.Files["File"];
//convert the posted file stream a to memory stream
System.IO.MemoryStream target = new System.IO.MemoryStream();
postedFile.InputStream.CopyTo(target);
//the following static function is a function I built which accepts the amazon file key and also the object that will be uploaded to S3, in this case, a MemoryStream object
s3.WritingAnObject(fileKey, target);
The S3 is an instance of a class called "S3Uploader", here are some snippets that can get you going,
below are some needed namespaces:
using Amazon;
using Amazon.Runtime;
using Amazon.S3;
using Amazon.S3.Model;
using Amazon.S3.Transfer;
class constructor:
static IAmazonS3 client;
static TransferUtility fileTransferUtility;
public S3Uploader(string accessKeyId, string secretAccessKey,string bucketName)
{
_bucketName = bucketName;
var credentials = new BasicAWSCredentials(accessKeyId, secretAccessKey);
client = new AmazonS3Client(credentials, RegionEndpoint.USEast1);
fileTransferUtility = new TransferUtility(client);
}
Notice here that we are creating the credentials using the BasicAWSCredentials class instead of passing it to the AmazonS3Client directly. And then we are using fileTransferUtility class to have better control over what is sent to S3. and here is how the Upload works based on Memory Stream:
public void WritingAnObject(string keyName, MemoryStream fileToUpload)
{
try
{
TransferUtilityUploadRequest fileTransferUtilityRequest = new
TransferUtilityUploadRequest
{
StorageClass = S3StorageClass.ReducedRedundancy,
CannedACL = S3CannedACL.Private
};
fileTransferUtility.Upload(fileToUpload, _bucketName, keyName);
}
catch (AmazonS3Exception amazonS3Exception)
{
//your error handling here
}
}
Hope this helps someone with similar issues.

System.Net.ProtocolViolationException: Bytes to be written to the stream exceed the Content-Length bytes size specified

Using AWSSDK.dll I am trying to loop through the images in a zip file and send them to amazon S3 using the code below. The problem is I keep getting the error
System.Net.ProtocolViolationException
when I call the fileTransferUtility.Upload() method. The zip is posted as HttpPostedFile to the method which is used to create the zip file, then loop through each entry and upload.
ZipFile zipFile = new ZipFile(postedFile.InputStream);
foreach (ZipEntry zipEntry in zipFile)
{
if (zipEntry.Name != String.Empty)
{
string saveLocation = unzipBaseDir + "/" + zipEntry.Name;
string dbLocation = "./" + Path.Combine(unzipBaseDir, zipEntry.Name).Replace(#"\", "/");
// save the file
TransferUtilityConfig config = new TransferUtilityConfig();
config.MinSizeBeforePartUpload = 80740;
TransferUtility fileTransferUtility = new TransferUtility(new AmazonS3Client(accessKeyID, secretAccessKeyID, RegionEndpoint.EUWest1), config);
using (Stream fileToUpload = zipFile.GetInputStream(zipEntry))
{
TransferUtilityUploadRequest fileTransferUtilityRequest = new TransferUtilityUploadRequest
{
BucketName = existingBucketName,
InputStream = fileToUpload,
StorageClass = S3StorageClass.Standard,
PartSize = fileToUpload.Length,
Key = saveLocation,
CannedACL = S3CannedACL.PublicRead
};
fileTransferUtility.Upload(fileTransferUtilityRequest);
}
}
}

Categories