I am tired of all these "upload to S3" examples and tutorials that don't work , can someone just show me an example that simply works and is super easy?
well here are the instruction that you have to follow to get a fully working demo program ...
1-Download and install the Amazon web services SDK for .NET which you can find in (http://aws.amazon.com/sdk-for-net/). because I have visual studio 2010 I choose to install the 3.5 .NET SDK.
2- open visual studio and make a new project , I have visual studio 2010 and I am using a console application project.
3- add reference to AWSSDK.dll , it is installed with the Amazon web service SDK mentioned above , in my system the dll is located in "C:\Program Files (x86)\AWS SDK for .NET\bin\Net35\AWSSDK.dll".
4- make a new class file ,call it "AmazonUploader" here the complete code of the class:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Amazon;
using Amazon.S3;
using Amazon.S3.Transfer;
namespace UploadToS3Demo
{
public class AmazonUploader
{
public bool sendMyFileToS3(string localFilePath, string bucketName, string subDirectoryInBucket, string fileNameInS3)
{
// input explained :
// localFilePath = the full local file path e.g. "c:\mydir\mysubdir\myfilename.zip"
// bucketName : the name of the bucket in S3 ,the bucket should be alreadt created
// subDirectoryInBucket : if this string is not empty the file will be uploaded to
// a subdirectory with this name
// fileNameInS3 = the file name in the S3
// create an instance of IAmazonS3 class ,in my case i choose RegionEndpoint.EUWest1
// you can change that to APNortheast1 , APSoutheast1 , APSoutheast2 , CNNorth1
// SAEast1 , USEast1 , USGovCloudWest1 , USWest1 , USWest2 . this choice will not
// store your file in a different cloud storage but (i think) it differ in performance
// depending on your location
IAmazonS3 client = Amazon.AWSClientFactory.CreateAmazonS3Client(RegionEndpoint.EUWest1);
// create a TransferUtility instance passing it the IAmazonS3 created in the first step
TransferUtility utility = new TransferUtility(client);
// making a TransferUtilityUploadRequest instance
TransferUtilityUploadRequest request = new TransferUtilityUploadRequest();
if (string.IsNullOrEmpty(subDirectoryInBucket))
{
request.BucketName = bucketName; //no subdirectory just bucket name
}
else
{ // subdirectory and bucket name
request.BucketName = bucketName + #"/" + subDirectoryInBucket;
}
request.Key = fileNameInS3 ; //file name up in S3
request.FilePath = localFilePath; //local file name
utility.Upload(request); //commensing the transfer
return true; //indicate that the file was sent
}
}
}
5- add a configuration file : right click on your project in the solution explorer and choose "add" -> "new item" then from the list choose the type "Application configuration file" and click the "add" button. a file called "App.config" is added to the solution.
6- edit the app.config file : double click the "app.config" file in the solution explorer the edit menu will appear . replace all the text with the following text :
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<appSettings>
<add key="AWSProfileName" value="profile1"/>
<add key="AWSAccessKey" value="your Access Key goes here"/>
<add key="AWSSecretKey" value="your Secret Key goes here"/>
</appSettings>
</configuration>
you have to modify the above text to reflect your Amazon Access Key Id and Secret Access Key.
7- now in the program.cs file (remember this is a console application) write the following code :
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace UploadToS3Demo
{
class Program
{
static void Main(string[] args)
{
// preparing our file and directory names
string fileToBackup = #"d:\mybackupFile.zip" ; // test file
string myBucketName = "mys3bucketname"; //your s3 bucket name goes here
string s3DirectoryName = "justdemodirectory";
string s3FileName = #"mybackupFile uploaded in 12-9-2014.zip";
AmazonUploader myUploader = new AmazonUploader();
myUploader.sendMyFileToS3(fileToBackup, myBucketName, s3DirectoryName, s3FileName);
}
}
}
8- replace the strings in the code above with your own data
9- add error correction
and your program is ready
The solution of #docesam is for an old version of AWSSDK. Here is an example with the latest documentation of AmazonS3:
First open Visual Studio (I'm using VS2015) and create a New Project -> ASP.NET Web Application -> MVC.
Browse in Manage Nuget Package , the package AWSSDK.S3 and install it.
Now create a class named AmazonS3Uploader, then copy and paste this code:
using System;
using Amazon.S3;
using Amazon.S3.Model;
namespace AmazonS3Demo
{
public class AmazonS3Uploader
{
private string bucketName = "your-amazon-s3-bucket";
private string keyName = "the-name-of-your-file";
private string filePath = "C:\\Users\\yourUserName\\Desktop\\myImageToUpload.jpg";
public async void UploadFile()
{
var client = new AmazonS3Client(Amazon.RegionEndpoint.USEast1);
try
{
PutObjectRequest putRequest = new PutObjectRequest
{
BucketName = bucketName,
Key = keyName,
FilePath = filePath,
ContentType = "text/plain"
};
PutObjectResponse response = await client.PutObjectAsync(putRequest);
}
catch (AmazonS3Exception amazonS3Exception)
{
if (amazonS3Exception.ErrorCode != null &&
(amazonS3Exception.ErrorCode.Equals("InvalidAccessKeyId")
||
amazonS3Exception.ErrorCode.Equals("InvalidSecurity")))
{
throw new Exception("Check the provided AWS Credentials.");
}
else
{
throw new Exception("Error occurred: " + amazonS3Exception.Message);
}
}
}
}
}
Edit your Web.config file adding the next lines inside of <appSettings></appSettings> :
Now call your method UploadFile from HomeController.cs to test it:
public class HomeController : Controller
{
public ActionResult Index()
{
AmazonS3Uploader amazonS3 = new AmazonS3Uploader();
amazonS3.UploadFile();
return View();
}
....
Find your file in your Amazon S3 bucket and that's all.
Download my Demo Project
I have written a tutorial about this.
Uploading a file to S3 bucket using low-level API:
IAmazonS3 client = new AmazonS3Client("AKI...access-key...", "+8Bo...secrey-key...", RegionEndpoint.APSoutheast2);
FileInfo file = new FileInfo(#"c:\test.txt");
string destPath = "folder/sub-folder/test.txt"; // <-- low-level s3 path uses /
PutObjectRequest request = new PutObjectRequest()
{
InputStream = file.OpenRead(),
BucketName = "my-bucket-name",
Key = destPath // <-- in S3 key represents a path
};
PutObjectResponse response = client.PutObject(request);
Uploading a file to S3 bucket using high-level API:
IAmazonS3 client = new AmazonS3Client("AKI...access-key...", "+8Bo...secrey-key...", RegionEndpoint.APSoutheast2);
FileInfo localFile = new FileInfo(#"c:\test.txt");
string destPath = #"folder\sub-folder\test.txt"; // <-- high-level s3 path uses \
S3FileInfo s3File = new S3FileInfo(client, "my-bucket-name", destPath);
if (!s3File.Exists)
{
using (var s3Stream = s3File.Create()) // <-- create file in S3
{
localFile.OpenRead().CopyTo(s3Stream); // <-- copy the content to S3
}
}
#mejiamanuel57's solution works fine for small files under 15MB. For larger files, I was getting System.Net.Sockets.SocketException: The I/O operation has been aborted because of either a thread exit or an application request. Following improved solution works for larger files (tested with 50MB file):
...
public void UploadFile()
{
var client = new AmazonS3Client(Amazon.RegionEndpoint.USEast1);
var transferUtility = new TransferUtility(client);
try
{
TransferUtilityUploadRequest transferUtilityUploadRequest = new TransferUtilityUploadRequest
{
BucketName = bucketName,
Key = keyName,
FilePath = filePath,
ContentType = "text/plain"
};
transferUtility.Upload(transferUtilityUploadRequest); // use UploadAsync if possible
}
...
More info here.
The example on the AWS site worked for me:
https://docs.aws.amazon.com/AmazonS3/latest/dev/HLuploadFileDotNet.html
Although it was set to a different region which returned an error:
//private static readonly RegionEndpoint bucketRegion = RegionEndpoint.USWest2;
private static readonly RegionEndpoint bucketRegion = RegionEndpoint.USWest1;
I set up my bucket with Northern California, which is USWest1.
#mejiamanuel57 and #HoomanBahreini covers it very well but i would still like to show the official sample code from AWS SDK Code Examples:
https://docs.aws.amazon.com/sdk-for-net/v3/developer-guide/welcome.html
https://github.com/awsdocs/aws-doc-sdk-examples/tree/main/dotnetv3/S3
https://github.com/awsdocs/aws-doc-sdk-examples/blob/main/dotnetv3/S3/S3_Basics/S3Bucket.cs#L45
/// <summary>
/// Shows how to upload a file from the local computer to an Amazon S3
/// bucket.
/// </summary>
/// <param name="client">An initialized Amazon S3 client object.</param>
/// <param name="bucketName">The Amazon S3 bucket to which the object
/// will be uploaded.</param>
/// <param name="objectName">The object to upload.</param>
/// <param name="filePath">The path, including file name, of the object
/// on the local computer to upload.</param>
/// <returns>A boolean value indicating the success or failure of the
/// upload procedure.</returns>
public static async Task<bool> UploadFileAsync(
IAmazonS3 client,
string bucketName,
string objectName,
string filePath)
{
var request = new PutObjectRequest
{
BucketName = bucketName,
Key = objectName,
FilePath = filePath,
};
var response = await client.PutObjectAsync(request);
if (response.HttpStatusCode == System.Net.HttpStatusCode.OK)
{
Console.WriteLine($"Successfully uploaded {objectName} to {bucketName}.");
return true;
}
else
{
Console.WriteLine($"Could not upload {objectName} to {bucketName}.");
return false;
}
}
This is a modified service I wrote to upload json strings as json files both with sync and async methods:
public class AwsS3Service
{
private readonly IAmazonS3 _client;
private readonly string _bucketName;
private readonly string _keyPrefix;
/// <param name="bucketName">The Amazon S3 bucket to which the object
/// will be uploaded.</param>
public AwsS3Service(string accessKey, string secretKey, string bucketName, string keyPrefix)
{
BasicAWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
_client = new AmazonS3Client(credentials);
_bucketName=bucketName;
_keyPrefix=keyPrefix;
}
public bool UploadJson(string objectName, string json, int migrationId, long orgId)
{
return Task.Run(() => UploadJsonAsync(objectName, json, migrationId, orgId)).GetAwaiter().GetResult();
}
public async Task<bool> UploadJsonAsync(string objectName, string json, int migrationId, long orgId)
{
var request = new PutObjectRequest
{
BucketName = $"{_bucketName}",
Key = $"{_keyPrefix}{migrationId}/{orgId}/{objectName}",
InputStream = new MemoryStream(Encoding.UTF8.GetBytes(json)),
};
var response = await _client.PutObjectAsync(request);
if (response.HttpStatusCode == System.Net.HttpStatusCode.OK)
{
return true;
}
else
{
return false;
}
}
}
Related
I am trying to implement ML.NET model into a Blazor server app and it throws this error.
public static class AIModel
{
public enum Label
{
Undefined,
Bolnav,
Sanatos
}
private static string MLNetModelPath = "BlazorTWProject.AiModelMers.zip";
public static readonly Lazy<PredictionEngine<ImageInput, ImageOutput>> PredictEngine = new Lazy<PredictionEngine<ImageInput, ImageOutput>>(AIModel.CreatePredictEngine, true);
public static ImageOutput Predict(ImageInput input)
{
var predEngine = PredictEngine.Value;
return predEngine.Predict(input);
}
private static PredictionEngine<ImageInput, ImageOutput> CreatePredictEngine()
{
MLContext? mlContext = new MLContext();
var thisAssembly = Assembly.GetExecutingAssembly();
var files= thisAssembly.GetManifestResourceNames();
using var stream = thisAssembly.GetManifestResourceStream(MLNetModelPath) ;
ITransformer mlModel = mlContext.Model.Load(stream, out DataViewSchema _);
return mlContext.Model.CreatePredictionEngine<ImageInput, ImageOutput>(mlModel);
}
}
I tried to add the zip file to the embedded resources but nothing worked.
I want to specify that even it does say that it can't find the file, if I change the stream parameter to a non-existing file path, the code throws an System.IO.File error, not System.Reflection.TargetInvocationException.
The problem was I haven't installed all the NuGet Packages required for the model.
After I installed all ML.Vision, ML.ImageAnalytics, ... the problem was solved.
I'm writing a Discord Bot (Discord.net) and I found myself requiring to access some data on a google sheet using their APIs. Being that I thought it would be best to actually separate those two in two different class files I have tried summoning the Main method of the Google APIs into my program (after having renamed it "Sheets") like this in my Program.cs:
using Discord;
using Discord.WebSocket;
using System;
using System.IO;
using System.Threading.Tasks;
namespace WoM_Balance_Bot
{
public class Program
{
public static void Main(string[] args)
{
GoogleAPI GSheet = new GoogleAPI();
GSheet.Sheets();
new Program().MainAsync().GetAwaiter().GetResult();
}
private DiscordSocketClient _client;
public async Task MainAsync()
{
_client = new DiscordSocketClient();
_client.MessageReceived += CommandHandler;
_client.Log += Log;
var token = File.ReadAllText("bot-token.txt");
await _client.LoginAsync(TokenType.Bot, token);
await _client.StartAsync();
// Block this task until the program is closed.
await Task.Delay(-1);
}.......ecc
I tried writing the parameters to pass here in these parentheses like "string" and "args" but either I get the syntax wrong or I have a very wrong idea about what to pass exactly.
This is the actual content of GoogleAPI.cs, which is the other class file I created that has the Google Sheet APIs:
using Google.Apis.Auth.OAuth2;
using Google.Apis.Services;
using Google.Apis.Sheets.v4;
using Google.Apis.Sheets.v4.Data;
using Google.Apis.Util.Store;
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading;
namespace WoM_Balance_Bot
{
public class GoogleAPI
{
// If modifying these scopes, delete your previously saved credentials
// at ~/.credentials/sheets.googleapis.com-dotnet-quickstart.json
private static readonly string[] Scopes = { SheetsService.Scope.SpreadsheetsReadonly };
private static readonly string ApplicationName = "wombankrolls";
public static void Sheets(string[] args)
{
UserCredential credential;
Console.WriteLine("if you read this then it's good");
using (var stream =
new FileStream("credentials.json", FileMode.Open, FileAccess.Read))
{
// The file token.json stores the user's access and refresh tokens, and is created
// automatically when the authorization flow completes for the first time.
string credPath = "token.json";
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
Scopes,
"user",
CancellationToken.None,
new FileDataStore(credPath, true)).Result;
Console.WriteLine("Credential file saved to: " + credPath);
}
// Create Google Sheets API service.
var service = new SheetsService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = ApplicationName,
});
// Define request parameters.
String spreadsheetId = "16W56LWqt6wDaYAU5xNdTWCdaY_gkuQyl4CE1lPpUui4";
String range = "Class Data!G163:I";
SpreadsheetsResource.ValuesResource.GetRequest request =
service.Spreadsheets.Values.Get(spreadsheetId, range);
// Prints the names and majors of students in a sample spreadsheet:
// https://docs.google.com/spreadsheets/d/1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs74OgvE2upms/edit
ValueRange response = request.Execute();
IList<IList<Object>> values = response.Values;
/*
if (values != null && values.Count > 0)
{
Console.WriteLine("Name, Major");
foreach (var row in values)
{
// Print columns A and E, which correspond to indices 0 and 4.
Console.WriteLine("{0}, {1}", row[0], row[4]);
}
}
else
{
Console.WriteLine("No data found.");
}
Console.Read();
*/
}
}
}
I have modified it from the quickstart given by Google in a way that I thought it made sense but I still get in the end the same error:
There is no argument given that corresponds to the required formal parameter 'args' of 'GoogleAPI.Sheets(string[])'
as the user "David L" wrote in the comments:
As a general rule of thumb, if you do not use an argument, remove it. C# helps enforce this paradigm by throwing a compiler error if your method expects an argument and you do not provide one, which is exactly what is happening here.
It was my bad as I was under the impression of the total opposite during an API implementation. I would like to always target a clean code as a result and keeping stuff that I will not use was my bad. Thank you David!
I am just stuck with aws s3 on my .net core mvc application. I just simply need to input bucket name of s3 then return all of directory name list in this bucket but this simple task i didn't found anywhere on internet. I already tried few solution provided by AWS forum but problem is this absolutely not works at all. Bellow i have provided my controller code also forum link. Actually the issue they told is Amazon.S3.IO and S3DirectoryInfo namespace was removed from .net core so i am failed to follow them as they advised there. Any one can fix my code bellow which will give a list of bucket directory in .net core application?
I am using two nuget package-
AWSSDK.Core and AWSSDK.S3
Forum Link - Amazon.S3.IO not supported in .Net Core anymore?
Controller:
using Amazon;
using Amazon.S3;
using Amazon.S3.Model;
public IActionResult Media()
{
string bucketName = "domain33.com";
AmazonS3Client s3Client = new AmazonS3Client("Access_Key_ID", "Secret_Access_Key", RegionEndpoint.USEast1);
var getResponse = s3Client.ListBucketsAsync(new GetObjectRequest
{
BucketName = bucketName
});
var x = getResponse;
return View();
}
You could try using the ListObjectsV2Async method on IAmazonS3 to retrieve a list of all of the existing objects in the bucket based on the AWS's example. Their code is below in case the link dies:
// Copyright 2018 Amazon.com, Inc. or its affiliates. All Rights Reserved.
// SPDX-License-Identifier: MIT-0 (For details, see https://github.com/awsdocs/amazon-s3-developer-guide/blob/master/LICENSE-SAMPLECODE.)
using Amazon.S3;
using Amazon.S3.Model;
using System;
using System.Threading.Tasks;
namespace Amazon.DocSamples.S3
{
class ListObjectsTest
{
private const string bucketName = "*** bucket name ***";
// Specify your bucket region (an example region is shown).
private static readonly RegionEndpoint bucketRegion = RegionEndpoint.USWest2;
private static IAmazonS3 client;
public static void Main()
{
client = new AmazonS3Client(bucketRegion);
ListingObjectsAsync().Wait();
}
static async Task ListingObjectsAsync()
{
try
{
ListObjectsV2Request request = new ListObjectsV2Request
{
BucketName = bucketName,
MaxKeys = 10
};
ListObjectsV2Response response;
do
{
response = await client.ListObjectsV2Async(request);
// Process the response.
foreach (S3Object entry in response.S3Objects)
{
Console.WriteLine("key = {0} size = {1}",
entry.Key, entry.Size);
}
Console.WriteLine("Next Continuation Token: {0}", response.NextContinuationToken);
request.ContinuationToken = response.NextContinuationToken;
} while (response.IsTruncated);
}
catch (AmazonS3Exception amazonS3Exception)
{
Console.WriteLine("S3 error occurred. Exception: " + amazonS3Exception.ToString());
Console.ReadKey();
}
catch (Exception e)
{
Console.WriteLine("Exception: " + e.ToString());
Console.ReadKey();
}
}
}
}
Based on that sample, you could do further processing or add the keys to a list of strings for subsequent processing, instead of just writing it the console as their example code does. For instance, you could add each key to a list, and then process that list to calculate the distinct "directories".
I have been struggling to find a way of persisting an SQLite database on a Pi under Win IoT which can be accessed by different background applications (not concurrently).
I thought I had the answer when I discovered Libraries (Music, Pictures, Videos - but perversely not Documents, without more work). I can create a text file in one app and write it to the Pictures library's default folder. I can then read the text file with another app. File.Exists returns true. Bingo (I thought)!
However, SQLite will not create a database in the folder or open an existing database that I copy to the folder. SQLite.Net.SQLiteConnection returns an SQLite exception: "Could not open database file: C:\Data\Users\DefaultAccount\Pictures\MyDb.db (CannotOpen)" - no further clues.
The folder appears to grant full permissions. Does anyone have any ideas, please?
Creating and Writing a text file:
using System;
using Windows.ApplicationModel.Background;
using System.IO;
using System.Diagnostics;
//*** NOTE: Pictures Library checked in Package.appxmanifest 'Capabilities'
namespace LibraryTest
{
public sealed class StartupTask : IBackgroundTask
{
private BackgroundTaskDeferral Deferral;
public async void Run (IBackgroundTaskInstance taskInstance)
{
Deferral = taskInstance.GetDeferral ();
var myPictures = await Windows.Storage.StorageLibrary.GetLibraryAsync
(Windows.Storage.KnownLibraryId.Pictures);
string path = myPictures.SaveFolder.Path;
Debug.WriteLine ($"'Pictures' Folder: {path}");
string newFilePath = Path.Combine (path, "TestTextFile.txt");
Debug.WriteLine ($"New File Path: {newFilePath}");
try {
using ( var stream = File.OpenWrite (newFilePath) ) {
using ( var writer = new StreamWriter (stream) ) {
writer.Write ("This is some test text.");
}
}
Debug.WriteLine ($"File created OK");
}
catch (Exception ex) { Debug.WriteLine ($"Exception: {ex.Message}"); }
}
}
}
Produced:
'Pictures' Folder: C:\Data\Users\DefaultAccount\Pictures
New File Path: C:\Data\Users\DefaultAccount\Pictures\TestTextFile.txt
File created OK
Reading:
using System;
using Windows.ApplicationModel.Background;
using System.IO;
using System.Diagnostics;
//*** NOTE: Pictures Library checked in Package.appxmanifest 'Capabilities'
namespace ReadLibraryTest
{
public sealed class StartupTask : IBackgroundTask
{
private BackgroundTaskDeferral Deferral;
public async void Run (IBackgroundTaskInstance taskInstance)
{
Deferral = taskInstance.GetDeferral ();
var myPictures = await Windows.Storage.StorageLibrary.GetLibraryAsync
(Windows.Storage.KnownLibraryId.Pictures);
string path = myPictures.SaveFolder.Path;
Debug.WriteLine ($"'Pictures' Folder: {path}");
string newFilePath = Path.Combine (path, "TestTextFile.txt");
Debug.WriteLine ($"New File Path: {newFilePath}");
try {
using ( var stream = File.OpenRead (newFilePath) ) {
using ( var reader = new StreamReader (stream) ) {
string fileContents = reader.ReadLine ();
Debug.WriteLine ($"First line of file: '{fileContents}'");
}
}
Debug.WriteLine ($"File read OK");
}
catch ( Exception ex ) { Debug.WriteLine ($"Exception: {ex.Message}"); }
}
}
}
Produced:
'Pictures' Folder: C:\Data\Users\DefaultAccount\Pictures
New File Path: C:\Data\Users\DefaultAccount\Pictures\TestTextFile.txt
First line of file: 'This is some test text.'
File read OK
However, SQLite will not create a database in the folder or open an
existing database that I copy to the folder.
SQLite.Net.SQLiteConnection returns an SQLite exception: "Could not
open database file: C:\Data\Users\DefaultAccount\Pictures\MyDb.db
(CannotOpen)" - no further clues.
Yes, I reproduced this issue. It seems this folder does not work with SQLite file operations but I don't know where the problem is.
As a workaround, you can use PublisherCacheFolder. I create the .db file and write data in one background app. And read the data from another background app. It works.
Contact class:
public sealed class Contact
{
public int Id { get; set; }
public string Name { get; set; }
}
Create and write file:
StorageFolder sharedFonts = Windows.Storage.ApplicationData.Current.GetPublisherCacheFolder("test");
var sqlpath = System.IO.Path.Combine(sharedFonts.Path, "MyDb.db");
using (SQLite.Net.SQLiteConnection conn = new SQLite.Net.SQLiteConnection(new SQLite.Net.Platform.WinRT.SQLitePlatformWinRT(), sqlpath))
{
conn.CreateTable<Contact>();
for (var i = 0; i < 100; i++)
{
Contact contact = new Contact()
{
Id = i,
Name = "A"
};
conn.Insert(contact);
}
}
Read file:
StorageFolder sharedFonts = Windows.Storage.ApplicationData.Current.GetPublisherCacheFolder("test");
var sqlpath = System.IO.Path.Combine(sharedFonts.Path, "MyDb.db");
using (SQLite.Net.SQLiteConnection conn = new SQLite.Net.SQLiteConnection(new SQLite.Net.Platform.WinRT.SQLitePlatformWinRT(), sqlpath))
{
var query = conn.Table<Contact>().Where(v => v.Name.Equals("A"));
foreach (var stock in query)
Debug.WriteLine("contact: " + stock.Id);
}
To use this publisher folder you need add the following lines in Package.appxmanifest:
<Extensions>
<Extension Category="windows.publisherCacheFolders">
<PublisherCacheFolders>
<Folder Name="test" />
</PublisherCacheFolders>
</Extension>
</Extensions>
Thanks, Rita. Worked very well. For the benefit of anyone reading, I am using the async version of SqlLite and create the connection as follows:
const string FileName = "MyFile.db";
string DbDir;
string DbPath;
Constructor:
DbDir = Windows.Storage.ApplicationData.Current.GetPublisherCacheFolder("test").Path;
DbPath = Path.Combine (DbDir, DbFileName);
public SQLite.Net.Async.SQLiteAsyncConnection GetConnectionAsync ()
{
var connectionFactory = new Func<SQLite.Net.SQLiteConnectionWithLock>(()=>
new SQLite.Net.SQLiteConnectionWithLock(new SQLitePlatformWinRT(),
new SQLite.Net.SQLiteConnectionString(DbPath, storeDateTimeAsTicks: false)));
var asyncConnection = new SQLiteAsyncConnection(connectionFactory);
return asyncConnection;
}
Then, for instance, read a table of type Parms:
public async Task<Parms> ReadParmsAsync ()
{
var db = GetConnectionAsync ();
var query = db.Table<Parms> ().Where (p => p.Id == 1);
return await query.FirstOrDefaultAsync ();
}
My concern about the SQLite async connection is that it is not IDisposable. Therefore, will the 'factory' eventually run out of steam (memory, handles)? But I guess that is a subject for another thread.
We need to work on .net based web application that will upload files to Amazon S3 Storage bucket using admin panel of the app and clients will be given to downloadable files with client.aspx file.
We looked at few example and got confused with some of the sample code for downloading non public files from S3 storage. one such example is below
AmazonS3Config config = new AmazonS3Config()
{
RegionEndpoint = RegionEndpoint.USEast1
};
IAmazonS3 client = new AmazonS3Client(accessKey, secretKey, config);
string dest = System.IO.Path.GetTempPath() + "event.mp4";
using (client)
{
GetObjectRequest request = new GetObjectRequest() { BucketName = "bucketname" + #"/" + "videos2015", Key = "event.mp4" };
using (GetObjectResponse response = client.GetObject(request))
{
response.WriteResponseStreamToFile(dest);
}
}
Response.Clear();
Response.AppendHeader("content-disposition", "attachment; filename=" + "dynamic_filename.png");
Response.ContentType = "application/octet-stream";
Response.TransmitFile(dest);
Response.Flush();
Response.End();
When user click on the link following code gets executed on web server and code downloads file on the web server and then serves the same file to client... if i am not wrong. Is there not a way that we can serve file for download directly from the AWS S3 storage.
In above case it is waste of server resources and increases the download time also.
Out files on AWS are not Public they are non public so the url is not accessible directly from client browsers as is in case of public content type
The pre-signed urls are indeed what you are looking for. Since you are using C#, here is a link to some useful code examples:
http://docs.aws.amazon.com/AmazonS3/latest/dev/ShareObjectPreSignedURLDotNetSDK.html
There is no need to upload files to s3 thru your webserver, they can be sent directly. Same thing on the download, download directly from S3 - don't copy them to EC2 first, you would be wasting bandwidth and processing resources.
You can use Minio-dotnet client library Its Open Source & supports compatible S3 API.
Here is an example for PresignedPostPolicy
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Minio;
namespace Minio.Examples
{
class PresignedPostPolicy
{
static int Main()
{
/// Note: YOUR-ACCESSKEYID, YOUR-SECRETACCESSKEY, my-bucketname and
/// my-objectname are dummy values, please replace them with original values.
var client = new MinioClient("s3.amazonaws.com", "YOUR-ACCESSKEYID", "YOUR-SECRETACCESSKEY");
PostPolicy form = new PostPolicy();
DateTime expiration = DateTime.UtcNow;
form.SetExpires(expiration.AddDays(10));
form.SetKey("my-objectname");
form.SetBucket("my-bucketname");
Dictionary <string, string> formData = client.PresignedPostPolicy(form);
string curlCommand = "curl ";
foreach (KeyValuePair<string, string> pair in formData)
{
curlCommand = curlCommand + " -F " + pair.Key + "=" + pair.Value;
}
curlCommand = curlCommand + " -F file=#/etc/bashrc https://s3.amazonaws.com/my-bucketname";
Console.Out.WriteLine(curlCommand);
return 0;
}
}
}
And below one for PresignedPutObject
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Minio;
namespace Minio.Examples
{
class PresignedPutObject
{
static int Main()
{
/// Note: YOUR-ACCESSKEYID, YOUR-SECRETACCESSKEY, my-bucketname and
/// my-objectname are dummy values, please replace them with original values.
var client = new MinioClient("s3.amazonaws.com", "YOUR-ACCESSKEYID", "YOUR-SECRETACCESSKEY");
Console.Out.WriteLine(client.PresignedPutObject("my-bucketname", "my-objectname", 1000));
return 0;
}
}
}
Hope it helps.
PS: I work for Minio