I need to upload file sending extra paramaters.
I have found the following post in stackoverflow: Webapi ajax formdata upload with extra parameters
It describes how to do this using MultipartFormDataStreamProvider and saving data to fileserver. I do not need to save file to server, but to DB instead.
And I have already working code using MultipartMemoryStreamProvider, but it doesn't use extra parameter.
Can you give me clues how to process extra paramaters in webapi?
For example, if I add file and also test paramater:
data.append("myParameter", "test");
Here is my webapi that processes fileupload without extra paramater:
if (Request.Content.IsMimeMultipartContent())
{
var streamProvider = new MultipartMemoryStreamProvider();
var task = Request.Content.ReadAsMultipartAsync(streamProvider).ContinueWith<IEnumerable<FileModel>>(t =>
{
if (t.IsFaulted || t.IsCanceled)
{
throw new HttpResponseException(HttpStatusCode.InternalServerError);
}
_fleDataService = new FileDataBLL();
FileData fle;
var fleInfo = streamProvider.Contents.Select(i => {
fle = new FileData();
fle.FileName = i.Headers.ContentDisposition.FileName;
var contentTest = i.ReadAsByteArrayAsync();
contentTest.Wait();
if (contentTest.Result != null)
{
fle.FileContent = contentTest.Result;
}
// get extra parameters here ??????
_fleDataService.Save(fle);
return new FileModel(i.Headers.ContentDisposition.FileName, 1024); //todo
});
return fleInfo;
});
return task;
}
Expanding on gooid's answer, I encapsulated the FormData extraction into the provider because I was having issues with it being quoted. This just provided a better implementation in my opinion.
public class MultipartFormDataMemoryStreamProvider : MultipartMemoryStreamProvider
{
private readonly Collection<bool> _isFormData = new Collection<bool>();
private readonly NameValueCollection _formData = new NameValueCollection(StringComparer.OrdinalIgnoreCase);
private readonly Dictionary<string, Stream> _fileStreams = new Dictionary<string, Stream>();
public NameValueCollection FormData
{
get { return _formData; }
}
public Dictionary<string, Stream> FileStreams
{
get { return _fileStreams; }
}
public override Stream GetStream(HttpContent parent, HttpContentHeaders headers)
{
if (parent == null)
{
throw new ArgumentNullException("parent");
}
if (headers == null)
{
throw new ArgumentNullException("headers");
}
var contentDisposition = headers.ContentDisposition;
if (contentDisposition == null)
{
throw new InvalidOperationException("Did not find required 'Content-Disposition' header field in MIME multipart body part.");
}
_isFormData.Add(String.IsNullOrEmpty(contentDisposition.FileName));
return base.GetStream(parent, headers);
}
public override async Task ExecutePostProcessingAsync()
{
for (var index = 0; index < Contents.Count; index++)
{
HttpContent formContent = Contents[index];
if (_isFormData[index])
{
// Field
string formFieldName = UnquoteToken(formContent.Headers.ContentDisposition.Name) ?? string.Empty;
string formFieldValue = await formContent.ReadAsStringAsync();
FormData.Add(formFieldName, formFieldValue);
}
else
{
// File
string fileName = UnquoteToken(formContent.Headers.ContentDisposition.FileName);
Stream stream = await formContent.ReadAsStreamAsync();
FileStreams.Add(fileName, stream);
}
}
}
private static string UnquoteToken(string token)
{
if (string.IsNullOrWhiteSpace(token))
{
return token;
}
if (token.StartsWith("\"", StringComparison.Ordinal) && token.EndsWith("\"", StringComparison.Ordinal) && token.Length > 1)
{
return token.Substring(1, token.Length - 2);
}
return token;
}
}
And here's how I'm using it. Note that I used await since we're on .NET 4.5.
[HttpPost]
public async Task<HttpResponseMessage> Upload()
{
if (!Request.Content.IsMimeMultipartContent())
{
return Request.CreateResponse(HttpStatusCode.UnsupportedMediaType, "Unsupported media type.");
}
// Read the file and form data.
MultipartFormDataMemoryStreamProvider provider = new MultipartFormDataMemoryStreamProvider();
await Request.Content.ReadAsMultipartAsync(provider);
// Extract the fields from the form data.
string description = provider.FormData["description"];
int uploadType;
if (!Int32.TryParse(provider.FormData["uploadType"], out uploadType))
{
return Request.CreateResponse(HttpStatusCode.BadRequest, "Upload Type is invalid.");
}
// Check if files are on the request.
if (!provider.FileStreams.Any())
{
return Request.CreateResponse(HttpStatusCode.BadRequest, "No file uploaded.");
}
IList<string> uploadedFiles = new List<string>();
foreach (KeyValuePair<string, Stream> file in provider.FileStreams)
{
string fileName = file.Key;
Stream stream = file.Value;
// Do something with the uploaded file
UploadManager.Upload(stream, fileName, uploadType, description);
// Keep track of the filename for the response
uploadedFiles.Add(fileName);
}
return Request.CreateResponse(HttpStatusCode.OK, "Successfully Uploaded: " + string.Join(", ", uploadedFiles));
}
You can achieve this in a not-so-very-clean manner by implementing a custom DataStreamProvider that duplicates the logic for parsing FormData from multi-part content from MultipartFormDataStreamProvider.
I'm not quite sure why the decision was made to subclass MultipartFormDataStreamProvider from MultiPartFileStreamProvider without at least extracting the code that identifies and exposes the FormData collection since it is useful for many tasks involving multi-part data outside of simply saving a file to disk.
Anyway, the following provider should help solve your issue. You will still need to ensure that when you iterate the provider content you are ignoring anything that does not have a filename (specifically the statement streamProvider.Contents.Select() else you risk trying to upload the formdata to the DB). Hence the code that asks the provider is a HttpContent IsStream(), this is a bit of a hack but was the simplest was I could think to do it.
Note that it is basically a cut and paste hatchet job from the source of MultipartFormDataStreamProvider - it has not been rigorously tested (inspired by this answer).
public class MultipartFormDataMemoryStreamProvider : MultipartMemoryStreamProvider
{
private readonly Collection<bool> _isFormData = new Collection<bool>();
private readonly NameValueCollection _formData = new NameValueCollection(StringComparer.OrdinalIgnoreCase);
public NameValueCollection FormData
{
get { return _formData; }
}
public override Stream GetStream(HttpContent parent, HttpContentHeaders headers)
{
if (parent == null) throw new ArgumentNullException("parent");
if (headers == null) throw new ArgumentNullException("headers");
var contentDisposition = headers.ContentDisposition;
if (contentDisposition != null)
{
_isFormData.Add(String.IsNullOrEmpty(contentDisposition.FileName));
return base.GetStream(parent, headers);
}
throw new InvalidOperationException("Did not find required 'Content-Disposition' header field in MIME multipart body part.");
}
public override async Task ExecutePostProcessingAsync()
{
for (var index = 0; index < Contents.Count; index++)
{
if (IsStream(index))
continue;
var formContent = Contents[index];
var contentDisposition = formContent.Headers.ContentDisposition;
var formFieldName = UnquoteToken(contentDisposition.Name) ?? string.Empty;
var formFieldValue = await formContent.ReadAsStringAsync();
FormData.Add(formFieldName, formFieldValue);
}
}
private static string UnquoteToken(string token)
{
if (string.IsNullOrWhiteSpace(token))
return token;
if (token.StartsWith("\"", StringComparison.Ordinal) && token.EndsWith("\"", StringComparison.Ordinal) && token.Length > 1)
return token.Substring(1, token.Length - 2);
return token;
}
public bool IsStream(int idx)
{
return !_isFormData[idx];
}
}
It can be used as follows (using TPL syntax to match your question):
[HttpPost]
public Task<string> Post()
{
if (!Request.Content.IsMimeMultipartContent())
throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.NotAcceptable, "Invalid Request!"));
var provider = new MultipartFormDataMemoryStreamProvider();
return Request.Content.ReadAsMultipartAsync(provider).ContinueWith(p =>
{
var result = p.Result;
var myParameter = result.FormData.GetValues("myParameter").FirstOrDefault();
foreach (var stream in result.Contents.Where((content, idx) => result.IsStream(idx)))
{
var file = new FileData(stream.Headers.ContentDisposition.FileName);
var contentTest = stream.ReadAsByteArrayAsync();
// ... and so on, as per your original code.
}
return myParameter;
});
}
I tested it with the following HTML form:
<form action="/api/values" method="post" enctype="multipart/form-data">
<input name="myParameter" type="hidden" value="i dont do anything interesting"/>
<input type="file" name="file1" />
<input type="file" name="file2" />
<input type="submit" value="OK" />
</form>
I really needed the media type and length of the files uploaded so I modified #Mark Seefeldt answer slightly to the following:
public class MultipartFormFile
{
public string Name { get; set; }
public long? Length { get; set; }
public string MediaType { get; set; }
public Stream Stream { get; set; }
}
public class MultipartFormDataMemoryStreamProvider : MultipartMemoryStreamProvider
{
private readonly Collection<bool> _isFormData = new Collection<bool>();
private readonly NameValueCollection _formData = new NameValueCollection(StringComparer.OrdinalIgnoreCase);
private readonly List<MultipartFormFile> _fileStreams = new List<MultipartFormFile>();
public NameValueCollection FormData
{
get { return _formData; }
}
public List<MultipartFormFile> FileStreams
{
get { return _fileStreams; }
}
public override Stream GetStream(HttpContent parent, HttpContentHeaders headers)
{
if (parent == null)
{
throw new ArgumentNullException("parent");
}
if (headers == null)
{
throw new ArgumentNullException("headers");
}
var contentDisposition = headers.ContentDisposition;
if (contentDisposition == null)
{
throw new InvalidOperationException("Did not find required 'Content-Disposition' header field in MIME multipart body part.");
}
_isFormData.Add(String.IsNullOrEmpty(contentDisposition.FileName));
return base.GetStream(parent, headers);
}
public override async Task ExecutePostProcessingAsync()
{
for (var index = 0; index < Contents.Count; index++)
{
HttpContent formContent = Contents[index];
if (_isFormData[index])
{
// Field
string formFieldName = UnquoteToken(formContent.Headers.ContentDisposition.Name) ?? string.Empty;
string formFieldValue = await formContent.ReadAsStringAsync();
FormData.Add(formFieldName, formFieldValue);
}
else
{
// File
var file = new MultipartFormFile
{
Name = UnquoteToken(formContent.Headers.ContentDisposition.FileName),
Length = formContent.Headers.ContentLength,
MediaType = formContent.Headers.ContentType.MediaType,
Stream = await formContent.ReadAsStreamAsync()
};
FileStreams.Add(file);
}
}
}
private static string UnquoteToken(string token)
{
if (string.IsNullOrWhiteSpace(token))
{
return token;
}
if (token.StartsWith("\"", StringComparison.Ordinal) && token.EndsWith("\"", StringComparison.Ordinal) && token.Length > 1)
{
return token.Substring(1, token.Length - 2);
}
return token;
}
}
Ultimately, the following was what worked for me:
string root = HttpContext.Current.Server.MapPath("~/App_Data");
var provider = new MultipartFormDataStreamProvider(root);
var filesReadToProvider = await Request.Content.ReadAsMultipartAsync(provider);
foreach (var file in provider.FileData)
{
var fileName = file.Headers.ContentDisposition.FileName.Replace("\"", string.Empty);
byte[] documentData;
documentData = File.ReadAllBytes(file.LocalFileName);
DAL.Document newRecord = new DAL.Document
{
PathologyRequestId = PathologyRequestId,
FileName = fileName,
DocumentData = documentData,
CreatedById = ApplicationSecurityDirector.CurrentUserGuid,
CreatedDate = DateTime.Now,
UpdatedById = ApplicationSecurityDirector.CurrentUserGuid,
UpdatedDate = DateTime.Now
};
context.Documents.Add(newRecord);
context.SaveChanges();
}
Related
I am uploading files to a Gdrive follow this instructions. In the File objtect I just set the name, like this:
{
"name": "myObjectName"
}
The files are uploading without problem. Now I need to generate a shared link for each upload file, Do you know who is the request that I have to do?
Thanks,
With #Jacques-Guzel HeronĀ“s help, I completed the upload file process. I created a wrapper for Google Drive API v3. That is my code if any need to do something like that: (I am using C#):
public interface IGDriveApiV3Wrapper
{
string UploadFile(string filePath, string gDriveUploadDestinationFolderId = null);
bool SetFilePermissions(string fileId, GDriveFileRole gDriveRole, GDriveFileType gDriveType);
GDriveFile GetFileInfo(string fileId);
}
public class GDriveApiV3NativeWrapper : IGDriveApiV3Wrapper
{
private const string GDriveFilesApiResumablePath = "https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable";
private const string GDriveTokenApiPath = "https://oauth2.googleapis.com/token";
private static readonly HttpClient GDriveClient = new HttpClient { Timeout = Timeout.InfiniteTimeSpan };
private readonly List<KeyValuePair<string, string>> _getTokenRequestContent;
private static GDriveTokenInfo _gDriveTokenInfo;
private static readonly object UpdateGDriveTokenInfoLocker = new object();
public GDriveApiV3NativeWrapper(string gDriveApiClientId, string gDriveApiClientSecret, string gDriveApiRefreshToken)
{
_getTokenRequestContent = new List<KeyValuePair<string, string>>
{
new KeyValuePair<string, string>("client_id", gDriveApiClientId),
new KeyValuePair<string, string>("client_secret", gDriveApiClientSecret),
new KeyValuePair<string, string>("refresh_token", gDriveApiRefreshToken),
new KeyValuePair<string, string>("grant_type", "refresh_token")
};
}
public string UploadFile(string filePath, string gDriveUploadDestinationFolderId = null)
{
if (string.IsNullOrEmpty(filePath))
throw new ArgumentException("Value cannot be null or empty.", nameof(filePath));
FileInfo fileInfo;
try
{
fileInfo = new FileInfo(filePath);
}
catch (Exception ex)
{
throw new ArgumentException("File not valid.", nameof(filePath), ex);
}
if (!fileInfo.Exists)
throw new ArgumentException("File not exists.", nameof(filePath));
using (var initiateResumableUploadSessionRequest = new HttpRequestMessage(HttpMethod.Post, GDriveFilesApiResumablePath))
{
UpdateGDriveTokenInfo();
initiateResumableUploadSessionRequest.Headers.Authorization = new AuthenticationHeaderValue(_gDriveTokenInfo.TokenType, _gDriveTokenInfo.AccessToken);
var jsonContent = new JObject(
new JProperty("name", fileInfo.Name));
if (!string.IsNullOrEmpty(gDriveUploadDestinationFolderId))
{
jsonContent.Add(new JProperty("parents", new JArray { gDriveUploadDestinationFolderId }));
}
initiateResumableUploadSessionRequest.Content = new StringContent(jsonContent.ToString(), Encoding.UTF8, "application/json");
var initiateResumableUploadSessionResponse = GDriveClient.SendAsync(initiateResumableUploadSessionRequest).Result;
if (initiateResumableUploadSessionResponse.StatusCode != HttpStatusCode.OK)
throw new ExternalException(initiateResumableUploadSessionResponse.ToString());
using (var uploadFileRequest = new HttpRequestMessage(HttpMethod.Put, initiateResumableUploadSessionResponse.Headers.Location))
{
uploadFileRequest.Content = new ByteArrayContent(File.ReadAllBytes(filePath));
HttpResponseMessage uploadFileResponse;
uploadFileResponse = GDriveClient.SendAsync(uploadFileRequest).Result;
if (uploadFileResponse.StatusCode != HttpStatusCode.OK && uploadFileResponse.StatusCode != HttpStatusCode.Created)
throw new ExternalException(uploadFileResponse.ReasonPhrase);
var uploadFileResponseBody = uploadFileResponse.Content.ReadAsStringAsync().Result;
JObject uploadFileResponseJson = JObject.Parse(uploadFileResponseBody);
return uploadFileResponseJson["id"].ToString();
}
}
}
public bool SetFilePermissions(string fileId, GDriveFileRole gDriveFileRole, GDriveFileType gDriveFileType)
{
if (string.IsNullOrEmpty(fileId))
throw new ArgumentException("Value cannot be null or empty.", nameof(fileId));
using (var setFilePermissionsRequest = new HttpRequestMessage(HttpMethod.Post, $"https://www.googleapis.com/drive/v3/files/{fileId}/permissions"))
{
UpdateGDriveTokenInfo();
setFilePermissionsRequest.Headers.Authorization = new AuthenticationHeaderValue(_gDriveTokenInfo.TokenType, _gDriveTokenInfo.AccessToken);
var jsonContent2 = new JObject(
new JProperty("role", gDriveFileRole.ToString().ToLower()),
new JProperty("type", gDriveFileType.ToString().ToLower()));
setFilePermissionsRequest.Content = new StringContent(jsonContent2.ToString(), Encoding.UTF8, "application/json");
HttpResponseMessage setFilePermissionsResponse = GDriveClient.SendAsync(setFilePermissionsRequest).Result;
if (setFilePermissionsResponse.StatusCode != HttpStatusCode.OK)
throw new ExternalException(setFilePermissionsResponse.ToString());
}
return true;
}
public GDriveFile GetFileInfo(string fileId)
{
using (var getFileInfoRequest = new HttpRequestMessage(HttpMethod.Get, $"https://www.googleapis.com/drive/v3/files/{fileId}?fields=name,webViewLink"))
{
UpdateGDriveTokenInfo();
getFileInfoRequest.Headers.Authorization = new AuthenticationHeaderValue(_gDriveTokenInfo.TokenType, _gDriveTokenInfo.AccessToken);
HttpResponseMessage getFileInfoResponse = GDriveClient.SendAsync(getFileInfoRequest).Result;
if (getFileInfoResponse.StatusCode != HttpStatusCode.OK)
throw new ExternalException(getFileInfoResponse.ToString());
var getFileInfoResponseBody = getFileInfoResponse.Content.ReadAsStringAsync().Result;
JObject getFileInfoResponseJson = JObject.Parse(getFileInfoResponseBody);
return new GDriveFile
{
Id = fileId,
Name = getFileInfoResponseJson["name"].ToString(),
WebViewLink = getFileInfoResponseJson["webViewLink"].ToString()
};
}
}
private void UpdateGDriveTokenInfo()
{
lock (UpdateGDriveTokenInfoLocker)
{
if (_gDriveTokenInfo != null && !_gDriveTokenInfo.IsExpired())
{
return;
}
using (var refreshTokenRequest = new HttpRequestMessage(HttpMethod.Post, GDriveTokenApiPath))
{
refreshTokenRequest.Content = new FormUrlEncodedContent(_getTokenRequestContent);
var getTokenRequestResponse = GDriveClient.SendAsync(refreshTokenRequest).Result;
var jsonResponse = JObject.Parse(getTokenRequestResponse.Content.ReadAsStringAsync().Result);
_gDriveTokenInfo = new GDriveTokenInfo((string)jsonResponse["access_token"], (int)jsonResponse["expires_in"], (string)jsonResponse["token_type"]);
}
}
}
public class GDriveFile
{
public string Id { get; set; }
public string Name { get; set; }
public string WebViewLink { get; set; }
}
public enum GDriveFileRole
{
Owner,
Organizer,
FileOrganizer,
Writer,
Commenter,
Reader
}
public enum GDriveFileType
{
User,
Group,
Domain,
Anyone
}
public class Program
{
private static IGDriveApiV3Wrapper _gDriveApiV3Wrapper;
private readonly string _gDriveUploadDestinationFolderId;
public Program(IGDriveApiV3Wrapper gDriveApiV3Wrapper, string gDriveUploadDestinationFolderId = null)
{
_gDriveApiV3Wrapper = gDriveApiV3Wrapper;
_gDriveUploadDestinationFolderId = gDriveUploadDestinationFolderId;
}
public string Upload(string filePath)
{
string fileId = _gDriveApiV3Wrapper.UploadFile(filePath, _gDriveUploadDestinationFolderId);
_gDriveApiV3Wrapper.SetFilePermissions(fileId, GDriveFileRole.Reader, GDriveFileType.Anyone);
GDriveFile gDriveFile = _gDriveApiV3Wrapper.GetFileInfo(fileId);
return gDriveFile.WebViewLink;
}
}
I'll assume that you are using Drive API v3. To set up permissions after uploading the file you have to use the method create. You can check the sharing documentation to know more about the different ways of creation of permissions and its different levels of access. After sharing the file, you can retrieve the link with the property webViewLink of the file. If you still have any doubts, please ask me for further clarification.
The Class GDrivenTokenInfo i don`t see its code
I have a Xamarin application and have managed to download my data from my server to my device. I have also got it set up so that it can take a SqlCipher Encryption key to encrypt the data.
My question is where is the correct location to store my key that I use to encrypt this data? Is it to you KeyStore / KeyChain? Which mono classes should I be looking to use?
Due to the popularity of this question I am going to post my implementation of this:
PCL interface
public interface IAuth
{
void CreateStore();
IEnumerable<string> FindAccountsForService(string serviceId);
void Save(string pin,string serviceId);
void Delete(string serviceId);
}
Android
public class IAuthImplementation : IAuth
{
Context context;
KeyStore ks;
KeyStore.PasswordProtection prot;
static readonly object fileLock = new object();
const string FileName = "MyProg.Accounts";
static readonly char[] Password = null;
public void CreateStore()
{
this.context = Android.App.Application.Context;
ks = KeyStore.GetInstance(KeyStore.DefaultType);
prot = new KeyStore.PasswordProtection(Password);
try
{
lock (fileLock)
{
using (var s = context.OpenFileInput(FileName))
{
ks.Load(s, Password);
}
}
}
catch (Java.IO.FileNotFoundException)
{
//ks.Load (null, Password);
LoadEmptyKeyStore(Password);
}
}
public IEnumerable<string> FindAccountsForService(string serviceId)
{
var r = new List<string>();
var postfix = "-" + serviceId;
var aliases = ks.Aliases();
while (aliases.HasMoreElements)
{
var alias = aliases.NextElement().ToString();
if (alias.EndsWith(postfix))
{
var e = ks.GetEntry(alias, prot) as KeyStore.SecretKeyEntry;
if (e != null)
{
var bytes = e.SecretKey.GetEncoded();
var password = System.Text.Encoding.UTF8.GetString(bytes);
r.Add(password);
}
}
}
return r;
}
public void Delete(string serviceId)
{
var alias = MakeAlias(serviceId);
ks.DeleteEntry(alias);
Save();
}
public void Save(string pin, string serviceId)
{
var alias = MakeAlias(serviceId);
var secretKey = new SecretAccount(pin);
var entry = new KeyStore.SecretKeyEntry(secretKey);
ks.SetEntry(alias, entry, prot);
Save();
}
void Save()
{
lock (fileLock)
{
using (var s = context.OpenFileOutput(FileName, FileCreationMode.Private))
{
ks.Store(s, Password);
}
}
}
static string MakeAlias(string serviceId)
{
return "-" + serviceId;
}
class SecretAccount : Java.Lang.Object, ISecretKey
{
byte[] bytes;
public SecretAccount(string password)
{
bytes = System.Text.Encoding.UTF8.GetBytes(password);
}
public byte[] GetEncoded()
{
return bytes;
}
public string Algorithm
{
get
{
return "RAW";
}
}
public string Format
{
get
{
return "RAW";
}
}
}
static IntPtr id_load_Ljava_io_InputStream_arrayC;
void LoadEmptyKeyStore(char[] password)
{
if (id_load_Ljava_io_InputStream_arrayC == IntPtr.Zero)
{
id_load_Ljava_io_InputStream_arrayC = JNIEnv.GetMethodID(ks.Class.Handle, "load", "(Ljava/io/InputStream;[C)V");
}
IntPtr intPtr = IntPtr.Zero;
IntPtr intPtr2 = JNIEnv.NewArray(password);
JNIEnv.CallVoidMethod(ks.Handle, id_load_Ljava_io_InputStream_arrayC, new JValue[]
{
new JValue (intPtr),
new JValue (intPtr2)
});
JNIEnv.DeleteLocalRef(intPtr);
if (password != null)
{
JNIEnv.CopyArray(intPtr2, password);
JNIEnv.DeleteLocalRef(intPtr2);
}
}
Call Create Store in the main activity of Android app first. - This could possibly be improved and remove CreateStrore() from the interface by checking if ks == null in Save and Delete and calling the method if true
iOS
public class IAuthImplementation : IAuth
{
public IEnumerable<string> FindAccountsForService(string serviceId)
{
var query = new SecRecord(SecKind.GenericPassword);
query.Service = serviceId;
SecStatusCode result;
var records = SecKeyChain.QueryAsRecord(query, 1000, out result);
return records != null ?
records.Select(GetAccountFromRecord).ToList() :
new List<string>();
}
public void Save(string pin, string serviceId)
{
var statusCode = SecStatusCode.Success;
var serializedAccount = pin;
var data = NSData.FromString(serializedAccount, NSStringEncoding.UTF8);
//
// Remove any existing record
//
var existing = FindAccount(serviceId);
if (existing != null)
{
var query = new SecRecord(SecKind.GenericPassword);
query.Service = serviceId;
statusCode = SecKeyChain.Remove(query);
if (statusCode != SecStatusCode.Success)
{
throw new Exception("Could not save account to KeyChain: " + statusCode);
}
}
//
// Add this record
//
var record = new SecRecord(SecKind.GenericPassword);
record.Service = serviceId;
record.Generic = data;
record.Accessible = SecAccessible.WhenUnlocked;
statusCode = SecKeyChain.Add(record);
if (statusCode != SecStatusCode.Success)
{
throw new Exception("Could not save account to KeyChain: " + statusCode);
}
}
public void Delete(string serviceId)
{
var query = new SecRecord(SecKind.GenericPassword);
query.Service = serviceId;
var statusCode = SecKeyChain.Remove(query);
if (statusCode != SecStatusCode.Success)
{
throw new Exception("Could not delete account from KeyChain: " + statusCode);
}
}
string GetAccountFromRecord(SecRecord r)
{
return NSString.FromData(r.Generic, NSStringEncoding.UTF8);
}
string FindAccount(string serviceId)
{
var query = new SecRecord(SecKind.GenericPassword);
query.Service = serviceId;
SecStatusCode result;
var record = SecKeyChain.QueryAsRecord(query, out result);
return record != null ? GetAccountFromRecord(record) : null;
}
public void CreateStore()
{
throw new NotImplementedException();
}
}
WP
public class IAuthImplementation : IAuth
{
public IEnumerable<string> FindAccountsForService(string serviceId)
{
using (var store = IsolatedStorageFile.GetUserStoreForApplication())
{
string[] auths = store.GetFileNames("MyProg");
foreach (string path in auths)
{
using (var stream = new BinaryReader(new IsolatedStorageFileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read, store)))
{
int length = stream.ReadInt32();
byte[] data = stream.ReadBytes(length);
byte[] unprot = ProtectedData.Unprotect(data, null);
yield return Encoding.UTF8.GetString(unprot, 0, unprot.Length);
}
}
}
}
public void Save(string pin, string serviceId)
{
byte[] data = Encoding.UTF8.GetBytes(pin);
byte[] prot = ProtectedData.Protect(data, null);
var path = GetAccountPath(serviceId);
using (var store = IsolatedStorageFile.GetUserStoreForApplication())
using (var stream = new IsolatedStorageFileStream(path, FileMode.Create, FileAccess.Write, FileShare.None, store))
{
stream.WriteAsync(BitConverter.GetBytes(prot.Length), 0, sizeof(int)).Wait();
stream.WriteAsync(prot, 0, prot.Length).Wait();
}
}
public void Delete(string serviceId)
{
var path = GetAccountPath(serviceId);
using (var store = IsolatedStorageFile.GetUserStoreForApplication())
{
store.DeleteFile(path);
}
}
private string GetAccountPath(string serviceId)
{
return String.Format("{0}", serviceId);
}
public void CreateStore()
{
throw new NotImplementedException();
}
}
This is an adaptation of the Xamarin.Auth library (Found Here) but removes the dependency from the Xamarin.Auth library to provide cross platform use through the interface in the PCL. For this reason I have simplified it to only save one string. This is probably not the best implementation but it works in my case. Feel free to expand upon this
There is a nuget package called KeyChain.NET that encapsulated this logic for iOs, Android and Windows Phone.
It's open source and you have find sample at its github repository
More info at this blog post
I have this service class:
public delegate string AsyncMethodCaller(string id, HttpPostedFileBase file);
public class ObjectService : IDisposable
{
private readonly IObjectRepository repository;
private readonly IAmazonS3 client;
private readonly string bucketName;
private static object syncRoot = new object();
private static IDictionary<string, int> processStatus { get; set; }
public ObjectService(string accessKey, string secretKey, string bucketName)
{
var credentials = new BasicAWSCredentials(accessKey, secretKey);
this.bucketName = bucketName;
this.client = new AmazonS3Client(credentials, RegionEndpoint.EUWest1);
this.repository = new ObjectRepository(this.client, this.bucketName);
if (processStatus == null)
processStatus = new Dictionary<string, int>();
}
public IList<S3Object> GetAll()
{
return this.repository.GetAll();
}
public S3Object Get(string key)
{
return this.GetAll().Where(model => model.Key.Equals(key, StringComparison.OrdinalIgnoreCase)).SingleOrDefault();
}
/// <summary>
/// Note: You can upload objects of up to 5 GB in size in a single operation. For objects greater than 5 GB you must use the multipart upload API.
/// Using the multipart upload API you can upload objects up to 5 TB each. For more information, see http://docs.aws.amazon.com/AmazonS3/latest/dev/uploadobjusingmpu.html.
/// </summary>
/// <param name="id">Unique id for tracking the upload progress</param>
/// <param name="bucketName">The name of the bucket that the object is being uploaded to</param>
/// <param name="file">The file that will be uploaded</param>
/// <returns>The unique id</returns>
public string Upload(string id, HttpPostedFileBase file)
{
var reader = new BinaryReader(file.InputStream);
var data = reader.ReadBytes((int)file.InputStream.Length);
var stream = new MemoryStream(data);
var utility = new TransferUtility(client);
var request = new TransferUtilityUploadRequest()
{
BucketName = this.bucketName,
Key = file.FileName,
InputStream = stream
};
request.UploadProgressEvent += (sender, e) => request_UploadProgressEvent(sender, e, id);
utility.Upload(request);
return id;
}
private void request_UploadProgressEvent(object sender, UploadProgressArgs e, string id)
{
lock (syncRoot)
{
processStatus[id] = e.PercentDone;
}
}
public void Add(string id)
{
lock (syncRoot)
{
processStatus.Add(id, 0);
}
}
public void Remove(string id)
{
lock (syncRoot)
{
processStatus.Remove(id);
}
}
public int GetStatus(string id)
{
lock (syncRoot)
{
if (processStatus.Keys.Count(x => x == id) == 1)
{
return processStatus[id];
}
else
{
return 100;
}
}
}
public void Dispose()
{
this.repository.Dispose();
this.client.Dispose();
}
}
and my controller looks like this:
public class _UploadController : Controller
{
public void StartUpload(string id, HttpPostedFileBase file)
{
var bucketName = CompanyProvider.CurrentCompanyId();
using (var service = new ObjectService(ConfigurationManager.AppSettings["AWSAccessKey"], ConfigurationManager.AppSettings["AWSSecretKey"], bucketName))
{
service.Add(id);
var caller = new AsyncMethodCaller(service.Upload);
var result = caller.BeginInvoke(id, file, new AsyncCallback(CompleteUpload), caller);
}
}
public void CompleteUpload(IAsyncResult result)
{
var caller = (AsyncMethodCaller)result.AsyncState;
var id = caller.EndInvoke(result);
}
//
// GET: /_Upload/GetCurrentProgress
public JsonResult GetCurrentProgress(string id)
{
try
{
var bucketName = CompanyProvider.CurrentCompanyId();
this.ControllerContext.HttpContext.Response.AddHeader("cache-control", "no-cache");
using (var service = new ObjectService(ConfigurationManager.AppSettings["AWSAccessKey"], ConfigurationManager.AppSettings["AWSSecretKey"], bucketName))
{
return new JsonResult { Data = new { success = true, progress = service.GetStatus(id) } };
}
}
catch (Exception ex)
{
return new JsonResult { Data = new { success = false, error = ex.Message } };
}
}
}
Now, I have found that sometimes, I get the error ObjectDisposedException when trying to upload a file on this line: var data = reader.ReadBytes((int)file.InputStream.Length);. I read that I should not be using the using keyword because of the asynchronous calls but it still seems to be disposing the stream.
Can anyone tell me why?
Update 1
I have changed my controller to this:
private ObjectService service = new ObjectService(ConfigurationManager.AppSettings["AWSAccessKey"], ConfigurationManager.AppSettings["AWSSecretKey"], CompanyProvider.CurrentCompanyId());
public void StartUpload(string id, HttpPostedFileBase file)
{
service.Add(id);
var caller = new AsyncMethodCaller(service.Upload);
var result = caller.BeginInvoke(id, file, new AsyncCallback(CompleteUpload), caller);
}
public void CompleteUpload(IAsyncResult result)
{
var caller = (AsyncMethodCaller)result.AsyncState;
var id = caller.EndInvoke(result);
this.service.Dispose();
}
but I am still getting the error on the file.InputStream line.
Update 2
The problem seems to be with the BinaryReader.
I changed the code to look like this:
var inputStream = file.InputStream;
var i = inputStream.Length;
var n = (int)i;
using (var reader = new BinaryReader(inputStream))
{
var data = reader.ReadBytes(n);
var stream = new MemoryStream(data);
var request = new TransferUtilityUploadRequest()
{
BucketName = this.bucketName,
Key = file.FileName,
InputStream = stream
};
try
{
request.UploadProgressEvent += (sender, e) => request_UploadProgressEvent(sender, e, id);
utility.Upload(request);
}
catch
{
file.InputStream.Dispose(); // Close our stream
}
}
If the upload fails and I try to re-upload the item, that is when the error is thrown. It is like the item is locked or something.
You are disposing the service with the using statement when you are calling the service with BeginInvoke.
using (var service = new ObjectService(ConfigurationManager.AppSettings["AWSAccessKey"], ConfigurationManager.AppSettings["AWSSecretKey"], bucketName))
{
service.Add(id);
var caller = new AsyncMethodCaller(service.Upload);
var result = caller.BeginInvoke(id, file, new AsyncCallback(CompleteUpload), caller);
}
You have to dispose your service when the job is done:
var service = new ObjectService(ConfigurationManager.AppSettings["AWSAccessKey"], ConfigurationManager.AppSettings["AWSSecretKey"], bucketName)
public void StartUpload(string id, HttpPostedFileBase file)
{
var bucketName = CompanyProvider.CurrentCompanyId();
service.Add(id);
var caller = new AsyncMethodCaller(service.Upload);
var result = caller.BeginInvoke(id, file, new AsyncCallback(CompleteUpload), caller);
}
public void CompleteUpload(IAsyncResult result)
{
var caller = (AsyncMethodCaller)result.AsyncState;
var id = caller.EndInvoke(result);
service.Close();
service.Dispose();
}
Also your file might be corrupted, try this code:
byte[] buffer = new byte[file.InputStream.Length];
file.InputStream.Seek(0, SeekOrigin.Begin);
file.InputStream.Read(buffer, 0, file.InputStream.Length);
how can i upload a large file with ASP.NET MVC4 Web Api
and also get a progress?
i saw this post and i understand how to handle the uploaded file but how i can get the progress data?
How To Accept a File POST
please don't send me links to upload products.
i want to understand how handle this in the MVC4 Web Api way...
here is an example code of handling a file upload in MVC4 WebApi
public async Task<HttpResponseMessage> Post()
{
if (Request.Content.IsMimeMultipartContent())
{
var path = HttpContext.Current.Server.MapPath("~/App_Data");
var provider = new MultipartFormDataStreamProvider(path);
await Request.Content.ReadAsMultipartAsync(provider).ContinueWith(t =>
{
if (t.IsFaulted || t.IsCanceled)
throw new HttpResponseException(HttpStatusCode.InternalServerError);
});
return Request.CreateResponse(HttpStatusCode.OK);
}
else
{
throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.NotAcceptable, "This request is not properly formatted"));
}
}
now when
await Request.Content.ReadAsMultipartAsync(provider)
how can i get how bytes loaded?
There is a limitation to the size of files to be uploaded by default at two places. One at the request level, and second , if you hosting on IIS, then on web server level. I added couple of configs as mentioned in this blog, and i was able to upload a 36mb file without any issues. I have posted the snippet below.
Basically
1.
<system.web>
<httpRuntime maxRequestLength="2097152"/>
</system.web>
2.
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="2147483648" />
</requestFiltering>
</security><system.webServer>
Its easy to find the size of the file loaded into the server if you wish. In your code
while reading through the filedata in the stream, for each item in you file data, you can read the local file name as shown below.
string savedFile = fileData.LocalFileName;
// use the file info class to derive properties of the uploaded file
FileInfo file = new FileInfo(savedFile);
//this will give the size of the uploaded file
long size = file.length/1024
Hope this helps. I wonder why this was marked down?
I use this solution:
public class UploadController : ApiController
{
private static ConcurrentDictionary<string, State> _state = new ConcurrentDictionary<string, State>();
public State Get(string id)
{
State state;
if (_state.TryGetValue(id, out state))
{
return state;
}
return null;
}
public async Task<HttpResponseMessage> Post([FromUri] string id)
{
if (Request.Content.IsMimeMultipartContent())
{
var state = new State(Request.Content.Headers.ContentLength);
if (!_state.TryAdd(id, state))
throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.Conflict));
var path = System.Web.Hosting.HostingEnvironment.MapPath("~/App_Data");
var provider = new FileMultipartStreamProvider(path, state.Start, state.AddBytes);
await Request.Content.ReadAsMultipartAsync(provider).ContinueWith(t =>
{
_state.TryRemove(id, out state);
if (t.IsFaulted || t.IsCanceled)
throw new HttpResponseException(HttpStatusCode.InternalServerError);
});
return Request.CreateResponse(HttpStatusCode.OK);
}
else
{
throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.NotAcceptable, "This request is not properly formatted"));
}
}
}
public class State
{
public long? Total { get; set; }
public long Received { get; set; }
public string Name { get; set; }
public State(long? total = null)
{
Total = total;
}
public void Start(string name)
{
Received = 0;
Name = name;
}
public void AddBytes(long size)
{
Received = size;
}
}
public class FileMultipartStreamProvider : MultipartStreamProvider
{
private string _rootPath;
private Action<string> _startUpload;
private Action<long> _uploadProgress;
public FileMultipartStreamProvider(string root_path, Action<string> start_upload, Action<long> upload_progress)
: base()
{
_rootPath = root_path;
_startUpload = start_upload;
_uploadProgress = upload_progress;
}
public override System.IO.Stream GetStream(HttpContent parent, System.Net.Http.Headers.HttpContentHeaders headers)
{
var name = (headers.ContentDisposition.Name ?? "undefined").Replace("\"", "").Replace("\\", "_").Replace("/", "_").Replace("..", "_");
_startUpload(name);
return new WriteFileStreamProxy(Path.Combine(_rootPath, name), _uploadProgress);
}
}
public class WriteFileStreamProxy : FileStream
{
private Action<long> _writeBytes;
public WriteFileStreamProxy(string file_path, Action<long> write_bytes)
: base(file_path, FileMode.Create, FileAccess.Write)
{
_writeBytes = write_bytes;
}
public override void EndWrite(IAsyncResult asyncResult)
{
base.EndWrite(asyncResult);
#if DEBUG
System.Threading.Thread.Sleep(100);
#endif
if (_writeBytes != null)
_writeBytes(base.Position);
}
public override void Write(byte[] array, int offset, int count)
{
base.Write(array, offset, count);
#if DEBUG
System.Threading.Thread.Sleep(100);
#endif
if (_writeBytes != null)
_writeBytes(base.Position);
}
}
and small configure for non-buffered input stream:
config.Services.Replace(typeof(IHostBufferPolicySelector), new CustomPolicy());
implemented this:
public class CustomPolicy : System.Web.Http.WebHost.WebHostBufferPolicySelector
{
public override bool UseBufferedInputStream(object hostContext)
{
return false;
}
}
I Ended Up using an HttpModule but even the HttpModule won't show the progress bar
I found out something very interesting it's seems that when i upload the file in a Secure Protocol(over https://) then the progress are working but in non secure protocl (http://) the progress is not working and the file is fully buffered i don't know way is like that i believe it's a bug somewhere between the IIS to Asp.net Framework when the Request are get Procced.
now because i success make it work over https with an HttpModule i believe it is possible to make it work also with Mvc Web Api but i currently don't have the time to check that.
for parsing Mutlipart form data i used Nancy HttpMultipart parser here:
https://github.com/NancyFx/Nancy/tree/master/src/Nancy
just grabbed the classes:
HttpMultipart.cs
HttpMultipartBoundary.cs
HttpMultipartBuffer.cs
HttpMultipartSubStream.cs
here is the HttpModule Source:
public class HttpUploadModule : IHttpModule
{
public static DateTime lastClean = DateTime.UtcNow;
public static TimeSpan cleanInterval = new TimeSpan(0,10,0);
public static readonly object cleanLocker = new object();
public static readonly Dictionary<Guid,UploadData> Uploads = new Dictionary<Guid,UploadData>();
public const int KB = 1024;
public const int MB = KB * 1024;
public static void CleanUnusedResources( HttpContext context)
{
if( lastClean.Add( cleanInterval ) < DateTime.UtcNow ) {
lock( cleanLocker )
{
if( lastClean.Add( cleanInterval ) < DateTime.UtcNow )
{
int maxAge = int.Parse(ConfigurationManager.AppSettings["HttpUploadModule.MaxAge"]);
Uploads.Where(u=> DateTime.UtcNow.AddSeconds(maxAge) > u.Value.createdDate ).ToList().ForEach(u=>{
Uploads.Remove(u.Key);
});
Directory.GetFiles(context.Server.MapPath(ConfigurationManager.AppSettings["HttpUploadModule.Folder"].TrimEnd('/'))).ToList().ForEach(f=>{
if( DateTime.UtcNow.AddSeconds(maxAge) > File.GetCreationTimeUtc(f)) File.Delete(f);
});
lastClean = DateTime.UtcNow;
}
}
}
}
public void Dispose()
{
}
public void Init(HttpApplication app)
{
app.BeginRequest += app_BeginRequest;
}
void app_BeginRequest(object sender, EventArgs e)
{
HttpContext context = ((HttpApplication)sender).Context;
Guid uploadId = Guid.Empty;
if (context.Request.HttpMethod == "POST" && context.Request.ContentType.ToLower().StartsWith("multipart/form-data"))
{
IServiceProvider provider = (IServiceProvider)context;
HttpWorkerRequest wr = (HttpWorkerRequest)provider.GetService(typeof(HttpWorkerRequest));
FileStream fs = null;
MemoryStream ms = null;
CleanUnusedResources(context);
string contentType = wr.GetKnownRequestHeader(HttpWorkerRequest.HeaderContentType);
NameValueCollection queryString = HttpUtility.ParseQueryString( wr.GetQueryString() );
UploadData upload = new UploadData { id = uploadId ,status = 0, createdDate = DateTime.UtcNow };
if(
!contentType.Contains("boundary=") ||
/*AT LAST 1KB */ context.Request.ContentLength < KB ||
/*MAX 5MB */ context.Request.ContentLength > MB*5 ||
/*IS UPLOADID */ !Guid.TryParse(queryString["upload_id"], out uploadId) || Uploads.ContainsKey( uploadId )) {
upload.id = uploadId;
upload.status = 2;
Uploads.Add(upload.id, upload);
context.Response.StatusCode = 400;
context.Response.StatusDescription = "Bad Request";
context.Response.End();
}
string boundary = Nancy.HttpMultipart.ExtractBoundary( contentType );
upload.id = uploadId;
upload.status = 0;
Uploads.Add(upload.id, upload);
try {
if (wr.HasEntityBody())
{
upload.bytesRemaining =
upload.bytesTotal = wr.GetTotalEntityBodyLength();
upload.bytesLoaded =
upload.BytesReceived = wr.GetPreloadedEntityBodyLength();
if (!wr.IsEntireEntityBodyIsPreloaded())
{
byte[] buffer = new byte[KB * 8];
int readSize = buffer.Length;
ms = new MemoryStream();
//fs = new FileStream(context.Server.MapPath(ConfigurationManager.AppSettings["HttpUploadModule.Folder"].TrimEnd('/')+'/' + uploadId.ToString()), FileMode.CreateNew);
while (upload.bytesRemaining > 0)
{
upload.BytesReceived = wr.ReadEntityBody(buffer, 0, readSize);
if(upload.bytesRemaining == upload.bytesTotal) {
}
ms.Write(buffer, 0, upload.BytesReceived);
upload.bytesLoaded += upload.BytesReceived;
upload.bytesRemaining -= upload.BytesReceived;
if (readSize > upload.bytesRemaining)
{
readSize = upload.bytesRemaining;
}
}
//fs.Flush();
//fs.Close();
ms.Position = 0;
//the file is in our hands
Nancy.HttpMultipart multipart = new Nancy.HttpMultipart(ms, boundary);
foreach( Nancy.HttpMultipartBoundary b in multipart.GetBoundaries()) {
if(b.Name == "data") {
upload.filename = uploadId.ToString()+Path.GetExtension( b.Filename ).ToLower();
fs = new FileStream(context.Server.MapPath(ConfigurationManager.AppSettings["HttpUploadModule.Folder"].TrimEnd('/')+'/' + upload.filename ), FileMode.CreateNew);
b.Value.CopyTo(fs);
fs.Flush();
fs.Close();
upload.status = 1;
context.Response.StatusCode = 200;
context.Response.StatusDescription = "OK";
context.Response.Write( context.Request.ApplicationPath.TrimEnd('/') + "/images/temp/" + upload.filename );
}
}
}
}
}
catch(Exception ex) {
upload.ex = ex;
}
if(upload.status != 1)
{
upload.status = 2;
context.Response.StatusCode = 400;
context.Response.StatusDescription = "Bad Request";
}
context.Response.End();
}
}
}
public class UploadData {
public Guid id { get;set; }
public string filename {get;set;}
public int bytesLoaded { get; set; }
public int bytesTotal { get; set; }
public int BytesReceived {get; set;}
public int bytesRemaining { get;set; }
public int status { get;set; }
public Exception ex { get;set; }
public DateTime createdDate { get;set; }
}
would anyone have a working example of an amazon ITEMLOOKUP ?>
i have the following code but it does not seem to work:
string ISBN = "0393326381";
string ASIN = "";
if (!(string.IsNullOrEmpty(ISBN) && string.IsNullOrEmpty(ASIN)))
{
AWSECommerceServicePortTypeChannel service = new AWSECommerceServicePortTypeChannel();
ItemLookup lookup = new ItemLookup();
ItemLookupRequest request = new ItemLookupRequest();
lookup.AssociateTag = secretKey;
lookup.AWSAccessKeyId = accessKeyId;
if (string.IsNullOrEmpty(ASIN))
{
request.IdType = ItemLookupRequestIdType.ISBN;
request.ItemId = new string[] { ISBN.Replace("-", "") };
}
else
{
request.IdType = ItemLookupRequestIdType.ASIN;
request.ItemId = new string[] { ASIN };
}
request.ResponseGroup = new string[] { "OfferSummary" };
lookup.Request = new ItemLookupRequest[] { request };
response = service.ItemLookup(lookup);
if (response.Items.Length > 0 && response.Items[0].Item.Length > 0)
{
Item item = response.Items[0].Item[0];
if (item.MediumImage == null)
{
//bookImageHyperlink.Visible = false;
}
else
{
//bookImageHyperlink.ImageUrl = item.MediumImage.URL;
}
//bookImageHyperlink.NavigateUrl = item.DetailPageURL;
//bookTitleHyperlink.Text = item.ItemAttributes.Title;
//bookTitleHyperlink.NavigateUrl = item.DetailPageURL;
if (item.OfferSummary.LowestNewPrice == null)
{
if (item.OfferSummary.LowestUsedPrice == null)
{
//priceHyperlink.Visible = false;
}
else
{
//priceHyperlink.Text = string.Format("Buy used {0}", item.OfferSummary.LowestUsedPrice.FormattedPrice);
//priceHyperlink.NavigateUrl = item.DetailPageURL;
}
}
else
{
//priceHyperlink.Text = string.Format("Buy new {0}", item.OfferSummary.LowestNewPrice.FormattedPrice);
//priceHyperlink.NavigateUrl = item.DetailPageURL;
}
if (item.ItemAttributes.Author != null)
{
//authorLabel.Text = string.Format("By {0}", string.Join(", ", item.ItemAttributes.Author));
}
else
{
//authorLabel.Text = string.Format("By {0}", string.Join(", ", item.ItemAttributes.Creator.Select(c => c.Value).ToArray()));
}
/*
ItemLink link = item.ItemLinks.Where(i => i.Description.Contains("Wishlist")).FirstOrDefault();
if (link == null)
{
//wishListHyperlink.Visible = false;
}
else
{
//wishListHyperlink.NavigateUrl = link.URL;
}
* */
}
}
}
the problem is with this:
thisshould be defined differently but i do not know how AWSECommerceServicePortTypeChannel service = new AWSECommerceServicePortTypeChannel();
Say, that code looks awful familiar. You're missing the Endpoint signing piece from when they switched over to requiring that you add message signing. You need to add a behavior on your client. Here's the change to your code above:
if (!(string.IsNullOrEmpty(ISBN) && string.IsNullOrEmpty(ASIN)))
{
AWSECommerceServicePortTypeClient client = new AWSECommerceServicePortTypeClient();
client.ChannelFactory.Endpoint.Behaviors.Add(
new Amazon.AmazonSigningEndpointBehavior(
accessKeyId,
secretKey);
ItemLookup lookup = new ItemLookup();
ItemLookupRequest request = new ItemLookupRequest();
lookup.AssociateTag = accessKeyId;
lookup.AWSAccessKeyId = secretKey;
//... etc.
And here's the Endpoint (I can't take credit for this, I wish I could remember who should):
namespace Amazon
{
public class AmazonSigningEndpointBehavior : IEndpointBehavior {
private string accessKeyId = "";
private string secretKey = "";
public AmazonSigningEndpointBehavior(string accessKeyId, string secretKey) {
this.accessKeyId = accessKeyId;
this.secretKey = secretKey;
}
public void ApplyClientBehavior(ServiceEndpoint serviceEndpoint, ClientRuntime clientRuntime) {
clientRuntime.MessageInspectors.Add(new AmazonSigningMessageInspector(accessKeyId, secretKey));
}
public void ApplyDispatchBehavior(ServiceEndpoint serviceEndpoint, EndpointDispatcher endpointDispatcher) { return; }
public void Validate(ServiceEndpoint serviceEndpoint) { return; }
public void AddBindingParameters(ServiceEndpoint serviceEndpoint, BindingParameterCollection bindingParameters) { return; }
}
}
Oh. And you'll need the MessageInspector for that to work.
namespace Amazon
{
public class AmazonSigningMessageInspector : IClientMessageInspector {
private string accessKeyId = "";
private string secretKey = "";
public AmazonSigningMessageInspector(string accessKeyId, string secretKey) {
this.accessKeyId = accessKeyId;
this.secretKey = secretKey;
}
public object BeforeSendRequest(ref Message request, IClientChannel channel) {
// prepare the data to sign
string operation = Regex.Match(request.Headers.Action, "[^/]+$").ToString();
DateTime now = DateTime.UtcNow;
string timestamp = now.ToString("yyyy-MM-ddTHH:mm:ssZ");
string signMe = operation + timestamp;
byte[] bytesToSign = Encoding.UTF8.GetBytes(signMe);
// sign the data
byte[] secretKeyBytes = Encoding.UTF8.GetBytes(secretKey);
HMAC hmacSha256 = new HMACSHA256(secretKeyBytes);
byte[] hashBytes = hmacSha256.ComputeHash(bytesToSign);
string signature = Convert.ToBase64String(hashBytes);
// add the signature information to the request headers
request.Headers.Add(new AmazonHeader("AWSAccessKeyId", accessKeyId));
request.Headers.Add(new AmazonHeader("Timestamp", timestamp));
request.Headers.Add(new AmazonHeader("Signature", signature));
return null;
}
public void AfterReceiveReply(ref Message reply, object correlationState) { }
}
}
And finally, the Header:
namespace Amazon
{
public class AmazonHeader : MessageHeader
{
private string name;
private string value;
public AmazonHeader(string name, string value)
{
this.name = name;
this.value = value;
}
public override string Name { get { return name; } }
public override string Namespace { get { return "http://security.amazonaws.com/doc/2007-01-01/"; } }
protected override void OnWriteHeaderContents(XmlDictionaryWriter xmlDictionaryWriter, MessageVersion messageVersion)
{
xmlDictionaryWriter.WriteString(value);
}
}
}
Yes, they made it complicated when they started requiring message signing...
A simple and easy library is available on nuget.
PM> Install-Package Nager.AmazonProductAdvertising
Example
var authentication = new AmazonAuthentication("accesskey", "secretkey");
var client = new AmazonProductAdvertisingClient(authentication, AmazonEndpoint.US);
var result = await client.GetItemsAsync("B00BYPW00I");
To perform a lookup for anything other then an ASIN, you need to specify the "SearchIndex" property. You can simply set it to "All".
var request = new ItemLookupRequest();
request.ItemId = new[] {upcCode};
request.IdType = ItemLookupRequestIdType.UPC;
request.IdTypeSpecified = true;
request.SearchIndex = "All";
Here is a link to the documentation: http://docs.amazonwebservices.com/AWSECommerceService/2011-08-01/DG/index.html?ItemLookup.html. Note the description of the SearchIndex parameter:
Constraint:If ItemIdis an ASIN, a search index cannot be specified in
the request. Required for non-ASIN ItemIds.
I actually built a little wrapper around it so it hands you back a handy object graph. I have the source up on BitBucket and a little more about it on the C# Amazon ItemLookup page.
C# Amazon ItemLookup
You can make calls like:
var item = client.LookupByAsin("B0037X9N5U");
double? price = item.GetLowestPrice();