calling an async task in a windows store app? - c#

I had a working "save reader" that would strip unwanted data out of a winform, then return a list of strings referred to as "items" .
My problem is that when converting this for use with a winstore app; I cannot call an async task, and I need to use one to access the file.
im not 100% sure how to do this, so any help is appreciated; this is what I have so far.
the public SaveGame(StorageFile file)
{
File = promptUser();
}
throws an error, I have tried making the constructor async to, but that throws more errors
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.IO;
using Windows.Storage;
using Windows.Storage.Pickers;
namespace save_game_reader
{
class SaveGame
{
private StorageFile File { get; set; }
public SaveGame(StorageFile file)
{
File = promptUser();
Task<StorageFile> returnedTaskTResult = promptUser();
StorageFile intResult = await promptUser();
}
public async Task<List<string>> GetAllItems()
{
string WholeSaveString = await Windows.Storage.FileIO.ReadTextAsync(File);
List<string> ToReturn = new List<string>();
List<string> SplitList = WholeSaveString.Split(new string[] { "data" }, StringSplitOptions.None).ToList();
foreach (string line in SplitList)
{
var start = line.IndexOf("value") + 6;
var end = line.IndexOf("type", start);
if (end != -1)
{
string substring = line.Substring(start, (end - start) - 8);
if (substring.Length >= 4)
{
ToReturn.Add(substring);
}
}
}
return ToReturn;
}
private async Task<StorageFile> promptUser()
{
var picker = new FileOpenPicker();
picker.ViewMode = PickerViewMode.Thumbnail;
picker.SuggestedStartLocation = PickerLocationId.DocumentsLibrary;
StorageFile sf = await picker.PickSingleFileAsync();
return sf;
}
}
}

It's pretty rare to need to do so, but the typical way to work around constructors in async is to just use a static method:
class SaveGame
{
SaveGame(StorageFile file)
{
File = file;
}
public async Task<SaveGame> Create()
{
return new SaveGame(await promptUser());
}
}
SaveGame g = await SaveGame.Create();
It may be worth it in your case to consider a cleaner separation between UI code and backend code. If something seems weird or difficult to do, it's often a sign of design flaw.
Also, one does not "call an async task". You call an async method and await a Task.

Constructors cannot be async, change your code as follows:
public SaveGame()
{}
public async Task Init(Storage file)
{
File = promptUser();
Task<StorageFile> returnedTaskTResult = promptUser();
StorageFile intResult = await promptUser();
}
somewhere else in an async method you call it like this:
var save = new SaveGame();
await save.Init(file);

Related

How to download correctly files simultaneously?

I am trying to download mltiple files simultaneosly. But all files are downloading one by one, sequantilly. So, at first this file downloaded #"http://download.geofabrik.de/europe/cyprus-latest.osm.pbf", and then this file is started to dowload #"http://download.geofabrik.de/europe/finland-latest.osm.pbf",, and the next file to be downloaded is #"http://download.geofabrik.de/europe/great-britain-latest.osm.pbf" and so on.
But I would like to download simultaneously.
So I've the following code based on the code from this answer:
static void Main(string[] args)
{
Task.Run(async () =>
{
await DownloadFiles();
}).GetAwaiter().GetResult();
}
public static async Task DownloadFiles()
{
IList<string> urls = new List<string>
{
#"http://download.geofabrik.de/europe/cyprus-latest.osm.pbf",
#"http://download.geofabrik.de/europe/finland-latest.osm.pbf",
#"http://download.geofabrik.de/europe/great-britain-latest.osm.pbf",
#"http://download.geofabrik.de/europe/belgium-latest.osm.pbf",
#"http://download.geofabrik.de/europe/belgium-latest.osm.pbf"
};
foreach (var url in urls)
{
string fileName = url.Substring(url.LastIndexOf('/'));
await DownloadFile(url, fileName);
}
}
public static async Task DownloadFile(string url, string fileName)
{
string address = #"D:\Downloads";
using (var client = new WebClient())
{
await client.DownloadFileTaskAsync(url, $"{address}{fileName}");
}
}
However, when I see in my file system, then I see that files are downloading one by one, sequantially, not simultaneosuly:
In addition, I've tried to use this approach, however there are no simultaneous downloads:
static void Main(string[] args)
{
IList<string> urls = new List<string>
{
#"http://download.geofabrik.de/europe/cyprus-latest.osm.pbf",
#"http://download.geofabrik.de/europe/finland-latest.osm.pbf",
#"http://download.geofabrik.de/europe/great-britain-latest.osm.pbf",
#"http://download.geofabrik.de/europe/belgium-latest.osm.pbf",
#"http://download.geofabrik.de/europe/belgium-latest.osm.pbf"
};
Parallel.ForEach(urls,
new ParallelOptions { MaxDegreeOfParallelism = 10 },
DownloadFile);
}
public static void DownloadFile(string url)
{
string address = #"D:\Downloads";
using (var sr = new StreamReader(WebRequest.Create(url)
.GetResponse().GetResponseStream()))
using (var sw = new StreamWriter(address + url.Substring(url.LastIndexOf('/'))))
{
sw.Write(sr.ReadToEnd());
}
}
Could you tell me how it is possible to download simultaneosly?
Any help would be greatly appreciated.
foreach (var url in urls)
{
string fileName = url.Substring(url.LastIndexOf('/'));
await DownloadFile(url, fileName); // you wait to download the item and then move the next
}
Instead you should create tasks and wait all of them to complete.
public static Task DownloadFiles()
{
IList<string> urls = new List<string>
{
#"http://download.geofabrik.de/europe/cyprus-latest.osm.pbf",
#"http://download.geofabrik.de/europe/finland-latest.osm.pbf",
#"http://download.geofabrik.de/europe/great-britain-latest.osm.pbf",
#"http://download.geofabrik.de/europe/belgium-latest.osm.pbf",
#"http://download.geofabrik.de/europe/belgium-latest.osm.pbf"
};
var tasks = urls.Select(url=> {
var fileName = url.Substring(url.LastIndexOf('/'));
return DownloadFile(url, fileName);
}).ToArray();
return Task.WhenAll(tasks);
}
Rest of your code can remain same.
Eldar's solution works with some minor edits. This is the full working DownloadFiles method that was edited:
public static async Task DownloadFiles()
{
IList<string> urls = new List<string>
{
#"http://download.geofabrik.de/europe/cyprus-latest.osm.pbf",
#"http://download.geofabrik.de/europe/finland-latest.osm.pbf",
#"http://download.geofabrik.de/europe/great-britain-latest.osm.pbf",
#"http://download.geofabrik.de/europe/belgium-latest.osm.pbf",
#"http://download.geofabrik.de/europe/belgium-latest.osm.pbf"
};
var tasks = urls.Select(t => {
var fileName = t.Substring(t.LastIndexOf('/'));
return DownloadFile(t, fileName);
}).ToArray();
await Task.WhenAll(tasks);
}
this will download them asynchronously one after each other.
await DownloadFile(url, fileName);
await DownloadFile(url2, fileName2);
this will do what you actually want to achieve:
var task1 = DownloadFile(url, fileName);
var task2 = DownloadFile(url2, fileName2);
await Task.WhenAll(task1, task2);

How do you delete the dead letters in an Azure Service Bus queue?

How do you delete the dead letters in an Azure Service Bus queue?
I can process the messages in the queue ...
var queueClient = QueueClient.CreateFromConnectionString(sbConnectionString, queueName);
while (queueClient.Peek() != null)
{
var brokeredMessage = queueClient.Receive();
brokeredMessage.Complete();
}
but can't see anyway to handle the dead letter messages
The trick is to get the deadletter path for the queue which you can get by using QueueClient.FormatDeadLetterPath(queueName).
Please try the following:
var queueClient = QueueClient.CreateFromConnectionString(sbConnectionString, QueueClient.FormatDeadLetterPath(queueName));
while (queueClient.Peek() != null)
{
var brokeredMessage = queueClient.Receive();
brokeredMessage.Complete();
}
using System;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using Microsoft.Azure.ServiceBus;
using Microsoft.Azure.ServiceBus.Core;
using System.Text;
using System.Collections.Generic;
using System.Threading.Tasks;
namespace ClearDeadLetterQ
{
[TestClass]
public class UnitTest1
{
const string ServiceBusConnectionString = "Endpoint=sb://my-domain.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=youraccesskeyhereyouraccesskeyhere";
[TestMethod]
public async Task TestMethod1()
{
await this.ClearDeadLetters("my.topic.name", "MySubscriptionName/$DeadLetterQueue");
}
public async Task ClearDeadLetters(string topicName, string subscriptionName)
{
var messageReceiver = new MessageReceiver(ServiceBusConnectionString, EntityNameHelper.FormatSubscriptionPath(topicName, subscriptionName), ReceiveMode.PeekLock);
var message = await messageReceiver.ReceiveAsync();
while ((message = await messageReceiver.ReceiveAsync()) != null)
{
await messageReceiver.CompleteAsync(message.SystemProperties.LockToken);
}
await messageReceiver.CloseAsync();
}
}
}
There are some great samples available in our GitHub sample repo (https://github.com/Azure-Samples/azure-servicebus-messaging-samples). The DeadletterQueue project should show you an example of how to do this in your code:
var dead-letterReceiver = await receiverFactory.CreateMessageReceiverAsync(
QueueClient.FormatDeadLetterPath(queueName), ReceiveMode.PeekLock);
while (true)
{
var msg = await dead-letterReceiver.ReceiveAsync(TimeSpan.Zero);
if (msg != null)
{
foreach (var prop in msg.Properties)
{
Console.WriteLine("{0}={1}", prop.Key, prop.Value);
}
await msg.CompleteAsync();
}
else
{
break;
}
}
}
Hope that helps!
Open Azure into your target Bus Queue Service.
(Set this value in place of <BUS-QUEUE-NAME> on the code)
Go into the queue you want to delete.
(Set this value in place of <QUEUE-NAME> on the code)
Create a Shared Access Policy and name it: RemoveDeadLetterQueue with checkbox Manage been selected.
Copy this Primary Key onto <QUEUE-POLICY-PRIMARYKEY> in this code.
And the code is ready to run.
using Microsoft.Azure.ServiceBus.Core;
using Microsoft.Azure.ServiceBus;
string serviceBusQueue = "<BUS-QUEUE-NAME>";
string serviceBusQueueName = "<QUEUE-NAME>";
string policyName = "RemoveDeadLetterQueue";
string policyPrimaryKey = "<QUEUE-POLICY-PRIMARYKEY>";
var receiver = new MessageReceiver(
connectionString: $"Endpoint=sb://{serviceBusQueue}.servicebus.windows.net/;SharedAccessKeyName={policyName};SharedAccessKey={policyPrimaryKey}",
entityPath: $"{serviceBusQueueName}/$DeadLetterQueue",
receiveMode: ReceiveMode.ReceiveAndDelete
);
var messages = await receiver.ReceiveAsync(maxMessageCount: 1000);
while(messages != null)
{
foreach (var message in messages)
{
Console.WriteLine($"[Delete Message] ID: {message.MessageId}");
}
messages = await receiver.ReceiveAsync(maxMessageCount: 1000);
}
await receiver.CloseAsync();

Deserializing an array of values from JSON in 2048 game causes NullReferenceException

I'm making a 2048 clone for my university classes. I've got a custom class:
public class Field
{
public int value;
public bool stop;
}
And an array of those fields:
public Field[,] gameArray = new Field[4, 4];
Those are in a custom class.
Here are the methods I use to save and load this array in the item page cs file:
private async void saveAsync()
{
string jsonContents = JsonConvert.SerializeObject(currentGame.gameArray);
StorageFolder localFolder = ApplicationData.Current.LocalFolder;
StorageFile textFile = await localFolder.CreateFileAsync(saveFileName,
CreationCollisionOption.ReplaceExisting);
using (IRandomAccessStream textStream = await textFile.OpenAsync(FileAccessMode.ReadWrite))
{
using (DataWriter textWriter = new DataWriter(textStream))
{
textWriter.WriteString(jsonContents);
await textWriter.StoreAsync();
}
}
}
private async void loadAsync()
{
StorageFolder localFolder = ApplicationData.Current.LocalFolder;
StorageFile textFile = await localFolder.GetFileAsync(saveFileName);
using (IRandomAccessStream textStream = await textFile.OpenReadAsync())
{
using (DataReader textReader = new DataReader(textStream))
{
if ((uint)textStream.Size != 0)
{
uint textLength = (uint)textStream.Size;
await textReader.LoadAsync(textLength);
string jsonContents = textReader.ReadString(textLength);
currentGame.gameArray = JsonConvert.DeserializeObject<Field[,]>(jsonContents);
}
else currentGame.Reset();
}
update();
}
}
currentGame is an instance of the custom class. Reset(); method, well, resets all the fields, sets their value to 0. Update(); method is used to check if the player lost(not yet implemented), runs the saveAsync(); method and fills the board with lines like this(color method just changes the background color):
txt10.Text = currentGame.gameArray[0, 0].value.ToString();
color(b10, currentGame.gameArray[0, 0].value);
and the first line is what gives me a NullReferenceException. I probably messed something up with serialization and deserialization of the array, but I got literally no idea what it is and how to fix that, that's my first take on serialization and I basically copied it from the internet. Do I somehow need to pass a reference to the loadAsync(); method?

Web API - Get progress when uploading to Azure storage

The task i want to accomplish is to create a Web API service in order to upload a file to Azure storage. At the same time, i would like to have a progress indicator that reflects the actual upload progress. After some research and studying i found out two important things:
First is that i have to split the file manually into chunks, and upload them asynchronously using the PutBlockAsync method from Microsoft.WindowsAzure.Storage.dll.
Second, is that i have to receive the file in my Web API service in Streamed mode and not in Buffered mode.
So until now i have the following implementation:
UploadController.cs
using System.Configuration;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using System.Web.Http;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using WebApiFileUploadToAzureStorage.Infrastructure;
using WebApiFileUploadToAzureStorage.Models;
namespace WebApiFileUploadToAzureStorage.Controllers
{
public class UploadController : ApiController
{
[HttpPost]
public async Task<HttpResponseMessage> UploadFile()
{
if (!Request.Content.IsMimeMultipartContent("form-data"))
{
return Request.CreateResponse(HttpStatusCode.UnsupportedMediaType,
new UploadStatus(null, false, "No form data found on request.", string.Empty, string.Empty));
}
var streamProvider = new MultipartAzureBlobStorageProvider(GetAzureStorageContainer());
var result = await Request.Content.ReadAsMultipartAsync(streamProvider);
if (result.FileData.Count < 1)
{
return Request.CreateResponse(HttpStatusCode.BadRequest,
new UploadStatus(null, false, "No files were uploaded.", string.Empty, string.Empty));
}
return Request.CreateResponse(HttpStatusCode.OK);
}
private static CloudBlobContainer GetAzureStorageContainer()
{
var storageConnectionString = ConfigurationManager.AppSettings["AzureBlobStorageConnectionString"];
var storageAccount = CloudStorageAccount.Parse(storageConnectionString);
var blobClient = storageAccount.CreateCloudBlobClient();
blobClient.DefaultRequestOptions.SingleBlobUploadThresholdInBytes = 1024 * 1024;
var container = blobClient.GetContainerReference("photos");
if (container.Exists())
{
return container;
}
container.Create();
container.SetPermissions(new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Container
});
return container;
}
}
}
MultipartAzureBlobStorageProvider.cs
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.WindowsAzure.Storage.Blob;
namespace WebApiFileUploadToAzureStorage.Infrastructure
{
public class MultipartAzureBlobStorageProvider : MultipartFormDataStreamProvider
{
private readonly CloudBlobContainer _blobContainer;
public MultipartAzureBlobStorageProvider(CloudBlobContainer blobContainer) : base(Path.GetTempPath())
{
_blobContainer = blobContainer;
}
public override Task ExecutePostProcessingAsync()
{
const int blockSize = 256 * 1024;
var fileData = FileData.First();
var fileName = Path.GetFileName(fileData.Headers.ContentDisposition.FileName.Trim('"'));
var blob = _blobContainer.GetBlockBlobReference(fileName);
var bytesToUpload = (new FileInfo(fileData.LocalFileName)).Length;
var fileSize = bytesToUpload;
blob.Properties.ContentType = fileData.Headers.ContentType.MediaType;
blob.StreamWriteSizeInBytes = blockSize;
if (bytesToUpload < blockSize)
{
var cancellationToken = new CancellationToken();
using (var fileStream = new FileStream(fileData.LocalFileName, FileMode.Open, FileAccess.ReadWrite))
{
var upload = blob.UploadFromStreamAsync(fileStream, cancellationToken);
Debug.WriteLine($"Status {upload.Status}.");
upload.ContinueWith(task =>
{
Debug.WriteLine($"Status {task.Status}.");
Debug.WriteLine("Upload is over successfully.");
}, TaskContinuationOptions.OnlyOnRanToCompletion);
upload.ContinueWith(task =>
{
Debug.WriteLine($"Status {task.Status}.");
if (task.Exception != null)
{
Debug.WriteLine("Task could not be completed." + task.Exception.InnerException);
}
}, TaskContinuationOptions.OnlyOnFaulted);
upload.Wait(cancellationToken);
}
}
else
{
var blockIds = new List<string>();
var index = 1;
long startPosition = 0;
long bytesUploaded = 0;
do
{
var bytesToRead = Math.Min(blockSize, bytesToUpload);
var blobContents = new byte[bytesToRead];
using (var fileStream = new FileStream(fileData.LocalFileName, FileMode.Open))
{
fileStream.Position = startPosition;
fileStream.Read(blobContents, 0, (int)bytesToRead);
}
var manualResetEvent = new ManualResetEvent(false);
var blockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(index.ToString("d6")));
Debug.WriteLine($"Now uploading block # {index.ToString("d6")}");
blockIds.Add(blockId);
var upload = blob.PutBlockAsync(blockId, new MemoryStream(blobContents), null);
upload.ContinueWith(task =>
{
bytesUploaded += bytesToRead;
bytesToUpload -= bytesToRead;
startPosition += bytesToRead;
index++;
var percentComplete = (double)bytesUploaded / fileSize;
Debug.WriteLine($"Percent complete: {percentComplete.ToString("P")}");
manualResetEvent.Set();
});
manualResetEvent.WaitOne();
} while (bytesToUpload > 0);
Debug.WriteLine("Now committing block list.");
var putBlockList = blob.PutBlockListAsync(blockIds);
putBlockList.ContinueWith(task =>
{
Debug.WriteLine("Blob uploaded completely.");
});
putBlockList.Wait();
}
File.Delete(fileData.LocalFileName);
return base.ExecutePostProcessingAsync();
}
}
}
I also enabled Streamed mode as this blog post suggests. This approach works great, in terms that the file is uploaded successfully to Azure storage. Then, when i make a call to this service making use of XMLHttpRequest (and subscribing to the progress event) i see the indicator moving to 100% very quickly. If a 5MB file needs around 1 minute to upload, my indicator moves to the end in just 1 second. So probably the problem resides in the way that the server informs the client about the upload progress. Any thoughts about this? Thank you.
================================ Update 1 ===================================
That is the JavaScript code i use to call the service
function uploadFile(file, index, uploadCompleted) {
var authData = localStorageService.get("authorizationData");
var xhr = new XMLHttpRequest();
xhr.upload.addEventListener("progress", function (event) {
fileUploadPercent = Math.floor((event.loaded / event.total) * 100);
console.log(fileUploadPercent + " %");
});
xhr.onreadystatechange = function (event) {
if (event.target.readyState === event.target.DONE) {
if (event.target.status !== 200) {
} else {
var parsedResponse = JSON.parse(event.target.response);
uploadCompleted(parsedResponse);
}
}
};
xhr.open("post", uploadFileServiceUrl, true);
xhr.setRequestHeader("Authorization", "Bearer " + authData.token);
var data = new FormData();
data.append("file-" + index, file);
xhr.send(data);
}
your progress indicator might be moving rapidly fast, might be because of
public async Task<HttpResponseMessage> UploadFile()
i have encountered this before, when creating an api of async type, im not even sure if it can be awaited, it will just of course just finish your api call on the background, reason your progress indicator instantly finish, because of the async method (fire and forget). the api will immediately give you a response, but will actually finish on the server background (if not awaited).
please kindly try making it just
public HttpResponseMessage UploadFile()
and also try these ones
var result = Request.Content.ReadAsMultipartAsync(streamProvider).Result;
var upload = blob.UploadFromStreamAsync(fileStream, cancellationToken).Result;
OR
var upload = await blob.UploadFromStreamAsync(fileStream, cancellationToken);
hope it helps.
Other way to acomplish what you want (I don't understand how the XMLHttpRequest's progress event works) is using the ProgressMessageHandler to get the upload progress in the request. Then, in order to notify the client, you could use some cache to store the progress, and from the client request the current state in other endpoint, or use SignalR to send the progress from the server to the client
Something like:
//WebApiConfigRegister
var progress = new ProgressMessageHandler();
progress.HttpSendProgress += HttpSendProgress;
config.MessageHandlers.Add(progress);
//End WebApiConfig Register
private static void HttpSendProgress(object sender, HttpProgressEventArgs e)
{
var request = sender as HttpRequestMessage;
//todo: check if request is not null
//Get an Id from the client or something like this to identify the request
var id = request.RequestUri.Query[0];
var perc = e.ProgressPercentage;
var b = e.TotalBytes;
var bt = e.BytesTransferred;
Cache.InsertOrUpdate(id, perc);
}
You can check more documentation on this MSDN blog post (Scroll down to "Progress Notifications" section)
Also, you could calculate the progress based on the data chunks, store the progress in a cache, and notify in the same way as above. Something like this solution

Getting `Cannot access a closed file` when asynchronously copying files

I want to copy multiple files asynchronously but i am getting this error,
System.ObjectDisposedException: Cannot access a closed file.
Here is my method,
public Task CopyAllAsync(IList<ProductsImage> productsImage)
{
var tasks = new List<Task>();
foreach (var productImage in productsImage)
{
var task = _fileService.CopyAsync(productImage.ExistingFileName, productImage.NewFileName);
tasks.Add(task);
}
return Task.WhenAll(tasks);
}
here is the FileService.CopyAsync method,
public Task CopyAsync(string sourcePath, string destinationPath)
{
using (var source = File.Open(sourcePath, FileMode.Open))
{
using (var destination = File.Create(destinationPath))
{
return source.CopyToAsync(destination);
}
}
}
Then I am awaiting this,
await _imageService.CopyAllAsync(productsImage);
If I debug then I will not get this error?
You need to await the copy operation instead of simply returning the task. That would make sure you're not ending the using scope too soon which means calling Dispose on your FileStreams
public async Task CopyAsync(string sourcePath, string destinationPath)
{
using (var source = File.Open(sourcePath, FileMode.Open))
{
using (var destination = File.Create(destinationPath))
{
await source.CopyToAsync(destination);
}
}
}

Categories