Encode existing blob instead of uploading from a local folder - c#

I have a C# program that upload-encode-publish a video from my local folder.I want to change the "filepath" to a "URI of blob" in azure storage account. ie instead of uploading a video and encode-publish, process an uploaded video and encode-publish. Anyone kindly suggest a method to implement this.
namespace OnDemandEncodingWithMES
{
class Program
{
// Read values from the App.config file.
static void Main(string[] args)
{
try
{
// Create and cache the Media Services credentials in a static class variable.
_cachedCredentials = new MediaServicesCredentials(_mediaServicesAccountName,_mediaServicesAccountKey);
Console.WriteLine("Upload a file.\n");
IAsset inputAsset =
UploadFile(Path.(_mediaFiles, #"video.mp4"), AssetCreationOptions.None);
Console.WriteLine("Generate thumbnails and get URLs.\n");
IAsset thumbnailAsset = GenerateThumbnail(inputAsset, AssetCreationOptions.None);
PublishAssetGetURLs(thumbnailAsset, false, ".bmp");
Console.WriteLine("Encode to audio and get an on demand URL.\n");
IAsset audioOnly = EncodeToAudioOnly(inputAsset, AssetCreationOptions.None);
PublishAssetGetURLs(audioOnly);
Console.WriteLine("Encode to adaptive bitraite MP4s and get on demand URLs.\n");
IAsset encodedAsset =
EncodeToAdaptiveBitrateMP4s(inputAsset, AssetCreationOptions.StorageEncrypted);
PublishAssetGetURLs(encodedAsset);
}
}
}
}

Check https://azure.microsoft.com/en-us/documentation/articles/media-services-copying-existing-blob/. Example shows how to create azure media services asset from existing azure storage account

Related

How to connect with google cloud vision API in c# with JSON file

Enabled the Cloud Vision API
Downloaded the JSON file from the google cloud.
How to connect the JSON file with the c# windows application and connect with the Google cloud vision for text read from the image?
I believe you should use something similar with the documentation :
Quickstart: Using client libraries
using Google.Cloud.Vision.V1;
using System;
namespace GoogleCloudSamples
{
public class QuickStart
{
public static void Main(string[] args)
{
// jsonPath is the path to the key.json file
var credential = GoogleCredential.FromFile(jsonPath);
// Instantiates a client
var client = ImageAnnotatorClient.Create(credential);
// Load the image file into memory
var image = Image.FromFile("wakeupcat.jpg");
// Performs label detection on the image file
var response = client.DetectLabels(image);
foreach (var annotation in response)
{
if (annotation.Description != null)
Console.WriteLine(annotation.Description);
}
}
}
}

Google Firebase and Unity (C#): Unable to download png from bucket

Specs
Unity editor version: 2018.2.8f1
Firebase Unity SDK version: 5.5.0
Additional SDKs: SimpleFirebaseUnity
Developing on: Mac
Export Platform: Android
Issue
I'm having troubles setting up a system to download pictures from storage. I'm not an expert in databases, but I wanted to give it try, just to learn how it is done.
I found Firebase very useful to store metadata on the real-time database and easy to approach even for an entry level programmer like me.
The problem is that I'm trying to download a .png file from a folder in storage, but I can't manage to find if the file is actually downloaded or if it's just lost in the process. I don't get any errors in the console, but when I open the folder in which the files should be, it's empty.
Code
private SimpleFirebaseUnity.Firebase firebaseDatabase;
private FirebaseQueue firebaseQueue;
private FirebaseStorage firebaseStorage;
private StorageReference m_storage_ref;
// Setup refernece to database and storage
void SetupReferences()
{
// Get a reference to the database service, using SimpleFirebase plugin
firebaseDatabase = SimpleFirebaseUnity.Firebase.CreateNew(FIREBASE_LINK, FIREBASE_SECRET);
// Get a reference to the storage service, using the default Firebase App
firebaseStorage = FirebaseStorage.DefaultInstance;
// Create a storage reference from our storage service
m_storage_ref = firebaseStorage.GetReferenceFromUrl(STORAGE_LINK);
// Create a queue, using SimpleFirebase
firebaseQueue = new FirebaseQueue(true, 3, 1f);
}
// ...
IEnumerator DownloadImage(string address, string fileName)
{
var local_path = Application.persistentDataPath + THUMBNAILS_PATH;
var content_ref = m_storage_ref.Child(THUMBNAILS_PATH + fileName + ".png");
content_ref.GetFileAsync(local_path).ContinueWith(task => {
if (!task.IsFaulted && !task.IsCanceled)
{
Debug.Log("File downloaded.");
}
});
yield return null;
}
There can be many reason for why this is not working for you including:
security rules are not setup properly
paths to files are not correct
you are testing it on wrong platform (Firebase is not working well in the editor)
your device is blocking the connection
etc...
In order to get error messages you need to log them:
IEnumerator DownloadImage(string address, string fileName)
{
var local_path = Application.persistentDataPath + THUMBNAILS_PATH;
var content_ref = m_storage_ref.Child(THUMBNAILS_PATH + fileName + ".png");
content_ref.GetFileAsync(local_path).ContinueWith(task => {
if (!task.IsFaulted && !task.IsCanceled)
{
Debug.Log("File downloaded.");
}
else
{
Debug.Log(task.Exception.ToString());
}
});
yield return null;
}
Keep in mind testing it in the editor may not work.

Upload to Azure Media Services / blob storage with SasLocator not showing uploaded file

So I want to upload video's from client desktop application to Azure Media Services (which of course uses Azure Storage).
I am trying to do a combination of:
this old documentation: 3 - Uploading Video into Microsoft Azure Media Services
and this relative new documentation: Upload multiple files with Media Services .NET SDK.
The first one shows an perfect example of my scenario, but the second one illustrates how to use BlobTransferClient to upload multiple files and have a "progress" indicator.
The problem: It does seem to upload and I don't get any error after uploading, yet nothing is showing up in Azure portal / Storage account.
It seems to upload because task takes long, task manager shows wifi upload progress and Azure storage shows that (successful) requests are being made.
So, serverside, I create a SasLocator for a temporary time:
public async Task<VideoUploadModel> GetSasLocator(string filename)
{
var assetName = filename + DateTime.UtcNow;
IAsset asset = await _context.Assets.CreateAsync(assetName, AssetCreationOptions.None, CancellationToken.None);
IAccessPolicy accessPolicy = _context.AccessPolicies.Create(assetName, TimeSpan.FromMinutes(10),
AccessPermissions.Write);
var locator = _context.Locators.CreateLocator(LocatorType.Sas, asset, accessPolicy);
var blobUri = new UriBuilder(locator.Path);
blobUri.Path += "/" + filename;
var model = new VideoUploadModel()
{
Filename = filename,
AssetName = assetName,
SasLocator = blobUri.Uri.AbsoluteUri,
AssetId = asset.Id
};
return model;
}
And client-side, I try to upload:
public async Task UploadVideoFileToBlobStorage(string[] files, string sasLocator, CancellationToken cancellationToken)
{
var blobUri = new Uri(sasLocator);
var sasCredentials = new StorageCredentials(blobUri.Query);
//var blob = new CloudBlockBlob(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
var blobClient = new CloudBlobClient(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
var blobTransferClient = new BlobTransferClient(TimeSpan.FromMinutes(1))
{
NumberOfConcurrentTransfers = 2,
ParallelTransferThreadCount = 2
};
//register events
blobTransferClient.TransferProgressChanged += BlobTransferClient_TransferProgressChanged;
//files
var uploadTasks = new List<Task>();
foreach (var filePath in files)
{
await blobTransferClient.UploadBlob(blobUri, filePath, new FileEncryption(), cancellationToken, blobClient, new NoRetry());
}
//StorageFile storageFile = null;
//if (string.IsNullOrEmpty(file.FutureAccessToken))
//{
// storageFile = await StorageFile.GetFileFromPathAsync(file.Path).AsTask(cancellationToken);
//}
//else
//{
// storageFile = await StorageApplicationPermissions.FutureAccessList.GetFileAsync(file.FutureAccessToken).AsTask(cancellationToken);
//}
//cancellationToken.ThrowIfCancellationRequested();
//await blob.UploadFromFileAsync(storageFile);
}
I know I am probably not doing it correctly with naming of assets and using the progress indicator instead of await, but of course I first want this to work first before finishing it.
I configured Azure Media Services to "Connect to Media Services API with service principal", where I created a new Azure AD app and generated keys for that, like this documentation page. I am not really sure how this exactly works, little unexperienced in Azure AD and Azure AD apps (guidance?).
Uploading:
Asset created but no files:
Storage doesn't show any files either:
Storage does show successful upload:
The reason I can't exactly follow the Upload multiple files with Media Services .NET SDK documentation is because it uses the _context (which is Microsoft.WindowsAzure.MediaServices.Client.CloudMediaContext), that _context I can use serverside but not client-side because it requires the TentantDomain,RESTAPI Endpoint, ClientId and Client Secret.
I guess uploading via SaSLocator is the correct way (?).
UPDATE 1
When uploading using CloudBlockBlob it does upload again and it is shown in my storage account within an asset, yet when I go the media services within azure and click on the particular asset, it doesn't show any files.
So the code for that:
var blob = new CloudBlockBlob(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
//files
var uploadTasks = new List<Task>();
foreach (var filePath in files)
{
await blob.UploadFromFileAsync(filePath, CancellationToken.None);
}
I've also tried to upload an asset manually within Azure. So Clicking on "Upload" in the Asset menu, then Encoding it. This all works fine.
UPDATE 2:
Digging deeper I came up with the following, not yet production-proof, way to make it currently work:
1. Get a Shared access signature directly from storage and upload it to there:
public static async Task<string> GetMediaSasLocator(string filename)
{
CloudBlobContainer cont = await GetMediaContainerAsync();
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(60),
Permissions = SharedAccessBlobPermissions.Write,
SharedAccessStartTime = DateTimeOffset.UtcNow.AddMinutes(-5)
};
await cont.FetchAttributesAsync();
return cont.Uri.AbsoluteUri + "/" + filename + cont.GetSharedAccessSignature(policy);
}
With this SaS I can upload just like I showed in UPDATE 1, nothing changed there.
2. Create a Azure Function (which was already planned to do) which handles the asset creation, file uploading to asset, encoding and publishing.
This has been done by following this tutorial: Azure Functions Tools for Visual Studio and then implement the code that is illustrated in Upload multiple files with Media Services .NET SDK.
So this "works" but is not perfect yet, I still don't have my progress indicator within my client WPF application and the Azure Function takes quite a long time to complete because we basically "upload" the file again to an Asset after it is already in Azure Storage. I rather use a method to either copy from one container to an asset container.
I came to this point because Azure functions need a fixed given container name, since assets create their own containers within an storage account, you can't trigger an Azure function on those. So to work with Azure Functions it seems I really have to upload it to a fixed container name and thereafter do the rest.
Question still remains: Why uploading a video file to Azure Storage via the BlobTransferClient does not work? And if it will work, how do I trigger an Azure function based on multiple containers. A 'path' like asset-{name}/{name}.avi would be preferred.
Eventually it turned out that I need to specify the base URL in the UploadBlob method, so without the filename itself which is within the SasLocator URL, but only the container name.
Once I fixed that I also noted it didn't upload to the filename I have provided in the SasLocator I generated server side (it includes a customerID prefix). I had to use one of the other method overloads to get the correct filename.
public async Task UploadVideoFilesToBlobStorage(List<VideoUploadModel> videos, CancellationToken cancellationToken)
{
var blobTransferClient = new BlobTransferClient();
//register events
blobTransferClient.TransferProgressChanged += BlobTransferClient_TransferProgressChanged;
//files
_videoCount = _videoCountLeft = videos.Count;
foreach (var video in videos)
{
var blobUri = new Uri(video.SasLocator);
//create the sasCredentials
var sasCredentials = new StorageCredentials(blobUri.Query);
//get the URL without sasCredentials, so only path and filename.
var blobUriBaseFile = new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path,
UriFormat.UriEscaped));
//get the URL without filename (needed for BlobTransferClient (seems to me like a issue)
var blobUriBase = new Uri(blobUriBaseFile.AbsoluteUri.Replace("/"+video.Filename, ""));
var blobClient = new CloudBlobClient(blobUriBaseFile, sasCredentials);
//upload using stream, other overload of UploadBlob forces to put online filename of local filename
using (FileStream fs = new FileStream(video.FilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
await blobTransferClient.UploadBlob(blobUriBase, video.Filename, fs, null, cancellationToken, blobClient,
new NoRetry(), "video/x-msvideo");
}
_videoCountLeft -= 1;
}
blobTransferClient.TransferProgressChanged -= BlobTransferClient_TransferProgressChanged;
}
private void BlobTransferClient_TransferProgressChanged(object sender, BlobTransferProgressChangedEventArgs e)
{
Console.WriteLine("progress, seconds remaining:" + e.TimeRemaining.Seconds);
double bytesTransfered = e.BytesTransferred;
double bytesTotal = e.TotalBytesToTransfer;
double thisProcent = bytesTransfered / bytesTotal;
double procent = thisProcent;
//devide by video amount
int videosUploaded = _videoCount - _videoCountLeft;
if (_videoCountLeft > 0)
{
procent = (thisProcent + videosUploaded) / _videoCount;
}
procent = procent * 100;//to real %
UploadProgressChangedEvent?.Invoke((int)procent, videosUploaded, _videoCount);
}
Actually Microsoft.WindowsAzure.MediaServices.Client.BlobTransferClient should be able to do concurrent uploads but there is no Method for uploading multiple yet it has properties for NumberOfConcurrentTransfers and ParallelTransferThreadCount, not sure how to use this.
I didn't check if this is now working with Assets as well because I now upload to 1 single container for every file and later using an Azure Function to process to an Asset, mainly because I can't trigger an Azure Function on a dynamic container name (every asset creates its own container).

Publish video on Media Services with ProgressiveDownload streaming format

I need to implement an application which is able to upload an .mp4 video on Azure Media Services. The video should be published in ProgressiveDownload streaming format and should be encrypted at rest.
Studying the Media Services documentation I tried to implement a Console application.
static void Main(string[] args)
{
try
{
var tokenCredentials = new AzureAdTokenCredentials(_AADTenantDomain, AzureEnvironments.AzureCloudEnvironment);
var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
_context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
// Add calls to methods defined in this section.
// Make sure to update the file name and path to where you have your media file.
IAsset inputAsset =
UploadFile(_videoPath, AssetCreationOptions.StorageEncrypted);
IAsset encodedAsset =
EncodeToAdaptiveBitrateMP4s(inputAsset, AssetCreationOptions.StorageEncrypted);
PublishAssetGetURLs(encodedAsset);
}
catch (Exception exception)
{
// Parse the XML error message in the Media Services response and create a new
// exception with its content.
exception = MediaServicesExceptionParser.Parse(exception);
Console.Error.WriteLine(exception.Message);
}
finally
{
Console.ReadLine();
}
}
static public IAsset EncodeToAdaptiveBitrateMP4s(IAsset asset, AssetCreationOptions options)
{
// Prepare a job with a single task to transcode the specified asset
// into a multi-bitrate asset.
IJob job = _context.Jobs.CreateWithSingleTask(
"Media Encoder Standard",
"Adaptive Streaming",
asset,
"Adaptive Bitrate MP4",
options);
Console.WriteLine("Submitting transcoding job...");
// Submit the job and wait until it is completed.
job.Submit();
job = job.StartExecutionProgressTask(
j =>
{
Console.WriteLine("Job state: {0}", j.State);
Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
},
CancellationToken.None).Result;
Console.WriteLine("Transcoding job finished.");
IAsset outputAsset = job.OutputMediaAssets[0];
return outputAsset;
}
static public void PublishAssetGetURLs(IAsset asset)
{
// Publish the output asset by creating an Origin locator for adaptive streaming,
// and a SAS locator for progressive download.
IAssetDeliveryPolicy policy =
_context.AssetDeliveryPolicies.Create("Clear Policy",
AssetDeliveryPolicyType.NoDynamicEncryption,
AssetDeliveryProtocol.ProgressiveDownload | AssetDeliveryProtocol.HLS | AssetDeliveryProtocol.SmoothStreaming | AssetDeliveryProtocol.Dash,
null);
asset.DeliveryPolicies.Add(policy);
_context.Locators.Create(
LocatorType.OnDemandOrigin,
asset,
AccessPermissions.Read,
TimeSpan.FromDays(30));
_context.Locators.Create(
LocatorType.Sas,
asset,
AccessPermissions.Read,
TimeSpan.FromDays(30));
IEnumerable<IAssetFile> mp4AssetFiles = asset
.AssetFiles
.ToList()
.Where(af => af.Name.EndsWith(".mp4", StringComparison.OrdinalIgnoreCase));
// Get the Smooth Streaming, HLS and MPEG-DASH URLs for adaptive streaming,
// and the Progressive Download URL.
Uri smoothStreamingUri = asset.GetSmoothStreamingUri();
Uri hlsUri = asset.GetHlsUri();
Uri mpegDashUri = asset.GetMpegDashUri();
// Get the URls for progressive download for each MP4 file that was generated as a result
// of encoding.
List<Uri> mp4ProgressiveDownloadUris = mp4AssetFiles.Select(af => af.GetSasUri()).ToList();
}
This code stopped working when I added the part to manage encryption at rest. More precisely when I:
replaced UploadFile(_videoPath, AssetCreationOptions.None); with UploadFile(_videoPath, AssetCreationOptions.StorageEncrypted);
replaced EncodeToAdaptiveBitrateMP4s(inputAsset, AssetCreationOptions.None); with EncodeToAdaptiveBitrateMP4s(inputAsset, AssetCreationOptions.StorageEncrypted);
added the following code in PublishAssetGetURLs method
IAssetDeliveryPolicy policy =
_context.AssetDeliveryPolicies.Create("Clear Policy",
AssetDeliveryPolicyType.NoDynamicEncryption,
AssetDeliveryProtocol.ProgressiveDownload | AssetDeliveryProtocol.HLS | AssetDeliveryProtocol.SmoothStreaming | AssetDeliveryProtocol.Dash,
null);
asset.DeliveryPolicies.Add(policy);
The problem is that the video is correctly uploaded, but when I try to play the video right inside Azure Portal I get a generic 0x0 error.
Id avoid progressive download if you need it to be protected. Unless you are building an offline protected download solution. If that is the case we just added some new articles in our documentation that shows how to do Playready, Widevine, and Fairpoay offline DRM.
Check out the content protection section of our docs for those articles.

Azure file sync in Xamarin

I'm new with Xamarin development. I am using Azure bucket for file uploading in android mobile app.
For test purpose i have build the console application that is uploading file on azure bucket successfully without any interrupt.
But when i'm trying to create access token from mobile development it get stuck and give me time out exception. If i create the token from console application and use that token for file uploading it give me another exception like" place holder not found".
As i expected i need a mobile service for token generation, If you have any idea please share your opinions with me that would be very helpful.
I'm also uploading Code that i'm using for android mobile development.
[Activity(Label = "WedAndroidApp", MainLauncher = true, Icon = "#drawable/icon")]
public class MainActivity : Activity
{
int count = 1;
//string sas = "https://supplypark.blob.core.windows.net/transaction-images?sv=2015-04-05&sr=c&sig=iJ8CZOi%2BktarlmrbZVHK7rYLdMOnKCeBjuPqjrrkGnM%3D&se=2016-06-09T14%3A21%3A49Z&sp=rwdl";
string sas_token = "https://supplypark.blob.core.windows.net/transaction-images?sv=2015-04-05&sr=c&sig=AeWe8rghAlKz77Xh%2BUM6S46AuUQzAaD2djqhaW9wdN8%3D&se=2016-06-09T14%3A21%3A49Z&sp=rwdl";
protected override void OnCreate(Bundle bundle)
{
base.OnCreate(bundle);
// Set our view from the "main" layout resource
SetContentView(Resource.Layout.Main);
// Get our button from the layout resource,
// and attach an event to it
Button button = FindViewById<Button>(Resource.Id.MyButton);
//button.Click += delegate { button.Text = string.Format("{0} clicks!", count++); };
button.Click += async delegate {
button.Text = string.Format("{0} clicks!", count++);
await UseContainerSAS(sas_token);
};
}
static async Task UseContainerSAS(string sas)
{
//Try performing container operations with the SAS provided.
//Return a reference to the container using the SAS URI.
CloudBlobContainer container = new CloudBlobContainer(new Uri(sas));
string date = DateTime.Now.ToString();
try
{
//Write operation: write a new blob to the container.
CloudBlockBlob blob = container.GetBlockBlobReference("tdi" + date + ".txt");
string blobContent = "This blob was created with a shared access signature granting write permissions to the container. ";
MemoryStream msWrite = new
MemoryStream(Encoding.UTF8.GetBytes(blobContent));
msWrite.Position = 0;
using (msWrite)
{
await blob.UploadFromStreamAsync(msWrite);
}
Console.WriteLine("Write operation succeeded for SAS " + sas);
Console.WriteLine();
}
catch (Exception e)
{
Console.WriteLine("Write operation failed for SAS " + sas);
Console.WriteLine("Additional error information: " + e.Message);
Console.WriteLine();
}
}
}
If you're already using Azure Mobile Apps, I would recommend that you use the built-in feature that accesses Azure Storage: Connect to Azure Storage in your Xamarin.Forms app
Otherwise, you should write code that creates a SAS token on your server, not your client. The storage master key should never be distributed with your client app, as it would be a big security risk. You should have the key in your server only, and it should send a SAS token to clients to scope access to a particular blob or container.
If you're using a .NET backend Mobile App/Mobile Service, then you should just move your SAS code to a custom API on the server. If you're using Node.js, you can follow this tutorial: Work with shared access signatures in Node.js.
If you don't have a Mobile App backend, then you can use Azure Functions to generate SAS tokens. Here is a sample: https://github.com/lindydonna/GetSasToken-function.
Note that Azure Mobile Service is deprecated, so you should be using Azure Mobile Apps instead.

Categories