I am new to working with azure via C# and I am looking to upload an existing file to an existing azure storage container. I am currently able to create a new local file and then upload it to a container that I create as seen in the CreateContainerAndUploadFile method. However when I try to upload an existing file to an existing container it does not work. When using the CreateContainerAndUploadFile method I see the new container and .txt file appear, but the UploadFile method runs all the way through with no errors and I do not see the file appear in the container.
If anyone knows why the method is running through but not uploading I would greatly appreciate the help.
public class Launcher
{
public static void Main(string[] args)
{
BlobManager manager = new BlobManager();
manager.UploadFile("DemoText.txt", "democontainer");
manager.CreateContainerAndUploadFile("demo");
}
}
public class BlobManager
{
private BlobServiceClient blobServiceClient;
public BlobManager()
{
try
{
// Get azure table storage connection string.
string connectionString = ConfigurationManager.AppSettings.Get("StorageConnectionString");
blobServiceClient = new BlobServiceClient(connectionString);
}
catch (Exception ExceptionObj)
{
throw ExceptionObj;
}
}
public void UploadFile(string fileName, string container)
{
Console.WriteLine("Entering Upload to existing blob");
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(container);
containerClient.CreateIfNotExistsAsync();
Console.WriteLine("Existing blob obtained");
string localPath = "./";
string localFilePath = Path.Combine(localPath, fileName);
Console.WriteLine("Local file path set");
CloudBlockBlob blockBlob;
//// Create a block blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine($"BlobClient established with filename {fileName}");
//// Open the file and upload its data
using FileStream uploadFileStream = File.OpenRead(localFilePath);
blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
Console.WriteLine("Uploaded File to existing container");
}
public void CreateContainerAndUploadFile(string containerName)
{
//Create a BlobServiceClient object which will be used to create a container client
BlobServiceClient blobServiceClient =
new BlobServiceClient(ConfigurationManager.AppSettings.Get("StorageConnectionString"));
Console.WriteLine("Pre-container");
// Create the container and return a container client object
BlobContainerClient containerClient = blobServiceClient.CreateBlobContainer(containerName + Guid.NewGuid());
Console.WriteLine("Post-container");
// Create a local file in the ./data/ directory for uploading and downloading
string localPath = "./";
string fileName = "demo" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
// Write text to the file
File.WriteAllTextAsync(localFilePath, "Hello, World!");
// Get a reference to the blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine($"Uploading to Blob storage as blob: {fileName}");
//// Open the file and upload its data
using FileStream uploadFileStream = File.OpenRead(localFilePath);
blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
}
}
The reason you're not able to upload the blob is because you're calling an async function but not waiting for it to complete.
Considering your calling method is sync and not async, I would recommend changing the following line of code:
blobClient.UploadAsync(uploadFileStream, true);
to
blobClient.Upload(uploadFileStream, true);
Other alternative would be to convert the wrapper method as async methods and await the completion of all async methods.
Related
this is my code and i get an error that the filename/ directory is incorrect. The file does exist in that directory. I want to upload this file to my azure blob storage.
private static void UploadFileToBlobStorage()
{
var localFilePath = "C:\\Users\\LK\\source\repos\dsd-ica-perf\\src\\et.ure.ica.Perf\\PerfTest.cs";
BlobServiceClient blobServiceClient = new BlobServiceClient(storageConnStr);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("perfTest");
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobServiceClient.Uri);
BlobClient blobClient = containerClient.GetBlobClient("PerfTest");
using FileStream uploadingFileStream = File.OpenRead(localFilePath);
blobClient.Upload(uploadingFileStream);
uploadingFileStream.Close();
}
When I have tried with above code I was able to upload a file successfully in azure blob storage"
using Azure.Storage.Blobs;
namespace blobstorage
{
class Program
{
private static void Main()
{
var storageConnStr = <connection string of the storage account>;
var localFilePath = "#"C:\root"; #path of the folder which should be upto file
BlobServiceClient blobServiceClient = new BlobServiceClient(storageConnStr);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("container1");#lowercase should be allowed.
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobServiceClient.Uri);
BlobClient blobClient = containerClient.GetBlobClient("folder1");
using FileStream uploadingFileStream = File.OpenRead(localFilePath);
blobClient.Upload(uploadingFileStream);
uploadingFileStream.Close();
}
}
}
Azure portal -> storage account: Before uploading a file.
Output:
When i tried with above code I uploaded a file successfully in azure blob storage.
As said by #gauravmantri in comment the blob container name should be in lowercase and the error occurs due to path of folder is wrong condition.kindly check above code the path should be in that way.
Reference:
system.io.file - C#: System.IO.DirectoryNotFoundException: 'Could not find a part of the path - Stack Overflow by ingvar
I was given the address like following to upload file to Azure File Share using Shared Access Signature(SAS)
https://myaccount.file.core.windows.net/xxxxx?sv=2020-08-04&ss=bfqt&srt=so&sp=rwdlacupitfx&se=2022-12-30T18:11:32Z&st=2021-12-12T10:11:32Z&spr=https&sig=signature
This is my test program
using Azure.Storage.Files.Shares;
public async Task TestAsync()
{
var sas = #"https://myaccount.file.core.windows.net/xxxxx?sv=2020-08-04&ss=bfqt&srt=so&sp=rwdlacupitfx&se=2022-12-30T18:11:32Z&st=2021-12-12T10:11:32Z&spr=https&sig=signature";
var localfile = #"C:\Test\local.txt";
var client = new ShareFileClient(new Uri(sas));
using (var stream = new FileStream(localfile, FileMode.Open, FileAccess.Read))
{
var response = await client.UploadAsync(stream);
}
}
The program throw RequestFailedException with following error:
Status: 400 (The requested URI does not represent any resource on the server.)
ErrorCode: InvalidUri
Additional Information:
UriPath: /xxxxx
My question is what this error mean, is it anything wrong in my test code?
According to this Document uploading with SAS is not possible as it requires authentication to do so. Even though we try having the code correctly it still throws Authentication information is not given in the correct format. Check the value of the Authorization header exception. An alternative way is to use a connection string which of working fine when we try reproducing from our end.
Here is the code
static async Task Main(string[] args) {
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("<YOUR CONNECTION STRING>");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("<YOUR FILE SHARE>");
if (await share.ExistsAsync()) {
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
CloudFile file = rootDir.GetFileReference("sample.txt");
byte[] data = File.ReadAllBytes(# "sample.txt");
Stream fileStream = new MemoryStream(data);
await file.UploadFromStreamAsync(fileStream);
}
}
Here is the workaround using SAS that you can try:
using System.IO;
using Azure.Storage;
using Azure.Storage.Files.Shares;
public class SampleClass {
public void Upload() {
///Get information from your Azure storage account and configure
string accountName = "{Get the account name from the Azure portal site.}";
string accessKey = "{Get the access key from the Azure portal site.}";
Uri serverurl = new Uri(# "{Get the URL from the Azure portal.}");
///Upload destination(azure side)
string azureDirectoryPath = # "{Destination(azure side)Specify the directory of}";
string azureFileName = "{Specify the file name to save}";
///Upload target(Local side)
string localDirectoryPath = # "{Upload target(Local side)Specify the directory of}";
string localFileName = "{Upload target(Local side)Specify the file name of}";
//SSL communication permission setting
//If you don't do this, SSL(https)An error occurs in communication.
System.Net.ServicePointManager.SecurityProtocol = System.Net.SecurityProtocolType.Tls11 | System.Net.SecurityProtocolType.Tls12;
try {
//Preparing to connect to Azure: Setting connection information
StorageSharedKeyCredential credential = new StorageSharedKeyCredential(accountName, accessKey);
//Connect to Azure
ShareClient share = new ShareClient(serverurl, credential);
ShareDirectoryClient directory = share.GetDirectoryClient(azureDirectoryPath);
//Upload destination(azure side)Create if there is no folder in.
directory.CreateIfNotExists();
//Upload destination(azure side)Create a file instance in.
ShareFileClient file = directory.GetFileClient(azureFileName);
//Delete any file with the same name
file.DeleteIfExists();
//Open the Local file to be uploaded. It is easy to get binary information by opening it with FileStream type.
FileStream stream = File.OpenRead(Path.Combine(localDirectoryPath, localFileName));
//Upload destination(azure side)Inject binary information into a file instance
file.Create(stream.Length);
file.UploadRange(new Azure.HttpRange(0, stream.Length), stream);
//Free local files
stream.Dispose();
} catch (Exception ex) {
System.Console.WriteLine(ex.Message);
return;
}
}
}
REFERENCE:
How to programmatically upload files to Azure Storage (File Share)
I am trying to upload a new append blob file to a container every time a message comes in from a service bus. I do not want to append to the blob that is already there. I want to create a whole new append blob and add it at the end.
Is this possible?
I was looking at this article but couldn't quiet understand what they meant when they got to the content part: https://azuresdkdocs.blob.core.windows.net/$web/javascript/azure-storage-blob/12.1.1/classes/appendblobclient.html#appendblock
Here is the code that I have so far:
public static async void StoreToBlob(Services service)
{
//Serealize Object
var sender = JsonConvert.SerializeObject(service);
// Create a BlobServiceClient object which will be used to create a container client
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
// Create the container and return a container client object
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
// Create the container if it doesn't already exist.
await containerClient.CreateIfNotExistsAsync();
//Reference to blob
AppendBlobClient appendBlobClient = containerClient.GetAppendBlobClient("services" + Guid.NewGuid().ToString() + ".json");
// Create the blob.
appendBlobClient.Create();
await appendBlobClient.AppendBlock(sender, sender.Length); //here is where I am having an issue
}
Can you try something like the following (not tested code):
byte[] blockContent = Encoding.UTF8.GetBytes(sender);
using (var ms = new MemoryStream(blockContent))
{
appendBlobClient.AppendBlock(ms, blockContent.Length);
}
Essentially we're converting the string to byte array, creating a stream out of it and then uploading that stream.
I downloaded a file from an FTP server using an azure function and save it in the target that I get from this code:
var target = Path.Combine(context.FunctionAppDirectory, "File.CSV");
Which will be somewhere in "File Shares" that we can see in "Microsoft Azure storage Explorer".
Now my question is about how to copy this file from File Share to Blob container or Directly save it to Blob Storage that azure SQL has access to?
private static void AzureStorageAccountBlob()
{
string filename = "mytestfile.txt";
string fileContents = "some content";
StorageCredentials creds = new StorageCredentials("mystorageaccount2020", "XXXXX");
CloudStorageAccount acct = new CloudStorageAccount(creds, true);
CloudBlobClient client = acct.CreateCloudBlobClient();
CloudBlobContainer container = client.GetContainerReference("myfirstcontainer");
container.CreateIfNotExists();
ICloudBlob blob = container.GetBlockBlobReference(filename);
using (MemoryStream stream = new MemoryStream(Encoding.UTF8.GetBytes(fileContents)))
{
blob.UploadFromStream(stream);
}
}
In my example I have assumed that content already achieved from file. And also one important thing you must create StorageAccount.
Use the below extention to upload to azure:
/// <summary>
/// </summary>
/// <param name="file"></param>
/// <param name="fileName"></param>
/// <param name="connectionString"></param>
/// <param name="containerName"></param>
/// <param name="blobContentType"></param>
/// <returns></returns>
public static async Task<string> AzureUpload(this Stream file, string fileName, string connectionString, string containerName, string blobContentType = null)
{
CloudBlobClient blobClient = CloudStorageAccount.Parse(connectionString).CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
if (await container.CreateIfNotExistsAsync())
{
// Comment this code below if you don't want your files
// to be publicly available. By default, a container is private.
// You can see more on how
// to set different container permissions at:
// https://learn.microsoft.com/en-us/azure/storage/blobs/storage-manage-access-to-resources
await container.SetPermissionsAsync(new BlobContainerPermissions() { PublicAccess = BlobContainerPublicAccessType.Blob });
}
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
await blockBlob.UploadFromStreamAsync(file);
blobContentType = blobContentType.HasValue() ? blobContentType : getBlobContentType(fileName);
if (blobContentType.HasValue())
{
blockBlob.Properties.ContentType = blobContentType;
await blockBlob.SetPropertiesAsync();
}
return blockBlob.Uri.AbsoluteUri;
}
Do something like this:
var target = Path.Combine(context.FunctionAppDirectory, "File.CSV");
FileStream fileStream = new FileStream(target, FileMode.Open, FileAccess.Read);;
string azureUriForUploadedCSV = await fileStream.AzureUpload(
"File.CSV",
"StorageConnectionString",
"csv-folder",
"application/csv");
Then save azureUriForUploadedCSV into your database...
We can use CloudBlockBlob.StartCopy(CloudFile). You may refer to the code below:
using System;
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
using Microsoft.Azure.Storage.File;
namespace ConsoleApp3
{
class Program
{
static void Main(string[] args)
{
// Parse the connection string for the storage account.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=*************");
// Create a CloudFileClient object for credentialed access to File storage.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Get a reference to the file share you created previously.
CloudFileShare share = fileClient.GetShareReference("hurytest");
// Get a reference to the file("test.csv") which I have uploaded to the file share("hurytest")
CloudFile sourceFile = share.GetRootDirectoryReference().GetFileReference("test.csv");
// Get a reference to the blob to which the file will be copied.(I have created a container with name of "targetcontainer")
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("targetcontainer");
//container.CreateIfNotExists();
CloudBlockBlob destBlob = container.GetBlockBlobReference("test.csv");
// Create a SAS for the file that's valid for 24 hours.
// Note that when you are copying a file to a blob, or a blob to a file, you must use a SAS
// to authenticate access to the source object, even if you are copying within the same
// storage account.
string fileSas = sourceFile.GetSharedAccessSignature(new SharedAccessFilePolicy()
{
// Only read permissions are required for the source file.
Permissions = SharedAccessFilePermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24)
});
// Construct the URI to the source file, including the SAS token.
Uri fileSasUri = new Uri(sourceFile.StorageUri.PrimaryUri.ToString() + fileSas);
// Copy the file to the blob.
destBlob.StartCopy(fileSasUri);
}
}
}
Hope it would be helpful to your problem~
I have been following this example from GitHub to transfer files to Azure Blob Storage. The program creates a file in the local MyDocuments folder to upload to a blob container. After the file is created it uploads it to the container. Is it possible to create JSON objects in memory and send them to Azure Blob Storage without writing that file to the hard drive first?
namespace storage_blobs_dotnet_quickstart
{
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
using System;
using System.IO;
using System.Threading.Tasks;
public static class Program
{
public static void Main()
{
Console.WriteLine("Azure Blob Storage - .NET quickstart sample");
Console.WriteLine();
ProcessAsync().GetAwaiter().GetResult();
Console.WriteLine("Press any key to exit the sample application.");
Console.ReadLine();
}
private static async Task ProcessAsync()
{
CloudStorageAccount storageAccount = null;
CloudBlobContainer cloudBlobContainer = null;
string sourceFile = null;
string destinationFile = null;
// Retrieve the connection string for use with the application. The storage connection string is stored
// in an environment variable on the machine running the application called storageconnectionstring.
// If the environment variable is created after the application is launched in a console or with Visual
// Studio, the shell needs to be closed and reloaded to take the environment variable into account.
string storageConnectionString = Environment.GetEnvironmentVariable("storageconnectionstring");
// Check whether the connection string can be parsed.
if (CloudStorageAccount.TryParse(storageConnectionString, out storageAccount))
{
try
{
// Create the CloudBlobClient that represents the Blob storage endpoint for the storage account.
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
// Create a container called 'quickstartblobs' and append a GUID value to it to make the name unique.
cloudBlobContainer = cloudBlobClient.GetContainerReference("quickstartblobs" + Guid.NewGuid().ToString());
await cloudBlobContainer.CreateAsync();
Console.WriteLine("Created container '{0}'", cloudBlobContainer.Name);
Console.WriteLine();
// Set the permissions so the blobs are public.
BlobContainerPermissions permissions = new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
};
await cloudBlobContainer.SetPermissionsAsync(permissions);
// Create a file in your local MyDocuments folder to upload to a blob.
string localPath = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
string localFileName = "QuickStart_" + Guid.NewGuid().ToString() + ".txt";
sourceFile = Path.Combine(localPath, localFileName);
// Write text to the file.
File.WriteAllText(sourceFile, "Hello, World!");
Console.WriteLine("Temp file = {0}", sourceFile);
Console.WriteLine("Uploading to Blob storage as blob '{0}'", localFileName);
Console.WriteLine();
// Get a reference to the blob address, then upload the file to the blob.
// Use the value of localFileName for the blob name.
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(localFileName);
await cloudBlockBlob.UploadFromFileAsync(sourceFile);
// List the blobs in the container.
Console.WriteLine("Listing blobs in container.");
BlobContinuationToken blobContinuationToken = null;
do
{
var resultSegment = await cloudBlobContainer.ListBlobsSegmentedAsync(null, blobContinuationToken);
// Get the value of the continuation token returned by the listing call.
blobContinuationToken = resultSegment.ContinuationToken;
foreach (IListBlobItem item in resultSegment.Results)
{
Console.WriteLine(item.Uri);
}
} while (blobContinuationToken != null); // Loop while the continuation token is not null.
Console.WriteLine();
// Download the blob to a local file, using the reference created earlier.
// Append the string "_DOWNLOADED" before the .txt extension so that you can see both files in MyDocuments.
destinationFile = sourceFile.Replace(".txt", "_DOWNLOADED.txt");
Console.WriteLine("Downloading blob to {0}", destinationFile);
Console.WriteLine();
await cloudBlockBlob.DownloadToFileAsync(destinationFile, FileMode.Create);
}
catch (StorageException ex)
{
Console.WriteLine("Error returned from the service: {0}", ex.Message);
}
finally
{
Console.WriteLine("Press any key to delete the sample files and example container.");
Console.ReadLine();
// Clean up resources. This includes the container and the two temp files.
Console.WriteLine("Deleting the container and any blobs it contains");
if (cloudBlobContainer != null)
{
await cloudBlobContainer.DeleteIfExistsAsync();
}
Console.WriteLine("Deleting the local source file and local downloaded files");
Console.WriteLine();
File.Delete(sourceFile);
File.Delete(destinationFile);
}
}
else
{
Console.WriteLine(
"A connection string has not been defined in the system environment variables. " +
"Add a environment variable named 'storageconnectionstring' with your storage " +
"connection string as a value.");
}
}
}
}
There are some other built-in methods for uploading to blob storage without storing in local drive first.
For your case, you can consider the following built-in methods:
1.For uploading stream(for samples, see here):
UploadFromStream / UploadFromStreamAsync
2.For uploading string / text(for samples, see here):
UploadText / UploadTextAsync
Yes, you can use streams, byte arrays, text and files to upload from.
UploadFromFileAsync
UploadFromByteArrayAsync
UploadFromStreamAsync
UploadFromTextAsync
CloudBlockBlob Class
I would recommend you create a blob repository interface and implement your Azure Blob Storage repository class using this interface.
In this way if you need change you backing blob repo you will be able to do so with minimal impact to the rest of your code.