First of all I would like to ask am I trying to write to correct path or not? I would like to have a log file on device and currently it is created and maintained this way:
string documentsPath = Path.Combine(Android.OS.Environment.ExternalStorageDirectory.AbsolutePath, Android.OS.Environment.DirectoryDocuments);
string filePath = Path.Combine(documentsPath, "Log.txt");
I mean is this the right place or is it better to store it somewhere else? For example in internal storage?
Second question if above is fine and can be as it is, why following is doing nothing on app startup. I mean it does not request permissions for External Storage management?
if (ContextCompat.CheckSelfPermission(this, Manifest.Permission.ManageExternalStorage) != (int)Permission.Granted ||
ContextCompat.CheckSelfPermission(this, Manifest.Permission.AccessFineLocation) != (int)Permission.Granted)
{
this.RequestPermissions(new string[] { Manifest.Permission.ManageExternalStorage, Manifest.Permission.AccessFineLocation }, 0);
}
and following should be added in order to request External storage management permissions:
if (!Android.OS.Environment.IsExternalStorageManager)
{
Intent intent = new Intent();
intent.SetAction(Android.Provider.Settings.ActionManageAppAllFilesAccessPermission);
Android.Net.Uri uri = Android.Net.Uri.FromParts("package", this.PackageName, null);
intent.SetData(uri);
this.StartActivity(intent);
}
Related
I have a problem with creating folder with nuget package PCLStorage, I cannot create folder.
Nothing appear inside my files folder. I,m using my device not emulator there is android version 8.0
public async Task WriteDataAsync(string filename, string data)
{
string folderName = "SignatureSotrage";
IFolder folder = FileSystem.Current.LocalStorage;
folder = await folder.CreateFolderAsync(folderName, CreationCollisionOption.ReplaceExisting);
}
Here is a code where I run this function:
public ICommand AddCustomerCommand => new Command(async () =>
{
Signature = await SignatureFromStream();
// Signature should be != null
var customer = new Customer()
{
FullName = this.FullName,
IsAccepted = this.IsAccepted,
Birthday = this.Birthday
};
if(Signature != null)
{
customer.Image = this.Signature.ToString();
}
else
{
await Application.Current.MainPage.DisplayAlert("Błąd", "Nie wszystkie pola zostały poprawnie wypełnione", "OK");
return;
}
await DependencyService.Get<IFileHelper>().WriteDataAsync("signature.txt", "this is file");
//_context.Customers.Add(customer);
//_context.SaveChanges();
});
did you debug your code & check if the file/folder is actually getting created by your code or else it enters the catch block and goes with the normal flow?
Check for UserPermissions every time for reading & write permission before doing any operations on the storage. You can add the Nugget packet Plugin.Permission it handles everything for you, it adds both the permission in the manifest.
For checking user permissions always try calling CheckForStoragePermissions() before performing any operations on storage.(*DialogService is CustomDialogBox)
if( !await CheckForStoragePermissions() )
{
DialogService.Alert("Invalid Permission", "User declined permission for this action");
return;
}
private async Task<bool> CheckForStoragePermissions()
{
PermissionStatus storagePermissionStatus = await CrossPermissions.Current.CheckPermissionStatusAsync(Permission.Storage);
if (storagePermissionStatus != PermissionStatus.Granted)
{
Dictionary<Permission, PermissionStatus> storagePermissionResult = await CrossPermissions.Current.RequestPermissionsAsync(Permission.Storage);
if (storagePermissionResult.ContainsKey(Permission.Storage))
{
storagePermissionStatus = storagePermissionResult[Permission.Storage];
}
}
return storagePermissionStatus == PermissionStatus.Granted;
}
I test the sample code on GitHub. https://github.com/dsplaisted/PCLStorage
Based on my test the folder path would like:
/data/user/0/PCLStorage.Test.Android/files/
It is a internal storage. You couldn't see the files without root permission. https://learn.microsoft.com/en-us/xamarin/android/platform/files/#working-with-internal-storage
If you want to see the files in internal storage, you could use adb tool. Please refer to the way in the link. How to write the username in a local txt file when login success and check on file for next login?
I have FileSystem watcher for a local directory. It's working fine. I want same to implement for FTP. Is there any way I can achieve it? I have checked many solutions but it's not clear.
Logic: Want to get files from FTP later than some timestamp.
Problem faced: Getting all files from FTP and then filtering the result is hitting the performance (used FtpWebRequest).
Is there any right way to do this? (WinSCP is on hold. Cant use it now.)
FileSystemWatcher oFsWatcher = new FileSystemWatcher();
OFSWatchers.Add(oFsWatcher);
oFsWatcher.Path = sFilePath;
oFsWatcher.Filter = string.IsNullOrWhiteSpace(sFileFilter) ? "*.*" : sFileFilter;
oFsWatcher.NotifyFilter = NotifyFilters.FileName;
oFsWatcher.EnableRaisingEvents = true;
oFsWatcher.IncludeSubdirectories = bIncludeSubdirectories;
oFsWatcher.Created += new FileSystemEventHandler(OFsWatcher_Created);
You cannot use the FileSystemWatcher or any other way, because the FTP protocol does not have any API to notify a client about changes in the remote directory.
All you can do is to periodically iterate the remote tree and find changes.
It's actually rather easy to implement, if you use an FTP client library that supports recursive listing of a remote tree. Unfortunately, the built-in .NET FTP client, the FtpWebRequest does not. But for example with WinSCP .NET assembly, you can use the Session.EnumerateRemoteFiles method.
See the article Watching for changes in SFTP/FTP server:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
List<string> prevFiles = null;
while (true)
{
// Collect file list
List<string> files =
session.EnumerateRemoteFiles(
"/remote/path", "*.*", EnumerationOptions.AllDirectories)
.Select(fileInfo => fileInfo.FullName)
.ToList();
if (prevFiles == null)
{
// In the first round, just print number of files found
Console.WriteLine("Found {0} files", files.Count);
}
else
{
// Then look for differences against the previous list
IEnumerable<string> added = files.Except(prevFiles);
if (added.Any())
{
Console.WriteLine("Added files:");
foreach (string path in added)
{
Console.WriteLine(path);
}
}
IEnumerable<string> removed = prevFiles.Except(files);
if (removed.Any())
{
Console.WriteLine("Removed files:");
foreach (string path in removed)
{
Console.WriteLine(path);
}
}
}
prevFiles = files;
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
}
(I'm the author of WinSCP)
Though, if you actually want to just download the changes, it's a way easier. Just use the Session.SynchronizeDirectories in the loop.
while (true)
{
SynchronizationResult result =
session.SynchronizeDirectories(
SynchronizationMode.Local, "/remote/path", #"C:\local\path", true);
result.Check();
// You can inspect result.Downloads for a list for updated files
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
This will update even modified files, not only new files.
Though using WinSCP .NET assembly from a web application might be problematic. If you do not want to use a 3rd party library, you have to do with limitations of the FtpWebRequest. For an example how to recursively list a remote directory tree with the FtpWebRequest, see my answer to List names of files in FTP directory and its subdirectories.
You have edited your question to say that you have performance problems with the solutions I've suggested. Though you have already asked a new question that covers this:
Get FTP file details based on datetime in C#
Unless you have access to the OS which hosts the service; it will be a bit harder.
FileSystemWatcher places a hook on the filesystem, which will notify your application as soon as something happened.
FTP command specifications does not have such a hook. Besides that it's always initiated by the client.
Therefor, to implement such logic you should periodical perform a NLST to list the FTP-directory contents and track the changes (or hashes, perhaps (MDTM)) yourself.
More info:
FTP return codes
FTP
I have got an alternative solution to do my functionality.
Explanation:
I am downloading the files from FTP (Read permission reqd.) with same folder structure.
So everytime the job/service runs I can check into the physical path same file(Full Path) exists or not If not exists then it can be consider as a new file. And Ii can do some action for the same and download as well.
Its just an alternative solution.
Code Changes:
private static void GetFiles()
{
using (FtpClient conn = new FtpClient())
{
string ftpPath = "ftp://myftp/";
string downloadFileName = #"C:\temp\FTPTest\";
downloadFileName += "\\";
conn.Host = ftpPath;
//conn.Credentials = new NetworkCredential("ftptest", "ftptest");
conn.Connect();
//Get all directories
foreach (FtpListItem item in conn.GetListing(conn.GetWorkingDirectory(),
FtpListOption.Modify | FtpListOption.Recursive))
{
// if this is a file
if (item.Type == FtpFileSystemObjectType.File)
{
string localFilePath = downloadFileName + item.FullName;
//Only newly created files will be downloaded.
if (!File.Exists(localFilePath))
{
conn.DownloadFile(localFilePath, item.FullName);
//Do any action here.
Console.WriteLine(item.FullName);
}
}
}
}
}
Specs
Unity editor version: 2018.2.8f1
Firebase Unity SDK version: 5.5.0
Additional SDKs: SimpleFirebaseUnity
Developing on: Mac
Export Platform: Android
Issue
I'm having troubles setting up a system to download pictures from storage. I'm not an expert in databases, but I wanted to give it try, just to learn how it is done.
I found Firebase very useful to store metadata on the real-time database and easy to approach even for an entry level programmer like me.
The problem is that I'm trying to download a .png file from a folder in storage, but I can't manage to find if the file is actually downloaded or if it's just lost in the process. I don't get any errors in the console, but when I open the folder in which the files should be, it's empty.
Code
private SimpleFirebaseUnity.Firebase firebaseDatabase;
private FirebaseQueue firebaseQueue;
private FirebaseStorage firebaseStorage;
private StorageReference m_storage_ref;
// Setup refernece to database and storage
void SetupReferences()
{
// Get a reference to the database service, using SimpleFirebase plugin
firebaseDatabase = SimpleFirebaseUnity.Firebase.CreateNew(FIREBASE_LINK, FIREBASE_SECRET);
// Get a reference to the storage service, using the default Firebase App
firebaseStorage = FirebaseStorage.DefaultInstance;
// Create a storage reference from our storage service
m_storage_ref = firebaseStorage.GetReferenceFromUrl(STORAGE_LINK);
// Create a queue, using SimpleFirebase
firebaseQueue = new FirebaseQueue(true, 3, 1f);
}
// ...
IEnumerator DownloadImage(string address, string fileName)
{
var local_path = Application.persistentDataPath + THUMBNAILS_PATH;
var content_ref = m_storage_ref.Child(THUMBNAILS_PATH + fileName + ".png");
content_ref.GetFileAsync(local_path).ContinueWith(task => {
if (!task.IsFaulted && !task.IsCanceled)
{
Debug.Log("File downloaded.");
}
});
yield return null;
}
There can be many reason for why this is not working for you including:
security rules are not setup properly
paths to files are not correct
you are testing it on wrong platform (Firebase is not working well in the editor)
your device is blocking the connection
etc...
In order to get error messages you need to log them:
IEnumerator DownloadImage(string address, string fileName)
{
var local_path = Application.persistentDataPath + THUMBNAILS_PATH;
var content_ref = m_storage_ref.Child(THUMBNAILS_PATH + fileName + ".png");
content_ref.GetFileAsync(local_path).ContinueWith(task => {
if (!task.IsFaulted && !task.IsCanceled)
{
Debug.Log("File downloaded.");
}
else
{
Debug.Log(task.Exception.ToString());
}
});
yield return null;
}
Keep in mind testing it in the editor may not work.
So I want to upload video's from client desktop application to Azure Media Services (which of course uses Azure Storage).
I am trying to do a combination of:
this old documentation: 3 - Uploading Video into Microsoft Azure Media Services
and this relative new documentation: Upload multiple files with Media Services .NET SDK.
The first one shows an perfect example of my scenario, but the second one illustrates how to use BlobTransferClient to upload multiple files and have a "progress" indicator.
The problem: It does seem to upload and I don't get any error after uploading, yet nothing is showing up in Azure portal / Storage account.
It seems to upload because task takes long, task manager shows wifi upload progress and Azure storage shows that (successful) requests are being made.
So, serverside, I create a SasLocator for a temporary time:
public async Task<VideoUploadModel> GetSasLocator(string filename)
{
var assetName = filename + DateTime.UtcNow;
IAsset asset = await _context.Assets.CreateAsync(assetName, AssetCreationOptions.None, CancellationToken.None);
IAccessPolicy accessPolicy = _context.AccessPolicies.Create(assetName, TimeSpan.FromMinutes(10),
AccessPermissions.Write);
var locator = _context.Locators.CreateLocator(LocatorType.Sas, asset, accessPolicy);
var blobUri = new UriBuilder(locator.Path);
blobUri.Path += "/" + filename;
var model = new VideoUploadModel()
{
Filename = filename,
AssetName = assetName,
SasLocator = blobUri.Uri.AbsoluteUri,
AssetId = asset.Id
};
return model;
}
And client-side, I try to upload:
public async Task UploadVideoFileToBlobStorage(string[] files, string sasLocator, CancellationToken cancellationToken)
{
var blobUri = new Uri(sasLocator);
var sasCredentials = new StorageCredentials(blobUri.Query);
//var blob = new CloudBlockBlob(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
var blobClient = new CloudBlobClient(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
var blobTransferClient = new BlobTransferClient(TimeSpan.FromMinutes(1))
{
NumberOfConcurrentTransfers = 2,
ParallelTransferThreadCount = 2
};
//register events
blobTransferClient.TransferProgressChanged += BlobTransferClient_TransferProgressChanged;
//files
var uploadTasks = new List<Task>();
foreach (var filePath in files)
{
await blobTransferClient.UploadBlob(blobUri, filePath, new FileEncryption(), cancellationToken, blobClient, new NoRetry());
}
//StorageFile storageFile = null;
//if (string.IsNullOrEmpty(file.FutureAccessToken))
//{
// storageFile = await StorageFile.GetFileFromPathAsync(file.Path).AsTask(cancellationToken);
//}
//else
//{
// storageFile = await StorageApplicationPermissions.FutureAccessList.GetFileAsync(file.FutureAccessToken).AsTask(cancellationToken);
//}
//cancellationToken.ThrowIfCancellationRequested();
//await blob.UploadFromFileAsync(storageFile);
}
I know I am probably not doing it correctly with naming of assets and using the progress indicator instead of await, but of course I first want this to work first before finishing it.
I configured Azure Media Services to "Connect to Media Services API with service principal", where I created a new Azure AD app and generated keys for that, like this documentation page. I am not really sure how this exactly works, little unexperienced in Azure AD and Azure AD apps (guidance?).
Uploading:
Asset created but no files:
Storage doesn't show any files either:
Storage does show successful upload:
The reason I can't exactly follow the Upload multiple files with Media Services .NET SDK documentation is because it uses the _context (which is Microsoft.WindowsAzure.MediaServices.Client.CloudMediaContext), that _context I can use serverside but not client-side because it requires the TentantDomain,RESTAPI Endpoint, ClientId and Client Secret.
I guess uploading via SaSLocator is the correct way (?).
UPDATE 1
When uploading using CloudBlockBlob it does upload again and it is shown in my storage account within an asset, yet when I go the media services within azure and click on the particular asset, it doesn't show any files.
So the code for that:
var blob = new CloudBlockBlob(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
//files
var uploadTasks = new List<Task>();
foreach (var filePath in files)
{
await blob.UploadFromFileAsync(filePath, CancellationToken.None);
}
I've also tried to upload an asset manually within Azure. So Clicking on "Upload" in the Asset menu, then Encoding it. This all works fine.
UPDATE 2:
Digging deeper I came up with the following, not yet production-proof, way to make it currently work:
1. Get a Shared access signature directly from storage and upload it to there:
public static async Task<string> GetMediaSasLocator(string filename)
{
CloudBlobContainer cont = await GetMediaContainerAsync();
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(60),
Permissions = SharedAccessBlobPermissions.Write,
SharedAccessStartTime = DateTimeOffset.UtcNow.AddMinutes(-5)
};
await cont.FetchAttributesAsync();
return cont.Uri.AbsoluteUri + "/" + filename + cont.GetSharedAccessSignature(policy);
}
With this SaS I can upload just like I showed in UPDATE 1, nothing changed there.
2. Create a Azure Function (which was already planned to do) which handles the asset creation, file uploading to asset, encoding and publishing.
This has been done by following this tutorial: Azure Functions Tools for Visual Studio and then implement the code that is illustrated in Upload multiple files with Media Services .NET SDK.
So this "works" but is not perfect yet, I still don't have my progress indicator within my client WPF application and the Azure Function takes quite a long time to complete because we basically "upload" the file again to an Asset after it is already in Azure Storage. I rather use a method to either copy from one container to an asset container.
I came to this point because Azure functions need a fixed given container name, since assets create their own containers within an storage account, you can't trigger an Azure function on those. So to work with Azure Functions it seems I really have to upload it to a fixed container name and thereafter do the rest.
Question still remains: Why uploading a video file to Azure Storage via the BlobTransferClient does not work? And if it will work, how do I trigger an Azure function based on multiple containers. A 'path' like asset-{name}/{name}.avi would be preferred.
Eventually it turned out that I need to specify the base URL in the UploadBlob method, so without the filename itself which is within the SasLocator URL, but only the container name.
Once I fixed that I also noted it didn't upload to the filename I have provided in the SasLocator I generated server side (it includes a customerID prefix). I had to use one of the other method overloads to get the correct filename.
public async Task UploadVideoFilesToBlobStorage(List<VideoUploadModel> videos, CancellationToken cancellationToken)
{
var blobTransferClient = new BlobTransferClient();
//register events
blobTransferClient.TransferProgressChanged += BlobTransferClient_TransferProgressChanged;
//files
_videoCount = _videoCountLeft = videos.Count;
foreach (var video in videos)
{
var blobUri = new Uri(video.SasLocator);
//create the sasCredentials
var sasCredentials = new StorageCredentials(blobUri.Query);
//get the URL without sasCredentials, so only path and filename.
var blobUriBaseFile = new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path,
UriFormat.UriEscaped));
//get the URL without filename (needed for BlobTransferClient (seems to me like a issue)
var blobUriBase = new Uri(blobUriBaseFile.AbsoluteUri.Replace("/"+video.Filename, ""));
var blobClient = new CloudBlobClient(blobUriBaseFile, sasCredentials);
//upload using stream, other overload of UploadBlob forces to put online filename of local filename
using (FileStream fs = new FileStream(video.FilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
await blobTransferClient.UploadBlob(blobUriBase, video.Filename, fs, null, cancellationToken, blobClient,
new NoRetry(), "video/x-msvideo");
}
_videoCountLeft -= 1;
}
blobTransferClient.TransferProgressChanged -= BlobTransferClient_TransferProgressChanged;
}
private void BlobTransferClient_TransferProgressChanged(object sender, BlobTransferProgressChangedEventArgs e)
{
Console.WriteLine("progress, seconds remaining:" + e.TimeRemaining.Seconds);
double bytesTransfered = e.BytesTransferred;
double bytesTotal = e.TotalBytesToTransfer;
double thisProcent = bytesTransfered / bytesTotal;
double procent = thisProcent;
//devide by video amount
int videosUploaded = _videoCount - _videoCountLeft;
if (_videoCountLeft > 0)
{
procent = (thisProcent + videosUploaded) / _videoCount;
}
procent = procent * 100;//to real %
UploadProgressChangedEvent?.Invoke((int)procent, videosUploaded, _videoCount);
}
Actually Microsoft.WindowsAzure.MediaServices.Client.BlobTransferClient should be able to do concurrent uploads but there is no Method for uploading multiple yet it has properties for NumberOfConcurrentTransfers and ParallelTransferThreadCount, not sure how to use this.
I didn't check if this is now working with Assets as well because I now upload to 1 single container for every file and later using an Azure Function to process to an Asset, mainly because I can't trigger an Azure Function on a dynamic container name (every asset creates its own container).
I am using google API for google drive, V3 with C# dot net. When trying to change the ownership from my service account to a 'regular' drive account (so it is within the same domain) of files other than docs, sheets and slides (like .zip or even .pdf) I get an error saying that:
Error: Bad Request. User message: "You can't yet change the owner of this item. (We're working on it.).
I guess this has something to do with the fact that docs, sheets and slides are not taken into account in the storage quota.
(1) Does this have a workaround? (Trying to change the file name to .doc before uploading it causes auto file conversion of the file and it is useless after that).
(2) Does this also happen on a paid account?
(3) Is Google team really 'working on it' as it states in the error message?
UPDATE:
This is the code I am using:
public string UploadFileToDrive(string FilePath, string ParentID)
{
try
{
Google.Apis.Drive.v3.Data.File body = new Google.Apis.Drive.v3.Data.File();
string fileNameNoPath = System.IO.Path.GetFileName(FilePath);
body.Name = "NewFile.ASC"; // some file names such as zip are not acceptable by google drive api
//body.MimeType = GoogleDriveMimeTypes.GetGenericMimeTypeString();
if (ParentID != null)
{
body.Parents = new List<string>();
body.Parents.Add(ParentID);
}
byte[] byteArray = System.IO.File.ReadAllBytes(FilePath);
System.IO.MemoryStream Ustream = new System.IO.MemoryStream(byteArray);
var requestU = _CurrentDriveService.Files.Create(body, Ustream, "");
requestU.Upload();
var uploadedFileID = requestU.ResponseBody.Id;
body.Name = fileNameNoPath;
//body.MimeType = GoogleDriveMimeTypes.GetGenericMimeTypeString();
FilesResource.CopyRequest cr = new FilesResource.CopyRequest(_CurrentDriveService, body, uploadedFileID);
var newFile = cr.Execute();
var NewFileNameID = newFile.Id;
DeleteFileFromDrive(uploadedFileID);
{
Permission p = new Permission();
p.Role = "reader";
p.Type = "anyone";
PermissionsResource.CreateRequest cc = new PermissionsResource.CreateRequest(_CurrentDriveService, p, NewFileNameID);
cc.Execute();
}
// you can comment out the next block if using Auth client
//
{
// make main account the owner in order to take its size quota in main account not google service.
Permission p = new Permission();
p.Role = "owner";
p.Type = "user";
p.EmailAddress = "vizfilesender#gmail.com";
PermissionsResource.CreateRequest cc = new PermissionsResource.CreateRequest(_CurrentDriveService, p, NewFileNameID);
cc.TransferOwnership = true; // acknowledge transfer of ownership - must be set to "true" in order for role to change to "owner"
cc.Execute();
}
return NewFileNameID;
}
catch (Exception e)
{
System.Diagnostics.Debug.WriteLine(e.Message);
return "";
}
}
With this code I can upload all files, change permissions for sharing, but I can't change ownership back to the google drive account.
I finally found the answer. I need to impersonate to another user.
var initializer = new ServiceAccountCredential.Initializer("blablablabla#blabla.iam.gserviceaccount.com")
{
Scopes = scope,
User = "emailToImpersonate#domain"
};
var credential = new ServiceAccountCredential(initializer.FromPrivateKey("-----BEGIN PRIVATE KEY-----\n-----END PRIVATE KEY-----\n"));
var driveService = new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = ApplicationName
});
Also, make sure you give the google service domain wide delegation as shown here:
https://developers.google.com/drive/v2/web/delegation
and let up to 10 minutes for the change to take effect.
This is the workaround I have been searching for.