We have a small program, to upload some files to Softlayer through object storage's Rest API.
Every day we upload about 38 files, with 1.32 GB total size. The files sizes vary between 650KB to 600MB, approximately.
One of those files have ~582MB size, and every day our program tries three times to upload it, but it has always been unsuccessful. This file takes about 30~45 minutes long to be upload, each attempt.
The message returned by the API is:
The request was aborted: The request was canceled.
Here is my code to Upload the files:
// Lists the Backup files of the folder
DirectoryInfo dirInfoBkps = new DirectoryInfo(backupFolder);
FileInfo[] arrFiles = dirInfoBkps.GetFiles(backupExtension);
// Performs the authentication in Softlayer, and obtains the Token and URL of Upload
RestHelper softlayerRestAPI = new RestHelper();
softlayerRestAPI.RestHeaders.Add("X-Auth-User", apiSoftlayerUser);
softlayerRestAPI.RestHeaders.Add("X-Auth-Key", apiSoftlayerToken);
softlayerRestAPI.RestHeaders.Add("X-Account-Meta-Temp-Url-Key", apiSoftlayerMetaTempUrlKey);
Dictionary<string, string> dicRespondeHeaders;
SoftlayerModel softlayerModel =
softlayerRestAPI.CallGetRestMethod<SoftlayerModel>(apiSoftlayerUrl, out dicRespondeHeaders);
// Prepares to Upload Files
apiSoftlayerUrl = softlayerModel.storage.#public;
apiSoftlayerUrl = apiSoftlayerUrl.Replace("https", "http");
apiSoftlayerXAuthToken = dicRespondeHeaders["X-Storage-Token"];
// Upload each file in the folder
foreach (FileInfo fileInfo in arrFiles)
{
// Creates the Upload URL
string uploadUrl = string.Format("{0}{1}{2}",
apiSoftlayerUrl,
"/Backups_SVN/",
fileInfo.Name);
// Try to make the upload 3-times
int numberOfTries = 0;
Exception lastException = null;
string lastFilename = null;
string mensagem = string.Empty;
while (numberOfTries < 3)
{
try
{
numberOfTries++;
softlayerRestAPI.RestHeaders.Clear();
softlayerRestAPI.RestHeaders.Add("X-Auth-Token", apiSoftlayerXAuthToken);
byte[] arr =
softlayerRestAPI.CallUploadRestMethod(uploadUrl, fileInfo.FullName);
// Upload Successful
break;
}
catch (Exception ex)
{
// Upload failed
lastException = ex;
lastFilename = fileInfo.Name;
Console.WriteLine(ex.Message);
}
}
if (numberOfTries == 3) // All attempts failed
{
// Writes the error log for future reference
}
}
Update
I forgot the code for RestHelper class: https://pastebin.com/hBYjXXJh
#FrankerZ is right - the problem posted here is duplicated with the other post.
I made the procedures of the the other post, and increased the Timeout of my WebRequest object, and the problem was solved.
Thanks for the help.
Related
I am trying to upload an XML file to SFTP server using SSH.NET. (.NetCore 3.1 / VS 2019 / C#)
The remote folder path to upload the file is "/testin/" which we have been confirmed that we have enough write permission to upload files to this folder.
I am able to connect and authorized to SFTP server on this connection.
However, when I try to upload a file, SSH.net return only "Failure" in the exception message with no other clear details.
I am able to download files with no issue.
By checking the summary details of "Upload" function from SSH.NET as below, it specifies that it would return clear error message for permissions and so on.
I did some search on internet with not much success.
// Summary:
// Uploads stream into remote file.
//
// Parameters:
// input:
// Data input stream.
//
// path:
// Remote file path.
//
// canOverride:
// if set to true then existing file will be overwritten.
//
// uploadCallback:
// The upload callback.
//
// Exceptions:
// T:System.ArgumentNullException:
// input is null.
//
// T:System.ArgumentException:
// path is null or contains only whitespace characters.
//
// T:Renci.SshNet.Common.SshConnectionException:
// Client is not connected.
//
// T:Renci.SshNet.Common.SftpPermissionDeniedException:
// Permission to upload the file was denied by the remote host.
// -or-
// A SSH command was denied by the server.
//
// T:Renci.SshNet.Common.SshException:
// A SSH error where System.Exception.Message is the message from the remote host.
//
// T:System.ObjectDisposedException:
// The method was called after the client was disposed.
//
// Remarks:
// Method calls made by this method to input, may under certain conditions result
// in exceptions thrown by the stream.
public void UploadFile(Stream input, string path, bool canOverride, Action<ulong> uploadCallback = null);
Here is my code:
Any idea?
using (var sftp = new SftpClient(Host, TcpPort, Username, Password))
{
try
{
sftp.Connect();
var localFile = Path.Combine(LocalUploadPath + LocalFilename);
// var path = $"{localFile.Replace(#"\", "/")}";
using (var fs = File.OpenRead(localFile))
{
sftp.UploadFile(fs, RemoteUploadRootPath, true);
}
}
catch (Exception ex)
{
sftp.Disconnect();
throw new Exception($"{ex.Message} {ex.InnerException}"
}
finally
{
sftp.Disconnect();
}
}
I found the issue, it was simple enough, in the upload path, the file name was not specified in the file path. :|
This link may worth to share for SSH.NET with some good examples:
https://csharp.hotexamples.com/examples/Renci.SshNet.Sftp/SftpUploadAsyncResult/Update/php-sftpuploadasyncresult-update-method-examples.html
Here is the updated code that works fine:
using (var sftp = new SftpClient(Host, TcpPort, Username, Password))
{
try
{
sftp.Connect();
var localFile = Path.Combine(LocalUploadPath + LocalFilename);
var remoteFilePath = Path.Combine("/testin/" + LocalFilename);
using (var fs = File.OpenRead(localFile))
{
sftp.UploadFile(fs, remoteFilePath, true);//, UploadCallBack);
}
}
catch (Exception ex)
{
sftp.Disconnect();
throw new Exception($"{ex.Message} {ex.InnerException}");
}
finally
{
sftp.Disconnect();
}
}
In my case, I had sent the path to the ListDirectory method with a leading forward slash.
I have the following code that I want to chance using batch. In this code, first I create a copy of a file, and then using the id of the new file, I am adding a permission.
File readOnlyFile = new File();
readOnlyFile.Name = newFileName.Replace(' ', '-') + "_assess";
readOnlyFile.Parents = new List<string> { targetFolderId };
FilesResource.CopyRequest fileCreateRequest = _driveService.Files.Copy(readOnlyFile, fileId);
string readOnlyFileId = fileCreateRequest.Execute().Id;
if (readOnlyFileId != null)
{
newPermission.ExpirationTime = expirationDate;
newPermission.Type = "anyone";
newPermission.Role = "reader";
PermissionsResource.CreateRequest req = _driveService.Permissions.Create(newPermission, readOnlyFileId);
req.SendNotificationEmail = false;
req.Execute();
}
However, I am puzzled when trying to use batch for this task since I will need the id of the newly copied file to add permission. Below is my initial attempt where I do not know how to proceed after batch.Queue(fileCreateRequest, callback). I can add new action to the batch to add permission. But, I do not know how to get the id of the file. Any suggestions? I need to do this for three different files.
var batch = new BatchRequest(_driveService);
BatchRequest.OnResponse<Permission> callback = delegate (
Permission permission,
RequestError error,
int index,
System.Net.Http.HttpResponseMessage message)
{
if (error != null)
{
// Handle error
Console.WriteLine(error.Message);
}
else
{
Console.WriteLine("Permission ID: " + permission.Id);
}
};
Permission newPermission = new Permission();
File readOnlyFile = new File();
readOnlyFile.Name = newFileName.Replace(' ', '-') + "_assess";
readOnlyFile.Parents = new List<string> { targetFolderId };
FilesResource.CopyRequest fileCreateRequest = _driveService.Files.Copy(readOnlyFile, fileId);
batch.Queue(fileCreateRequest, callback);
To upload and to configure permissions of a file are two different operations that cannot be batched together. Your approach is correct, you only have to use set up the permissions in a second call.
After uploading the files and retrieving the ids as you do, you have to create a second call to create the permissions.
There is no way to do it in a single request, because for setting up the permissions you need the id of the file; and that is only created after finishing the upload. If you need any more clarification, please ask me without hesitating.
Im currently building a Windows service that will be used to create backups of logs. Currently, the logs are stored at the path E:\Logs and intent is to copy the contents, timestamp their new folder and compress it. After this, you should have E:\Logs and E:\Logs_[Timestamp].zip. The zip will be moved to C:\Backups\ for later processing. Currently, I am using the following to try and zip the log folder:
var logDirectory = "E://Logs";
var timeStamp = DateTime.Now.ToString("yyyyMMddHHmm");
var zippedFolder = logDirectory + "_" + timeStamp + ".zip";
System.IO.Compression.ZipFile.CreateFromDirectory(logDirectory, zippedFolder);
While this appears to create a zip folder, I get the error Windows cannot open the folder. The Compressed (zipped) Folder E:\Logs_201805161035.zip is invalid.
To address any troubleshooting issues, the service is running with an AD account that has a sufficient permission level to perform administrative tasks. Another thing to consider is that the service kicks off when its FileSystemWatcher detects a new zip folder in the path C:\Aggregate. Since there are many zip folders that are added to C:\Aggregate at once, the FileSystemWatcher creates a new Task for each zip found. You can see how this works in the following:
private void FileFoundInDrops(object sender, FileSystemEventArgs e)
{
var aggregatePath = new DirectoryInfo("C://Aggregate");
if (e.FullPath.Contains(".zip"))
{
Task task = Task.Factory.StartNew(() =>
{
try
{
var logDirectory = "E://Logs";
var timeStamp = DateTime.Now.ToString("yyyyMMddHHmm");
var zippedFolder = logDirectory + "_" + timeStamp + ".zip";
ZipFile.CreateFromDirectory(logDirectory, zippedFolder);
}
catch (Exception ex)
{
Log.WriteLine(System.DateTime.Now.ToString() + " - ERROR: " + ex);
}
});
task.Dispose();
}
}
How can I get around the error I am receiving? Any help would be appreciated!
I am trying to create a program that will download image files in my google drive. I was able to do so, however when I am trying to search a file to return a specific file I always got an error when using the 'name' field which is base on this website https://developers.google.com/drive/v3/web/search-parameters. I don't really know the problem. This is my code
GoogleHelper gh = new GoogleHelper();//calling
DriveService service = GoogleHelper.AuthenticateServiceAccount(email, securityPath);
List<String> file = GoogleHelper.GetFiles(service,
"mimeType='image/jpeg' and name contains 'aa'");
String newFile = newPath+id;
gh.DownloadFile(service, file[0],newPath);
//get File Method:
public static List<String> GetFiles(DriveService service, string search)
{
List<String> Files = new List<String>();
try
{
//List all of the files and directories for the current user.
FilesResource.ListRequest list = service.Files.List();
list.MaxResults = 1000;
if (search != null)
{
list.Q = search;
}
FileList filesFeed = list.Execute();
// MessageBox.Show(filesFeed.Items.Count);
//// Loop through until we arrive at an empty page
while (filesFeed.Items != null)
{
// Adding each item to the list.
foreach (File item in filesFeed.Items)
{
Files.Add(item.Id);
}
// We will know we are on the last page when the next page token is
// null.
// If this is the case, break.
if (filesFeed.NextPageToken == null)
{
break;
}
// Prepare the next page of results
list.PageToken = filesFeed.NextPageToken;
// Execute and process the next page request
filesFeed = list.Execute();
}
}
catch (Exception ex)
{
// In the event there is an error with the request.
Console.WriteLine(ex.Message);
MessageBox.Show(ex.Message);
}
return Files;
}
If we check the documentation Search for Files
name string contains1, =, != Name of the file.
They also show it being used
name contains 'hello' and name contains 'goodbye'
Now the file.list method returns a List of file resources. If you check file resources name is not a parameter title is.
So if you do
mimeType='image/jpeg' and (title contains 'a')
Your request will work.
Now the reason the documentation is wrong is that you are using the Google Drive v2 API and the documentation has apparently been updated for Google Drive v3 which you guessed it uses name instead of title for a file.
IMO there should be two because well its just different APIs here.
We've created a small website using Service Stack, but are having a problem with user uploads. We find that when a user uploads a file using a POST that their session is closed.
The size of the file doesn't seem to matter, nor does the delay in responding to the upload POST.
I've confirmed that the browser is still sending the same Session ID (ss-id) cookie before and after the upload.
Here's how we're handling AAA:
public override void Configure(Container container)
{
//Config examples
//this.Plugins.Add(new PostmanFeature());
//this.Plugins.Add(new CorsFeature());
ServiceStack.Text.JsConfig.DateHandler = ServiceStack.Text.DateHandler.ISO8601;
Plugins.Add(new AuthFeature(() => new AuthUserSession(),
new IAuthProvider[] {
new FeedPartnerAuthProvider(), //Custom MSSQL based auth
}
));
//Global response filter to extend session automatically
this.GlobalResponseFilters.Add((req, res, requestDto) =>
{
var userSession = req.GetSession();
req.SaveSession(userSession, slidingExpiryTimeSpan);
});
this.Plugins.Add(new RazorFormat());
}
Here's what the upload code looks like:
//Upload Test Models
[Route("/upload", "POST")]
public class PostUpload : IReturn<PostUploadResponse>
{
public String Title { get; set; }
}
public class PostUploadResponse
{
public String Title { get; set; }
public Boolean Success { get; set; }
}
//Upload Test Service
[Authenticate]
public object Post(PostUpload request)
{
string uploadPath = "~/uploads";
Directory.CreateDirectory(uploadPath.MapAbsolutePath());
var customerFile = Request.Files.SingleOrDefault(uploadedFile =>
uploadedFile.ContentLength > 0 &&
uploadedFile.ContentLength <= 500 * 1000 * 1024);
if (customerFile == null)
throw new ArgumentException("Error: Uploaded file must be less than 500MB");
//determine the extension of the file
String inputFileExtension = "";
var regexResult = Regex.Match(customerFile.FileName, "^.*\\.(.{3})$");
if (regexResult.Success)
inputFileExtension = regexResult.Groups[1].Value;
if (inputFileExtension.Length == 0)
throw new Exception("Error determining extension of input filename.");
//build a temporary location on the disk for this file
String outputFilename = "{0}/{1}.{2}".FormatWith(uploadPath, Guid.NewGuid(), inputFileExtension).MapAbsolutePath();
if (File.Exists(outputFilename))
throw new Exception("Unable to create temporary file during upload.");
//Get some information from the session
String ownerId = "Partner_" + this.GetSession().UserAuthId;
//Write the temp file to the disk and begin creating the profile.
try
{
//Move the file to a working directory.
using (var outFile = File.OpenWrite(outputFilename))
{
customerFile.WriteTo(outFile);
}
}
catch (Exception ex)
{
throw new Exception("Error creating profile with uploaded file. ", ex);
}
finally
{
//Clean up temp file no matter what
try { File.Delete(outputFilename); }
catch (Exception delException) { }
}
return new PostUploadResponse
{
Title = request.Title,
Success = true
};
}
The file uploads successfully, and the response is correctly passed back to the browser, but subsequent calls to any service receive a 302 Redirect to /login even though the correct ss-id and sp-pid are transmitted as part of the request.
What is causing my user session to end whenever I upload a file?
Many thanks!
-Z
Well, I solved this one:
What was happening here was that the user uploaded files were ending up in /bin/uploads instead of /uploads. Whenever the contents of /bin change the App Domain restarted which invalidated the session.
The bug in this instance is my use of .MapAbsolutePath() instead of .MapServerPath()