Service Stack Session Lost After File Upload - c#

We've created a small website using Service Stack, but are having a problem with user uploads. We find that when a user uploads a file using a POST that their session is closed.
The size of the file doesn't seem to matter, nor does the delay in responding to the upload POST.
I've confirmed that the browser is still sending the same Session ID (ss-id) cookie before and after the upload.
Here's how we're handling AAA:
public override void Configure(Container container)
{
//Config examples
//this.Plugins.Add(new PostmanFeature());
//this.Plugins.Add(new CorsFeature());
ServiceStack.Text.JsConfig.DateHandler = ServiceStack.Text.DateHandler.ISO8601;
Plugins.Add(new AuthFeature(() => new AuthUserSession(),
new IAuthProvider[] {
new FeedPartnerAuthProvider(), //Custom MSSQL based auth
}
));
//Global response filter to extend session automatically
this.GlobalResponseFilters.Add((req, res, requestDto) =>
{
var userSession = req.GetSession();
req.SaveSession(userSession, slidingExpiryTimeSpan);
});
this.Plugins.Add(new RazorFormat());
}
Here's what the upload code looks like:
//Upload Test Models
[Route("/upload", "POST")]
public class PostUpload : IReturn<PostUploadResponse>
{
public String Title { get; set; }
}
public class PostUploadResponse
{
public String Title { get; set; }
public Boolean Success { get; set; }
}
//Upload Test Service
[Authenticate]
public object Post(PostUpload request)
{
string uploadPath = "~/uploads";
Directory.CreateDirectory(uploadPath.MapAbsolutePath());
var customerFile = Request.Files.SingleOrDefault(uploadedFile =>
uploadedFile.ContentLength > 0 &&
uploadedFile.ContentLength <= 500 * 1000 * 1024);
if (customerFile == null)
throw new ArgumentException("Error: Uploaded file must be less than 500MB");
//determine the extension of the file
String inputFileExtension = "";
var regexResult = Regex.Match(customerFile.FileName, "^.*\\.(.{3})$");
if (regexResult.Success)
inputFileExtension = regexResult.Groups[1].Value;
if (inputFileExtension.Length == 0)
throw new Exception("Error determining extension of input filename.");
//build a temporary location on the disk for this file
String outputFilename = "{0}/{1}.{2}".FormatWith(uploadPath, Guid.NewGuid(), inputFileExtension).MapAbsolutePath();
if (File.Exists(outputFilename))
throw new Exception("Unable to create temporary file during upload.");
//Get some information from the session
String ownerId = "Partner_" + this.GetSession().UserAuthId;
//Write the temp file to the disk and begin creating the profile.
try
{
//Move the file to a working directory.
using (var outFile = File.OpenWrite(outputFilename))
{
customerFile.WriteTo(outFile);
}
}
catch (Exception ex)
{
throw new Exception("Error creating profile with uploaded file. ", ex);
}
finally
{
//Clean up temp file no matter what
try { File.Delete(outputFilename); }
catch (Exception delException) { }
}
return new PostUploadResponse
{
Title = request.Title,
Success = true
};
}
The file uploads successfully, and the response is correctly passed back to the browser, but subsequent calls to any service receive a 302 Redirect to /login even though the correct ss-id and sp-pid are transmitted as part of the request.
What is causing my user session to end whenever I upload a file?
Many thanks!
-Z

Well, I solved this one:
What was happening here was that the user uploaded files were ending up in /bin/uploads instead of /uploads. Whenever the contents of /bin change the App Domain restarted which invalidated the session.
The bug in this instance is my use of .MapAbsolutePath() instead of .MapServerPath()

Related

Why cant I save a user with picture on Azure, while it works locally?

I have a create method which works fine in localhost. It looks like this:
public PatientViewModel Create(PatientViewModel model)
{
try
{
model.patient!.photoPath = (model.file != null) ? UploadFile(model.file) : "placeholder.png";
model.patient.appUser = appuserAwhole(model);
model.patient.appUser.user = (!model.mode) ? model.user : model.Users!.Where(r => r.userId == model.userid).First();
_idb.patients.Add(model.patient);
idbSaveChanges();
Patientfile pf = new Patientfile();
pf.patientMail = model.patient.emailAddress;
_fdb.patientfiles.Add(pf);
fdbSaveChanges();
}
catch (Exception e) {
System.Diagnostics.Debug.WriteLine(e.Message);
}
return model;
}
This is my UploadFile method:
private string UploadFile(IFormFile photo)
{
string wwwPath = this._env.WebRootPath;
string contentPath = this._env.ContentRootPath;
string path = Path.Combine(wwwPath, "img\\PatientPhotos");
if (!Directory.Exists(path))
{
Directory.CreateDirectory(path);
}
if (photo == null)
{
return "default.png";
}
string fileName = Path.GetFileName(photo.FileName);
using (FileStream stream = new FileStream(Path.Combine(path, fileName), FileMode.Create))
{
photo.CopyTo(stream);
}
return fileName;
}
I first get the wwwroot path and use this to save my image. If the image already exist, it uses the image thats already saved (i know this isnt ideal, but it works for me for now).
When I upload a file in localhost and submit, the patient gets saved and so does the image (in wwwroot). On Azure however, the patient and the image dont get saved.
When I dont upload a file, it automatically uses placeholder.png. Now the patient does get saved on Azure, but no image, since none was selected.
Im not getting any errors (probably because its in a production environment), it just returns to the Index page after submitting, without saving anything
In my controller you return to the Index page after creating the patient:
_db.Create(vm);
return RedirectToAction("Index");
Is there any way to fix this problem?

Copying a file and adding a permission in the same batch request in Google Drive Api v3

I have the following code that I want to chance using batch. In this code, first I create a copy of a file, and then using the id of the new file, I am adding a permission.
File readOnlyFile = new File();
readOnlyFile.Name = newFileName.Replace(' ', '-') + "_assess";
readOnlyFile.Parents = new List<string> { targetFolderId };
FilesResource.CopyRequest fileCreateRequest = _driveService.Files.Copy(readOnlyFile, fileId);
string readOnlyFileId = fileCreateRequest.Execute().Id;
if (readOnlyFileId != null)
{
newPermission.ExpirationTime = expirationDate;
newPermission.Type = "anyone";
newPermission.Role = "reader";
PermissionsResource.CreateRequest req = _driveService.Permissions.Create(newPermission, readOnlyFileId);
req.SendNotificationEmail = false;
req.Execute();
}
However, I am puzzled when trying to use batch for this task since I will need the id of the newly copied file to add permission. Below is my initial attempt where I do not know how to proceed after batch.Queue(fileCreateRequest, callback). I can add new action to the batch to add permission. But, I do not know how to get the id of the file. Any suggestions? I need to do this for three different files.
var batch = new BatchRequest(_driveService);
BatchRequest.OnResponse<Permission> callback = delegate (
Permission permission,
RequestError error,
int index,
System.Net.Http.HttpResponseMessage message)
{
if (error != null)
{
// Handle error
Console.WriteLine(error.Message);
}
else
{
Console.WriteLine("Permission ID: " + permission.Id);
}
};
Permission newPermission = new Permission();
File readOnlyFile = new File();
readOnlyFile.Name = newFileName.Replace(' ', '-') + "_assess";
readOnlyFile.Parents = new List<string> { targetFolderId };
FilesResource.CopyRequest fileCreateRequest = _driveService.Files.Copy(readOnlyFile, fileId);
batch.Queue(fileCreateRequest, callback);
To upload and to configure permissions of a file are two different operations that cannot be batched together. Your approach is correct, you only have to use set up the permissions in a second call.
After uploading the files and retrieving the ids as you do, you have to create a second call to create the permissions.
There is no way to do it in a single request, because for setting up the permissions you need the id of the file; and that is only created after finishing the upload. If you need any more clarification, please ask me without hesitating.

Android: How do I construct *full* path to pass to Intent.CreateChooser

Trying to make Android chooser to display available actions for user to launch a PDF file which is stored in my local folder.
When I pass the file name like /data/user/0/myappappname/files/output.pdf , (which exsists, of course), I get a nice chooser with all the apps that can accept a pdf file. But when I pick any of them, I get an error (from external app) The document path is not valid. No exception is thrown.
Then I tried (for testing purposes) to set fname to something like /storage/emulated/0/Download/TLCL.pdf (file also exists), and everything works fine.
At first, I thought that this has something to do with file permissions (since first path is private to my app), but then I found flag ActivityFlags.GrantReadUriPermission built exactly for purpose of temporarily granting file access to other apps. Still same results.
Since this is a Xamarin.forms project, I am limited in choice of file creation locations (I use PCLStorage, which always writes to app-private, local folder), so I don't have an option of generating files in /Documents, /Downloads etc.
I am obviously doing something wrong. Any ideas appreciated.
Is there an option to get full path from system, including the /storage/emulated/0 part (or whatever that would be on other devices)? Maybe that would help?
Piece of code:
(mimeType is defined as "application/pdf" earlier)
public async Task<bool> LaunchFile(string fname, string mimeType)
{
var uri = Android.Net.Uri.Parse("file://" + fname );
var intent = new Intent(Intent.ActionView);
intent.SetDataAndType(uri, mimeType);
intent.SetFlags(ActivityFlags.ClearWhenTaskReset | ActivityFlags.NewTask | ActivityFlags.GrantReadUriPermission );
try
{
Forms.Context.StartActivity(Intent.CreateChooser(intent, "ChooseApp"));
return true;
}
catch (Exception ex)
{
Debug.WriteLine("LaunchFile: " + ex.Message);
return false;
}
My solution to this, which may not be exactly what you want, is to generate a file (in my case a zip file), export it to a public folder, and use that file for the chooser.
Using these:
private readonly string PublicDocsPath = Android.OS.Environment.ExternalStorageDirectory.AbsolutePath + "/AppName";
private readonly string PrivateDocsPath = System.Environment.GetFolderPath(System.Environment.SpecialFolder.ApplicationData);
and some basic functions:
public Stream GetOutputStream(string destFilePath)
{
string destFolderPath = Path.GetDirectoryName(destFilePath);
if (!Directory.Exists(destFolderPath))
Directory.CreateDirectory(destFolderPath);
return new FileStream(destFilePath, FileMode.Create, FileAccess.Write, FileShare.None);
}
public Stream GetInputStream(string sourceFilePath)
{
if (!File.Exists(sourceFilePath)) throw new FileNotFoundException();
string sourceFolderPath = Path.GetDirectoryName(sourceFilePath);
return new FileStream(sourceFilePath, FileMode.Open, FileAccess.Read, FileShare.Read);
}
You can copy your file to your public folder (or subfolders, you just have to assemble the path) and use that file for your chooser:
public void SendEmail(string subject, string body, string recipient, string mimeType, string attachmentFilePath, string activityTitle)
{
var emailIntent = new Intent(Intent.ActionSendMultiple);
if (string.IsNullOrEmpty(subject)) throw new ArgumentException();
emailIntent.PutExtra(Intent.ExtraSubject, subject);
if (!string.IsNullOrEmpty(recipient))
emailIntent.PutExtra(Intent.ExtraEmail, new[] { recipient });
if (!string.IsNullOrEmpty(body))
emailIntent.PutExtra(Intent.ExtraText, body);
if (!string.IsNullOrEmpty(attachmentFilePath))
{
var file = new Java.IO.File(attachmentFilePath);
file.SetReadable(true, true);
var uri = Android.Net.Uri.FromFile(file);
emailIntent.PutParcelableArrayListExtra(Intent.ExtraStream, new List<IParcelable>(){uri});
}
emailIntent.SetType(mimeType);
_activity.StartActivity(Intent.CreateChooser(emailIntent, activityTitle));
}
This chooser specifically lets the user send their file via email or google drive , but you can assemble it however you want. The attachmentFilePath of this function is the same as the string passed into the GetOutputStream function above.
we're using Acr.IO rather than PCLStorage and I recall that has a property that'll return the fullpath for you.
The code we're using is below, but I wonder if you're simply missing "file://" off the start of your path, as I noticed thats in our code, as well as this previous stackoverflow answer to a similar question as this one, open a PDF in Xamarin.Forms (Android)
We're using a dependency FileService on Android and using this code to open PDFs:
public void OpenNatively(string filePath) {
Android.Net.Uri uri;
if (filePath.StartsWithHTTP()) {
uri = Android.Net.Uri.Parse(filePath);
}
else {
uri = Android.Net.Uri.Parse("file:///" + filePath);
}
Intent intent = new Intent(Intent.ActionView);
var extension = filePath.Substring(filePath.LastIndexOf(".")+1);
if (extension == "ppt" || extension == "pptx") {
extension = "vnd.ms-powerpoint";
}
var docType = "application/" + extension;
intent.SetDataAndType(uri, docType);
intent.SetFlags(ActivityFlags.ClearWhenTaskReset | ActivityFlags.NewTask);
try {
Xamarin.Forms.Forms.Context.StartActivity(intent);
}
catch (Exception e) {
Toast.MakeText(Xamarin.Forms.Forms.Context, "No Application found to view " + extension.ToUpperInvariant() + " files.", ToastLength.Short).Show();
}
}

StreamWriter data getting lost on a reboot

I am using StreamWriter to write a file in a sessionchange like logon and logoff.
But if a reboot happens just about the time when we write/close the file, we are getting an empty file and some time data already in it is getting deleted
This is my code
public void ToJson(T objectToWrite)
{
InformFileLogger.Instance.Debug(">>> start >>>");
try
{
string json = null;
string directory = Path.GetDirectoryName(this.serializeFilePath);
if (!Directory.Exists(directory))
{
Directory.CreateDirectory(directory);
}
if (File.Exists(this.serializeFilePath))
{
if (new FileInfo(this.serializeFilePath).Length == 0)
{
File.Delete(this.serializeFilePath);
}
}
json = JsonConvert.SerializeObject(objectToWrite);
if (json != null)
{
using (StreamWriter streamWriter = new StreamWriter(this.serializeFilePath))
{
streamWriter.WriteLine(json);
streamWriter.Close();
}
}
}
catch (Exception ex)
{
InformEventLogger.Error(ex.ToString());
InformFileLogger.Instance.Error(InformHelper.ExceptionMessageFormat(ex));
}
InformFileLogger.Instance.Debug("<<< end <<<");
}
How can I avoid writing null entries in the files/ getting the data deleted?
This really depends on how the reboot is triggered.
Unless the reboot is triggered by your code, you won't be able to make sure your code can do anything before a reboot - including completing a StreamWriter function.

SSH.NET SFTP Get a list of directories and files recursively

I am using Renci.SshNet library to get a list of files and directories recursively by using SFTP. I can able to connect SFTP site but I am not sure how to get a list of directories and files recursively in C#. I haven't found any useful examples.
Has anybody tried this? If so, can you post some sample code about how to get these files and folders recursively.
Thanks,
Prav
This library has some quirks that make this recursive listing tricky because the interaction between the ChangeDirectory and ListDirectory do not work as you may expect.
The following does not list the files in the /home directory instead it lists the files in the / (root) directory:
sftp.ChangeDirectory("home");
sftp.ListDirectory("").Select (s => s.FullName);
The following does not work and returns a SftpPathNotFoundException:
sftp.ChangeDirectory("home");
sftp.ListDirectory("home").Select (s => s.FullName);
The following is the correct way to list the files in the /home directory
sftp.ChangeDirectory("/");
sftp.ListDirectory("home").Select (s => s.FullName);
This is pretty crazy if you ask me. Setting the default directory with the ChangeDirectory method has no effect on the ListDirectory method unless you specify a folder in the parameter of this method. Seems like a bug should be written for this.
So when you write your recursive function you'll have to set the default directory once and then change the directory in the ListDirectory call as you iterate over the folders. The listing returns an enumerable of SftpFiles. These can then be checked individually for IsDirectory == true. Just be aware that the listing also returns the . and .. entries (which are directories). You'll want to skip these if you want to avoid an infinite loop. :-)
EDIT 2/23/2018
I was reviewing some of my old answers and would like to apologize for the answer above and supply the following working code. Note that this example does not require ChangeDirectory, since it's using the Fullname for the ListDirectory:
void Main()
{
using (var client = new Renci.SshNet.SftpClient("sftp.host.com", "user", "password"))
{
var files = new List<String>();
client.Connect();
ListDirectory(client, ".", ref files);
client.Disconnect();
files.Dump();
}
}
void ListDirectory(SftpClient client, String dirName, ref List<String> files)
{
foreach (var entry in client.ListDirectory(dirName))
{
if (entry.IsDirectory)
{
ListDirectory(client, entry.FullName, ref files);
}
else
{
files.Add(entry.FullName);
}
}
}
Try this:
var filePaths = client.ListDirectory(client.WorkingDirectory);
Here is a full class. It's .NET Core 2.1 Http trigger function app (v2)
I wanted to get rid of any directories that start with '.', cause my sftp server has .cache folders and .ssh folders with keys. Also didn't want to have to deal with folder names like '.' or '..'
What I will end up doing is projecting the SftpFile into a type that I work with and return that to the caller (in this case it will be a logic app). I'll then pass that object into a stored procedure and use OPENJSON to build up my monitoring table. This is basically the first step in creating my SFTP processing queue that will move files off my SFTP folder and into my Data Lake (blob for now until I come up with something better I guess).
The reason I used .WorkingDirectory is because I created a user with home directory as '/home'. This lets me traverse all of my user folders. My app doesn't need to have a specific folder as a starting point, just the user 'root' so to speak.
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using Renci.SshNet;
using Renci.SshNet.Sftp;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
namespace SFTPFileMonitor
{
public class GetListOfFiles
{
[FunctionName("GetListOfFiles")]
public async Task<IActionResult> RunAsync([HttpTrigger(AuthorizationLevel.Anonymous, "get")] HttpRequest req, ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
List<SftpFile> zFiles;
int fileCount;
decimal totalSizeGB;
long totalSizeBytes;
using (SftpClient sftpClient = new SftpClient("hostname", "username", "password"))
{
sftpClient.Connect();
zFiles = await GetFiles(sftpClient, sftpClient.WorkingDirectory, new List<SftpFile>());
fileCount = zFiles.Count;
totalSizeBytes = zFiles.Sum(l => l.Length);
totalSizeGB = BytesToGB(totalSizeBytes);
}
return new OkObjectResult(new { fileCount, totalSizeBytes, totalSizeGB, zFiles });
}
private async Task<List<SftpFile>> GetFiles(SftpClient sftpClient, string directory, List<SftpFile> files)
{
foreach (SftpFile sftpFile in sftpClient.ListDirectory(directory))
{
if (sftpFile.Name.StartsWith('.')) { continue; }
if (sftpFile.IsDirectory)
{
await GetFiles(sftpClient, sftpFile.FullName, files);
}
else
{
files.Add(sftpFile);
}
}
return files;
}
private decimal BytesToGB(long bytes)
{
return Convert.ToDecimal(bytes) / 1024 / 1024 / 1024;
}
}
}
I have achieved this using recursion. Created a class TransportResponse like this
public class TransportResponse
{
public string directoryName { get; set; }
public string fileName { get; set; }
public DateTime fileTimeStamp { get; set; }
public MemoryStream fileStream { get; set; }
public List<TransportResponse> lstTransportResponse { get; set; }
}
I create a list of TransportResponse class. If the directoryName is not null, it will contain a list of the same class which will have the the files inside that directory as a MemoryStream ( this can be changed as per your use case)
List<TransportResponse> lstResponse = new List<TransportResponse>();
using (var client = new SftpClient(connectionInfo))
{
try
{
Console.WriteLine("Connecting to " + connectionInfo.Host + " ...");
client.Connect();
Console.WriteLine("Connected to " + connectionInfo.Host + " ...");
}
catch (Exception ex)
{
Console.WriteLine("Could not connect to "+ connectionInfo.Host +" server. Exception Details: " + ex.Message);
}
if (client.IsConnected)
{
var files = client.ListDirectory(transport.SourceFolder);
lstResponse = downloadFilesInDirectory(files, client);
client.Disconnect();
}
else
{
Console.WriteLine("Could not download files from "+ transport.TransportIdentifier +" because client was not connected.");
}
}
private static List<TransportResponse> downloadFilesInDirectory(IEnumerable<SftpFile> files, SftpClient client)
{
List<TransportResponse> lstResponse = new List<TransportResponse>();
foreach (var file in files)
{
if (!file.IsDirectory)
{
if (file.Name != "." && file.Name != "..")
{
if (!TransportDAL.checkFileExists(file.Name, file.LastWriteTime))
{
using (MemoryStream fs = new MemoryStream())
{
try
{
Console.WriteLine("Reading " + file.Name + "...");
client.DownloadFile(file.FullName, fs);
fs.Seek(0, SeekOrigin.Begin);
lstResponse.Add(new TransportResponse { fileName = file.Name, fileTimeStamp = file.LastWriteTime, fileStream = new MemoryStream(fs.GetBuffer()) });
}
catch(Exception ex)
{
Console.WriteLine("Error reading File. Exception Details: " + ex.Message);
}
}
}
else
{
Console.WriteLine("File was downloaded previously");
}
}
}
else
{
if (file.Name != "." && file.Name != "..")
{
lstResponse.Add(new TransportResponse { directoryName = file.Name,lstTransportResponse = downloadFilesInDirectory(client.ListDirectory(file.Name), client) });
}
}
}
return lstResponse;
}
Hope this helps. Thanks
#Carlos Bos
This library has some quirks that make this recursive listing tricky
because the interaction between the ChangeDirectory and ListDirectory
do not work as you may expect.
correct
It works well when the ChangeDirectory() parameter is "."
but if you do
SftpClient sftp ...;
sftp.ChangeDirectory("some_folder");
//get file list
List<SftpFile> fileList = sftp.ListDirectory("some_folder").ToList();
then there is an assertion because the ListDirectory() call expects
"some_folder/some_folder"
The workaround I use is to save and restore the current directory before a remote upload/rename to "some_folder", and you need to list that folder before the operation (e.g to see the file already exists)
string working_directory = sftp.WorkingDirectory;
sftp.ChangeDirectory("some_folder");
sftp.RenameFile("name", "new_name");
sftp.ChangeDirectory(working_directory);
to check if the file exists, this call is sufficient
sftp.Exists(path)
or if you want to add some other criteria, like case sensitive or not
public FileExistence checkFileExists(string folder, string fileName)
{
//get file list
List<SftpFile> fileList = sftp.ListDirectory(folder).ToList();
if (fileList == null)
{
return FileExistence.UNCONFIRMED;
}
foreach (SftpFile f in fileList)
{
Console.WriteLine(f.ToString());
//a not case sensitive comparison is made
if (f.IsRegularFile && f.Name.ToLower() == fileName.ToLower())
{
return FileExistence.EXISTS;
}
}
//if not found in traversal , it does not exist
return FileExistence.DOES_NOT_EXIST;
}
where FileExistence is
public enum FileExistence
{
EXISTS,
DOES_NOT_EXIST,
UNCONFIRMED
};
Extension Method:
public static IEnumerable<SftpFile> ListDirectoryRecursive(this SftpClient client, String dirName)
{
foreach (var entry in client.ListDirectory(dirName))
{
if (Regex.IsMatch(entry.Name, "^\\.+$")) continue;
yield return entry;
if (entry.IsDirectory)
foreach (var innerEntry in ListDirectoryRecursive(client, entry.FullName))
yield return innerEntry;
}
}

Categories