I have an entity that has both general properties that are stored in database table and a reference to a local file on system disk. I want the create/replace/delete methods for this file be encapsulated in the data access layer to let other parts of the application not care of how and where it should be stored, just send a bytes stream of perform "clear" operation. At the same time, I'd like the file directory to be defined in web.config like database access parameters are.
Im my web app I use EF 5 Code First and have defined the entity like an example below:
// EF entity
public class UserImage{
public string Description { get; set; }
[NotMapped]
public LocalFile File { get; set; }
}
// not EF entity
public class LocalFile{
public string Name { get; set; }
public string LocalPath { // should return full path as dir path + file name }
public string Url { // should return LocalPath mapped to Url }
public void Set(FileStream fs) { // saves file to disk }
public void Clear() { // deletes file }
}
In my approach I can account my DbContext is not only database context, but a context for both database and filesystem storage and I can provide it both with DB connection string and a local directory path on creation time. I think it should be a good practice, let me know if I'm wrong.
Now the problem: how I can know the local directory path from inside of the LocalFile or UserImage objects so they can implement LocalPath and Url properties? In other words, how some other part of the application can know what's the actual Path/Url of the LocalFile or UserImage instance? Or is there a way to provide these objects with LocalDir as they're being created inside DbContext? At last, what is the alternate way to incapsulate local storage operations within UserImage so any outed code never care how and where the file is stored?
You should create interface for your file operations that will have two methods: Stream GetFile(string fileName) and void PutFile(Stream fileStream, string fileName) and implement it with concrete class that will have constructor with parameter locationPath:
public interface IFileRepository
{
Stream GetFile(string fileName);
void PutFile(Stream fileStream, string fileName);
}
public class FileRepository
{
private readonly string localPath;
public FileRepository(string localPath)
{
_localPath = localPath;
}
public Stream GetFile(string fileName)
{
var file = //get filestram from harddrive using fileName and localPath
return file;
}
...
public void PutFile(Stream fileStream, string fileName)
{
//save input stream to disk file using fileName and localPath
}
}
In your DbContext class you should create private field of type IFileRepository and in constructor initialize it from parameter:
public class SomeDbContext:DbContext
{
private readonly IFileRepository _fileRepository;
public SomeDbContext(IFileRepository fileRepository)
{
_fileRepository = fileRepository;
}
...
}
And use this _fileRepository to put and get files in DbContext methods.
Concrete classes for interface type parameters should be passed by Inversion of Control container (or other implementations of Inversion of Control principle).
Update:
public class UserImage
{
public string Description { get; set; }
[NotMapped]
public LocalFile File { get; set; }
}
// not EF entity
public class LocalFile
{
private readonly string _filePath;
public LocalFile(string filePath)
{
_filePath=filePath;
}
public string Name { get; set; }
public string LocalPath { // aggregate Name and filePath }
public string Url { // should return LocalPath mapped to Url } If i where you, i would write helper for this
}
I think my mistake is that i'm trying to access context properties (i.e. directory path) from inside of the entity. EF database context architecture doesnt implement it and the entities don't have idea how they are stored. I wouldn't like to voilate this canon.
The directory for storing files can be considered either as context property (like connection string) and entity property (like path). To implement first case i can provide myDbContext with the Directory property and then resolve all paths via context instance by calling myContext.GetUrl(userImage.FileName). But myDbContext is not always directly accessible from presentation level and i'll be unable to extract userImage's Url to set it on web page until i propogate myDbContext to all upper layers.
If I consider Directory as LocalFile's property then i need somehow to inject its value, either in constructor:
public class LocalFile{
// Constructor
public LocalFile(string dir){ // set current dir }
private string _dir;
public GetUrl(){ // return _dir + filename }
}
// cons: parameterless constructor that will be called by DbContext on getting
// data from DB won't set `_dir` and GetUrl won't return correct result
or using static directory that is set up earlier (say in global.asax):
public class LocalFile{
// Constructor
public LocalFile(){ // empty }
public static Dir;
public GetUrl(){ // return Dir + filename }
}
or even directly accessing web.config to get paths:
public class LocalFile{
// Constructor
public LocalFileType(){ // empty }
public GetUrl(){ // read dir from web.config + filename }
}
// not good idea to access web-specific assemblies from EF data classes
or making extension methods at upper layers where web.config is accessible:
public static class MyExtensions
{
public static string GetUrl(LocalFile localFile)
{
// dir from web.config + localFile.Name
}
}
So, there are many possible solutions and each has its own disadvantages. My case is little bit more complicated as my dir path also depends on LocalFile's parent user's ID so i have dir template users/{0}/image.jpg in web.config instead of simple path.
What i've done to achieve my targets:
put url template of type users/{0}/{1} (0 - parent UserId, 1 -
fileName) to web.config
created class Settings nearby my EF entities
public static class Shared
{
public static Func<string, int, string> LocalFileUrlResolver;
}
populated its values on application start
protected void Application_Start()
{
Shared.LocalFileUrlResolver =
(fileName, userId) =>
String.Format(ConfigurationManager.AppSettings["LocalFileUrl"], userId, fileName);
}
made my User provide its own images with Url resolver at creation
time
public User()
{
// ...
Image = new LocalFile(
"userpic.css",
fileName => Shared.LocalFileUrlResolver(fileName, userId)
);
}
made my LocalFile constructor accept Func<string, string> param
that resolves full Url of given file name
public class LocalFile
{
public LocalFile(
string fileName,
Func<string, string> fnUrlResolver
)
{
FileName = fileName;
_fnUrlResolver = fnUrlResolver;
}
private readonly Func<string, string> _fnUrlResolver;
public string FileName { get; private set; }
public string Url { get { return _fnUrlResolver(FileName); } }
}
Yes, so many lines. I take dir template from web.config, inject it into static member of data access layer and make it more specific on User creation point for user's local images.
I am absolutely not sure does it worth the cost. Maybe in future i'll choose direct access to web.config :)
Related
I'm new to NoSql and MongoDB. I'm using the MongoDB C# driver inside Visual Studio.
I've read in different places that it's preferable to have a single instance of your database class that maintains the connection(s) to keep everything thread safe and that it's generally a bad idea to use static classes for database CRUD operations.
At the start of my program I instantiate my database class which opens a connection. Within that class and also in derived classes I can perform CRUD operations. But now I'm in a different part of my solution (same namespace, different class) and I need to do read operations to check if a user exists. I also need to compose a new document that I then want to insert.
Now I'm in a situation where that's only possible by creating a new instance of the database class to access its CRUD methods. I want to avoid static CRUD methods (that could be accessed from other classes) because then the base class of my database connection also needs to be static. I cannot figure out how to approach this and what would be the recommended way.
From the MongoDB website:
The MongoClient instance actually represents a pool of connections to the database; you will only need one instance of class MongoClient even with multiple threads.
http://mongodb.github.io/mongo-csharp-driver/2.2/getting_started/quick_tour/
Does this mean I should create a new MongoClient everytime I need to acces the database in others parts of my program?
UPDATE
It seems I was a bit mistaken about the static properties and how they can be used. I now have it setup like this:
class Database
{
const string MongoConnection = "mongodb+srv://user:password#cluster.mongodb.net";
public static MongoClient Client { get; set; }
public static IMongoDatabase Directory { get; set; }
public static IMongoCollection<User> Collection { get; set; }
public Database()
{
Client = new MongoClient(MongoConnection);
Directory= Client.GetDatabase("studentDB");
Collection = Directory.GetCollection<User>("users");
}
public static void InsertNewUser(User user)
{
Collection.InsertOne(user);
}
public static bool EmailHasAccount(string email)
{
return Collection.Find(x => x.Email == email).FirstOrDefault() == null ? false : true;
}
public static User RetrieveUserAccount(string email)
{
return Collection.Find(x => x.Email == email).FirstOrDefault();
}
}
public class User
{
public Guid Id { get; private set; }
public string Name { get; set; }
public string Email { get; set; }
public User(string name, string email)
{
Id = Guid.NewGuid();
Name = name;
Email = email;
}
}
And in my main program I can use it like this:
var db = new Database();
var user = new User("myName", "email#address");
Database.InsertNewUser(user);
Console.WriteLine(Database.EmailHasAccount("email#address")); // returns true
Console.WriteLine(Database.RetrieveUserAccount("email#address").Name); // returns "myName"
That's exactly what I was looking for. What would be the best way to handle multiple collections? Would it be safe to change the Collection property or is it better to create separate properties? Is a Generic even possible?
I've got a simple document management system I'm putting together, I'm trying to follow solid DDD principals and things have been coming together. One area I've been questioning is what the cleanest solution for managing files would be.
For background, a lot of this document management is going to revolve around uploading documents and assigning them to specific "work orders". This is in the manufacturing industry, we need to keep track of certain documents and send them to the customer when we're all done making their stuff.
So my bounded context is mostly composed of a couple main entities, lets say DocumentPackage, Document, Requirements, etc. A DocumentPackage is a grouping of documents for a single "work order". The same document may be used in multiple DocumentPackages. Each DocumentPackage has a certain number of Requirements, which is a distinct type of document that is needed as part of the package.
So when it comes to the action of uploading and downloading the files, manipulating the files, and updating the database to reflect these changes, where do I want to do most handle most of that?
Here's an example of a UploadDocumentCommand and Handler I have. Note that I decided to save the uploaded file to the local file system in the API controller, and to pass it in my command as the FileInfo.
public class UploadDocumentCommand : IRequest<AppResponse>
{
public UploadDocumentCommand(Guid documentId, string workOrderNumber, FileInfo file, Guid? requirementId = null)
{
DocumentId = documentId;
WorkOrderNumber = new WorkOrderNumber(workOrderNumber);
FileInfo = file;
RequirementId = requirementId;
}
public Guid DocumentId { get; }
public WorkOrderNumber WorkOrderNumber { get; }
public Guid? RequirementId { get; }
public FileInfo FileInfo { get; }
}
public class UploadDocumentCommandHandler : IRequestHandler<UploadDocumentCommand, AppResponse>
{
private readonly IDocumentRepository documentRepository;
private readonly IDocumentPackageRepository packageRepo;
public UploadDocumentCommandHandler(IDocumentRepository documentRepository, IDocumentPackageRepository packageRepo)
{
this.documentRepository = documentRepository;
this.packageRepo = packageRepo;
}
public async Task<AppResponse> Handle(UploadDocumentCommand request, CancellationToken cancellationToken)
{
try
{
// get the package, throws not found exception if does not exist
var package = await packageRepo.Get(request.WorkOrderNumber);
var document = DocumentFactory.CreateFromFile(request.DocumentId, request.FileInfo);
if (request.RequirementId != null)
{
package.AssignDocumentToRequirement(document, request.RequirementId.GetValueOrDefault());
}
else
{
package.AddUnassignedDocument(document);
}
await documentRepository.Add(document, request.FileInfo);
await packageRepo.Save(package);
return AppResponse.Ok();
}
catch (AppException ex)
{
// the file may have been uploaded to docuware but failed afterwards, this should be addressed
// this can be better handled by using an event to add the document to the package only after successful upload
return AppResponse.Exception(ex);
}
}
}
public class Document : Entity<Guid>
{
private Document() { }
public Document(Guid id, string fileName, DateTime addedOn)
{
Id = id;
FileName = new FileName(fileName);
AddedOn = addedOn;
}
public FileName FileName { get; private set; }
public DateTime AddedOn { get; private set; }
public override string ToString() => $"Document {Id} {FileName}";
}
My DocumentRepository has mixed responsibilities, and I'm having it both save the file to the file store as well as update the database. I'm using a specific document storage application right now, but I wanted to keep this abstracted so that I am not stuck on this. It is also possible that different files, like images, might have different stores. But part of me is wondering if it is actually better to have this logic in my application layer, where my handler takes care of storing the file and updating the database. I don't feel like the DocumentRepository is very SRP, and the act of loading my Document entity shouldn't have a dependency on my DocuwareRepository.
public class DocumentRepository : IDocumentRepository
{
private readonly DbContext context;
private readonly IDocuwareRepository docuwareRepository;
public DocumentRepository(DbContext context, IDocuwareRepository docuwareRepository)
{
this.context = context;
this.docuwareRepository = docuwareRepository;
}
public async Task<Document> Get(Guid id)
{
return await
context
.Document
.FirstOrDefaultAsync(x => x.Id.Equals(id));
}
public async Task Add(Document document, FileInfo fileInfo)
{
var results = await docuwareRepository.UploadToDocuware(document.Id, fileInfo);
var storageInfo = new DocumentStorageInfo
{
DocuwareId = results.DocuwareDocumentId,
DocuwareFileCabinetId = results.CabinetId,
Document = document
};
context.DocumentStorageInfo.Add(storageInfo);
context.Document.Add(document);
await context.SaveChangesAsync();
}
public Task<FileStream> Download(Guid id)
{
throw new NotImplementedException();
}
public void Dispose()
{
context?.Dispose();
}
}
I've got another use case I'm working on where the DocumentPackage has to be downloaded. I want my application to take all the valid documents from the package, compile them into a zip file, where the documents are structured in a folder hierarchy based on the Requirements, and that archive zip is going to get saved long term for traceability reasons, and the client can download that zip file. So I have another entity I'm calling the DocumentPackageArchive, its got a repository, and I'm thinking the zip file gets compiled there. This would call for the repository downloading all the files from their respective stores, compressing it as a zip, saving the zip locally on the web server, sending the zip file back to be saved for read-only keeping, and updating the database with some data about the archive. Yes, I am intentionally creating a copy of the document package as a snapshot.
So where this leaves me, I feel like the file management is happening all over the place. I'm saving some files in the web api becuase I feel like I need to save them to temp storage right when I get them from an IFormFile. I'm handling the file info in the application layer as part of the commands. And most of the heavy lifting is happening in the Infrastructure layer.
How would you recommend I approach this? I feel like those two repositories dealing with the documents need to be re-designed.
As an additional note, I am also considering coordinating some of this work through domain events. I don't think I'm ready for that yet, and it seems like that's a bit more complication then I need to be adding right now. Still, advise in that area would be also appreciated.
I'm designing a file upload API that needs to work with large files. I want to stay away from passing around byte arrays. The endpoint storage for the file will be a third party such as Azure or Rackspace file storage.
I have the following project structure, which is following DDD:
Web API (Netcore - accepts the uploaded file)
Business Service (calls to Azure to save the file and saves a record to the database)
Domain (Domain models for EF)
Persistence (Repositories EFCore - saves the database changes)
I would like to have methods in each that can start passing through the uploaded filestream as soon as the upload starts. I'm unsure if this is possible?
Previously we've used byte[] to pass the files through the layers, but for large files this seems to require lots of memory to do so and has caused us issues.
Is it possible to optimize the upload of files through a ntier application, so you don't have to copy around large byte arrays, and if so, how can it be done?
In order to clarify, the code structure would be something like the following. Repository stuff has been excluded:
namespace Project.Controllers
{
[Produces("application/json")]
[Route("api/{versionMNumber}/")]
public class DocumentController : Controller
{
private readonly IAddDocumentCommand addDocumentCommand;
public DocumentController(IAddDocumentCommand addDocumentCommand)
{
this.addDocumentCommand = addDocumentCommand;
}
[Microsoft.AspNetCore.Mvc.HttpPost("application/{applicationId}/documents", Name = "PostDocument")]
public IActionResult UploadDocument([FromRoute] string applicationId)
{
var addDocumentRequest = new AddDocumentRequest();
addDocumentRequest.ApplicationId = applicationId;
addDocumentRequest.FileStream = this.Request.Body;
var result = new UploadDocumentResponse { DocumentId = this.addDocumentCommand.Execute(addDocumentRequest).DocumentId };
return this.Ok(result);
}
}
}
namespace Project.BusinessProcess
{
public interface IAddDocumentCommand
{
AddDocumentResponse Execute(AddDocumentRequest request);
}
public class AddDocumentRequest
{
public string ApplicationId { get; set; }
public Stream FileStream { get; set; }
}
public class AddDocumentResponse
{
public Guid DocumentId { get; set; }
}
public class AddDocumentCommand : IAddDocumentCommand
{
private readonly IDocuentRepository documentRepository;
private readonly IMessageBus bus;
public AddDocumentCommand(IDocumentRepository documentRepository, IMessageBus bus)
{
this.documentRepository = documentRepository;
this.bus = bus;
}
public AddDocumentResponse Execute(AddDocumentRequest request)
{
/// We need the file to be streamed off somewhere else, fileshare, Azure, Rackspace etc
/// We need to save a record to the db that the file has been saved successfully
/// We need to trigger the background workers to process the uploaded file
var fileUri = AzureStorageProvider.Save(request.FileStream);
var documentId = documentRepository.Add(new Document { FileUri = fileUri });
bus.AddMessage(new DocumentProcessingRequest { documentId = documentId, fileUri = fileUri });
return new AddDocumentResponse { DocumentId = documentId };
}
}
}
Some notes:
Passing around a byte array or stream doesn't copy the data - the issue is having the data on your server at all. If your web server needs to process the data in its complete form, you aren't going to be able to avoid the memory usage of doing so.
If you don't need to process the data on your web server at all, but just need to put it in blob storage, you should return a uri for the upload, which points to blob storage (this might be helpful: Using CORS with Azure)
If you need to process the data, but you're okay doing that a bit-at-a-time, something like this answer is what you need Using streams in ASP.Net web API
I want to implement a certain functionality, but I do not know where to start. I will describe what I have.
Backend
public enum SourceType { Database, Folder }
public class DatabaseSource
{
public string ServerName { get; set; }
public string DatabaseName { get; set; }
}
public class FolderSource
{
public string FolderName { get; set; }
}
public class TestController : ApiController
{
[HttpPost]
[Route("source")]
public void Post([FromBody]DatabaseSource source) //method one
{
}
[HttpPost]
[Route("source")]
public void Post([FromBody]FolderSource source) //method two
{
}
}
Frontend
export enum SourceType {
Database,
Folder
}
export class DatabaseSource {
public ServerName: string;
public DatabaseName: string;
}
export class FolderSource {
public FolderName: string;
}
var source = new DatabaseSource();
source.ServerName = "serverName";
source.DatabaseName = "dbName";
var obj = {
sourceType: SourceType.Database,
source: source
};
Now imagine that I will send obj to the server. I want that specific controller method to be called depending on the enum. How can I do this?
P.S. The example is greatly simplified.
Your implementation is inconsistent for what you've specified in code.
On the front-end you are describing an object which has a sourceType field and a source object property, while on the backend you're overloading the ApiController method and mapping different REST object resources to a single HTTP method and endpoint (which I believe will not work).
There is no magic way for the ApiController to use your enum property to differentiate between the object types automatically.
A simpler (and better) implementation would be to have separate ApiController classes for your Database and Folder source object POST calls. This follows the principle of REST API design where you are essentially mapping basic CRUD operations to the HTTP methods with object types.
If your intention is to perform an operation based on these parameter objects, then clarify the intention via the API routing for the endpoint as below:
public class TestController : ApiController
{
[HttpPost]
[Route("ETLLoad/Database/source")]
public void Post([FromBody]DatabaseSource source) //method one
{
}
[HttpPost]
[Route("ETLLoad/Folder/source")]
public void Post([FromBody]FolderSource source) //method two
{
}
}
I am trying to upload a file and send it to the service layer to save, however I keep finding examples on how the controller gets the HTTPPostedFileBase and saves it directly in the controller. My service layer has no depencies on the web dll hence do I need to read my object into a memory stream/ byte? Any pointers on how I should go about this is greatly appreciated...
Note: Files can by pdf, word so I may need to check content type also (maybe within the domain-service layer...
Code:
public ActionResult UploadFile(string filename, HttpPostedFileBase thefile)
{
//what do I do here...?
}
EDIT:
public interface ISomethingService
{
void AddFileToDisk(string loggedonuserid, int fileid, UploadedFile newupload);
}
public class UploadedFile
{
public string Filename { get; set; }
public Stream TheFile { get; set; }
public string ContentType { get; set; }
}
public class SomethingService : ISomethingService
{
public AddFileToDisk(string loggedonuserid, int fileid, UploadedFile newupload)
{
var path = #"c:\somewhere";
//if image
Image _image = Image.FromStream(file);
_image.Save(path);
//not sure how to save files as this is something I am trying to find out...
}
}
You could use the InputStream property of the posted file to read the contents as a byte array and send it to your service layer along with other information such as ContentType and FileName that your service layer might need:
public ActionResult UploadFile(string filename, HttpPostedFileBase thefile)
{
if (thefile != null && thefile.ContentLength > 0)
{
byte[] buffer = new byte[thefile.ContentLength];
thefile.InputStream.Read(buffer, 0, buffer.Length);
_service.SomeMethod(buffer, thefile.ContentType, thefile.FileName);
}
...
}
Can't you create a method on the service layer accepting a Stream as a parameter and pass the theFile.InputStream to it ? Stream does not requires any web related dependency, and you avoid to duplicate memory by copying the data in some other data structures to consume it.