I've got a simple document management system I'm putting together, I'm trying to follow solid DDD principals and things have been coming together. One area I've been questioning is what the cleanest solution for managing files would be.
For background, a lot of this document management is going to revolve around uploading documents and assigning them to specific "work orders". This is in the manufacturing industry, we need to keep track of certain documents and send them to the customer when we're all done making their stuff.
So my bounded context is mostly composed of a couple main entities, lets say DocumentPackage, Document, Requirements, etc. A DocumentPackage is a grouping of documents for a single "work order". The same document may be used in multiple DocumentPackages. Each DocumentPackage has a certain number of Requirements, which is a distinct type of document that is needed as part of the package.
So when it comes to the action of uploading and downloading the files, manipulating the files, and updating the database to reflect these changes, where do I want to do most handle most of that?
Here's an example of a UploadDocumentCommand and Handler I have. Note that I decided to save the uploaded file to the local file system in the API controller, and to pass it in my command as the FileInfo.
public class UploadDocumentCommand : IRequest<AppResponse>
{
public UploadDocumentCommand(Guid documentId, string workOrderNumber, FileInfo file, Guid? requirementId = null)
{
DocumentId = documentId;
WorkOrderNumber = new WorkOrderNumber(workOrderNumber);
FileInfo = file;
RequirementId = requirementId;
}
public Guid DocumentId { get; }
public WorkOrderNumber WorkOrderNumber { get; }
public Guid? RequirementId { get; }
public FileInfo FileInfo { get; }
}
public class UploadDocumentCommandHandler : IRequestHandler<UploadDocumentCommand, AppResponse>
{
private readonly IDocumentRepository documentRepository;
private readonly IDocumentPackageRepository packageRepo;
public UploadDocumentCommandHandler(IDocumentRepository documentRepository, IDocumentPackageRepository packageRepo)
{
this.documentRepository = documentRepository;
this.packageRepo = packageRepo;
}
public async Task<AppResponse> Handle(UploadDocumentCommand request, CancellationToken cancellationToken)
{
try
{
// get the package, throws not found exception if does not exist
var package = await packageRepo.Get(request.WorkOrderNumber);
var document = DocumentFactory.CreateFromFile(request.DocumentId, request.FileInfo);
if (request.RequirementId != null)
{
package.AssignDocumentToRequirement(document, request.RequirementId.GetValueOrDefault());
}
else
{
package.AddUnassignedDocument(document);
}
await documentRepository.Add(document, request.FileInfo);
await packageRepo.Save(package);
return AppResponse.Ok();
}
catch (AppException ex)
{
// the file may have been uploaded to docuware but failed afterwards, this should be addressed
// this can be better handled by using an event to add the document to the package only after successful upload
return AppResponse.Exception(ex);
}
}
}
public class Document : Entity<Guid>
{
private Document() { }
public Document(Guid id, string fileName, DateTime addedOn)
{
Id = id;
FileName = new FileName(fileName);
AddedOn = addedOn;
}
public FileName FileName { get; private set; }
public DateTime AddedOn { get; private set; }
public override string ToString() => $"Document {Id} {FileName}";
}
My DocumentRepository has mixed responsibilities, and I'm having it both save the file to the file store as well as update the database. I'm using a specific document storage application right now, but I wanted to keep this abstracted so that I am not stuck on this. It is also possible that different files, like images, might have different stores. But part of me is wondering if it is actually better to have this logic in my application layer, where my handler takes care of storing the file and updating the database. I don't feel like the DocumentRepository is very SRP, and the act of loading my Document entity shouldn't have a dependency on my DocuwareRepository.
public class DocumentRepository : IDocumentRepository
{
private readonly DbContext context;
private readonly IDocuwareRepository docuwareRepository;
public DocumentRepository(DbContext context, IDocuwareRepository docuwareRepository)
{
this.context = context;
this.docuwareRepository = docuwareRepository;
}
public async Task<Document> Get(Guid id)
{
return await
context
.Document
.FirstOrDefaultAsync(x => x.Id.Equals(id));
}
public async Task Add(Document document, FileInfo fileInfo)
{
var results = await docuwareRepository.UploadToDocuware(document.Id, fileInfo);
var storageInfo = new DocumentStorageInfo
{
DocuwareId = results.DocuwareDocumentId,
DocuwareFileCabinetId = results.CabinetId,
Document = document
};
context.DocumentStorageInfo.Add(storageInfo);
context.Document.Add(document);
await context.SaveChangesAsync();
}
public Task<FileStream> Download(Guid id)
{
throw new NotImplementedException();
}
public void Dispose()
{
context?.Dispose();
}
}
I've got another use case I'm working on where the DocumentPackage has to be downloaded. I want my application to take all the valid documents from the package, compile them into a zip file, where the documents are structured in a folder hierarchy based on the Requirements, and that archive zip is going to get saved long term for traceability reasons, and the client can download that zip file. So I have another entity I'm calling the DocumentPackageArchive, its got a repository, and I'm thinking the zip file gets compiled there. This would call for the repository downloading all the files from their respective stores, compressing it as a zip, saving the zip locally on the web server, sending the zip file back to be saved for read-only keeping, and updating the database with some data about the archive. Yes, I am intentionally creating a copy of the document package as a snapshot.
So where this leaves me, I feel like the file management is happening all over the place. I'm saving some files in the web api becuase I feel like I need to save them to temp storage right when I get them from an IFormFile. I'm handling the file info in the application layer as part of the commands. And most of the heavy lifting is happening in the Infrastructure layer.
How would you recommend I approach this? I feel like those two repositories dealing with the documents need to be re-designed.
As an additional note, I am also considering coordinating some of this work through domain events. I don't think I'm ready for that yet, and it seems like that's a bit more complication then I need to be adding right now. Still, advise in that area would be also appreciated.
Related
I have to save multiple Versions of the same File.
I would like to know how to represent this in the database and if I have to configure something in EF Core for that.
Basically, the User uploads a File and afterwards it´s possible to upload a new Version of this File. Only the newest File shall be shown to the User in Standard View, but it should be possible to see the other Versions. Versions can be deleted. A File has Actions, which say what can be done with this.
How should I write the Model for this?
public File{
public string FileName;
public int Version;
public bool Active;
//+ Actions, that is same for every Version
}
1.) I thought that I could just add a List to this class/table, but my problem is what happens if the first Version will be deleted? I would have to always remember to switch the Files.
Also, with this a Version could have Versions, but that´s not correct as they all belong together.
2.) I could just add an public File ParentFile and every Version links to the first File. But then I will have again problems with deletion.
3.) Introduce something like a SameFIleID Then I would add a List of this ID to the Project table, search all the Files with this ID and then take the one with Active or highest Version.... But then I will have a problem with my Actions since I would have to always update all Versions for that instead of just one.
Any ideas?
Why not just keep the File Definition and the File Contents separate?
public class File
{
public string FileName { get; set; }
public List<FileContents> Versions { get; set; } = new List<FileContents>();
[NotMapped]
public FileContents ActiveContents =>
Versions.OrderByDescending(v => v.Version).FirstOrDefault();
//Actions, that is same for every Version
}
public class FileContents
{
public File FileDefinition { get; set; }
public int Version { get; set; }
// Actual Contents
}
This way Actions are only saved once per file. The ActiveContents Property will always take the latest version if it exists. This is set as [NotMapped] because the active version is implicit in the Version Id and doesn't need to be saved in the database.
I'm designing a file upload API that needs to work with large files. I want to stay away from passing around byte arrays. The endpoint storage for the file will be a third party such as Azure or Rackspace file storage.
I have the following project structure, which is following DDD:
Web API (Netcore - accepts the uploaded file)
Business Service (calls to Azure to save the file and saves a record to the database)
Domain (Domain models for EF)
Persistence (Repositories EFCore - saves the database changes)
I would like to have methods in each that can start passing through the uploaded filestream as soon as the upload starts. I'm unsure if this is possible?
Previously we've used byte[] to pass the files through the layers, but for large files this seems to require lots of memory to do so and has caused us issues.
Is it possible to optimize the upload of files through a ntier application, so you don't have to copy around large byte arrays, and if so, how can it be done?
In order to clarify, the code structure would be something like the following. Repository stuff has been excluded:
namespace Project.Controllers
{
[Produces("application/json")]
[Route("api/{versionMNumber}/")]
public class DocumentController : Controller
{
private readonly IAddDocumentCommand addDocumentCommand;
public DocumentController(IAddDocumentCommand addDocumentCommand)
{
this.addDocumentCommand = addDocumentCommand;
}
[Microsoft.AspNetCore.Mvc.HttpPost("application/{applicationId}/documents", Name = "PostDocument")]
public IActionResult UploadDocument([FromRoute] string applicationId)
{
var addDocumentRequest = new AddDocumentRequest();
addDocumentRequest.ApplicationId = applicationId;
addDocumentRequest.FileStream = this.Request.Body;
var result = new UploadDocumentResponse { DocumentId = this.addDocumentCommand.Execute(addDocumentRequest).DocumentId };
return this.Ok(result);
}
}
}
namespace Project.BusinessProcess
{
public interface IAddDocumentCommand
{
AddDocumentResponse Execute(AddDocumentRequest request);
}
public class AddDocumentRequest
{
public string ApplicationId { get; set; }
public Stream FileStream { get; set; }
}
public class AddDocumentResponse
{
public Guid DocumentId { get; set; }
}
public class AddDocumentCommand : IAddDocumentCommand
{
private readonly IDocuentRepository documentRepository;
private readonly IMessageBus bus;
public AddDocumentCommand(IDocumentRepository documentRepository, IMessageBus bus)
{
this.documentRepository = documentRepository;
this.bus = bus;
}
public AddDocumentResponse Execute(AddDocumentRequest request)
{
/// We need the file to be streamed off somewhere else, fileshare, Azure, Rackspace etc
/// We need to save a record to the db that the file has been saved successfully
/// We need to trigger the background workers to process the uploaded file
var fileUri = AzureStorageProvider.Save(request.FileStream);
var documentId = documentRepository.Add(new Document { FileUri = fileUri });
bus.AddMessage(new DocumentProcessingRequest { documentId = documentId, fileUri = fileUri });
return new AddDocumentResponse { DocumentId = documentId };
}
}
}
Some notes:
Passing around a byte array or stream doesn't copy the data - the issue is having the data on your server at all. If your web server needs to process the data in its complete form, you aren't going to be able to avoid the memory usage of doing so.
If you don't need to process the data on your web server at all, but just need to put it in blob storage, you should return a uri for the upload, which points to blob storage (this might be helpful: Using CORS with Azure)
If you need to process the data, but you're okay doing that a bit-at-a-time, something like this answer is what you need Using streams in ASP.Net web API
In my scenario, I have a Winforms client that connects to WebApi2. The data is stored in a SQL Server database.
To speed up performance, I am researching if storing data in local cache is a viable solution. Preferably, the local cache should be stored in files instead of kept in-memory as RAM might be an issue. The data is all POCO classes, some being much more complex than others, and most classes being related to each other.
I have made a shortlist of which frameworks might be viable:
MemoryCache
MemCached
CacheManager
StackExchange.Redis
Local Database
Using MemoryCache, I would need to implement my own solution, but it will fit my initial requirements.
However, one common problem that I am seeing is the updating of related classes. For example, I have a relationship between CustomerAddress and PostCode. If I change some properties in a postcode object, I can easily update its local cache. But how is it possible to update/invalidate any other classes that use this postcode, in this case CustomerAddress?
Does any of the frameworks above have methods that help in this kind of situation, or is it totally dependent on the developer to handle such cache invalidation?
The CachingFramework.Redis library provides a mechanism to relate tags to keys and hashes so you can then invalidate them in a single operation.
I'm assuming that you will:
Store the Customer Addresses in Redis with keys like "Address:{AddressId}".
Store the Post Codes in Redis with keys like "PostCode:{PostCodeId}".
And that your model is something like this:
public class CustomerAddress
{
public int CustomerAddressId { get; set; }
public int CustomerId { get; set; }
public int PostCodeId { get; set; }
}
public class PostCode
{
public int PostCodeId { get; set; }
public string Code { get; set; }
}
My suggestion is to:
Mark the Customer Addresses objects on Redis with tags like "Tag-PostCode:{PostCodeId}".
Use a cache-aside pattern to retrieve the Customer Addresses and Post Codes from cache/database.
Invalidate the cache objects by tag when a Post Code is changed.
Something like this should probably work:
public class DataAccess
{
private Context _cacheContext = new CachingFramework.Redis.Context("localhost:6379");
private string FormatPostCodeKey(int postCodeId)
{
return string.Format("PostCode:{0}", postCodeId);
}
private string FormatPostCodeTag(int postCodeId)
{
return string.Format("Tag-PostCode:{0}", postCodeId);
}
private string FormatAddressKey(int customerAddressId)
{
return string.Format("Address:{0}", customerAddressId);
}
public void InsertPostCode(PostCode postCode)
{
Sql.InsertPostCode(postCode);
}
public void UpdatePostCode(PostCode postCode)
{
Sql.UpdatePostCode(postCode);
//Invalidate cache: remove CustomerAddresses and PostCode related
_cacheContext.Cache.InvalidateKeysByTag(FormatPostCodeTag(postCode.PostCodeId));
}
public void DeletePostCode(int postCodeId)
{
Sql.DeletePostCode(postCodeId);
_cacheContext.Cache.InvalidateKeysByTag(FormatPostCodeTag(postCodeId));
}
public PostCode GetPostCode(int postCodeId)
{
// Get/Insert the postcode from/into Cache with key = PostCode{PostCodeId}.
// Mark the object with tag = Tag-PostCode:{PostCodeId}
return _cacheContext.Cache.FetchObject(
FormatPostCodeKey(postCodeId), // Redis Key to use
() => Sql.GetPostCode(postCodeId), // Delegate to get the value from database
new[] { FormatPostCodeTag(postCodeId) }); // Tags related
}
public void InsertCustomerAddress(CustomerAddress customerAddress)
{
Sql.InsertCustomerAddress(customerAddress);
}
public void UpdateCustomerAddress(CustomerAddress customerAddress)
{
var updated = Sql.UpdateCustomerAddress(customerAddress);
if (updated.PostCodeId != customerAddress.PostCodeId)
{
var addressKey = FormatAddressKey(customerAddress.CustomerAddressId);
_cacheContext.Cache.RenameTagForKey(addressKey, FormatPostCodeTag(customerAddress.PostCodeId), FormatPostCodeTag(updated.PostCodeId));
}
}
public void DeleteCustomerAddress(CustomerAddress customerAddress)
{
Sql.DeleteCustomerAddress(customerAddress.CustomerAddressId);
//Clean-up, remove the postcode tag from the CustomerAddress:
_cacheContext.Cache.RemoveTagsFromKey(FormatAddressKey(customerAddress.CustomerAddressId), new [] { FormatPostCodeTag(customerAddress.PostCodeId) });
}
public CustomerAddress GetCustomerAddress(int customerAddressId)
{
// Get/Insert the address from/into Cache with key = Address:{CustomerAddressId}.
// Mark the object with tag = Tag-PostCode:{PostCodeId}
return _cacheContext.Cache.FetchObject(
FormatAddressKey(customerAddressId),
() => Sql.GetCustomerAddress(customerAddressId),
a => new[] { FormatPostCodeTag(a.PostCodeId) });
}
}
To speed up performance, I am researching if storing data in local
cache is a viable solution. Preferably, the local cache should be
stored in files instead of kept in-memory as RAM might be an issue
The whole issue is to avoid storing it in files, to avoid DISK operations which are slow, thus Redis is RAM based memory.
Does any of the frameworks above have methods that help in this kind
of situation, or is it totally dependent on the developer to handle
such cache invalidation?
You can save the entire object as JSON instead of applying logic and disassembles the objects, which will be also slow and error prone when applying changes.
I'm currently trying to find a better design for my multi-module solution using DI/IOC, but now I'm somehow lost. I have a solution where different kind of entities can be distributed to recipients via different channels.
This is a simplified version of my classes:
#region FTP Module
public interface IFtpService
{
void Upload(FtpAccount account, byte[] data);
}
public class FtpService : IFtpService
{
public void Upload(FtpAccount account, byte[] data)
{
}
}
#endregion
#region Email Module
public interface IEmailService :IDistributionService
{
void Send(IEnumerable<string> recipients, byte[] data);
}
public class EmailService : IEmailService
{
public void Send(IEnumerable<string> recipients, byte[] data)
{
}
}
#endregion
public interface IDistributionService { }
#region GenericDistributionModule
public interface IDistributionChannel
{
void Distribute();
}
public interface IDistribution
{
byte[] Data { get; }
IDistributionChannel DistributionChannel { get; }
void Distribute();
}
#endregion
#region EmailDistributionModule
public class EmailDistributionChannel : IDistributionChannel
{
public void Distribute()
{
// Set some properties
// Call EmailService???
}
public List<string> Recipients { get; set; }
}
#endregion
#region FtpDistributionModule
public class FtpDistributionChannel : IDistributionChannel
{
public void Distribute()
{
// Set some properties
// Call FtpService???
}
public FtpAccount ftpAccount { get; set; }
}
#endregion
#region Program
public class Report
{
public List<ReportDistribution> DistributionList { get; private set; }
public byte[] reportData{get; set; }
}
public class ReportDistribution : IDistribution
{
public Report Report { get; set; }
public byte[] Data { get { return Report.reportData; } }
public IDistributionChannel DistributionChannel { get; private set; }
public void Distribute()
{
DistributionChannel.Distribute();
}
}
class Program
{
static void Main(string[] args)
{
EmailService emailService = new EmailService();
FtpService ftpService = new FtpService();
FtpAccount aAccount;
Report report;
ReportDistribution[] distributions =
{
new ReportDistribution(new EmailDistributionChannel(new List<string>("test#abc.xyz", "foo#bar.xyz"))),
new ReportDistribution(new FtpDistributionChannel(aAccount))
};
report.DistributionList.AddRange(distributions);
foreach (var distribution in distributions)
{
// Old code:
// if (distribution.DistributionChannel is EmailDistributionChannel)
// {
// emailService.Send(...);
// }else if (distribution.DistributionChannel is FtpDistributionChannel)
// {
// ftpService.Upload(...);
// }else{ throw new NotImplementedException();}
// New code:
distribution.Distribute();
}
}
}
#endregion
In my current solution it is possible to create and store persistent IDistribution POCOs (I'am using a ReportDistribution here) and attach them to the distributable entity (a Report in this example).
E.g. someone wants to distribute an existing Report via Email to a set of recipients. Therefore he creates a new ReportDistribution' with anEmailDistributionChannel'. Later he decides to distribute the same Report via FTP to a specified FtpServer. Therefore he creates another ReportDistribution with an FtpDistributionChannel.
It is possible to distribute the same Report multiple times on the same or different channels.
An Azure Webjob picks up stored IDistribution instances and distributes them. The current, ugly implementation uses if-else to distribute Distributions with a FtpDistributionChannel via a (low-level) FtpService and EmailDistributionChannels with an EmailService.
I'm now trying to implement the interface method Distribute() on FtpDistributionChannel and EmailDistributionChannel. But for this to work the entities need a reference to the services. Injecting the Services into the entities via ConstructorInjection seems to be considered bad style.
Mike Hadlow comes up with three other solutions:
Creating Domain Services. I could e.g. create a FtpDistributionService, inject a FtpService and write a Distribute(FtpDistributionChannel distribution) method (and also a EmailDistributionService). Apart from the drawback mentioned by Mike, how can I select a matching DistributionService based on the IDistribution instance? Replacing my old if-else with another one does not feel right
Inject IFtpService/EMailService into the Distribute() method. But how should I define the Distribute() method in the IDistribution interface? EmailDistributionChannel needs an IEmailService while FtpDistributionChannel need an IFtpService.
Domain events pattern. I'm not sure how this can solve my problem.
Let me try to explain why I came up with this quite complicated solution:
It started with a simple list of Reports. Soon someone asked me to send reports to some recipients (and store the list of recipients). Easy!
Later, someone else added the requirement to send a report to a FtpAccount. Different FtpAccounts are managed in the application, therefore the selected account should also be stored.
This was to the point where I added the IDistributionChannel abstraction. Everything was still fine.
Then someone needed the possibility to also send some kind of persistent Logfiles via Email. This lead to my solution with IDistribution/IDistributionChannel.
If now someone needs to distribute some other kind of data, I can just implement another IDistribution for this data. If another DistributionChannel (e.g. Fax) is required, I implement it and it is available for all distributable entities.
I would really appreciate any help/ideas.
First of all, why do yo create interfaces for the FtpAccount? The class is isolated and provide no behavior that need to be abstracted away.
Let's start with your original problem and build from there. The problem as I interpret it as that you want to send something to a client using a different set of mediums.
By expressing it in code it can be done like this instead:
public void SendFileToUser(string userName, byte[] file)
{
var distributions = new []{new EmailDistribution(), new FtpDistribution() };
foreach (var distribution in distributions)
{
distribution.Distribute(userName, file);
}
}
See what I did? I added a bit of context. Because your original use case was way to generic. It's not often that you want to distribute some arbitrary data to an arbitrary distribution service.
The change that I made introduces a domain and a real problem.
With that change we can also model the rest of the classes a bit different.
public class FtpDistributor : IDistributor
{
private FtpAccountRepository _repository = new FtpAccountRepository();
private FtpClient _client = new FtpClient();
public void Distribute(string userName, byte[] file)
{
var ftpAccount = _repository.GetAccount(userName);
_client.Connect(ftpAccount.Host);
_client.Authenticate(ftpAccount.userName, ftpAccount.Password);
_Client.Send(file);
}
}
See what I did? I moved the responsibility of keeping track of the FTP account to the actual service. In reality you probably have an administration web or similar where the account can be mapped to a specific user.
By doing so I also isolated all handling regarding FTP to within the service and therefore reduced the complexity in the calling code.
The email distributor would work in the same way.
When you start to code problems like this, try to go from top->down. It's otherwise easy to create an architecture that seems to be SOLID while it doesn't really solve the actual business problem.
Update
I've read your update and I don't see why you must use the same classes for the new requirements?
Then someone needed the possibility to also send some kind of persistent Logfiles via Email
That's an entirely different use case and should be separated from the original use case. Create new code for it. The SmtpClient in .NET is quite easy to us and do not need to be abstracted away.
If now someone needs to distribute some other kind of data, I can just implement another IDistribution for this data.
Why? what complexity are you trying to hide?
If another DistributionChannel (e.g. Fax) is required, I implement it and it is available for all distributable entities
No. Distributing thing A is not the same as distributing thing B. You can't for instance transport parts of a large bridge on an airpane, either a freight ship or a truck is required.
What I'm trying to say is that creating too generic abstractions/contracts to promote code reuse seems like a good idea, but it usually just make your application more complex or less readable.
Create abstractions when there is real complexity issues and not on before hand.
I have an entity that has both general properties that are stored in database table and a reference to a local file on system disk. I want the create/replace/delete methods for this file be encapsulated in the data access layer to let other parts of the application not care of how and where it should be stored, just send a bytes stream of perform "clear" operation. At the same time, I'd like the file directory to be defined in web.config like database access parameters are.
Im my web app I use EF 5 Code First and have defined the entity like an example below:
// EF entity
public class UserImage{
public string Description { get; set; }
[NotMapped]
public LocalFile File { get; set; }
}
// not EF entity
public class LocalFile{
public string Name { get; set; }
public string LocalPath { // should return full path as dir path + file name }
public string Url { // should return LocalPath mapped to Url }
public void Set(FileStream fs) { // saves file to disk }
public void Clear() { // deletes file }
}
In my approach I can account my DbContext is not only database context, but a context for both database and filesystem storage and I can provide it both with DB connection string and a local directory path on creation time. I think it should be a good practice, let me know if I'm wrong.
Now the problem: how I can know the local directory path from inside of the LocalFile or UserImage objects so they can implement LocalPath and Url properties? In other words, how some other part of the application can know what's the actual Path/Url of the LocalFile or UserImage instance? Or is there a way to provide these objects with LocalDir as they're being created inside DbContext? At last, what is the alternate way to incapsulate local storage operations within UserImage so any outed code never care how and where the file is stored?
You should create interface for your file operations that will have two methods: Stream GetFile(string fileName) and void PutFile(Stream fileStream, string fileName) and implement it with concrete class that will have constructor with parameter locationPath:
public interface IFileRepository
{
Stream GetFile(string fileName);
void PutFile(Stream fileStream, string fileName);
}
public class FileRepository
{
private readonly string localPath;
public FileRepository(string localPath)
{
_localPath = localPath;
}
public Stream GetFile(string fileName)
{
var file = //get filestram from harddrive using fileName and localPath
return file;
}
...
public void PutFile(Stream fileStream, string fileName)
{
//save input stream to disk file using fileName and localPath
}
}
In your DbContext class you should create private field of type IFileRepository and in constructor initialize it from parameter:
public class SomeDbContext:DbContext
{
private readonly IFileRepository _fileRepository;
public SomeDbContext(IFileRepository fileRepository)
{
_fileRepository = fileRepository;
}
...
}
And use this _fileRepository to put and get files in DbContext methods.
Concrete classes for interface type parameters should be passed by Inversion of Control container (or other implementations of Inversion of Control principle).
Update:
public class UserImage
{
public string Description { get; set; }
[NotMapped]
public LocalFile File { get; set; }
}
// not EF entity
public class LocalFile
{
private readonly string _filePath;
public LocalFile(string filePath)
{
_filePath=filePath;
}
public string Name { get; set; }
public string LocalPath { // aggregate Name and filePath }
public string Url { // should return LocalPath mapped to Url } If i where you, i would write helper for this
}
I think my mistake is that i'm trying to access context properties (i.e. directory path) from inside of the entity. EF database context architecture doesnt implement it and the entities don't have idea how they are stored. I wouldn't like to voilate this canon.
The directory for storing files can be considered either as context property (like connection string) and entity property (like path). To implement first case i can provide myDbContext with the Directory property and then resolve all paths via context instance by calling myContext.GetUrl(userImage.FileName). But myDbContext is not always directly accessible from presentation level and i'll be unable to extract userImage's Url to set it on web page until i propogate myDbContext to all upper layers.
If I consider Directory as LocalFile's property then i need somehow to inject its value, either in constructor:
public class LocalFile{
// Constructor
public LocalFile(string dir){ // set current dir }
private string _dir;
public GetUrl(){ // return _dir + filename }
}
// cons: parameterless constructor that will be called by DbContext on getting
// data from DB won't set `_dir` and GetUrl won't return correct result
or using static directory that is set up earlier (say in global.asax):
public class LocalFile{
// Constructor
public LocalFile(){ // empty }
public static Dir;
public GetUrl(){ // return Dir + filename }
}
or even directly accessing web.config to get paths:
public class LocalFile{
// Constructor
public LocalFileType(){ // empty }
public GetUrl(){ // read dir from web.config + filename }
}
// not good idea to access web-specific assemblies from EF data classes
or making extension methods at upper layers where web.config is accessible:
public static class MyExtensions
{
public static string GetUrl(LocalFile localFile)
{
// dir from web.config + localFile.Name
}
}
So, there are many possible solutions and each has its own disadvantages. My case is little bit more complicated as my dir path also depends on LocalFile's parent user's ID so i have dir template users/{0}/image.jpg in web.config instead of simple path.
What i've done to achieve my targets:
put url template of type users/{0}/{1} (0 - parent UserId, 1 -
fileName) to web.config
created class Settings nearby my EF entities
public static class Shared
{
public static Func<string, int, string> LocalFileUrlResolver;
}
populated its values on application start
protected void Application_Start()
{
Shared.LocalFileUrlResolver =
(fileName, userId) =>
String.Format(ConfigurationManager.AppSettings["LocalFileUrl"], userId, fileName);
}
made my User provide its own images with Url resolver at creation
time
public User()
{
// ...
Image = new LocalFile(
"userpic.css",
fileName => Shared.LocalFileUrlResolver(fileName, userId)
);
}
made my LocalFile constructor accept Func<string, string> param
that resolves full Url of given file name
public class LocalFile
{
public LocalFile(
string fileName,
Func<string, string> fnUrlResolver
)
{
FileName = fileName;
_fnUrlResolver = fnUrlResolver;
}
private readonly Func<string, string> _fnUrlResolver;
public string FileName { get; private set; }
public string Url { get { return _fnUrlResolver(FileName); } }
}
Yes, so many lines. I take dir template from web.config, inject it into static member of data access layer and make it more specific on User creation point for user's local images.
I am absolutely not sure does it worth the cost. Maybe in future i'll choose direct access to web.config :)