DB - How to save multiple Versions of the same File? - c#

I have to save multiple Versions of the same File.
I would like to know how to represent this in the database and if I have to configure something in EF Core for that.
Basically, the User uploads a File and afterwards it´s possible to upload a new Version of this File. Only the newest File shall be shown to the User in Standard View, but it should be possible to see the other Versions. Versions can be deleted. A File has Actions, which say what can be done with this.
How should I write the Model for this?
public File{
public string FileName;
public int Version;
public bool Active;
//+ Actions, that is same for every Version
}
1.) I thought that I could just add a List to this class/table, but my problem is what happens if the first Version will be deleted? I would have to always remember to switch the Files.
Also, with this a Version could have Versions, but that´s not correct as they all belong together.
2.) I could just add an public File ParentFile and every Version links to the first File. But then I will have again problems with deletion.
3.) Introduce something like a SameFIleID Then I would add a List of this ID to the Project table, search all the Files with this ID and then take the one with Active or highest Version.... But then I will have a problem with my Actions since I would have to always update all Versions for that instead of just one.
Any ideas?

Why not just keep the File Definition and the File Contents separate?
public class File
{
public string FileName { get; set; }
public List<FileContents> Versions { get; set; } = new List<FileContents>();
[NotMapped]
public FileContents ActiveContents =>
Versions.OrderByDescending(v => v.Version).FirstOrDefault();
//Actions, that is same for every Version
}
public class FileContents
{
public File FileDefinition { get; set; }
public int Version { get; set; }
// Actual Contents
}
This way Actions are only saved once per file. The ActiveContents Property will always take the latest version if it exists. This is set as [NotMapped] because the active version is implicit in the Version Id and doesn't need to be saved in the database.

Related

How to split a large request object that contains many properties of which one of them is a huge list of values

Excuse me right off the bat. I am sort of new.
I have an object that contains few properties of which one of the property in that itself is a List. Now, we do not know how big the list of values in the input payload would be like (It could be 1000, it could be 100,000). We are logging this request payload before we process.
We use _logger.Verbose ("Some String...", {object});
When we log, the log file (We use Serilog) saves it as a notepad file with huge values, in JSON format.
Now, when the input is too big, the logger tries to log but fails and retries many times due to big payload.
I am looking for a way to split or do some looping and split and store or something. I dont know how to do in C# code. I tried googling and researched a lot but futile. I found SKIP and TASK methods of Lambda but unsure how to use.
Code below:In this case, imagine, "Model" is like 1000, or 100,1000 it could be anything. I am just looking for a loop logic in C# to divide to a decent number and process.
public class Make
{
public int ID { get; set;}
public string Name { get; set;}
public string Category { get; set;}
public List<Model> Models { get;set;}
}
public class Model
{
public string Name { get; set;}
public string County { get; set;}
public string Submodel { get; set;}
}
public ProcessCars ( Make object)
{
_logger.Verbose ("Some String...", {object});`
// Processing///
//.....//
}
I understand the purpose of yours is to view or debug the values of your list.
If I were you, I would ask myself a few questions
Do I need to write all values? Why can't I filter first before logging?
What's the purpose of writing to a text file, when you can log to database? Serilog support DB logging.
Is it a best practice to log large values to a text file?

DDD and File Management

I've got a simple document management system I'm putting together, I'm trying to follow solid DDD principals and things have been coming together. One area I've been questioning is what the cleanest solution for managing files would be.
For background, a lot of this document management is going to revolve around uploading documents and assigning them to specific "work orders". This is in the manufacturing industry, we need to keep track of certain documents and send them to the customer when we're all done making their stuff.
So my bounded context is mostly composed of a couple main entities, lets say DocumentPackage, Document, Requirements, etc. A DocumentPackage is a grouping of documents for a single "work order". The same document may be used in multiple DocumentPackages. Each DocumentPackage has a certain number of Requirements, which is a distinct type of document that is needed as part of the package.
So when it comes to the action of uploading and downloading the files, manipulating the files, and updating the database to reflect these changes, where do I want to do most handle most of that?
Here's an example of a UploadDocumentCommand and Handler I have. Note that I decided to save the uploaded file to the local file system in the API controller, and to pass it in my command as the FileInfo.
public class UploadDocumentCommand : IRequest<AppResponse>
{
public UploadDocumentCommand(Guid documentId, string workOrderNumber, FileInfo file, Guid? requirementId = null)
{
DocumentId = documentId;
WorkOrderNumber = new WorkOrderNumber(workOrderNumber);
FileInfo = file;
RequirementId = requirementId;
}
public Guid DocumentId { get; }
public WorkOrderNumber WorkOrderNumber { get; }
public Guid? RequirementId { get; }
public FileInfo FileInfo { get; }
}
public class UploadDocumentCommandHandler : IRequestHandler<UploadDocumentCommand, AppResponse>
{
private readonly IDocumentRepository documentRepository;
private readonly IDocumentPackageRepository packageRepo;
public UploadDocumentCommandHandler(IDocumentRepository documentRepository, IDocumentPackageRepository packageRepo)
{
this.documentRepository = documentRepository;
this.packageRepo = packageRepo;
}
public async Task<AppResponse> Handle(UploadDocumentCommand request, CancellationToken cancellationToken)
{
try
{
// get the package, throws not found exception if does not exist
var package = await packageRepo.Get(request.WorkOrderNumber);
var document = DocumentFactory.CreateFromFile(request.DocumentId, request.FileInfo);
if (request.RequirementId != null)
{
package.AssignDocumentToRequirement(document, request.RequirementId.GetValueOrDefault());
}
else
{
package.AddUnassignedDocument(document);
}
await documentRepository.Add(document, request.FileInfo);
await packageRepo.Save(package);
return AppResponse.Ok();
}
catch (AppException ex)
{
// the file may have been uploaded to docuware but failed afterwards, this should be addressed
// this can be better handled by using an event to add the document to the package only after successful upload
return AppResponse.Exception(ex);
}
}
}
public class Document : Entity<Guid>
{
private Document() { }
public Document(Guid id, string fileName, DateTime addedOn)
{
Id = id;
FileName = new FileName(fileName);
AddedOn = addedOn;
}
public FileName FileName { get; private set; }
public DateTime AddedOn { get; private set; }
public override string ToString() => $"Document {Id} {FileName}";
}
My DocumentRepository has mixed responsibilities, and I'm having it both save the file to the file store as well as update the database. I'm using a specific document storage application right now, but I wanted to keep this abstracted so that I am not stuck on this. It is also possible that different files, like images, might have different stores. But part of me is wondering if it is actually better to have this logic in my application layer, where my handler takes care of storing the file and updating the database. I don't feel like the DocumentRepository is very SRP, and the act of loading my Document entity shouldn't have a dependency on my DocuwareRepository.
public class DocumentRepository : IDocumentRepository
{
private readonly DbContext context;
private readonly IDocuwareRepository docuwareRepository;
public DocumentRepository(DbContext context, IDocuwareRepository docuwareRepository)
{
this.context = context;
this.docuwareRepository = docuwareRepository;
}
public async Task<Document> Get(Guid id)
{
return await
context
.Document
.FirstOrDefaultAsync(x => x.Id.Equals(id));
}
public async Task Add(Document document, FileInfo fileInfo)
{
var results = await docuwareRepository.UploadToDocuware(document.Id, fileInfo);
var storageInfo = new DocumentStorageInfo
{
DocuwareId = results.DocuwareDocumentId,
DocuwareFileCabinetId = results.CabinetId,
Document = document
};
context.DocumentStorageInfo.Add(storageInfo);
context.Document.Add(document);
await context.SaveChangesAsync();
}
public Task<FileStream> Download(Guid id)
{
throw new NotImplementedException();
}
public void Dispose()
{
context?.Dispose();
}
}
I've got another use case I'm working on where the DocumentPackage has to be downloaded. I want my application to take all the valid documents from the package, compile them into a zip file, where the documents are structured in a folder hierarchy based on the Requirements, and that archive zip is going to get saved long term for traceability reasons, and the client can download that zip file. So I have another entity I'm calling the DocumentPackageArchive, its got a repository, and I'm thinking the zip file gets compiled there. This would call for the repository downloading all the files from their respective stores, compressing it as a zip, saving the zip locally on the web server, sending the zip file back to be saved for read-only keeping, and updating the database with some data about the archive. Yes, I am intentionally creating a copy of the document package as a snapshot.
So where this leaves me, I feel like the file management is happening all over the place. I'm saving some files in the web api becuase I feel like I need to save them to temp storage right when I get them from an IFormFile. I'm handling the file info in the application layer as part of the commands. And most of the heavy lifting is happening in the Infrastructure layer.
How would you recommend I approach this? I feel like those two repositories dealing with the documents need to be re-designed.
As an additional note, I am also considering coordinating some of this work through domain events. I don't think I'm ready for that yet, and it seems like that's a bit more complication then I need to be adding right now. Still, advise in that area would be also appreciated.

Class property says "File not Found" ... before even being initialized?

I am having the weirdest problem right now ...
I am working on a program that tries to instantiate types (DbContext derivatives) from another project. Because the DbContexts don't simply accept standard parameters but only interfaces from that other projects I had to take some of the other project's assemblies and reference them so I could create dummy-types that implement the interfaces.
The main one is this:
class ContextConfigurationDummy:IContextConfiguration
{
#region privates
private ILogger GetLoggerDummy() => new LoggerDummy();
private ITenantSchemaResolver GetSchemaResolverDummy() => new SchemaResolverDummy();
private CultureInfo GetUserCulture() => new CultureInfo("en-US");
private DatabaseMetadata GetDbMetadata() => new DatabaseMetadata();
private IEntityValidatorFactory GetValidatorFactory() => new ValidatorFactoryDummy();
#endregion
public DbConnection GetDbConnection() => new SqlConnection(#"[insert connection string]");
public RuminantDummy GetRuminant() => new RuminantDummy();
public Func<CultureInfo> UserCulture { get; set; }
public Func<DbConnection> Connection { get; set; }
public Func<IDatabaseMappingRuminant> Ruminant { get; set; }
public Func<DatabaseMetadata> DatabaseMetadata { get; set; }
public IEntityValidatorFactory ValidatorFactory { get; set; }
public Func<ITenantSchemaResolver> TenantSchemaResolver { get; set; }
public Func<ILogger> Logger { get; set; }
public ContextConfigurationDummy()
{
UserCulture = GetUserCulture;
Connection = GetDbConnection;
Ruminant = GetRuminant;
DatabaseMetadata = GetDbMetadata;
ValidatorFactory = GetValidatorFactory();
TenantSchemaResolver = GetSchemaResolverDummy;
Logger = GetLoggerDummy;
}
}
The original interface is this:
public interface IContextConfiguration
{
Func<CultureInfo> UserCulture { get; set; }
Func<DbConnection> Connection { get; set; }
Func<IDatabaseMappingRuminant> Ruminant { get; set; }
Func<DatabaseMetadata> DatabaseMetadata { get; set; }
IEntityValidatorFactory ValidatorFactory { get; set; }
Func<ITenantSchemaResolver> TenantSchemaResolver { get; set; }
Func<ILogger> Logger { get; set; }
}
Everything was working just fine, I didn't look at that part of the code for a while, did quite a few changes to the rest (including to the project itself to adapt it for continuous integration), when I was done I started testing again and, when calling the constructor of ContextConfigurationDummy, I was faced with this:
Note how the debugger has not even tried to evaluate it yet! A "normal" error would be that the value is null and after it tries to evaluate it there would be some kind of exception. Not here. There isn't even any file being loaded in this class. Same error for all the following fields
I figured I had to have broken something while working on the project structure, so I remade the entire project, took only the code, re-referenced the assemblies, left everything at default ... and had the exact same error again. So, it has to be the code right?
Well, I reverted to an old commit through git, ran the program, everything working as it should. Great. Checked the code - the relevant parts are exactly the same, except for the Namespaces which were slightly adapted. So, not the code either?
I went back to my current commit and after trying different kinds of ways to reference the external assemblies I tried something else once again: Just for lulz I went into my Main, the Start of the program and tried to instantiate the class with exactly the same line I used to originally instantiate it: var dummy = new ContextConfigurationDummy();
It works. Everything as expected, my class just instantiates normally.
But as if this wasn't weird enough, I'll do you one better:
After calling the method in the main I kept the program running. I got back to the spot where I originally called new ContextConfigurationDummy(). Back into the constructor and ... it works there as well. Took the line out the main - doesn't work. Back in - works again.
I'm beyond stumped, please help. (and don't tell me to clean my solution, I've done so a hundred times and literally remade the entire thing) .
I'm effectively comibining answers from my own experiences and some google-fu here but.
.NET Version
Make sure you're using the same .NET version. I've had it where I've created a new project with an old .NET version and it's screwed me completely with the same issue.
Assembly Names
Make sure none of your dlls share the same name with the assembly you're bringing in. If you have two MyAssembly.dlls it could be looking for the wrong one / version. Additionally if you've updated the other project at some point, check that all your references are using the same version of a brought in dll.
Nuget Packages
Have you updated any Nuget packages in your project that are required by the other assembly also? (I've had this many times with my unit tests) if so you'll either need to update the other projects nuget packages to the same verison, or possibly add an App.Config file to tell it to use the newer version.
I've never had the same assembly names issue, but I've had the nuget package issue and the .net version issue quite a few times (It's something you just don't think about)

db4o does not seem to store items added to a List<T>

Let's say I have a model that looks like this (fake, but as an example)
public class PackageItem
{
public string Name { get; set;}
public bool Available { get; set;}
}
public class Package
{
public string Recipient { get; set;}
public List<PackageItem> Items { get; set;}
}
First a new instance of Package is created with just the Recipient set to some value.
Some time later, (the database has been closed in between), I retrieve it from the database
and add items to it:
var packages = database.Query<Package>()
foreach(var package in packages)
{
package.Recipient = "Bla bla bla";
package.Items.Add(new PackageItem() { Name = "SomeName", Available = true });
database.Store(package);
}
database.Commit();
At this point the changes to the Recipient property are saved in the database,
but the additions to the Items property are not.
I found some information about db4o not knowing anything has changed because the reference to the List did not change and db4o does not know about changes to the List itself... is this correct?
And how should one deal with a List of T when using db4o?
*Update: * I thought executing
database.Store(package.Items);
would help but it does not.
This is all using db4o v8 on Debian with Mono 2.11
You've stumbled upon the activation depth.
Take a look at the documentation. You either need to explicilty store the changes in the collection:
database.Store(page.Items)
Or increase the update depth. Or switch to transparent persistence.
besides the options mentioned by Gamlor you can also explicitly define update depth in Store method itself (overloaded Store method is hidden in Ext) like:
database.Ext().Store(package, int.MaxValue);

How do I design a specific super class

I am trying to refactor a solution to bring on board another project.
I have a Core project where common classes across projects reside.
I've tried to simpify my question by using 2 imaginary projects: Holidays and Weather...
I have a file load process setup for the Holidays project which has the following 2 classes:
public class Job
{
public virtual string CreatedBy { get; set; }
public virtual DateTime? CreatedDate { get; set; }
public virtual Security Security { get; set; }
protected IList<File> _files = new List<File>();
public virtual IEnumerable<File> Files
{
get { return _files; }
}
}
public class File
{
public virtual string FileName { get; set; }
public virtual FileType FileType { get; set; }
public virtual FileStatusType FileStatusType { get; set; }
public virtual Job Job { get; set; }
}
The file load process for the Weather project has exactly the same structure as Holidays, except that the Jobs class does not have a Security property.
My question is, is it possible to somehow move both classes into the Core project to allow both projects to use them?
Obviously Weather does not need the Security property, so I was thinking I would have a Core.Job class without Security, and then extend the Core.Job in Holidays.Job.
But once I do that, in the Core.File class, what Job is it referring to? As it sits in the Core project it must be the Core.Job.
So would I then need to have Job and File sit in Holidays, and Weather (and any other future projects) use the Core.Job and Core.File?
I don't want the Core project to have any references to sub projects.
I am using NHibernate, and so have mapping files - adding to the complexity.
Hope this is clear enough
Thanks
You can certainly do this, but I am not sure whether it brings you true benefit:
Does the Core itself work with the base Job in any way? If it does not, implementing Job separately in each project may help you keep coupling loose, even though I'd a little redundant. In code I wrote, I have sometimes introduced unnecessary dependencies by extracting interfaces without adding true benefit. This is why I am a bit precautious.
In case Core does acutal work with it, the part to refactor into the common base Job is perhaps the interface it works with.
You may think of an interface instead of a base class. Security may semantically belong to another interface. Moreover, you hand over a lot of control over your classes to the Core.
Do you ever hand a job from one project to another (or are they mapped to the same DB table via NHibernate?)? If you don't, an internal redundant class may be fine too.
Not very clear why confuse on the soluton offered by you (assuming that I right understood you)
//Core DLL
public class Job
{
public virtual string CreatedBy { get; set; }
public virtual DateTime? CreatedDate { get; set; }
protected IList<File> _files = new List<File>();
public virtual IEnumerable<File> Files
{
get { return _files; }
}
}
in the Hollidays you have
public class HollidayJob : Job
{
public virtual Security Security { get; set; }
}
in Weather simply use a type Job, if it selfsufficient.
In this case you refer CoreDLL from Holliday project and Weather. When you serialize it via NHibernate it for HollidayJob save one field more, but when Weather reads the same table it skips that field, as don't know anything, and don't actually care abotu it.
Hope this helps.

Categories