Im having some problems saving an object (FeatureType) that have a 1-M relationship with Section.
public class FeatureType
{
public int Id { get; set; }
public string Name { get; set; }
[ForeignKey("SectionId")]
public Section Section { get; set; }
public virtual List<ItemType> ItemTypes { set; get; }
}
public class Section
{
public int Id { get; set; }
public string Name { get; set; }
public int Order { get; set; }
public virtual List<FeatureType> Features { get; set; }
}
If The ItemTypes are new i have no problem and the insert is done correctly.
But if i want to add some existing ItemTypes im getting this Error:
An entity object cannot be referenced by multiple instances of
IEntityChangeTracker.
I have been reading about this problem but i havent found a way to solve it, and it might be because of how its designed my application.
Whem im mappinig from my viewModel to my Model, im getting the section ID and getting the section Object from my SectionRepository as this:
private Section GetSection()
{
var section = _sectionRepository.GetSection(SectionId);
return section;
}
And this is what is giving me the problem, as the section is now been tracked by the SectionRepository that have its own context.
How can i solve this? I have tried just creating a new section with the existing ID but it just create me an empty object.
private Section GetSection()
{
var section = new Section{Id=SectionId};
return section;
}
UPDATE
To save my entity i just use :
_repository.Create(featureType.ToModel());
public FeatureType ToModel()
{
var ft = new FeatureType
{
Name = Name,
ControlType = (ControlType)ControlType,
Order = Order,
Required = Required,
RequiredText = RequiredText,
ItemTypes = GetItemTypes().ToList(),
Section = GetSection(),
};
return ft;
}
UPDATE 2: This is how i have my repositories, i wouldn't like to manage any EF in my controller but with some kind of repository or service.
public class EFBaseRepository
{
protected MyContext Db = new MyContext();
public void Dispose(bool disposing)
{
Db.Dispose();
}
}
public class EFFeatureTypeRepository : EFBaseRepository, IFeatureTypeRepository
{
public IQueryable<FeatureType> GetFeatureTypes
{
get { return Db.FeatureTypes.Include("Section").Include("ItemTypes"); }
}
public Message Create(FeatureType feature)
{
try
{
Db.FeatureTypes.Add(feature);
Db.SaveChanges();
return new Message();
}
catch (Exception e)
{
throw;
// return new Message(e, string.Format("Error Creating {0}", feature.GetType()));
}
}
//..Other Methods
}
You say that the SectionRepository has its own context. That is going to cause you problems. The repositories should share a context. The context is a combination of the unit of work and repository patterns. You need to separate the two patterns:
How to migrate towards unit-of-work and repository pattern
EDIT
You can avoid having the DbContext in the Controller by implementing your own Unit Of Work pattern.
public interface IUnitOfWork : IDisposable
{
ISectionRepository SectionRepository {get;}
//etc
int Save();
}
then in your controller:
public ActionResult Create(FeatureTypeCreate featureType)
{
_Uow.SectionRepository.Create(featureType.ToModel());
_Uow.Save(); //Saving is the responsibility of the Unit Of Work
//not the Repository
}
More references:
Implementing the Repository and Unit of Work
Repository and Unit of Work in Entity Framework
John Papa's original source code
Simply, the error you're getting means that the entities were returned from a different instance of your DbContext than from which they are now trying to be saved. Make sure that you're not doing something like using two different usings around your repository and that your repository always makes use of the same DbContext per instantiation.
Related
So I have been stuck with this issue for quite some time. We are implementing a DDD architecture and I don't want our models or entities to be anemic.
We are also using EF6 and Autofac. I don't want to implement a repository pattern as EF already acts as this pattern.
So say for instance we have a context called TestContext
public class TestContext : DbContext
{
public TestContext() : base("TestContext")
{
}
public DbSet<AEntity> AEntities { get; set; }
}
The one DBset it has is AEntity
public class AEntity
{
public ITest testService;
public Guid Id { get; set; }
public string Name { get; private set; }
public AEntity()
{
}
public AEntity(string name)
{
this.Name = name;
}
public virtual void Test()
{
// Logic.
// Global testLogic for names
testService.Test(this.Name);
}
}
So I have autoface configured to autowire property injection
builder.RegisterType<AEntity>().PropertiesAutowired();
and this works a charm if autofac is responsible for instantiating the instance like the following method shows:
public ValuesController(AEntity aEntity)
{
aEntity.Test();
}
Great it works and everything but here comes the catch when I do something like this
public ValuesController(TestContext context)
{
var a = context.AEntities.FirstOrDefault();
a.Do();
}
The ITest is not getting autowired, and I know its due to that autofac is not the instantiater or resolver, but this is something that I want to accomplish.
Any pointers and let me know if my question does not make sense.
I am very new to C# and ServiceStack and I am working on a small project that consists on calling a third party API and loading the data I get back from the API into a relational database via ServiceStack's ORMLite.
The idea is to have each endpoint of the API have a reusable model that determines how it should be received in the API's response, and how it should be inserted into the database.
So I have something like the following:
[Route("/api/{ApiEndpoint}", "POST")]
public class ApiRequest : IReturn<ApiResponse>
{
public Int32 OrderId { get; set; }
public DateTime PurchaseDate { get; set; }
public String ApiEndpoint { get; set; }
}
public class ApiResponse
{
public Endpoint1[] Data { get; set; }
public String ErrorCode { get; set; }
public Int32 ErrorNumber { get; set; }
public String ErrorDesc { get; set; }
}
public class Endpoint1
{
[AutoIncrement]
public Int32 Id { get; set; }
[CustomField("DATETIME2(7)")]
public String PurchaseDate { get; set; }
[CustomField("NVARCHAR(50)")]
public String Customer { get; set; }
[CustomField("NVARCHAR(20)")]
public String PhoneNumber { get; set; }
public Int32 Amount { get; set; }
}
My first class represents the API's request with its route, the second class represents the API's response. The API's response is the same for all endpoints, but the only thing that varies is the structure of the Data field that comes back from that endpoint. I've defined the structure of one of my endpoints in my Endpoint1 class, and I am using it in my API's response class. As you can see, I am also defining a few attributes on my Endpoint1 class to help the ORM make better decisions later when inserting the data.
Ok, so the issue is that I have about 15 endpoints and I don't want to create 15 ApiResponse classes when I know the only thing that changes is that first Data field in the class.
So I made something like this:
public class DataModels
{
public Type getModel(String endpoint)
{
Dictionary<String, Type> models = new Dictionary<String, Type>();
models.Add("Endpoint1", typeof(Endpoint1));
// models.Add("Endpoint2", typeof(Endpoint2));
// models.Add("Endpoint3", typeof(Endpoint3));
// and so forth...
return models[endpoint];
}
}
I would like for getModel() to be called when the request is made so that I can pass in the ApiEndpoint field in the ApiRequest class and store the type that I want my Data field to have so that I can dynamically change it in my ApiResponse class.
In addition, there is the ORM part where I iterate over every endpoint and create a different table using the model/type of each endpoint. Something like this:
endpoints.ForEach(
(endpoint) =>
{
db.CreateTableIfNotExists<Endpoint1>();
// inserting data, doing other work etc
}
);
But again, I'd like to be able to call getModel() in here and with that define the model of the specific endpoint I am iterating on.
I've attempted calling getModel() on both places but I always get errors back like cannot use variable as a typeand others... so I am definitely doing something wrong.
Feel free to suggest a different approach to getModel(). This is just what I came up with but I might be ignoring a much simpler approach.
When I DID understand you correctly, you have different API-Calls which all return the same object. The only difference is, that the field "Data" can have different types.
Then you can simply change the type of data to object:
public object Data { get; set; }
And later simply cast this to the required object:
var data1=(Endpoint1[]) response.Data;
You're going to have a very tough time trying to dynamically create .NET types dynamically which requires advanced usage of Reflection.Emit. It's self-defeating trying to dynamically create Request DTOs with ServiceStack since the client and metadata services needs the concrete Types to be able to call the Service with a Typed API.
I can't really follow your example but my initial approach would be whether you can use a single Service (i.e. instead of trying to dynamically create multiple of them). Likewise with OrmLite if the Schema of the POCOs is the same, it sounds like you would be able to flatten your DataModel and use a single database table.
AutoQuery is an example of a feature which dynamically creates Service Implementations from just a concrete Request DTO, which is effectively the minimum Type you need.
So whilst it's highly recommended to have explict DTOs for each Service you can use inheritance to reuse the common properties, e.g:
[Route("/api/{ApiEndpoint}/1", "POST")]
public ApiRequest1 : ApiRequestBase<Endpoint1> {}
[Route("/api/{ApiEndpoint}/2", "POST")]
public ApiRequest2 : ApiRequestBase<Endpoint1> {}
public abstract class ApiRequestBase<T> : IReturn<ApiResponse<T>>
{
public int OrderId { get; set; }
public DateTime PurchaseDate { get; set; }
public string ApiEndpoint { get; set; }
}
And your Services can return the same generic Response DTO:
public class ApiResponse<T>
{
public T[] Data { get; set; }
public String ErrorCode { get; set; }
public Int32 ErrorNumber { get; set; }
public String ErrorDesc { get; set; }
}
I can't really understand the purpose of what you're trying to do so the API design is going to need modifications to suit your use-case.
You're going to have similar issues with OrmLite which is a Typed code-first POCO ORM where you're going to run into friction trying to use dynamic types which don't exist at Runtime where you'll likely have an easier time executing Dynamic SQL since it's far easier to generate a string than a .NET Type.
With that said GenericTableExpressions.cs shows an example of changing the Table Name that OrmLite saves a POCO to at runtime:
const string tableName = "Entity1";
using (var db = OpenDbConnection())
{
db.DropAndCreateTable<GenericEntity>(tableName);
db.Insert(tableName, new GenericEntity { Id = 1, ColumnA = "A" });
var rows = db.Select(tableName, db.From<GenericEntity>()
.Where(x => x.ColumnA == "A"));
Assert.That(rows.Count, Is.EqualTo(1));
db.Update(tableName, new GenericEntity { ColumnA = "B" },
where: q => q.ColumnA == "A");
rows = db.Select(tableName, db.From<GenericEntity>()
.Where(x => x.ColumnA == "B"));
Assert.That(rows.Count, Is.EqualTo(1));
}
Which uses these extension methods:
public static class GenericTableExtensions
{
static object ExecWithAlias<T>(string table, Func<object> fn)
{
var modelDef = typeof(T).GetModelMetadata();
lock (modelDef)
{
var hold = modelDef.Alias;
try
{
modelDef.Alias = table;
return fn();
}
finally
{
modelDef.Alias = hold;
}
}
}
public static void DropAndCreateTable<T>(this IDbConnection db, string table)
{
ExecWithAlias<T>(table, () => {
db.DropAndCreateTable<T>();
return null;
});
}
public static long Insert<T>(this IDbConnection db, string table, T obj, bool selectIdentity = false)
{
return (long)ExecWithAlias<T>(table, () => db.Insert(obj, selectIdentity));
}
public static List<T> Select<T>(this IDbConnection db, string table, SqlExpression<T> expression)
{
return (List<T>)ExecWithAlias<T>(table, () => db.Select(expression));
}
public static int Update<T>(this IDbConnection db, string table, T item, Expression<Func<T, bool>> where)
{
return (int)ExecWithAlias<T>(table, () => db.Update(item, where));
}
}
But it's not an approach I'd take personally, if I absolutely needed (and I'm struggling to think of a valid use-case outside of table-based Multitenancy or sharding) to save the same schema in multiple tables I'd just be using inheritance again, e.g:
public class Table1 : TableBase {}
public class Table2 : TableBase {}
public class Table3 : TableBase {}
Say you have a multi-tenant app. A Tenant has various properties:
public class Tenant{
public string TenantName {get; set;}
public string TenantUrl {get; set;}
}
This way when my service layer sends emails, for example, I can do the following:
SendEmail(Tenant.FromEmailAddress, recipientEmailAddress)
This works well for properties. In many places throughout my business logic, I'm encountering cases where tenant-specific behaviors must be accounted for. One example is retrieving photos for the homepage:
public List<string> GetPhotoUrls(){
if(currentTenant == TenantA){
// logic to go off to retrieve from one third party
} else if (currentTenant == TenantB){
// totally different logic
} else... // one for each tenant
// do some stuff
// return stuff
}
GetPhotoUrls is a simple example - but there are cases like this in many places in my business logic. I'm looking for a simple pattern where I can define and implement tenant-specific logic. The overall goal is to get all tenant-specific logic in one place so tenant creation and definition is easy.
I would like the developer experience to read along the lines of:
public List<string> GetPhotoUrls(){
currentTenant.GetPhotoUrls(); // define this logic on the tenant object somehow
// do some stuff
// return stuff
}
What patterns/constructs are available to achieve this?
Use the strategy pattern in your case. The pattern is best applied when you see switch statements or multiple if statements to simplify the client so that it delegates custom implementation to dependent interfaces. You may also use in combination of factory pattern. To illustrate this:
public interface ITenant{
List<string> GetPhotoUrls();
}
public class TenantA:ITenant{
public string TenantName {get; set;}
public string TenantUrl {get; set;}
public List<string> GetPhotoUrls(){
//A implementation
}
}
public class TenantB:ITenant{
public string TenantName {get; set;}
public string TenantUrl {get; set;}
public List<string> GetPhotoUrls(){
//B implementation
}
}
public class SomeTenantApp{
public SomeTenantApp(ITenant tenant){
_tenant = tenant;
}
public void DoSomething(){
var urls = _tenant.GetPhotoUrls();
//do something
}
}
public static class TenantFactory{
public static ITenant Create(string id)
{
//logic to get concrete tenant
return concreteTenant;
}
}
class Program
{
static void Main(string[] args)
{
var tenant = TenantFactory.Create("A");
var app = var SomeTenantApp(tenant);
app.DoSomething();
}
}
The client (SomeTenantApp) won't have to change. You delegated the implementation to the concrete class which owns the logic.
If you want to build SaaS, I'd strongly recommend using ASP.NET Core and dependency injection to overcome multi-tenancy issue.
You can defined your tenant class :
public class AppTenant
{
public string Name { get; set; }
public string[] Hostnames { get; set; }
}
Next you can resolve a tenant from the current request
public class AppTenantResolver : ITenantResolver<AppTenant>
{
IEnumerable<AppTenant> tenants = new List<AppTenant>(new[]
{
new AppTenant {
Name = "Tenant 1",
Hostnames = new[] { "localhost:6000", "localhost:6001" }
},
new AppTenant {
Name = "Tenant 2",
Hostnames = new[] { "localhost:6002" }
}
});
public async Task<TenantContext<AppTenant>> ResolveAsync(HttpContext context)
{
TenantContext<AppTenant> tenantContext = null;
// it's just a sample...
var tenant = tenants.FirstOrDefault(t =>
t.Hostnames.Any(h => h.Equals(context.Request.Host.Value.ToLower())));
if (tenant != null)
{
tenantContext = new TenantContext<AppTenant>(tenant);
}
return tenantContext;
}
}
Wiring it up :
public void ConfigureServices(IServiceCollection services)
{
services.AddMultitenancy<AppTenant, AppTenantResolver>();
}
Getting the current tenant (whenever you need it) :
public class HomeController : Controller
{
private AppTenant tenant;
public HomeController(AppTenant tenant)
{
this.tenant = tenant;
}
.
.
.
}
For more info take a look at SaasKit
Building multi-tenant applications with ASP.NET Core (ASP.NET 5)
I'm trying to use ASP.NET Boilerplate to handle my project and I have one serious problem.
I have 2 Models : Photo and Comment:
public class Comment : Entity<int>
{
[DataType(DataType.MultilineText)]
public string Text { get; set; }
public string Author { get; set; }
public int ItemID { get; set; }
public virtual Item Item { get; set; }
}
public class Item : Entity<int>
{
public string Title { get; set; }
public string Description { get; set; }
public ItemSourceType SourceType { get; set; }
public byte[] PhotoBytes { get; set; }
public string Url { get; set; }
public virtual ICollection<Comment> Comments { get; set; }
}
Additionally I have created default OOB repository based on RepositoryBase<Item> and same for Comment.
The problem exists when I'm trying to get Item like this:
public ActionResult Details(int? id)
{
if (id == null)
{
return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
}
Item item = _repoItems.Get(id.Value);
if (item == null)
{
return HttpNotFound();
}
return View(item);
}
When I'm debugging this code I can see that item has this exception in Comments property.
Am I missing something from ASP.NET Boilerplate or what?
Thanks for helping!
//Edit:
Full exception message:
{"The ObjectContext instance has been disposed and can no longer be used for operations that require a connection."}
I just stumbled across the same problem.
Seems to be that the UnitOfWork implementation creates and disposes a new DbContext for each "UnitOfWork"
So to fix that particular problem, try to inject "IUnitOfWorkManager"
and call
public ActionResult Details(int? id)
{
if (id == null)
{
return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
}
using(var uow = _unitOfWorkManager.Begin())
{
try
{
Item item = _repoItems.Get(id.Value);
if (item == null)
{
return HttpNotFound();
}
return View(item);
}
finally
{
uow.Complete()
};
}
}
if that works, consider calling "Begin()" in the constructor of your ApiController and "Complete()" in its "Dispose()" override.
Hope that helps!
No. The problem is most likely in your repository, the code of which you unfortunately have not included. If I had to guess, I'd say you're using something like the following in that repository method:
using (var context = new ApplicationContext())
{
// fetch something
}
Your Comments property is virtual, which means by default, Entity Framework will lazy load it, only actually issuing the query for that to the database once you try to access the property. However, by that point, you context has been disposed because the repository method has already finished its work, and your context was only available inside the using block.
There's a number of ways to fix this. You could eagerly load comments inside the the using block:
return context.Items.Include("Comments").Find(id);
However, that really just glosses over the problem. The best thing you can do is just not use using. Ideally, your context should be instantiated once and only once for each request. The easiest way to do that is to use a dependency injection container and add a constructor to your repository that accepts the context:
public class MyAwesomeRepository
{
private readonly ApplicationContext context;
public MyAwesomeRepository(ApplicationContext context)
{
this.context = context;
}
}
The configuration for your DI container will vary depending on which you choose to go with, but generally, you want to make sure that you bind your context class to request scope.
I had the similar issue with ASP.NET Boilerplate and figured out that this framework does all this DI magic properly only if you name your interfaces and classes in accordance with their naming conventions. You can somehow do it manually though, but you have to drill down into ABP architecture much deeper than you want.
A link posted by #ChrisPratt (see comments in his answer) says:
Naming conventions are very important here. For example you can change name of PersonAppService to MyPersonAppService or another name which contains 'PersonAppService' postfix since the IPersonAppService has this postfix. But you can not name your service as PeopleService. If you do it, it's not registered for IPersonAppService automatically (It's registered to DI framework but with self-registration, not with interface), so, you should manually register it if you want.
In my case, I had a service called ProductService implementing interface IProductAppService. It failed for me with ObjectDisposedException exception until I renamed my service to ProductAppService.
I don't think that OP still have this issue, but hopefully it will save few hours for the folks struggling with ABP like me. :-)
I have a Prototype using WPF + MVVM + PRISM + ENTITY FRAMEWORK
The problem is that im very confuse if i use the ENTITY FRAMEWORK Entities as the Model of the MVVM pattern. I have a Business Logic Layer, and i had problems using mappers on this layer, because im very unhappy on the conversion (Map problem).
What i can do to simplify the code, use a real Model not the Entitie object(for me use the Entitie as model is incorrect on the frontend), with the MVVM pattern on mind... and stay good for changes on the future, it will have 200+ entities on the final version...
Thats my layers...(Please forget about Mapping, since i taked it off putting the EF entities on the ViewModel, but the image represents the correct layers)
Im not using the repository too since i can add it on the end with changes only on the BLL.
VIEW MODEL:
my current prototype do a getall, put it on a grid, and on selectchanged of the grid i put selected item on the textbox, and the save button update this changes to the database.
public class CadastroClienteViewModel : BindableBase, ICadastroClienteViewModel
{
private readonly IClienteBLL _clienteService;
#region Model
//public Cliente ObCliente { get; private set; }
public int ClienteID
{
get { return ((Cliente)cliItems.CurrentItem).ClienteID; }
set
{
((Cliente)cliItems.CurrentItem).ClienteID = value;
OnPropertyChanged("ClienteID");
}
}
public string Nome
{
get { return ((Cliente)cliItems.CurrentItem).Nome; }
set
{
((Cliente)cliItems.CurrentItem).Nome = value;
OnPropertyChanged("Nome");
}
}
#endregion
public CadastroClienteViewModel(IClienteBLL ServiceCliente)
{
//ObCliente = new Cliente();
_clienteService = ServiceCliente;
this.SaveCommand = new DelegateCommand(ExecuteMethodSave);
this.RefreshCommand = new DelegateCommand(ExecuteMethodRefresh, CanExecuteMethodRefresh);
RefreshCommand.Execute(null);
}
private void ExecuteMethodSave()
{
_clienteService.ClienteBLL_Update(((Cliente)cliItems.CurrentItem));
RefreshCommand.Execute(null);
}
private bool CanExecuteMethodRefresh()
{
return true;
}
private void ExecuteMethodRefresh()
{
var personViewModels = _clienteService.ClienteBLL_GetAll();
//cliente = new ObservableCollection<Cliente>(personViewModels);
cliItems = new ListCollectionView(personViewModels.ToList());
cliItems.CurrentChanged += CliItemsOnCurrentChanged;
//OnPropertyChanged("cliente");
OnPropertyChanged("cliItems");
}
private void CliItemsOnCurrentChanged(object sender, EventArgs eventArgs)
{
//OnPropertyChanged("ObCliente");
}
public ICommand SaveCommand { get; private set; }
public ICommand RefreshCommand { get; private set; }
//public ObservableCollection<Cliente> cliente { get; private set; }
public ICollectionView cliItems { get; private set; }
}
MODEL(Im not using it... but i would like):
public class MCliente
{
public int ClienteID { get; set; }
public string Nome { get; set; }
}
EF Entitie:
namespace Sistema.DataEntities.Models
{
public class Cliente
{
public Cliente()
{
}
public int ClienteID { get; set; }
public string Nome { get; set; }
}
BLL:
public class ClienteBLL : IClienteBLL
{
readonly ISistemaContext _context;
public ClienteBLL(ISistemaContext context)
{
_context = context;
}
public IEnumerable<Cliente> ClienteBLL_GetAll()
{
return _context.Cliente.AsEnumerable();
}
public Cliente ClienteBLL_GetByID(int id)
{
return _context.Cliente.Find(id);
}
public bool ClienteBLL_Adicionar(Cliente Obcliente)
{
_context.Cliente.Add(Obcliente);
return _context.SaveChanges() > 0;
}
public bool ClienteBLL_Update(Cliente Obcliente)
{
_context.Cliente.Attach(Obcliente);
_context.Entry(Obcliente).State = EntityState.Modified;
return _context.SaveChanges() > 0;
}
public bool ClienteBLL_Delete(int id)
{
var clubMember = _context.Cliente.Find(id);
_context.Cliente.Remove(clubMember);
return _context.SaveChanges() > 0;
}
I'm adding this as an answer (not a comment) even if it's not a final answer to your question (cause it's opinion-based) but it doesn't fit as a comment. That's just what I would do for a WPF application that requires a database.
I would entirely drop the idea of directly connecting your WPF application to your database. I would build a 3-tiers architecture, i.e. I would create a stateless webservice that does all the stuff on server side.
So you would have:
the database
the webservice (using WCF), that is connected to the database, that does all the data stuff for you (I would even make it responsible of the business stuff too)
the WPF application, that is connected to the webservice:
the View layer is your XAML + your code-behind
the ViewModel layer is, well, your ViewModels (out of scope of your question, but feel free to ask if you have any question about that layer). The ViewModels asynchronously call the webservice
the Model is the client WCF proxy
Some benefits of this approach:
depending on the hardware/network harchitecture, could be a huge performance benefit to only make ONE call to the server instead of N calls (assuming the latency between the DB and the webservice (both on "server side") is lower than the one between the WPF application and the database)
more scalable
all benefits of the stateless approach: one Entity Framework context instantiation per webservice requests, so much easier to deal with concurrency issues (in case you have N WPF instances running concurrently)
easier to maintain (loose coupling between tiers)
easier to test (assuming you actually build tests)
better security (no need to expose a direct access to the database over the network)