I want to do exactly what this question asks:
Cascade Saves with Fluent NHibernate AutoMapping
Using Fluent Nhibernate Mappings to turn on "cascade" globally once for all classes and relation types using one call rather than setting it for each mapping individually.
The answer to the earlier question looks great, but I'm afraid that the Fluent Nhibernate API altered its .WithConvention syntax last year and broke the answer... either that or I'm missing something.
I keep getting a bunch of name space not found errors relating to the IOneToOnePart, IManyToOnePart and all their variations:
"The type or namespace name 'IOneToOnePart' could not be found (are you missing a using directive or an assembly reference?)"
I've tried the official example dll's, the RTM dll's and the latest build and none of them seem to make VS 2008 see the required namespace.
The second problem is that I want to use the class with my AutoPersistenceModel
but I'm not sure where to this line:
.ConventionDiscovery.AddFromAssemblyOf()
in my factory creation method.
private static ISessionFactory CreateSessionFactory()
{
return Fluently.Configure()
.Database(SQLiteConfiguration.Standard.UsingFile(DbFile))
.Mappings(m => m.AutoMappings
.Add(AutoMap.AssemblyOf<Shelf>(type => type.Namespace.EndsWith("Entities"))
.Override<Shelf>(map =>
{
map.HasManyToMany(x => x.Products).Cascade.All();
})
)
)//emd mappings
.ExposeConfiguration(BuildSchema)
.BuildSessionFactory();//finalizes the whole thing to send back.
}
Below is the class and using statements I'm trying
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using FluentNHibernate.Conventions;
using FluentNHibernate.Cfg;
using FluentNHibernate.Cfg.Db;
using NHibernate;
using NHibernate.Cfg;
using NHibernate.Tool.hbm2ddl;
using FluentNHibernate.Mapping;
namespace TestCode
{
public class CascadeAll : IHasOneConvention, IHasManyConvention, IReferenceConvention
{
public bool Accept(IOneToOnePart target)
{
return true;
}
public void Apply(IOneToOnePart target)
{
target.Cascade.All();
}
public bool Accept(IOneToManyPart target)
{
return true;
}
public void Apply(IOneToManyPart target)
{
target.Cascade.All();
}
public bool Accept(IManyToOnePart target)
{
return true;
}
public void Apply(IManyToOnePart target)
{
target.Cascade.All();
}
}
}
The easiest way I've found to do this for a whole project is to use DefaultCascade:
.Conventions.Add( DefaultCascade.All() );
Go to "The Simplest Conventions" section on the wiki, for this, and a list of others.
Edit:
Here's the list from the Wiki:
Table.Is(x => x.EntityType.Name + "Table")
PrimaryKey.Name.Is(x => "ID")
AutoImport.Never()
DefaultAccess.Field()
DefaultCascade.All()
DefaultLazy.Always()
DynamicInsert.AlwaysTrue()
DynamicUpdate.AlwaysTrue()
OptimisticLock.Is(x => x.Dirty())
Cache.Is(x => x.AsReadOnly())
ForeignKey.EndsWith("ID")
A word of warning - some of the method names in the Wiki may be wrong. I edited the Wiki with what I could verify (i.e. DefaultCascade and DefaultLazy), but can't vouch for the rest. But you should be able to figure out the proper names with Intellisense if the need arises.
Here's a full working example similar to the Getting Started guide https://github.com/jagregory/fluent-nhibernate/wiki/Getting-started
//=====CONSOLE MAIN
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using FluentNHibernate.Cfg;
using FluentNHibernate.Cfg.Db;
using NHibernate;
using NHibernate.Cfg;
using NHibernate.Tool.hbm2ddl;
using System.IO;
using FluentNHibernate.Automapping;
using App4.Entities;
using System.Diagnostics;
namespace App4
{
class Program
{
static void Main(string[] args)
{
// create our NHibernate session factory
var sessionFactory = CreateSessionFactory();
using (var session = sessionFactory.OpenSession())
{
// populate the database
using (var transaction = session.BeginTransaction())
{
// create a couple of Stores each with some Products and Employees
var topShelf = new Shelf();
var sw = new Stopwatch();
sw.Start();
for (var i = 0; i < 1000; i++)
{
var potatoes = new Product { Name = "Potatoes" + i.ToString(), Price = 3.60 + i };
var meat = new Product { Name = "Meat" + i.ToString(), Price = 4.49 + i };
//session.SaveOrUpdate(potatoes); //===<<cascading save handles this :-)
//session.SaveOrUpdate(meat);
topShelf.Products.Add(meat);
topShelf.Products.Add(potatoes);
}
sw.Stop();
session.SaveOrUpdate(topShelf);
//session.SaveOrUpdate(superMart);
transaction.Commit();
Console.WriteLine("Add Items: " + sw.ElapsedMilliseconds);
}
}
using (var session = sessionFactory.OpenSession())
{
// retreive all stores and display them
using (session.BeginTransaction())
{
var shelves = session.CreateCriteria(typeof(Shelf)).List<Shelf>();
foreach (var store in shelves)
{
WriteShelfPretty(store);
}
}
}
Console.ReadLine();
}
private const string DbFile = "FIVEProgram.db";
private static ISessionFactory CreateSessionFactory()
{
return Fluently.Configure()
.Database(SQLiteConfiguration.Standard.UsingFile(DbFile))
.Mappings(m => m.AutoMappings
.Add(AutoMap.AssemblyOf<Shelf>(type => type.Namespace.EndsWith("Entities"))
.Override<Shelf>(map =>
{
map.HasManyToMany(x => x.Products);//.Cascade.All();
})
.Conventions.AddFromAssemblyOf<CascadeAll>()
)
) //emd mappings
.ExposeConfiguration(BuildSchema)//Delete and remake db (see function below)
.BuildSessionFactory();//finalizes the whole thing to send back.
}
private static void BuildSchema(Configuration config)
{
// delete the existing db on each run
if (File.Exists(DbFile))
File.Delete(DbFile);
// this NHibernate tool takes a configuration (with mapping info in)
// and exports a database schema from it
new SchemaExport(config)
.Create(false, true);
}
private static void WriteShelfPretty(Shelf shelf)
{
Console.WriteLine(shelf.Id);
Console.WriteLine(" Products:");
foreach (var product in shelf.Products)
{
Console.WriteLine(" " + product.Name);
}
Console.WriteLine();
}
}
}
//Data Classes
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace App4.Entities
{
public class Product
{
public virtual int Id { get; private set; }
public virtual string Name { get; set; }
public virtual double Price { get; set; }
}
public class Shelf
{
public virtual int Id { get; private set; }
public virtual IList<Product> Products { get; private set; }
public Shelf()
{
Products = new List<Product>();
}
}
}
//Cascade All Helper Class
using FluentNHibernate.Conventions;
using FluentNHibernate.Conventions.AcceptanceCriteria;
using FluentNHibernate.Conventions.Inspections;
using FluentNHibernate.Conventions.Instances;
using System;
using System.Collections.Generic;
namespace App4
{
public class CascadeAll :
IHasOneConvention, //Actually Apply the convention
IHasManyConvention,
IReferenceConvention,
IHasManyToManyConvention,
IHasOneConventionAcceptance, //Test to see if we should use the convention
IHasManyConventionAcceptance, //I think we could skip these since it will always be true
IReferenceConventionAcceptance, //adding them for reference later
IHasManyToManyConventionAcceptance
{
//One to One
public void Accept(IAcceptanceCriteria<IOneToOneInspector> criteria)
{
//criteria.Expect(x => (true));
}
public void Apply(IOneToOneInstance instance)
{
instance.Cascade.All();
}
//One to Many
public void Accept(IAcceptanceCriteria<IOneToManyCollectionInspector> criteria)
{
//criteria.Expect(x => (true));
}
public void Apply(IOneToManyCollectionInstance instance)
{
instance.Cascade.All();
}
//Many to One
public void Accept(IAcceptanceCriteria<IManyToOneInspector> criteria)
{
// criteria.Expect(x => (true));
}
public void Apply(IManyToOneInstance instance)
{
instance.Cascade.All();
}
//Many to Many
public void Accept(IAcceptanceCriteria<IManyToManyCollectionInspector> criteria)
{
// criteria.Expect(x => (true));
}
public void Apply(IManyToManyCollectionInstance instance)
{
instance.Cascade.All();
}
}
}
The signature for the conventions has changed. Are you not using something like ReSharper? That would point you to that conclusion.
You can read more about the new conventions on the wiki.
Related
I am using ElasticClient.Search function to get the values of the fields.
The issue is :
The code that i make below make the mapping correctly but for searching it returns null values of the fields that was mapped before.
Main.cs
using Nest;
using System;
using System.Linq;
using System.Threading;
namespace DataAccessConsole
{
class Program
{
public static Uri node;
public static ConnectionSettings settings;
public static ElasticClient client;
static void Main(string[] args)
{
{
node = new Uri("http://localhost:9200");
settings = new ConnectionSettings(node).DefaultIndex("getallcommissionspermanentes");
settings.DefaultFieldNameInferrer(p => p);
client = new ElasticClient(settings);
var indexSettings = new IndexSettings();
indexSettings.NumberOfReplicas = 1;
indexSettings.NumberOfShards = 1;
client.Indices.Create("getallcommissionspermanentes", index => index
.Map<GetAllCommissionsPermanentes>(
x => x
.AutoMap<GetAllCommissionsPermanentes>()
));
client.Search<GetAllCommissionsPermanentes>(s => s
.AllIndices()
);
}
}
GetAllCommissionsPermanentes.cs
the table is located in an edmx model of Entityframework and Data came from SQL SERVER Database
public partial class GetAllCommissionsPermanentes
{
public int ID { get; set; }
public string NomAr { get; set; }
public string NomFr { get; set; }
}
if you need more informations just make a comment below.
Thanks
Code is correct but '.All Indices ()' searches in all indexes, results that do not match the model are coming. This code will return more accurate results;
client.Search<GetAllCommissionsPermanentes>(s => s.Index("getallcommissionspermanentes");
I'm currently learning about dotnet core 2.0 (console app) and i'm working on configuration.
I try to use a json to configure my app and this json is bind on a class. It works fine but it seems the binder just populate with the default value if the key is not in the json file. Is there an easy way to tell the system to throw an exception in that case ?
for the moment i solved this by using reflection but maybe someone have a better solution ?
this is my code :
using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.Extensions.Configuration;
namespace ConsoleApp2
{
public class EnvironmentSettings
{
public int ValueOne { get; set; }
public int ValueTwo { get; set; }
}
class Program
{
static void Main(string[] args)
{
EnvironmentSettings testsetting = new EnvironmentSettings();
var builder = new ConfigurationBuilder();
builder.SetBasePath(#"C:\Users\username\source\repos\ConsoleApp2\ConsoleApp2\");
builder.AddJsonFile("TestConfig.json", false);
var build = builder.Build();
var child = build.GetChildren();
foreach(string key in typeof(EnvironmentSettings).GetProperties().Select(s=>s.Name))
{
if(!child.Any(item => item.Key == key))
{
throw new KeyNotFoundException($"Configuration value with key '{key}' does not exist");
}
}
build.Bind(testsetting);
Console.WriteLine($"{testsetting.ValueOne} - {testsetting.ValueTwo}");
Console.ReadLine();
}
}
}
this is my json file
{
"ValueOne": 1
}
and this is the part i wan't to change to something better if possible :
var child = build.GetChildren();
foreach(string key in typeof(EnvironmentSettings).GetProperties().Select(s=>s.Name))
{
if(!child.Any(item => item.Key == key))
{
throw new KeyNotFoundException($"Configuration value with key '{key}' does not exist");
}
}
I have an asp.net core project which needs to be able to support plugins at runtime, and as a consequence, I need to generate database tables based on what has been plugged in. The plugins are each divided in separate projects and they have have their own DbContext class. The plugins to be used are not known during compile-time, only at runtime.
Now in EF Core I thought that there would be a method like "UpdateDatabase" where you can just add tables to the existing database, but I was wrong. Is there a way to accomplish this? I was able to generate a separate database for each of the plugins, but that wasn't quite what I had in mind..I needed all tables in one database.
Here's the code for the "HRContext" plugin:
using System;
using System.Collections.Generic;
using System.Text;
using Microsoft.EntityFrameworkCore;
using Plugins.HR.Models.Entities;
namespace Plugins.HR.Contexts
{
public class HrContext : DbContext
{
public HrContext()
{
}
public HrContext(DbContextOptions<HrContext> contextOptions) : base(contextOptions)
{
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.HasDefaultSchema("HR");
base.OnModelCreating(modelBuilder);
}
public DbSet<Address> Address { get; set; }
public DbSet<Attendance> Attendance { get; set; }
public DbSet<Department> Departments { get; set; }
public DbSet<Employee> Employees { get; set; }
public DbSet<JobTitle> JobTitles { get; set; }
}
}
Here's another piece of code for the "CoreContext" plugin:
using System;
using System.Collections.Generic;
using System.Text;
using Core.Data.Models;
using Microsoft.EntityFrameworkCore;
namespace Core.Data.Contexts
{
public class CoreContext : DbContext
{
public CoreContext()
{
}
public CoreContext(DbContextOptions<CoreContext> contextOptions) : base(contextOptions)
{
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.HasDefaultSchema("Core");
base.OnModelCreating(modelBuilder);
}
public DbSet<Test> Tests { get; set; }
}
}
My ConfigureServices method in Startup.cs:
public void ConfigureServices(IServiceCollection services)
{
services.AddDbContext<CoreContext>(options => options.UseSqlServer("Data source = localhost; initial catalog = Company.Core; integrated security = true;"))
.AddDbContext<HrContext>(options => options.UseSqlServer("Data source = localhost; initial catalog = Company.HR; integrated security = true;"));
// Add framework services.
services.AddMvc();
}
If I try to change the connection string to be the same, sooner or later I will get an error that says that the table for one plugin does not exist. I tried "EnsureCreated" but that didn't work too.
I had the same issue. See my solution on GitHub a few days ago, here: EF Core Issue #9238
What you need is something like the following:
// Using an interface, so that we can swap out the implementation to support PG or MySQL, etc if we wish...
public interface IEntityFrameworkHelper
{
void EnsureTables<TContext>(TContext context)
where TContext : DbContext;
}
// Default implementation (SQL Server)
public class SqlEntityFrameworkHelper : IEntityFrameworkHelper
{
public void EnsureTables<TContext>(TContext context)
where TContext : DbContext
{
string script = context.Database.GenerateCreateScript(); // See issue #2943 for this extension method
if (!string.IsNullOrEmpty(script))
{
try
{
var connection = context.Database.GetDbConnection();
bool isConnectionClosed = connection.State == ConnectionState.Closed;
if (isConnectionClosed)
{
connection.Open();
}
var existingTableNames = new List<string>();
using (var command = connection.CreateCommand())
{
command.CommandText = "SELECT table_name from INFORMATION_SCHEMA.TABLES WHERE table_type = 'base table'";
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
existingTableNames.Add(reader.GetString(0).ToLowerInvariant());
}
}
}
var split = script.Split(new[] { "CREATE TABLE " }, StringSplitOptions.RemoveEmptyEntries);
foreach (string sql in split)
{
var tableName = sql.Substring(0, sql.IndexOf("(", StringComparison.OrdinalIgnoreCase));
tableName = tableName.Split('.').Last();
tableName = tableName.Trim().TrimStart('[').TrimEnd(']').ToLowerInvariant();
if (existingTableNames.Contains(tableName))
{
continue;
}
try
{
using (var createCommand = connection.CreateCommand())
{
createCommand.CommandText = "CREATE TABLE " + sql.Substring(0, sql.LastIndexOf(";"));
createCommand.ExecuteNonQuery();
}
}
catch (Exception)
{
// Ignore
}
}
if (isConnectionClosed)
{
connection.Close();
}
}
catch (Exception)
{
// Ignore
}
}
}
}
Then at the end of Startup.Configure(), I resolve an IEntityFrameworkHelper instance and use that with an instance of DbContext to call EnsureTables().
One issue is I need to still account for the parts of the script which are not CREATE TABLE statements. For example, the CREATE INDEX statements.
I requested they give us a clean solution, for example: add a CreateTable<TEntity>() method to IRelationalDatabaseCreator. Not holding my breath for that though...
EDIT
I forgot to post the code for GenerateCreateScript(). See below:
using System.Text;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Metadata;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage;
public static class DatabaseFacadeExtensions
{
public static string GenerateCreateScript(this DatabaseFacade database)
{
var model = database.GetService<IModel>();
var migrationsModelDiffer = database.GetService<IMigrationsModelDiffer>();
var migrationsSqlGenerator = database.GetService<IMigrationsSqlGenerator>();
var sqlGenerationHelper = database.GetService<ISqlGenerationHelper>();
var operations = migrationsModelDiffer.GetDifferences(null, model);
var commands = migrationsSqlGenerator.Generate(operations, model);
var stringBuilder = new StringBuilder();
foreach (var command in commands)
{
stringBuilder
.Append(command.CommandText)
.AppendLine(sqlGenerationHelper.BatchTerminator);
}
return stringBuilder.ToString();
}
}
It's based on the code found here: EF Core Issue #2943
when i run a method within my WCF service i get "An unhandled exception of type 'System.StackOverflowException' occurred in developer1SVC.dll".
No infinite loops exist and no infinite recursion is occurring. Any ideas to why this may be happening? when i run the method through wcf test client. I get the results back correctly however, hooking it up to my console app and running it breaks the application. The other methods run fine. It is this one method. Just trying to get the feel for WCF services. The service breaks right after i return accounts from the "generateMultiplAccounts" method.
Much help appreciated.
Service
using developer1.Core.ServiceContracts;
using developer1.Core.Data;
using developer1.Core.Dto;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.Serialization;
using System.ServiceModel;
using System.ServiceModel.Web;
using System.Text;
using developer1.Core.Dto.Account;
namespace developer1.Core.Service
{
// NOTE: You can use the "Rename" command on the "Refactor" menu to change the class name "Service1" in code, svc and config file together.
// NOTE: In order to launch WCF Test Client for testing this service, please select Service1.svc or Service1.svc.cs at the Solution Explorer and start debugging.
public class Service1 : IService1
{
public string GetData(int value)
{
return string.Format("You entered: {0}", value);
}
public CompositeType GetDataUsingDataContract(CompositeType composite)
{
if (composite == null)
{
throw new ArgumentNullException("composite");
}
if (composite.BoolValue)
{
composite.StringValue += "Suffix";
}
Console.WriteLine(composite.StringValue + composite.BoolValue);
return composite;
}
public List<AccountDto> GenerateMultipleAccounts(int count)
{
List<AccountDto> accounts = new List<AccountDto>();
for (int i = 0; i < count; i++)
{
AccountDto newAccount = new AccountDto() { AccountId = Guid.NewGuid()};
accounts.Add(newAccount);
}
return accounts;
}
}
}
Console Application
using developer1.Core;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using developer1.Core.Service;
using developer1.Core.Dto.Account;
using developer1.Core.ServiceContracts;
using AccountServiceClient = developer1.TestConsole.ServiceReference1.Service1Client;
namespace developer1.TestConsole
{
class Program
{
static void Main(string[] args)
{
try
{
AccountServiceClient AccountServiceClient = new AccountServiceClient();
Guid testGuid = Guid.NewGuid();
List<AccountDto> newAccounts = new List<AccountDto>(AccountServiceClient.GenerateMultipleAccounts(2));
Console.WriteLine(testGuid);
CompositeType testDataContract = new CompositeType() { StringValue = "test", BoolValue = true };
testDataContract = AccountServiceClient.GetDataUsingDataContract(testDataContract);
Console.WriteLine(AccountServiceClient.GetData(6));
Console.WriteLine(testDataContract.StringValue);
//foreach (var item in newAccounts)
//{
// Console.WriteLine(item.AccountId);
//}
}
catch(Exception e) {
Console.WriteLine(e);
}
Console.Read();
}
}
}
Data Contract
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.Serialization;
using System.Web;
namespace developer1.Core.Dto.Account
{
[DataContract]
public class AccountDto
{
[DataMember]
public Guid AccountId { get; set; }
//get { return this.AccountId; }
//set { this.AccountId = this.AccountId == Guid.Empty ? Guid.NewGuid() : value; }
[DataMember]
public Guid UserId { get; set; }
[DataMember]
public string AccountName { get; set; }
[DataMember]
public string BankName { get; set; }
//get { return this.BankName; }
//set { this.BankName = this.BankName == null ? "Unspecified" : value; }
}
}
ANSWER!!!!!!!
So I have solved this dreaded issue. You must create a WCF Service Library instead of a WCF Service Application. My god that is stupid that the application wont let you split your components outside of the interface.
I'm trying to create a simple service using the code provided but i don't understand why have an exception when binding.
10-19 11:42:09.148 I/mono-stdout( 1622): MvxBind:Error: 10.40 Exception thrown during the view binding
MvxBindingLayoutInflatorFactory Line 133!
I need some help please :)
My Source:
DataStore Interface:
using Core.Models;
using System;
using System.Collections.Generic;
using System.Collections.ObjectModel;
using System.Linq;
using System.Text;
namespace Core.Interfaces
{
public interface IDataStore
{
void UpdateFeed(FeedModel feedModel);
void DeleteFeed(FeedModel feedModel);
void CreateFeed(FeedModel feedModel);
FeedModel GetFeed(Uri uri);
ObservableCollection<FeedModel> Feeds { get; }
}
}
DataStore Class:
using System;
using System.Collections.Generic;
using System.Linq;
using Core.Interfaces;
using System.Collections.ObjectModel;
using Cirrious.MvvmCross.Interfaces.ServiceProvider;
using Cirrious.MvvmCross.Interfaces.Platform;
using Cirrious.MvvmCross.Interfaces.Localization;
using Cirrious.MvvmCross.ExtensionMethods;
using Core.Helpers;
using System.Xml.Serialization;
using System.Xml.Linq;
using System.IO;
namespace Core.Models
{
public class DataStore
: IDataStore
, IMvxServiceConsumer<IMvxSimpleFileStoreService>
, IMvxServiceConsumer<IMvxResourceLoader>
{
public DataStore()
{
Load();
}
public void UpdateFeed(FeedModel feedModel)
{
var toUpdate = this.m_feeds.First(feed => feed.Url == feedModel.Url);
toUpdate.CloneFrom(feedModel);
Save();
}
public void DeleteFeed(FeedModel feedModel)
{
this.m_feeds.Remove(this.m_feeds.First(feed => feed.Url == feedModel.Url));
Save();
}
public void CreateFeed(FeedModel feedModel)
{
this.m_feeds.Add(feedModel);
Save();
}
public FeedModel GetFeed(Uri uri)
{
return this.m_feeds.First(feed => feed.Url == uri);
}
private void Load()
{
var fileService = this.GetService<IMvxSimpleFileStoreService>();
if (!fileService.TryReadBinaryFile(LocationDataService.StoreFileName, LoadFrom))
{
var resourceLoader = this.GetService<IMvxResourceLoader>();
resourceLoader.GetResourceStream(LocationDataService.ResourceFileName, (inputStream) => LoadFrom(inputStream));
}
}
private bool LoadFrom(Stream inputStream)
{
try
{
var loadedData = XDocument.Load(inputStream);
if (loadedData.Root == null)
return false;
using (var reader = loadedData.Root.CreateReader())
{
var list = (List<FeedModel>)new XmlSerializer(typeof(List<FeedModel>)).Deserialize(reader);
this.m_feeds = new ObservableCollection<FeedModel>(list);
return true;
}
}
catch
{
return false;
}
}
private void Save()
{
var fileService = this.GetService<IMvxSimpleFileStoreService>();
fileService.WriteFile(LocationDataService.StoreFileName, (stream) =>
{
var serializer = new XmlSerializer(typeof(List<FeedModel>));
serializer.Serialize(stream, m_feeds.ToList());
});
}
private ObservableCollection<FeedModel> m_feeds;
public ObservableCollection<FeedModel> Feeds
{
get { return this.m_feeds; }
}
}
}
BaseViewModel:
using Cirrious.MvvmCross.Commands;
using Cirrious.MvvmCross.ExtensionMethods;
using Cirrious.MvvmCross.Interfaces.Commands;
using Cirrious.MvvmCross.Interfaces.ServiceProvider;
using Cirrious.MvvmCross.ViewModels;
using Core.Interfaces;
namespace Core.ViewModels
{
public class BaseViewModel
: MvxViewModel
, IMvxServiceConsumer<IDataStore>
{
protected IDataStore DataStore
{
get { return this.GetService<IDataStore>(); }
}
}
}
FeedManagerViewModel:
using Cirrious.MvvmCross.Commands;
using Cirrious.MvvmCross.Interfaces.Commands;
using Core.Controls;
using Core.Models;
using System;
using System.Collections.ObjectModel;
namespace Core.ViewModels
{
public class FeedsManagerViewModel
: BaseViewModel
{
public ObservableCollection<FeedModel> Feeds { get { return this.DataStore.Feeds; } }
...
}
}
View xml:
<Mvx.MvxBindableListView
android:layout_width="fill_parent"
android:layout_height="fill_parent"
local:MvxBind="{'ItemsSource':{'Path':'Feeds'}, 'ItemClick':{'Path':'DisplayItemCommand'}}"
local:MvxItemTemplate="#layout/feedlist_viewmodel" />
This is most likely an error in your XML... but it's hard to tell from just that one line of trace.
What version of MvvmCross are you running?
The tip version of both Master and vNext show line 133 as
MvxBindingTrace.Trace(MvxTraceLevel.Error, "Exception during creation of {0} from type {1} - exception {2}", name, viewType.FullName, exception.ToLongString());
So hopefully if you use the tip, then that should give you a lot more information about what is going wrong.
Beyond that, you can always try setting a breakpoint on the offending line to extract more information.
If the exception is on line 99 then change the error logging there from:
MvxBindingTrace.Trace(MvxTraceLevel.Error, "Exception thrown during the view binding ", exception.ToLongString());
to:
MvxBindingTrace.Trace(MvxTraceLevel.Error, "Exception thrown during the view binding {0}", exception.ToLongString());
The error will be in there somewhere :)
Another good debugging technique is to comment out lines one-by-one until the problem goes away - this helps identify where the problem is.
You've got a working development environment and a really powerful debugger - using it is a good skill to learn. 'Code Complete' is one of my favourite books ever :)