I have a legacy system that dynamically augments a table with additional columns when needed. Now I would like to access said table via C#/NHibernate.
There is no way to change the behaviour of the legacy system and I dynamically need to work with the data in the additional columns. Therefore dynamic-component mapping is not an option since I do not know the exact names of the additional columns.
Is there a way to put all unmapped columns into a dictionary (column name as key)? Or if that's not an option put all columns into a dictionary?
Again, I do not know the names of the columns at compile time so this has to be fully dynamic.
Example:
public class History
{
public Guid Id { get; set; }
public DateTime SaveDateTime { get; set; }
public string Description { get; set; }
public IDictionary<string, object> AdditionalProperties { get; set; }
}
So if the table History contains the Columns Id, SaveDateTime, Description, A, B, C and D I would like to have "A", "B", "C" and "D" in the IDictionary. Or if that's too hard to do simply throw all columns in there.
For starters I would also be fine with only using string columns if that helps.
You probably need an ADO.NET query to get this data out. If you use NH, even with a SQL query using SELECT *, you'll not getting the column names.
You can try using SMO (SqlServer management objects, a .NET Port to the SqlServer) or some other way to find the table definitions. Then you build up the mapping using Fluent NHibernate with a dynamic component. I'm not sure if you can change the mappings after you already used the session factory. It's worth a try. Good luck :-)
I guess with the following code you can get your results in a Hashtable:
var hashTable = (Hashtable)Session.CreateSQLQuery("SELECT * FROM MyTable")
.SetResultTransformer(Transformers.AliasToEntityMap)
.UniqueResult();
Obviously all your data will be detached from the session...
I think the best you can do is to find the columns at runtime, create a mapping for these extra columns then write the output to an xmlfile. Once that is done you can add the mapping at runtime...
ISessionFactory sessionFactory = new Configuration()
.AddFile("myDynamicMapping.hbm.xml")
How you would use this mapping is a good question as you would have to create your class dynamically as well then you are SOL
good luck.
What's not possible in sql is not possible in NHibernate.
It's not possible to write in insert query to insert into unknown columns.
I assume that your program builds a single Configuration object on startup, by reading XML files, and then uses the Configuration object build ISessionFactory objects.
Instead of reading the XML files, building the Configuration object, and calling it a day, however, your program can send a query to the database to figure out any extra columns on this table, and then alter the Configuration, adding columns to the DynamicMapping programmatically, before compiling the Configuration object into an ISessionFactory.
NHibernate does have ways to get the database schema, provided it's supported by the database type/dialect. It is primarily used by the SchemaExport and SchemaUpdate functions.
If you're not scared of getting your hands a bit dirty;
Start by looking at the GenerateSchemaUpdateScript function in the Configuration class:
https://nhibernate.svn.sourceforge.net/svnroot/nhibernate/trunk/nhibernate/src/NHibernate/Cfg/Configuration.cs
In particular, you'd be interested in this class, which is referenced in that method:
https://nhibernate.svn.sourceforge.net/svnroot/nhibernate/trunk/nhibernate/src/NHibernate/Tool/hbm2ddl/DatabaseMetadata.cs
The DatabaseMetadata object will allow you to traverse the metadata for all tables and fields in the database, allowing you to figure out which fields are not mapped. If you look at the Configuration class again, it holds a list of it's mapping in the TableMappings collection. Taking hints from the GenerateSchemaUpdateScript function, you can compare a Table object from the TableMappings against the any object implementing ITableMetadata returned by the DatabaseMetadata.GetTableMetadata function to figure out which columns are unmapped.
Use this information to then rebuild the mapping file used by the "dynamic" class at runtime placing all the dynamic/runtime fields in the "AdditionalProperties" dynamic-component section of the mapping file. The mapping file will need to be included as an external file and not an embedded resource to do this, but this is possible with the Configuration AddFile function. After it is rebuilt, reload the configuration, and finally rebuild the session factory.
At this time it looks like Firebird, MsSQL Compact, MsSQL, MySQL, Oracle, SQLite, and SybaseAnywhere have implementations for ITableMetadata, so it's only possible with one of these(unless you make your own implementation).
Related
I want to do some changes in a table that is already exist(i am using sqlite). I want to remove a Attribute of a property in a class. Remove [Required] attribute. How can i do that, what should i change, should i some changes in DbContext or migration folder or what commands can i use in package manager console.
public class Appointment
{
[Required]
[MaxLength(50)]
public string Company { get; set; }
This is a good example of why it is better to use fluent API then attributes to specify your database: you want the same class to be used in a different database.
The DbContext defines the database: what tables are in it, how do the tables relate towards each other, what classes do represent the tables, and what are the constraints to the tables.
For instance, maybe in one database you want Company to have a MaxLength of 50 characters, in another database you might desire a length of 100 characters. Some databases need a simple DateTime, others require a DateTime2. Or maybe you want a different precision for your decimals?
Hence it is usually better to specify the database statistics where it belongs: in the definition of the database, which is the DbContext.
Back to your question
It depends a bit on the entity framework that you are using on how the database reacts if during migration you use remove the Required attribute and move it to fluent API. I guess it is best to experiment with it.
In OnModelCreating, or the migration equivalent of it, you will have something like:
var entityAppointment = modelBuilder.Entity<Appointment>();
var propertyCompany = entityAppointment.Property(appointment => appointment.Company);
propertyCompany.IsOptional()
.HasMaxLength(50)
...; // other non-default specifications of Company
// like columnName, IsUnicode, etc
When migrating the statements might be little different, but I guess you get the gist.
I have more than 50 data tables that have nearly identical structures. Some of the tables have additional columns. I'm developing an application to help me monitor and track changes to the data contained in these tables and only need to be able to read the data contained in them. I want to create an entity framework model that will work with all of the tables and give me access to all columns that exist.
As long as the model contains the subset of columns that exist is all of the tables my model works and I can dynamically switch between the tables with the same model. However I need accesses to the additional columns when they exist. When my model contains a column that doesn't exist in the table that I switch to I get an exception for an invalid column. Is there a way to have my model be the set of all columns and if the column doesn't exist in the context of a particular table handle it in a way that I still have access to the columns that exist? I know that using strait SQL I can do this quite easily but I'm curious is there is a way to do this with entity framework. Essentially I am looking for the equivalent of querying sys.columns to determine the structure of the table and then interact with the table based on knowing what columns exist from the sys.columns query.
Sample of issue:
The 50+ tables hold data from different counties. Some of there counties have included additional data, for instance a url link to an image or file. Thus I have an column that is a varchar that contains this link. Many of the counties don't supply this type of attribute and it isn't apart of the table in other counties. But there are 100 other reported attributes that are common between all tables. I realize a solution to this issue is to have all tables contain all possible columns. However in practice this has been hard to achieve due to frequent changes to provide more to our clients in certain counties.
From the EF prospective I do not know a solution but you can try something with an extension method like below:
public static DbRawSqlQuery<YourBaseModel> GetDataFromTable(this ApplicationDbContext context, string tableName)
{
return context.Database.SqlQuery<YourBaseModel>("select * from " + tableName);
}
I think this will map only columns that exists in table with properties in your model.
This is not tested by the way but it can give you an idea of what I mean.
Entity Framework supports generating Table per Concrete type mapping, this lets you have a base class that contains all the shared columns, and derived classes for each specific table
https://weblogs.asp.net/manavi/inheritance-mapping-strategies-with-entity-framework-code-first-ctp5-part-3-table-per-concrete-type-tpc-and-choosing-strategy-guidelines
I have been reading several examples of Sitecore DataProvider implementation on single database and single table (using the config file parameters to specify particular table and columns trying to integrate with). Just wonder if it is possible to implement the dataprovider working on multiple tables instead of just one. Couldn't find any examples on this, just asking for any ideas or possibilities.
First problem I encounter when I try to deal with multiple tables is to override GetItemDefinition method. Since this method returns only one item definition and needs to know which particular table it will get the item information from. (This is specified in the config file if just dealing with one table). Basically I am looking for a way to switch (dynamically) between tables without changing the config file params every time.
If you're creating a custom data provider then then implemetation is left entirely up to you. If you have been following some of the examples such, such as the Northwind Dataprovider then as you state the implementation acts on a single database as specified in config. But you can specify whatever you need in the methods that you implement, and run logic to switch the select statement you call in the methods such as GetItemDefinition() and GetItemFields(). You can see in the Northwind exmaple that the SQL query is dynamically built:
StringBuilder sqlSelect = new StringBuilder();
sqlSelect.AppendFormat("SELECT {0} FROM {1}", nameField, table);
If you are building a read-only dataprovider then you might be able to make use of SQL Views, allowing you to write a query to combine the results from several tables using UNION operator. As long as each record has a unique ID across tables (i.e. if you are using GUIDs as the ID) then this should work fine.
I am trying to use entity framework code first method to connect to PostgreSQL database, and when I use entity data model wizard in visual studio to generate C# class from database, it can generate classes for each table in database successfully, but the views in database cannot be generated.
(source: linearbench.com)
(source: linearbench.com)
Can someone told me where I did wrong? I use Entity framework 6.1.3, with Npgsql 2.2.5. PosgreSQL database is version 9.3.6 installed on a Ubuntu server.
Thanks
I know this question is a little bit old now, but ill chime in here for anyone else who may be looking for solutions here. My answer may not be exactly what the question was looking for, however, it has sufficed as a work around solution for me.
The problem with views is that entity framework has a hard time determining the primary key column for them. In Sql Server, you can use ISNULL() function to trick EF into thinking that the column is a key column, but the equvilant coalesce() function in postgres isn't good enough for EF. I also tried generating auto-incrementing row id column, joining to other tables with primary keys, etc; no luck with any of these.
However, something that has just about emulated the functionality that I needed as far as being able to query my views into my view objects is to just extend your context class with functions that call Database.SqlQuery and return it as a Queryable
For example:
Suppose a view in your database, "foo", with columns id, bar, baz. You can write your own POCO to hold the view data like so
public class foo
{
public int id { get; set; }
public string bar { get; set; }
public string baz { get; set; }
}
and then extend your context class with a partial class definition like this
public partial class FooContext : DbContext
{
public IQueryable<foo> foo =>
this.Database.SqlQuery<foo>( "select * from foo" ).AsQueryable();
}
And then you can query it from your context just the same as any other table
context.foo.where( x => id > 100 ).toList(); //etc,etc
You wont be able to do inserts or use any of those extra capabilities that usually come with the standard DbSet, but Views are typically used as read-only queries anyways (unless youre using some special insert triggers)...
But this gives you a base call that will query the entire view, and it doesn't hit the database because its left as a queryable, so you're free to call any other LINQ extensions on it such as Where to filter it to the results you want.
I migrated from sql server to postgres sql using npgsql lib, and this fix allowed my views to work without having to make any changes to my programs codebase, just as if nothing had changed at all, and despite the fact that the edmx would not generate my view objects due to lack of a (discernible) primary key.
Hope this helps!
I have two tables that have the same layout -
Report Table
ID
ReportCol1
ReportCol2
In another database I have
Reporting Table
ID
ReportCol1
ReportCol2
I want to use a single entity model called Report to load the data from both of these tables.
In my context class I have
public DbSet<Report> Report{ get; set; }
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Configurations.Add(new ReportMap());
}
In my call to the first database Report table I get the results as expected.
I change the connection string to point to the second database, but I can't change the name of the table in the table mapping.
I don't want to use stored procs for the reason outlined in my comment.
What can I do, short of the tables names in the database(that is not an option).
Have you tried this fluent API modelBuilder.Entity<Report>().ToTable("Reporting"); ? You may need to write this so it conditionally does this based on which database you are connecting to. You may need to have your configuration allow you to say "DatabaseA uses this mapping and connection string", and "DatabaseB uses this other mapping and conenctions string", and rather than changing the connection string, you specify which database by some name/key, and your app looks up that name to determine which mapping code to run.
if(dbMappingconfig == DbMapping.A)//some enum you create
{
modelBuilder.Entity<Report>().ToTable("Reporting");
}
If your goal is to be able to pass these entities to other methods like DisplayReport(Report r) so that you don't have to duplicate code, you could have both Reporting and Report classes implement a IReport interface.
EF also supports inheritance hierarchies, so you could have them inherit from the same class, BUT I havfe a strong feeling that will not work across databases.
If the OnModelCreating doesn't rerun, it's probably already cached. Put modelBuilder.CacheForContextType = false; in there so it doesn't cache it in future, and to clear the current cache I think you can just do a Clean+Rebuild. This will come at the price of rebuilding the model everytime instead of reusing a cache. What you'd really want is use the cache up until the connection string changes. I don't know of anyway to manually clear the cache, but there might be a way. You can manage the model building yourself:
DbModelBuilder builder = new DbModelBuilder();
// Setup configurations
DbModel model = builder.Build(connection);
DbCompiledModel compiledModel = model.Compile();
DbContext context = new DbContext(connection, compiledModel);
But that will introduce additional complexities since you will need to manage the caching yourself.
While searching on this, I came across this that looks like they are trying to accomplish the same thing, as well as having gone down the same page, see Final section in question: How to map an Entity framework model to a table name dynamically
Are you able to create the same named view in each database and map to that instead of a variable table name?
I have 2 copies of tables with different names in my solution and deal with that by having 2 contexts and 2 sets of map files (generated text templates)