Dynamic CRUD against unknown/multiple MSSQL databases - c#

I would like to get some ideas about how retrieve data from a MSSQL database with the constraints for all the columns. I'm listing all the databases on a server, and let the user choose the database, and after that, let them choose a table to CRUD against. This is going to be shown in a javascript grid (Slickgrid) for inline editing.
It's going to be very close to what you get when you rightclick a table in MSSQL Management Studio and select Edit top 200
Chalenges:
The application should access a bunch of different databases that often change, so generating POCOs is out of the question. The databases are also very often porly made and does not contain FKs as they should.
I do need to have some server side validation, and preferably client side as well for all the columns, so I do need to get the information about datatype, nvarchar length, nullable or not and so on from the DB.
I'm used to program in C# with EF/ADO.NET, but i would not mind trying some other languages like Node.js, if there are any good support for what I'm after (Not interested in PHP though).
I was thinking about using ASP.NET MVC with ADO.NET and read the data into some models like this:
public class GridVM
{
public IEnumerable<Column> ColumnDefinitions { get; set; }
public IEnumerable<dynamic> Rows { get; set; }
}
public class Column
{
public string Name { get; set; }
public string Type { get; set; } //perhaps public Type Type?
public bool Nullable { get; set; }
public int MaxLength { get; set; }
}
And then creating a list of dynamics with the number of properties corresponding with the number of entries in the ColumnDefinitions.
dynamic
{
public object column1 { get; set; }
public object column2 { get; set; }
public object column3 { get; set; }
//etc, so that I get properties for all the columns
}
I do have some code for binding retrieved data from a DataReader to a data model, ignoring the property names not corresponding with the column names, but I need to do it without having a known data model.
The questions:
Is this a good approach or should i reconsider using some other technique or method? Are there any pitfalls with this approach that I'm not seeing right now?

Related

Run linq-to-sql queries on database with changing structure

Is it possible to run Linq-to-SQL queries when underling database structure is changing from time to time (I mean database updates that happens due to business requirements and since database is shared among several apps It may be happens without announcements to me)?
Is there any way that I can connect to new database structure in Linq-to-SQL without updating the .dbml file in my source code?
If I want to run raw queries knowing that my database structure changes during time, can I use any of Linq-to-SQL benefits somehow?
Provided the structure you have in your classes match to your tables (at least covering all the fields you need) you can do that. ie: Northwind customers table have more than 4 fields in reality. Provided below 4 are still in that table this would work:
void Main()
{
DataContext db = new DataContext(#"server=.\SQLexpress;trusted_connection=yes;database=Northwind");
Table<Customer> Customers = db.GetTable<Customer>();
var data = Customers.Where(c => c.Country == "USA");
foreach (var customer in data)
{
Console.WriteLine($"{customer.CustomerID}, {customer.CompanyName}");
}
}
[Table(Name = "Customers")]
public class Customer
{
[Column]
public string CustomerID { get; set; }
[Column]
public string CompanyName { get; set; }
[Column]
public string ContactName { get; set; }
[Column]
public string Country { get; set; }
}
For raw SQL, again you could use a type covering fields in select list or dynamic.
Note: For inserts, for this to work, fields that are not in your model should either accept null or have default values.

Read and Write to database with generic DataTable using Entity Framework

Is it possible to read and write to a SQL Server database using DataTable with Entity Framework?
I have multiple code tables defined in my database such that each of them share a fixed set of properties as shown in the sample below.
For example
public class CTGender
{
public Guid ID { get; set; }
public string Description { get; set; }
public string DisplayValue { get; set; }
//...Other properties specific to CTGender
}
public class CTNationality
{
public Guid ID { get; set; }
public string Description { get; set; }
public string DisplayValue { get; set; }
//...Other properties specific to CTNationality
}
The situation I face right now is the ever expansion of my code tables, could be another CTCountry, CTRole and so on, for example.
I am trying to synchronise these code tables between multiple databases.
The solution is heavily dependent on Entity Framework as the data access.
Is there a generic way for Entity Framework to read and write ALL these code tables without their entity models defined, like how you can read and write generic DataTables using ADO.NET?
Yes, there are couple of ways by which you can create tables at code side then either using code first approach or using publish project mechanism you can generate tables in SQL server using entity framework.
In the latter approach, you can create a separate project where you can write SQL for your various tables. This project should target SQL Server. You can right click on this project and click on publish option for updating all your tables inside SQL server.

How to insert nested object using stored procedure in sql

I am having the nested object model as follows:
public class Product
{
public List<ProductOffering> ProductOfferings { get; set; }
}
public class ProductOffering
{
public int OfferingId { get; set; }
public string OfferingDescription { get; set; }
public string OfferingType { get; set; }
public List<OfferingPriceRegion> PriceRegions { get; set; }
}
I want to insert Product along with list of ProductOffering which having again list of OfferingPriceRegion in single stored procedure (SPInsertProduct)using C#. what is the best approach except entity framework. because ProductOfferings in Product may be in large number in count say 400. where entity framework may take more time in looping save functionality. Please suggest.
Dapper being an ADO.Net based object mapper, best option would be using TableValuedParameters, where complete required data can send to the database in a single call.
Following are the important points:
Dapper takes the TVP as a DataTable
For IEnumerable<T> to DataTable, you can use the System.Data.DatasetExtensions method CopyToDataTable or there's an Nuget API FastMember to achieve the same
Few Caveats:
Number of Columns, columns names and their order shall be exactly same for TVP and the DataTable, else it will not work and error will not suggest the issue, this mapping is not same as Json mapping where schema mismatch isn't an issue
If the number of records are very high, you may want to divide into multiple DataTables and use Async-Await to do the same operation concurrently.

ASP.Net - Create Entity Class From Query Fields - Visual Studio

I am very new to ASP.Net and MVC applications so pardon me if this question has been asked or is trivial.
I know how to create an entity model class from a database table, but I want to perform a series of Joins and create a Pivot table from an SQL query and then pass this to a view.
However, I do not know a quick way to create an entity class for this.
Currently, I am doing it the long way by manually defining a model class like so:
public class OAData
{
public int Zone { get; set; }
public string Device { get; set; }
public string Part { get; set; }
...
//CONSTRUCTOR
public OAData(int zone, string device, string part...){
Zone = zone;
Device = device;
Part = part;
...
}
}
and then create a database connection in the controller, loop through all the records, creating OAData objects for each record, add it to a list and then pass that list to the View.
Is there an easier way to do this (there are many fields returned by the query)? Can I create a model from a complex SQL query rather than just off a database table?

How to save an entity's change history

I have a Task entity with various navigation properties, including a Comment entity:
public Comment {
public int ID { get; set; }
public string Title { get; set; }
public string Description { get; set; }
public DateTime CreateDate { get; set; }
public DateTime LastEditedDate { get; set; }
public virtual User User { get; set; }
}
A comment can be edited, and I want to keep a history of all such changes. Think of the way Stackoverflow allows you to edit your questions/comments, but keeps a history of changes.
Keeping a history means more complexity, and it becomes harder to maintain. My options:
add properties such as public virtual ICollection<string> DescriptionHistory { get; set; }, and a similar one for Title, User, EditDate, etc. It gets out of hand very quickly.
keep the Title, Description, etc. properties as strings, but make them CSV strings and stuff all those changes into a fewer set of properties. This means more processing, but the entity is simpler. The problem is that it becomes tricky to associate one CSV fragment with the corresponding one from a different property, eg a historical title must match its historical description and date.
do both, have the current set of properties, and another single set of nullables like TitleHistory, DescriptionHistory, etc., which are CSV strings of older versions, and so it only gets complicated when you are dealing with the historical stuff.
Also there are problems around the storage of the user, unless I use a CSV of IDs rather than the entities.
What is the best approach to this problem? There are various techniques such as sprocs and "insert only" tables - but I am using Entity Framework 5, and so prefer a solution which leverages the technology I am already using.
I've settled on using an "insert only" table. Pity it's not easy to do with EF5.

Categories