I am having the nested object model as follows:
public class Product
{
public List<ProductOffering> ProductOfferings { get; set; }
}
public class ProductOffering
{
public int OfferingId { get; set; }
public string OfferingDescription { get; set; }
public string OfferingType { get; set; }
public List<OfferingPriceRegion> PriceRegions { get; set; }
}
I want to insert Product along with list of ProductOffering which having again list of OfferingPriceRegion in single stored procedure (SPInsertProduct)using C#. what is the best approach except entity framework. because ProductOfferings in Product may be in large number in count say 400. where entity framework may take more time in looping save functionality. Please suggest.
Dapper being an ADO.Net based object mapper, best option would be using TableValuedParameters, where complete required data can send to the database in a single call.
Following are the important points:
Dapper takes the TVP as a DataTable
For IEnumerable<T> to DataTable, you can use the System.Data.DatasetExtensions method CopyToDataTable or there's an Nuget API FastMember to achieve the same
Few Caveats:
Number of Columns, columns names and their order shall be exactly same for TVP and the DataTable, else it will not work and error will not suggest the issue, this mapping is not same as Json mapping where schema mismatch isn't an issue
If the number of records are very high, you may want to divide into multiple DataTables and use Async-Await to do the same operation concurrently.
Related
Is it possible to run Linq-to-SQL queries when underling database structure is changing from time to time (I mean database updates that happens due to business requirements and since database is shared among several apps It may be happens without announcements to me)?
Is there any way that I can connect to new database structure in Linq-to-SQL without updating the .dbml file in my source code?
If I want to run raw queries knowing that my database structure changes during time, can I use any of Linq-to-SQL benefits somehow?
Provided the structure you have in your classes match to your tables (at least covering all the fields you need) you can do that. ie: Northwind customers table have more than 4 fields in reality. Provided below 4 are still in that table this would work:
void Main()
{
DataContext db = new DataContext(#"server=.\SQLexpress;trusted_connection=yes;database=Northwind");
Table<Customer> Customers = db.GetTable<Customer>();
var data = Customers.Where(c => c.Country == "USA");
foreach (var customer in data)
{
Console.WriteLine($"{customer.CustomerID}, {customer.CompanyName}");
}
}
[Table(Name = "Customers")]
public class Customer
{
[Column]
public string CustomerID { get; set; }
[Column]
public string CompanyName { get; set; }
[Column]
public string ContactName { get; set; }
[Column]
public string Country { get; set; }
}
For raw SQL, again you could use a type covering fields in select list or dynamic.
Note: For inserts, for this to work, fields that are not in your model should either accept null or have default values.
I have a sample ASP.net application, and I want to create it using n tier architecture, so I have a data base that contains tables and stored procedures (that perform (CRUD) operations on these tables)
, now when I tried to create the data access layer I created methods that uses ado.net to call these stored procedures,and I have created it but these methods returns datatables like this one:
public DataTable getallcourcesdetailsbyid(string courseid) {
SqlParameter[] parameter = new SqlParameter[] { new SqlParameter("#courseid", courseid) };
return sqlhelper.ExecuteParamerizedSelectCommand("usp_getcoursedetailsbyid", CommandType.StoredProcedure, parameter);
}
So I found that there is better way to create classes with properties that represents tables in the database to hold the data returned by data access layer
like this one:
class course{
public int courseid { get; set; }
public string coursename { get; set; }
public short specializationid { get; set; }
public short subjectid { get; set; }
public short instructorid { get; set; }
public string startdate { get; set; }
public string enddate { get; set; }
public bool isactive { get; set; }
public bool isdeleted { get; set; }
}
But these stored procedures not always returns the data from specific table for example the class course above present the course table in the data base but the method above called "getallcourcesdetailsbyid" call stored procedure with the following code
select courseid,coursename,startdate,enddate,courseimgpath,specialization,firstname,lastname,subjectname,price,coursedetails,teacherimgpath
from joacademytest.course
inner join joacademytest.specialization ON joacademytest.course.specializationid = joacademytest.specialization.specializationid
inner join joacademytest.[subject] on joacademytest.course.subjectnameid=joacademytest.[subject].subjectid
inner join joacademytest.teachers on joacademytest.course.instructerid=joacademytest.teachers.teacherid
inner join dbo.courcesprices on joacademytest.course.priceid=dbo.courcesprices.priceid
where joacademytest.course.isactive=1 and joacademytest.course.isdeleted=0 and courseid = #courseid;
So the stored procedure will not return the same column exists in the course object but it represent columns from 4 joined tables, so do I have to create the entity classes based on the columns existing on the tables or based on the columns returned by my stored procedure. I have searched the Internet and never found any body mentioned that I can create entity classes based on stored procedure returned column which made me confused.
There is nothing wrong with creating classes that map to the results of a store procedure. Infact Many of the ORMs like entity framework, nHibernate allow you to do that. In the end It all depends on what you want to achieve, performance, maintenance etc and these are very broad topics. to answer your question and keeping in mind your current setup, i would propose -
Create Entity that map to store procs if, store procs are the preferred way of getting data from database, Infact so that you dont re-invent the wheel you can use one of the many Orm tools - dapper, EF.
Or instead of creating entities from your datatables , you can return dynamic objects.
hope this helps
What I always do is I create a classes that represents my tables in my database. In a scenario like you describe, where you have a stored procedure that returns data, I move it to a view. Then I create a class that represents that view. If you can't do that and you must keep it in a stored procedure, then I would just create a class around what the stored procedure returns.
It seems this problem would have been encountered before me but I'm not finding much help online probably because I don't really know what to search for.
My problem in short is that I have a db table. That table has 5 keys to other tables.
I then have a model that represents this table in EF. Of course this object that represents the db table has List<T> properties that are representations of foreign keys in the db. That doesn't seem to be the problem as much as the EF model that has this table representation but also List<T> properties to other models.
The problem I am experiencing is that a call to a stored procedure to populate the main modelforces additional calls to the db to populate the related List<T> models.
I am looking to improve performance namely by eliminating the multiple calls.
My only thought to this point is to modify the stored procedure to return multiple recordsets and match each List<T> property to its corresponding recordset.
My sterilized structure is something like this.
DB:
sql_Id Int PK
sql_Status Int FK
sql_Reason Int FK
sql_GuestId Int
sql_Name varchar
sql_Created DateTime
sql_Original Int FK
EF:
public class OrderHeader : ClassBase
{
public OrderHeader()
{
TaskCodeAssignments = new List<OrderHeaderTaskCodeAssignment>();
StatusReasonCode = new OrderHeaderStatusReasonCode();
StatusCode = new OrderHeaderStatusCode();
Links = new OrderHeaderLinks();
}
public int OrderHeaderID { get; set; }
public short OrderHeaderStatusCodeID { get; set; }
public short? OrderHeaderStatusReasonCodeID { get; set; }
public short? OriginatingApplicationId { get; set; }
public string CustomerFirstName { get; set; }
public string CustomerLastName { get; set; }
public OrderHeaderStatusCode StatusCode { get; set; }
public OrderHeaderStatusReasonCode StatusReasonCode { get; set; }
public CustomerStatusCode CustomerStatusCode { get; set; }
public ICollection<OrderHeaderTaskCodeAssignment> TaskCodeAssignments { get; set; }
}
public class OrderHeaderStatusCode
{
public OrderHeaderStatusCode()
{
OrderHeaderStatusReasonCodes = new List<OrderHeaderStatusReasonCode>();
}
public ICollection<OrderHeaderStatusReasonCode> OrderHeaderStatusReasonCodes { get; set; }
public virtual ICollection<OrderHeader> OrderHeader { get; set; }
}
The other custom types like OrderHeaderStatusReasonCode are pretty similar in design so I'm leaving out for brevity.
C# Web API
public async Task<IHttpActionResult>GetOrdersHistory([FromUri]GetOrderRequestParameters orderParams)
{
....removed for brevity....
var query = await TheOrderRepository.GetOrderHistory(getOrder);
}
Order Repository:
public async Task<IQueryable<OrderHeader>> GetOrderHistory(GetOrderParameters orderParams)
{
// this is the call to stored procedure that I would modify to return multiple recordsets
var storedProcedure = StoredProcedure.Name.MyStoredProc.ToString();
var ordersHistory = await dbctx.Database.SqlQuery<OrderHeader>(...), storedProcParam).ToListAsync();
// now I jump off to fill in the other properties and their data has to come from the db
await GetOrdersData(ordersHistory, orderParams.Include);
}
private async Task GetOrdersData(List<OrderHeader> ordersHistory)
{
if (ordersHistory != null)
{
await LoadOrderStatusCodeForList(ordersHistory);
await LoadOrderStatusReasonCodeForList(ordersHistory);
await LoadCustomerStatusCodeForList(ordersHistory);
await LoadOrderHeaderTaskCodeAssignmentsForList(ordersHistory);
await LoadOrderHeaderTaskCodeForList(ordersHistory);
}
}
Again most of these awaits are similar so I'm just going to give an example of one...
private async Task LoadOrderStatusCodeForList()
{
....snipped for brevity...
await LoadOrderStatusCode(order.OrderHeaderStatusCodeID));
}
private async Task<OrderHeaderStatusCode> LoadOrderStatusCode(short orderHeaderStatusCodeId)
{
....snipped brevity....
var storedProcedure = StoredProcedure.Name.MySprocStatusCode.ToString();
return await _dbctx.Database.SqlQuery<OrderHeaderStatusCode>(...), ...).FirstOrDefaultAsync();
}
EDIT:
The crux is this. OrderHeader has properties with a custom type and basically those custom types have a List<T> that has to be populated. My current design is such that I repeatedly hit the db to populate those custom types List properties.
Is there a way to make one trip to the db to get all my information. As mentioned earlier the only way I can think of is to modify the stored procedure to return multiple record sets and then match them up.
BTW the architecture may be the flaw...in which case educate me on how to properly populate a complex object like this.
TIA
The root problem is that stored procedures aren't composable. In SQL you can't join a stored procedure call with anything (a database table or another stored procedure). So EF can't do that either.
If you want to get data with loaded collections from the database, normally you'd have to use Includes. EF will translate that into the appropriate joins and figure out how to load the entities and their collections from one big result set. But, as said, joins are no option here.
There is a way to load multiple result sets from one stored procedure. IMO it's pretty messy and very procedural. I would keep loading the data separately as you do now, if you want to keep using stored procedures. Others may suggest that you could load the additional data by lazy loading. Unfortunately that's not as straightforward as it should be with SqlQuery.
Another option of course is to start using regular DbSets (with Includes), but I can't judge if that's possible for you.
I would like to get some ideas about how retrieve data from a MSSQL database with the constraints for all the columns. I'm listing all the databases on a server, and let the user choose the database, and after that, let them choose a table to CRUD against. This is going to be shown in a javascript grid (Slickgrid) for inline editing.
It's going to be very close to what you get when you rightclick a table in MSSQL Management Studio and select Edit top 200
Chalenges:
The application should access a bunch of different databases that often change, so generating POCOs is out of the question. The databases are also very often porly made and does not contain FKs as they should.
I do need to have some server side validation, and preferably client side as well for all the columns, so I do need to get the information about datatype, nvarchar length, nullable or not and so on from the DB.
I'm used to program in C# with EF/ADO.NET, but i would not mind trying some other languages like Node.js, if there are any good support for what I'm after (Not interested in PHP though).
I was thinking about using ASP.NET MVC with ADO.NET and read the data into some models like this:
public class GridVM
{
public IEnumerable<Column> ColumnDefinitions { get; set; }
public IEnumerable<dynamic> Rows { get; set; }
}
public class Column
{
public string Name { get; set; }
public string Type { get; set; } //perhaps public Type Type?
public bool Nullable { get; set; }
public int MaxLength { get; set; }
}
And then creating a list of dynamics with the number of properties corresponding with the number of entries in the ColumnDefinitions.
dynamic
{
public object column1 { get; set; }
public object column2 { get; set; }
public object column3 { get; set; }
//etc, so that I get properties for all the columns
}
I do have some code for binding retrieved data from a DataReader to a data model, ignoring the property names not corresponding with the column names, but I need to do it without having a known data model.
The questions:
Is this a good approach or should i reconsider using some other technique or method? Are there any pitfalls with this approach that I'm not seeing right now?
My application has be entity model as below and use Dapper
public class Goal
{
public string Text { get; set; }
public List<SubGoal> SubGoals { get; set; }
}
public class SubGoal
{
public string Text { get; set; }
public List<Practise> Practices { get; set; }
public List<Measure> Measures { get; set; }
}
and has a repository as below
public interface IGoalPlannerRepository
{
IEnumerable<Goal> FindAll();
Goal Get(int id);
void Save(Goal goal);
}
I came across two scenarios as below
While retrieving data (goal entity), it needs to retrieve all the related objects in hierarchy (all subgoals along with practices and measures)
When a goal is saved all the related data need to be inserted and/or updated
Please suggest is there a better way to handle these scenarios other than "looping through" the collections and writing lots and lots of SQL queries.
The best way to do large batch data updates in SQL using Dapper is with compound queries.
You can retrieve all your objects in one query as a multiple resultset, like this:
CREATE PROCEDURE get_GoalAndAllChildObjects
#goal_id int
AS
SELECT * FROM goal WHERE goal_id = #goal_id
SELECT * FROM subgoals WHERE goal_id = #goal_id
Then, you write a dapper function that retrieves the objects like this:
using (var multi = connection.QueryMultiple("get_GoalAndAllChildObjects", new {goal_id=m_goal_id})) {
var goal = multi.Read<Goal>();
var subgoals = multi.Read<SubGoal>();
}
Next comes updating large data in batches. You do that through table parameter inserts (I wrote an article on this here: http://www.altdevblogaday.com/2012/05/16/sql-server-high-performance-inserts/ ). Basically, you create one table for each type of data you are going to insert, then write a procedure that takes those tables as parameters and write them to the database.
This is super high performance and about as optimized as you can get, plus the code isn't too complex.
However, I need to ask: is there any point to keeping "subgoals" and all the other objects relational? One easy alternative is to create an XML or JSON document that contains your goal and all its child objects serialized into text, and just save that object to the file system. It's unbelievably high performance, very simple, very extensible, and takes very little code. The only downside is that you can't write a SQL statement to browse across all subgoals with a bit of work. Consider it - it might be worth a thought ;)