Entity Framework Core - Daily SQL Operation - c#

I have a big database which is created by entity framework core. This database stores round about 5 million datasets. To improve the query speed i'd like to aggregate the data of the days before.
In this case i would like to execute a SQL command once every day at 00:00 o'clock and aggregate the data of yesterday.
In the past i created stored-procs which are executed by a database-job in mssql. But these databases were created manually and now i'd like to get a similar functionallity by using the entity framework.
I read that there shouldn't be any logic in the database. So how could i do this instead? (The article where i get the base information is: Can you create sql views / stored procedure using Entity Framework 4.1 Code first approach)
So i'm searching a good solution to execute every day a "aggregation" function and store the aggregation data in the database.

You use the method you used before! It's ideally solved by SQL Agent and a proc, almost anything else will have more issues and worse performance.
If you really wanted to do it differently then you need two parts
a scheduler, this will most likely be the OS one, but has no where near
as many features as SQL Agent.
the actual program, a .NET app using EF will do this but EF is
not required, simple ADO will work, as will any other library.
The only reason you'd choose this route, is if you had further requirements that SQL would be inappropriate for, so you needed a more general language.

Related

New project: ADO.Net vs Entity Framework - trying to understand if EF works out

we are at the beginning of a new project, which will replace a legacy project. The legacy one is written in .Net Framework 4.0 (SOA with WCF) + SQL Server. The connection with SQL is made by ADO.Net + stored procedures. There is a structural mistake by having most of the logic on the stored procedures, and on top of that, it is a monolytic.
The new project will be made with .Net 6 APIs and in some cases, it will have SQL Server as well, for operational data.
So, looking at the new product the question was raised: should we move from ADO.Net to EF? This is tempting since it reduces the development effort, but performance is a concern.
Taking a look at the technical must haves:
Get the product to be as fast as possible (performance is a concern)
The new project is expected to live at least for the next 15 years
Operations are executed against tables with 30 to 50 million records
We must be able to run operations against the regular database, but also against the readonly one (AlwaysOn)
We must be able to perform some resiliency policies such as retries in case of deadlocks
We don't have much room for changes if we choose one path and somewhere along the way we realize we should had gone with the other option
Quite honestly, IMHO, based on our tech requirements I feel should move forward with ADO.Net + Stored procedures (without any business logic) + some sort of package that translates the SQL results to my objects in a fast manner, but I'd like to give EF a shot, at least on this stage of the process where we are investigating possibilities.
I'd like to gather if possible opinions, specially if there is someone out there that went to EF with requirements as similar as ours, or someone who didn't go to EF or had to change from EF to ADO.Net somewhere along the way.
Thanks.
The only thing in your requirements that could support using ADO.NET over EF is
Get the product to be as fast as possible (performance is a concern)
Which is a nonsense requirement, as you can always write more code and make things more complex to make things marginally faster. You need a real performance requirement so you can measure different approaches.

C# Is there an easier way to create a database, empty tables, tables with data in them by default, stored procs and views?

Before I posted this question, I did some Googling first on how a database was created through C# and mostly it points to either SMO or SQL query files and it was the time of SQL Server 2005 and 2008.
So at this day in age, is there an easier way to create a database with empty tables, tables with data in them by default, stored procedures and views?
I need a suggestion.
I think the answer is probably Entity Framework. You can do 'code first' and use database migrations, allowing you to write your C# code and use that to generate a lot of the database for you.
Ultimately though, 'easier' is subjective. I personally find EF great for the 'normal' stuff, but at the end of the day, if you need a stored procedure to do some custom logic; you need to write the custom logic, in some fashion.
Maybe have a look and see if you think it fits your needs.
https://www.asp.net/mvc/overview/getting-started/getting-started-with-ef-using-mvc/creating-an-entity-framework-data-model-for-an-asp-net-mvc-application
Looked at the database projects in studio 2013. You create a database as a series of scripts using a familiar GUI. However, changes are published - this process creates a unique change script targeting the connection you define. For new databases the whole thing gets created, but publish against a partial or out dated version and the script created in a change script to bring it up to date.
You can even writ unit tests against your database using specialist tools, although I do find them lacking a bit.
More on msdn - here
Depends. right out of gates. Sp and views. Best shot is directly from database through a workbench. I can then capture definitions and store in a file to be replayed through c#
As for tables there are many orms that can generate tables via c#. Look at entity frameworks. Code first examples
I have generated tables using EF Works fine. I then went into database and created views and sps.
The trick is to migrate new views and sps into your EF model U can google entity Frameworks code first ... Adding views and SPs.
Worst case is u create database all through database workbench. Create a script that an be played to recreate eveything. By running. Then use EF DATABASE first approach
In either case u end up with a good set of autogenerated code to manage CRUD and object management and an abstracted data model

Is Entity Framework fast enough for data retrieval

I am designing a new set of projects including a WCF service that must handle as many as 50 requests per minute.
This will be a Microsoft stack using .NET 4.0 and C#.
Each request will validate the data and if it passes, retrieve data via a stored proc on a SQL Server 2008 server.
The response should be returned within 5 seconds of the request, if possible.
Both the request and the response XML are under 3K each and are fairly simple.
I plan to set up a load-balancer to handle the requests but I need to know if EF will be fast enough to pull this off or if I need to go with something else.
Note that none of this is built yet so I have the freedom to build something from scratch.
Entity Framework is relatively fast (Performance Considerations for Entity Framework 4, 5, and 6), however, if ALL you're doing is invoking stored procedures, Dapper or some other MICRO-ORM will be much faster. If you need to do more complex O/RM tasks, like LINQ queries against the database, LINQ to SQL is generally faster than EF6, but EF6 supports more concepts, such as code-first that LINQ to SQL was never meant to do.
I don't think your O/RM will be your bottleneck, no matter what way you go about it: more likely the stored procedure (or no indexes, if you go the O/RM query route and don't figure out what indexes you need beforehand) will be your performance bottleneck.

Entity Framework VS pure Ado.Net

EF is so widely used staff but I don't realize how I should use it. I met a lot of issues with EF on different projects with different approaches. So some questions brought together in my head. And answers leads me to use pure ado.net with stored procedures.
So the questions are:
How to deal with EF in n-tier application?
For example, we have some DAL with EF. I saw a lot of articles and projects that used repository, unit of work patterns as some kind of abstraction for EF. I think such approach kills most of benefits that increase development speed and leads to few things:
remapping of EF load results in some DTO that kills performance(call some select to get table data - first loop, second loop - map results to some composite type generated by ef, next - filter mapped data using linq and, at last, map it to some DTO). Exactly remapping to DTO is killer of one of the biggest efs benefit;
or
leads to strong cohesion between EF (and it's version) and app. It will be something like 2-tier app with dal and presentation with bll or dal with bll and presentation. I guess it's not best practice. And the same loading process as we have for previous thing except mapping, so again performance issue raised up. We could try to use EF as DAL without any abstraction under them. But we will get similar issues in some other way.
Should I use one context per app\thread\atomic operation? Using approach - one context per app\thread may slightly increase performance and possibilities to call navigation properties, but we meet another problem - updating this context and growing loaded data in context, also I'm not sure about concurrency with one dbcontext per app\thread. Using context per operation will lead us to remapping EF results to our DTO's. So you see that we again pushed back to question no.1.
Could we try to use EF + stored procedures only? Again we have issues from previous questions. What is the reason to use EF if the biggest part of functionality will not be used?
So, yes EF is great to start project. It so convenient when we have few screens and crud operations.
But what next?
All this text is just unsorted thoughts. I know that pure ado.net will lead to another kind of challenges.
So, what is your opinion about this topic?
By following the naming conventions , you will find it's called : ADO.NET Entity Framework , which means that Entity Framework sits on top of ADO.NET so it can't be faster , It may perform both in equal time , but let's look at EF provides :
You will no more get stuck with writing queries without any clue about if what you're writing is going to compile or not .
It makes you rely on C# or your favorite .NET language on writing your own data constraints that you wish to accept from the target user directly inside your model classes .
Finally : EF and LINQ give a lot of power in maintaining your applications later .
There are three different models with the Entity Framework : Model First , Database First and Code First get to know each of 'em .
-The Point about killing performance when remapping is on process , it's because that on the first run , EF loads metadata into memory and that takes time as it builds in-memory representation of model from edmx file.
ADO. Net is an object oriented framework that allows you to interact with database system (SQL, Oracle, etc).
Entity framework is a techniques of manipulating data in databases like (collection of queries (inert table name , select * from like this )).
it is uses with LINQ.
Entity Framework is not efficient in any case as in most tools or toolboxes designed to achieve 'faster' results.
Access to database should be viewed as a separate tier using store procedures as the interface. There is no reason for any application to have more than absolutely require CRUD operations. Less is more principle. Stored procedures are easy to write, secure, maintain and is de facto fastest way. It's easy to write tools to generate desired codes for POCO and DbContext through stored procedures.
Application well designed should have a limited numbers of connection strings to database and none of which should be the all mighty God. Using schema to support connection rights.
Lazy loading are false statements added to solve a problem that should never exist and introduced with ORM and its plug and play features. Data should only be read when needed. Developers should be responsible to implement this logic base on application context.
If your application logic has a problem to maintain states, no tool will help. It will in fact, make it worse by cover up the real problem until it's too late.
Database first is the only solution for a well designed application. Civilization realized long time ago the important of solid aqueduct and sewer system. High level code can and will be replaced anytime but data stays. Rewrite an entire application is matter of days if database is well designed.
Applications are just glorified database access. Still true in most cases.
This is my conclusion after many years in business applications debugging through codes produced by many different tools or toolboxes. The faster results advertised are not even close to cover the amount of time/energy wasted later trying to clean up the mess. Performance issues are rarely if not ever caused by high demand but the sum of all 'features' added through unusable tools.
ADO.NET provides consistent access to data sources such as SQL Server and XML, and to data sources exposed through OLE DB and ODBC. Data-sharing consumer applications can use ADO.NET to connect to these data sources and retrieve, handle, and update the data that they contain.
Entity Framework 6 (EF6) is a tried and tested object-relational mapper (O/RM) for .NET with many years of feature development and stabilization. An ORM like EF has the following advantage
ORM lets developers focus on the business logic of the application thereby facilitating huge reduction in code.
It eliminates the need for repetitive SQL code and provides many benefits to development speed.
Prevents writing manual SQL queries; & many more..
In an n-tier application,it depends on the amount of data your application is handling and your database is managing. According to my knowledge DTO's don't kill performance. They are data container for moving data between layers and are only used to pass data and does not contain any business logic. They are mostly used in service classes.See DTO.
One DBContext is always a best practice.
There is no such combination of EF + SP(Stored Procedure) as per my knowledge. If you wish to use an ORM like EF and an SP at the same time try micro-ORMs like Dapper,BLToolkit, etc..It was build for that purpose and is heck lotta fast than EF. Here is a good article on Dapper ORM.
Here is a related thread on a similar topic: What is the difference between an orm and ADO.net?

Performance monitoring options in Entity Framework 4.1

I am developing a custom Content Management System in C# (SQL Server 2005) for my organization that operates primarily on Entity Framework 4.1. I would like some insight as to how my application is running, specifically when it comes to my EF queries.
What I'm looking for is a way to monitor the quantity, speed and actual execution (translated SQL) of queries being executed within a given period of time. Essentially I'd like to add DB profiling functionality into my application.
If at all possible I would to do this without implementing custom monitoring code for each one of my repository functions.
My question is this:
What is the simplest way to monitor in/out performance of the Entity Framework queries. I would like the following data:
A list of queries executed within the profiling time-span
For each query I would like to see execution time and actual SQL
If possible, the result size for each query would be helpful too
You can use an existing tool, such as the Hibernating Rhinos EF Profiler.

Categories