Foreach gets slower with every iteration - c#

we have an application that does some processing at night. Simply put, it creates some statistics for every user (about 10.000).
Now we have noticed that this takes hours in the production environment and we have been able to simulate this using a backup of the production database.
And we see that when the foreach loop starts, it generally takes around 200 ms to generate the data and save it to the database for one user. After about a 1000 users, this gets up to 700 ms per user. And after about 2.000 users, it starts to just take longer and longer, all the way up to 2 seconds to generate the data and save it to the database per user.
Do you have any ideas why this may be? Here is the (simplified) code to show what happens:
var userService = IoC.GetInstance<IUserService>();
var users = userService.GetAll().Where(x => x.IsRegistered == true);
var statisticsService = IoC.GetInstance<IStatisticsService>();
foreach (var user in users.ToList())
{
var statistic1 = GetStatistic(1); // this returns an object
var statistic 2 = GetStatistic(2);
statisticsService.Add(statistic1);
statisticsService.Add(statistic2);
statisticsService.Commit(); /* this is essentially a dbContext.SaveChanges(); */
}
function Statistic GetStatistic(int statisticnumber)
{
var stat = new Statistic();
stat.Type = statisticnumber;
switch(statisticnumber) {
case 1:
stat.Value = /* query to count how many times they've logged in */
break;
case 2:
stat.Value = /* query to get their average score */
break;
...
}
return stat;
}
So far we have tried:
AsNoTracking: we select the users using AsNoTracking to make sure Entity Framework doesn't track any changes to the user itself (because there are none).
Clearing indexes on statistics table: before this starts we run a script that drops all indexes (except the clustered index). After generation we recreate these indexes
Does anyone have any additional things we can test/try ?

As you can see in comments you need to keep the context clean so you need to Dispose it every n records (usually, in my case, n < 1000).
This is a good solution in most cases. But there are some issues, the most important are:
1. When you need to insert a lot of records, running insert (and update) statements runs faster.
2. The entities you write (and related entities) must be all in the same context.
There are some other libraries around to make bulk operations but they works only with SQL Server and I think that a great added value of EF is that is DBMS independent without significant efforts.
When I need to insert several records (less than 1.000.000) and I want to keep EF advantages I use the following methods. They generates a DML statement starting from an entity.
public int ExecuteInsertCommand(object entityObject)
{
DbCommand command = GenerateInsertCommand(entityObject);
ConnectionState oldConnectionState = command.Connection.State;
try
{
if (oldConnectionState != ConnectionState.Open)
command.Connection.Open();
int result = command.ExecuteNonQuery();
return result;
}
finally
{
if (oldConnectionState != ConnectionState.Open)
command.Connection.Close();
}
}
public DbCommand GenerateInsertCommand(object entityObject)
{
ObjectContext objectContext = ((IObjectContextAdapter)Context).ObjectContext;
var metadataWorkspace = ((EntityConnection)objectContext.Connection).GetMetadataWorkspace();
IEnumerable<EntitySetMapping> entitySetMappingCollection = metadataWorkspace.GetItems<EntityContainerMapping>(DataSpace.CSSpace).Single().EntitySetMappings;
IEnumerable<AssociationSetMapping> associationSetMappingCollection = metadataWorkspace.GetItems<EntityContainerMapping>(DataSpace.CSSpace).Single().AssociationSetMappings;
var entitySetMappings = entitySetMappingCollection.First(o => o.EntityTypeMappings.Select(e => e.EntityType.Name).Contains(entityObject.GetType().Name));
var entityTypeMapping = entitySetMappings.EntityTypeMappings[0];
string tableName = entityTypeMapping.EntitySetMapping.EntitySet.Name;
MappingFragment mappingFragment = entityTypeMapping.Fragments[0];
string sqlColumns = string.Empty;
string sqlValues = string.Empty;
int paramCount = 0;
DbCommand command = Context.Database.Connection.CreateCommand();
foreach (PropertyMapping propertyMapping in mappingFragment.PropertyMappings)
{
if (((ScalarPropertyMapping)propertyMapping).Column.StoreGeneratedPattern != StoreGeneratedPattern.None)
continue;
string columnName = ((ScalarPropertyMapping)propertyMapping).Column.Name;
object columnValue = entityObject.GetType().GetProperty(propertyMapping.Property.Name).GetValue(entityObject, null);
string paramName = string.Format("#p{0}", paramCount);
if (paramCount != 0)
{
sqlColumns += ",";
sqlValues += ",";
}
sqlColumns += SqlQuote(columnName);
sqlValues += paramName;
DbParameter parameter = command.CreateParameter();
parameter.Value = columnValue;
parameter.ParameterName = paramName;
command.Parameters.Add(parameter);
paramCount++;
}
foreach (var navigationProperty in entityTypeMapping.EntityType.NavigationProperties)
{
PropertyInfo propertyInfo = entityObject.GetType().GetProperty(navigationProperty.Name);
if (typeof(System.Collections.IEnumerable).IsAssignableFrom(propertyInfo.PropertyType))
continue;
AssociationSetMapping associationSetMapping = associationSetMappingCollection.First(a => a.AssociationSet.ElementType.FullName == navigationProperty.RelationshipType.FullName);
EndPropertyMapping propertyMappings = associationSetMapping.AssociationTypeMapping.MappingFragment.PropertyMappings.Cast<EndPropertyMapping>().First(p => p.AssociationEnd.Name.EndsWith("_Target"));
object relatedObject = propertyInfo.GetValue(entityObject, null);
foreach (ScalarPropertyMapping propertyMapping in propertyMappings.PropertyMappings)
{
string columnName = propertyMapping.Column.Name;
string paramName = string.Format("#p{0}", paramCount);
object columnValue = relatedObject == null ?
null :
relatedObject.GetType().GetProperty(propertyMapping.Property.Name).GetValue(relatedObject, null);
if (paramCount != 0)
{
sqlColumns += ",";
sqlValues += ",";
}
sqlColumns += SqlQuote(columnName);
sqlValues += string.Format("#p{0}", paramCount);
DbParameter parameter = command.CreateParameter();
parameter.Value = columnValue;
parameter.ParameterName = paramName;
command.Parameters.Add(parameter);
paramCount++;
}
}
string sql = string.Format("INSERT INTO {0} ({1}) VALUES ({2})", tableName, sqlColumns, sqlValues);
command.CommandText = sql;
foreach (DbParameter parameter in command.Parameters)
{
if (parameter.Value == null)
parameter.Value = DBNull.Value;
}
return command;
}
public int ExecuteUpdateCommand(object entityObject)
{
DbCommand command = GenerateUpdateCommand(entityObject);
ConnectionState oldConnectionState = command.Connection.State;
try
{
if (oldConnectionState != ConnectionState.Open)
command.Connection.Open();
int result = command.ExecuteNonQuery();
return result;
}
finally
{
if (oldConnectionState != ConnectionState.Open)
command.Connection.Close();
}
}
public DbCommand GenerateUpdateCommand(object entityObject)
{
ObjectContext objectContext = ((IObjectContextAdapter)Context).ObjectContext;
var metadataWorkspace = ((EntityConnection)objectContext.Connection).GetMetadataWorkspace();
IEnumerable<EntitySetMapping> entitySetMappingCollection = metadataWorkspace.GetItems<EntityContainerMapping>(DataSpace.CSSpace).Single().EntitySetMappings;
IEnumerable<AssociationSetMapping> associationSetMappingCollection = metadataWorkspace.GetItems<EntityContainerMapping>(DataSpace.CSSpace).Single().AssociationSetMappings;
string entityTypeName;
if (!entityObject.GetType().Namespace.Contains("DynamicProxi"))
entityTypeName = entityObject.GetType().Name;
else
entityTypeName = entityObject.GetType().BaseType.Name;
var entitySetMappings = entitySetMappingCollection.First(o => o.EntityTypeMappings.Select(e => e.EntityType.Name).Contains(entityTypeName));
var entityTypeMapping = entitySetMappings.EntityTypeMappings[0];
string tableName = entityTypeMapping.EntitySetMapping.EntitySet.Name;
MappingFragment mappingFragment = entityTypeMapping.Fragments[0];
string sqlColumns = string.Empty;
int paramCount = 0;
DbCommand command = Context.Database.Connection.CreateCommand();
foreach (PropertyMapping propertyMapping in mappingFragment.PropertyMappings)
{
if (((ScalarPropertyMapping)propertyMapping).Column.StoreGeneratedPattern != StoreGeneratedPattern.None)
continue;
string columnName = ((ScalarPropertyMapping)propertyMapping).Column.Name;
if (entityTypeMapping.EntityType.KeyProperties.Select(_ => _.Name).Contains(columnName))
continue;
object columnValue = entityObject.GetType().GetProperty(propertyMapping.Property.Name).GetValue(entityObject, null);
string paramName = string.Format("#p{0}", paramCount);
if (paramCount != 0)
sqlColumns += ",";
sqlColumns += string.Format("{0} = {1}", SqlQuote(columnName), paramName);
DbParameter parameter = command.CreateParameter();
parameter.Value = columnValue ?? DBNull.Value;
parameter.ParameterName = paramName;
command.Parameters.Add(parameter);
paramCount++;
}
foreach (var navigationProperty in entityTypeMapping.EntityType.NavigationProperties)
{
PropertyInfo propertyInfo = entityObject.GetType().GetProperty(navigationProperty.Name);
if (typeof(System.Collections.IEnumerable).IsAssignableFrom(propertyInfo.PropertyType))
continue;
AssociationSetMapping associationSetMapping = associationSetMappingCollection.First(a => a.AssociationSet.ElementType.FullName == navigationProperty.RelationshipType.FullName);
EndPropertyMapping propertyMappings = associationSetMapping.AssociationTypeMapping.MappingFragment.PropertyMappings.Cast<EndPropertyMapping>().First(p => p.AssociationEnd.Name.EndsWith("_Target"));
object relatedObject = propertyInfo.GetValue(entityObject, null);
foreach (ScalarPropertyMapping propertyMapping in propertyMappings.PropertyMappings)
{
string columnName = propertyMapping.Column.Name;
string paramName = string.Format("#p{0}", paramCount);
object columnValue = relatedObject == null ?
null :
relatedObject.GetType().GetProperty(propertyMapping.Property.Name).GetValue(relatedObject, null);
if (paramCount != 0)
sqlColumns += ",";
sqlColumns += string.Format("{0} = {1}", SqlQuote(columnName), paramName);
DbParameter parameter = command.CreateParameter();
parameter.Value = columnValue ?? DBNull.Value;
parameter.ParameterName = paramName;
command.Parameters.Add(parameter);
paramCount++;
}
}
string sqlWhere = string.Empty;
bool first = true;
foreach (EdmProperty keyProperty in entityTypeMapping.EntityType.KeyProperties)
{
var propertyMapping = mappingFragment.PropertyMappings.First(p => p.Property.Name == keyProperty.Name);
string columnName = ((ScalarPropertyMapping)propertyMapping).Column.Name;
object columnValue = entityObject.GetType().GetProperty(propertyMapping.Property.Name).GetValue(entityObject, null);
string paramName = string.Format("#p{0}", paramCount);
if (first)
first = false;
else
sqlWhere += " AND ";
sqlWhere += string.Format("{0} = {1}", SqlQuote(columnName), paramName);
DbParameter parameter = command.CreateParameter();
parameter.Value = columnValue;
parameter.ParameterName = paramName;
command.Parameters.Add(parameter);
paramCount++;
}
string sql = string.Format("UPDATE {0} SET {1} WHERE {2}", tableName, sqlColumns, sqlWhere);
command.CommandText = sql;
return command;
}

Add Method
This method is getting slower and slower after every iteration. In fact, this method doesn't get slower but the DetectChanges methods that get called inside the Add method.
So more record the ChangeTracker contains, slower the DetectChanges method become.
At around 100,000 entities, it can get more than 200ms to simply add a new entity when it was taking 0ms when the first entity was added.
Solution
There are several solutions to fix this issue such as:
USE AddRange over Add
SET AutoDetectChanges to false
SPLIT SaveChanges in multiple batches
In your case, probably re-creating a new context every time you loop can be the best idea since it looks you want to save on every iteration.
foreach (var user in users.ToList())
{
var statisticsService = new Instance<IStatisticsService>();
var statistic1 = GetStatistic(1); // this returns an object
var statistic 2 = GetStatistic(2);
statisticsService.Add(statistic1);
statisticsService.Add(statistic2);
statisticsService.Commit(); /* this is essentially a dbContext.SaveChanges(); */
}
Once you get rid of the poor performance due to the DetectChanges method, you still have a performance issue caused by the number of database round-trip performed by the SaveChanges methods.
If you need to save 10,000 statistics, then SaveChanges will make 10,000 database round-trip which is INSANELY slow.
Disclaimer: I'm the owner of the project Entity Framework Extensions
This library allows you to perform all bulk operations:
BulkSaveChanges
BulkInsert
BulkUpdate
BulkDelete
BulkMerge
BulkSynchronize
It works will all major provider such:
SQL Server
SQL Compact
Oracle
MySQL
SQLite
PostgreSQL
Example:
// Easy to use
context.BulkSaveChanges();
// Easy to customize
context.BulkSaveChanges(bulk => bulk.BatchSize = 100);
// Perform Bulk Operations
context.BulkDelete(customers);
context.BulkInsert(customers);
context.BulkUpdate(customers);
// Customize Primary Key
context.BulkMerge(customers, operation => {
operation.ColumnPrimaryKeyExpression =
customer => customer.Code;
});

Related

How do I return multiple rows of parameter data with C#

I'm using C# to extract data to T-SQL parameters, the query is:
DECLARE #Param1 INT, #Param2 NCHAR(50), #P3 REAL, #P4 BIT
SELECT #Param1 = [idx]
,#Param2 = [data]
,#P3 = [itgetsreal]
,#P4 = [wazzup]
FROM [T].[dbo].[temp];
Running from MS SQL Server Management Studio yields a predictable one line dataset, the last. My C# code creates output parameters:
public class ParamData
{ // query has executed, return parameter data in class properties
public string[] names; public Object[] vals; public bool success = false;
public ParamData(SqlCommand cmd)
{
if (cmd == null) {
err = "SqlCommand \"cmd\" may not be NULL";
errcode = -1; errdata = ""; errsrc = "ParamData(SqlCommand cmd)"; }
else {
if (cmd.Parameters == null) { err = "\"SqlCommand.Parameters\" may not be NULL";
errcode = -1; errdata = cmd.CommandText; errsrc = "ParamData(SqlCommand cmd)"; }
else {
try {err = ""; errcode = 0; errdata = ""; errsrc = "";
int cnt = cmd.Parameters.Count;
if (cnt > 0) {
names = new string[cnt]; vals = new object[cnt];
for (int i = 0; i < cnt; i++)
{
names[i] = cmd.Parameters[i].ParameterName;
vals[i] = cmd.Parameters[i].Value;
}
}
success = true;
}
catch (SqlException ex) {
err = ex.Message; errcode = ex.Number;
errdata = String.Format("Parameters name/val assignment: Query: \"{0}\"",
cmd.CommandText);
errsrc = "ParamData(SqlCommand cmd)";}
}
}
}
}
How do I create a SqlReader type object where I can extract each row of the dataset and not only the last?
BTW: I've got means already to bring in the data without parameters but am constrained to provide a legacy compatibility and can't yet replicate the original function.
You can't return multiple rows of T-SQL variables/parameters. Parameters only hold the last value.
The comments are spot on - one needs to return datasets (not through parameters) to obtain multiple rows.

BAD QUESTION/SOLVED - C# How to use a Type obtained from a variable. Error is - "is a variable but is used as a type"

I'm trying to make a function to extract the values of a specific record in a database through a raw query, and i want to return an object of the type of the entity that is requested.
<(Edited: some briefing to better understanding the purpose)
I'm making a kind of an Entity Framework Addon function(s) to be able to request and save any object of a respective model class on a MySqlite db created by code first approach. Here is the request but the save is basically the same. Just one function for save and one for read, for all entities. This will also save me trouble of creating a form for every model as there will be only on form that handles all models on front end. This is a WebAssembly project with a visual database management system for the Admin>
I'm looking for an elegant solution for that tiny piece of code under the commented line, ortherwise i'll have to use a switch with all types and respective tryParse on the value. If anyone knows i appreciate and hope this code would be usefull for someone.
See commented line
public async Task<object> GetClassData(string entityName, int id=0)
{
var classes = AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(t => t.GetTypes())
.Where(t => t.IsClass && t.Namespace == "MecanicoAppSqlite.Shared.Models");
Type genericClassType = classes.FirstOrDefault(x => x.Name.ToLower() == entityName.ToLower());
PropertyInfo[] newClassProperties = genericClassType.GetProperties();
var tempClass = System.Reflection.Assembly.GetAssembly(genericClassType).CreateInstance(genericClassType.ToString());
string query = $"SELECT * from {entityName} WHERE Id LIKE {id}";
try
{
using (var command = context.Database.GetDbConnection().CreateCommand())
{
command.CommandText = query;
command.CommandType = CommandType.Text;
await context.Database.OpenConnectionAsync();
using (var reader = await command.ExecuteReaderAsync())
{
reader.Read();
int columnCount= reader.FieldCount;
for (int h = 0; h < columnCount; h++)
{
string colName = reader.GetName(h);
PropertyInfo pinfo = newClassProperties.FirstOrDefault(x => x.Name.ToLower() == colName.ToLower());
Type tp = pinfo.GetType();
//This tp above is a Int32 or string or whatever. Related problem is in next line
var x = reader.IsDBNull(h) ? null : reader.GetFieldValueAsync<tp>(h).Result;
pinfo.SetValue(tempClass, x);
}
}
}
return StatusCode(200, tempClass); // Get all users
}
catch (Exception e)
{
return StatusCode(500, e);
}
}
You should really use something like Dapper or Entity framework. But answering your question, you can use MakeGenericMethod for that:
public async Task<object> GetClassData(string entityName, int id = 0)
{
var classes = AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(t => t.GetTypes())
.Where(t => t.IsClass && t.Namespace == "MecanicoAppSqlite.Shared.Models");
Type genericClassType = classes.FirstOrDefault(x => x.Name.ToLower() == entityName.ToLower());
PropertyInfo[] newClassProperties = genericClassType.GetProperties();
var tempClass = System.Reflection.Assembly.GetAssembly(genericClassType).CreateInstance(genericClassType.ToString());
string query = $"SELECT * from {entityName} WHERE Id LIKE {id}";
try
{
using (var command = context.Database.GetDbConnection().CreateCommand())
{
command.CommandText = query;
command.CommandType = CommandType.Text;
await context.Database.OpenConnectionAsync();
// Here we get the MethodInfo for the method we want to call
MethodInfo method = typeof(DbDataReaderExtensions).GetMethod("GetFieldValue");
using (var reader = await command.ExecuteReaderAsync())
{
reader.Read();
int columnCount = reader.FieldCount;
for (int h = 0; h < columnCount; h++)
{
string colName = reader.GetName(h);
PropertyInfo pinfo = newClassProperties.FirstOrDefault(x => x.Name.ToLower() == colName.ToLower());
// pinfo.GetType() will give you typeof(PropertyInfo) not the type of the property
Type tp = pinfo.PropertyType;
var genericMethod = method.MakeGenericMethod(new[] { tp });
var x = reader.IsDBNull(h) ? null : genericMethod.Invoke(reader, new object[] {h});
pinfo.SetValue(tempClass, x);
}
}
}
return StatusCode(200, tempClass); // Get all users
}
catch (Exception e)
{
return StatusCode(500, e);
}
}
I'll use a switch with all the datatypes that i use in this application for now
using (var reader = await command.ExecuteReaderAsync())
{
reader.Read();
var tempClass = System.Reflection.Assembly.GetAssembly(genericClassType).CreateInstance(genericClassType.ToString());
int numeroDeColunas = reader.FieldCount;
for(int h=0;h<numeroDeColunas;h++)
{
string colName = reader.GetName(h);
PropertyInfo property = newClassProperties.FirstOrDefault(x => x.Name.ToLower() == colName.ToLower());
Type propType = property.GetType();
var x = reader.IsDBNull(h) ? "" : reader.GetFieldValueAsync<string>(h).Result;
switch (propType.Name)
{
case "Int32":
int iv;
if (!Int32.TryParse(x, out iv)) throw new ApplicationException("Unable to parse at ValuesController>GetAll");
property.SetValue(tempClass, iv);
break;
case "String":
property.SetValue(tempClass, x);
break;
case "DateTime":
DateTime dv;
if (!DateTime.TryParse(x, out dv)) throw new ApplicationException("Unable to parse at ValuesController>GetAll");
property.SetValue(tempClass, dv);
break;
case "TimeSpan":
TimeSpan tv;
if (!TimeSpan.TryParse(x, out tv)) throw new ApplicationException("Unable to parse at ValuesController>GetAll");
property.SetValue(tempClass, tv);
break;
default:
Console.WriteLine("Subject is C#");
break;
}
}
}

how to optimize the time in executing the query in c# react

I am new to React and new to Web API. I am uploading data in a tabulator in react front end from the value that I am passing through the web API. I am passing value through the getReports function like this:
[HttpPost]
[Route("GetReports")]
public IHttpActionResult GetReports(string jwt, List<object> data)
{
if (!Common.VerificationToken.VerifyJWToken(jwt))
{
return null;
}
var to = data[0];
var from = data[1];
DateTime toDate = Convert.ToDateTime(to);
DateTime fromDate = Convert.ToDateTime(from);
var ReportData = db.T_CQL_COIL_DESC.Where(t => t.CCD_CREATED_ON >= toDate &&
t.CCD_CREATED_ON <= fromDate).ToList();
ReportsDTO dto = new ReportsDTO();
List<ReportsDTO> ReportDTO = new List<ReportsDTO>();
try
{
foreach (var report in ReportData)
{
List<vehicledetail> vehicle = new List<vehicledetail>();
var imgurl = "https://firebasestorage.googleapis.com/v0/b/tsl-coil-qlty-
monitoring-dev.appspot.com/";
dto = new ReportsDTO();
dto.Type = report.CCD_OP_TYPE;
dto.ID = report.CCD_COIL_ID;
vehicle = GetVehicleID(dto.ID);
vehicledetail vehicledetails = vehicle[0];
dto.vehicleno = vehicledetails.vehicleno.ToString();
dto.wagonno = vehicledetails.wagonno.ToString();
dto.Active = report.CCD_ACTIVE;
dto.ImgURL = report.CCD_IMAGE_URL != null ? imgurl + report.CCD_IMAGE_URL : "NA";
dto.Desc = report.CCD_VIEW_DESC != null ? report.CCD_VIEW_DESC : "NA";
ReportDTO.Add(dto);
}
return Ok(ReportDTO);
}
catch (Exception ex)
{
return Content(HttpStatusCode.NoContent, "Something went wrong");
}
}
The data in vehicledetail in vehicledetail vehicledetails = vehicle[0]; is getting populated from this function:
public List<vehicledetail> GetVehicleID(string coilID)
{
List<vehicledetail> vehicle = new List<vehicledetail>();
vehicledetail vehicledetails = new vehicledetail();
string oradb = Utilities.STAR_DB;
OracleConnection conn = new OracleConnection(oradb);
string query = "SELECT a.Vbeln, b.WAGON_NO FROM sapr3.lips a, sapr3.ZVTRRDA b WHERE
a.MANDT='600' AND a.CHARG='" + coilID + "' AND a.LFIMG > 0 AND a.MANDT = b.MANDT AND
a.VBELN = b.VBELN";
OracleDataAdapter da = new OracleDataAdapter(query, conn);
conn.Open();
DataTable dt = new DataTable();
da.Fill(dt);
foreach (DataRow row in dt.Rows)
{
vehicledetails.vehicleno = row["VBELN"].ToString();
vehicledetails.wagonno = row["WAGON_NO"].ToString();
}
conn.Close();
vehicle.Add(vehicledetails);
return (vehicle);
}
It is working fine but it is taking 30 seconds to load the below data:
How do I optimize this . Please help. Note: that it is taking 30 seconds to upload this data
Few other things aside, the major problem seems to be that you are querying database for each vehicle.
In this particular scenario it might very well be better to select all vehicle ids and query them all.
var vehicleIds = reportData.SelectMany(t => t.ID);
You can then form a query that will get all vehicle details together. This will reduce the number of database calls and it can have massive impact on time.
Another thing to check is if there's any index created on the vehicle id column in database as that may also help speed things up.

C# MySQLDataReader returns column names instead of field values

I am using MySQLClient with a local database. I wrote a method which returns a list of data about the user, where I specify the columns I want the data from and it generates the query dynamically.
However, the reader is only returning the column names rather than the actual data and I don't know why, since the same method works previously in the program when the user is logging in.
I am using parameterised queries to protect from SQL injection.
Here is my code. I have removed parts which are unrelated to the problem, but i can give full code if needed.
namespace Library_application
{
class MainProgram
{
public static Int32 user_id;
static void Main()
{
MySqlConnection conn = LoginProgram.Start();
//this is the login process and works perfectly fine so i won't show its code
if (conn != null)
{
//this is where things start to break
NewUser(conn);
}
Console.ReadLine();
}
static void NewUser(MySqlConnection conn)
{
//three types of users, currently only using student
string query = "SELECT user_role FROM Users WHERE user_id=#user_id";
Dictionary<string, string> vars = new Dictionary<string, string>
{
["#user_id"] = user_id.ToString()
};
MySqlDataReader reader = SQLControler.SqlQuery(conn, query, vars, 0);
if (reader.Read())
{
string user_role = reader["user_role"].ToString();
reader.Close();
//this works fine and it correctly identifies the role and creates a student
Student user = new Student(conn, user_id);
//later i will add the logic to detect and create the other users but i just need this to work first
}
else
{
throw new Exception($"no user_role for user_id - {user_id}");
}
}
}
class SQLControler
{
public static MySqlDataReader SqlQuery(MySqlConnection conn, string query, Dictionary<string, string> vars, int type)
{
MySqlCommand cmd = new MySqlCommand(query, conn);
int count = vars.Count();
MySqlParameter[] param = new MySqlParameter[count];
//adds the parameters to the command
for (int i = 0; i < count; i++)
{
string key = vars.ElementAt(i).Key;
param[i] = new MySqlParameter(key, vars[key]);
cmd.Parameters.Add(param[i]);
}
//runs this one
if (type == 0)
{
Console.WriteLine("------------------------------------");
return cmd.ExecuteReader();
//returns the reader so i can get the data later and keep this reusable
}
else if (type == 1)
{
cmd.ExecuteNonQuery();
return null;
}
else
{
throw new Exception("incorrect type value");
}
}
}
class User
{
public List<string> GetValues(MySqlConnection conn, List<string> vals, int user_id)
{
Dictionary<string, string> vars = new Dictionary<string, string> { };
//------------------------------------------------------------------------------------
//this section is generating the query and parameters
//using parameters to protect against sql injection, i know that it ins't essential in this scenario
//but it will be later, so if i fix it by simply removing the parameterisation then im just kicking the problem down the road
string args = "";
for (int i = 0; i < vals.Count(); i++)
{
args = args + "#" + vals[i];
vars.Add("#" + vals[i], vals[i]);
if ((i + 1) != vals.Count())
{
args = args + ", ";
}
}
string query = "SELECT " + args + " FROM Users WHERE user_id = #user_id";
Console.WriteLine(query);
vars.Add("#user_id", user_id.ToString());
//-------------------------------------------------------------------------------------
//sends the connection, query, parameters, and query type (0 means i use a reader (select), 1 means i use non query (delete etc..))
MySqlDataReader reader = SQLControler.SqlQuery(conn, query, vars, 0);
List<string> return_vals = new List<string>();
if (reader.Read())
{
//loops through the reader and adds the value to list
for (int i = 0; i < vals.Count(); i++)
{
//vals is a list of column names in the ame order they will be returned
//i think this is where it's breaking but im not certain
return_vals.Add(reader[vals[i]].ToString());
}
reader.Close();
return return_vals;
}
else
{
throw new Exception("no data");
}
}
}
class Student : User
{
public Student(MySqlConnection conn, int user_id)
{
Console.WriteLine("student created");
//list of the data i want to retrieve from the db
//must be the column names
List<string> vals = new List<string> { "user_forename", "user_surname", "user_role", "user_status"};
//should return a list with the values in the specified columns from the user with the matching id
List<string> return_vals = base.GetValues(conn, vals, user_id);
//for some reason i am getting back the column names rather than the values in the fields
foreach(var v in return_vals)
{
Console.WriteLine(v);
}
}
}
What i have tried:
- Using getstring
- Using index rather than column names
- Specifying a specific column name
- Using while (reader.Read)
- Requesting different number of columns
I have used this method during the login section and it works perfectly there (code below). I can't figure out why it doesnt work here (code above) aswell.
static Boolean Login(MySqlConnection conn)
{
Console.Write("Username: ");
string username = Console.ReadLine();
Console.Write("Password: ");
string password = Console.ReadLine();
string query = "SELECT user_id, username, password FROM Users WHERE username=#username";
Dictionary<string, string> vars = new Dictionary<string, string>
{
["#username"] = username
};
MySqlDataReader reader = SQLControler.SqlQuery(conn, query, vars, 0);
Boolean valid_login = ValidLogin(reader, password);
return (valid_login);
}
static Boolean ValidLogin(MySqlDataReader reader, string password)
{
Boolean return_val;
if (reader.Read())
{
//currently just returns the password as is, I will implement the hashing later
password = PasswordHash(password);
if (password == reader["password"].ToString())
{
MainProgram.user_id = Convert.ToInt32(reader["user_id"]);
return_val = true;
}
else
{
return_val = false;
}
}
else
{
return_val = false;
}
reader.Close();
return return_val;
}
The problem is here:
string args = "";
for (int i = 0; i < vals.Count(); i++)
{
args = args + "#" + vals[i];
vars.Add("#" + vals[i], vals[i]);
// ...
}
string query = "SELECT " + args + " FROM Users WHERE user_id = #user_id";
This builds a query that looks like:
SELECT #user_forename, #user_surname, #user_role, #user_status FROM Users WHERE user_id = #user_id;
Meanwhile, vars.Add("#" + vals[i], vals[i]); ends up mapping #user_forename to "user_forename" in the MySqlParameterCollection for the query. Your query ends up selecting the (constant) value of those parameters for each row in the database.
The solution is:
Don't prepend # to the column names you're selecting.
Don't add the column names as variables to the query.
You can do this by replacing that whole loop with:
string args = string.Join(", ", vals);

Bulk insert/Update with Petapoco

I'm using the Save() method to insert or update records, but I would like to make it perform a bulk insert and bulk update with only one database hit. How do I do this?
In my case, I took advantage of the database.Execute() method.
I created a SQL parameter that had the first part of my insert:
var sql = new Sql("insert into myTable(Name, Age, Gender) values");
for (int i = 0; i < pocos.Count ; ++i)
{
var p = pocos[i];
sql.Append("(#0, #1, #2)", p.Name, p.Age , p.Gender);
if(i != pocos.Count -1)
sql.Append(",");
}
Database.Execute(sql);
I tried two different methods for inserting a large quantity of rows faster than the default Insert (which is pretty slow when you have a lot of rows).
1) Making up a List<T> with the poco's first and then inserting them at once within a loop (and in a transaction):
using (var tr = PetaPocoDb.GetTransaction())
{
foreach (var record in listOfRecords)
{
PetaPocoDb.Insert(record);
}
tr.Complete();
}
2) SqlBulkCopy a DataTable:
var bulkCopy = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.TableLock);
bulkCopy.DestinationTableName = "SomeTable";
bulkCopy.WriteToServer(dt);
To get my List <T> to a DataTable I used Marc Gravells Convert generic List/Enumerable to DataTable? function which worked ootb for me (after I rearranged the Poco properties to be in the exact same order as the table fields in the db.)
The SqlBulkCopy was fastest, 50% or so faster than the transactions method in the (quick) perf tests I did with ~1000 rows.
Hth
Insert in one SQL query is much faster.
Here is a customer method for PetaPoco.Database class that adds ability to do a bulk insert of any collection:
public void BulkInsertRecords<T>(IEnumerable<T> collection)
{
try
{
OpenSharedConnection();
using (var cmd = CreateCommand(_sharedConnection, ""))
{
var pd = Database.PocoData.ForType(typeof(T));
var tableName = EscapeTableName(pd.TableInfo.TableName);
string cols = string.Join(", ", (from c in pd.QueryColumns select tableName + "." + EscapeSqlIdentifier(c)).ToArray());
var pocoValues = new List<string>();
var index = 0;
foreach (var poco in collection)
{
var values = new List<string>();
foreach (var i in pd.Columns)
{
values.Add(string.Format("{0}{1}", _paramPrefix, index++));
AddParam(cmd, i.Value.GetValue(poco), _paramPrefix);
}
pocoValues.Add("(" + string.Join(",", values.ToArray()) + ")");
}
var sql = string.Format("INSERT INTO {0} ({1}) VALUES {2}", tableName, cols, string.Join(", ", pocoValues));
cmd.CommandText = sql;
cmd.ExecuteNonQuery();
}
}
finally
{
CloseSharedConnection();
}
}
Here is the updated verision of Steve Jansen answer that splits in chuncs of maximum 2100 pacos
I commented out the following code as it produces duplicates in the database...
//using (var reader = cmd.ExecuteReader())
//{
// while (reader.Read())
// {
// inserted.Add(reader[0]);
// }
//}
Updated Code
/// <summary>
/// Performs an SQL Insert against a collection of pocos
/// </summary>
/// <param name="pocos">A collection of POCO objects that specifies the column values to be inserted. Assumes that every POCO is of the same type.</param>
/// <returns>An array of the auto allocated primary key of the new record, or null for non-auto-increment tables</returns>
public object BulkInsert(IEnumerable<object> pocos)
{
Sql sql;
IList<PocoColumn> columns = new List<PocoColumn>();
IList<object> parameters;
IList<object> inserted;
PocoData pd;
Type primaryKeyType;
object template;
string commandText;
string tableName;
string primaryKeyName;
bool autoIncrement;
int maxBulkInsert;
if (null == pocos)
{
return new object[] { };
}
template = pocos.First<object>();
if (null == template)
{
return null;
}
pd = PocoData.ForType(template.GetType());
tableName = pd.TableInfo.TableName;
primaryKeyName = pd.TableInfo.PrimaryKey;
autoIncrement = pd.TableInfo.AutoIncrement;
//Calculate the maximum chunk size
maxBulkInsert = 2100 / pd.Columns.Count;
IEnumerable<object> pacosToInsert = pocos.Take(maxBulkInsert);
IEnumerable<object> pacosremaining = pocos.Skip(maxBulkInsert);
try
{
OpenSharedConnection();
try
{
var names = new List<string>();
var values = new List<string>();
var index = 0;
foreach (var i in pd.Columns)
{
// Don't insert result columns
if (i.Value.ResultColumn)
continue;
// Don't insert the primary key (except under oracle where we need bring in the next sequence value)
if (autoIncrement && primaryKeyName != null && string.Compare(i.Key, primaryKeyName, true) == 0)
{
primaryKeyType = i.Value.PropertyInfo.PropertyType;
// Setup auto increment expression
string autoIncExpression = _dbType.GetAutoIncrementExpression(pd.TableInfo);
if (autoIncExpression != null)
{
names.Add(i.Key);
values.Add(autoIncExpression);
}
continue;
}
names.Add(_dbType.EscapeSqlIdentifier(i.Key));
values.Add(string.Format("{0}{1}", _paramPrefix, index++));
columns.Add(i.Value);
}
string outputClause = String.Empty;
if (autoIncrement)
{
outputClause = _dbType.GetInsertOutputClause(primaryKeyName);
}
commandText = string.Format("INSERT INTO {0} ({1}){2} VALUES",
_dbType.EscapeTableName(tableName),
string.Join(",", names.ToArray()),
outputClause
);
sql = new Sql(commandText);
parameters = new List<object>();
string valuesText = string.Concat("(", string.Join(",", values.ToArray()), ")");
bool isFirstPoco = true;
var parameterCounter = 0;
foreach (object poco in pacosToInsert)
{
parameterCounter++;
parameters.Clear();
foreach (PocoColumn column in columns)
{
parameters.Add(column.GetValue(poco));
}
sql.Append(valuesText, parameters.ToArray<object>());
if (isFirstPoco && pocos.Count() > 1)
{
valuesText = "," + valuesText;
isFirstPoco = false;
}
}
inserted = new List<object>();
using (var cmd = CreateCommand(_sharedConnection, sql.SQL, sql.Arguments))
{
if (!autoIncrement)
{
DoPreExecute(cmd);
cmd.ExecuteNonQuery();
OnExecutedCommand(cmd);
PocoColumn pkColumn;
if (primaryKeyName != null && pd.Columns.TryGetValue(primaryKeyName, out pkColumn))
{
foreach (object poco in pocos)
{
inserted.Add(pkColumn.GetValue(poco));
}
}
return inserted.ToArray<object>();
}
object id = _dbType.ExecuteInsert(this, cmd, primaryKeyName);
if (pacosremaining.Any())
{
return BulkInsert(pacosremaining);
}
return id;
//using (var reader = cmd.ExecuteReader())
//{
// while (reader.Read())
// {
// inserted.Add(reader[0]);
// }
//}
//object[] primaryKeys = inserted.ToArray<object>();
//// Assign the ID back to the primary key property
//if (primaryKeyName != null)
//{
// PocoColumn pc;
// if (pd.Columns.TryGetValue(primaryKeyName, out pc))
// {
// index = 0;
// foreach (object poco in pocos)
// {
// pc.SetValue(poco, pc.ChangeType(primaryKeys[index]));
// index++;
// }
// }
//}
//return primaryKeys;
}
}
finally
{
CloseSharedConnection();
}
}
catch (Exception x)
{
if (OnException(x))
throw;
return null;
}
}
Below is a BulkInsert method of PetaPoco that expands on taylonr's very clever idea to use the SQL technique of insert multiple rows via INSERT INTO tab(col1, col2) OUTPUT inserted.[ID] VALUES (#0, #1), (#2, 3), (#4, #5), ..., (#n-1, #n).
It also returns the auto-increment (identity) values of inserted records, which I don't believe happens in IvoTops' implementation.
NOTE: SQL Server 2012 (and below) has a limit of 2,100 parameters per query. (This is likely the source of the stack overflow exception referenced by Zelid's comment). You will need to manually split your batches up based on the number of columns that are not decorated as Ignore or Result. For example, a POCO with 21 columns should be sent in batch sizes of 99, or (2100 - 1) / 21. I may refactor this to dynamically split batches based on this limit for SQL Server; however, you will always see the best results by managing the batch size external to this method.
This method showed an approximate 50% gain in execution time over my previous technique of using a shared connection in a single transaction for all inserts.
This is one area where Massive really shines - Massive has a Save(params object[] things) that builds an array of IDbCommands, and executes each one on a shared connection. It works out of the box, and doesn't run into parameter limits.
/// <summary>
/// Performs an SQL Insert against a collection of pocos
/// </summary>
/// <param name="pocos">A collection of POCO objects that specifies the column values to be inserted. Assumes that every POCO is of the same type.</param>
/// <returns>An array of the auto allocated primary key of the new record, or null for non-auto-increment tables</returns>
/// <remarks>
/// NOTE: As of SQL Server 2012, there is a limit of 2100 parameters per query. This limitation does not seem to apply on other platforms, so
/// this method will allow more than 2100 parameters. See http://msdn.microsoft.com/en-us/library/ms143432.aspx
/// The name of the table, it's primary key and whether it's an auto-allocated primary key are retrieved from the attributes of the first POCO in the collection
/// </remarks>
public object[] BulkInsert(IEnumerable<object> pocos)
{
Sql sql;
IList<PocoColumn> columns = new List<PocoColumn>();
IList<object> parameters;
IList<object> inserted;
PocoData pd;
Type primaryKeyType;
object template;
string commandText;
string tableName;
string primaryKeyName;
bool autoIncrement;
if (null == pocos)
return new object[] {};
template = pocos.First<object>();
if (null == template)
return null;
pd = PocoData.ForType(template.GetType());
tableName = pd.TableInfo.TableName;
primaryKeyName = pd.TableInfo.PrimaryKey;
autoIncrement = pd.TableInfo.AutoIncrement;
try
{
OpenSharedConnection();
try
{
var names = new List<string>();
var values = new List<string>();
var index = 0;
foreach (var i in pd.Columns)
{
// Don't insert result columns
if (i.Value.ResultColumn)
continue;
// Don't insert the primary key (except under oracle where we need bring in the next sequence value)
if (autoIncrement && primaryKeyName != null && string.Compare(i.Key, primaryKeyName, true) == 0)
{
primaryKeyType = i.Value.PropertyInfo.PropertyType;
// Setup auto increment expression
string autoIncExpression = _dbType.GetAutoIncrementExpression(pd.TableInfo);
if (autoIncExpression != null)
{
names.Add(i.Key);
values.Add(autoIncExpression);
}
continue;
}
names.Add(_dbType.EscapeSqlIdentifier(i.Key));
values.Add(string.Format("{0}{1}", _paramPrefix, index++));
columns.Add(i.Value);
}
string outputClause = String.Empty;
if (autoIncrement)
{
outputClause = _dbType.GetInsertOutputClause(primaryKeyName);
}
commandText = string.Format("INSERT INTO {0} ({1}){2} VALUES",
_dbType.EscapeTableName(tableName),
string.Join(",", names.ToArray()),
outputClause
);
sql = new Sql(commandText);
parameters = new List<object>();
string valuesText = string.Concat("(", string.Join(",", values.ToArray()), ")");
bool isFirstPoco = true;
foreach (object poco in pocos)
{
parameters.Clear();
foreach (PocoColumn column in columns)
{
parameters.Add(column.GetValue(poco));
}
sql.Append(valuesText, parameters.ToArray<object>());
if (isFirstPoco)
{
valuesText = "," + valuesText;
isFirstPoco = false;
}
}
inserted = new List<object>();
using (var cmd = CreateCommand(_sharedConnection, sql.SQL, sql.Arguments))
{
if (!autoIncrement)
{
DoPreExecute(cmd);
cmd.ExecuteNonQuery();
OnExecutedCommand(cmd);
PocoColumn pkColumn;
if (primaryKeyName != null && pd.Columns.TryGetValue(primaryKeyName, out pkColumn))
{
foreach (object poco in pocos)
{
inserted.Add(pkColumn.GetValue(poco));
}
}
return inserted.ToArray<object>();
}
// BUG: the following line reportedly causes duplicate inserts; need to confirm
//object id = _dbType.ExecuteInsert(this, cmd, primaryKeyName);
using(var reader = cmd.ExecuteReader())
{
while (reader.Read())
{
inserted.Add(reader[0]);
}
}
object[] primaryKeys = inserted.ToArray<object>();
// Assign the ID back to the primary key property
if (primaryKeyName != null)
{
PocoColumn pc;
if (pd.Columns.TryGetValue(primaryKeyName, out pc))
{
index = 0;
foreach(object poco in pocos)
{
pc.SetValue(poco, pc.ChangeType(primaryKeys[index]));
index++;
}
}
}
return primaryKeys;
}
}
finally
{
CloseSharedConnection();
}
}
catch (Exception x)
{
if (OnException(x))
throw;
return null;
}
}
Here is the code for BulkInsert that you can add to v5.01 PetaPoco.cs
You can paste it somewhere close the regular insert at line 1098
You give it an IEnumerable of Pocos and it will send it to the database
in batches of x together. The code is 90% from the regular insert.
I do not have performance comparison, let me know :)
/// <summary>
/// Bulk inserts multiple rows to SQL
/// </summary>
/// <param name="tableName">The name of the table to insert into</param>
/// <param name="primaryKeyName">The name of the primary key column of the table</param>
/// <param name="autoIncrement">True if the primary key is automatically allocated by the DB</param>
/// <param name="pocos">The POCO objects that specifies the column values to be inserted</param>
/// <param name="batchSize">The number of POCOS to be grouped together for each database rounddtrip</param>
public void BulkInsert(string tableName, string primaryKeyName, bool autoIncrement, IEnumerable<object> pocos, int batchSize = 25)
{
try
{
OpenSharedConnection();
try
{
using (var cmd = CreateCommand(_sharedConnection, ""))
{
var pd = PocoData.ForObject(pocos.First(), primaryKeyName);
// Create list of columnnames only once
var names = new List<string>();
foreach (var i in pd.Columns)
{
// Don't insert result columns
if (i.Value.ResultColumn)
continue;
// Don't insert the primary key (except under oracle where we need bring in the next sequence value)
if (autoIncrement && primaryKeyName != null && string.Compare(i.Key, primaryKeyName, true) == 0)
{
// Setup auto increment expression
string autoIncExpression = _dbType.GetAutoIncrementExpression(pd.TableInfo);
if (autoIncExpression != null)
{
names.Add(i.Key);
}
continue;
}
names.Add(_dbType.EscapeSqlIdentifier(i.Key));
}
var namesArray = names.ToArray();
var values = new List<string>();
int count = 0;
do
{
cmd.CommandText = "";
cmd.Parameters.Clear();
var index = 0;
foreach (var poco in pocos.Skip(count).Take(batchSize))
{
values.Clear();
foreach (var i in pd.Columns)
{
// Don't insert result columns
if (i.Value.ResultColumn) continue;
// Don't insert the primary key (except under oracle where we need bring in the next sequence value)
if (autoIncrement && primaryKeyName != null && string.Compare(i.Key, primaryKeyName, true) == 0)
{
// Setup auto increment expression
string autoIncExpression = _dbType.GetAutoIncrementExpression(pd.TableInfo);
if (autoIncExpression != null)
{
values.Add(autoIncExpression);
}
continue;
}
values.Add(string.Format("{0}{1}", _paramPrefix, index++));
AddParam(cmd, i.Value.GetValue(poco), i.Value.PropertyInfo);
}
string outputClause = String.Empty;
if (autoIncrement)
{
outputClause = _dbType.GetInsertOutputClause(primaryKeyName);
}
cmd.CommandText += string.Format("INSERT INTO {0} ({1}){2} VALUES ({3})", _dbType.EscapeTableName(tableName),
string.Join(",", namesArray), outputClause, string.Join(",", values.ToArray()));
}
// Are we done?
if (cmd.CommandText == "") break;
count += batchSize;
DoPreExecute(cmd);
cmd.ExecuteNonQuery();
OnExecutedCommand(cmd);
}
while (true);
}
}
finally
{
CloseSharedConnection();
}
}
catch (Exception x)
{
if (OnException(x))
throw;
}
}
/// <summary>
/// Performs a SQL Bulk Insert
/// </summary>
/// <param name="pocos">The POCO objects that specifies the column values to be inserted</param>
/// <param name="batchSize">The number of POCOS to be grouped together for each database rounddtrip</param>
public void BulkInsert(IEnumerable<object> pocos, int batchSize = 25)
{
if (!pocos.Any()) return;
var pd = PocoData.ForType(pocos.First().GetType());
BulkInsert(pd.TableInfo.TableName, pd.TableInfo.PrimaryKey, pd.TableInfo.AutoIncrement, pocos);
}
And in the same lines if you want BulkUpdate:
public void BulkUpdate<T>(string tableName, string primaryKeyName, IEnumerable<T> pocos, int batchSize = 25)
{
try
{
object primaryKeyValue = null;
OpenSharedConnection();
try
{
using (var cmd = CreateCommand(_sharedConnection, ""))
{
var pd = PocoData.ForObject(pocos.First(), primaryKeyName);
int count = 0;
do
{
cmd.CommandText = "";
cmd.Parameters.Clear();
var index = 0;
var cmdText = new StringBuilder();
foreach (var poco in pocos.Skip(count).Take(batchSize))
{
var sb = new StringBuilder();
var colIdx = 0;
foreach (var i in pd.Columns)
{
// Don't update the primary key, but grab the value if we don't have it
if (string.Compare(i.Key, primaryKeyName, true) == 0)
{
primaryKeyValue = i.Value.GetValue(poco);
continue;
}
// Dont update result only columns
if (i.Value.ResultColumn)
continue;
// Build the sql
if (colIdx > 0)
sb.Append(", ");
sb.AppendFormat("{0} = {1}{2}", _dbType.EscapeSqlIdentifier(i.Key), _paramPrefix,
index++);
// Store the parameter in the command
AddParam(cmd, i.Value.GetValue(poco), i.Value.PropertyInfo);
colIdx++;
}
// Find the property info for the primary key
PropertyInfo pkpi = null;
if (primaryKeyName != null)
{
pkpi = pd.Columns[primaryKeyName].PropertyInfo;
}
cmdText.Append(string.Format("UPDATE {0} SET {1} WHERE {2} = {3}{4};\n",
_dbType.EscapeTableName(tableName), sb.ToString(),
_dbType.EscapeSqlIdentifier(primaryKeyName), _paramPrefix,
index++));
AddParam(cmd, primaryKeyValue, pkpi);
}
if (cmdText.Length == 0) break;
if (_providerName.IndexOf("oracle", StringComparison.OrdinalIgnoreCase) >= 0)
{
cmdText.Insert(0, "BEGIN\n");
cmdText.Append("\n END;");
}
DoPreExecute(cmd);
cmd.CommandText = cmdText.ToString();
count += batchSize;
cmd.ExecuteNonQuery();
OnExecutedCommand(cmd);
} while (true);
}
}
finally
{
CloseSharedConnection();
}
}
catch (Exception x)
{
if (OnException(x))
throw;
}
}
Here's a nice 2018 update using FastMember from NuGet:
private static async Task SqlBulkCopyPocoAsync<T>(PetaPoco.Database db, IEnumerable<T> data)
{
var pd = PocoData.ForType(typeof(T), db.DefaultMapper);
using (var bcp = new SqlBulkCopy(db.ConnectionString))
using (var reader = ObjectReader.Create(data))
{
// set up a mapping from the property names to the column names
var propNames = typeof(T).GetProperties().Where(p => Attribute.IsDefined(p, typeof(ResultColumnAttribute)) == false).Select(propertyInfo => propertyInfo.Name).ToArray();
foreach (var propName in propNames)
{
bcp.ColumnMappings.Add(propName, "[" + pd.GetColumnName(propName) + "]");
}
bcp.DestinationTableName = pd.TableInfo.TableName;
await bcp.WriteToServerAsync(reader).ConfigureAwait(false);
}
}
You can just do a foreach on your records.
foreach (var record in records) {
db.Save(record);
}

Categories