My application uses Entity Framework. Most days I feel fine about that, but I have trouble when I need to do bulk insertions. I've yet to find a way for EF to do these quickly. As such, I need a solution that goes outside of EF to do the updates.
I want a method that takes a connection string, destination table name, and a generic array of source data, and performs the bulk insertions. Moreover, I'd like it to map the properties of the source data to the specific table fields, ideally without requiring attributes in the source object to designate the table field.
Thus, for this object:
public class Customer
{
//property populated & used only in memory
public string TempProperty { get; set; }
//properties saved in the database
public string Name { get; set; }
public string Address { get; set; }
public int Age { get; set; }
public string Comments { get; set; }
//various methods, constructors, etc.
}
I should be able to provide the table name Data.Customer, and the method should map Customer.Name -> Data.Customers.Name, Customer.Address -> Data.Customers.Address, etc.
I was surprised I couldn't find the code for something like this anywhere. Maybe this isn't the best approach to the problem? That possibility notwithstanding, here's the solution I came up with.
My starting outline was something like this:
Use the destination table and source object type to create mapping object using reflection.
Iterate through the source objects, applying the mapper, to generate the insertion data.
Insert the data into the destination table using the System.Data.SqlClient.SqlBulkCopy object.
When I see an outline like this, I try to rephrase it in terms of data types, because really all I'm doing is translating my input (T[]) into something that SqlBulkCopy will accept. But to do this, I need to decide on how to map fields.
Here's what I settled on:
var mapper = new List<Tuple<string, Func<object, string, object>>>();
I chose this to represent a table that looks like this:
+------------+----------------------------------------------+
| Field Name | Mapping Function |
+------------+----------------------------------------------+
| Name | Customer.Name -> Data.Customers.Name |
| Address | Customer.Address -> Data.Customers.Address |
| Age | Customer.Age -> Data.Customers.Age |
| Comments | Customer.Comments -> Data.Customers.Comments |
+------------+----------------------------------------------+
The fact that it's a list represents the rows. That leaves us with
Tuple<string, Func<object, string, object>>. This creates a sort of Dictionary (but indexed) where a given string (the field name) is mapped to a function that, when given a source object T and a source field (e.g. Address), will retrieve the appropriate value. If no corresponding property is found for a table field, we'll just return null.
After validating that the input values are valid (connection valid, table exists, etc.), we'll create our mapping object:
//get all the column names for the table to build mapping object
SqlCommand command = new SqlCommand($"SELECT TOP 1 * FROM {foundTableName}", conn);
SqlDataReader reader = command.ExecuteReader();
//build mapping object by iterating through rows and verifying that there is a match in the table
var mapper = new List<Tuple<string, Func<object, string, object>>>();
foreach (DataRow col in reader.GetSchemaTable().Rows)
{
//get column information
string columnName = col.Field<string>("ColumnName");
PropertyInfo property = typeof(T).GetProperty(columnName);
Func<object, string, object> map;
if (property == null)
{
//check if it's nullable and exit if not
bool nullable = col.Field<bool>("Is_Nullable");
if (!nullable)
return $"No corresponding property found for Non-nullable field '{columnName}'.";
//if it's nullable, create mapping function
map = new Func<object, string, object>((a, b) => null);
}
else
map = new Func<object, string, object>((src, fld) => typeof(T).GetProperty(fld).GetValue(src));
//add mapping object
mapper.Add(new Tuple<string, Func<object, string, object>>(columnName, map));
}
The SqlBulkCopy object accepts DataRow[] as an input, so from here we can simply create template DataRow objects from the destination table and fill them from our mapper:
//get all the data
int dataCount = sourceData.Count();
var rows = new DataRow[dataCount];
DataTable destTableDT = new DataTable();
destTableDT.Load(reader);
for (int x = 0; x < dataCount; x++)
{
var dataRow = destTableDT.NewRow();
dataRow.ItemArray = mapper.Select(m => m.Item2.Invoke(sourceData[x], m.Item1))
.ToArray();
rows[x] = dataRow;
}
Then finish it out by writing the data:
//set up the bulk copy connection
SqlBulkCopy sbc = new SqlBulkCopy(conn, SqlBulkCopyOptions.TableLock |
SqlBulkCopyOptions.UseInternalTransaction, null);
sbc.DestinationTableName = foundTableName;
sbc.BatchSize = BATCH_SIZE;
sbc.WriteToServer(rows);
And that's it! Works like a charm, and ran too fast for me to bother benchmarking (EF was taking several minutes to run these imports).
I should note that I left out a lot of my error checking code and will probably be adding more. However, if you'd like to see my class in its entirety, here it is:
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.Common;
using System.Data.SqlClient;
using System.Linq;
using System.Reflection;
namespace DatabaseUtilities
{
public static class RapidDataTools
{
const int SCHEMA_SCHEMA_NAME = 1;
const int SCHEMA_TABLE_NAME = 2;
const int BATCH_SIZE = 1000;
/// <summary>
/// Imports an array of data into a specified table. It does so by mapping object properties
/// to table columns. Only properties with the same name as the column name will be copied;
/// other columns will be left null. Non-nullable columns with no corresponding property will
/// throw an error.
/// </summary>
/// <param name="connectionString"></param>
/// <param name="destTableName">Qualified table name (e.g. Admin.Table)</param>
/// <param name="sourceData"></param>
/// <returns></returns>
public static string Import<T>(string connectionString, string destTableName, T[] sourceData)
{
//get destination table qualified name
string[] tableParts = destTableName.Split('.');
if (tableParts.Count() != 2) return $"Invalid or unqualified destination table name: {destTableName}.";
string destSchema = tableParts[0];
string destTable = tableParts[1];
//create the database connection
SqlConnection conn = GetConnection(connectionString);
if (conn == null) return "Invalid connection string.";
//establish connection
try { conn.Open(); }
catch { return "Could not connect to database using provided connection string."; }
//make sure the requested table exists
string foundTableName = string.Empty;
foreach (DataRow row in conn.GetSchema("Tables").Rows)
if (row[SCHEMA_SCHEMA_NAME].ToString().Equals(destSchema, StringComparison.CurrentCultureIgnoreCase) &&
row[SCHEMA_TABLE_NAME].ToString().Equals(destTable, StringComparison.CurrentCultureIgnoreCase))
{
foundTableName = $"{row[SCHEMA_SCHEMA_NAME]}.{row[SCHEMA_TABLE_NAME]}";
break;
}
if (foundTableName == string.Empty) return $"Specified table '{destTableName}' could not be found in table.";
//get all the column names for the table to build mapping object
SqlCommand command = new SqlCommand($"SELECT TOP 1 * FROM {foundTableName}", conn);
SqlDataReader reader = command.ExecuteReader();
//build mapping object by iterating through rows and verifying that there is a match in the table
var mapper = new List<Tuple<string, Func<object, string, object>>>();
foreach (DataRow col in reader.GetSchemaTable().Rows)
{
//get column information
string columnName = col.Field<string>("ColumnName");
PropertyInfo property = typeof(T).GetProperty(columnName);
Func<object, string, object> map;
if (property == null)
{
//check if it's nullable and exit if not
bool nullable = col.Field<bool>("Is_Nullable");
if (!nullable)
return $"No corresponding property found for Non-nullable field '{columnName}'.";
//if it's nullable, create mapping function
map = new Func<object, string, object>((a, b) => null);
}
else
map = new Func<object, string, object>((src, fld) => typeof(T).GetProperty(fld).GetValue(src));
//add mapping object
mapper.Add(new Tuple<string, Func<object, string, object>>(columnName, map));
}
//get all the data
int dataCount = sourceData.Count();
var rows = new DataRow[dataCount];
DataTable destTableDT = new DataTable();
destTableDT.Load(reader);
for (int x = 0; x < dataCount; x++)
{
var dataRow = destTableDT.NewRow();
dataRow.ItemArray = mapper.Select(m => m.Item2.Invoke(sourceData[x], m.Item1)).ToArray();
rows[x] = dataRow;
}
//close the old connection
conn.Close();
//set up the bulk copy connection
SqlBulkCopy sbc = new SqlBulkCopy(conn, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.UseInternalTransaction, null);
sbc.DestinationTableName = foundTableName;
sbc.BatchSize = BATCH_SIZE;
//establish connection
try { conn.Open(); }
catch { return "Failed to re-established connection to the database after reading data."; }
//write data
try { sbc.WriteToServer(rows); }
catch (Exception ex) { return $"Batch write failed. Details: {ex.Message} - {ex.StackTrace}"; }
//if we got here, everything worked!
return string.Empty;
}
private static SqlConnection GetConnection(string connectionString)
{
DbConnectionStringBuilder csb = new DbConnectionStringBuilder();
try { csb.ConnectionString = connectionString; }
catch { return null; }
return new SqlConnection(csb.ConnectionString);
}
}
}
Take a look at NuGet package "EntityFramework.Utilities", you can add it to your project. It has features for this. You'll have to do some recoding, but this will probably benefit you in the end.
References:
https://www.nuget.org/packages/EFUtilities/ (package)
https://github.com/MikaelEliasson/EntityFramework.Utilities (source +
examples)
There is basically 3 approach for performing Bulk Insert in Entity Framework
Entity Framework Extensions (Recommended)
EntityFramework.BulkInsert (Limited and unsupported)
SqlBulkCopy (Complex to make it work)
The first 2 library add extension methods to the DbContext
using (var ctx = new EntitiesContext())
{
ctx.BulkInsert(list);
}
You can learn more about this three techniques in this article
Disclaimer: I'm the owner of the project Entity Framework Extensions
Related
I've written this program:
using System;
using System.Data;
public class Program
{
public static void Main()
{
using (var dt = new DataTable())
{
// the data in the table - one row one column
var person = new
{
Name = "ABC",
Age = 24
};
// setup data table
dt.Columns.Add("Column1");
var row = dt.NewRow();
row["Column1"] = person;
dt.Rows.Add(row);
// assert
var storedPerson = dt.Rows[0]["Column1"];
if(!ReferenceEquals(dt.Rows[0].ItemArray[0], storedPerson))
throw new Exception();
if(!ReferenceEquals(storedPerson, person)){
Console.WriteLine("What is going on?");
Console.WriteLine($"Person is stored as a [{storedPerson.GetType().Name}]!");
Console.WriteLine($"Why is it not of type: [{person.GetType().Name}]?");
}
}
}
}
Which you can run here: https://dotnetfiddle.net/LzHPs1
I would expect no exceptions and no output, but I actually get:
What is going on?
Person is stored as a [String]!
Why is it not of type: [<>f__AnonymousType0`2]?
Is the data table class supposed to call ToString() on the objects when they get inserted into the row?
Can I avoid it and store the reference to the object? If I can, how can I do this and if I can't surely Microsoft could have constrained the assignment of the row value to only accept strings and the return type of the index to be string and the ItemArray to be a string array and avoid all this confusion?
That's because you have added Column1 without denoting its type, hence by default it will be assigned string.
Try to explicitly set it as Object instead:
dt.Columns.Add("Column1", typeof(Object));
See Source
You can avoid it just by defining datatype of column when you are adding a column to you datatable.
i.e
dt.Columns.Add("Column1",typeof(object));
if you didn't specify datatype of column then it will be by default string. Reference Article
Try this code and you will see the magic. :)
using System;
using System.Data
public class Program
{
public static void Main()
{
using (var dt = new DataTable())
{
// the data in the table - one row one column
var person = new
{
Name = "ABC",
Age = 24
};
// setup data table
dt.Columns.Add("Column1",typeof(object));
var row = dt.NewRow();
row["Column1"] = person;
dt.Rows.Add(row);
// assert
var storedPerson = dt.Rows[0]["Column1"];
if(!ReferenceEquals(dt.Rows[0].ItemArray[0], storedPerson))
throw new Exception();
if(!ReferenceEquals(storedPerson, person)){
Console.WriteLine("What is going on?");
Console.WriteLine($"Person is stored as a [{storedPerson.GetType().Name}]!");
Console.WriteLine($"Why is it not of type: [{person.GetType().Name}]?");
}
}
}
}
Which you can run here: https://dotnetfiddle.net/y84NCh (Online .Net Compiler)
i have my classes as the following:
class SalesInvlice
{
int id {get;set;}
string number {get;set;}
List<InvoiceItems> {get;set}
}
class InvoiceItems
{
id {get;set}
string item_name {get;set}
int price {get;set;}
}
my application is an agent that can connect to any database specified in the config file, and to execute a certain query. in my case it will execute a query on one client DB as the following select id, number, items.id as items_id, item_name, price from transaction left join transaction_details on transactions.id = transaction_details.transaction_id
let say i got the data using SQLDataReader, but i am open in another solutions as well.
what i am looking for here, is to map the results from SQLDataReader to list of SalesInvoice Object.
the issue i am facing that if the transaction has list of transaction_details, this means the datareader will get them to me in different rows.
You have a choice to make you can return the data in one dataset or two. The first will return the transaction data duplicated see bellow. You will need to manage the dr duplicate transactions.
select id,
number,
items.id as items_id,
item_name, price
from transaction
left join transaction_details
on transactions.id = transaction_details.transaction_id
Or return the data in two sets of data ( transaction and transaction detail). I am assuming that you are interested in data for a particular transaction not many different transactions. Ado.net will allow for multiple groups of data to be returned. You would need to cast each data set to is object type. dimly a case of newing up a transaction and assigning properties.
to solve this issue, i have made my simi general mapper.
I have mapper that do the following:
private List<T> mappingFun<T>( IEnumerable<Dictionary<string, object>> args, object propListObj = null) where T: Entity
{
List<T> listObject = new List<T>();
foreach(var arg in args)
{
T returnedObject = Activator.CreateInstance<T>();
PropertyInfo[] modelProperties = returnedObject.GetType().GetProperties();
var addedobject = listObject.FirstOrDefault(x => x.name == arg[returnedObject.GetType().Name + "_name"].ToString());
if (addedobject != null)
{
returnedObject = addedobject;
}
foreach(var prop in modelProperties)
{
var a = prop.PropertyType;
if (prop.PropertyType == typeof(String))
{
prop.SetValue(returnedObject, arg[returnedObject.GetType().Name + "_" + prop.Name].ToString());
}
else
{
var propModuleObj = GetType().GetMethod("mappingFun", BindingFlags.Instance | BindingFlags.NonPublic, null, new Type[] { typeof(IEnumerable<Dictionary<string, object>>), typeof(object) }, null).MakeGenericMethod(prop.PropertyType.GetGenericArguments().Single()).Invoke(this, new object[] { new List<Dictionary<string, object>>() { arg }, prop.GetValue(returnedObject) });
prop.SetValue(returnedObject, propModuleObj);
}
}
listObject.AddIfNotExist(returnedObject);
if(propListObj != null)
listObject.AddRange((List<T>)propListObj);
}
return listObject;
}
This helps 100%
Now I'm using Dapper + Dapper.Extensions. And yes, it's easy and awesome. But I faced with a problem: Dapper.Extensions has only Insert command and not InsertUpdateOnDUplicateKey. I want to add such method but I don't see good way to do it:
I want to make this method generic like Insert
I can't get cached list of properties for particular type because I don't want to use reflection directly to build raw sql
Possible way here to fork it on github but I want to make it in my project only. Does anybody know how to extend it? I understand this feature ("insert ... update on duplicate key") is supported only in MySQL. But I can't find extension points in DapperExtensions to add this functionality outside.
Update: this is my fork https://github.com/MaximTkachenko/Dapper-Extensions/commits/master
This piece of code has helped me enormously in MySQL -related projects, I definitely owe you one.
I do a lot of database-related development on both MySQL and MS SQL. I also try to share as much code as possible between my projects.
MS SQL has no direct equivalent for "ON DUPLICATE KEY UPDATE", so I was previously unable to use this extension when working with MS SQL.
While migrating a web application (that leans heavily on this Dapper.Extensions tweak) from MySQL to MS SQL, I finally decided to do something about it.
This code uses the "IF EXISTS => UPDATE ELSE INSERT" approach that basically does the same as "ON DUPLICATE KEY UPDATE" on MySQL.
Please note: the snippet assumes that you are taking care of transactions outside this method. Alternatively you could append "BEGIN TRAN" to the beginning and "COMMIT" to the end of the generated sql string.
public static class SqlGeneratorExt
{
public static string InsertUpdateOnDuplicateKey(this ISqlGenerator generator, IClassMapper classMap, bool hasIdentityKeyWithValue = false)
{
var columns = classMap.Properties.Where(p => !(p.Ignored || p.IsReadOnly || (p.KeyType == KeyType.Identity && !hasIdentityKeyWithValue))).ToList();
var keys = columns.Where(c => c.KeyType != KeyType.NotAKey).Select(p => $"{generator.GetColumnName(classMap, p, false)}=#{p.Name}");
var nonkeycolumns = classMap.Properties.Where(p => !(p.Ignored || p.IsReadOnly) && p.KeyType == KeyType.NotAKey).ToList();
if (!columns.Any())
{
throw new ArgumentException("No columns were mapped.");
}
var tablename = generator.GetTableName(classMap);
var columnNames = columns.Select(p => generator.GetColumnName(classMap, p, false));
var parameters = columns.Select(p => generator.Configuration.Dialect.ParameterPrefix + p.Name);
var valuesSetters = nonkeycolumns.Select(p => $"{generator.GetColumnName(classMap, p, false)}=#{p.Name}").ToList();
var where = keys.AppendStrings(seperator: " and ");
var sqlbuilder = new StringBuilder();
sqlbuilder.AppendLine($"IF EXISTS (select * from {tablename} WITH (UPDLOCK, HOLDLOCK) WHERE ({where})) ");
sqlbuilder.AppendLine(valuesSetters.Any() ? $"UPDATE {tablename} SET {valuesSetters.AppendStrings()} WHERE ({where}) " : "SELECT 0 ");
sqlbuilder.AppendLine($"ELSE INSERT INTO {tablename} ({columnNames.AppendStrings()}) VALUES ({parameters.AppendStrings()}) ");
return sqlbuilder.ToString();
}
}
Actually I closed my pull request and remove my fork because:
I see some open pull requests created in 2014
I found a way "inject" my code in Dapper.Extensions.
I remind my problem: I want to create more generic queries for Dapper.Extensions. It means I need to have access to mapping cache for entities, SqlGenerator etc. So here is my way. I want to add ability to make INSERT .. UPDATE ON DUPLICATE KEY for MySQL. I created extension method for ISqlGenerator
public static class SqlGeneratorExt
{
public static string InsertUpdateOnDuplicateKey(this ISqlGenerator generator, IClassMapper classMap)
{
var columns = classMap.Properties.Where(p => !(p.Ignored || p.IsReadOnly || p.KeyType == KeyType.Identity));
if (!columns.Any())
{
throw new ArgumentException("No columns were mapped.");
}
var columnNames = columns.Select(p => generator.GetColumnName(classMap, p, false));
var parameters = columns.Select(p => generator.Configuration.Dialect.ParameterPrefix + p.Name);
var valuesSetters = columns.Select(p => string.Format("{0}=VALUES({1})", generator.GetColumnName(classMap, p, false), p.Name));
string sql = string.Format("INSERT INTO {0} ({1}) VALUES ({2}) ON DUPLICATE KEY UPDATE {3}",
generator.GetTableName(classMap),
columnNames.AppendStrings(),
parameters.AppendStrings(),
valuesSetters.AppendStrings());
return sql;
}
}
One more extension method for IDapperImplementor
public static class DapperImplementorExt
{
public static void InsertUpdateOnDuplicateKey<T>(this IDapperImplementor implementor, IDbConnection connection, IEnumerable<T> entities, int? commandTimeout = null) where T : class
{
IClassMapper classMap = implementor.SqlGenerator.Configuration.GetMap<T>();
var properties = classMap.Properties.Where(p => p.KeyType != KeyType.NotAKey);
string emptyGuidString = Guid.Empty.ToString();
foreach (var e in entities)
{
foreach (var column in properties)
{
if (column.KeyType == KeyType.Guid)
{
object value = column.PropertyInfo.GetValue(e, null);
string stringValue = value.ToString();
if (!string.IsNullOrEmpty(stringValue) && stringValue != emptyGuidString)
{
continue;
}
Guid comb = implementor.SqlGenerator.Configuration.GetNextGuid();
column.PropertyInfo.SetValue(e, comb, null);
}
}
}
string sql = implementor.SqlGenerator.InsertUpdateOnDuplicateKey(classMap);
connection.Execute(sql, entities, null, commandTimeout, CommandType.Text);
}
}
Now I can create new class derived from Database class to use my own sql
public class Db : Database
{
private readonly IDapperImplementor _dapperIml;
public Db(IDbConnection connection, ISqlGenerator sqlGenerator) : base(connection, sqlGenerator)
{
_dapperIml = new DapperImplementor(sqlGenerator);
}
public void InsertUpdateOnDuplicateKey<T>(IEnumerable<T> entities, int? commandTimeout) where T : class
{
_dapperIml.InsertUpdateOnDuplicateKey(Connection, entities, commandTimeout);
}
}
Yeah, it's required to create another DapperImplementor instance because DapperImplementor instance from base class is private :(. So now I can use my Db class to call my own generic sql queries and native queries from Dapper.Extension. Examples of usage Database class instead of IDbConnection extensions can be found here.
I have an entity (Entity Framework 6) which maps to a table which has 100 columns named col_1, col_2 etc. The table is an existing legacy table which is used to interface to another system, so it's very generic and will never change.
So the entity has 100 properties which map to those columns.
public class DataEntity{
public string Column1{get;set;}
...
}
I then have data like this:
var columnNumber = 1;
var data = "The data";
How can I set col_1 to "The Data" without a long case statement?
var dataEntity = new DataEntity();
This is what I don't want to do:
switch(columnNumber)
{
case 1:
dataEntity.Column1 = data;
break;
}
Note that it is not possible to change the structure of the entity.
If you absolutely cannot change the structure of the data and you do not want to use a switch statement -- I'd probably look into some implementation of a strategy pattern:
http://www.dofactory.com/net/strategy-design-pattern
If you do not want to go that route, you could use reflection to set the values of the entity by following a convention where the columns will always be named "col_1", etc., but this is extremely fragile.
var myEntity = new MyEntity();
var value = "The data";
var columnNumber = 1;
PropertyInfo propertyInfo = MyEntity.GetType().GetProperty(string.Format("Col_{0}", columnNumber));
propertyInfo.SetValue(myEntity, Convert.ChangeType(value, propertyInfo.PropertyType), null);
I would definitely use an ExpandoObject for pushing anything I want into a dummy object, then map it to a specific instance of your class...
The mapper code you can get it from this answer:
https://codereview.stackexchange.com/questions/1002/mapping-expandoobject-to-another-object-type
Let's say you have a class representing your holy-mother-of-100columns-database-tables
class LegacyTable
{
public string Column1 { get; set; }
public string Column2 { get; set; }
public string Column3 { get; set; }
// More to come
}
Now you can just work easily on a dynamic object.
dynamic expando = new ExpandoObject();
expando.Column1 = "Data 1";
expando.Column2 = "Data 2";
// Etcetera
You could also cast the expando to a dictionary for fast access (specially if your properties follow a patter "Column" + number
var p = expando as IDictionary<String, object>;
p["Column" + 1] = "Data 1";
p["Column" + 2] = "Data 2";
p["Column" + 3] = "Data 3";
Now you can use the mapper to pas your props to an actual instance of the legacy class.
LegacyTable legacy = new LegacyTable();
Mapper<LegacyTable>.Map(expando, legacy);
It just works...
I am executing a stored procedure from C# code
private void DataToGrid()
{
DataTable dt = new DataTable();
string[,] aryPara = new string[,]
{{ "#pClassId", "10" } ,
{ "#pMediumId", "11" } ,
{ "#pStreamId", "12" } ,
{ "#pSessionId","13" } ,
{ "#pSectionId", "15" } ,
{ "#pShiftId", "16" } ,
{ "#pDateId", "17" } };
dt = CallStoredProcedureUsingAdapterFillDataTabel("ssspAtdDailyattendance", aryPara);
DatagridView1.DataSource = dt;
}
public DataTable CallStoredProcedureUsingAdapterFillDataTabel(string StoredProcedureName, [Optional] string[,] aryParameters)
{
SqlConnection con = new SqlConnection();
con.ConnectionString = "Data Source=AIS-OCTACORE\SQLserver2008r2;Initial Catalog= SchoolSoulDataTest; integrated security=SSPI";
con.open();
SqlCommand lSQLCmd = new SqlCommand();
SqlDataAdapter adp = new SqlDataAdapter();
DataTable dt = new DataTable();
lSQLCmd.CommandType = CommandType.StoredProcedure;
lSQLCmd.CommandText = StoredProcedureName;
lSQLCmd.CommandTimeout = 300;
try
{
for (int i = 0; i < aryParameters.GetLength(0); i++)
{
lSQLCmd.Parameters.Add(new SqlParameter(aryParameters[i, 0], aryParameters[i, 1]));
}
}
catch (Exception ex)
{
}
lSQLCmd.Connection = con;
adp.SelectCommand = lSQLCmd;
adp.Fill(dt);
clsConnectionClose();
return dt;
}
Where ssspAtdDailyattendance is a Dynamic Stored Procedure in which data returned by the stored procedure has variable number of columns.
Now I want to convert DataTable dt to a List<T> but as dt return variable number of columns thus type of T is not Fixed
So my question is how could I convert dt to List?
I would recommend you use Dapper for it instead of writing and maintaining the boilerplate code yourself. Dapper supports dynamic objects.
Execute a query and map it to a list of dynamic objects
public static IEnumerable<dynamic> Query (this IDbConnection cnn, string sql,
object param = null, SqlTransaction transaction = null, bool buffered = true)
This method will execute SQL and return a dynamic list.
Example usage:
var rows = connection.Query("select 1 A, 2 B union all select 3, 4");
((int)rows[0].A).IsEqualTo(1);
((int)rows[0].B).IsEqualTo(2);
((int)rows[1].A).IsEqualTo(3);
((int)rows[1].B).IsEqualTo(4);
Website: http://code.google.com/p/dapper-dot-net/
NuGet: http://www.nuget.org/packages/Dapper/
If I understood right your question, T can be more than one type.
So for example, first row could be a Person and second row a Dog? Or did you mean different types as string, int, date and so on (value types).
Regardless you will need to create your "T" entities to represent the data from the datatable and map it.
What you are looking for is potentially a mapper such as ValueInjector or Automapper.
Or even map manually:
This answer here shows how to map a DataTable to a class using ValueInjector. It can be give you a headstart.
I hope this helps.
#Douglas Jimenez has a relevant question. If only the type varies, this can be easily solved with an interface for common fields (or object if no field is common) and different types implementing the other fields. When you build your list, you then use a factory to decide which type to build for the row (or the wholde DataTable), and you build the object using the row.
However, if the type is not known at compile time, there are many solutions:
Use the DLR with the dynamic keyword.
The ExpandoObject may do what you need.
Use a hashmap (like a Dictionary<string, object>)
Emit the type with the Emit framework.
This solution is actually complicated and will lead you to advanced parts of the CLR. However, this works really well.
If you choose this path, this tutorial will help you greatly.
This may very well be overkill for your needs.
There may be other solutions as well. Good luck with the solution you choose.
According to answer posted by sll you can use something like,
IList<Class1> items = dt.AsEnumerable().Select(row =>
new Class1
{
id = row.Field<string>("id"),
name = row.Field<string>("name")
}).ToList();
But since you have requirement that it may contain different columns then you can have a common class with all the properties and place a check before getting column something like,
public class Class1
{
private int id;
public string name;
public Class1(DataRow row)
{
id = (int)GetColumnValue(row, "id");
name = (string)GetColumnValue(row, "name");
}
public object GetColumnValue(DataRow row, string column)
{
return row.Table.Columns.Contains(column) ? row[column] : null;
}
}
And then you may invoke function as,
IList<Class1> items = dt.AsEnumerable().Select(row => new Class1(row)).ToList();
I faced this issue and soled use this code from
http://musthaan.com/2016/01/18/datatable-to-dynamic-list/
public static class HelperExtensions
{
public static List<dynamic> ToDynamic(this DataTable dt)
{
var dynamicDt = new List<dynamic>();
foreach (DataRow row in dt.Rows)
{
dynamic dyn = new ExpandoObject();
dynamicDt.Add(dyn);
foreach (DataColumn column in dt.Columns)
{
var dic = (IDictionary<string, object>)dyn;
dic[column.ColumnName] = row[column];
}
}
return dynamicDt;
}
}