Firstly, why do this? In production system if data comes in that is too big for the DB, you may want to keep running and accept the data loss if the fields are not critical, and there will be bigger problems if the record is not inserted.
I have this working with the following code but it is ugly as hell having to reflect over private .net framework properties.
Surely there is a better way to do this?
I saw following code on another posting to load the metadata but it does not work. The Types loaded this way do not have the database field lengths populated. We generate model from DB. That way we do not ever have to make manual adjustments to entity model ourselves (the DB first method).
var metaDataWorkspace = ((IObjectContextAdapter)db).ObjectContext.MetadataWorkspace;
var edmTypes = metaDataWorkspace.GetItems<EdmType>(DataSpace.OSpace);
The Method with the logic is in AutoTruncateStringToMaxLength() below.
has following usings:
using System.Data.Entity;
using System.Data.Entity.Validation;
using System.Data.Entity.Core.EntityClient;
using System.Data.Entity.Core.Metadata.Edm;
code which sits in partial entity class (e.g. public partial class MyEntities):
public override int SaveChanges()
{
try
{
this.AutoTruncateStringToMaxLength();
return base.SaveChanges();
}
catch (DbEntityValidationException ex)
{
List<string> errorMessages = new List<string>();
foreach (DbEntityValidationResult validationResult in ex.EntityValidationErrors)
{
string entityName = validationResult.Entry.Entity.GetType().Name;
foreach (DbValidationError error in validationResult.ValidationErrors)
{
errorMessages.Add(entityName + "." + error.PropertyName + ": " + error.ErrorMessage);
}
}
// Join the list to a single string.
var fullErrorMessage = string.Join("; ", errorMessages);
// Combine the original exception message with the new one.
var exceptionMessage = string.Concat(ex.Message, " The validation errors are: ", fullErrorMessage);
// Throw a new DbEntityValidationException with the improved exception message.
throw new DbEntityValidationException(exceptionMessage, ex.EntityValidationErrors);
}
}
private void AutoTruncateStringToMaxLength()
{
var entries = this?.ChangeTracker?.Entries();
if (entries == null)
{
return;
}
//********** EDM type from here does not work. MaxLength properties are not set ***************** //
//var metaDataWorkspace = ((IObjectContextAdapter)db).ObjectContext.MetadataWorkspace;
//var edmTypes = metaDataWorkspace.GetItems<EdmType>(DataSpace.OSpace);
ReadOnlyMetadataCollection<EdmMember> memberMetaDataProperties = null;
string currentloadedEdmType = null;
foreach (var entry in entries)
{
var internalEntry = entry.GetType().GetProperty("InternalEntry", System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance).GetValue(entry);
var edmType = (EdmType)internalEntry.GetType().GetProperty("EdmEntityType", System.Reflection.BindingFlags.Public | System.Reflection.BindingFlags.Instance).GetValue(internalEntry);
if (edmType != null)
{
if (currentloadedEdmType == null || edmType.Name != currentloadedEdmType)
{
currentloadedEdmType = edmType.Name;
//seems slow to load (in debug) so cache just in case there is performance issue
memberMetaDataProperties = (ReadOnlyMetadataCollection<EdmMember>)edmType.MetadataProperties.FirstOrDefault(x => x.Name == "Members").Value;
}
entry.Entity.GetType().GetProperties().ToList().ForEach(p =>
{
var matchingMemberMetaData = memberMetaDataProperties.FirstOrDefault(x => x.Name == p.Name);
if (matchingMemberMetaData != null && matchingMemberMetaData.BuiltInTypeKind.ToString() == "EdmProperty")
{
var edmProperty = (EdmProperty)matchingMemberMetaData;
if (edmProperty.MaxLength.HasValue && edmProperty.TypeName == "String")
{
string value = (p.GetValue(entry.Entity) ?? "").ToString();
if (value.Length > edmProperty.MaxLength.Value)
{
// oops. Its too Long, so truncate it.
p.SetValue(entry.Entity, value.Substring(value.Length - edmProperty.MaxLength.Value));
}
}
}
});
}
}
}
EF Core has a better metadata API. EG
var q = from e in this.Model.GetEntityTypes()
from p in e.GetProperties()
where p.ClrType == typeof(string)
select new { EntityName = e.Name, PropertyName = p.Name, MaxLength = p.GetMaxLength() };
I'm comparing materialize time between Dapper and ADO.NET and Dapper. Ultimately, Dapper tend to faster than ADO.NET, though the first time a given fetch query was executed is slower than ADO.NET. a few result show that Dapper a little bit faster than ADO.NET(almost all of result show that it comparable though)
So I think I'm using inefficient approach to map result of SqlDataReader to object.
This is my code
var sql = "SELECT * FROM Sales.SalesOrderHeader WHERE SalesOrderID = #Id";
var conn = new SqlConnection(ConnectionString);
var stopWatch = new Stopwatch();
try
{
conn.Open();
var sqlCmd = new SqlCommand(sql, conn);
for (var i = 0; i < keys.GetLength(0); i++)
{
for (var r = 0; r < keys.GetLength(1); r++)
{
stopWatch.Restart();
sqlCmd.Parameters.Clear();
sqlCmd.Parameters.AddWithValue("#Id", keys[i, r]);
var reader = await sqlCmd.ExecuteReaderAsync();
SalesOrderHeaderSQLserver salesOrderHeader = null;
while (await reader.ReadAsync())
{
salesOrderHeader = new SalesOrderHeaderSQLserver();
salesOrderHeader.SalesOrderId = (int)reader["SalesOrderId"];
salesOrderHeader.SalesOrderNumber = reader["SalesOrderNumber"] as string;
salesOrderHeader.AccountNumber = reader["AccountNumber"] as string;
salesOrderHeader.BillToAddressID = (int)reader["BillToAddressID"];
salesOrderHeader.TotalDue = (decimal)reader["TotalDue"];
salesOrderHeader.Comment = reader["Comment"] as string;
salesOrderHeader.DueDate = (DateTime)reader["DueDate"];
salesOrderHeader.CurrencyRateID = reader["CurrencyRateID"] as int?;
salesOrderHeader.CustomerID = (int)reader["CustomerID"];
salesOrderHeader.SalesPersonID = reader["SalesPersonID"] as int?;
salesOrderHeader.CreditCardApprovalCode = reader["CreditCardApprovalCode"] as string;
salesOrderHeader.ShipDate = reader["ShipDate"] as DateTime?;
salesOrderHeader.Freight = (decimal)reader["Freight"];
salesOrderHeader.ModifiedDate = (DateTime)reader["ModifiedDate"];
salesOrderHeader.OrderDate = (DateTime)reader["OrderDate"];
salesOrderHeader.TerritoryID = reader["TerritoryID"] as int?;
salesOrderHeader.CreditCardID = reader["CreditCardID"] as int?;
salesOrderHeader.OnlineOrderFlag = (bool)reader["OnlineOrderFlag"];
salesOrderHeader.PurchaseOrderNumber = reader["PurchaseOrderNumber"] as string;
salesOrderHeader.RevisionNumber = (byte)reader["RevisionNumber"];
salesOrderHeader.Rowguid = (Guid)reader["Rowguid"];
salesOrderHeader.ShipMethodID = (int)reader["ShipMethodID"];
salesOrderHeader.ShipToAddressID = (int)reader["ShipToAddressID"];
salesOrderHeader.Status = (byte)reader["Status"];
salesOrderHeader.SubTotal = (decimal)reader["SubTotal"];
salesOrderHeader.TaxAmt = (decimal)reader["TaxAmt"];
}
stopWatch.Stop();
reader.Close();
await PrintTestFindByPKReport(stopWatch.ElapsedMilliseconds, salesOrderHeader.SalesOrderId.ToString());
}
I used as keyword to cast in nullable column, is that correct?
and this is code for Dapper.
using (var conn = new SqlConnection(ConnectionString))
{
conn.Open();
var stopWatch = new Stopwatch();
for (var i = 0; i < keys.GetLength(0); i++)
{
for (var r = 0; r < keys.GetLength(1); r++)
{
stopWatch.Restart();
var result = (await conn.QueryAsync<SalesOrderHeader>("SELECT * FROM Sales.SalesOrderHeader WHERE SalesOrderID = #Id", new { Id = keys[i, r] })).FirstOrDefault();
stopWatch.Stop();
await PrintTestFindByPKReport(stopWatch.ElapsedMilliseconds, result.ToString());
}
}
}
When in doubt regarding anything db or reflection, I ask myself, "what would Marc Gravell do?".
In this case, he would use FastMember! And you should too. It's the underpinning to the data conversions in Dapper, and can easily be used to map your own DataReader to an object (should you not want to use Dapper).
Below is an extension method converting a SqlDataReader into something of type T:
PLEASE NOTE: This code implies a dependency on FastMember and is written for .NET Core (though could easily be converted to .NET Framework/Standard compliant code).
public static T ConvertToObject<T>(this SqlDataReader rd) where T : class, new()
{
Type type = typeof(T);
var accessor = TypeAccessor.Create(type);
var members = accessor.GetMembers();
var t = new T();
for (int i = 0; i < rd.FieldCount; i++)
{
if (!rd.IsDBNull(i))
{
string fieldName = rd.GetName(i);
if (members.Any(m => string.Equals(m.Name, fieldName, StringComparison.OrdinalIgnoreCase)))
{
accessor[t, fieldName] = rd.GetValue(i);
}
}
}
return t;
}
2022 update
Now that we have .NET 5 and .NET 6 available, which include Source Generators - an amazing Roslyn-based feature, that, basically, allows your code to... generate more code at compile time. It's basically "AOT Reflection" (ahead-of-time) that allows you to generate lightning-fast mapping code that has zero overhead. This thing will revolutionize the ORM world for sure.
Now, back to the question - the fastest way to map an IDataReader would be to use Source Generators. We started experimenting with this feature and we love it.
Here's a library we're working on, that does exactly that (maps DataReader to objects), feel free to "steal" some code examples: https://github.com/jitbit/MapDataReader
Previous answer that is still 100% valid
The most upvoted answer mentions #MarkGravel and his FastMember. But if you're already using Dapper, which is also a component of his, you can use Dapper's GetRowParser like this:
var parser = reader.GetRowParser<MyObject>(typeof(MyObject));
while (reader.Read())
{
var myObject = parser(reader);
}
Here's a way to make your ADO.NET code faster.
When you do your select, list out the fields that you are selecting rather than using select *. This will let you ensure the order that the fields are coming back even if that order changes in the database.Then when getting those fields from the Reader, get them by index rather than by name. Using and index is faster.
Also, I'd recommend not making string database fields nullable unless there is a strong business reason. Then just store a blank string in the database if there is no value. Finally I'd recommend using the Get methods on the DataReader to get your fields in the type they are so that casting isn't needed in your code. So for example instead of casting the DataReader[index++] value as an int use DataReader.GetInt(index++)
So for example, this code:
salesOrderHeader = new SalesOrderHeaderSQLserver();
salesOrderHeader.SalesOrderId = (int)reader["SalesOrderId"];
salesOrderHeader.SalesOrderNumber = reader["SalesOrderNumber"] as string;
salesOrderHeader.AccountNumber = reader["AccountNumber"] as string;
becomes
int index = 0;
salesOrderHeader = new SalesOrderHeaderSQLserver();
salesOrderHeader.SalesOrderId = reader.GetInt(index++);
salesOrderHeader.SalesOrderNumber = reader.GetString(index++);
salesOrderHeader.AccountNumber = reader.GetString(index++);
Give that a whirl and see how it does for you.
Modified #HouseCat's solution to be case insensitive:
/// <summary>
/// Maps a SqlDataReader record to an object. Ignoring case.
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="dataReader"></param>
/// <param name="newObject"></param>
/// <remarks>https://stackoverflow.com/a/52918088</remarks>
public static void MapDataToObject<T>(this SqlDataReader dataReader, T newObject)
{
if (newObject == null) throw new ArgumentNullException(nameof(newObject));
// Fast Member Usage
var objectMemberAccessor = TypeAccessor.Create(newObject.GetType());
var propertiesHashSet =
objectMemberAccessor
.GetMembers()
.Select(mp => mp.Name)
.ToHashSet(StringComparer.InvariantCultureIgnoreCase);
for (int i = 0; i < dataReader.FieldCount; i++)
{
var name = propertiesHashSet.FirstOrDefault(a => a.Equals(dataReader.GetName(i), StringComparison.InvariantCultureIgnoreCase));
if (!String.IsNullOrEmpty(name))
{
objectMemberAccessor[newObject, name]
= dataReader.IsDBNull(i) ? null : dataReader.GetValue(i);
}
}
}
EDIT: This does not work for List<T> or multiple tables in the results.
EDIT2: Changing the calling function to this works for lists. I am just going to return a list of objects no matter what and get the first index if I was expecting a single object. I haven't looked into multiple tables yet but I will.
public static void MapDataToObject<T>(this SqlDataReader dataReader, T newObject)
{
if (newObject == null) throw new ArgumentNullException(nameof(newObject));
// Fast Member Usage
var objectMemberAccessor = TypeAccessor.Create(newObject.GetType());
var propertiesHashSet =
objectMemberAccessor
.GetMembers()
.Select(mp => mp.Name)
.ToHashSet(StringComparer.InvariantCultureIgnoreCase);
for (int i = 0; i < dataReader.FieldCount; i++)
{
var name = propertiesHashSet.FirstOrDefault(a => a.Equals(dataReader.GetName(i), StringComparison.InvariantCultureIgnoreCase));
if (!String.IsNullOrEmpty(name))
{
//Attention! if you are getting errors here, then double check that your model and sql have matching types for the field name.
//Check api.log for error message!
objectMemberAccessor[newObject, name]
= dataReader.IsDBNull(i) ? null : dataReader.GetValue(i);
}
}
}
EDIT 3: Updated to show sample calling function.
public async Task<List<T>> ExecuteReaderAsync<T>(string storedProcedureName, SqlParameter[] sqlParameters = null) where T : class, new()
{
var newListObject = new List<T>();
using (var conn = new SqlConnection(_connectionString))
{
using (SqlCommand sqlCommand = GetSqlCommand(conn, storedProcedureName, sqlParameters))
{
await conn.OpenAsync();
using (var dataReader = await sqlCommand.ExecuteReaderAsync(CommandBehavior.Default))
{
if (dataReader.HasRows)
{
while (await dataReader.ReadAsync())
{
var newObject = new T();
dataReader.MapDataToObject(newObject);
newListObject.Add(newObject);
}
}
}
}
}
return newListObject;
}
Took the method from pimbrouwers' answer and optimized it slightly. Reduce LINQ calls.
Maps only properties found in both the object and data field names. Handles DBNull. Other assumption made is your domain model properties absolutely equals table column/field names.
/// <summary>
/// Maps a SqlDataReader record to an object.
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="dataReader"></param>
/// <param name="newObject"></param>
public static void MapDataToObject<T>(this SqlDataReader dataReader, T newObject)
{
if (newObject == null) throw new ArgumentNullException(nameof(newObject));
// Fast Member Usage
var objectMemberAccessor = TypeAccessor.Create(newObject.GetType());
var propertiesHashSet =
objectMemberAccessor
.GetMembers()
.Select(mp => mp.Name)
.ToHashSet();
for (int i = 0; i < dataReader.FieldCount; i++)
{
if (propertiesHashSet.Contains(dataReader.GetName(i)))
{
objectMemberAccessor[newObject, dataReader.GetName(i)]
= dataReader.IsDBNull(i) ? null : dataReader.GetValue(i);
}
}
}
Sample Usage:
public async Task<T> GetAsync<T>(string storedProcedureName, SqlParameter[] sqlParameters = null) where T : class, new()
{
using (var conn = new SqlConnection(_connString))
{
var sqlCommand = await GetSqlCommandAsync(storedProcedureName, conn, sqlParameters);
var dataReader = await sqlCommand.ExecuteReaderAsync(CommandBehavior.CloseConnection);
if (dataReader.HasRows)
{
var newObject = new T();
if (await dataReader.ReadAsync())
{ dataReader.MapDataToObject(newObject); }
return newObject;
}
else
{ return null; }
}
}
You can install the package DbDataReaderMapper with the command Install-Package DbDataReaderMapper or using your IDE's package manager.
You can then create your data access object (I will choose a shorter example than the one you provided):
class EmployeeDao
{
public int Id { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public int? Age { get; set; }
}
To do the automatic mapping you can call the extension method MapToObject<T>()
var reader = await sqlCmd.ExecuteReaderAsync();
while (await reader.ReadAsync())
{
var employeeObj = reader.MapToObject<EmployeeDao>();
}
and you will get rid of tens of lines of unreadable and hardly-maintainable code.
Step-by-step example here: https://github.com/LucaMozzo/DbDataReaderMapper
Perhaps the approach I will present isn't the most efficient but gets the job done with very little coding effort. The main benefit I see here is that you don't have to deal with data structure other than building a compatible (mapable) object.
If you convert the SqlDataReader to DataTable then serialize it using JsonConvert.SerializeObject you can then deserialize it to a known object type using JsonConvert.DeserializeObject
Here is an example of implementation:
SqlDataReader reader = null;
SqlConnection myConnection = new SqlConnection();
myConnection.ConnectionString = ConfigurationManager.ConnectionStrings["DatabaseConnection"].ConnectionString;
SqlCommand sqlCmd = new SqlCommand();
sqlCmd.CommandType = CommandType.Text;
sqlCmd.CommandText = "SELECT * FROM MyTable";
sqlCmd.Connection = myConnection;
myConnection.Open();
reader = sqlCmd.ExecuteReader();
var dataTable = new DataTable();
dataTable.Load(reader);
List<MyObject> myObjects = new List<MyObject>();
if (dataTable.Rows.Count > 0)
{
var serializedMyObjects = JsonConvert.SerializeObject(dataTable);
// Here you get the object
myObjects = (List<MyObject>)JsonConvert.DeserializeObject(serializedMyObjects, typeof(List<MyObject>));
}
myConnection.Close();
List<T> result = new List<T>();
SqlDataReader reader = com.ExecuteReader();
while(reader.Read())
{
Type type = typeof(T);
T obj = (T)Activator.CreateInstance(type);
PropertyInfo[] properties = type.GetProperties();
foreach (PropertyInfo property in properties)
{
try
{
var value = reader[property.Name];
if (value != null)
property.SetValue(obj, Convert.ChangeType(value.ToString(), property.PropertyType));
}
catch{}
}
result.Add(obj);
}
There is a SqlDataReader Mapper library in NuGet which helps you to map SqlDataReader to an object. Here is how it can be used (from GitHub documentation):
var mappedObject = new SqlDataReaderMapper<DTOObject>(reader)
.Build();
Or, if you want a more advanced mapping:
var mappedObject = new SqlDataReaderMapper<DTOObject>(reader)
.NameTransformers("_", "")
.ForMember<int>("CurrencyId")
.ForMember("CurrencyCode", "Code")
.ForMember<string>("CreatedByUser", "User").Trim()
.ForMemberManual("CountryCode", val => val.ToString().Substring(0, 10))
.ForMemberManual("ZipCode", val => val.ToString().Substring(0, 5), "ZIP")
.Build();
Advanced mapping allows you to use name transformers, change types, map fields manually or even apply functions to the object's data so that you can easily map objects even if they differ with a reader.
I took both pimbrouwers and HouseCat's answers and come up with me. In my scenario, the column name in database has snake case format.
public static T ConvertToObject<T>(string query) where T : class, new()
{
using (var conn = new SqlConnection(AutoConfig.ConnectionString))
{
conn.Open();
var cmd = new SqlCommand(query) {Connection = conn};
var rd = cmd.ExecuteReader();
var mappedObject = new T();
if (!rd.HasRows) return mappedObject;
var accessor = TypeAccessor.Create(typeof(T));
var members = accessor.GetMembers();
if (!rd.Read()) return mappedObject;
for (var i = 0; i < rd.FieldCount; i++)
{
var columnNameFromDataTable = rd.GetName(i);
var columnValueFromDataTable = rd.GetValue(i);
var splits = columnNameFromDataTable.Split('_');
var columnName = new StringBuilder("");
foreach (var split in splits)
{
columnName.Append(CultureInfo.InvariantCulture.TextInfo.ToTitleCase(split.ToLower()));
}
var mappedColumnName = members.FirstOrDefault(x =>
string.Equals(x.Name, columnName.ToString(), StringComparison.OrdinalIgnoreCase));
if(mappedColumnName == null) continue;
var columnType = mappedColumnName.Type;
if (columnValueFromDataTable != DBNull.Value)
{
accessor[mappedObject, columnName.ToString()] = Convert.ChangeType(columnValueFromDataTable, columnType);
}
}
return mappedObject;
}
}
We use the following class to execute a SQL query and automatically map the rows to objects. You can easily adjust the class to fit to your needs. Beware that our approach depends on FastMember, but you could easily modify the code to use reflection.
/// <summary>
/// Mapping configuration for a specific sql table to a specific class.
/// </summary>
/// <param name="Accessor">Used to access the target class properties.</param>
/// <param name="PropToRowIdxDict">Target class property name -> database reader row idx dictionary.</param>
internal record RowMapper(TypeAccessor Accessor, IDictionary<string, int> PropToRowIdxDict);
public class RawSqlHelperService
{
/// <summary>
/// Create a new mapper for the conversion of a <see cref="DbDataReader"/> row -> <typeparamref name="T"/>.
/// </summary>
/// <typeparam name="T">Target class to use.</typeparam>
/// <param name="reader">Data reader to obtain column information from.</param>
/// <returns>Row mapper object for <see cref="DbDataReader"/> row -> <typeparamref name="T"/>.</returns>
private RowMapper GetRowMapper<T>(DbDataReader reader) where T : class, new()
{
var accessor = TypeAccessor.Create(typeof(T));
var members = accessor.GetMembers();
// Column name -> column idx dict
var columnIdxDict = Enumerable.Range(0, reader.FieldCount).ToDictionary(idx => reader.GetName(idx), idx => idx);
var propToRowIdxDict = members
.Where(m => m.GetAttribute(typeof(NotMappedAttribute), false) == null)
.Select(m =>
{
var columnAttr = m.GetAttribute(typeof(ColumnAttribute), false) as ColumnAttribute;
var columnName = columnAttr == null
? m.Name
: columnAttr.Name;
return (PropertyName: m.Name, ColumnName: columnName);
})
.ToDictionary(x => x.PropertyName, x => columnIdxDict[x.ColumnName]);
return new RowMapper(accessor, propToRowIdxDict);
}
/// <summary>
/// Read <see cref="DbDataReader"/> current row as object <typeparamref name="T"/>.
/// </summary>
/// <typeparam name="T">The class to map to.</typeparam>
/// <param name="reader">Data reader to read the current row from.</param>
/// <param name="mapper">Mapping configuration to use to perform the mapping operation.</param>
/// <returns>Resulting object of the mapping operation.</returns>
private T ReadRowAsObject<T>(DbDataReader reader, RowMapper mapper) where T : class, new()
{
var (accessor, propToRowIdxDict) = mapper;
var t = new T();
foreach (var (propertyName, columnIdx) in propToRowIdxDict)
accessor[t, propertyName] = reader.GetValue(columnIdx);
return t;
}
/// <summary>
/// Execute the specified <paramref name="sql"/> query and automatically map the resulting rows to <typeparamref name="T"/>.
/// </summary>
/// <typeparam name="T">Target class to map to.</typeparam>
/// <param name="dbContext">Database context to perform the operation on.</param>
/// <param name="sql">SQL query to execute.</param>
/// <param name="parameters">Additional list of parameters to use for the query.</param>
/// <returns>Result of the SQL query mapped to a list of <typeparamref name="T"/>.</returns>
public async Task<IEnumerable<T>> ExecuteSql<T>(DbContext dbContext, string sql, IEnumerable<DbParameter> parameters = null) where T : class, new()
{
var con = dbContext.Database.GetDbConnection();
await con.OpenAsync();
var cmd = con.CreateCommand() as OracleCommand;
cmd.BindByName = true;
cmd.CommandText = sql;
cmd.Parameters.AddRange(parameters?.ToArray() ?? new DbParameter[0]);
var reader = await cmd.ExecuteReaderAsync();
var records = new List<T>();
var mapper = GetRowMapper<T>(reader);
while (await reader.ReadAsync())
{
records.Add(ReadRowAsObject<T>(reader, mapper));
}
await con.CloseAsync();
return records;
}
}
Mapping Attributes Supported
I implemented support for the attributes NotMapped and Column used also by the entity framework.
NotMapped Attribute
Properties decorated with this attribute will be ignored by the mapper.
Column Attribute
With this attribute the column name can be customized. Without this attribute the property name is assumed to be the column name.
Example Class
private class Test
{
[Column("SDAT")]
public DateTime StartDate { get; set; } // Column name = "SDAT"
public DateTime EDAT { get; set; } // Column name = "EDAT"
[NotMapped]
public int IWillBeIgnored { get; set; }
}
Comparision to Reflection
I also compared the approach with FastMember to using plain reflection.
For the comparision I queried two date columns from a table with 1000000 rows, here are the results:
Approach
Duration in seconds
FastMember
~1.6 seconds
Reflection
~2 seconds
Credits to user pim for inspiration.
This is based on the other answers but I used standard reflection to read the properties of the class you want to instantiate and fill it from the dataReader. You could also store the properties using a dictionary persisted b/w reads.
Initialize a dictionary containing the properties from the type with their names as the keys.
var type = typeof(Foo);
var properties = type.GetProperties(BindingFlags.Public | BindingFlags.Instance);
var propertyDictionary = new Dictionary<string,PropertyInfo>();
foreach(var property in properties)
{
if (!property.CanWrite) continue;
propertyDictionary.Add(property.Name, property);
}
The method to set a new instance of the type from the DataReader would be like:
var foo = new Foo();
//retrieve the propertyDictionary for the type
for (var i = 0; i < dataReader.FieldCount; i++)
{
var n = dataReader.GetName(i);
PropertyInfo prop;
if (!propertyDictionary.TryGetValue(n, out prop)) continue;
var val = dataReader.IsDBNull(i) ? null : dataReader.GetValue(i);
prop.SetValue(foo, val, null);
}
return foo;
If you want to write an efficient generic class dealing with multiple types you could store each dictionary in a global dictionary>.
This kinda works
public static object PopulateClass(object o, SQLiteDataReader dr, Type T)
{
Type type = o.GetType();
PropertyInfo[] properties = type.GetProperties();
foreach (PropertyInfo property in properties)
{
T.GetProperty(property.Name).SetValue(o, dr[property.Name],null);
}
return o;
}
Note I'm using SQlite here but the concept is the same. As an example I'm filling a Game object by calling the above like this-
g = PopulateClass(g, dr, typeof(Game)) as Game;
Note you have to have your class match up with datareader 100%, so adjust your query to suit or pass in some sort of list to skip fields. With a SQLDataReader talking to a SQL Server DB you have a pretty good type match between .net and the database. With SQLite you have to declare your ints in your class as Int64s for this to work and watch sending nulls to strings. But the above concept seems to work so it should get you going. I think this is what the Op was after.
I develop method, which resets cache, when Entity Framework is saving changes. I handle this event in objectContext_SavingChanges
private void objectContext_SavingChanges(object sender, EventArgs e)
{
if (_cacheProvider != null)
{
var objectContext = (this as IObjectContextAdapter).ObjectContext;
var entries =
objectContext.ObjectStateManager.GetObjectStateEntries(EntityState.Modified | EntityState.Added);
var result = new List<string>();
if (entries != null && entries.Any())
{
foreach (var entry in entries.Where(entry => !entry.IsRelationship))
{
var entity = entries.Select(r => r.Entity).FirstOrDefault();
if (entity != null)
{
var genericTypeName =
typeof(List<>).MakeGenericType(ObjectContext.GetObjectType(entity.GetType()))?.ToString();
_cacheProvider.ResetCache(genericTypeName);
}
}
foreach (var entry in entries.Where(entry => entry.IsRelationship))
{
var set = entry.EntitySet as AssociationSet;
if(set !=null)
{
var firstEntitySet = set.AssociationSetEnds[0].EntitySet;
var secondEntitySet = set.AssociationSetEnds[1].EntitySet;
var firstEntitySetName = firstEntitySet.ElementType.FullName + "," + Assembly.GetExecutingAssembly().FullName;
var secondEntitySetName = secondEntitySet.ElementType.FullName + "," + Assembly.GetExecutingAssembly().FullName;
var firstGenericTypeName =
typeof(List<>).MakeGenericType((Type.GetType(firstEntitySetName)))?.ToString();
var secondGenericTypeName =
typeof(List<>).MakeGenericType((Type.GetType(secondEntitySetName)))?.ToString();
_cacheProvider.ResetCache(firstGenericTypeName);
_cacheProvider.ResetCache(secondGenericTypeName);
}
}
}
}
}
It works great for entity's entries of object state entries, but does not work for association sets. I need to reset cache for both entities on other sides of relation. AssociationSet has "FullName" property but this property does not contain real name of type. It contains short name of type and assembly name, but not contains full namespace in solution.
For example, type has real full name in solution DataAccess.Entities.CachedEntity, namespace: DataAccess.Entities, short name: CachedEntity. But full name in association set will be DataAccess.CachedEntity.
Can I get full name of type for each end of relation?
I decided this task so:
var clrTypeNamespace = #"http://schemas.microsoft.com/ado/2013/11/edm/customannotation:ClrType";
var firstEdmType = set.AssociationSetEnds[0]?.EntitySet?.ElementType;
var secondEdmType = set.AssociationSetEnds[1]?.EntitySet?.ElementType;
var firstType = firstEdmType?.MetadataProperties.SingleOrDefault(p => p.Name == clrTypeNamespace)?.Value as Type;
var secondType = secondEdmType?.MetadataProperties.SingleOrDefault(p => p.Name == clrTypeNamespace)?.Value as Type;
We can use namespace clrTypeNamespace of CLR type in EDM and find real type of entity in MetadataProperties
Also, we can use navigation properies for each relationship's set. But this solution is sutable for situations, when navigation properties exist on both sides of relationship.
var asem = set.AssociationSetEnds[0].CorrespondingAssociationEndMember;
var propInfo = asem.MetadataProperties.SingleOrDefault(p => p.Name == "ClrPropertyInfo")?.Value as PropertyInfo;
I have an image site where users can tag photos much like you can tag a question on Stackoverflow.
I have the following tables:
Images [ID, URL, etc]
Tags [ID, TagName]
ImageTag [TagID, ImageID]
I want to write a method with the signature:
public void UpdateImageTags(int imageId, IEnumerable<string> currentTags)
This method will do the following:
Create any new Tags in currentTags that don't already exist in the Tags table.
Get the old ImageTag's for an image.
Delete any ImageTag's that no longer exist in the currentTags
Add any ImageTag's that are new between the currentTags and oldTags.
Here is my attempt at that method:
public void UpdateImageTags(int imageId, IEnumerable<string> currentTags)
{
using (var db = new ImagesDataContext())
{
var oldTags = db.ImageTags.Where(it => it.ImageId == imageId).Select(it => it.Tag.TagName);
var added = currentTags.Except(oldTags);
var removed = oldTags.Except(currentTags);
// Add any new tags that need created
foreach (var tag in added)
{
if (!db.Tags.Any(t => t.TagName == tag))
{
db.Tags.InsertOnSubmit(new Tag { TagName = tag });
}
}
db.SubmitChanges();
// Delete any ImageTags that need deleted.
var deletedImageTags = db.ImageTags.Where(it => removed.Contains(it.Tag.TagName));
db.ImageTags.DeleteAllOnSubmit(deletedImageTags);
// Add any ImageTags that need added.
var addedImageTags = db.Tags.Where(t => added.Contains(t.TagName)).Select(t => new ImageTag { ImageId = imageId, TagId = t.TagId });
db.ImageTags.InsertAllOnSubmit(addedImageTags);
db.SubmitChanges();
}
}
However, this fails on the line:
db.ImageTags.DeleteAllOnSubmit(deletedImageTags);
With the error:
Local sequence cannot be used in LINQ to SQL implementations of query
operators except the Contains operator.
Is there an easier way I can handle the operation of adding new tags, deleting old ImageTags, adding new ImageTags in LINQ to SQL?
Seems like this would be easiest
public void UpdateImageTags(int imageId, IEnumerable<string> currentTags)
{
using (var db = new ImagesDataContext())
{
var image = db.Images.Where(it => it.ImageId == imageId).First()
image.Tags.Clear();
foreach(string s in currentTags)
{
image.Tags.Add(new Tag() { TagName = s});
}
db.SubmitChanges();
}
}
This might have to be modified slightly for LinqtoSQL. EF is what i have been using most recently. Also this is dependent on Lazy loading being enabled. If it is not, you will have to force the include of the image tags.
Here is a helper method to deal with many-to-many relationships:
public static void UpdateReferences<FK, FKV>(
this EntitySet<FK> refs,
Expression<Func<FK, FKV>> fkexpr,
IEnumerable<FKV> values)
where FK : class
where FKV : class
{
Func<FK, FKV> fkvalue = fkexpr.Compile();
var fkmaker = MakeMaker(fkexpr);
var fkdelete = MakeDeleter(fkexpr);
var fks = refs.Select(fkvalue).ToList();
var added = values.Except(fks);
var removed = fks.Except(values);
foreach (var add in added)
{
refs.Add(fkmaker(add));
}
foreach (var r in removed)
{
var res = refs.Single(x => fkvalue(x) == r);
refs.Remove(res);
fkdelete(res);
}
}
static Func<FKV, FK> MakeMaker<FKV, FK>(Expression<Func<FK, FKV>> fkexpr)
{
var me = fkexpr.Body as MemberExpression;
var par = Expression.Parameter(typeof(FKV), "fkv");
var maker = Expression.Lambda(
Expression.MemberInit(Expression.New(typeof(FK)),
Expression.Bind(me.Member, par)), par);
var cmaker = maker.Compile() as Func<FKV, FK>;
return cmaker;
}
static Action<FK> MakeDeleter<FK, FKV>(Expression<Func<FK, FKV>> fkexpr)
{
var me = fkexpr.Body as MemberExpression;
var pi = me.Member as PropertyInfo;
var assoc = Attribute.GetCustomAttribute(pi, typeof(AssociationAttribute))
as AssociationAttribute;
if (assoc == null || !assoc.DeleteOnNull)
{
throw new ArgumentException("DeleteOnNull must be set to true");
}
var par = Expression.Parameter(typeof(FK), "fk");
var maker = Expression.Lambda(
Expression.Call(par, pi.GetSetMethod(),
Expression.Convert(Expression.Constant(null), typeof(FKV))), par);
var cmaker = maker.Compile() as Action<FK>;
return cmaker;
}
Usage:
IEnumerable<Tag> values = ...;
Image e = ...;
e.ImageTags.UpdateReferences(x => x.Tag, tags);
I am using this code to retrieve global optionsets
var request = new RetrieveOptionSetRequest {Name = "OptionsetNameGoesHere"};
var retrieveOptionSetResponse =(RetrieveOptionSetResponse) DynamicsHandler._serviceProxy.Execute(request);
var retrievedOptionSetMetadata =(OptionSetMetadata) retrieveOptionSetResponse.OptionSetMetadata;
var optionList = retrievedOptionSetMetadata.Options.ToArray();
foreach (var optionMetadata in optionList)
{
Printout(optionMetadata.Label.LocalizedLabels[0].Label + "\n");
}
But how do I retrieve optionsets like AccountCategory (AccountCategoryCode) so that I can bind them to a Combobox?
You should get it with a RetrieveAttributeRequest. It will return an RetrieveAttributeResponse which contains the Property AttributeMetadata.
In your case it should be of type OptionSetMetadata, which is what you are looking for.
This is how I have solved this problem.
CRMBase is my base class with connection to the CRM instance. Codelanguage: C#
public static Dictionary<int, string> GetAll(CRMBase conn, string entityName, string attributeName)
{
OptionMetadataCollection result = RetrieveOptionSetMetaDataCollection(conn, entityName, attributeName);
return result.Where(r => r.Value.HasValue).ToDictionary(r => r.Value.Value, r => r.Label.UserLocalizedLabel.Label);
}
// Method to retrieve OptionSet Options Metadadata collection.
private static OptionMetadataCollection RetrieveOptionSetMetaDataCollection(CRMBase conn, string prmEntityName, string prmAttributeName)
{
RetrieveEntityRequest retrieveEntityRequest = new RetrieveEntityRequest();
retrieveEntityRequest.LogicalName = prmEntityName;
retrieveEntityRequest.EntityFilters = Microsoft.Xrm.Sdk.Metadata.EntityFilters.Attributes;
RetrieveEntityResponse retrieveEntityResponse = (RetrieveEntityResponse)conn._orgContext.Execute(retrieveEntityRequest);
return (from AttributeMetadata in retrieveEntityResponse.EntityMetadata.Attributes where
(AttributeMetadata.AttributeType == AttributeTypeCode.Picklist & AttributeMetadata.LogicalName == prmAttributeName)
select ((PicklistAttributeMetadata)AttributeMetadata).OptionSet.Options).FirstOrDefault();
}