Mapping Rows Data to MyObject - c#

i have my classes as the following:
class SalesInvlice
{
int id {get;set;}
string number {get;set;}
List<InvoiceItems> {get;set}
}
class InvoiceItems
{
id {get;set}
string item_name {get;set}
int price {get;set;}
}
my application is an agent that can connect to any database specified in the config file, and to execute a certain query. in my case it will execute a query on one client DB as the following select id, number, items.id as items_id, item_name, price from transaction left join transaction_details on transactions.id = transaction_details.transaction_id
let say i got the data using SQLDataReader, but i am open in another solutions as well.
what i am looking for here, is to map the results from SQLDataReader to list of SalesInvoice Object.
the issue i am facing that if the transaction has list of transaction_details, this means the datareader will get them to me in different rows.

You have a choice to make you can return the data in one dataset or two. The first will return the transaction data duplicated see bellow. You will need to manage the dr duplicate transactions.
select id,
number,
items.id as items_id,
item_name, price
from transaction
left join transaction_details
on transactions.id = transaction_details.transaction_id
Or return the data in two sets of data ( transaction and transaction detail). I am assuming that you are interested in data for a particular transaction not many different transactions. Ado.net will allow for multiple groups of data to be returned. You would need to cast each data set to is object type. dimly a case of newing up a transaction and assigning properties.

to solve this issue, i have made my simi general mapper.
I have mapper that do the following:
private List<T> mappingFun<T>( IEnumerable<Dictionary<string, object>> args, object propListObj = null) where T: Entity
{
List<T> listObject = new List<T>();
foreach(var arg in args)
{
T returnedObject = Activator.CreateInstance<T>();
PropertyInfo[] modelProperties = returnedObject.GetType().GetProperties();
var addedobject = listObject.FirstOrDefault(x => x.name == arg[returnedObject.GetType().Name + "_name"].ToString());
if (addedobject != null)
{
returnedObject = addedobject;
}
foreach(var prop in modelProperties)
{
var a = prop.PropertyType;
if (prop.PropertyType == typeof(String))
{
prop.SetValue(returnedObject, arg[returnedObject.GetType().Name + "_" + prop.Name].ToString());
}
else
{
var propModuleObj = GetType().GetMethod("mappingFun", BindingFlags.Instance | BindingFlags.NonPublic, null, new Type[] { typeof(IEnumerable<Dictionary<string, object>>), typeof(object) }, null).MakeGenericMethod(prop.PropertyType.GetGenericArguments().Single()).Invoke(this, new object[] { new List<Dictionary<string, object>>() { arg }, prop.GetValue(returnedObject) });
prop.SetValue(returnedObject, propModuleObj);
}
}
listObject.AddIfNotExist(returnedObject);
if(propListObj != null)
listObject.AddRange((List<T>)propListObj);
}
return listObject;
}
This helps 100%

Related

Joining datatables dynamic

Im struggling with a following issue:
I have a datatable that contains information retrieved from an SQL Server database. The information retrieved is regarding the price of certain products. So, for example, the datatable has the price of a specific product on an specific price list. One product may be present in several different price list, and today on the datatable, each combination is a different row. For example:
I need to transform the datatable into a datatable that only has one row per product, and the price list information is on columns:
EXAMPLE DATA:
A few notes:
I dont know who many price lists i will have, so i could have the column "cost" (for ex) N amount of times.
The product information (first part) needs to be included.
No more than one row per id_Art.
I been working on the following, but i wanted to step back because i might be going into a rabbit hole and there might be an easier solution.
Currently, i'm creating new datatables filtered by each pricelist. The idea behind this was to join the datatables, but i got stucked.
foreach(var pricelist in pricelists)
{
DataSet dataSetsingle = new DataSet();
dataSetsingle = GetDataSetForMultiTariff(tarifa, dpto, secc, fam, subfam); //This will return a datatable filtered by pricelist
System.Data.DataTable dtnew = dataSetsingle.Tables[0];
var results = from table1 in dtinitial.AsEnumerable()
join table2 in dtnew.AsEnumerable()
on new { A = table1["ID_ART"], B = table1["ID_FORMAT"] } equals new { A = table2["ID_ART"], B = table2["ID_FORMAT"] }
select new
{
table1,table2
};
}
Should i keep moving forward through this approach? I dont get a flattend result the way i'm doing it, and i'm not sure how to solve that.
I have access to the database, so i could potentially change the query.
Pivot tables could work?
Thanks a lot!
If you want to continue with the LINQ join over DataTables, this extension method can help, but I think you would be better off with a pivot. Unfortunately I can't tell from your question what you want to pivot, but I do have a pivot method for a DataTable as well.
public static class DataTableExt {
// ***
// *** T Extensions
// ***
public static IEnumerable<T> AsSingleton<T>(this T first) {
yield return first;
}
// ***
// *** MemberInfo Extensions
// ***
public static Type GetMemberType(this MemberInfo member) {
switch (member) {
case FieldInfo mfi:
return mfi.FieldType;
case PropertyInfo mpi:
return mpi.PropertyType;
case EventInfo mei:
return mei.EventHandlerType;
default:
throw new ArgumentException("MemberInfo must be if type FieldInfo, PropertyInfo or EventInfo", nameof(member));
}
}
public static object GetValue(this MemberInfo member, object srcObject) {
switch (member) {
case FieldInfo mfi:
return mfi.GetValue(srcObject);
case PropertyInfo mpi:
return mpi.GetValue(srcObject);
case MethodInfo mi:
return mi.Invoke(srcObject, null);
default:
throw new ArgumentException("MemberInfo must be of type FieldInfo, PropertyInfo or MethodInfo", nameof(member));
}
}
public static T GetValue<T>(this MemberInfo member, object srcObject) => (T)member.GetValue(srcObject);
// ***
// *** Type Extensions
// ***
public static List<MemberInfo> GetPropertiesOrFields(this Type t, BindingFlags bf = BindingFlags.Public | BindingFlags.Instance) =>
t.GetMembers(bf).Where(mi => mi.MemberType == MemberTypes.Field || mi.MemberType == MemberTypes.Property).ToList();
// ***
// *** DataTable Extensions
// ***
public static IEnumerable<DataColumn> DataColumns(this DataTable aTable) => aTable.Columns.Cast<DataColumn>();
public static IEnumerable<string> ColumnNames(this DataTable aTable) => aTable.DataColumns().Select(dc => dc.ColumnName);
// Create new DataTable from LINQ join results on DataTable
// Expect T to be anonymous object of form new { [DataRow or other] d1, [DataRow or other] d2, ... }
public static DataTable FlattenToDataTable<T>(this IEnumerable<T> src) {
var res = new DataTable();
if (src.Any()) {
var firstRow = src.First();
var memberInfos = typeof(T).GetPropertiesOrFields();
var allDC = memberInfos.SelectMany(mi => (mi.GetMemberType() == typeof(DataRow)) ? mi.GetValue<DataRow>(firstRow).Table.DataColumns() : new DataColumn(mi.Name, mi.GetMemberType()).AsSingleton());
foreach (var dc in allDC) {
var newColumnName = dc.ColumnName;
if (res.ColumnNames().Contains(newColumnName)) {
var suffixNumber = 1;
while (res.ColumnNames().Contains($"{newColumnName}.{suffixNumber}"))
++suffixNumber;
newColumnName = $"{newColumnName}.{suffixNumber}";
}
res.Columns.Add(new DataColumn(newColumnName, dc.DataType));
}
foreach (var objRows in src)
res.Rows.Add(memberInfos.SelectMany(mi => (mi.GetMemberType() == typeof(DataRow)) ? mi.GetValue<DataRow>(objRows).ItemArray : mi.GetValue(objRows).AsSingleton()).ToArray());
}
return res;
}
}
Hmmm... didn't realize how many extension methods that code used :)

DapperExtensions: Add "insert ... update on duplicate key"

Now I'm using Dapper + Dapper.Extensions. And yes, it's easy and awesome. But I faced with a problem: Dapper.Extensions has only Insert command and not InsertUpdateOnDUplicateKey. I want to add such method but I don't see good way to do it:
I want to make this method generic like Insert
I can't get cached list of properties for particular type because I don't want to use reflection directly to build raw sql
Possible way here to fork it on github but I want to make it in my project only. Does anybody know how to extend it? I understand this feature ("insert ... update on duplicate key") is supported only in MySQL. But I can't find extension points in DapperExtensions to add this functionality outside.
Update: this is my fork https://github.com/MaximTkachenko/Dapper-Extensions/commits/master
This piece of code has helped me enormously in MySQL -related projects, I definitely owe you one.
I do a lot of database-related development on both MySQL and MS SQL. I also try to share as much code as possible between my projects.
MS SQL has no direct equivalent for "ON DUPLICATE KEY UPDATE", so I was previously unable to use this extension when working with MS SQL.
While migrating a web application (that leans heavily on this Dapper.Extensions tweak) from MySQL to MS SQL, I finally decided to do something about it.
This code uses the "IF EXISTS => UPDATE ELSE INSERT" approach that basically does the same as "ON DUPLICATE KEY UPDATE" on MySQL.
Please note: the snippet assumes that you are taking care of transactions outside this method. Alternatively you could append "BEGIN TRAN" to the beginning and "COMMIT" to the end of the generated sql string.
public static class SqlGeneratorExt
{
public static string InsertUpdateOnDuplicateKey(this ISqlGenerator generator, IClassMapper classMap, bool hasIdentityKeyWithValue = false)
{
var columns = classMap.Properties.Where(p => !(p.Ignored || p.IsReadOnly || (p.KeyType == KeyType.Identity && !hasIdentityKeyWithValue))).ToList();
var keys = columns.Where(c => c.KeyType != KeyType.NotAKey).Select(p => $"{generator.GetColumnName(classMap, p, false)}=#{p.Name}");
var nonkeycolumns = classMap.Properties.Where(p => !(p.Ignored || p.IsReadOnly) && p.KeyType == KeyType.NotAKey).ToList();
if (!columns.Any())
{
throw new ArgumentException("No columns were mapped.");
}
var tablename = generator.GetTableName(classMap);
var columnNames = columns.Select(p => generator.GetColumnName(classMap, p, false));
var parameters = columns.Select(p => generator.Configuration.Dialect.ParameterPrefix + p.Name);
var valuesSetters = nonkeycolumns.Select(p => $"{generator.GetColumnName(classMap, p, false)}=#{p.Name}").ToList();
var where = keys.AppendStrings(seperator: " and ");
var sqlbuilder = new StringBuilder();
sqlbuilder.AppendLine($"IF EXISTS (select * from {tablename} WITH (UPDLOCK, HOLDLOCK) WHERE ({where})) ");
sqlbuilder.AppendLine(valuesSetters.Any() ? $"UPDATE {tablename} SET {valuesSetters.AppendStrings()} WHERE ({where}) " : "SELECT 0 ");
sqlbuilder.AppendLine($"ELSE INSERT INTO {tablename} ({columnNames.AppendStrings()}) VALUES ({parameters.AppendStrings()}) ");
return sqlbuilder.ToString();
}
}
Actually I closed my pull request and remove my fork because:
I see some open pull requests created in 2014
I found a way "inject" my code in Dapper.Extensions.
I remind my problem: I want to create more generic queries for Dapper.Extensions. It means I need to have access to mapping cache for entities, SqlGenerator etc. So here is my way. I want to add ability to make INSERT .. UPDATE ON DUPLICATE KEY for MySQL. I created extension method for ISqlGenerator
public static class SqlGeneratorExt
{
public static string InsertUpdateOnDuplicateKey(this ISqlGenerator generator, IClassMapper classMap)
{
var columns = classMap.Properties.Where(p => !(p.Ignored || p.IsReadOnly || p.KeyType == KeyType.Identity));
if (!columns.Any())
{
throw new ArgumentException("No columns were mapped.");
}
var columnNames = columns.Select(p => generator.GetColumnName(classMap, p, false));
var parameters = columns.Select(p => generator.Configuration.Dialect.ParameterPrefix + p.Name);
var valuesSetters = columns.Select(p => string.Format("{0}=VALUES({1})", generator.GetColumnName(classMap, p, false), p.Name));
string sql = string.Format("INSERT INTO {0} ({1}) VALUES ({2}) ON DUPLICATE KEY UPDATE {3}",
generator.GetTableName(classMap),
columnNames.AppendStrings(),
parameters.AppendStrings(),
valuesSetters.AppendStrings());
return sql;
}
}
One more extension method for IDapperImplementor
public static class DapperImplementorExt
{
public static void InsertUpdateOnDuplicateKey<T>(this IDapperImplementor implementor, IDbConnection connection, IEnumerable<T> entities, int? commandTimeout = null) where T : class
{
IClassMapper classMap = implementor.SqlGenerator.Configuration.GetMap<T>();
var properties = classMap.Properties.Where(p => p.KeyType != KeyType.NotAKey);
string emptyGuidString = Guid.Empty.ToString();
foreach (var e in entities)
{
foreach (var column in properties)
{
if (column.KeyType == KeyType.Guid)
{
object value = column.PropertyInfo.GetValue(e, null);
string stringValue = value.ToString();
if (!string.IsNullOrEmpty(stringValue) && stringValue != emptyGuidString)
{
continue;
}
Guid comb = implementor.SqlGenerator.Configuration.GetNextGuid();
column.PropertyInfo.SetValue(e, comb, null);
}
}
}
string sql = implementor.SqlGenerator.InsertUpdateOnDuplicateKey(classMap);
connection.Execute(sql, entities, null, commandTimeout, CommandType.Text);
}
}
Now I can create new class derived from Database class to use my own sql
public class Db : Database
{
private readonly IDapperImplementor _dapperIml;
public Db(IDbConnection connection, ISqlGenerator sqlGenerator) : base(connection, sqlGenerator)
{
_dapperIml = new DapperImplementor(sqlGenerator);
}
public void InsertUpdateOnDuplicateKey<T>(IEnumerable<T> entities, int? commandTimeout) where T : class
{
_dapperIml.InsertUpdateOnDuplicateKey(Connection, entities, commandTimeout);
}
}
Yeah, it's required to create another DapperImplementor instance because DapperImplementor instance from base class is private :(. So now I can use my Db class to call my own generic sql queries and native queries from Dapper.Extension. Examples of usage Database class instead of IDbConnection extensions can be found here.

How to bulk insert generic source array into arbitrary SQL table

My application uses Entity Framework. Most days I feel fine about that, but I have trouble when I need to do bulk insertions. I've yet to find a way for EF to do these quickly. As such, I need a solution that goes outside of EF to do the updates.
I want a method that takes a connection string, destination table name, and a generic array of source data, and performs the bulk insertions. Moreover, I'd like it to map the properties of the source data to the specific table fields, ideally without requiring attributes in the source object to designate the table field.
Thus, for this object:
public class Customer
{
//property populated & used only in memory
public string TempProperty { get; set; }
//properties saved in the database
public string Name { get; set; }
public string Address { get; set; }
public int Age { get; set; }
public string Comments { get; set; }
//various methods, constructors, etc.
}
I should be able to provide the table name Data.Customer, and the method should map Customer.Name -> Data.Customers.Name, Customer.Address -> Data.Customers.Address, etc.
I was surprised I couldn't find the code for something like this anywhere. Maybe this isn't the best approach to the problem? That possibility notwithstanding, here's the solution I came up with.
My starting outline was something like this:
Use the destination table and source object type to create mapping object using reflection.
Iterate through the source objects, applying the mapper, to generate the insertion data.
Insert the data into the destination table using the System.Data.SqlClient.SqlBulkCopy object.
When I see an outline like this, I try to rephrase it in terms of data types, because really all I'm doing is translating my input (T[]) into something that SqlBulkCopy will accept. But to do this, I need to decide on how to map fields.
Here's what I settled on:
var mapper = new List<Tuple<string, Func<object, string, object>>>();
I chose this to represent a table that looks like this:
+------------+----------------------------------------------+
| Field Name | Mapping Function |
+------------+----------------------------------------------+
| Name | Customer.Name -> Data.Customers.Name |
| Address | Customer.Address -> Data.Customers.Address |
| Age | Customer.Age -> Data.Customers.Age |
| Comments | Customer.Comments -> Data.Customers.Comments |
+------------+----------------------------------------------+
The fact that it's a list represents the rows. That leaves us with
Tuple<string, Func<object, string, object>>. This creates a sort of Dictionary (but indexed) where a given string (the field name) is mapped to a function that, when given a source object T and a source field (e.g. Address), will retrieve the appropriate value. If no corresponding property is found for a table field, we'll just return null.
After validating that the input values are valid (connection valid, table exists, etc.), we'll create our mapping object:
//get all the column names for the table to build mapping object
SqlCommand command = new SqlCommand($"SELECT TOP 1 * FROM {foundTableName}", conn);
SqlDataReader reader = command.ExecuteReader();
//build mapping object by iterating through rows and verifying that there is a match in the table
var mapper = new List<Tuple<string, Func<object, string, object>>>();
foreach (DataRow col in reader.GetSchemaTable().Rows)
{
//get column information
string columnName = col.Field<string>("ColumnName");
PropertyInfo property = typeof(T).GetProperty(columnName);
Func<object, string, object> map;
if (property == null)
{
//check if it's nullable and exit if not
bool nullable = col.Field<bool>("Is_Nullable");
if (!nullable)
return $"No corresponding property found for Non-nullable field '{columnName}'.";
//if it's nullable, create mapping function
map = new Func<object, string, object>((a, b) => null);
}
else
map = new Func<object, string, object>((src, fld) => typeof(T).GetProperty(fld).GetValue(src));
//add mapping object
mapper.Add(new Tuple<string, Func<object, string, object>>(columnName, map));
}
The SqlBulkCopy object accepts DataRow[] as an input, so from here we can simply create template DataRow objects from the destination table and fill them from our mapper:
//get all the data
int dataCount = sourceData.Count();
var rows = new DataRow[dataCount];
DataTable destTableDT = new DataTable();
destTableDT.Load(reader);
for (int x = 0; x < dataCount; x++)
{
var dataRow = destTableDT.NewRow();
dataRow.ItemArray = mapper.Select(m => m.Item2.Invoke(sourceData[x], m.Item1))
.ToArray();
rows[x] = dataRow;
}
Then finish it out by writing the data:
//set up the bulk copy connection
SqlBulkCopy sbc = new SqlBulkCopy(conn, SqlBulkCopyOptions.TableLock |
SqlBulkCopyOptions.UseInternalTransaction, null);
sbc.DestinationTableName = foundTableName;
sbc.BatchSize = BATCH_SIZE;
sbc.WriteToServer(rows);
And that's it! Works like a charm, and ran too fast for me to bother benchmarking (EF was taking several minutes to run these imports).
I should note that I left out a lot of my error checking code and will probably be adding more. However, if you'd like to see my class in its entirety, here it is:
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.Common;
using System.Data.SqlClient;
using System.Linq;
using System.Reflection;
namespace DatabaseUtilities
{
public static class RapidDataTools
{
const int SCHEMA_SCHEMA_NAME = 1;
const int SCHEMA_TABLE_NAME = 2;
const int BATCH_SIZE = 1000;
/// <summary>
/// Imports an array of data into a specified table. It does so by mapping object properties
/// to table columns. Only properties with the same name as the column name will be copied;
/// other columns will be left null. Non-nullable columns with no corresponding property will
/// throw an error.
/// </summary>
/// <param name="connectionString"></param>
/// <param name="destTableName">Qualified table name (e.g. Admin.Table)</param>
/// <param name="sourceData"></param>
/// <returns></returns>
public static string Import<T>(string connectionString, string destTableName, T[] sourceData)
{
//get destination table qualified name
string[] tableParts = destTableName.Split('.');
if (tableParts.Count() != 2) return $"Invalid or unqualified destination table name: {destTableName}.";
string destSchema = tableParts[0];
string destTable = tableParts[1];
//create the database connection
SqlConnection conn = GetConnection(connectionString);
if (conn == null) return "Invalid connection string.";
//establish connection
try { conn.Open(); }
catch { return "Could not connect to database using provided connection string."; }
//make sure the requested table exists
string foundTableName = string.Empty;
foreach (DataRow row in conn.GetSchema("Tables").Rows)
if (row[SCHEMA_SCHEMA_NAME].ToString().Equals(destSchema, StringComparison.CurrentCultureIgnoreCase) &&
row[SCHEMA_TABLE_NAME].ToString().Equals(destTable, StringComparison.CurrentCultureIgnoreCase))
{
foundTableName = $"{row[SCHEMA_SCHEMA_NAME]}.{row[SCHEMA_TABLE_NAME]}";
break;
}
if (foundTableName == string.Empty) return $"Specified table '{destTableName}' could not be found in table.";
//get all the column names for the table to build mapping object
SqlCommand command = new SqlCommand($"SELECT TOP 1 * FROM {foundTableName}", conn);
SqlDataReader reader = command.ExecuteReader();
//build mapping object by iterating through rows and verifying that there is a match in the table
var mapper = new List<Tuple<string, Func<object, string, object>>>();
foreach (DataRow col in reader.GetSchemaTable().Rows)
{
//get column information
string columnName = col.Field<string>("ColumnName");
PropertyInfo property = typeof(T).GetProperty(columnName);
Func<object, string, object> map;
if (property == null)
{
//check if it's nullable and exit if not
bool nullable = col.Field<bool>("Is_Nullable");
if (!nullable)
return $"No corresponding property found for Non-nullable field '{columnName}'.";
//if it's nullable, create mapping function
map = new Func<object, string, object>((a, b) => null);
}
else
map = new Func<object, string, object>((src, fld) => typeof(T).GetProperty(fld).GetValue(src));
//add mapping object
mapper.Add(new Tuple<string, Func<object, string, object>>(columnName, map));
}
//get all the data
int dataCount = sourceData.Count();
var rows = new DataRow[dataCount];
DataTable destTableDT = new DataTable();
destTableDT.Load(reader);
for (int x = 0; x < dataCount; x++)
{
var dataRow = destTableDT.NewRow();
dataRow.ItemArray = mapper.Select(m => m.Item2.Invoke(sourceData[x], m.Item1)).ToArray();
rows[x] = dataRow;
}
//close the old connection
conn.Close();
//set up the bulk copy connection
SqlBulkCopy sbc = new SqlBulkCopy(conn, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.UseInternalTransaction, null);
sbc.DestinationTableName = foundTableName;
sbc.BatchSize = BATCH_SIZE;
//establish connection
try { conn.Open(); }
catch { return "Failed to re-established connection to the database after reading data."; }
//write data
try { sbc.WriteToServer(rows); }
catch (Exception ex) { return $"Batch write failed. Details: {ex.Message} - {ex.StackTrace}"; }
//if we got here, everything worked!
return string.Empty;
}
private static SqlConnection GetConnection(string connectionString)
{
DbConnectionStringBuilder csb = new DbConnectionStringBuilder();
try { csb.ConnectionString = connectionString; }
catch { return null; }
return new SqlConnection(csb.ConnectionString);
}
}
}
Take a look at NuGet package "EntityFramework.Utilities", you can add it to your project. It has features for this. You'll have to do some recoding, but this will probably benefit you in the end.
References:
https://www.nuget.org/packages/EFUtilities/ (package)
https://github.com/MikaelEliasson/EntityFramework.Utilities (source +
examples)
There is basically 3 approach for performing Bulk Insert in Entity Framework
Entity Framework Extensions (Recommended)
EntityFramework.BulkInsert (Limited and unsupported)
SqlBulkCopy (Complex to make it work)
The first 2 library add extension methods to the DbContext
using (var ctx = new EntitiesContext())
{
ctx.BulkInsert(list);
}
You can learn more about this three techniques in this article
Disclaimer: I'm the owner of the project Entity Framework Extensions

How to use LINQ to CRM to populate Entity Readonly Fields (ModifiedOn, Createdby, etc)?

I'm trying to run this simple query:
var appt = (from a in context.AppointmentSet
select new Appointment{ ModifiedOn = a.ModifiedOn}).First();
but I'm getting a compiler exception since ModifiedOn is readonly.
I could just return a, but then all the attributes of the Appointment entity will be returned, not just the ModifiedOn.
I could return new { a.ModifiedOn }, but then appt would be an AnonymousType and not an Appointment.
What's the suggested way to make this work?
Note, this is an example, assume that I'm returning more than just a single property from Appointment, and then there is a where criteria of some sort
I always do it like that:
var appt = (from a in context.AppointmentSet
select new Appointment {
Attributes =
{
{ "modifiedon", a.ModifiedOn },
{ "createdon", a.CreatedOn },
{ "createdby", a.CreatedBy }
}
})
.First();
It does not require any extra coding, I'm really surprised nobody posted that here yet.
This is the most efficient way I can think of:
Create a new constructor that accepts an AnonymousType (Object)
public Appointment(object anonymousType) : this()
{
foreach (var p in anonymousType.GetType().GetProperties())
{
var value = p.GetValue(anonymousType);
if (p.PropertyType == typeof(Guid))
{
// Type is Guid, must be Id
base.Id = (Guid)value;
Attributes["opportunityid"] = base.Id;
}
else if (p.Name == "FormattedValues")
{
// Add Support for FormattedValues
FormattedValues.AddRange((FormattedValueCollection)value);
}
else
{
Attributes[p.Name.ToLower()] = value;
}
}
}
Then call it like so:
var appt = (from a in context.AppointmentSet
select new Appointment(new { a.ModifiedOn })).First();
It has to reflect over all the properties of the AnoymousType, but it should work and has the added benefit of not having to rewrite the properties names each time.

how to could use reflection to write to the appropriate properties on the entity?

How to: you don't want to manually write code to write to ColumnA, ColumnB, etc, then you could use reflection to write to the appropriate properties on the entity?
you can create a new instance of this class and its property values. These property values are mapped to columns in the SQL database table. You then pass this object to the DataContext class generated by LINQ to SQL, to add a new row to the table in the database.
So, you would do something like this:
For a database table with columns "ColumnA", "ColumnB" and "ColumnC"
var myEntity = new EntityObject { ColumnA = "valueA", ColumnB = "valueB", "ColumnC" = "valueC" };
DataContext.InsertOnSubmit(myEntity);
DataContext.SubmitChanges();
This will insert a new row into the database with the column values specified.
Now if you don't want to manually write code to write to ColumnA, ColumnB, etc, then you could use reflection to write to the appropriate properties on the entity:
For example, with entity instance 'myEntity':
var properties = myEntity.GetType().GetProperties();
foreach (string ky in ld.Keys)
{
var matchingProperty = properties.Where(p => p.Name.Equals(ky)).FirstOrDefault();
if (matchingProperty != null)
{
matchingProperty.SetValue(myEntity, ld[ky], null);
}
}
i try to this but i cannot. How can you make it?
Check this article : LINQ to SQL: All common operations (Insert, Update, Delete, Get) in one base class
Following might help you :
protected virtual void Update(T entity, Expression<Func<T, bool>> query)
{
using (DC db = new DC())
{
object propertyValue = null;
T entityFromDB = db.GetTable<T>().Where(query).SingleOrDefault();
if (null == entityFromDB)
throw new NullReferenceException("Query Supplied to " +
"Get entity from DB is invalid, NULL value returned");
PropertyInfo[] properties = entityFromDB.GetType().GetProperties();
foreach (PropertyInfo property in properties)
{
propertyValue = null;
if (null != property.GetSetMethod())
{
PropertyInfo entityProperty =
entity.GetType().GetProperty(property.Name);
if (entityProperty.PropertyType.BaseType ==
Type.GetType("System.ValueType")||
entityProperty.PropertyType ==
Type.GetType("System.String"))
propertyValue =
entity.GetType().GetProperty(property.Name).GetValue(entity, null);
if (null != propertyValue)
property.SetValue(entityFromDB, propertyValue, null);
}
}
db.SubmitChanges();
}
}

Categories