Let's say I have a complex .NET class, with lots of arrays and other class object members. I need to be able to generate a deep clone of this object - so I write a Clone() method, and implement it with a simple BinaryFormatter serialize/deserialize - or perhaps I do the deep clone using some other technique which is more error prone and I'd like to make sure is tested.
OK, so now (ok, I should have done it first) I'd like write tests which cover the cloning. All the members of the class are private, and my architecture is so good (!) that I haven't needed to write hundreds of public properties or other accessors. The class isn't IComparable or IEquatable, because that's not needed by the application. My unit tests are in a separate assembly to the production code.
What approaches do people take to testing that the cloned object is a good copy? Do you write (or rewrite once you discover the need for the clone) all your unit tests for the class so that they can be invoked with either a 'virgin' object or with a clone of it? How would you test if part of the cloning wasn't deep enough - as this is just the kind of problem which can give hideous-to-find bugs later?
You method of testing will depend on the type of solution you come up with. If you write some custom cloning code and have to manually implement that in each cloneable type then you should really test the cloning of each one of those types. Alternatively, if you decide to go a more generic route (where the aforementioned reflection would likely fit in), your tests would only need to test the specific scenarios that you cloning system will have to deal with.
To answer your specific questions:
Do you write (or rewrite once you discover the need for the clone) all your unit tests for the class so that they can be invoked with either a 'virgin' object or with a clone of it?
You should have tests for all the methods that can be performed on both the original and cloned objects. Note that it should be pretty easy to set up a simple test design to support this without manually updating the logic for each test.
How would you test if part of the cloning wasn't deep enough - as this is just the kind of problem which can give hideous-to-find bugs later?
It depends on the cloning method you choose. If you have to manually update the cloneable types then you should test that each type is cloning all (and only) the members you expect. Whereas, if you are testing a cloning framework I would create some test cloneable types to test each scenario you need to support.
There's a really obvious solution that doesn't take nearly as much work:
Serialize the object into a binary format.
Clone the object.
Serialize the clone into a binary format.
Compare the bytes.
Assuming that serialization works - and it better because you are using it to clone - this should be easy to maintain. In fact, it will be encapsulated from changes to the structure of your class completely.
I'd just write a single test to determine if the clone was correct or not. If the class isn't sealed, you can create a harness for it by extending it, and then exposing all your internals within the child class. Alternatively, you could use reflection (yech), or use MSTest's Accessor generators.
You need to clone your object and then go through every single property and variable that your object has and determine if it was copied correctly or cloned correctly.
I like to write unit tests that use one of the builtin serializers on the original and the cloned object and then check the serialized representations for equality (for a binary formatter, I can just compare the byte arrays). This works great in cases where the object is still serializable, and I'm only changing to a custom deep clone for perf reasons.
Furthermore, I like to add a debug mode check to all of my Clone implementations using something like this
[Conditional("DEBUG")]
public static void DebugAssertValueEquality<T>(T current, T other, bool expected,
params string[] ignoredFields) {
if (null == current)
{ throw new ArgumentNullException("current"); }
if (null == ignoredFields)
{ ignoredFields = new string[] { }; }
FieldInfo lastField = null;
bool test;
if (object.ReferenceEquals(other, null))
{ Debug.Assert(false == expected, "The other object was null"); return; }
test = true;
foreach (FieldInfo fi in current.GetType().GetFields(BindingFlags.Instance)) {
if (test = false) { break; }
if (0 <= Array.IndexOf<string>(ignoredFields, fi.Name))
{ continue; }
lastField = fi;
object leftValue = fi.GetValue(current);
object rightValue = fi.GetValue(other);
if (object.ReferenceEquals(null, leftValue)) {
if (!object.ReferenceEquals(null, rightValue))
{ test = false; }
}
else if (object.ReferenceEquals(null, rightValue))
{ test = false; }
else {
if (!leftValue.Equals(rightValue))
{ test = false; }
}
}
Debug.Assert(test == expected, string.Format("field: {0}", lastField));
}
This method relies on an accurate implementation of Equals on any nested members, but in my case anything that is cloneable is also equatable
I would usually implement Equals() for comparing the two objects in depth. You might not need it in your production code but it might still come in handy later and the test code is much cleaner.
Here is a sample of how I implemented this a while back, although this will need to be tailored to the scenario. In this case we had a nasty object chain that could easily change and the clone was used as a very critical prototype implementation and so I had to patch (hack) this test together.
public static class TestDeepClone
{
private static readonly List<long> objectIDs = new List<long>();
private static readonly ObjectIDGenerator objectIdGenerator = new ObjectIDGenerator();
public static bool DefaultCloneExclusionsCheck(Object obj)
{
return
obj is ValueType ||
obj is string ||
obj is Delegate ||
obj is IEnumerable;
}
/// <summary>
/// Executes various assertions to ensure the validity of a deep copy for any object including its compositions
/// </summary>
/// <param name="original">The original object</param>
/// <param name="copy">The cloned object</param>
/// <param name="checkExclude">A predicate for any exclusions to be done, i.e not to expect IPolicy items to be cloned</param>
public static void AssertDeepClone(this Object original, Object copy, Predicate<object> checkExclude)
{
bool isKnown;
if (original == null) return;
if (copy == null) Assert.Fail("Copy is null while original is not", original, copy);
var id = objectIdGenerator.GetId(original, out isKnown); //Avoid checking the same object more than once
if (!objectIDs.Contains(id))
{
objectIDs.Add(id);
}
else
{
return;
}
if (!checkExclude(original))
{
Assert.That(ReferenceEquals(original, copy) == false);
}
Type type = original.GetType();
PropertyInfo[] propertyInfos = type.GetProperties(BindingFlags.NonPublic | BindingFlags.Instance | BindingFlags.Public);
FieldInfo[] fieldInfos = type.GetFields(BindingFlags.NonPublic | BindingFlags.Instance | BindingFlags.Public);
foreach (PropertyInfo memberInfo in propertyInfos)
{
var getmethod = memberInfo.GetGetMethod();
if (getmethod == null) continue;
var originalValue = getmethod.Invoke(original, new object[] { });
var copyValue = getmethod.Invoke(copy, new object[] { });
if (originalValue == null) continue;
if (!checkExclude(originalValue))
{
Assert.That(ReferenceEquals(originalValue, copyValue) == false);
}
if (originalValue is IEnumerable && !(originalValue is string))
{
var originalValueEnumerable = originalValue as IEnumerable;
var copyValueEnumerable = copyValue as IEnumerable;
if (copyValueEnumerable == null) Assert.Fail("Copy is null while original is not", new[] { original, copy });
int count = 0;
List<object> items = copyValueEnumerable.Cast<object>().ToList();
foreach (object o in originalValueEnumerable)
{
AssertDeepClone(o, items[count], checkExclude);
count++;
}
}
else
{
//Recurse over reference types to check deep clone success
if (!checkExclude(originalValue))
{
AssertDeepClone(originalValue, copyValue, checkExclude);
}
if (originalValue is ValueType && !(originalValue is Guid))
{
//check value of non reference type
Assert.That(originalValue.Equals(copyValue));
}
}
}
foreach (FieldInfo fieldInfo in fieldInfos)
{
var originalValue = fieldInfo.GetValue(original);
var copyValue = fieldInfo.GetValue(copy);
if (originalValue == null) continue;
if (!checkExclude(originalValue))
{
Assert.That(ReferenceEquals(originalValue, copyValue) == false);
}
if (originalValue is IEnumerable && !(originalValue is string))
{
var originalValueEnumerable = originalValue as IEnumerable;
var copyValueEnumerable = copyValue as IEnumerable;
if (copyValueEnumerable == null) Assert.Fail("Copy is null while original is not", new[] { original, copy });
int count = 0;
List<object> items = copyValueEnumerable.Cast<object>().ToList();
foreach (object o in originalValueEnumerable)
{
AssertDeepClone(o, items[count], checkExclude);
count++;
}
}
else
{
//Recurse over reference types to check deep clone success
if (!checkExclude(originalValue))
{
AssertDeepClone(originalValue, copyValue, checkExclude);
}
if (originalValue is ValueType && !(originalValue is Guid))
{
//check value of non reference type
Assert.That(originalValue.Equals(copyValue));
}
}
}
}
}
Related
I'm using an extension method to copy properties from an object to another (base on type and name). Below is my current code (inspired from this article).
I added some code to accommodate the copy of nullable to their base type if needed but I can't seem to find an elegant way to copy over collections of different types.
For instance how to copy a List<T> to an ObservableCollection<T> (where T is the same type in each collection) ?
All the collections are always initialized and it is ok to clear them before copy so it may help a bit.
Thank you for your help and guidance.
public static void CopyPropertiesFrom(this object? self, object parent){
var fromProperties = parent.GetType().GetProperties();
var toProperties = self.GetType().GetProperties();
foreach (var fromProperty in fromProperties)
foreach (var toProperty in toProperties.Where(x => x.SetMethod != null))
if (fromProperty.Name == toProperty.Name){
if (fromProperty.PropertyType == toProperty.PropertyType){
toProperty.SetValue(self, fromProperty.GetValue(parent));
break;
}
//The code below is to accomodate Nullable type and copy properties based on the base type
if (Nullable.GetUnderlyingType(fromProperty.PropertyType) == toProperty.PropertyType){
if (fromProperty.GetValue(parent) == null) continue;
toProperty.SetValue(self, fromProperty.GetValue(parent));
break;
}
if (Nullable.GetUnderlyingType(toProperty.PropertyType) == fromProperty.PropertyType){
toProperty.SetValue(self, fromProperty.GetValue(parent));
break;
}
if (Nullable.GetUnderlyingType(fromProperty.PropertyType) ==
Nullable.GetUnderlyingType(toProperty.PropertyType)){
//This case will be true when from and to are two collections<T>, where T is the same between each collections.
//However I'm not sure how to detect the type of collections and add the necessary logic to clear/add elements
toProperty.SetValue(self, fromProperty.GetValue(parent));
break;
}
}
}
Recently, I ran into a problem of comparing 2 objects of the same class in C#. I need to know which fields/properties are changed.
Here is the example:
SampleClass
{
string sampleField1;
int sampleField2;
CustomClass sampleField3;
}
And I have 2 SampleClass object, object1 and object2, for example.
These 2 objects have some different field value.
Can anyone know the best approach to get which fields are different?
And how to get the (string) names of that different fields/properties?
I heard of Reflection in .Net. Is that the best approach in this situation?
And if we didn't have the CustomClass field? (I just make this field for a more general approach, that field does not exist in my case)
If you want Generic way to get all changed properties
you can use this method (and it is using reflection ^_^ )
public List<string> GetChangedProperties(object obj1, object obj2)
{
List<string> result = new List<string>();
if(obj1 == null || obj2 == null )
// just return empty result
return result;
if (obj1.GetType() != obj2.GetType())
throw new InvalidOperationException("Two objects should be from the same type");
Type objectType = obj1.GetType();
// check if the objects are primitive types
if (objectType.IsPrimitive || objectType == typeof(Decimal) || objectType == typeof(String) )
{
// here we shouldn't get properties because its just primitive :)
if (!object.Equals(obj1, obj2))
result.Add("Value");
return result;
}
var properties = objectType.GetProperties();
foreach (var property in properties)
{
if (!object.Equals(property.GetValue(obj1), property.GetValue(obj2)))
{
result.Add(property.Name);
}
}
return result;
}
Please note that this method only gets Primitive type properties that have changed and reference type properties that refer to the same instance
EDIT: Added validation in case if obj1 or obj2 is primitive type (int,string ... ) because I tried to pass string object and it will give an error
also fixed bug of checking whether the two values are equal
A slight modification of another answer posted here, but this one works with properties that are not string types, doesn't use an internal list and does automatic some preliminary type checking as it's generic:
public IEnumerable<string> ChangedFields<T>(T first, T second)
{
if (obj1.GetType() != obj2.GetType())
throw new ArgumentOutOfRangeException("Objects should be of the same type");
var properties = first
.GetType()
.GetProperties();
foreach (var property in properties)
{
if(!object.Equals(property.GetValue(first), property.GetValue(second)))
{
yield return property.Name;
}
}
}
If you need to compare two objects as part of your business logic reflection is the way to go, unless of course you can write comparator classes for each type.
If you want to compare two objects at run time during debugging, there is a neat plugin called Oz Code that can do that for you, something like this:
I have an Entity Framework 4 project that has built up some brute-force searching code that I'd like to reduce to more generic, and more manageable chunks.
One of my Partial Classes, the Run object, contains Navigation Properties (Entity Collections) to other objects (Run.Nodes, Run.Arcs), as well as Scalars (GUID, Version #), and singlular Navigation Properties (Entity Objects - Run.TimeRange).
Run.Nodes is a Base Class collection of NodeBase, with derived classes of NodeTypeA, NodeTypeB, and NodeTypeC.
Using Reflection:
public EntityObject FindDiscriminant<T>(T needle) where T : EntityObject
{
Boolean test = false;
Type sourceType = this.GetType();
String needleString = needle.GetType().BaseType.Name.ToString();
String needleStringLookup = typeDict.Where(o => o.Key == needleString).FirstOrDefault().Value;
//If we don't match anything that means that the object itself is a base class, so we need to try again
if (needleStringLookup == null)
{
needleString = needle.GetType().Name.ToString();
needleStringLookup = typeDict.Where(o => o.Key == needleString).FirstOrDefault().Value;
}
var needleProperty = Type.GetType(sourceType.FullName).GetProperty(needleStringLookup);
var runValue = needleProperty.GetValue(this, null);
if (runValue.GetType().ToString().Contains("EntityCollection"))
{
foreach (var obj in (runValue as EntityCollection<T>).ToList())
{
test = (obj as T).Discriminant(needle);
if (test == true)
return obj;
}
}
else
{
test = (runValue as EntityObject).Discriminant(needle);
if (test == true)
return (T)runValue;
}
return null;
}
This method works great for EntityCollections (except NodeBase). If I try and look for a node of NodeTypeC in Run.Nodes, runValue will be an EntityCollection of 173 NodeBase objects. But when I try and iterate over it (.ToList()), I get this error:
System.ArgumentNullException was unhandled
Value cannot be null.
Parameter name: source
My workaround is to check to see of the EntityCollection is of type NodeBase, and have an if statement to handle it, and substitute EntityCollection).ToList() for EntityCollection).ToList()
Any suggestions?
An update to my question, for anyone searching this. The code has changed dramatically, and I'm now using Delegates as SearchActions, and have a generic FindSomething routine that uses those delegates instead of having several search routines each using their own type of input.
The things to note are:
The method of detection for determining if my object I pulled with
reflection is an EntityObject or an EntityCollection
I use a private method to iterate over the EntityCollection that I
pass from my generic FindSomething routine. This takes care of the
base-class comparisons
By having the private method to call, I avoid having to use casting on the EntityCollection - this goes away: (runValue as EntityCollection) as well as (obj as T)
I have created a dynamic Object Dictionary when I instantiate our
application - I go through our collection of objects and map objects
and the properties we care about so I don't have to brute force
through an entire object every search
I use dynamic instead of var - I love dynamic! And I no longer cast
before doing a search.
The function is recursive - the SearchAction delegate gets called
again during the iteration code in the IterateThroughEntityCollection
method.
Good? Bad? Comments? Feedback? It works for me, and it's fast.
Here's the revised code:
private EntityObject FindSomething<T>(Run haystack, T needle, SearchAction<T> sa)
{
//First, assume we haven't found anything
Boolean test = false;
//Next, go through all the objects in a run and see if we find anything
//No need to go through Arcs, if the needle is a type of NodeBase, etc.
Type oldRunElementProperty = TypeReference.RunElementDictionary.Where(o => o.Key == type).Single().Key;
PropertyInfo runValuePropertyToChange = TypeReference.RunElementDictionary.Where(o => o.Key == type).Single().Value;
dynamic runValue = runValuePropertyToChange.GetValue(haystack, null);
//Check to see if we're dealing with an EntityCollection or an EntityObject. If it is an EntityObject, we can set that value
//directly. If it is a collection, we need to use the generic method
if (runValuePropertyToChange.PropertyType.IsGenericType && runValuePropertyToChange.PropertyType.GetGenericTypeDefinition() == typeof(EntityCollection<>))
{
EntityObject result = IterateThroughEntityCollection(runValue, needle, sa);
if (result != null) return result;
}
else
{
test = sa(runValue, needle); if (test == true) return runValue;
}
return null;
}
Private EntityCollection iterator.
private EntityObject IterateThroughEntityCollection<T,U>(EntityCollection<T> haystack, U needle, SearchAction<U> sa) where T: EntityObject
{
Boolean test = false;
foreach(dynamic obj in haystack)
{
test = sa(obj, needle);
if (test == true) return obj;
}
return null;
}
private Equipment GenerateDirtyPropertiesOnEntity(Equipment updatedEntity)
{
updatedEntity.DirtyProperties.Clear();
Equipment originalEntity = GetEquipmentByGuid(updatedEnitity.Guid.Value);
Type myType = updatedEntity.GetType();
System.Reflection.PropertyInfo[] properties = myType.GetProperties();
foreach (System.Reflection.PropertyInfo p in properties)
{
if (p.GetValue(originalEntity, null) == null)
{
if (p.GetValue(updatedEntity, null) != null)
updatedEntity.DirtyProperties.Add(p.Name);
}
else
{
if (!(p.GetValue(originalEntity, null).Equals(p.GetValue(updatedEntity, null))))
updatedEntity.DirtyProperties.Add(p.Name);
}
}
return updatedEntity;
}
How much speed am i sacrificing when using this?
Does anyone know of a better way to do this?
Thanks in advance
You might get better performance by trying something using the INotifyPropertyChanged interface. Instead of using reflection you can use event based modeling to accomplish the same thing.
CSLA.NET is an example of a framework that takes that approach.
Example
T SomeProperty()
{
get
{
return _someProperty;
}
set
{
if (_someProperty <> value)
{
_someProperty = value;
OnPropertyChanged("SomeProperty");
}
}
}
and then OnPropertyChanged would look something like
OnPropertyChanged(object params)
{
DirtyProperties.Add(params);
}
Keep in mind this is total air code. I can't remember how the params were constructed, but it's not actually of type object and the name of the property was included which is how you would determine which property to add to your DirtyProperties list.
You are asking 2 questions:
How much speed are you loosing?
Is there a faster way to do it
Question #1:
The answer to the first one is: it depends. Writing the property checking code by hand could be several times faster than the reflection code. However, that might not be actually be a problem depending on how often the code gets called. If the code isn't called very frequently, then you wouldn't get much for the trouble of optimizing it. If, however, it's called a lot, then optimizing it may give you big speed improvements. I would run your app underneath a profiler (I like Jet Brain's Dot Trace personally) to see where the time is actually being spent. The percentage of time spent inside "GenerateDirtyPropertiesOnEntity" will give you the theoretical maximum perf gain you can get by optimizing the method. If that ends up being a small percentage, then I would just keep the code as is.
Question #2
I can think of 2 simple ways of making this faster:
Write the property comparison code by hand.
Use the DynamicMethod class to generate the comparison code
I'm assuming you don't want to do #1. I'll post some code that shows #2 in a second.
Update:
Here's the code for generating a dynamic method
class Util
{
public static Func<T,T, List<string>> CreateDitryChecker<T>()
{
var dm =
new DynamicMethod
(
"$dirty_checker",
typeof(List<string>),
new[] { typeof(T), typeof(T) },
typeof(T)
);
var ilGen = dm.GetILGenerator();
//var retVar = new List<string>();
var retVar = ilGen.DeclareLocal(typeof(List<string>));
ilGen.Emit(OpCodes.Newobj, typeof(List<string>).GetConstructor(new Type[0]));
ilGen.Emit(OpCodes.Stloc, retVar);
var properties = typeof(T).GetProperties(BindingFlags.Public | BindingFlags.Instance);
MethodInfo objEqualsMehtod = typeof(object).GetMethod("Equals", new[] { typeof(object) });
MethodInfo listAddMethod = typeof(List<string>).GetMethod("Add");
foreach (PropertyInfo prop in properties)
{
//Inject code equivalent to the following into the method:
//if (arg1.prop == null)
//{
// if (arg2.prop != null)
// {
// retVar.Add("prop")
// }
//}
//else
//{
// if (! arg1.prop.Equals(arg2))
// {
// retVar.Add("prop")
// }
//}
Label endLabel = ilGen.DefineLabel();
Label elseLabel = ilGen.DefineLabel();
//if arg1.prop != null, goto elseLabel
ilGen.Emit(OpCodes.Ldarg_0);
ilGen.Emit(OpCodes.Call, prop.GetGetMethod());
ilGen.Emit(OpCodes.Brtrue, elseLabel);
//if arg2.prop != null, goto endLabel
ilGen.Emit(OpCodes.Ldarg_1);
ilGen.EmitCall(OpCodes.Call, prop.GetGetMethod(), null);
ilGen.Emit(OpCodes.Brfalse, endLabel);
//retVar.Add("prop");
ilGen.Emit(OpCodes.Ldloc, retVar);
ilGen.Emit(OpCodes.Ldstr, prop.Name);
ilGen.EmitCall(OpCodes.Callvirt, listAddMethod, null);
ilGen.Emit(OpCodes.Br, endLabel);
//elseLabel:
ilGen.MarkLabel(elseLabel);
//if (arg0.prop.Equals(arg1.prop), goto endLabel
ilGen.Emit(OpCodes.Ldarg_0);
ilGen.EmitCall(OpCodes.Call, prop.GetGetMethod(), null);
ilGen.Emit(OpCodes.Ldarg_1);
ilGen.EmitCall(OpCodes.Call, prop.GetGetMethod(), null);
ilGen.EmitCall(OpCodes.Callvirt, objEqualsMehtod, null);
ilGen.Emit(OpCodes.Brtrue, endLabel);
//retVar.Add("prop")
ilGen.Emit(OpCodes.Ldloc, retVar);
ilGen.Emit(OpCodes.Ldstr, prop.Name);
ilGen.EmitCall(OpCodes.Callvirt, listAddMethod, null);
//endLAbel:
ilGen.MarkLabel(endLabel);
}
ilGen.Emit(OpCodes.Ldloc, retVar);
ilGen.Emit(OpCodes.Ret);
return (Func<T, T, List<string>>) dm.CreateDelegate(typeof(Func<T, T, List<string>>));
}
}
It takes in a generic parameter T and returns a delegate that, when given 2 T instances will return a list of all the changed properties.
To get a performance boost out of it, it's a good idea to call the method once and store the result in a readonly static field. Something like this would work:
class FooBar
{
static readonly Func<FooBar,FooBar, List<string>> s_dirtyChecker;
static FooBar()
{
s_dirtyChecker = Util.CreateDirtyChecker<FooBar>();
}
public List<string> GetDirtyProperties(Foobar other)
{
return s_dirtyChecker(this, other);
}
}
I'm looking at doing something similar using PostSharp, then comparing the old and new property value when it's being set, and flagging the object as dirty. Shouldn't be too hard to do the same things on a property level.
The only way to know, without doubt, how much speed you're sacrificing would be to profile this.
In general, in my experience, reflecting on properties seems to be, at best, about 1/50th the speed of accessing them directly. At worst it can be 200x slower. Depending on the frequency of this operation, and the number of properties, this may or may not be a noticable difference, though, which again, is why I'd suggest profiling it to tell if you need a different solution.
I have a requirement to map all of the field values and child collections between ObjectV1 and ObjectV2 by field name. ObjectV2 is in a different namspace to ObjectV1.
Inheritence between the template ClassV1 and ClassV2 has been discounted as these 2 classes need to evolve independently. I have considered using both reflection (which is slow) and binary serialisation (which is also slow) to perform the mapping of the common properties.
Is there a preferred approach? Are there any other alternatives?
As an alternative to using reflection every time, you could create a helper class which dynamically creates copy methods using Reflection.Emit - this would mean you only get the performance hit on startup. This may give you the combination of flexibility and performance that you need.
As Reflection.Emit is quite clunky, I would suggest checking out this Reflector addin, which is brilliant for building this sort of code.
What version of .NET is it?
For shallow copy:
In 3.5, you can pre-compile an Expression to do this. In 2.0, you can use HyperDescriptor very easily to do the same. Both will vastly out-perform reflection.
There is a pre-canned implementation of the Expression approach in MiscUtil - PropertyCopy:
DestType clone = PropertyCopy<DestType>.CopyFrom(original);
(end shallow)
BinaryFormatter (in the question) is not an option here - it simply won't work since the original and destination types are different. If the data is contract based, XmlSerializer or DataContractSerializer would work if all the contract-names match, but the two (shallow) options above would be much quicker if they are possible.
Also - if your types are marked with common serialization attributes (XmlType or DataContract), then protobuf-net can (in some cases) do a deep-copy / change-type for you:
DestType clone = Serializer.ChangeType<OriginalType, DestType>(original);
But this depends on the types having very similar schemas (in fact, it doesn't use the names, it uses the explicit "Order" etc on the attributes)
You might want to take a look at AutoMapper, a library which specializes in copying values between objects. It uses convention over configuration, so if the properties really have the exaxt same names it will do almost all of the work for you.
Here is a solution which I built:
/// <summary>
/// Copies the data of one object to another. The target object gets properties of the first.
/// Any matching properties (by name) are written to the target.
/// </summary>
/// <param name="source">The source object to copy from</param>
/// <param name="target">The target object to copy to</param>
public static void CopyObjectData(object source, object target)
{
CopyObjectData(source, target, String.Empty, BindingFlags.Public | BindingFlags.Instance);
}
/// <summary>
/// Copies the data of one object to another. The target object gets properties of the first.
/// Any matching properties (by name) are written to the target.
/// </summary>
/// <param name="source">The source object to copy from</param>
/// <param name="target">The target object to copy to</param>
/// <param name="excludedProperties">A comma delimited list of properties that should not be copied</param>
/// <param name="memberAccess">Reflection binding access</param>
public static void CopyObjectData(object source, object target, string excludedProperties, BindingFlags memberAccess)
{
string[] excluded = null;
if (!string.IsNullOrEmpty(excludedProperties))
{
excluded = excludedProperties.Split(new char[1] { ',' }, StringSplitOptions.RemoveEmptyEntries);
}
MemberInfo[] miT = target.GetType().GetMembers(memberAccess);
foreach (MemberInfo Field in miT)
{
string name = Field.Name;
// Skip over excluded properties
if (string.IsNullOrEmpty(excludedProperties) == false
&& excluded.Contains(name))
{
continue;
}
if (Field.MemberType == MemberTypes.Field)
{
FieldInfo sourcefield = source.GetType().GetField(name);
if (sourcefield == null) { continue; }
object SourceValue = sourcefield.GetValue(source);
((FieldInfo)Field).SetValue(target, SourceValue);
}
else if (Field.MemberType == MemberTypes.Property)
{
PropertyInfo piTarget = Field as PropertyInfo;
PropertyInfo sourceField = source.GetType().GetProperty(name, memberAccess);
if (sourceField == null) { continue; }
if (piTarget.CanWrite && sourceField.CanRead)
{
object targetValue = piTarget.GetValue(target, null);
object sourceValue = sourceField.GetValue(source, null);
if (sourceValue == null) { continue; }
if (sourceField.PropertyType.IsArray
&& piTarget.PropertyType.IsArray
&& sourceValue != null )
{
CopyArray(source, target, memberAccess, piTarget, sourceField, sourceValue);
}
else
{
CopySingleData(source, target, memberAccess, piTarget, sourceField, targetValue, sourceValue);
}
}
}
}
}
private static void CopySingleData(object source, object target, BindingFlags memberAccess, PropertyInfo piTarget, PropertyInfo sourceField, object targetValue, object sourceValue)
{
//instantiate target if needed
if (targetValue == null
&& piTarget.PropertyType.IsValueType == false
&& piTarget.PropertyType != typeof(string))
{
if (piTarget.PropertyType.IsArray)
{
targetValue = Activator.CreateInstance(piTarget.PropertyType.GetElementType());
}
else
{
targetValue = Activator.CreateInstance(piTarget.PropertyType);
}
}
if (piTarget.PropertyType.IsValueType == false
&& piTarget.PropertyType != typeof(string))
{
CopyObjectData(sourceValue, targetValue, "", memberAccess);
piTarget.SetValue(target, targetValue, null);
}
else
{
if (piTarget.PropertyType.FullName == sourceField.PropertyType.FullName)
{
object tempSourceValue = sourceField.GetValue(source, null);
piTarget.SetValue(target, tempSourceValue, null);
}
else
{
CopyObjectData(piTarget, target, "", memberAccess);
}
}
}
private static void CopyArray(object source, object target, BindingFlags memberAccess, PropertyInfo piTarget, PropertyInfo sourceField, object sourceValue)
{
int sourceLength = (int)sourceValue.GetType().InvokeMember("Length", BindingFlags.GetProperty, null, sourceValue, null);
Array targetArray = Array.CreateInstance(piTarget.PropertyType.GetElementType(), sourceLength);
Array array = (Array)sourceField.GetValue(source, null);
for (int i = 0; i < array.Length; i++)
{
object o = array.GetValue(i);
object tempTarget = Activator.CreateInstance(piTarget.PropertyType.GetElementType());
CopyObjectData(o, tempTarget, "", memberAccess);
targetArray.SetValue(tempTarget, i);
}
piTarget.SetValue(target, targetArray, null);
}
If speed is an issue you could take the reflection process offline and generate code for the mapping of the common properties. You could do this at runtime using Lightweight Code Generation or completely offline by building C# code to compile.
If if you control the instantiation of the destination object, try using JavaScriptSerializer. It doesn't spit out any type information.
new JavaScriptSerializer().Serialize(new NamespaceA.Person{Id = 1, Name = "A"})
returns
{Id: 1, Name: "A"}
From this it should possible to deserialize any class with the same property names.
If speed is an issue, you should implement clone methods in the methods themselves.
For a deep copy, I used Newtonsoft and create and generic method such as:
public T DeepCopy<T>(T objectToCopy)
{
var objectSerialized = JsonConvert.SerializeObject(objectToCopy);
return JsonConvert.DeserializeObject<T>(objectSerialized);
}
I know that is not much orthodox solution, but it works for me.