Datatable Comparison of - primary keys is not working - c#

I have written a code block which Compares datatable for Scheme differences.
this is the code:
private static void ValidateSchema(DataTable originalTbl, DataTable otherTbl)
{
var primaryKeyDoesNotMatch = originalTbl.PrimaryKey != otherTbl.PrimaryKey;
if(primaryKeyDoesNotMatch)
{
throw new ArgumentException("primary key does not match");
}
var primaryKeyDoesNotExist = originalTbl.PrimaryKey == null;
if(primaryKeyDoesNotExist)
{
throw new ArgumentException("primary key does not exist");
}
var otherTableHasAdditionalColumns = (from x in otherTbl.Columns.OfType<DataColumn>() where !originalTbl.Columns.OfType<DataColumn>().Any(y => y.ColumnName == x.ColumnName) select x).Any();
if (otherTableHasAdditionalColumns)
{
throw new ArgumentException("Other table does have additional columns.");
}
var columnsAreMissingInOtherTable = (from x in originalTbl.Columns.OfType<DataColumn>() where !otherTbl.Columns.OfType<DataColumn>().Any(y => y.ColumnName == x.ColumnName) select x).Any();
if (columnsAreMissingInOtherTable)
{
throw new ArgumentException("Other table does not have all the columns.");
}
var columnDataTypesDoesNotMatch = (from x in otherTbl.Columns.OfType<DataColumn>() where originalTbl.Columns.OfType<DataColumn>().Any(y => x.DataType != y.DataType) select x).Any();
if (columnDataTypesDoesNotMatch)
{
throw new ArgumentException("Column's data type does not match");
}
}
I also have a Unit test for this which has been implemented to test all these scenarios.
the problem is that even when i test the "columnDataTypeDoesNotMatch" or "columnsAreMissinginOtherTable" it runs till the first IF statement and tells me "primary key does not match" however they do!
any idea why this happens?
your help is appreciated
tanx in advance.

Sure. The PrimaryKey can never equal one another (well, unless it's the same table, or they both have null - although it should still have two separate new DataColumn[0] arrays), because you're performing a reference comparison, and over an array to boot.
Instead, you have to check for example based on the names of the columns involved (and depending on your requirements, data types as well):
bool Compare(DataColumn[] primary, DataColumn[] secondary)
{
if (primary.Length != secondary.Length) return false;
var names = new HashSet<string>(secondary.Select(col => col.ColumnName));
return primary.All(col => names.Contains(col.ColumnName));
}
You'll have to expand those as required per your requirements, for example based on whether you care about case sensitivity etc.

Related

Distinct Values in hashsets and insert in DataGridView

I want to display a hashset with all the different values ​​between two DataGridView, but I have not been successful in displaying the strings, I attach images.
var dataA = new HashSet<string>();
var dataB = new HashSet<string>();
for (int i = 1; i < dgv_A.Rows.Count; i++)
{
dataA.Add(dgv_A[8, i].Value.ToString());
}
for (int i = 1; i < dgv_B.Rows.Count; i++)
{
dataB.Add(dgv_B[8, i].Value.ToString());
}
if (dataA == dataB)
{
lbl_resultado.Text = "Las certificaciones estan correctas";
}
else
{
var error = dataA.Except(dataB).Concat(dataB.Except(dataA));
var container = new HashSet<string>(error);
dgv_B.DataSource = container.ToList();
}
Theres the values that are needed:
The result of the code:
NOTE: the if (dataA == dataB) part, it's not the problem i need answer but thanks. The part needed its:
else
{
var error = dataA.Except(dataB).Concat(dataB.Except(dataA));
var container = new HashSet<string>(error);
dgv_B.DataSource = container.ToList();
}
NOTE 2: The main operation of the project is the verification of documents (for example, that no user has been modified)
If any value is changed you need to know which values ​​were changed.
With
var error = dataA.Except(dataB).Concat(dataB.Except(dataA));
var container = new HashSet<string>(error);
dgv_B.DataSource = container.ToList();
I get the distinct values ​​fron dataA and dataB, but I can't get it to show me the texts that appear.
NOTE:
Here's your problem:
if (dataA == dataB)
{
}
The HashSet<T> type does not override the == operator.
Consequently dataA == dataB performs only a reference equality comparison, which means the same thing as Object.ReferenceEquals( dataA, dataB ), which will always be false, as dataA and dataB are references to separate GC objects.
Instead, to compare both HashSets to see if they are both equivalent use (the confusingly named) SetEquals() method:
if( dataA.SetEquals( dataB ) )
{
lbl_resultado.Text = "Las certificaciones estan correctas";
}
Do not use Enumerable.SequenceEquals because that evaluates the HashSet as an ordered sequence of values, but a HashSet is a Mathematical Set, which is unordered and (AFAIK) its HashSet<T>.Enumerator iterator returns its elements in undefined order, so attempting to check two sets' set-equality by comparing sequences is incorrect.

Replace property values in a class from List<Dictionary> values

I have a method that takes a List<Dictionary<string,object>> as a parameter. The plan is to use that parameter, but only update the values held in a particular class. Here is the (partially written) method
public async Task<Errors> UpdatePageForProject(Guid projectId, List<Dictionary<string, object>> data)
{
if (!IsValidUserIdForProject(projectId))
return new Errors { ErrorMessage = "Project does not exist", Success = false };
if (data.Count == 0)
return new Errors { ErrorMessage = "No data passed to change", Success = false };
var page = await _context.FlowPages.FirstOrDefaultAsync(t => t.ProjectId == projectId);
foreach (var d in data)
{
}
return new Errors { Success = true };
}
My original plan is to take each dictionary, check if the key and the property in page match and then alter the value (so I can pass in 1 dictionary or 8 dictionaries in the list and then alter page to save back to my entity database).
I'd rather not use reflection due to the speed hit (though C#9 is really fast, I'd still rather not use it), but I'm not sure how else this can be done. I did consider using AutoMapper to do this, but for now would rather not (it's a PoC, so it is possibly overkill)
If you want to do this without Reflection (which I agree is a good idea, not just for performance reasons) then you could use a "map" or lookup table with actions for each property.
var map = new Dictionary<string,Action<Page,object>>()
{
{ "Title", (p,o) => p.Title = (string)o },
{ "Header", (p,o) => p.Field1 = (string)o },
{ "DOB", (p,o) => p.DateOfBirth = (DateTime)o }
};
You can then iterate over your list of dictionaries and use the map to execute actions that update the page.
foreach (var dictionary in data)
{
foreach (entry in dictionary)
{
var action = map[entry.Key];
action(page, entry.Value);
}
}

unable to get data from database by passing string value in entity frame work asp.net MVC

i am trying to get data from database passing a string value. but get null value instead of the data.
i have tried the following code
order getCustomerOrder(string or_n)
{
using (foodorderingEntities db = new foodorderingEntities ())
{
var result = db.orders.Where(or => or.order_no == or_n).FirstOrDefault();
return result;
}
}
please some one guide me to solve this problem.
Please paste your order object, so we know if its reference object/primitive etc, you can use this to understand the two ways to retrieve orders.
I would recommend you read this answer & this MSDN string compare, if your default culture is causing an issue in the comparison, it will help you understand whats going on
Options 1:
// Query syntax
IEnumerable<CustomerOrder> queryResultsCustomerOrder =
from order in orders
where order.number == myOrderNumber
select order;
Options 2:
// Method-based syntax
IEnumerable<CustomerOrder> queryResultsCustomerOrder2 = orders.Where(order => order.Number == myOrderNumber);
using the above, now you can get however many orders the customer has. I am assuming your order is a number, but you can change it to whatever like a string.
Int based order comparison sample
IEnumerable<CustomerOrder> getCustomerOrder(int myOrderNumber)
{
if(myOrderNumber <1) return null;
using (foodorderingEntities dbContextOrderSet = new foodorderingEntities())
{
IEnumerable<CustomerOrder> resultsOneOrManyOrders = orders.Where(order => order.Number == myOrderNumber);
return resultsOneOrManyOrders ;
}
}
string based order comparison
IEnumerable<CustomerOrder> getCustomerOrder(string myOrderNumber)
{
//no orders
if(String.IsNullOrEmpty(myOrderNumber)) return null;
using (foodorderingEntities dbContextOrderSet = new foodorderingEntities())
{
IEnumerable<CustomerOrder> resultsOneOrManyOrders = orders.Where(order => order.Number == myOrderNumber);
// you can replace the *** comparison with .string.Compare and try inside the block
return resultsOneOrManyOrders ;
}
}

How to Move to next record while building an IEnumerable if the current row has data problems

I have a method that outputs a list of RSVPs for a class. The RSVP is based on a pull from a SQL table holding RSVP records, based on an input parameter of the class id. Then a dictionary of all students (people who RSVPd to that class) is made. Finally I output the IEnumerable of the RSVPs based on the data for each student.
The problem I'm running into is that I've got a couple students in the data that are "bad users": They aren't in the system. Potentially because of bad record deletions, or bad creations. Either way, I need to set up error handling for "bad student records" while building the IEnumerable.
My thought was to catch the potential error when evaluating the student id on this line:
var data = x.ToRsvpData(students[x.RawAgentId]);
And then just skip that record and move on to the next one. However, I'm not sure how to do that.
Here's the complete method:
public IEnumerable<RsvpData> GetAllRsvpsFor(Guid scheduledId)
{
var rsvps = _sors.AnyRsvpsIn(new[] { scheduledId })[scheduledId];
var certificates = _sors.CertificatesIn(rsvps.Select(x => x.RsvpId).ToList());
var students = _sors.StudentsBy(rsvps);
return rsvps.Select(x => {
var data = x.ToRsvpData(students[x.RawAgentId]);
if (x.Completed)
{
data.SignatureUrl = StaticContent.S3WebPrefixFor(string.Format("/schools/signatures/{0}.jpg", x.RsvpId.ToString()));
var cert = certificates.FirstOrDefault(c => c.Rsvp.RsvpId == x.RsvpId);
data.CertificateId = cert != null ? cert.CertId.ToString() : "";
}
return data;
}).OrderBy(x => x.LastName).ToList();
}
Update:
Here's the completed, working code:
public IEnumerable<RsvpData> GetAllRsvpsFor(Guid scheduledId)
{
var rsvps = _sors.AnyRsvpsIn(new[] { scheduledId })[scheduledId];
var certificates = _sors.CertificatesIn(rsvps.Select(x => x.RsvpId).ToList());
var students = _sors.StudentsBy(rsvps);
var cleanRsvpData = new List<RsvpData>();
foreach (var rsvp in rsvps)
{
try
{
var data = rsvp.ToRsvpData(students[rsvp.RawAgentId]);
if (rsvp.Completed)
{
data.SignatureUrl = StaticContent.S3WebPrefixFor(string.Format("/schools/signatures/{0}.jpg", rsvp.RsvpId.ToString()));
var cert = certificates.FirstOrDefault(c => c.Rsvp.RsvpId == rsvp.RsvpId);
data.CertificateId = cert != null ? cert.CertId.ToString() : "";
}
cleanRsvpData.Add(data);
}
catch (Exception ex)
{ //Bad Student record
Elmah.ErrorSignal.FromCurrentContext().Raise(ex);
}
}
Depending on what makes them particularly misbehaving records, simply wrapping your Enumerable in another Enumerable should fix the problem. Something like this:
IEnumerable<Record> GetCorrectRecords(IEnumerable<Record> records)
{
foreach(var record in records)
if(record.Valid) // up to you how you define this
yield return record;
}
Instead of using lambda expressions, your best bet would probably be to use a temp list variable and use a try catch block - so only the "clean" records will make it into your list:
List<RsvpData> cleanRsvpData = new List<RsvpData>();
foreach (RsvpData rsvp in rsvps)
{
try
{
RsvpData data = rsvp.ToRsvpData(students[x.RawAgentId]);
if (rsvp.Completed)
{
data.SignatureUrl = "test";
var cert = certificates.FirstOrDefault(c => c.Rsvp.RsvpId == x.RsvpId);
data.CertificateId = cert != null ? cert.CertId.ToString() : "";
}
cleanRsvpData.Add(data);
}
catch (Exception ex)
{ // handle error here
}
}
Since the try catch is INSIDE the loop, it won't break your whole loop if one of the items in the list throws an error.
You can do that using the famous try...catch block, something like this:
public IEnumerable<int> GetAllRsvpsFor(Guid scheduledId)
{
//all the code that precedes the loop
//for or foreach loop
{
//any code that you have to perform after this block
try
{
var data = x.ToRsvpData(students[x.RawAgentId]);
}
catch
{
continue; //continues to the next for iteration in case of any error
}
//any code that you have to perform after this block
}
}

Removing duplicates from a list with "priority"

Given a collection of records like this:
string ID1;
string ID2;
string Data1;
string Data2;
// :
string DataN
Initially Data1..N are null, and can pretty much be ignored for this question. ID1 & ID2 both uniquely identify the record. All records will have an ID2; some will also have an ID1. Given an ID2, there is a (time-consuming) method to get it's corresponding ID1. Given an ID1, there is a (time-consuming) method to get Data1..N for the record. Our ultimate goal is to fill in Data1..N for all records as quickly as possible.
Our immediate goal is to (as quickly as possible) eliminate all duplicates in the list, keeping the one with more information.
For example, if Rec1 == {ID1="ABC", ID2="XYZ"}, and Rec2 = {ID1=null, ID2="XYZ"}, then these are duplicates, --- BUT we must specifically remove Rec2 and keep Rec1.
That last requirement eliminates the standard ways of removing Dups (e.g. HashSet), as they consider both sides of the "duplicate" to be interchangeable.
How about you split your original list into 3 - ones with all data, ones with ID1, and ones with just ID2.
Then do:
var unique = allData.Concat(id1Data.Except(allData))
.Concat(id2Data.Except(id1Data).Except(allData));
having defined equality just on the basis of ID2.
I suspect there are more efficient ways of expressing that, but the fundamental idea is sound as far as I can tell. Splitting the initial list into three is simply a matter of using GroupBy (and then calling ToList on each group to avoid repeated queries).
EDIT: Potentially nicer idea: split the data up as before, then do:
var result = new HashSet<...>(allData);
result.UnionWith(id1Data);
result.UnionWith(id2Data);
I believe that UnionWith keeps the existing elements rather than overwriting them with new but equal ones. On the other hand, that's not explicitly specified. It would be nice for it to be well-defined...
(Again, either make your type implement equality based on ID2, or create the hash set using an equality comparer which does so.)
This may smell quite a bit, but I think a LINQ-distinct will still work for you if you ensure the two compared objects come out to be the same. The following comparer would do this:
private class Comp : IEqualityComparer<Item>
{
public bool Equals(Item x, Item y)
{
var equalityOfB = x.ID2 == y.ID2;
if (x.ID1 == y.ID1 && equalityOfB)
return true;
if (x.ID1 == null && equalityOfB)
{
x.ID1 = y.ID1;
return true;
}
if (y.ID1 == null && equalityOfB)
{
y.ID1 = x.ID1;
return true;
}
return false;
}
public int GetHashCode(Item obj)
{
return obj.ID2.GetHashCode();
}
}
Then you could use it on a list as such...
var l = new[] {
new Item { ID1 = "a", ID2 = "b" },
new Item { ID1 = null, ID2 = "b" } };
var l2 = l.Distinct(new Comp()).ToArray();
I had a similar issue a couple of months ago.
Try something like this...
public static List<T> RemoveDuplicateSections<T>(List<T> sections) where T:INamedObject
{
Dictionary<string, int> uniqueStore = new Dictionary<string, int>();
List<T> finalList = new List<T>();
int i = 0;
foreach (T currValue in sections)
{
if (!uniqueStore.ContainsKey(currValue.Name))
{
uniqueStore.Add(currValue.Name, 0);
finalList.Add(sections[i]);
}
i++;
}
return finalList;
}
records.GroupBy(r => r, new RecordByIDsEqualityComparer())
.Select(g => g.OrderByDescending(r => r, new RecordByFullnessComparer()).First())
or if you want to merge the records, then Aggregate instead of OrderByDescending/First.

Categories