I'm writing a WinForms app that contains a simple object like this:
public class MyObject : INotifyPropertyChanged // for two-way data binding
{
public event PropertyChangedEventHandler PropertyChanged;
private void RaisePropertyChanged([CallerMemberName] string caller = "")
{
if (PropertyChanged != null)
{
PropertyChanged(this, new PropertyChangedEventArgs(caller));
}
}
private int _IndexValue;
public int IndexValue
{
get { return Value; }
set
{
if (value != Value)
{
Value = value;
RaisePropertyChanged();
}
}
}
private string _StringValue;
public string StringValue
{
get { return _StringValue; }
set
{
if (value != _StringValue)
{
_StringValue = value;
_Modified = true;
RaisePropertyChanged();
}
}
}
private bool _Modified;
public bool Modified
{
get { return _Modified; }
set
{
if (value != _Modified)
{
_Modified = value;
RaisePropertyChanged();
}
}
}
public MyObject(int indexValue)
{
IndexValue = indexValue;
StringValue = string.Empty;
Modified = false;
}
}
I have a BindingList that will contain a fixed number (100,000) of my objects as well as a BindingSource. Both of those are defined like this:
BindingList<MyObject> myListOfObjects = new BindingList<MyObject>();
BindingSource bindingSourceForObjects = new BindingSource();
bindingSourceForObjects .DataSource = myListOfObjects;
Finally, I have my DataGridView control. It has single column ("STRINGVALUECOLUMN") which displays the StringValue property for my objects and it is bound to the BindingSource that I just mentioned:
dataGridViewMyObjects.DataSource = bindingSourceForObjects;
When my application starts, I add 100,000 objects to myListOfObjects. Since I only have one column in my DGV and the property that it displays is initialized to string.Empty, I basically have a DGV that contains 100,000 "blank" rows. At this point, my user can begin editing the rows to enter strings. They don't have to edit them in any order so they might put one string in the first row, the next string in row 17, the next string in row 24581, etc. Sometimes, my users will want to import strings from text file. Since I have a fixed number of objects (100,000) and there may or may not be some existing strings already entered, I have a few checks to perform during the import process before I add a new string. In the code below, I've removed those checks but they don't seem to impact the performance of my application. However, if I import tens of thousands of strings using the code below, it's very slow (like 4 or 5 minutes to import 50k lines). I have narrowed it down to something in this block of code:
// this code is inside the loop that reads each line from a file...
// does this string already exist?
int count = myListOfObjects.Count(i => i.StringValue == stringFromFile);
if (count > 0)
{
Debug.WriteLine("String already exists!"); // don't insert strings that already exist
}
else
{
// find the first object in myListOfObjects that has a .StringValue property == string.Empty and then update it with the string read from the file
MyObject myObject = myListOfObjects.FirstOrDefault(i => i.StringValue == string.Empty);
myObject.StringValue = stringFromFile;
}
It's my understanding that I need two-way binding so I can update the underlying data and have it reflect in the DGV control but I've also read that INotifyPropertyChanged can be slow sometimes. Has anyone ever run into this problem before? If so, how did you solve it?
-- UPDATE --
Just for testing purposes, I replaced:
// does this string already exist?
int count = myListOfObjects.Count(i => i.StringValue == stringFromFile);
if (count > 0)
{
Debug.WriteLine("String already exists!"); // don't insert strings that already exist
}
else
{
// find the first object in myListOfObjects that has a .StringValue property == string.Empty and then update it with the string read from the file
MyObject myObject = myListOfObjects.FirstOrDefault(i => i.StringValue == string.Empty);
myObject.StringValue = stringFromFile;
}
with a for loop containing:
myListOfObjects[counter].StringValue = "some random string";
This is extremely fast even with 100,000 objects. However, I've now lost the ability to 1) check to see if the string that I read from the file is already assigned to an object in the list before I assign it and 2) find the first available object in the list whose StringValue property == string.Empty and then update that value accordingly. So it seems that:
int count = myListOfObjects.Count(i => i.StringValue == stringFromFile);
and
MyObject myObject = myListOfObjects.FirstOrDefault(i => i.StringValue == string.Empty);
...are the source of my performance problems. Is there a faster, more efficient way to perform these two operations against my BindingList?
The thing about Linq is that its really just standard loops, optimized of course, but still regular old loops, back in the back code.
One thing that may speed your code up is this:
myListOfObjects.Any(i => i.StringValue.Equals(stringFromFile));
this returns a simple boolean, Does X exist. It early exits so it wont scan the entire collection if it doesn't have to. .Count() requires not only scanning the whole thing but also keeping a running count.
Another thing to point out, since you are using FirstOrDefault, that indicates that the result could be null. Make sure you have a null-check on myobject before trying to use it.
Finally, as suggested by Mr Saunders, check the event stack and make sure there isn't more code running than you think there is. This is a danger in operations like this. You might need to borrow some code from the initialization engine and use this.SuspendLayout() and this.ResumeLayout()
The problem may be that when you update the underlying data, events fire to cause the grid to update. Lots of data changing == lots of updates.
It's been a long time since I've done much with Windows Forms, but check out the SuspendLayout method.
Related
I have a background worker that streams data and saves it to a ConcurrentQueue<T> which is what I need since it is a thread safe First In First Out collection, but I also need to do tasks like perform simple calculations or to pull data from this collection and I'm not sure what I need to use at this point. Here is some example pseudo code:
public class ExampleData
{
public DateTime Date { get; set; }
public decimal Value { get; set; }
}
public ConcurrentQueue<ExampleData> QueueCol { get; set; } = new();
public void AddToQueue(DateTime date, decimal value)
{
QueueCol.Enqueue(new ExampleData() { Date = date, Value = value });
}
public void DisplayPastData()
{
var count = QueueCol.Count();
var prev1Data = count >= 2 ? QueueCol.ElementAt(count - 2) : null;
var prev2Data = count >= 3 ? QueueCol.ElementAt(count - 3) : null;
var prev3Data = count >= 4 ? QueueCol.ElementAt(count - 4) : null;
if (prev1Data != null)
{
Console.WriteLine($"Date: {prev1Data.Date} Value: {prev1Data.Value}");
}
if (prev2Data != null)
{
Console.WriteLine($"Date: {prev2Data.Date} Value: {prev2Data.Value}");
}
if (prev3Data != null)
{
Console.WriteLine($"Date: {prev3Data.Date} Value: {prev3Data.Value}");
}
}
This is a very rough example but even with displaying data most of it looks correct and then I will get dates completely out of left field like a date from the previous day in between dates from the current day and so because of ordering issues like that I know the data isn't correct so my question is how do I convert the concurrent queue to a new collection that will allow me to keep the order and to work with the data without giving incorrect results?
The usage pattern you describe in your question makes a ConcurrentQueue<T> not a suitable collection for your scenario. As far as I can understand the requirements are:
The producer(s) should be able to enqueue items in the collection without being blocked for any amount of time.
The consumer(s) should be able to perform calculations on a snapshot of the collection, without creating an expensive copy of the collection, and without interfering in any way with the producer(s).
The collection that seems more suitable for your scenario out of the box, is the ImmutableList<T>. This collection can be updated with lock-free Interlocked operations, and it is essentially a snapshot by itself (because it is immutable). Here is how you could use it in a multithreading scenario, with thread-safety and without blocking any thread:
private ImmutableList<ExampleData> _data = ImmutableList<ExampleData>.Empty;
public ImmutableList<ExampleData> Data => Volatile.Read(ref _data);
public void AddToQueue(DateTime date, decimal value)
{
var newData = new ExampleData() { Date = date, Value = value };
ImmutableInterlocked.Update(ref _data, (x, y) => x.Add(y), newData);
}
public void DisplayPastData()
{
ImmutableList<ExampleData> snapshot = Volatile.Read(ref _data);
int count = snapshot.Count;
var prev1Data = count >= 2 ? snapshot[count - 2] : null;
var prev2Data = count >= 3 ? snapshot[count - 3] : null;
var prev3Data = count >= 4 ? snapshot[count - 4] : null;
if (prev1Data != null)
{
Console.WriteLine($"Date: {prev1Data.Date} Value: {prev1Data.Value}");
}
if (prev2Data != null)
{
Console.WriteLine($"Date: {prev2Data.Date} Value: {prev2Data.Value}");
}
if (prev3Data != null)
{
Console.WriteLine($"Date: {prev3Data.Date} Value: {prev3Data.Value}");
}
}
The immutable collections are not without disadvantages. They are a lot slower in comparison with the normal collections, they require significantly more memory, and they create significantly more garbage every time they are updated.
An optimal solution to your specific scenario could be a combination of a ConcurrentQueue<ExampleData> (recent data) and a List<ExampleData> (historic data). The producer(s) would enqueue items in the ConcurrentQueue<T>, and the single consumer would dequeue all the items from the ConcurrentQueue<T> and then add them in the List<T>. Then it would use the List<T> to do the calculations.
Hello i want to change and alter values inside the cache of my acumatica cache i would like to know how to do it
for example i want to change the Ext. Cost value pro grammatically of the first line or the second line or can i check if there is already a "Data Backup" on transaction Descr.
public delegate void PersistDelegate();
[PXOverride]
public void Persist(PersistDelegate baseMethod)
{
if (Globalvar.GlobalBoolean == true)
{
PXCache cache = Base.Transactions.Cache;
APTran red = new APTran();
red.BranchID = Base.Transactions.Current.BranchID;
red.InventoryID = 10045;
var curyl = Convert.ToDecimal(Globalvar.Globalred);
red.CuryLineAmt = curyl * -1;
cache.Insert(red);
}
else
{
}
baseMethod();
}
this code add a new line on persist but if it save again it add the same line agaub u wabt ti check if there is already a inventoryID =10045; in the cache
thank you for your help
You can access your cache instance by using a view name or cache type. Ex: (Where 'Base' is the graph instance)
Base.Transactions.Cache
or
Base.Caches<APTran>().Cache
Using the cache instance you can loop the cached values using Cached, Inserted, Updated, or Deleted depending on which type of record you are looking for. You can also use GetStatus() on an object to find out if its inserted, updated, etc. Alternatively calling PXSelect will find the results in cache (PXSelectReadOnly will not).
So you could loop your results like so:
foreach (MyDac row in Base.Caches<MyDac>().Cache.Cached)
{
// logic
}
If you know the key values of the cache object you are looking for you can use Locate to find by key fields:
var row = (MyDac)Base.Transactions.Cache.Locate(new MyDac
{
MyKey1 = "",
MyKey2 = ""
// etc... must include each key field
});
As Mentioned before you can also just use a PXSelect statement to get the values.
Once you have the row to update the values you set the object properties and then call your cache Update(row) before the base persist and you are good to go. Similar if needing to Insert(row) or Delete(row).
So in your case you might end up with something like this in your persist:
foreach (APTran row in Base.Transactions.Cache.Cached)
{
if (Globalvar.GlobalBoolean != true || row.TranDesc == null || !row.TranDesc.Contains("Data Backup"))
{
continue;
}
//Found my row
var curyl = Convert.ToDecimal(Globalvar.Globalred);
row.CuryLineAmt = curyl * -1;
Base.Transactions.Update(row);
}
I have a class called Estimate and it has the following field and property:
private IList<RouteInformation> _routeMatrix;
public virtual IList<RouteInformation> RouteMatrix
{
get
{
if (_routeMatrix != null && _routeMatrix.Count > 0)
{
var routeMatrix = _routeMatrix.ToList();
routeMatrix =
routeMatrix.OrderBy(tm => tm.Level.LevelType).ThenBy(tm => tm.Level.LevelValue).ToList();
return routeMatrix;
}
else return _routeMatrix;
}
set { _routeMatrix = value; }
}
So, in the getter method, I am just sorting the _routeMatrix by Level Type and then by Level Value and returning the sorted list.
In one of my programs, I have the following code:
public void SaveApprovers(string[] approvers)
{
int i = 1;
foreach (var approver in approvers)
{
var role = Repository.Get<Role>(long.Parse(approver));
var level = new Models.Level
{
LevelType = LevelType.Approver,
LevelValue = (LevelValue)i,
Role = role
};
Repository.Save(level);
var routeInformation = new Models.RouteInformation
{
Level = level,
RouteObjectType = RouteObjectType.Estimate,
RouteObjectId = _estimate.Id
};
Repository.Save(routeInformation);
_estimate.RouteMatrix.Add(routeInformation); // <--- The problem is here
Repository.Save(_estimate);
i++;
}
}
The problem is that, if there are multiple approvers (i.e: the length of the approvers array is greater than 1, only the first routeInformation is added in the RouteMatrix. I don't know what happen to the rest of them, but the Add method doesn't give any error.
Earlier, RouteMatrix was a public field. This problem started occuring after I made it private and encapsulated it in a public property.
Your get member returns a different list, you add to that temporary list.
get
{
if (_routeMatrix != null && _routeMatrix.Count > 0)
{
var routeMatrix = _routeMatrix.ToList(); // ToList creates a _copy_ of the list
...
return routeMatrix;
}
else return _routeMatrix;
}
.....
_estimate.RouteMatrix.Add(routeInformation); // add to the result of ToList()
I think the moral here is not to make getters too complicated. The sorting is wasted effort anyway when you just want to Add().
Also, bad things will happen when _routeMatrix == null. That may not happen but then the if (_routeMatrix != null && ...) part is misleading noise.
When you are applying ToList() then completely new list is created, which is not related to original _routeMatrix list. Well, they share same elements, but when you add or remove elements from one of lists, it does not affect second list.
From MSDN:
You can append this method to your query in order to obtain a cached
copy of the query results.
So, you have cached copy of your _routeMatrix which you are successfully modifying.
To solve this issue you can return IEnumerable instead of IList (to disable collection modifications outside of estimation class), and create AddRouteInformation method to estimation class which will add route information to _routeMatrix. Use that method to add new items:
_estimate.AddRouteInformation(routeInformation);
Repository.Save(_estimate);
The problem is that you're not actually modifying _routeMatrix, you're modifying a copy of it. Don't issue the ToList on _routeMatrix, just sort it. Change the get to this:
get
{
if (_routeMatrix != null && _routeMatrix.Count > 0)
{
_routeMatrix =
_routeMatrix.OrderBy(tm => tm.Level.LevelType).ThenBy(tm => tm.Level.LevelValue).ToList();
return _routeMatrix;
}
else return _routeMatrix;
}
If I try to change a value in a ComboBox's Items, it will only actually update if the new value is different from the current value after a case-insensitive compare.
Let's make a ComboBox with one item:
ComboBox cboBox = new ComboBox();
cboBox.Items.Add("Apple");
The following code will make the ComboBox still show "Apple", even though the string should look different:
cboBox.Items[0] = "APPLE";
And the naive workaround that I've been using, which will make it display correctly:
cboBox.Items[0] = "";
cboBox.Items[0] = "APPLE";
I wanted to figure out how this was happening, so I dug around with a reflector and found this. This is the ComboBox.ObjectCollection.SetItemInternal method that gets called when you try to modify a value:
internal void SetItemInternal(int index, object value)
{
...
this.InnerList[index] = value;
if (this.owner.IsHandleCreated)
{
bool flag = index == this.owner.SelectedIndex;
if (string.Compare(this.owner.GetItemText(value), this.owner.NativeGetItemText(index), true, CultureInfo.CurrentCulture) != 0)
{
this.owner.NativeRemoveAt(index);
this.owner.NativeInsert(index, value);
if (flag)
{
this.owner.SelectedIndex = index;
this.owner.UpdateText();
}
if (this.owner.AutoCompleteSource == AutoCompleteSource.ListItems)
{
this.owner.SetAutoComplete(false, false);
return;
}
}
else
{
if (flag)
{
this.owner.OnSelectedItemChanged(EventArgs.Empty);
this.owner.OnSelectedIndexChanged(EventArgs.Empty);
}
}
}
}
That true in string.Compare is telling it to ignore the case of the string. Why was this method chosen for deciding whether or not to update the value? And why didn't they expose the case sensitivity?
Is there an alternative way to update an item in an ObjectCollection so that I don't have to guess whether or not it actually gets updated?
EDIT: I should note that the DropDownStyle is set to DropDownList: this is a read-only ComboBox that occasionally needs to be updated due to actions elsewhere in the program.
Try this, add a SelectedIndexChanged event, and inside it put:
int index = cboBox.SelectedIndex;
if (index - 1 >= 0) {
cboBox.SelectedIndex = index - 1;
cboBox.SelectedIndex = index;
}
else if (index + 1 < cboBox.InnerList.Count) {
cboBox.SelectedIndex = index + 1;
cboBox.SelectedIndex = index;
}
This is probably as "naive" as your work around, but maybe worth a try?
After submitting a report to the MSDN, it was marked as "by-design" and nothing more, so that's that.
I realize i have to sort the collection where the ListView gathers the items from:
ListView listCollection = new ListView();
But this doesn't seem to work unless the ListView is added as a GUI-control to the form, that in turn makes it very slow to add items to, hence why i have to use VirtualMode on my GUI-ListView in the first place.
Anyone know how to go about this or point me in the right direction?
basically, you will need to apply sort to the data pump itself.
I did a quick search on Google for listview sort virtualmode. First result was this page, where the above quote was taken from.
For example, if your datasource is a DataView, apply sorting on this instead of the ListView.
If it is just a question of performance when adding items, I would do as barism suggests; use BeginUpdate/EndUpdate instead of VirtualMode.
try {
listView1.BeginUpdate();
// add items
}
finally {
listView1.EndUpdate();
}
If you are using virtual mode, you have to sort your underlying data source. As you may have found, ListViewItemSorter does nothing for virtual lists.
If you are using a non-virtual listview, you can also use AddRange(), which is significantly faster than a series of Add() -- In addition to using BeginUpdate/EndUpdate that has already been described.
ObjectListView (an open source wrapper around .NET WinForms ListView) already uses all these techniques to make itself fast. It is a big improvement over a normal ListView. It supports both normal mode and virtual mode listviews, and make them both much easier to use. For example, sorting is handled completely automatically.
I had the same problem switching VirtualMode True with an existing project, but the solution was surprisingly easy:
First step I am populating a list of ListViewItem, instead of ListView.Items collection:
private List<ListViewItem> _ListViewItems;
Then I have implemented the RetrieveVirtualItem method
private void mLV_RetrieveVirtualItem(object sender, RetrieveVirtualItemEventArgs e)
{
e.Item = _ListViewItems[e.ItemIndex];
}
Finally I am sorting my list of ListViewItem using the same class I was using before, I had just only to change the base class
_ListViewItems.Sort((System.Collections.Generic.IComparer<ListViewItem>)new ListViewItemComparer(new int[] { e.Column }, mLV.Sorting));
This is my IComparer class implementation:
class ListViewItemComparer : System.Collections.Generic.IComparer<ListViewItem>
{
int[] mColonne;
private System.Windows.Forms.SortOrder order;
public ListViewItemComparer(int[] mCols)
{
this.mColonne = mCols;
this.order = System.Windows.Forms.SortOrder.Ascending;
}
public ListViewItemComparer(int[] mCols, System.Windows.Forms.SortOrder order)
{
this.mColonne = mCols;
this.order = order;
}
public int Compare(ListViewItem x, ListViewItem y)
{
int returnVal = -1;
foreach (int mColonna in mColonne)
{
double mNum1;
double mNum2;
String mStr1 = "";
String mStr2 = "";
if ((x.SubItems[mColonna].Text == "NULL") && (x.SubItems[mColonna].ForeColor == Color.Red))
{
mStr1 = "-1";
}
else
{
mStr1 = x.SubItems[mColonna].Text;
}
if ((y.SubItems[mColonna].Text == "NULL") && (y.SubItems[mColonna].ForeColor == Color.Red))
{
mStr2 = "-1";
}
else
{
mStr2 = y.SubItems[mColonna].Text;
}
if ((double.TryParse(mStr1, out mNum1) == true) && (double.TryParse(mStr2, out mNum2) == true))
{
if (mNum1 == mNum2)
{
returnVal = 0;
}
else if (mNum1 > mNum2)
{
returnVal = 1;
}
else
{
returnVal = -1;
}
}
else if ((double.TryParse(mStr1, out mNum1) == true) && (double.TryParse(mStr2, out mNum2) == false))
{
returnVal = -1;
}
else if ((double.TryParse(mStr1, out mNum1) == false) && (double.TryParse(mStr2, out mNum2) == true))
{
returnVal = 1;
}
else
{
returnVal = String.Compare(mStr1, mStr2);
}
if (returnVal != 0)
{
break;
}
}
// Determine whether the sort order is descending.
if (order == System.Windows.Forms.SortOrder.Descending)
{
// Invert the value returned by String.Compare.
returnVal *= -1;
}
return returnVal;
}
}
Hope this will help you.
Did you try beginupdate() and endupdate()? Adding data is much faster when you use beginupdate/endupdate.(when you call beginupdate, listview doesn't draw until you call endupdate)
listView1.BeginUpdate();
for (int i = 0; i < 20000; i++)
{
listView1.Items.Add("abdc", 1);
}
listView1.EndUpdate();
For very large lists, the Virtual Mode ListView is the answer for certain. In non-virtual mode it seems to draw the entire list and then clip it to the view, while in Virtual mode it simply draws the ones in the view. In my case the list was 40K+ records. In non-virtual mode an update to the ListView could take minutes. In virtual mode it was instantaneous.
To sort the list, you must sort the underlying datasource, as has already been mentioned. That is the easy part. You will also need to force a refresh to the display, which is not done automatically. You can use ListView.TopItem.Index to find the index in the underlying data source that corresponds to the top row of the Virtual ListView before the sort. There is also an API call that returns the number of display rows in the ListView that you can implement as C# function, like this:
public const Int32 LVM_GETCOUNTPERPAGE = 0x1040;
public static int GetListViewRows( ListView xoView )
{
return (int)WindowsMessage.SendMessage( xoView.Handle, LVM_GETCOUNTPERPAGE, 0, 0 );
}
That will let you calculate the range you have to update. About the only remaining question is how you reconcile the existing display with what will appear after the data have been sorted. If you want to leave the same data element in the top row, you must have some mechanism in place that will find its new index in the newly sorted list, so you can replace it in the top position - something essentially equivalent to an SQL IDENTITY.