I have a functioning program that will calls for specific data from server, then when data is recieved it parses the data to add to the correct database row. the call when recieved data is landed is as followed:
public void AddData(string data)
{
if (this.ResponseData.InvokeRequired)
BeginInvoke(new AddDataDelegate(AddData), new object[] { data });
else
{
//sets new event args and object to run method
object sender = new object();
DoWorkEventArgs e = new DoWorkEventArgs(data);
//Calls the parsing and correct data row input
DataUpdate(data);
SystemGrid.Update();
}
}
My parsing coding is as followed:
private void DataUpdate(string e)
{
string[] data = e.ToString().Split(new string[] { "," }, StringSplitOptions.None);
if(data[0] == "[Q]")
{
int rowindex = 0;
if(data[2] == "ATQuoteFieldLastPrice")
{
foreach (DataGridViewRow Row in SystemGrid.Rows)
{
if (data[1] == Row.Cells[0].Value.ToString())
{
rowindex = Row.Index;
break;
}
}
SystemGrid.Rows[rowindex].Cells[5].Value = data[3];
}
}
if(data[0] == "[B]")
{
int index = 0;
foreach (DataGridViewRow Row in SystemGrid.Rows)
{
if (Row.Cells[0].Value.ToString() == data[1])
{
index = Row.Index;
break;
}
}
SystemGrid.Rows[index].Cells[9].Value = data[3];
SystemGrid.Rows[index].Cells[10].Value = data[4];
SystemGrid.Rows[index].Cells[11].Value = data[5];
SystemGrid.Rows[index].Cells[12].Value = data[6];
SystemGrid.Rows[index].Cells[13].Value = data[7];
}
}
when a timer goes off it sends for a set of data that arrives string by string.
first set finishes within 30 seconds. then the second set fires off, this takes 2min 20secs - 2min 30secs to finish. this second set has over 300,000 lines of data to go through. and will change on ever second set depending on what data changed on the server.
the third set finishes within 30secs
the fourth same as the second set
fith same as first.
6th set same as second set.
total time to update is right at 8 min give or take a little.
is there a way to thread the ADD(data) event safely, without data loss.
i've tried a backgroundworker, but the speed at which the data comes in is to fast so with that route it took the same time to finish. if tried a thread loop between 2 backgrounders, but continuely lost data.
is there a way to possibly thread pool this event upon landing, without loosing data. As the data will consistently coming in.
I am trying to get cut at least 3min of data processing.
so by the time the next update comes out the program is ready to fire again.
Related
I am trying to load a 10k row database XML export into combo box in C# but the time to open the window takes forever and I am sure 10k rows isnt that much to process, yet it still takes me about 20 seconds to finish maxing out one of my CPU threads. I tried creating a new thread in C# but got a cross-thread error. Any way for me to improve the loading time into 1 or 2 seconds?
public partial class AddGameWindow : Form
{
public AddGameWindow()
{
InitializeComponent();
Application.UseWaitCursor = true;
}
private void AddGameWindow_Load(object sender, EventArgs e)
{
var doc = XDocument.Parse(Properties.Resources.GameDB);
var rows = doc.Descendants("table").Select(el => new Game
{
Name = el.Element("title").Value,
Url = el.Element("link").Value,
Img = el.Element("imglink").Value
});
if (gamesList.Items.Count == 0 ) {
// for some reason fires again when window closed :(
int i = 0;
foreach (var row in rows)
{
//nejak to tam nahazet at to dlouho netrva
gamesList.Items.Insert(i, row.Name);
i++;
}
Application.UseWaitCursor = false;
}
}
}
class Game
{
public string Name;
public string Url;
public string Img;
}
You'll need to leverage BeginUpdate/EndUpdate if you're pushing a huge number of items into a list based control one at a time
Do heed the advice in the comments though; a combo of more than about 20 items is pretty unusable. Look at a "type in this textbox and use it as a filter", possibly with "only start filtering after the user typed 3 characters" and a "only start filtering after a delay of 500ms rather than filtering upon every keypress" type behavior..
I have problems with my code. I'm trying to send one data to db every loop. Loop function working fine, but the data is not the latest. Its the 1st data I manage to send. so, my db is full of the same data. Here is my code:
private void timer1_Tick(object sender, EventArgs e)
{
int iIdx;
int[] iData;
bool[] bData;
//float[] irData;
if (m_bRegister) // Read registers (4X references)
{
// read register (4X) data from slave
if (adamTCP.Modbus().ReadHoldingRegs(m_iStart, m_iLength, out iData))
//if (adamTCP.Modbus().ReadInputRegs(m_iStart, m_iLength, out iData)) // Read registers (3X references)
{
m_iCount++; // increment the reading counter
txtStatus.Text = "Read registers " + m_iCount.ToString() + " times...";
// update ListView
for (iIdx = 0; iIdx < m_iLength; iIdx++)
{
listViewModbusCur.Items[iIdx].SubItems[2].Text = iData[iIdx].ToString(); // show value in decimal
listViewModbusCur.Items[iIdx].SubItems[3].Text = iData[iIdx].ToString("X04"); // show value in hexdecimal
}
//display label and send to DB
label1.Text = HttpGet("http://XXX.XXX.XXX.XXX/misc/api.php?station=KL&value=" + iData[2].ToString());
}
else
txtStatus.Text = "Read registers failed!"; // show error message in "txtStatus"
}
The "label1.text = httpGet" code is the one i'm trying to send data to. And I want to set 1 minute interval for the data send.
I have an WinRT application that fires notifications everytime it recieves data from a device. I also have a UI control that is databound to an observable collection which I wish to add the new data to
While I have made it capable of updating the observable collection, it causes the UI to become very laggy, as the amount of data generated it fast. It would therefore be better to batch the update, maybe every few hundred milliseconds.
Below shows a snippet of the code. First I create the periodic timer
TimerElapsedHandler f = new TimerElapsedHandler(batchUpdate);
CreatePeriodicTimer(f, new TimeSpan(0, 0, 3));
Below is my event handler for when new data comes in, along with the temporary list that stores the information
List<FinancialStuff> lst = new List<FinancialStuff>();
async void myData_ValueChanged(GattCharacteristic sender, GattValueChangedEventArgs args)
{
var data = new byte[args.CharacteristicValue.Length];
DataReader.FromBuffer(args.CharacteristicValue).ReadBytes(data);
lst.Add(new FinancialStuff() { Time = "DateTime.UtcNow.ToString("mm:ss.ffffff")", Amount = data[0] });
}
Then my batch update, which is called peroidically
private void batchUpdate(ThreadPoolTimer source)
{
AddItem<FinancialStuff>(financialStuffList, lst);
}
Then finally, for testing I want to clear the observable collection and items.
public async void AddItem<T>(ObservableCollection<T> oc, List<T> items)
{
lock (items)
{
if (Dispatcher.HasThreadAccess)
{
foreach (T item in items)
oc.Add(item);
}
else
{
Dispatcher.RunAsync(CoreDispatcherPriority.Low, () =>
{
oc.Clear();
for (int i = 0; i < items.Count; i++)
{
items.Count());
oc.Add(items[i]);
}
lst.Clear();
});
}
}
}
While this seems to work, after a few updates the UI locks up and it updates very slowly/if not at all. For testing, it's only getting a few hundred items in the list by the time the timer is fired.
Can anybody enlighten me as to why as to why this is happening - I'm presuming my design is very poor.
Thanks
You're not locking your list in the event handler
// "lst" is never locked in your event handler
List<FinancialStuff> lst = new List<FinancialStuff>();
lst.Add(new FinancialStuff() { Time = "DateTime.UtcNow.ToString("mm:ss.ffffff")", Amount = data[0] });
Passing "lst" above to your async method
AddItem<FinancialStuff>(financialStuffList, lst);
You're locking "items" below, which is really "lst" above. However, you're adding to the list while your processing it. I assume the event handler has a higher priority so your processing is slower than your add. This can lead to "i < items.Count" being true forever.
public async void AddItem<T>(ObservableCollection<T> oc, List<T> items)
{
// "lst" reference is locked here, but it wasn't locked in the event handler
lock (items)
{
if (Dispatcher.HasThreadAccess)
{
foreach (T item in items)
oc.Add(item);
}
else
{
Dispatcher.RunAsync(CoreDispatcherPriority.Low, () =>
{
oc.Clear();
// This may never exit the for loop
for (int i = 0; i < items.Count; i++)
{
items.Count());
oc.Add(items[i]);
}
lst.Clear();
});
}
}
}
EDIT:
Do you need to view every piece of data? There is going to be some overhead when using a lock. If you're getting data quicker than the speed of how fast you can render it, you'll eventually be backed up and/or have a very large collection to render, which might also cause some problems. I suggest you do some filtering to only draw the last x number of items (say 100). Also, I'm not sure why you need the if (Dispatcher.HasThreadAccess) condition either.
Try the following:
public async void AddItem<T>(ObservableCollection<T> oc, List<T> items)
{
// "lst" reference is locked here, but it wasn't locked in the event handler
lock (items)
{
// Change this to what you want
const int maxSize = 100;
// Make sure it doesn't index out of bounds
int startIndex = Math.Max(0, items.Count - maxSize);
int length = items.Count - startIndex;
List<T> itemsToRender = items.GetRange(startIndex, length);
// You can clear it here in your background thread. The references to the objects
// are now in the itemsToRender list.
lst.Clear();
// Dispatcher.RunAsync(CoreDispatcherPriority.Low, () =>
// Please verify this is the correct syntax
Dispatcher.Run(() =>
{
// At second look, this might need to be locked too
// EDIT: This probably will just add overhead now that it's not running async.
// You can probably remove this lock
lock(oc)
{
oc.Clear();
for (int i = 0; i < itemsToRender.Count; i++)
{
// I didn't notice it before, but why are you checking the count again?
// items.Count());
oc.Add(itemsToRender[i]);
}
}
});
}
}
EDIT2:
Since your AddItem method is already on a background thread, I don't think you need to run the Dispatcher.RunAsync. Instead, I think it might be desirable for it to block so you don't end up with multiple calls to that section of code. Try using Dispatcher.Run instead. I've updated the code example above to show the changes. You shouldn't need the lock on the oc anymore since the lock on the items is good enough. Also, verify the syntax for Dispatcher.Run is correct.
I know there are MANY similiar questions, but I can't seem to get to the bottom of this.
In my program I execute a verification method which should compare two ascii HEX files with eachother (one is local, the other is read from a USB device). Some code:
private void buttonVerify_Click(object sender, EventArgs e)
{
onlyVerifying = true;
Thread t = new Thread(verifyProgram);
}
private void verifyProgram()
{
verifying = true;
externalFlashFile.Clear();
// After this method is finished, the returned data will end up in
// this.externalFlashFile since this listen to the usb's returned data
hexFile.readExternalFlashForVerify(usbDongle, autoEvent);
externalFlashFile.RemoveAt(0);
//externalFlashFile.RemoveAt(externalFlashFile.Count - 1);
hexFile.verifyProgram(externalFlashFile);
}
public void verifyProgram(List<string> externalProgram)
{
byte[] originalFile = null; // Will be modified later with given size
byte[] externalFile = new byte[4096];
int k = 0, errors = 0;
// Remove last line which contains USB command data
externalProgram.RemoveAt(externalProgram.Count - 1);
foreach (String currentLine in externalProgram)
{
for (int i = 0; i < 64; i += 2)
{
string currentDataByte = currentLine.Substring(i, 2);
externalFile[k] = Convert.ToByte(currentDataByte, 16);
k++;
}
progress += steps;
}
//... compare externalFile and originalFile
When executing the readExternalFlashForVerify the USB is responding with requested data. This data is parsed and calls an eventhandler:
public void usbDongle_OnDataParsed(object sender, EventArgs e)
{
if (verifying)
{
usbDongle.receivedBytesString.Trim();
externalFlashFile.Add(usbDongle.receivedBytesString.Substring(2, 32 * 2));
// Allow hexFile continue its thread processing
autoEvent.Set();
}
}
The first run is always completes correctly. The following executions, at the third or fourth iteration of the foreach, I get an extra element in externalProgram. This is not a global variable (argument in function call) and the function is not called anywhere else. This ofcourse throws an exception.
I tried adding .ToList() to externalProgram in the foreach but that didn't do any difference. How can my externalProgram be modified during this execution?
EDIT: I never found the cause of this, but replacing the foreach with a hard-coded for-loop solved the issue at hand. Not an optimal solution, but don't have much time on this.
// The list should never be larger than 128 items
for (int j = 0; j < 0x7f ; j++)
{
string currentLine = externalProgram[j];
// ...
Usually when you receive an exception with a message like that it is caused by multiple accesses from different threads to a list.
What I suggest you is to use a lock when you add and remove items from that list, so you're sure the indexes to that collection are not changing. You have to think what would happen if you try to remove the last element (of index 3, for example) of a collection when someone else removes a previous item (changing the lenght of the collection to 3...).
This example: Properly locking a List<T> in MultiThreaded Scenarios? describes better what I mean.
Probably this line is a problem:
externalProgram.RemoveAt(externalProgram.Count - 1);
If verifyProgram is called multiple times, it will remove more and more lines from externalProgram list passed by reference
I need to read the value from PLC and display it in a form whenever the PLC Tag value changes.
There will list of tags which I need to monitor. Whenever a TAG value changes i need to call a function(different functions for each tag).
This is what I have done so far for capturing the Tag value change..
After connecting to PLC, i ll create the LIST of tags.
Read TAG values in timer.
While reading i ll check with OLDVALUES tag, if there is any change in value I ll raise an event.
This is working fine with 4 or 5 tags. When the Tag count is more, say 100, some of the Tag change events are not firing..
This is what so far I have done..
public delegate void DataChangedEventHandler(string TagName, string NewValue);
private Timer tmr = new Timer();
public event DataChangedEventHandler OnDataChanged;
private void StartTimer(DataTable dt)
{
AddTagList(dt);
SetInitialVales();
tmr.Tick += timerticks;
tmr.Interval = 250;
tmr.Enabled = true;
tmr.Start();
}
private void StopTimer()
{
tmr.Enabled = false;
tmr.Stop();
}
I ll add the list of tags..
private List<string> TagValues = new List<string>();
private List<string> oldValues = new List<string>();
private List<string> newValues = new List<string>();
private void AddTagList(DataTable dt)
{
int ILoop = 0;
foreach (DataRow row in dt.Rows)
{
TagValues.Add((string)row["Path"]);
ILoop = ILoop + 1;
}
}
To set the initial values of the Tags.
private void SetInitialVales()
{
int iLoop = 0;
foreach (string vals in TagValues)
{
var rd = ReadTag(vals);
oldValues.Add(rd.ToString());
newValues.Add(rd.ToString());
iLoop = iLoop + 1;
}
//newValues = oldValues
}
and the main datachange part.
private void timerticks(object sender, EventArgs eventArgs)
{
int iLoop = 0;
foreach (string vals in TagValues)
{
oldValues[iLoop] = ReadTag(vals).ToString();
if (oldValues[iLoop] != newValues[iLoop])
{
newValues[iLoop] = oldValues[iLoop];
if (OnDataChanged != null)
{
OnDataChanged(vals, newValues[iLoop]);
}
}
iLoop = iLoop + 1;
}
}
My Queries:
1.What will happen if a event is raised while already raised event is still in progress(sub procedure is not completed)?? Is because of this reason I am missing some datachange events??
2.How to raise a raise a event automatically whenever the member of LIST value changes??
3.Any other better method to handle the timer-read-raiseevent?
What will happen if a event is raised while already raised event is still in progress
The event won't be raised, not until your code is done executing the previous one. Clearly you'll run into trouble when the events you fire take too long, longer than 1 second. The more tags you have, or the more of them can change within one scan, the greater the odds that these events take more than 1 second and thus miss a tag change.
You'll need to de-couple the scanning from the event processing. You can do so with a worker thread that does nothing but check for tag changes in a loop. And if it sees any, put a update notification in a thread-safe queue. Another thread, like your UI thread, can empty the queue and process the notifications. The queue now acts as a buffer, providing enough storage to be able to keep up with a sudden burst of tag changes.
Wouldn't it be better to create a class with old-new value in it and then a map with the tag as key to access the old-new instance?
It seems otherwise you have a lot of things floating around that need to be kept synched.