Yet another question.
So now my EventReceiver and its logic is working just fine. Except for one thing.
Basically it queries the whole list through CAML queries, and then passes the result to a DataTable object, and later on to a DataRow object...
Like all, in test environment, it works perfectly, but in production...
What happens is that the column I need updated gets update but not shown immediately. The item column receives the value I want, but it doesn't show at first refresh, you have to refresh the page again, then it appears...
The only difference is that in teste env. my list has like, 200 records, and in production, it has almost 5000 records.
Some questions:
Is there a way to define how many records you want? In CAML or in the DataTable object? Something like "SELECT TOP 100 ... "
If not, is there a way to make the refresh process stop and wait for the code execution?
Some Info:
It's WSS 3.0, and the event I'm intercepting is ItemAdded, which explains the refresh not waiting for my code.
Oh and considering changing to the ItemAdding event would be a little bit of a problem, because I need to capture the ID of the record, which is not yet available in ItemAdding because the list item has not been committed to the database yet.
Thanks in advance.
The problem here was the "GetDataTable()" method. When I ran the CAML query and filled a datatable with the results, it'd lose the order by modifier. But if I get the results with a SPListItemCollection object, it returns the row exactly how I wanted.
As seen in another post... "This is a nasty issue".
Similar question and answer here. You should be able to use the Rowlimit property of SPQuery.
After searching a lot, I ended up moving my code to the ItemAdding event, which is synchronous and will finish executing before SharePoint loads its page.
Even after I limited the result rows to 5, it would still load the page without the value I wanted to show.
Also, if you are considering capturing the value from a field that uses calculated value, and has a formula in it, be careful, because at least in my example, SharePoint didn't resolve the formula by event execution, so the field with the calculated value would always return null.
Related
I have an ASP.Net web application that calls a customer to a station. Five employees are running this application simultaneously when they see a customer walk in they click on a ButtonGetCustomer to call the customer and come to their station.
Here is my issue. I am getting the data from SQL and storing it in a Datatable. Sometimes when two or more clerks click at the same time they call the same customer.
Any ideas in how to prevent this from happening?
I had a similar problems with thousands of people clicking the same button trying to claim a limited number of spots. Here is a similar solution:
When they click your button, run a stored procedure to mark that user as seen.
Your SPROC will first check to see if the user is marked as seen, if so, quit (I use RAISEERROR and pass a message back and catch the SQL Exception in code so you can tell them what user has already been called).
If the user hasn't been seen, the next thing your SPROC does is mark them as seen.
So the person who clicked the button either has success and sees the customer, or he gets a message saying the customer has already been seen.
The problem you are experiencing is a concurrency problem. Try wrapping the read of the datatable in a lock statement (there are several), the records you plan on returning to the calling thread should be flagged so that they are not picked up by other thread, try something like this:
private Object _syncObject = new Object();
private DataTable yourDataReadMethod() {
lock(_syncObject)
{
// Read records to return to calling thread.
// Flag records read so they are not given to other threads. You might need expiration date in case the records are not completed in a timely manner.
}
}
Furthermore, if you are updating a record after a call takes place you should compare a the db last updated date with a date that is persisted in the client form; if they differ than raise an exception, because this means that someone else has already updated the record. Hopefully that helps.
I have an ASP.Net form and I want to send an email when the user changes their data. The email should only include data that has changed, and there are about 15 data fields total.
I don't want to use an ORM since I am updating a website that a 3rd party built for us, and all their data access calls go through a custom library of theirs.
The only ways to do this I can think of is
Make another database call to get old values and compare the form values one-by-one. If they're different, append to the email.
Store original data somewhere when it's first loaded (hidden field, session, etc), and once again compare the data one field at a time and append the differences to an email
Have someone on SO tell me there's an easier and/or simpler way that I haven't thought of
All the text boxes will have a TextChanged event, you can have them mark themselves as modified. ComboBox's will have a SelectedIndexChanged event, and so on.
Edit: All changed events can check their initial values (even on reverted changes) and either mark themselves as still modified or on a revert, as un-modified.
Here are some suggestions that may / may not be useful:
Trigger on the database table and the trigger compares the old (using the DELETED table) and updated (using the INSERTED table) and then sends an email. This may or may not be viable and I am not a big advocate of triggers.
Like you have already said you could make another database call, which would be my reccommended approach.
From what you've said I think that the only way forward is to create a duplicate dataset on the form to store the old data and run a comparison at the point where you want to produce the email.
You can use Dataset.Copy to copy structure and data.
However, now that I think about it there's always the Datset.GetChanges() method and the Dataset.AcceptChanges() along with DataSet.HasChanges()
Example code from this link:
if(dataSet.HasChanges(DataRowState.Modified |
DataRowState.Added)&& dataSet.HasErrors)
{
// Use GetChanges to extract subset.
changesDataSet = dataSet.GetChanges(
DataRowState.Modified|DataRowState.Added);
PrintValues(changesDataSet, "Subset values");
// Insert code to reconcile errors. In this case, reject changes.
foreach(DataTable changesTable in changesDataSet.Tables)
{
if (changesTable.HasErrors)
{
foreach(DataRow changesRow in changesTable.Rows)
{
//Console.WriteLine(changesRow["Item"]);
if((int)changesRow["Item",DataRowVersion.Current ]> 100)
{
changesRow.RejectChanges();
changesRow.ClearErrors();
}
}
}
}
// Add a column to the changesDataSet.
changesDataSet.Tables["Items"].Columns.Add(
new DataColumn("newColumn"));
PrintValues(changesDataSet, "Reconciled subset values");
// Merge changes back to first DataSet.
dataSet.Merge(changesDataSet, false,
System.Data.MissingSchemaAction.Add);
}
PrintValues(dataSet, "Merged Values");
Imagine a table and a button to add new rows to the table. On each click to the button, a new row will be inserted at the end of the table. The button event is functioning as follows:
first of all, it points out a reference row to copy.
whatever the controls and text are inside this referenced row they are copied to a datatable. Since a datatable cannot hold controls I am converting them to strings and saving them like that.
At the end, the datatable is stored within a cache.
Finally, on each page_init event I re-create the table using the data inside the datatable. Everything works fine.
However, I'm curious. Since I have from 3 to 5 tables in the page and all of them are stored in a different cache with a different datatable, and all of them are re-created during the page-cycle events, may it cause any problems in the future? By the way, please note that once the user leaves the page the cache is deleted.
I did not want to paste the whole code here since it's a bit long and may alienate people from reading the question. But I can give some statistics so that you can make some comments on it.
The class I've written is 118 lines long.
During the process of recreation of the table, there are 3 nested for/foreach loops, but they are not that long (the average loop times is probably from 5 to 10 for each).
And finally, as mentioned above, to re-create the table a datatable that is saved in cache is used.
So, I ask the question again: The code works perfectly, but I would like to know if building such a code is performance-friendly?
It depends completely on the amount of data in the table (number of rows / columns).
If its small like, pulling down a list of 10 users and their logins and passwords for example, it will work just fine with no performance issues.
But if this is going to be thousands and thousands of records, this will probably start to have performance issues.
Edit: Write a script to fill the database to a "worse case" expected amount of data, and then see how it performs.
Okay, this is a little hard to explain, as the title might suggest.
I have an event receiver on ItemUpdated and ItemCheckedIn, which both writes custom SPAuditEntries. When CheckedIn occurs though - it comes with two update entries as well (one for added file, and one for a simple update to the list item I suspect).
I'd love to get rid of these entries. At first I thought it would be really simple, just put an if in the itemUpdated event receiver, and stop everything
if(SPListItem.CheckedOut = false) { //... do nothing }
However I couldn't find any way to ascertain the checkout-status of the listitem.
My next thinking was, they hit almost at exactly the same time, so I could just crawl into the auditCollection, filter down to the specific listitem, user, and time (minus a second) and delete the two entries. But, sadly I found out I couldn't delete auditentries.
Anyone got any ideas?
Checked out status is determined via:
if (item.Level == SPFileLevel.Checkout) {
where item is an SPListItem
-Oisin
Is there a limit to the rows that IEnumerable.Count() (or IQueryable.Count()) using LINQ to SQL? For whatever reason, if my query returns more than 200 records, I only get 200 from IEnumerable.Count(). I've even tried using IEnumerable.LongCount() (even though the number of results shouldn't be high enough to need it). I've also verified that calling COUNT on the database returns more than 200 records.
I've checked MSDN and tried Googling it, but to no avail.
Am I going crazy, or is there something that I'm missing somewhere? I suppose this isn't really a big deal (as the program is inserting the right number of records), but it'd be nice to be able to log the number of records transferred.
Could we see some sample code?
public IEnumerable<Irms_tx_modify_profile_ban> ExtractNewAdmits()
{
var results = from a in _dc.Applications
select (Irms_tx_modify_profile_ban)new RMSProfile
{
//assign column names to property names
};
//Don't know why, but Bad Things happen if you don't put the
//OfType call here.
IEnumerable<Irms_tx_modify_profile_ban> narrowed_results = results.OfType<Irms_tx_modify_profile_ban>();
Program.log.InfoFormat("Read {0} records from Banner.", narrowed_results.Count());
return narrowed_results;
}
The reason for the comment about bad things happening is due to the issues brought up in this thread. What I did just find out is that if I call Count on narrowed_results (IEnumerable), it returns the right amount. If I call it on results (IQueryable), it returns just 200. I'll go ahead and solve Skeet's answer (since he mentioned the difference between IQueryable and IEnumerable), but if anyone is able to explain why this is happening, I'd like to hear it.
I've not heard of anything like that, and it does sound very odd.
The most obvious thing to check is what query is being sent to the database. Also, it matters a great deal whether you're calling Enumerable.Count() (i.e. on an IEnumerable<T>) or Queryable.Count() (i.e. on an IQueryable<T>). The former will be iterating through the actual rows in .NET code to retrieve the count; the latter will put the count into the query.
Could we see some sample code?
EDIT: Okay, so having seen the code:
When you didn't call OfType, it was executing the count at the SQL level. That should have been visible in the SQL logged, and should be reproducible with any other SQL tool.
I suspect you didn't really have to call OfType. You could have called AsEnumerable, or just declared results as IEnumerable<Irms_tx_modify_profile_ban>. The important thing is that the type of the variable decides the extension method to use - and thus where the count is executed.
It's worth noting that your current solution is really inefficient - it's fetching all the data, and counting it but ignoring everything but the count. It would be much better to get the count onto the server side - and while I know that doesn't work at the moment, I'm sure with a bit of analysis we can make it work :)