Selenium stale element exception (found when running tests and not while debugging) - c#

I am getting stale element exception when I run the code. However while debugging I do not get this exception.
Here is my piece of code. Can anybody help me ? thanks
public static bool CheckListFilterResult(IList<IWebElement> gridColumns, IList<IWebElement> gridRows, string filterColumn, List<string> filters)
{
bool checkResult = false;
{
int filterColumnIndex = GetColumnIndex(gridColumns, filterColumn);
if (gridRows.Count > 0 )
{
foreach (IWebElement row in gridRows)
{
TestManager.Doc.Step("before bool match");
bool filterMatch = filters.Contains(row.FindElements(By.TagName("td"))[filterColumnIndex].Text.Trim());
if (filterMatch)
{
checkResult = true;
}
else
{
checkResult = false;
break;
}
}
}
}
return checkResult;
}
}

From looking at your code, that line is the first line where you access something from the rows collection. My guess is that it has something to do with that. Did you pull the rows collection, then apply a filter, then call this function? If so, that's probably the problem. You need to apply the filter, then pull the rows collection, then call the function.

It could be that your IWebElement row was updated by javascript while you were running your selenium code.
As explained here: http://docs.seleniumhq.org/exceptions/stale_element_reference.jsp, Stale Element Reference Exception happens when the object on the UI may have been refreshed but you are trying to access the same object. So even if the row is visible to the user, it may have been updated/replaced by javascript, then it is a different object.
The reason why it happens only occassionaly but not always could be that when you run in debug mode, you are stopping at each line of your selenium code, but javascript code doesnt stop, so the element on the UI was already refreshed, then your selenium code gets a fresh copy of the IWebElement row.
There are several possible ways of fixing this problem.
try to catch StaleElementReferenceException, if it happens, then redo whatever you are trying to do.
you can do: wait.until(ExpectedConditions.stalenessOf(row));
Hope it helps

Related

The wait.until returns stale element, how to wait until DOM is stable

I have a blazor application running fine and want to have some behavior test with selenium. The test does now currently the following:
goes to a page (directly using an URL, no page loaded before)
tries to click on a button
The first point does work, but the second has an issue. If I use the wait.until the button is available, then I receive back an early version of the button, which then redrawn and updated in the DOM later. This will give me the "stale element reference: element is not attached to the page document" error.
Here is the code:
var xPath = By.XPath($".//tr//td[normalize-space()=\"{name}\"]/ancestor-or-self::tr//button");
var button = _wait.Until(ExpectedConditions.ElementToBeClickable(xPath));
Thread.Sleep(1000);
button = _chromeDriver.FindElement(xPath);
button.Click();
the _wait.until will return an item that will be stale, while the next FindElement will return a valid, but only after ~1 sec sleep. If I don't have sleep there, it will return the same as the other line.
The final question: How can I ensure in the _wait.until line, that my returned element is the final one to avoid using Thread.Sleep?
There is no gentle solution for such issues.
The only solution I know for this is to make a loop:
Wait for element clickability and click it.
In case exception is thrown wait for the clickability again.
If click succed - break the loop and continue to outer code / return True.
Something like this:
for(i=0; i<6; i++){
try: {
wait.Until(ExpectedConditions.ElementToBeClickable(xPath)).Click();
break;
}
catch(Exception e){
Thread.Sleep(100);
}
}
#golddragon007, you can wait for the element to become stale.
_wait.Until(ExpectedConditions.StalenessOf(_chromeDriver.FindElement(xPath)))
you can check the following link for more details:
https://www.selenium.dev/selenium/docs/api/dotnet/html/M_OpenQA_Selenium_Support_UI_ExpectedConditions_StalenessOf.htm
Selenium Wait.Until conditions are not 100% reliable. I was in shock when I figure×’ this myself, but yes, after the condition has finished you sometimes have to wait or try clicking or hovering in a loop.
I wish to suggest 2 options here:
Wait for the element to be both Clickable and Enabled:
getWait().until((ExpectedCondition) driver -> element.isEnabled());
Use a much better UI tool for testing like Playwright. It has a much sophisticated waiting system conditions and is also faster and very reliable.
Prophet's answer is a good approach, so have a look at that. Sometimes you just need to try something more than one time. This is an alternative that is easier to reuse and more flexible.
All of the methods on the WebDriverWait class expect a lambda express to return a value. Sometimes I need to call a method with a void return type, so I write an extension method on WebDriverWait to support this:
public static class WebDriverWaitExtensions
{
public static void Until(this WebDriverWait wait, Action<IWebDriver> action)
{
wait.Until(driver =>
{
action(driver);
return true;
});
}
}
This is much more flexible and easier to call with your code. This will attempt to click the element until it succeeds. A word of caution, however. The StaleElementException will cancel a wait operation immediately. Be sure to ignore this kind of exception.
var xPath = By.XPath($".//tr//td[normalize-space()=\"{name}\"]/ancestor-or-self::tr//button");
_wait.IgnoredExceptionTypes(typeof(StaleElementException));
_wait.Until(driver => driver.FindElement(xpath).Click());
This should work with any web driver method that has a void return type.
It seems another solution is this:
_wait.Until(driver => (bool)((IJavaScriptExecutor)driver).ExecuteScript("return typeof Blazor.reconnect != 'undefined'"));
as that variable only loaded when the page is fully loaded. And after that it's possible to do a simple find without an issue:
var button = _chromeDriver.FindElement(xPath);
Implement polly as your resilience framework for Selenium methods you create an example you can see below:
public static IWebElement Click(string xPath, int retries = 15,
int retryInterval = 1)
{
var element = Policy.HandleResult<IWebElement>(result => result == null)
.WaitAndRetry(retries, interval => TimeSpan.FromSeconds(retryInterval))
.Execute(() =>
{
var element = Searchers.FindWebElementByXPath(xPath);
if (element != null)
{
_logger.Info("Clicked Element: " + xPath + " (" + element.Text + ")");
try
{
_driver.ExecuteScript("arguments[0].click();", element);
return element;
}
catch (Exception e)
{
if (e is ElementClickInterceptedException or StaleElementReferenceException)
return null;
}
}
return null;
});
if (element != null) return element;
_logger.Info("Failed to click Element: " + xPath + "");
throw new Exception(" Failed to use Javascript to click element with XPath: " + xPath + "");
}
By leveraging the power of polly, I can guarantee that all of my events will go through.
Do not use the base Selenium methods to wait for certain properties to change/be a certain value, they are not very reliable.

WinAppDriver doesn't find element Inspect can find

I am trying to assert a certain Text element is present on a GUI I am testing. I can see the element using Inspect, yet when running my code, the element is not found. Interestingly, I cant use the "watch cursor" to select the element, I can only select it from the tree. Here is an image of the Inspect window displaying the element I am trying to assert is present:
Inspect Image
I've tried asserting the Parent element is there so I know my XPath is okay. I've tried finding all children by appending /* to the path, and this seems to return null.
Heres the part of my test code I am trying:
string xpath_ABLRegion_child = "//*/Custom[#ClassName=\"D3NodeOverview\"]/Text[9]"]";
var WinElem_ABLRegion_child = desktopSession.FindElementByAbsoluteXPath(xpath_ABLRegion_child);
Assert.IsTrue(desktopSession.FindElementByAbsoluteXPath(xpath_ABLRegion_child"]").Displayed);
Console.WriteLine($"Text of WinElem_ABLRegion_child is {WinElem_ABLRegion_child.Text}");
I expect the test to pass, as WinElem_ABLRegion_child is displayed. However, I am getting an error that the element cant be found.
Please let me know if there is any more information I can share to help you help me.
You can try this....
fyi - your xpath will be extremely brittle. Can you cut that down a bit?
using C# - Method
public static bool IsElementPresent_byXpath(string xpath)
{
bool result;
try { result = Driver.FindElement(By.XPath(xpath)).Displayed; }
catch (NoSuchElementException) { return false; }
catch (StaleElementReferenceException) { return false; }
return result;
}
call
bool ele = Browser.IsElementPresent_byXpath("your/long/xpath");
Assert.IsTrue(ele);
I noticed that the element you are looking for is the child of another window. Did you by chance assign the parent ownership? (I.e. owner = this on implementation).
If so WinAppDriver seems to be blind to that bit of code. See ticket.

Selenium: Stale Element Reference (works fine when debugging)

I am trying to scrape a page with Selenium in C# which has several pages that I can go through by clicking a "Next" button on the page. I am usually getting the error that there is a stale element reference, which ONLY happens if I run it without breakpoints. If I go through the program step by step, it works perfectly fine. I'm assuming that Selenium is skipping over important stuff without waiting (even though I have a wait method implemented).
To the code, this is the main logic for the problem:
foundVacancies.AddRange(FindVacanciesOnPage());
const string nextBtnXPath = "//*[#id=\"ContainerResultList\"]/div/div[3]/nav/ul/li[8]/a";
if (Driver.FindElements(By.XPath(nextBtnXPath)).Count != 0)
{
while (TryClickingNextButton(nextBtnXPath))
{
foundVacancies.AddRange(FindVacanciesOnPage());
}
}
This method first gets all items on the first page and adds them to the foundVacancies list. After that, it will try to look for the "Next" button, which is not always there if there are not enough items. If it is, it will try to click it, scrape the page, and click it again until there are no pages left. This works great when debugging, but there is something very wrong with normally running.
The method for getting all items on the page, and where the error occurs:
private IEnumerable<string> FindVacanciesOnPage()
{
var vacancies = new List<string>();
var tableContainingAllVacancies = Driver.FindElement(By.XPath("//*[#id=\"ContainerResultList\"]/div/div[2]/div/ul"));
var listOfVacancies = tableContainingAllVacancies.FindElements(By.XPath(".//li/article/div[1]/a"));
foreach (var vacancy in listOfVacancies)
{
vacancies.Add(vacancy.FindElement(By.XPath(".//h2")).Text);
}
return vacancies;
}
The items are in a <ul> HTML tag and have <li> childs, which I am going through one by one, and get their inner text. The stale element error occurs in the foreach loop. I'm assuming that the web driver didn't have the time to reload the DOM, because it's working when breakpointing. However, I do have a method to wait until the page is fully loaded, which is what I use when going to the next page.
private bool TryClickingNextButton(string nextButtonXPath)
{
var nextButton = Driver.FindElement(By.XPath(nextButtonXPath));
var currentUrl = Driver.Url;
ScrollElementIntoView(nextButton);
nextButton.Click();
WaitUntilLoaded();
var newUrl = Driver.Url;
return !currentUrl.Equals(newUrl);
}
I am comparing new and old URL to determine if this was the last page. The WaitUntilLoaded method looks like this:
var wait = new WebDriverWait(Driver, TimeSpan.FromSeconds(30));
wait.Until(x => ((IJavaScriptExecutor) Driver).ExecuteScript("return document.readyState").Equals("complete"));
Oddly enough, sometimes the web driver just closes immediately after loading the first page, without any errors nor any results. I spent a lot of time debugging and searching on SO, but can't seem to find any information, because the code is working perfectly fine when breakpointing through it.
I have only tried Chrome, with and without headless mode, but I don't see that this could be a Chrome problem.
The "Next" button has the following HTML:
<a href="" data-jn-click="nextPage()" data-ng-class="{'disabled-element':currentPage === totalPages}" tabindex="0">
<span class="hidden-md hidden-sm hidden-xs">Next <span class="icon icon-pagination-single-forward"></span></span>
<span class="hidden-lg icon icon-pagination-forward-enable"></span>
</a>
I couldn't find out what data-jn-click is. I tried to just execute the JavaScript nextPage();, but that didn't do anything.
I don't have any experience in c#, so if am wrong please don't mind.
You are using findElementsand storing it to var listOfVacancies. I have referred some sites. Why don't you use ReadOnlyCollection<IWebElement>. It is better to store all elements as a List and iterate through it.
So the code becomes,
ReadOnlyCollection<IWebElement> listOfVacancies = tableContainingAllVacancies.FindElements(By.XPath(".//li/article/div[1]/a"));
If the elements that are going into listOfVacancies are being populated via an ajax call, then document.readystate won't catch that. Try using:
wait.Until(x => ((IJavaScriptExecutor) Driver).ExecuteScript("return jQuery.active").Equals("0"));
I finally found a way to solve this issue. It's dirty, but it works. I tried many different approaches to waiting until the page is fully loaded, but none worked. So I went down the dark path of Thread.Sleep, but it's not as bad as it sounds like:
private IEnumerable<string> FindVacanciesOnPage()
{
return FindVacanciesOnPage(new List<string>(), 0, 50, 15000);
}
private IEnumerable<string> FindVacanciesOnPage(ICollection<string> foundVacancies, long waitedTime, int interval, long maxWaitTime)
{
try
{
var list = Driver.FindElements(By.XPath("//*[#data-ng-bind=\"item.JobHeadline\"]"));
foreach (var vacancy in list)
{
foundVacancies.Add(vacancy.Text);
}
}
catch (Exception)
{
if (waitedTime >= maxWaitTime) throw;
Thread.Sleep(interval);
waitedTime += interval;
return FindVacanciesOnPage(foundVacancies, waitedTime, interval, maxWaitTime);
}
return foundVacancies;
}
This will try to get the items, and if there is an Exception thrown, just waits a certain amount of time until it tries again. When a specified maximum time was waited, the exception is finally thrown.

Multi-threaded C# Selenium WebDriver automation with Uris not known beforehand

I need to perform some simultaneous webdrivers manipulation, but I am uncertain as to how to do this.
What I am asking here is:
What is the correct way to achieve this ?
What is the reason for the exception I am getting (revealed below)
After some research I ended up with:
1. The way I see people doing this (and the one I ended up using after playing with the API, before searching) is to loop over the window handles my WebDriver has at hand, and perform a switch to and out of the window handle I want to process, closing it when I am finished.
2. Selenium Grid does not seem like an option fore me - am I wrong or it is intended for parallel processing ? Since am running everything in a single computer, it will be of no use for me.
In trying the 1st option, I have the following scenario (a code sample is available below, I skipped stuff that is not relevant/repeat itself (where ever I added 3 dots:
I have a html page, with several submit buttons, stacked.
Clicking each of them will open a new browser/tab (interestingly enough, using ChromeDriver opens tabs, while FirefoxDriver opens separate windows for each.)
As a side note: I can't determine the uris of each submit beforehand (they must be determined by javascript, and at this point, let's just assume I want to handle everything knowing nothing about the client code.
Now, after looping over all the submit buttons, and issuing webElement.Click() on the corresponding elements, the tabs/windows open. The code flows to create a list of tasks to be executed, one for each new tab/window.
The problem is: since all tasks all depend upon the same instance of webdriver to switch to the window handles, seems I will need to add resource sharing locks/control. I am uncertain as whether I am correct, since I saw no mention of locks/resource access control in searching for multi-threaded web driver examples.
On the other hand, if I am able to determine the tabs/windows uris beforehand, I would be able to skip all the automation steps needed to reach this point, and then creating a webDriver instance for each thread, via Navigate().GoToUrl() would be straightforward. But this looks like a deadlock! I don't see webDriver's API providing any access to the newly opened tab/window without performing a switch. And I only want to switch if I do not have to repeat all the automation steps that lead me to the current window !
...
In any case, I keep getting the exception:
Element belongs to a different frame than the current one - switch to its containing frame to use it
at
IWebElement element = cell.FindElement
inside the ToDictionary() block.
I obviously checked that all my selectors are returning results, in chrome's console.
foreach (WebElement resultSet in resultSets)
resultSet.Click();
foreach(string windowHandle in webDriver.WindowHandles.Skip(1))
{
dataCollectionTasks.Add(Task.Factory.StartNew<List<DataTable>>(obj =>
{
List<DataTable> collectedData = new List<DataTable>();
string window = obj as string;
if (window != null)
{
webDriver.SwitchTo().Window(windowHandle);
List<WebElement> dataSets = webDriver.FindElements(By.JQuerySelector(utils.GetAppSetting("selectors.ResultSetData"))).ToList();
DataTable data = null;
for (int i = 0; i < dataSets.Count; i += 2)
{
data = new DataTable();
data.Columns.Add("Col1", typeof(string));
data.Columns.Add("Col2", typeof(string));
data.Columns.Add("Col3", typeof(string));
///...
//data set header
if (i % 2 != 0)
{
IWebElement headerElement = dataSets[i].FindElement(OpenQA.Selenium.By.CssSelector(utils.GetAppSetting("selectors.ResultSetDataHeader")));
data.TableName = string.Join(" ", headerElement.Text.Split().Take(3));
}
//data set records
else
{
Dictionary<string, string> cells = dataSets[i]
.FindElements(OpenQA.Selenium.By.CssSelector(utils.GetAppSetting("selectors.ResultSetDataCell")))
.ToDictionary(
cell =>
{
IWebElement element = cell.FindElement(OpenQA.Selenium.By.CssSelector(utils.GetAppSetting("selectors.ResultSetDataHeaderColumn")));
return element == null ? string.Empty : element.Text;
},
cell =>
{
return cell == null ? string.Empty : cell.Text;
});
string col1Value, col2Value, col3Value; //...
cells.TryGetValue("Col1", out col1Value);
cells.TryGetValue("Col2", out col2Value);
cells.TryGetValue("Col3", out col3Value);
//...
data.Rows.Add(col1Value, col2Value, col3Value /*...*/);
}
}
collectedData.Add(data);
}
webDriver.SwitchTo().Window(mainWindow);
webDriver.Close();
return collectedData;
}, windowHandle));
} //foreach
Task.WaitAll(dataCollectionTasks.ToArray());
foreach (Task<List<DataTable>> dataCollectionTask in dataCollectionTasks)
{
results.AddRange(dataCollectionTask.Result);
}
return results;

is there a better way to handle RPC_E_CALL_REJECTED exceptions when doing visual studio automation?

this is what I'm currently doing:
protected void setupProject()
{
bool lbDone = false;
int liCount = 0;
while (!lbDone && liCount < pMaxRetries)
{
try
{
pProject.ProjectItems.Item("Class1.cs").Delete();
lbDone = true;
}
catch (System.Runtime.InteropServices.COMException loE)
{
liCount++;
if ((uint)loE.ErrorCode == 0x80010001)
{
// RPC_E_CALL_REJECTED - sleep half sec then try again
System.Threading.Thread.Sleep(pDelayBetweenRetry);
}
}
}
}
now I have that try catch block around most calls to the EnvDTE stuff, and it works well enough. The problem I have is when I to loop through a collection and do something to each item once.
foreach(ProjectItem pi in pProject.ProjectItems)
{
// do something to pi
}
Sometimes I get the exception in the foreach(ProjectItem pi in pProject.ProjectItems) line.
Since I don't want to start the foreach loop over if I get the RPC_E_CALL_REJECTED exception I'm not sure what I can do.
Edit to answer comment:
Yes I'm automating VS from another program and yes I usually am using VS for something else at the same time. We have an application that reads an xml file then generates around 50 VS solutions based on the xml file. This usually takes a couple of hours so I try to do other work while this is happening.
There is a solution on this MSDN page: How to: Fix 'Application is Busy' and 'Call was Rejected By Callee' Errors. It shows how to implement a COM IOleMessageFilter interface so that it will automatically retry the call.
First, Hans doesn't want to say so but the best answer to "how to do this" is "don't do this". Just use separate instances of visual studio for your automation and your other work, if at all possible.
You need to take your problem statement out somewhere you can handle the error. You can do this by using in integer index instead of foreach.
// You might also need try/catch for this!
int cProjectItems = pProject.ProjectItems.Length;
for(iProjectItem = 0; iProjectItem < cProjectItems; iProjectItem++)
{
bool bSucceeded = false;
while(!bSucceeded)
{
try{
ProjectItem pi = pProject.ProjectItems[iProjectItem];
// do something with pi
bSucceeded = true;
}catch (System.Runtime.InteropServices.COMException loE)
{
liCount++;
if ((uint)loE.ErrorCode == 0x80010001) {
// RPC_E_CALL_REJECTED - sleep half sec then try again
System.Threading.Thread.Sleep(pDelayBetweenRetry);
}
}
}
}
I didn't have much luck with the recommended way from MSDN, and it seemed rather complicated. What I have done is to wrap up the re-try logic, rather like in the original post, into a generic utility function. You call it like this:
Projects projects = Utils.call( () => (m_dteSolution.Projects) );
The 'call' function calls the function (passed in as a lambda expression) and will retry if necessary. Because it is a generic function, you can use it to call any EnvDTE properties or methods, and it will return the correct type.
Here's the code for the function:
public static T call<T>(Func<T> fn)
{
// We will try to call the function up to 100 times...
for (int i=0; i<100; ++i)
{
try
{
// We call the function passed in and return the result...
return fn();
}
catch (COMException)
{
// We've caught a COM exception, which is most likely
// a Server is Busy exception. So we sleep for a short
// while, and then try again...
Thread.Sleep(1);
}
}
throw new Exception("'call' failed to call function after 100 tries.");
}
As the original post says, foreach over EnvDTE collections can be a problem as there are implicit calls during the looping. So I use my 'call' function to get the Count proprty and then iterate using an index. It's uglier than foreach, but the 'call' function makes it not so bad, as there aren't so many try...catches around. For example:
int numProjects = Utils.call(() => (projects.Count));
for (int i = 1; i <= numProjects; ++i)
{
Project project = Utils.call(() => (projects.Item(i)));
parseProject(project);
}
I was getting the same error using C# to read/write to Excel. Oddly, it worked in debug mode but not on a deployed machine. I simply changed the Excel app to be Visible, and it works properly, albeit about twice as slow. It is annoying to have an Excel app open and close dynamically on your screen, but this seems to be the simplest work-around for Excel.
Microsoft.Office.Interop.Excel.Application oApp = new ApplicationClass();
oApp.Visible = true;
oApp.DisplayAlerts = false;

Categories