I am trying to assert a certain Text element is present on a GUI I am testing. I can see the element using Inspect, yet when running my code, the element is not found. Interestingly, I cant use the "watch cursor" to select the element, I can only select it from the tree. Here is an image of the Inspect window displaying the element I am trying to assert is present:
Inspect Image
I've tried asserting the Parent element is there so I know my XPath is okay. I've tried finding all children by appending /* to the path, and this seems to return null.
Heres the part of my test code I am trying:
string xpath_ABLRegion_child = "//*/Custom[#ClassName=\"D3NodeOverview\"]/Text[9]"]";
var WinElem_ABLRegion_child = desktopSession.FindElementByAbsoluteXPath(xpath_ABLRegion_child);
Assert.IsTrue(desktopSession.FindElementByAbsoluteXPath(xpath_ABLRegion_child"]").Displayed);
Console.WriteLine($"Text of WinElem_ABLRegion_child is {WinElem_ABLRegion_child.Text}");
I expect the test to pass, as WinElem_ABLRegion_child is displayed. However, I am getting an error that the element cant be found.
Please let me know if there is any more information I can share to help you help me.
You can try this....
fyi - your xpath will be extremely brittle. Can you cut that down a bit?
using C# - Method
public static bool IsElementPresent_byXpath(string xpath)
{
bool result;
try { result = Driver.FindElement(By.XPath(xpath)).Displayed; }
catch (NoSuchElementException) { return false; }
catch (StaleElementReferenceException) { return false; }
return result;
}
call
bool ele = Browser.IsElementPresent_byXpath("your/long/xpath");
Assert.IsTrue(ele);
I noticed that the element you are looking for is the child of another window. Did you by chance assign the parent ownership? (I.e. owner = this on implementation).
If so WinAppDriver seems to be blind to that bit of code. See ticket.
Related
I have a blazor application running fine and want to have some behavior test with selenium. The test does now currently the following:
goes to a page (directly using an URL, no page loaded before)
tries to click on a button
The first point does work, but the second has an issue. If I use the wait.until the button is available, then I receive back an early version of the button, which then redrawn and updated in the DOM later. This will give me the "stale element reference: element is not attached to the page document" error.
Here is the code:
var xPath = By.XPath($".//tr//td[normalize-space()=\"{name}\"]/ancestor-or-self::tr//button");
var button = _wait.Until(ExpectedConditions.ElementToBeClickable(xPath));
Thread.Sleep(1000);
button = _chromeDriver.FindElement(xPath);
button.Click();
the _wait.until will return an item that will be stale, while the next FindElement will return a valid, but only after ~1 sec sleep. If I don't have sleep there, it will return the same as the other line.
The final question: How can I ensure in the _wait.until line, that my returned element is the final one to avoid using Thread.Sleep?
There is no gentle solution for such issues.
The only solution I know for this is to make a loop:
Wait for element clickability and click it.
In case exception is thrown wait for the clickability again.
If click succed - break the loop and continue to outer code / return True.
Something like this:
for(i=0; i<6; i++){
try: {
wait.Until(ExpectedConditions.ElementToBeClickable(xPath)).Click();
break;
}
catch(Exception e){
Thread.Sleep(100);
}
}
#golddragon007, you can wait for the element to become stale.
_wait.Until(ExpectedConditions.StalenessOf(_chromeDriver.FindElement(xPath)))
you can check the following link for more details:
https://www.selenium.dev/selenium/docs/api/dotnet/html/M_OpenQA_Selenium_Support_UI_ExpectedConditions_StalenessOf.htm
Selenium Wait.Until conditions are not 100% reliable. I was in shock when I figure×’ this myself, but yes, after the condition has finished you sometimes have to wait or try clicking or hovering in a loop.
I wish to suggest 2 options here:
Wait for the element to be both Clickable and Enabled:
getWait().until((ExpectedCondition) driver -> element.isEnabled());
Use a much better UI tool for testing like Playwright. It has a much sophisticated waiting system conditions and is also faster and very reliable.
Prophet's answer is a good approach, so have a look at that. Sometimes you just need to try something more than one time. This is an alternative that is easier to reuse and more flexible.
All of the methods on the WebDriverWait class expect a lambda express to return a value. Sometimes I need to call a method with a void return type, so I write an extension method on WebDriverWait to support this:
public static class WebDriverWaitExtensions
{
public static void Until(this WebDriverWait wait, Action<IWebDriver> action)
{
wait.Until(driver =>
{
action(driver);
return true;
});
}
}
This is much more flexible and easier to call with your code. This will attempt to click the element until it succeeds. A word of caution, however. The StaleElementException will cancel a wait operation immediately. Be sure to ignore this kind of exception.
var xPath = By.XPath($".//tr//td[normalize-space()=\"{name}\"]/ancestor-or-self::tr//button");
_wait.IgnoredExceptionTypes(typeof(StaleElementException));
_wait.Until(driver => driver.FindElement(xpath).Click());
This should work with any web driver method that has a void return type.
It seems another solution is this:
_wait.Until(driver => (bool)((IJavaScriptExecutor)driver).ExecuteScript("return typeof Blazor.reconnect != 'undefined'"));
as that variable only loaded when the page is fully loaded. And after that it's possible to do a simple find without an issue:
var button = _chromeDriver.FindElement(xPath);
Implement polly as your resilience framework for Selenium methods you create an example you can see below:
public static IWebElement Click(string xPath, int retries = 15,
int retryInterval = 1)
{
var element = Policy.HandleResult<IWebElement>(result => result == null)
.WaitAndRetry(retries, interval => TimeSpan.FromSeconds(retryInterval))
.Execute(() =>
{
var element = Searchers.FindWebElementByXPath(xPath);
if (element != null)
{
_logger.Info("Clicked Element: " + xPath + " (" + element.Text + ")");
try
{
_driver.ExecuteScript("arguments[0].click();", element);
return element;
}
catch (Exception e)
{
if (e is ElementClickInterceptedException or StaleElementReferenceException)
return null;
}
}
return null;
});
if (element != null) return element;
_logger.Info("Failed to click Element: " + xPath + "");
throw new Exception(" Failed to use Javascript to click element with XPath: " + xPath + "");
}
By leveraging the power of polly, I can guarantee that all of my events will go through.
Do not use the base Selenium methods to wait for certain properties to change/be a certain value, they are not very reliable.
I am trying to scrape a page with Selenium in C# which has several pages that I can go through by clicking a "Next" button on the page. I am usually getting the error that there is a stale element reference, which ONLY happens if I run it without breakpoints. If I go through the program step by step, it works perfectly fine. I'm assuming that Selenium is skipping over important stuff without waiting (even though I have a wait method implemented).
To the code, this is the main logic for the problem:
foundVacancies.AddRange(FindVacanciesOnPage());
const string nextBtnXPath = "//*[#id=\"ContainerResultList\"]/div/div[3]/nav/ul/li[8]/a";
if (Driver.FindElements(By.XPath(nextBtnXPath)).Count != 0)
{
while (TryClickingNextButton(nextBtnXPath))
{
foundVacancies.AddRange(FindVacanciesOnPage());
}
}
This method first gets all items on the first page and adds them to the foundVacancies list. After that, it will try to look for the "Next" button, which is not always there if there are not enough items. If it is, it will try to click it, scrape the page, and click it again until there are no pages left. This works great when debugging, but there is something very wrong with normally running.
The method for getting all items on the page, and where the error occurs:
private IEnumerable<string> FindVacanciesOnPage()
{
var vacancies = new List<string>();
var tableContainingAllVacancies = Driver.FindElement(By.XPath("//*[#id=\"ContainerResultList\"]/div/div[2]/div/ul"));
var listOfVacancies = tableContainingAllVacancies.FindElements(By.XPath(".//li/article/div[1]/a"));
foreach (var vacancy in listOfVacancies)
{
vacancies.Add(vacancy.FindElement(By.XPath(".//h2")).Text);
}
return vacancies;
}
The items are in a <ul> HTML tag and have <li> childs, which I am going through one by one, and get their inner text. The stale element error occurs in the foreach loop. I'm assuming that the web driver didn't have the time to reload the DOM, because it's working when breakpointing. However, I do have a method to wait until the page is fully loaded, which is what I use when going to the next page.
private bool TryClickingNextButton(string nextButtonXPath)
{
var nextButton = Driver.FindElement(By.XPath(nextButtonXPath));
var currentUrl = Driver.Url;
ScrollElementIntoView(nextButton);
nextButton.Click();
WaitUntilLoaded();
var newUrl = Driver.Url;
return !currentUrl.Equals(newUrl);
}
I am comparing new and old URL to determine if this was the last page. The WaitUntilLoaded method looks like this:
var wait = new WebDriverWait(Driver, TimeSpan.FromSeconds(30));
wait.Until(x => ((IJavaScriptExecutor) Driver).ExecuteScript("return document.readyState").Equals("complete"));
Oddly enough, sometimes the web driver just closes immediately after loading the first page, without any errors nor any results. I spent a lot of time debugging and searching on SO, but can't seem to find any information, because the code is working perfectly fine when breakpointing through it.
I have only tried Chrome, with and without headless mode, but I don't see that this could be a Chrome problem.
The "Next" button has the following HTML:
<a href="" data-jn-click="nextPage()" data-ng-class="{'disabled-element':currentPage === totalPages}" tabindex="0">
<span class="hidden-md hidden-sm hidden-xs">Next <span class="icon icon-pagination-single-forward"></span></span>
<span class="hidden-lg icon icon-pagination-forward-enable"></span>
</a>
I couldn't find out what data-jn-click is. I tried to just execute the JavaScript nextPage();, but that didn't do anything.
I don't have any experience in c#, so if am wrong please don't mind.
You are using findElementsand storing it to var listOfVacancies. I have referred some sites. Why don't you use ReadOnlyCollection<IWebElement>. It is better to store all elements as a List and iterate through it.
So the code becomes,
ReadOnlyCollection<IWebElement> listOfVacancies = tableContainingAllVacancies.FindElements(By.XPath(".//li/article/div[1]/a"));
If the elements that are going into listOfVacancies are being populated via an ajax call, then document.readystate won't catch that. Try using:
wait.Until(x => ((IJavaScriptExecutor) Driver).ExecuteScript("return jQuery.active").Equals("0"));
I finally found a way to solve this issue. It's dirty, but it works. I tried many different approaches to waiting until the page is fully loaded, but none worked. So I went down the dark path of Thread.Sleep, but it's not as bad as it sounds like:
private IEnumerable<string> FindVacanciesOnPage()
{
return FindVacanciesOnPage(new List<string>(), 0, 50, 15000);
}
private IEnumerable<string> FindVacanciesOnPage(ICollection<string> foundVacancies, long waitedTime, int interval, long maxWaitTime)
{
try
{
var list = Driver.FindElements(By.XPath("//*[#data-ng-bind=\"item.JobHeadline\"]"));
foreach (var vacancy in list)
{
foundVacancies.Add(vacancy.Text);
}
}
catch (Exception)
{
if (waitedTime >= maxWaitTime) throw;
Thread.Sleep(interval);
waitedTime += interval;
return FindVacanciesOnPage(foundVacancies, waitedTime, interval, maxWaitTime);
}
return foundVacancies;
}
This will try to get the items, and if there is an Exception thrown, just waits a certain amount of time until it tries again. When a specified maximum time was waited, the exception is finally thrown.
I have been working on making my Selenium Framework a Page Factory, however i am struggling to get the Wait.Until commands working in my Extension Class.
public static void Wait(this IWebElement element, IWebDriver driver, float TimeOut)
{
WebDriverWait Wait = new WebDriverWait(driver, TimeSpan.FromSeconds(TimeOut));
return Wait.Until(ExpectedConditions.ElementIsVisible(element));
}
If I use the above code I get the error
Cannot Convert from OpenQA.Selenium.IWebElement to Open.Qa.Selenium.By
Any suggestions how can I amend the code above to make it work in the By model I am using?
There is no ExpectedConditions.ElementIsVisible(IWebElement). Unfortunately, you can only use ElementIsVisible with By objects.
If appropriate, you could substitute with ExpectedConditions.ElementToBeClickable(IWebElement), which is a slightly different case that also checks that the element is enabled in addition to being visible. But this may satisfy your requirement.
Alternatively, you could just call element.Displayed in a custom WebDriverWait, making sure to ignore or catch the NoElementException
Here is an old implementation of this I've used and changed for your case, there may be a cleaner way to do it now:
new WebDriverWait(driver, TimeSpan.FromSeconds(TimeOut))
{
Message = "Element was not displayed within timeout of " + TimeOut + " seconds"
}.Until(d =>
{
try
{
return element.Displayed;
}
catch(NoSuchElementException)
{
return false;
}
}
A quick explanation for the code above... It will try to execute element.Displayed over and over until it returns true. When the element does not exist, it will throw a NoSuchElementException which will return false so the WebDriverWait will continue to execute until both the element exists, and element.Displayed returns true, or the TimeOut is reached.
Replace
return Wait.Until(ExpectedConditions.ElementIsVisible(element));
with the below line of code and check, as this worked for me
wait.Until(ExpectedConditions.ElementIsVisible(By.Id("ctlLogin_UserName")));
wherein "ctlLogin_UserName" is the ID of your web element.
I am getting stale element exception when I run the code. However while debugging I do not get this exception.
Here is my piece of code. Can anybody help me ? thanks
public static bool CheckListFilterResult(IList<IWebElement> gridColumns, IList<IWebElement> gridRows, string filterColumn, List<string> filters)
{
bool checkResult = false;
{
int filterColumnIndex = GetColumnIndex(gridColumns, filterColumn);
if (gridRows.Count > 0 )
{
foreach (IWebElement row in gridRows)
{
TestManager.Doc.Step("before bool match");
bool filterMatch = filters.Contains(row.FindElements(By.TagName("td"))[filterColumnIndex].Text.Trim());
if (filterMatch)
{
checkResult = true;
}
else
{
checkResult = false;
break;
}
}
}
}
return checkResult;
}
}
From looking at your code, that line is the first line where you access something from the rows collection. My guess is that it has something to do with that. Did you pull the rows collection, then apply a filter, then call this function? If so, that's probably the problem. You need to apply the filter, then pull the rows collection, then call the function.
It could be that your IWebElement row was updated by javascript while you were running your selenium code.
As explained here: http://docs.seleniumhq.org/exceptions/stale_element_reference.jsp, Stale Element Reference Exception happens when the object on the UI may have been refreshed but you are trying to access the same object. So even if the row is visible to the user, it may have been updated/replaced by javascript, then it is a different object.
The reason why it happens only occassionaly but not always could be that when you run in debug mode, you are stopping at each line of your selenium code, but javascript code doesnt stop, so the element on the UI was already refreshed, then your selenium code gets a fresh copy of the IWebElement row.
There are several possible ways of fixing this problem.
try to catch StaleElementReferenceException, if it happens, then redo whatever you are trying to do.
you can do: wait.until(ExpectedConditions.stalenessOf(row));
Hope it helps
I want to find a textbox id= UserName and give it a value =sa,
There is something wrong with my testing.
The error show UnexpectedJavaScriptError.
What's going on? How can I solve this?
Here is my code.
public void SetupTest()
{
driver.Manage().Timeouts().ImplicitlyWait(new TimeSpan(0, 0, 30));
driver.Navigate().GoToUrl(**the WEBSITE url**);
}
public void Test1()
{
IJavaScriptExecutor js = (IJavaScriptExecutor)driver;
js.ExecuteScript("document.getElementById('UserName').value='sa'");
}
thanks
I use Selenium at my day job, and this sort of code should work fine.
I think the problem may be that the element isn't there when you are trying to use it.
My suggestion is to try something like this:
try {
var element = document.getElementById('UserName');
if(element) {
element.value = 'sa';
}
} catch(e) {}
I actually have a "jquery" helper method in our selenium code that uses jQuery to find elements and then returns the [0] element so that we can use it with Selenium API.
private IWebElement GetFirstElement()
{
return (IWebElement)((IJavaScriptExecutor)_driver).ExecuteScript("return $(\"" + _selector + "\")[0];", null);
}
Additionally, you might just need to wait until the element is on the screen.
An easy way to do this is to use Selenium's FindElement(By...) because Selenium will wait a configurable amount of time for the element to appear.
If you do this, it might make more sense to avoid JavaScript altogether for this case, but what you are trying to do should work in a perfect scenario.
You cannot access DOM using selenium webdriver. Instead of using javascript you can achieve the same using the below code snippet
string text = "sa";
IWebElement element = driver.FindElement(By.Id("UserName"));
element.SendKeys(text);