I have a test case that is uploading a file and I am guessing I need a while loop to determine when the upload is complete.
There is an xpath //div[#class='media-upload-progress finished'] that appears when the file is finished or //div[#class='media-upload-progress uploading'] when the file is uploading.
I thought I could do something with a while loop and a SeleniumDriver.IsElementPresent but I have not been able to figure it out.
Any ideas?
Thanks for the help!
I would suggest you you give DefaultWait a try. PollingInterval would really help you since the finished element won't present unless the file is completely uploaded. The following code should poll the dom in every 100 ms and look for the intended element.
By bySelector = By.XPath("//div[#class='media-upload-progress finished']");
DefaultWait<IWebDriver> wait = new DefaultWait<IWebDriver>(driver);
wait.Timeout = TimeSpan.FromSeconds(1); // increase the timeout as needed
wait.PollingInterval = TimeSpan.FromMilliseconds(100);
wait.IgnoreExceptionTypes(typeof(NoSuchElementException));
//Add more typrof() exceptions as needed
IWebElement element = wait.Until<IWebElement>((d) =>
{
return d.FindElement(bySelector );
});
Disclaimer: I have never personally implemented this. So this code is entirely untested from my side. But, theoretically this should solve the issue you are having
Related
I am trying to click an accept terms checkbox while signing up for an account in selenium. My code seems to work, but it only works on some of my customers computers. On those who it doesn't work, it throws a staleelementexception. I have looked online to find a way to handle it such as
bool result = false;
int attempts = 0;
Thread.Sleep(500);
while (attempts < 100)
{
Thread.Sleep(700);
try
{
driver.Navigate().Refresh();
IJavaScriptExecutor javascriptExecutor = (IJavaScriptExecutor)driver;
Actions action = new Actions(driver);
IWebElement db = driver.FindElement(By.XPath("/html/body/div[1]/div/div[2]/form/div[1]/label/span[2]"));
action.MoveToElement(db).Build().Perform();
javascriptExecutor.ExecuteScript("arguments[0].click();", db);
result = true;
break;
}
catch (WebDriverException ee)
{}
attempts++;
}
nextPage(driver, wait);
I have also tried just clicking on it with the driver and clicking with actions, but nothing seems to be working for certain users. I am at the point of not knowing the solution or not knowing what else I can do.
UPDATE 1:
While I mentioned this in the comments, I still do receive a stale element, but I seem to of found the cause. When using specific country proxies, it seems that this is thrown. Now the element is still there, but why would the proxies cause this to happen?
A stale element usually means the DOM changed from the moment you found the element. There are a few things you can try:
Inspect the page and try to determine what is changing for those users. Maybe MoveToElement triggers a tooltip or some classes to change. If you can identify such change you can wait for it to be done and then get the element
Try using fluent waits as explained here
Try clicking the element as soon as it is identified driver.findElement(by.xpath(xpath)).click();
Note: The JavaScript click that you are using in the snippet works well for CLickIntercepted exceptions, not so much for StaleElements. The element will be stale regardless on how you click it
This might be a long shot but I might as well try here. There is a block of c# code that is rebuilding a solr core. The steps are as follows:
Delete all the existing documents
Get the core entities
Split the entities into batches of 1000
Spin of threads to preform the next set of processes:
Serialize each batch to json and writing the json to a file on the server
hosting the core
Send a command to the core to upload that file using System.Net.WebClient solrurl/corename/update/json?stream.file=myfile.json&stream.contentType=application/json;charset=utf-8
Delete the file. I've also tried deleting the files after all the batches are done, as well as not deleting the files at all
After all batches are done it commits. I've also tried committing
after each batch is done.
My problem is the last batch will not upload if it's much less than the batch size. It flows through like the command was called but nothing happens. It throws no exceptions and I see no errors in the solr logs. My questions are Why? and How can I ensure the last batch always gets uploaded? We think it's a timing issue, but we've added Thread.Sleep(30000) in many parts of the code to test that theory and it still happens.
The only time it doesn't happen is:
if the batch is full or almost full
we don't run multiple threads it
we put a break point at the File.Delete line on the last batch and wait for 30 seconds or so, then continue
Here is the code for writing the file and calling the update command. This is called for each batch.
private const string
FileUpdateCommand = "{1}/update/json?stream.file={0}&stream.contentType=application/json;charset=utf-8",
SolrFilesDir = #"\\MYSERVER\SolrFiles",
SolrFileNameFormat = SolrFilesDir + #"\{0}-{1}.json",
_solrUrl = "http://MYSERVER:8983/solr/",
CoreName = "MyCore";
public void UpdateCoreByFile(List<CoreModel> items)
{
if (items.Count == 0)
return;
var settings = new JsonSerializerSettings { DateTimeZoneHandling = DateTimeZoneHandling.Utc };
var dir = new DirectoryInfo(SolrFilesDir);
if (!dir.Exists)
dir.Create();
var filename = string.Format(SolrFileNameFormat, Guid.NewGuid(), CoreName);
using (var sw = new StreamWriter(filename))
{
sw.Write(JsonConvert.SerializeObject(items, settings));
}
var file = HttpUtility.UrlEncode(filename);
var command = string.Format(FileUpdateCommand, file, CoreName);
using (var client = _clientFactory.GetClient())//System.Net.WebClient
{
client.DownloadData(new Uri(_solrUrl + command));
}
//Thread.Sleep(30000);//doesn't work if I add this
File.Delete(filename);//works here if add breakpoint and wait 30 sec or so
}
I'm just trying to figure out why this is happening and how to address it. I hope this makes sense, and I have provided enough information and code. Thanks for any help.
Since changing the size of the data set and adding a breakpoint "fixes" it, this is most certainly a race condition. Since you haven't added the code that actually indexes the content, it's impossible to say what the issue really is, but my guess is that the last commit happens before all the threads have finished, and only works when all threads are done (if you sleep the threads, the issue will still be there, since all threads sleep for the same time).
The easy fix - use commitWithin instead, and never issue explicit commits. The commitWithin parmaeter makes sure that the documents become available in the index within the given time frame (given as milliseconds). To make sure that the documents you submit becomes available within ten seconds, append &commitWithin=10000 to your URL.
If there's already documents pending a commit, the documents added will be committed before the ten seconds has ellapsed, but even if there's just one last document being submitted as the last batch, it'll never be more than ten seconds before it becomes visible (.. and there will be no documents left forever in a non-committed limbo).
That way you won't have to keep your threads synchronized or issue a final commit, as long as you wait until all threads have finished before exiting your application (if it's an application that actually terminates).
I am doing an upload with Selenium, after the upload an elements gets invisible and I know its completed. The problem is this code throws a WebDriverException everytime after 1 minute and I don't know why since I set the timeout to 5 minutes.
It would be great if someone knows a solution for this. Thank you :)
var wait = new WebDriverWait(_driver, TimeSpan.FromSeconds(300));
wait.Until(ExpectedConditions.InvisibilityOfElementLocated(By.XPath(xpath)));
I am using Selenium 2.25 WebDriver
I'm having a issue with finding the elements on the page and some times my test cases able to find element and sometime the page is does not load and its due to page load and if i add this below line and it seems like working:
driver.Manage().Timeouts().SetPageLoadTimeout(TimeSpan.FromSeconds(2));
my question is, i dont want to have my code scatter with the above line of code, is there a way to make it centerlize in one place?
Any help would be greatly appreciated, thanks!
If you set the timeout once, it's set for the lifetime of the driver instance. You don't need to keep resetting it. You can set this immediately after creating the driver.
IWebDriver driver = new FirefoxDriver();
driver.Manage().Timeouts.SetPageLoadTimeout(TimeSpan.FromSeconds(2));
The only caveat for using this timeout is that not every browser may support it completely (IE does for sure, Firefox does too I think, but I don't think Chrome does).
You can try a workaround like this:
Observe the element that loads last in your page and find its id (or any other identifier). Then do something like this:
while (true)
{
try
{
IWebElement element = driver.FindElement(By.Id(...));
if (element.Displayed)
{
break;
}
}
catch (Exception)
{
continue;
}
}
This will keep looping till the element which is loaded last is displayed and breaks thereupon. The element not found exception is caught and loop is put into continuation till the element is not displayed.
We have some basic C# logic that iterates over a directory and returns the folders and files within. When run against a network share (\\server\share\folder) that is inaccessible or invalid, the code seems to 'hang' for about 30 seconds before returning back from the call.
I'd like to end up with a method that will attempt to get folders and files from the given path, but without the timeout period. In other words, to reduce or eliminate the timeout altogether.
I've tried something as simple as validating the existence of the directory ahead of time thinking that an 'unavailable' network drive would quickly return false, but that did not work as expected.
System.IO.Directory.Exists(path) //hangs
System.IO.DirectoryInfo di = new System.IO.DirectoryInfo(path); //hangs
Any suggestions on what may help me achieve an efficient (and hopefully managed) solution?
You can use this code:
var task = new Task<bool>(() => { var fi = new FileInfo(uri.LocalPath); return fi.Exists; });
task.Start();
return task.Wait(100) && task.Result;
Place it on its own thread, if it doesn't come back in a certain amount of time, move on.
Perhaps you could try pinging the server first, and only ask for the directory info if you get a response?
See...
Faster DirectoryExists function?
...for a way of setting the execution time for Directory.Exists