How to handle errors (unable to locate element) in Selenium - python

I am writting bot using Selenium for a game where is a lot of clicking.
Sometimes it shows error Unable to locate element:
selenium.common.exceptions.NoSuchElementException: Message: Unable to locate element: /html/body/div/div[1]/header/div/div[1]/a
I have correct Xpath but sometimes it gives me an error in different parts in my code. It is not that I have one mistake and it always show error in one place. My problem is that those errors are random. I am handling it like this:
try:
secondPhoto = self.driver.find_element_by_xpath(
'/html/body/span/section/main/article/div[2]/div/div[1]/div[1]')
secondPhotoOpen = ActionChains(self.driver)
secondPhotoOpen.move_to_element(secondPhoto)
secondPhotoOpen.click()
secondPhotoOpen.perform()
except:
time.sleep(3)
self.driver.find_element_by_xpath(
'/html/body/span/section/main/article/div[1]/div/div/div[1]/div[2]').click()
This is not a ideal solution. It still shows errors but less frequently.
I am also using time.sleep. Usually errors show up when I am doing something else on internet or I have lags(this is the reason why I am using time.sleep) Now, I have about 50 .click() in my code and for all clicks I am doing try except but still it is not working correctly.
Do you have an effective solution for this? How to write a code that use .click() to be 100% sure it works regardless lags, other
browser activity?
How to wait for full load of next page/image after click() ( I am
using time.sleep)?

You can use WebDriverWait:
btn = WebDriverWait(driver, 10).until(EC.element_to_be_clickable((By.ID, "element_id")))
btn.click()
this will wait at least 10 seconds, until element will be clickable and only then clicks on it. Also I would recommend to read this. With WebDriverWait you don't have to have hard coded pauses.

Related

How to use selenium to navigate web pages with multiple drop downs and check boxes

I am trying to automate a process to download data from a website. The code works if I run it step by step but if I run it all at once it fails. Giving the error
ElementNotInteractableException: Message: element not interactable
I have got around this using time.sleep(x amount of time) but it still seems to fail intermittently. I am having trouble implementing implicit waits. Any help would be appreciated. Code below.
import selenium
browser = webdriver.Chrome(executable_path=r'path\to\chromedriver.exe')
browser.get("https://map.sarig.sa.gov.au/")
browser.maximize_window()
browser.switch_to.frame(browser.find_element_by_id('MapViewer'))
browser.find_element_by_xpath('//*[#id="TourWidget"]/div[1]/span').click()
browser.find_element_by_xpath('//*[#id="menuAllMapLayers"]/div[2]/p').click()
browser.find_element_by_xpath('//*[#id="238"]/li[1]/div/div/span[1]').click()
time.sleep(3)
browser.find_element_by_xpath('//*[#id="238"]/li[1]/div/div/span[1]').click()
browser.find_element_by_xpath('//*[#id="238"]/li[3]/div/div/label/span').click()
browser.find_element_by_xpath('//*[#id="239"]/li[1]/div/div/span[1]').click()
browser.find_element_by_xpath('//*[#id="239"]/li[3]/div/div/label/span').click()
browser.find_element_by_xpath('//*[#id="menuActiveLayers"]').click()
browser.find_element_by_xpath('//*[#id="groupOptions238"]/span').click()
time.sleep(3)
browser.find_element_by_xpath('//*[#id="238"]/li[2]/div/div[3]/div[2]/span').click()
browser.find_element_by_xpath('//*[#id="groupOptions239"]/span').click()
time.sleep(3)
browser.find_element_by_xpath('//*[#id="239"]/li[2]/div/div[3]/div[2]/span').click()
Use ActionChains and get access to pause(3) instead of using sleep(3) but it could also help to use Waits and checking if your elements actually are "visible" rather than "present" (see expected_conditions)
It's a lot of dropdowns so maybe there are not visible all the time, but you can run these checks after doing a move_to_element() so it would actually be present.

Selenium weird behavior, element not clickable when it's there

I run a multithread python selenium script which gives me unexpected behavior which I can not understand. So after opening a new browser I try to clear the cache with below code ;
driver.get('chrome://settings/clearBrowserData')
WebDriverWait(driver, 10).until(EC.visibility_of_element_located((By.XPATH, '//settings-ui')))
driver.find_element_by_xpath('//settings-ui').send_keys(Keys.ENTER)
90 out of 100 times this works but sometimes it throws me this error :
element not interactable
To my understanding somehow the page is not loaded fully and selenium is too "fast".
However I do wait until the element is visible which would eliminate this plus to my understanding the driver.get also waits until the "full get" is returned and finished.
Can someone explain me why this behavior occurs ?
Sub question would be is my understanding right driver.get(page) waits for full page loading done ?
The ElementNotInteractableException exception can occur for a few reasons:
The element that you want to interact with is disabled.
There's an overlay element covering the one you want to interact with, such as a loading spinner.
If it's 1, use EC.element_to_be_clickable instead of EC.visibility_of_element_located.
If it's 2, you'll need to either wait for the overlay element to go away on it's own. Or, perform an action to make the overlay element go away.

element click intercepted exception in selenium python

I am a beginner in selenium and having a tough time understanding one of the errors appearing while using selenium chromedriver in python.
So, I am trying to click an element inside an svg tag by using css_selector. But I am getting ElementClickInterceptedException with the error (... is not clickable at point (1281, 60). Other element would receive the click:...). However, if I am putting time.sleep(5) before clicking the element, then I am able to click it. Why is it happening?
My first guess was maybe the element is not becoming visible or is inside an invisible box and I have to wait. Hence I tried to handle it with the explicit wait. But both cases timed out. So I assumed, it was not due to that. But then it is difficult for me to guess the reason, as time.sleep(5) is working.
Thank you in advance.

Page seems to load but not able to access data using selenium/python

I'm making sure I wait for the page to load with WebDriverWait but it's still running the timeout exception and I can't figure out why. I also checked to make sure the XPath was present in the chrome developer inspector and confirmed that it is. Here is the snippet if anyone can help me.
Thanks!
url2 = 'https://www.rotorooter.com/adelantoca/'
driver.get(url2)
delay = 3
try:
name = WebDriverWait(driver, delay).until(EC.presence_of_element_located((By.XPATH, '/html/body/app-root/div/app-not-found/div/app-local-page/app-local-map/div/div/div/div[2]/div[4]/div[1]/strong')))
print("Page is ready!",name)
except TimeoutException:
print("Loading took too much time!")
Returns:
Loading took too much time!
There are several issues here.
The main problem is that the element you are checking is far out of the initially visible screen view. So when the page is opened this element is not loaded until scroll down is performed. This is why your WebDriverWait fails on timeout.
You should never use automatically generated locators like /html/body/app-root/div/app-not-found/div/app-local-page/app-local-map/div/div/div/div[2]/div[4]/div[1]/strong. They are extremely unreliable and breakable.
It's recommended to use larger timeouts for WebDriverWaits. Making them too should may lead to false errors caused by low internet / slow web pages. They do not cause extra runtime in the normal cases since Selenium code will continue immediately after detecting the checked condition.

Python selenium stale element error

Using python with selenium and the following code produces a stale element error, can anyone see why?
def test_set_language(self):
driver = self.driver
driver.get("http://somewebpage.com")
elemL = driver.find_element_by_name("selectLang")
elemL.send_keys(Keys.DOWN)
driver.implicitly_wait(10)
self.assertIn("Mot", driver.page_source)
StaleElementReferenceException is thrown if between and interaction with that element page was reloaded or DOM model has changed.
Element lookup is first time when selenium assigned internal id to that element - see selenium execution logs for details.
If you can't provide better solution, you may want to use generic solution for StaleElementReferenceException - surround with try/catch block, in catch block wait a few seconds and retry operation. This is not an elegant solution, but this is what people have to do at times and what works just fine.

Categories

Resources