WebDriver does not recognize element - python

I'm trying to make Selenium wait for a specific element (near the bottom of the page) since I have to wait until the page is fully loaded.
I'm confused by it's behavior.
I'm not an expert in Selenium but I expect this work:
from selenium import webdriver
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
driver = webdriver.Firefox()
wait = WebDriverWait(driver, 10)
def load_page():
driver.get('http://www.firmy.cz/?geo=0&q=hodinov%C3%BD+man%C5%BEel&thru=sug')
wait.until(EC.visibility_of_element_located((By.PARTIAL_LINK_TEXT, 'Zobrazujeme')))
html = driver.page_source
print html
load_page()
TIMEOUT:
File "C:\Python27\lib\site-packages\selenium\webdriver\support\wait.py", line 78, in until
raise TimeoutException(message, screen, stacktrace)
selenium.common.exceptions.TimeoutException: Message:
I'm just trying to see the HTML of the fully loaded page. It raises TimeoutException but I'm sure that that element is already there. I've already tried another approach.
wait.until(EC.visibility_of_element_located(driver.find_element_by_xpath('//a[#class="companyTitle"]')))
But this approach raises error too:
selenium.common.exceptions.NoSuchElementException:
Message: Unable to locate element:
{"method":"xpath","selector":"//a[#class=\"companyTitle\"]"}

Loading the site takes a long time, use implicitly waiting.
In this case, when you are interested in the whole HTML, you don't have to wait for a specific element at the bottom of the page.
The load_page function will print the HTML as soon as the whole site is loaded if you give the browser enough time to do this with implicitly_wait().
from selenium import webdriver
driver = webdriver.Firefox()
# wait max 30 seconds till any element is located
# or the site is loaded
driver.implicitly_wait(30)
def load_page():
driver.get('http://www.firmy.cz/?geo=0&q=hodinov%C3%BD+man%C5%BEel&thru=sug')
html = driver.page_source
print html
load_page()

The main issue in your code is wrong selectors.
If you want to wait till web element with text Zobrazujeme will loaded and then print page source:
from selenium import webdriver
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
driver = webdriver.Firefox()
wait = WebDriverWait(driver, 10)
def load_page():
driver.get('http://www.firmy.cz/?geo=0&q=hodinov%C3%BD+man%C5%BEel&thru=sug')
wait.until(EC.visibility_of_element_located((By.CLASS_NAME , 'switchInfoExt')))
html = driver.page_source
print html
load_page()

Related

Using Selenium to accept a splash page

When I load the my site as shown below, a splash screen appears to confirm I am 21 years of age. I am trying to load the element to click yes, but I am unable to bypass the age verification screen.
My approach was to load the page. Add a sleep time, find the element and click Yes. However it wont work.
driver.get('https://seelbachs.com/products/sagamore-spirit-cask-strength-rye-whiskey')
time.sleep(5)
element= driver.find_element_by_xpath('//*[#id="enter"]')
element.click()
Error received.
selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//*[#id="enter"]"}
So either my find.element is off or my time.sleep is not helping with the splash page.
It seems the problem is that the pop-up is in an iframe.
This code appears to do the trick
from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.common.exceptions import NoSuchElementException
from selenium.webdriver import ActionChains
import time
driver = webdriver.Chrome('/usr/local/bin/chromedriver')
driver.get('https://seelbachs.com/products/sagamore-spirit-cask-strength-rye-whiskey')
# xpath of iframe
frame_xpath = '/html/body/div[5]/div/div/div/div/iframe'
wait = WebDriverWait(driver, 10)
# wait until iframe appears and select iframe
wait.until(EC.frame_to_be_available_and_switch_to_it((By.XPATH, frame_xpath)))
# select button
xpath = '//*[#id="enter"]'
time.sleep(2)
element= driver.find_element_by_xpath(xpath)
ActionChains(driver).move_to_element(element).click(element).perform()
referencing This answer on Stack Overflow
You need to switch to the frame, check the below code
driver.get('https://seelbachs.com/products/sagamore-spirit-cask-strength-rye-whiskey')
sleep(5)
driver.switch_to.frame(driver.find_element_by_xpath("//iframe[contains(#id,'fancyboxAge-frame')]"))
OR
#Using the explicitWait
WebDriverWait(driver,10).until(EC.frame_to_be_available_and_switch_to_it((By.XPATH,"//iframe[contains(#id,'fancyboxAge-frame')]")))
element= driver.find_element_by_xpath('//*[#id="enter"]')
element.click()
import
from time import sleep
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.support.wait import WebDriverWait

I want to click on a button with Selenium

I´m trying to web scraping and i need to simulate a click on a buttoN, I´ve tried this:
url = "https://apps5.mineco.gob.pe/transparencia/mensual/default.aspx?y=2021&ap=ActProy"
driver = driver = webdriver.Chrome()
driver.get(url)
nivelGob = driver.find_element_by_xpath('//*[#id="ctl00_CPH1_BtnTipoGobierno"]')
nivelGob.click()
and returns me this error:
selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//*[#id="ctl00_CPH1_BtnTipoGobierno"]"}
(Session info: chrome=88.0.4324.190)
I've been trying find the element by css selector, class name but nothing.
This is the button:
I hope someone can help me. Thanks a lot.
The website is actually within another frame so you need to switch to that frame. Try this:
from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
import time
url = "https://apps5.mineco.gob.pe/transparencia/mensual/default.aspx?y=2021&ap=ActProy"
driver = webdriver.Chrome()
driver.get(url)
time.sleep(3)
frame = driver.find_element_by_id("frame0")
driver.switch_to.frame(frame)
w = WebDriverWait(driver, 10).until(EC.presence_of_element_located((By.ID, "ctl00_CPH1_BtnTipoGobierno"))).click()
Perhaps the DOM isn't fully loaded yet. Try adding implicit wait to your driver
driver.implicitly_wait(10) # seconds

Python Selenium Webdriver doesn't refresh html after changing dropdown value in AJAX pages

I'm trying to scrape an AJAX webpage using Python and Selenium. The problem is, when I change the dropdown value, the page content changes according to my selection, but the selenium returns the same old html code from the page. I'd appreciate if anyone can help. Here is my code:
from selenium import webdriver
from selenium.webdriver.support.ui import Select
import time
url = "https://myurl.com/PATH"
driver = webdriver.Chrome()
driver.get(url)
time.sleep(5)
# change the dropdown value
sprintSelect = Select(driver.find_element_by_id("dropdown-select"))
sprintSelect.select_by_visible_text("DropDown_Value2")
html = driver.execute_script("return document.documentElement.outerHTML")
print(html)
You need to wait for the ajax to load the website after your selection.
Try to put implicit or explicit wait after selection.
driver.implicitly_wait(10) # 10 seconds
or if you know the tag/id etc. of the web element you want, try the explicit
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
element = WebDriverWait(driver, 10).until(
EC.presence_of_element_located((By.ID, "some_ID"))

How to reach inside <main> tags using selenium Python?

I am using Python 2.7.12 and Selenium 3.0.2.
I wanted to find a tag inside tag <section>, here is the code:
driver = webdriver.Chrome()
driver.get("https://www.semanticscholar.org/")
input_t = driver.find_element_by_xpath('//input[#type="search"]')
input_t.send_keys(keyword)
input_t.send_keys(Keys.ENTER)
target = driver.find_element_by_xpath('//main')
Running this, I got an exception:
selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//main"}
But actually, there is indeed a tag in the page:
<main class="main-column results" data-reactid=".dyth4mk2kg.0.1.0.1"><div class="controls" data-reactid=".dyth4mk2kg.0.1.0.1.1">
...
</main>
It's just timing issue. You should try using Explicit Waits to wait until main tag loaded and visible after clicking on search button as below :-
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
driver = webdriver.Chrome()
driver.get("https://www.semanticscholar.org/")
#Now enter text to search
driver.find_element_by_name("q").send_keys(keyword)
#Now click on search button
driver.find_element_by_css_selector(".search-bar .button").click()
#Now wait until main tag visible
target = WebDriverWait(driver, 30).until(EC.visibility_of_element_located((By.CSS_SELECTOR, "main.main-column.results")))

Issues clicking an element using selenium

Im using this code to explore tripadvisor (Portuguese comments)
from selenium import webdriver
from bs4 import BeautifulSoup
driver=webdriver.Firefox()
driver.get("https://www.tripadvisor.com/Airline_Review-d8729164-Reviews-Cheap-Flights-TAP-Portugal#review_425811350")
driver.set_window_size(1920, 1080)
Then Im trying to click the google-translate link
driver.find_element_by_class_name("googleTranslation").click()
But getting this error :-
WebDriverException: Message: Element is not clickable at point (854.5, 10.100006103515625). Other element would receive the click: <div class="inner easyClear"></div>
So the div class="inner easyClear" is getting the click. I tried exploring it
from bs4 import BeautifulSoup
page=driver.page_source
for i in page.findAll("div","easyClear"):
print i
print "================="
But was unable to get any intuition from this as in what changes to incorporate now to make the "Google Translate" clickable. Please help
===============================EDIT===============================
Ive also tried these
driver.execute_script("window.scrollTo(0, 1200);")
driver.find_element_by_class_name("googleTranslation").click()
Resizing the browser to full screen etc..
What worked for me was to use an Explicit Wait and the element_to_be_clickable Expected Condition and get to the inner span element:
from selenium import webdriver
from selenium.webdriver import ActionChains
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
driver = webdriver.Chrome()
driver.get("https://www.tripadvisor.com.br/ShowUserReviews-g1-d8729164-r425802060-TAP_Portugal-World.html")
wait = WebDriverWait(driver, 10)
google_translate = wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, ".googleTranslation .link")))
actions = ActionChains(driver)
actions.move_to_element(google_translate).click().perform()
You may also be getting into a "survey" or "promotion" popup - make sure to account for those.

Categories

Resources