I want to click on a button with Selenium - python

I´m trying to web scraping and i need to simulate a click on a buttoN, I´ve tried this:
url = "https://apps5.mineco.gob.pe/transparencia/mensual/default.aspx?y=2021&ap=ActProy"
driver = driver = webdriver.Chrome()
driver.get(url)
nivelGob = driver.find_element_by_xpath('//*[#id="ctl00_CPH1_BtnTipoGobierno"]')
nivelGob.click()
and returns me this error:
selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//*[#id="ctl00_CPH1_BtnTipoGobierno"]"}
(Session info: chrome=88.0.4324.190)
I've been trying find the element by css selector, class name but nothing.
This is the button:
I hope someone can help me. Thanks a lot.

The website is actually within another frame so you need to switch to that frame. Try this:
from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
import time
url = "https://apps5.mineco.gob.pe/transparencia/mensual/default.aspx?y=2021&ap=ActProy"
driver = webdriver.Chrome()
driver.get(url)
time.sleep(3)
frame = driver.find_element_by_id("frame0")
driver.switch_to.frame(frame)
w = WebDriverWait(driver, 10).until(EC.presence_of_element_located((By.ID, "ctl00_CPH1_BtnTipoGobierno"))).click()

Perhaps the DOM isn't fully loaded yet. Try adding implicit wait to your driver
driver.implicitly_wait(10) # seconds

Related

How to Fix: Unable to locate element: method- XPath

This is the website I am trying to automate some clicks:
I have tried clicking the button using Xpath and FullXpath, but still got no luck.
This is the simple code:
w = webdriver.Chrome(executable_path='chromedriver.exe',
chrome_options=options)
w.get("https://quillbot.com/")
time.sleep(5)
pasteXpath = "//button[contains(#class,'outlinedPrimary') and .//span[contains(text(),'Paste Text')]]"
element = w.find_element_by_xpath(pasteXpath).click()
But it fails with this message in the console:
selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//*[#id="inOutContainer"]/div[2]/div[2]/div/div[1]/div/div/div[1]/div/div/div[2]/div/div/button/span[1]/div"}
Please show me how to automate this click using selenium.
I recommend using By, WebDriverWait, and expected_conditions in the place of .find_element_by_xpath.
After you click the paste button you will receive a permissions prompt. See below to get past it.
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium import webdriver
from selenium.webdriver.firefox.service import Service
from selenium.webdriver.chrome.service import Service
import time
import pyautogui
service = Service('C:\\Path_To_Your\\chromedriver.exe')
driver = webdriver.Chrome(service=service)
driver.get('https://quillbot.com/')
paste_button = WebDriverWait(driver, 3).until(EC.visibility_of_element_located(
(By.XPATH, "//span[text()='Paste Text']")))
paste_button.click()
time.sleep(2)
pyautogui.press('tab')
pyautogui.press('tab')
pyautogui.press('enter')
This will work:
pasteXpath = "//button[contains(#class,'outlinedPrimary') and .//span[contains(text(),'Paste Text')]]"
element = w.find_element_by_xpath(pasteXpath).click()
Don't forget to add some wait / delay before it to make sure the page is fully loaded.
Try to use CSS selector instead:
element = w.find_element_by_css_selector('div[class*="MuiGrid-root"] > div[class="jss473"]').click()
You can find all the doc about css selector here

How to use Selenium to Scrape the website id for each rows

I am working on a Python scraping code to scrape the website id. It has 29 rows on the webpage with a unique id for each row.
Here is my code
op = webdriver.ChromeOptions()
driver = webdriver.Chrome(options=op, executable_path="/usr/local/bin/chromedriver")
driver.get(web)
driver.find_element_by_xpath('//*[#id="caseCriteria_SearchCriteria"]').send_keys(keys)
input("Press Enter to continue...")
content = driver.find_elements_by_class_name('k-detail-cell')
for c in content:
grid = c.find_element_by_css_selector('.party-card')
g = grid.get_attribute('id')
print(g)
driver.close()
I was able to get the id from the first row but then it gave an error NoSuchElementException: Message: no such element: Unable to locate element: {"method":"css selector","selector":".party-card"} (Session info: chrome=92.0.4515.131)
I am wondering am I doing it correctly? I attached a screenshot of the page source as well. every row are identical for the <div class='data-party-id' ....>
That will be great if I have get some advice!
Thanks!
OK, I will give you Selenium approach solution.
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.action_chains import ActionChains
import time
op = webdriver.ChromeOptions()
driver = webdriver.Chrome(options=op, executable_path="/usr/local/bin/chromedriver")
wait = WebDriverWait(driver, 20)
actions = ActionChains(driver)
driver.get(web)
wait.until(EC.visibility_of_element_located((By.XPATH, '//*[#id="caseCriteria_SearchCriteria"]')))
#here you should pass the CAPTCHA....
wait.until(EC.visibility_of_element_located((By.XPATH, '//*[#id="caseCriteria_SearchCriteria"]'))).send_keys(keys)
driver.find_element_by_xpath("//p//input[#id='btnSSSubmit']").click()
wait.until(EC.visibility_of_element_located((By.CSS_SELECTOR, ".k-detail-cell .party-card")))
time.sleep(2)
grid = driver.find_elements_by_css_selector(".k-detail-cell .party-card")
for g in grid:
actions.move_to_element(g).perform()
time.sleep(0.5)
print(g.get_attribute('id'))

How click a div with selenium in python to download a file (error: rootNode.elementsFromPoint is not a function)

I tried to download the Youtube charts Weekly from https://charts.youtube.com/charts/TopSongs/ as csv. (the download button is in the upper right corner in a SVG icon)
I used this code and tried two ways to click it but both gave me this error selenium.common.exceptions.JavascriptException: Message: javascript error: rootNode.elementsFromPoint is not a function (Session info: chrome=91.0.4472.106)"
And this is my code, I already make sure that I found the right HTML element with download_button.get_attribute("outerHTML")
from selenium.webdriver.common.by import By
from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as ec
from selenium.webdriver.common.action_chains import ActionChains
import time
driver = webdriver.Chrome()
driver.maximize_window()
driver.get('https://charts.youtube.com/charts/TopSongs/')
time.sleep(4)
#######first attempt######
download_button = driver.find_element_by_xpath("//div[#class='download-container style-scope ytmc-charts']/paper-icon-button")
action = ActionChains(driver)
action.move_to_element(download_button)
action.click()
action.perform()
#######second attempt######
wait = WebDriverWait(driver, 10)
check_box_el = wait.until(ec.visibility_of_element_located((By.XPATH, "//div[#class='download-container style-scope ytmc-charts']/paper-icon-button")))
ActionChains(driver).move_to_element(check_box_el).click().perform()
driver.quit()
Any idea about it? Thanks :)
See if this works:-
download_elm = driver.find_element_by_xpath(".//*[#id='download-button']")
driver.execute_script("arguments[0].click();", download_elm)

Unable to locate element - Can I wait for it to load?

I'm trying to use Selenium to scrape prices from a few websites. I learned about xpath and though it was a great way to select elements.
I'm having a hard time selecting the price from this page. I feel like maybe the element hasn't loaded yet which was one of the reasons I started using Selenium.
Is xpath really as reliable as I thought?
from selenium import webdriver
DRIVER_PATH = '/usr/bin/chromedriver'
options = webdriver.ChromeOptions()
options.add_argument("--incognito")
options.add_argument("--headless")
driver = webdriver.Chrome(executable_path=DRIVER_PATH, options=options)
url = "https://www.wayfair.com/furniture/pdp/zipcode-design-evan-726-wide-square-arm-convertible-sofa-zpcd1679.html"
xpath = '//*[#id="bd"]/div[1]/div[2]/div/div[2]/div/div[1]/div[2]/div/div/div[1]/span'
driver.get(url)
price = driver.find_element_by_xpath(xpath).text
print(price)
driver.quit()
My code gives this error:
selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//*[#id="bd"]/div[1]/div[2]/div/div[2]/div/div[1]/div[2]/div/div/div[1]/span"}
Simply wait and print the value out.
elem=WebDriverWait(driver, 10).until(EC.presence_of_element_located((By.XPATH,"//*[#id='bd']/div[1]/div[2]/div/div[2]/div/div[1]/div[2]/div/div/div[1]/span")))
print(elem.text)
Outputs
$779.99
Import
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC

Nested divs inside iframe selenium python

Im trying to use selenium with python to execute javascript on the id callled g-recaptcha-response.
Picture of HTML with div im targeting
But i'm getting this error saying Message: no such element: Unable to locate element. Here is the script that I have so far
import time
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.support.wait import WebDriverWait
driver = webdriver.Chrome(executable_path="C:\\chromedriver.exe")
driver.get("https://testform2020.bss.design")
#open up where the id is located
driver.find_element_by_class_name('btn-block').click()
#remove overlay
driver.execute_script("document.querySelector('body > div:nth-child(6)').style.display = 'none'")
#target the frame
iframes = driver.find_elements_by_tag_name("iframe")
driver.switch_to.frame(iframes[0])
driver.switch_to.default_content()
time.sleep(3)
container = driver.find_element_by_name('g-recaptcha-response')
driver.execute_script("arguments[0].style.display = 'block';", container)
The problem was I had driver.switch_to.default_content() which was switching back to the main content but i needed to go further down the page

Categories

Resources