I'm trying to submit a form on an .asp page but Mechanize does not recognize the name of the control. The form code is:
<form id="form1" name="frmSearchQuick" method="post">
....
<input type="button" name="btSearchTop" value="SEARCH" class="buttonctl" onClick="uf_Browse('dledir_search_quick.asp');" >
My code is as follows:
br = mechanize.Browser()
br.open(BASE_URL)
br.select_form(name='frmSearchQuick')
resp = br.click(name='btSearchTop')
I've also tried the last line as:
resp = br.submit(name='btSearchTop')
The error I get is:
raise ControlNotFoundError("no control matching "+description) ControlNotFoundError: no control matching name 'btSearchTop', kind 'clickable'
If I print br I get this: IgnoreControl(btSearchTop=)
But I don't see that anywhere in the HTML.
Any advice on how to submit this form?
The button doesn't submit the form - it calls some javascript function.
Mechanize can't run javascript, so you can't use it to click that button.
The easy way out is to read that function yourself, and see what it does - if it just submits the form, then maybe you can get around it by submitting the form without clicking on anything.
you need to inspect element first, did mechanize recognize the form ?
for form in br.forms():
print form
Related
I created a program to fill out an HTML webpage form in Selenium, but now I want to change it to requests. However, I've come across a bit of a roadblock. I'm new to requests, and I'm not sure how to emulate a request as if a button had been pressed on the original website. Here's what I have so far -
import requests
import random
emailRandom = ''
for i in range(6):
add = random.randint(1,10)
emailRandom += str(add)
payload = {
'email':emailRandom+'#redacted',
'state_id':'34',
'tnc-optin':'on',
}
r= requests.get('redacted.com', data=payload)
The button I'm trying to "click" on the webpage looks like this -
<div class="button-container">
<input type="hidden" name="recaptcha" id="recaptcha">
<button type="submit" class="button red large">ENTER NOW</button>
</div>
What is the default/"clicked" value for this button? Will I be able to use it to submit the form using my requests code?
Using selenium and using requests are 2 different things, selenium uses your browser to submit the form via the html rendered UI, Python requests just submits the data from your python code without the html UI, it does not involve "clicking" the submit button.
The "submit" button in this case just merely triggers the browser to POST the form values.
However your backend will validate against the "recaptcha" token, so you will need to work around that.
Recommend u fiddling requests.
https://www.telerik.com/fiddler
And them recreating them.
James`s answer using selenium is slower than this.
I am working on a project and I need to validate a piece of data using a third party site. I wrote a python script using the lxml package that successfully checks if a specific piece of data is valid.
Unfortunately, the site does not have a convenient url scheme for their data and therefor I can not predict the specific url that will contain the data for each unique request. Instead the third party site has a query page with a standard html text input that redirects to the proper url.
My question is this: is there a way to input a value into the html input and submit it all from my python script?
Yes there is.
Mechanize
Forms
List the forms
import mechanize
br = mechanize.Browser()
br.open(url)
for form in br.forms():
print "Form name:", form.name
print form
select form
br.select_form("form1")
br.form = list(br.forms())[0]
login form example
br.select_form("login")
br['login:loginUsernameField'] = user
br['login:password'] = password
br.method = "POST"
response = br.submit()
Selenium
Sending input
Given an element defined as:
<input type="text" name="passwd" id="passwd-id" />
you could find it using any of:
element = driver.find_element_by_id("passwd-id")
element = driver.find_element_by_name("passwd")
element = driver.find_element_by_xpath("//input[#id='passwd-id']")
You may want to enter some text into a text field:
element.send_keys("some text")
You can simulate pressing the arrow keys by using the “Keys” class:
element.send_keys("and some", Keys.ARROW_DOWN)
These are the two packages I'm aware of that can do what you've asked.
I use the Mechanize for filling the form of filtering.
My code:
br = Browser()
br.open(self.domain)
br.select_form(nr=1)
br.find_control("pf_keywords").value = "Lisp"
response = br.click(type='button', nr=0)
#or
response = br.submit(label='Применить фильтр')
At the same time, submit button is not in the list of controls for this form.
Html code for this button:
<button type="button" class="b-button b-button_flat b-button_flat_green" onclick="$('frm').submit();">Применить фильтр</button>
Because of this, using the method click() and submit() impossible to produce form submission. In these methods, there is a search of the desired control with the parameters passed to the control environment forms, as desired button is not there, raise a bug :
mechanize._form.ControlNotFoundError: no control matching type 'button', kind 'clickable'
What should I do? How can push a button and get the result?
Python: 3.4.1
Browser: Chrome
I'm trying to push a button which is located in a form using Selenium with Python. I'm fairly new to Selenium and HTML.
The HTML code is as follows:
<FORM id='QLf_437222' method='POST' action='xxxx'>
<script>document.write("<a href='javascript:void(0);' onclick='document.getElementById(\"QLf_437222\").submit();' title='xxx'>51530119</a>");</script>
<noscript><INPUT type='SUBMIT' value='51530119' title='xxx' name='xxxx'></noscript>
<INPUT type=hidden name="prodType" value="DDA"/>
<INPUT type=hidden name="BlitzToken" value="BlitzToken"/>
<INPUT type=hidden name="productInfo" value="40050951530119"/>
<INPUT type=hidden name="reDirectionURL" value="xxx"/>
</FORM>
I've been trying the following:
driver.execute("javascript:void(0)")
driver.find_element_by_xpath('//*[#id="QLf_437104"]/a').click()
driver.find_element_by_xpath('//*[#id="QLf_437104"]/a').submit()
driver.find_element_by_css_selector("#QLf_437104 > a").click()
driver.find_element_by_css_selector("#QLf_437104 > a").submit()
Python doesn't throw an exception, so it seems like I'm clicking something, but it doesn't do what I want.
In addition to this the webpage acts funny when the chrome driver is initialized from Selenium. When clicking the button in the initialized chrome driver, the webpage throws an error (888).
I'm not sure where to go from here. Might it be something with the hidden elements?
If I can provide additional information please let me know.
EDIT:
It looks like the form id changes sometimes.
What it sounds like you are trying to do, is to submit the form, right?
The <a> that you are pointing out is simply submitting that form. Since that is being injected via JavaScript, it's possible that it's not showing up when you try to click it. What i'd recommend, is doing:
driver.find_element_by_css_selector("form[id^='QLf']").submit()
That will avoid the button, and submit the appropriate form.
In the above CSS selector, i also used [id^= this means, find a <form> with an ID attribute that starts with QLf, because it looks like the numbers after, are automatically generated.
I am attempting to scrape the following website flow.gassco.no as one of my first python projects. I need to bypass the splash screen which redirects to the main page. I have isolated the following action,
<form method="get" action="acceptDisclaimer">
<input type="submit" value="Accept"/>
<input type="button" name="decline" value="Decline" onclick="window.location = 'http://www.gassco.no'" />
</form>
In a browser appending 'acceptDisclaimer?' to the url redirects to the target flow.gassco.no. However if I try to replicate this in urllib, I appear to stay on the same page when outputting the source.
import urllib, urllib2
url="http://flow.gassco.no/acceptDisclaimer?"
url2="http://flow.gassco.no/"
#first pass to invoke disclaimer
req=urllib2.Request(url)
res=urllib2.urlopen(req)
#second pass to access main page
req1=urllib2.Request(url2)
res2=urllib2.urlopen(req1)
data=res2.read()
print data
I suspect that I have oversimplified the problem, but would appreciate any input into how I can accept the disclaimer and continue to output the main page source.
Use a cookiejar. See python: urllib2 how to send cookie with urlopen request
Open the main url first
Open the /acceptDisclaimer after that