Python code to fill and submit Stulish Form - python

I'm trying to write a Python code to submit a simple form.
http://stulish.com/soumalya01
[Edit : http://travelangkawi.com/soumalya01/
When you use this link it returns a different page on form submit. Is good for debugging]
Any code would do
Tried both mechanize and mechanical soup. Both are unable to handle the text fields. It does not have a name only ID. But we are unable to get the element by ID
Any Code would do as long as it works. (Fill ABC in the text box and hit submit)

I just followed the documentations of mechanize. See sample code below:
from mechanize import Browser
br = Browser()
br.open('http://stulish.com/soumalya01')
br.select_form(nr=0)
form.set_all_readonly(False) #add this
br.form.set_value('ABC', nr=1)
print(br.form.controls[1])
br.submit()

Related

Using Mechanicalsoup to navigate multiple pages / forms

I've had success using mechanicalsoup with single pages / single forms, but am having difficulty with a multistep problem. The pages I am attempting to navigate start here: https://webapps2.ncua.gov/CustomQuery/CUSelect.aspx
I get through the first page/form, but then I am not sure how to deal with the second page/form. The third page includes the result that I wish to scrape.
import requests
import urllib.parse
import mechanicalsoup
browser = mechanicalsoup.StatefulBrowser()
browser.open("https://webapps2.ncua.gov/CustomQuery/CUSelect.aspx")
form=browser.select_form()
browser["operand0"] = "State"
browser["operator0"] = "Not Equal"
browser["value0"] = "XX"
response = browser.submit_selected()
form2 = browser.get_current_form()
submit = browser.get_current_page().find('input', id='BtnAllAcct')
form2.choose_submit(submit)
browser.submit_selected()
submit = browser.get_current_page().find('input', id='Btndata1')
form2.choose_submit(submit)
browser.submit_selected()
Any ideas? This is my second attempt after first trying to interact with the API, but two separate forms is stumping me on that as well.
I was solving a similar issue, following the advice in MechanicalSoup follow a link without inside a button I switched to selenium

Select and submit form with python requests library

I am trying to scrape data from this website. To access the tables, I need to click the "Search" button. I was able to successfully do this using mechanize:
br = mechanize.Browser()
br.open(url + 'Wildnew_Online_Status_New.aspx')
br.select_form(name='aspnetForm')
page = br.submit(id='ctl00_ContentPlaceHolder1_Button1')
"page" gives me the resulting webpage with the table, as needed. However, I'd like to iterate through the links to subsequent pages at the bottom, and this triggers javascript. I've heard mechanize does not support this, so I need a new strategy.
I believe I can get to subsequent pages using a post request from the requests library. However, I am not able to click "search" on the main page to get to the initial table. In other words, I want to replicate the above code using requests. I tried
s = requests.Session()
form_data = {'name': 'aspnetForm', 'id': 'ctl00_ContentPlaceHolder1_Button1'}
r = s.post('http://forestsclearance.nic.in/Wildnew_Online_Status_New.aspx', data=form_data)
Not sure why, but this returns the main page again (without clicking Search). Any help appreciated.
I think you should look into scrapy
you forgot some parameters in ths post request:
https://www.pastiebin.com/5bc6562304e3c
check the Post request with google dev tools

Scraping Complex Forms using BeautifulSoup and Requests

Below is a snippet of my Python code along with HTML from a page I'm trying to scrape.
The HTML is a complex form I'm having trouble scraping. I'm using BeautifulSoup4 and Python Requests however when I post to the page theform isn't properly receiving the correct inputs. I'm guessing it has something to do with all these hidden inputs above the actual select I'm trying to submit.
If I inspect the form-data being submitted while using Chrome, here's what I see.
Chrome Developer Console View
When using the page through the browser the only field that has to be selected is the select name="sel_subj as seen below. However when posting back to the page this fails
new_url = 'https://wl11gp.neu.edu/udcprod8/NEUCLSS.p_class_search'
requests.post(new_url, data={'STU_TERM_IN':201730,
'p_msg_code': UNSECURED',
'sel_subj': 'ACCT'})
To view a live version of the page I'm trying to scrape visit this link, select "Spring 2017 Semester" and click submit: https://wl11gp.neu.edu/udcprod8/NEUCLSS.p_disp_dyn_sched

Input html form data from python script

I am working on a project and I need to validate a piece of data using a third party site. I wrote a python script using the lxml package that successfully checks if a specific piece of data is valid.
Unfortunately, the site does not have a convenient url scheme for their data and therefor I can not predict the specific url that will contain the data for each unique request. Instead the third party site has a query page with a standard html text input that redirects to the proper url.
My question is this: is there a way to input a value into the html input and submit it all from my python script?
Yes there is.
Mechanize
Forms
List the forms
import mechanize
br = mechanize.Browser()
br.open(url)
for form in br.forms():
print "Form name:", form.name
print form
select form
br.select_form("form1")
br.form = list(br.forms())[0]
login form example
br.select_form("login")
br['login:loginUsernameField'] = user
br['login:password'] = password
br.method = "POST"
response = br.submit()
Selenium
Sending input
Given an element defined as:
<input type="text" name="passwd" id="passwd-id" />
you could find it using any of:
element = driver.find_element_by_id("passwd-id")
element = driver.find_element_by_name("passwd")
element = driver.find_element_by_xpath("//input[#id='passwd-id']")
You may want to enter some text into a text field:
element.send_keys("some text")
You can simulate pressing the arrow keys by using the “Keys” class:
element.send_keys("and some", Keys.ARROW_DOWN)
These are the two packages I'm aware of that can do what you've asked.

Filling Textboxes in Python via Mechanize Without Forms

I'm trying to fill textboxes in this site but mechanize can't find any form.Therefore I can't do anything on thoose textboxes cause they're in non-exists form.
I tried so different solutions but i failed.Here is my code :
import mechanize
def kota():
br = mechanize.Browser()
br.open("http://www.kentkartim.com/bakiyesorgula.php")
a = br.select_form(nr=0)
print(a)
With this I get this result :
mechanize._mechanize.FormNotFoundError: no form matching nr 0
I'm thinking that the reason of the result is the webpage using javascript. I need to fill the textboxes and submit them.

Categories

Resources