I'm currently trying to right a python script that overnight turns off all of our EC2 instances then in the morning my QA team can go to a webpage and press a button to turn the instances back on.
I have written my python script that turns the severs off using boto. I also have a function which when ran turns them back on.
I have an html doc with buttons on it.
I'm just struggling to work out how to get these buttons to call the function. I'm using bottle rather than flask and I have no Java SCript experience. So I would like t avoid Ajax if possible. I dont mind if the whole page has to reload after the button is pressed. After the single press the webpage isnt needed anyway.
What I have ended up doing to fix this issue is used bottle to make a url which completes the needed function. Then just made an html button that links to the relevant url.
Related
short version:
I am making a web kiosk that loads only one site. I want the browser to automatically go "back" (return to previous page) when it gets an error on the page it is loading. is this possible if so how would i make it work?
long version:
i am making a webpy program that accesses a WSDL. the webpy program is accessed from a raspberry pi set up to function as a touch kiosk. the user will not have access to a back button. i check to see if the WSDL is up and running when the program starts and when a user logs on. i do not check on every call to the WSDL because i worry the bandwidth might get overwhelming as i add users (100+). my fear is if the WSDL goes down after log in, iceweasel will load an error (404?) page and strand the user on that page. any ideas?
I would try making yourself a custom webpy error page that simply redirects the user back to the main page (probably using JavaScript). In your code.py:
def notfound():
return web.notfound("<script>window.location="http://some.com";<script>")
I'm new to web programming, and have recently began looking into using Python to automate some manual processes. What I'm trying to do is log into a site, click some drop-down menus to select settings, and run a report.
I've found the acclaimed requests library: http://docs.python-requests.org/en/latest/user/advanced/#request-and-response-objects
and have been trying to figure out how to use it.
I've successfully logged in using bpbp's answer on this page: How to use Python to login to a webpage and retrieve cookies for later usage?
My understanding of "clicking" a button is to write a post() command that mimics a click: Python - clicking a javascript button
My question (since I'm new to web programming and this library) is how I would go about pulling the data I need to figure out how I would construct these commands. I've been looking into [RequestObject].headers, .text, etc. Any examples would be great.
As always, thanks for your help!
EDIT:::
To make this question more concrete, I'm having trouble interacting with different aspects of a web-page. The following image shows what I'm actually trying to do:
I'm on a web-page that looks like this. There is a drop-down menu with click-able dates that can be changed. My goal is to automate changing the date to the most recent date, "click"'Save and Run', and download the report when it's finished running.
The only solution to this I have found is Selenium. If it werent a javascript heavy website you could try mechanize but for this you need to render the javascript and then inject javascript...like Selenium does.
Upside: You can record actions in Firefox (using selenium) then export those actions to python. The downside is that this code has to open a browser window to run.
I want to be able to access the elements of a webpage with python. I use Python 2.5 with Windows XP. Currently, I am using pywinauto to try to control IE, but I could switch to a different browser or module if it is needed. I need to be able to click the buttons the page has and type in text boxes. So far the best I've gotten is I was able to bring IE to the page I wanted. If it is possible without actually clicking the coords of the buttons and text boxes on the page, please tell me. Thanks!
I think for interacting with webserver better is to use cUrl. All webservers function are responses for GET or POST request (or both). in order to call them, just call the urls that buttons are linked to and/or send POST data attaching that data to appropiate request obj before calling send method. cUrl is able to retrieve and do some processing on webserver responce (web site code) without displaying it, what delivers knowledge about url adresses contained in the web site, which are called when clicking certain buttons. Also possible to know html fields which carry POST data to get their names.
Python has curl lib, which I hope is so powerful as PHP curl, tool what I used and presented.
Hope you are on the track now.
BR
Marek
Tommy ,
What i Have observer till now if u have a web application and if u are able to identify the object in that page using winspy (tool for spyup the controls in the page). you can automate that. Else you cant.
Like using following code
app = Application.Application();
app.window_(title_re = 'Internet Explorer. *').OKButton.Click()
My python program basically submits a form by loading a URL. There is a security code that seems to change every second so that you have to actually be on the website to enter the form.
For example,
http://www.locationary.com/prizes/index.jsp?ACTION_TOKEN=index_jsp$JspView$BetAction&inTickets=125000000&inSecureCode=091823021&inCampaignId=3060745
The only solution I can think of is using something like Selenium...I don't know any other way of kind of simulating a web browser but not really having it be as heavy and slow as a web browser...any ideas? Or is there a way I can do this without browser automation?
EDIT:
Response to first answer: I DID get the security code using urllib...the problem is that it seems to already have changed by the time I try to load my submission url...so I'm just guessing/assuming that you have to do it in realtime...
Yes, you'll need to get the security code programmatically since it changes every time. You can do this manually with urllib, or you can use mechanize or Selenium to make things easier.
I have a program written for text simplification in python language, I need this program to be run on a browser as a plugin... If you click the plugin it should take the webpage's text as input and pass this input to my text simplification program and the output of the program should be again displayed in another web page...
Text simplification program takes input text and produces a simplified version of the text, so now I'm planning to create a plugin which uses this program and produces simplified version of text on the webpage...
It will be of great help if anyone help me out through this...
You would need to use NPAPI plugins in Chrome Extension:
http://code.google.com/chrome/extensions/npapi.html
Then you use Content Scripts to get the webpage text, you pass it to the Background Page via Messaging. Then your NPAPI plugin will call python (do it however you like since its all in C++), and from the Background Page, you send the text within the plugin.
Concerning your NPAPI plugin, you can take a look how it is done in pyplugin or gather ideas from here to create it.
Now the serious question, why can't you do this all in JavaScript?
If you want an easier way than trying to figure out plugins, make it run as a webservice somewhere (Google App Engine is good for Python, and free), then use a bookmarklet to send pages from the browser. As an added bonus, it works with any browser, not just Chrome.
More explanation:
Rather than running on your own computer, you make your program run on a computer at Google (or somewhere else), and access it over the web. See Google's introduction to App Engine. Then, if you want it in your browser, you make a "bookmarklet" - a little bit of javascript that grabs the web page you're currently on (either the code or the URL, depends on what you're trying to do), and sends it to your program over the web. You can add this to your browser's bookmark bar as a button you can click. There's some more info on this site.