import mechanize module to python script - python

I tried to import mechanize module to my python script like this,
from mechanize import Browser
But, Google appengine throws HTTP 500 when accessing my script.
To make things more clear, Let me give you the snapshot of my package structure,
root
....mechanize(where all the mechanize related files there)
....main.py
....app.yaml
....image
....script
Can anyone help me out to resolve this issue?
Thanks,
Ponmalar

The mechanize main page says:
mechanize.Browser is a subclass of mechanize.UserAgentBase, which is, in turn, a subclass of urllib2.OpenerDirector
My understanding is that urllib2 is one of the sandboxed modules in GAE, with its functionality being replaced by the Google-provided urlfetch. You'd need to re-implement the mechanize.UserAgentBase class to use urlfetch, if that's at all possible.

When GAE throws a 500, you can see the actual error in the logs on your admin console. If that doesn't help, paste it here and we'll help further.
Also, does it work on the dev_appserver?

Related

How to get url from any browser like chrome, opera, etc to variable in Python automatically?

i'm beginner in Python so how a good ways to get url from any browser like chrome, opera, etc to variable in Python ? thanks
To get url you can do something like this:
import urllib2
response = urllib2.urlopen('http://domainname.com/')
html = response.read()
This can work, just grab the URL and stick it to a variable
variable_name = "https://www.url.com"
To answer this, you need to understand what context you are trying to grab the URL in for your Python app.
Are you making a desktop app that runs in the background and tracks Browser HTTP requests for different URLS? Windows (assuming you use it) already has something that already does that. Open command prompt and type in ipconfig /displaydns > dnslist.txt -- Viola! :)
Are you making trying to run Python code inside your web browser to do this? Browsers don't support Python normally but if you wanted you could install a plugin and ask your users to install it. You can read more about that here: Chrome extension in python? In this case I would strongly reccomend you just use Javascript however. The plugin just converts your python into JS anyway since browsers are made to only execute Javascript by default...
Are you trying to make a serverside app? In this case you would always know what domain is being called because it would be the domain of your own website! Your webserver would always know the exact URL that was requesting it in this case.
Which is it for you?

Python CGI with API

I am quite new to python, I have built some applications in python with CGI and I found cgi is much easier compare to a framework as i have full control on everything.
Tried to build a web api through below module but it end up with a web page rather than an api.
Import cgi
Import cgitb
I would like to create a web api, as I am familiar with cgi i would like to create it (web api) through python cgi,I have been looking for a good documentation but i dint find any. It would be helpful if someone can give me a clue. I really appreciate your help.
CGI applications are nothing but backend of HTTP kind protocol, so I guess you could naturally build REST API on top of it: http://en.wikipedia.org/wiki/Representational_state_transfer
Also python have good build-in support for HTTP for better understanding from inside (in case u like to keep control on everything): https://docs.python.org/3/library/http.server.html#module-http.server
Anyway, when your getting better in it your best bet to switch on a framework like these: https://www.djangoproject.com/ http://www.tornadoweb.org/en/stable/

Using python-twitter behind a proxy

I have "created" a Twitter parser in Python 2.7, which pretty much can parse everything available from the API. As everyone else the REST API limit is killing me. I am trying to create a social graph (pretty big I'd say) and time is of the essence. So I thought, what if I could use a proxy? And the fact is I managed to with urllib, but any try to recreate the parser with urllib this would destroy all the hours put into python-twitter. So my Extended(question) is, can someone please explain how to patch the twitter.py with these instructions: http://code.google.com/p/python-twitter/issues/detail?can=2&start=0&num=100&q=&colspec=ID%20Type%20Status%20Priority%20Milestone%20Owner%20Summary&groupby=&sort=&id=205
Or even better, anyone knows any similar workarounds the REST API limit? Moreover, are any other python modules offering oAuth and proxy support?
I'm not sure whether python-twitter supports proxy. But, there is a project which implements proxy support for python-twitter. Have a look at https://github.com/dhananjaysathe/python-twitter-with-proxy

How to implement sessions or cookies in Python

can anyone help me get sessions or cookies working with my code here:
http://pastebin.com/2Y2tydsF
I have tried a few session modules that I found with Google but nothing seems to work, or I dont know how to use it.
I have also fiddled with cookies but with no luck
Also, what are the differences, and what are CGI and WSGI apps? and would my code be one of the two?
Thanks
Use gai-sessions. The source includes demos which show how to use it, including how to integrate with the Users API or RPX/JanRain.

post method in Python

I want to write a script in Python that would connect and post some data to several web-servers automatically.
Please write how to post and submit data for example to google. Just can't understand how to do it from Python documentation. Thank You.
Check the documentation on the urllib2 module and check out this urllib2 the missing manual. It's all there.
If you are open for packages outside standard library, then mechanize is a good option for such tasks.

Categories

Resources