I am sending requests and receiving their response objects into a hooked function.
Sometimes, I'm sending a new request out of that same function, which in turns calls the function that sends requests (with a new thread for each request).
That creates the following exception in the Requests module itself:
Exception in thread Thread-103:
Traceback (most recent call last):
File "/usr/lib/python2.7/threading.py", line 810, in __bootstrap_inner
self.run()
File "/usr/lib/python2.7/threading.py", line 763, in run
self.__target(*self.__args, **self.__kwargs)
File "/api.py", line 272, in thread_func
r = self.session.post(url, data=params, headers=headers, timeout=60, hooks=dict(response=self.http_callback))
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 425, in post
return self.request('POST', url, data=data, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 383, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 494, in send
if r.history:
AttributeError: 'bool' object has no attribute 'history'
I find that very strange. Do you know what might be happening?
Note: I looked at this question but my problem is different: Problems with hooks using Requests Python package
The line before the exception point reads:
r = dispatch_hook('response', hooks, r, **kwargs)
Your hook returned a boolean, while it should return a response object. A boolean doesn't have a history attribute, while a response object would.
Alternatively, return None if you did not mean to alter the response.
From the event hooks documentation:
If the callback function returns a value, it is assumed that it is to replace the data that was passed in. If the function doesn’t return anything, nothing else is effected.
The value passed in for the response hook is the response object; the replacement value should be a response object too.
Fix your hook. From your traceback we can see that you passed in self.http_callback as the hook, if you were wondering where to look.
Related
I have some working code, that gets data from a queue, processes it and then emits the data via Flask-socketio to the browser.
This works when there aren't many messages to emit, however when the workload increases it simply can't cope.
Additionally I have tried processing the queue without transmitting and this appears to work correctly.
So what I was hoping to do, is rather than emit every time the queue.get() fires, to simply emit whatever the current dataset is on every ping.
In theory this should mean that even if the queue.get() fires multiple times between ping and pong, the messages being sent should remain constant and won't overload the system.
It means of course, some data will not be sent, but the most up-to date data as of the last ping should be sent, which is sufficient for what I need.
Hopefully that makes sense, so on to the (sort of working) code...
This works when there are not a lot of messages to emit (the socketio sleep needs to be there, otherwise the processing doesn't occur before it tries to emit):
def handle_message(*_args, **_kwargs):
try:
order_books = order_queue.get(block=True, timeout=1)
except Empty:
order_books = None
market_books = market_queue.get()
uo = update_orders(order_books, mkt_runners, profitloss, trading, eo, mb)
update_market_book(market_books, mb, uo, market_catalogues, mkt_runners, profitloss, trading)
socketio.sleep(0.2)
emit('initial_market', {'message': 'initial_market', 'mb': json.dumps(mb), 'ob': json.dumps(eo)})
socketio.sleep(0.2)
emit('my_response2', {'message': 'pong'})
def main():
socketio.run(app, debug=True, port=3000)
market_stream.stop()
order_stream.stop()
if __name__ == '__main__':
main()
This works (but I'm not trying to emit any messages here, this is just the Python script getting from the queue and processing):
while True:
try:
order_books = order_queue.get(block=True, timeout=1)
except Empty:
order_books = None
market_books = market_queue.get()
uo = update_orders(order_books, mkt_runners, profitloss, trading, eo, mb)
update_market_book(market_books, mb, uo, market_catalogues, mkt_runners, profitloss, trading)
print(mb)
(The mb at the end is the current dataset, which is returned by the update_market_book function).
Now, I was hoping, by having the While True at the end, this could simply run and the function would return the latest dataset on every ping, however using the above, the While True only fires when the main function is taken out...which of course stops the socketio section from working.
So, is there a way I can combine both these, to achieve what I am trying to do and/or is there an alternative method I haven't considered that might work?
As always, I appreciate your advice and if the question is not clear, please let me know, so I can clarify any bits.
Many thanks!
Just adding a stack trace as requested:
Traceback (most recent call last):
File "D:\Python37\lib\site-packages\betfairlightweight\endpoints\login.py", line 38, in request
response = session.post(self.url, data=self.data, headers=self.client.login_headers, cert=self.client.cert)
File "D:\Python37\lib\site-packages\requests\api.py", line 116, in post
return request('post', url, data=data, json=json, **kwargs)
File "D:\Python37\lib\site-packages\requests\api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "D:\Python37\lib\site-packages\requests\sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "D:\Python37\lib\site-packages\requests\sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "D:\Python37\lib\site-packages\requests\adapters.py", line 449, in send
timeout=timeout
File "D:\Python37\lib\site-packages\urllib3\connectionpool.py", line 600, in urlopen
chunked=chunked)
File "D:\Python37\lib\site-packages\urllib3\connectionpool.py", line 343, in _make_request
self._validate_conn(conn)
File "D:\Python37\lib\site-packages\urllib3\connectionpool.py", line 839, in _validate_conn
conn.connect()
File "D:\Python37\lib\site-packages\urllib3\connection.py", line 332, in connect
cert_reqs=resolve_cert_reqs(self.cert_reqs),
File "D:\Python37\lib\site-packages\urllib3\util\ssl_.py", line 279, in create_urllib3_context
context.options |= options
File "D:\Python37\lib\ssl.py", line 507, in options
super(SSLContext, SSLContext).options.__set__(self, value)
File "D:\Python37\lib\ssl.py", line 507, in options
super(SSLContext, SSLContext).options.__set__(self, value)
File "D:\Python37\lib\ssl.py", line 507, in options
super(SSLContext, SSLContext).options.__set__(self, value)
[Previous line repeated 489 more times]
RecursionError: maximum recursion depth exceeded while calling a Python object
macOS 10.12.3 python 2.7.13 requests 2.13.0
I use requests package to send post request.This request need to login before post data.So I use request.Session() and load a logined cookie.
Then I use this session to send post data in cycle mode.
It is no error that I used to run this code in Windows and Linux.
Simple Code:
s = request.Session()
s.cookies = cookieslib.LWPCookieJar('cookise')
s.cookies.load(ignore_discard=True)
for user_id in range(100,200):
url = 'http://xxxx'
data = { 'user': user_id, 'content': '123'}
r = s.post(url, data)
...
But the program frequently (about every interval) crash, the error isAttributeError: 'module' object has no attribute 'kqueue'
Traceback (most recent call last):
File "/Users/gasxia/Dev/Projects/TgbookSpider/kfz_send_msg.py", line 90, in send_msg
r = requests.post(url, data) # catch error if user isn't exist
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 535, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 488, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 609, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 423, in send
timeout=timeout
File "/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 588, in urlopen
conn = self._get_conn(timeout=pool_timeout)
File "/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 241, in _get_conn
if conn and is_connection_dropped(conn):
File "/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/util/connection.py", line 27, in is_connection_dropped
return bool(wait_for_read(sock, timeout=0.0))
File "/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/util/wait.py", line 33, in wait_for_read
return _wait_for_io_events(socks, EVENT_READ, timeout)
File "/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/util/wait.py", line 22, in _wait_for_io_events
with DefaultSelector() as selector:
File "/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/util/selectors.py", line 431, in __init__
self._kqueue = select.kqueue()
AttributeError: 'module' object has no attribute 'kqueue'
This looks like a problem that commonly arises if you're using something like eventlet or gevent, both of which monkeypatch the select module. If you're using those to achieve asynchrony, you will need to ensure that those monkeypatches are applied before importing requests. This is a known bug, being tracked in this issue.
I'm trying to write a redditbot; I decided to start with a simple one, to make sure I was doing things properly, and I got a RequestException.
my code (bot.py):
import praw
for s in praw.Reddit('bot1').subreddit("learnpython").hot(limit=5):
print s.title
my praw.ini file:
# The URL prefix for OAuth-related requests.
oauth_url=https://oauth.reddit.com
# The URL prefix for regular requests.
reddit_url=https://www.reddit.com
# The URL prefix for short URLs.
short_url=https://redd.it
[bot1]
client_id=HIDDEN
client_secret=HIDDEN
password=HIDDEN
username=HIDDEN
user_agent=ILovePythonBot0.1
(where HIDDEN replaces the actual id, secret, password and username.)
my Traceback:
Traceback (most recent call last):
File "bot.py", line 3, in <module>
for s in praw.Reddit('bot1').subreddit("learnpython").hot(limit=5):
File "/usr/local/lib/python2.7/dist-packages/praw/models/listing/generator.py", line 79, in next
return self.__next__()
File "/usr/local/lib/python2.7/dist-packages/praw/models/listing/generator.py", line 52, in __next__
self._next_batch()
File "/usr/local/lib/python2.7/dist-packages/praw/models/listing/generator.py", line 62, in _next_batch
self._listing = self._reddit.get(self.url, params=self.params)
File "/usr/local/lib/python2.7/dist-packages/praw/reddit.py", line 322, in get
data = self.request('GET', path, params=params)
File "/usr/local/lib/python2.7/dist-packages/praw/reddit.py", line 406, in request
params=params)
File "/usr/local/lib/python2.7/dist-packages/prawcore/sessions.py", line 131, in request
params=params, url=url)
File "/usr/local/lib/python2.7/dist-packages/prawcore/sessions.py", line 70, in _request_with_retries
params=params)
File "/usr/local/lib/python2.7/dist-packages/prawcore/rate_limit.py", line 28, in call
response = request_function(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/prawcore/requestor.py", line 48, in request
raise RequestException(exc, args, kwargs)
prawcore.exceptions.RequestException: error with request request() got an unexpected keyword argument 'json'
Any help would be appreciated. PS, I am using Python 2.7., on Ubuntu 14.04. Please ask me for any other information you may need.
The way i see it, it seems you have a problem with your request to Reddit API. Maybe try changing the user-agent in your in-file configuration. According to PRAW basic configuration Options the user-agent should follow a format <platform>:<app ID>:<version string> (by /u/<reddit username>) . Try that see what happens.
I'm building an asynchronous HTTP request class with urllib2. The code looks like this:
import urllib2, threading, datetime
class ExportHandler(urllib2.HTTPHandler):
def http_response(self, link, actualdate, nowdate ):
link += "&export=csv"
export = urllib2.urlopen( link )
for link, actualdate in commissionSummaryLinks:
o = urllib2.build_opener(ExportHandler())
t = threading.Thread(target=o.open, args=(link, actualdate, datetime.datetime.now().strftime("%Y%m%d%H%M%S")))
t.start()
print "I'm asynchronous!"
t.join()
print "ending threading"
Suffice it to say commissionSummaryLinks IS populated and actualdate is a date time.datetime.strptime() object.
Anyway, I'm receiving an error from all the issued threads that looks like this:
Exception in thread Thread-9:
Traceback (most recent call last):
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 808, in __bootstrap_inner
self.run()
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 761, in run
self.__target(*self.__args, **self.__kwargs)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 402, in open
req = meth(req)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 1123, in do_request_
'Content-length', '%d' % len(data))
TypeError: object of type 'datetime.date' has no len()
I'm running this on OS X (if it matters). Can anyone tell me what the problem here is?
When you instantiate your thread, you need to provide the arguments to o.open, not arguments to the http_response method of ExportHandler.
In this case, o.open has the following method signature:
open(self, fullurl, data=None, timeout=<object object>) method of urllib2.OpenerDirector instance
My guess is that you should only need to set args=(link,).
If you still need to use those other arguments, you'll probably want to modify the constructor of ExportHandler to take the other arguments you need, and then use them as appropriate in the http_response method. Take a look at the Python Class tutorial for more information on defining a class constructor.
I'm using manual on https://www.dropbox.com/developers/core/start/python .
Have made everything equal to manual, including creation of my app in account, copy-past of app keys, allowing to use the app using the key (in fact I open the link in my browser, click allow and copy confirmation code).
After this, I want to finish authorization, but I get such error text:
>>> access_token, user_id = flow.finish(code)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/dropbox/client.py", line 1233, in finish
return self._finish(code, None)
File "/usr/local/lib/python2.7/dist-packages/dropbox/client.py", line 1101, in _finish
response = self.rest_client.POST(url, params=params)
File "/usr/local/lib/python2.7/dist-packages/dropbox/rest.py", line 316, in POST
return cls.IMPL.POST(*n, **kw)
File "/usr/local/lib/python2.7/dist-packages/dropbox/rest.py", line 254, in POST
post_params=params, headers=headers, raw_response=raw_response)
File "/usr/local/lib/python2.7/dist-packages/dropbox/rest.py", line 218, in request
preload_content=False
File "/usr/lib/python2.7/dist-packages/urllib3/poolmanager.py", line 112, in urlopen
conn = self.connection_from_host(u.host, port=u.port, scheme=u.scheme)
File "/usr/lib/python2.7/dist-packages/urllib3/poolmanager.py", line 84, in connection_from_host
pool = pool_cls(host, port, **self.connection_pool_kw)
TypeError: __init__() got an unexpected keyword argument 'ssl_version'
P.S. Flow object is alive => http://screencloud.net/v/nDi0
Is seems you use urllib3 version 1.5 or older. Upgrade it to 1.6 or 1.7.