Celery Flask --- error: [Errno 111] Connection refused - python

After running through the basic example for flask-celery(runs fine as far as I can tell) I'm trying to integrate this to my own project. Basically, I'm using this below:
from flask import Blueprint, jsonify, request, session
from flask.views import MethodView
from celery.decorators import task
blueprint = Blueprint('myapi', __name__)
class MyAPI(MethodView):
def get(self, tag):
return get_resource.apply_async(tag)
#task(name="get_task")
def get_resource(tag):
pass
with the same setup as in the example, I'm getting this error:
Traceback (most recent call last):
File "/x/venv/lib/python2.7/site-packages/flask/app.py", line 1518, in __call__
return self.wsgi_app(environ, start_response)
File "/x/venv/lib/python2.7/site-packages/flask/app.py", line 1506, in wsgi_app
response = self.make_response(self.handle_exception(e))
File "/x/venv/lib/python2.7/site-packages/flask/app.py", line 1504, in wsgi_app
response = self.full_dispatch_request()
File "/x/venv/lib/python2.7/site-packages/flask/app.py", line 1264, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/x/venv/lib/python2.7/site-packages/flask/app.py", line 1262, in full_dispatch_request
rv = self.dispatch_request()
File "/x/venv/lib/python2.7/site-packages/flask/app.py", line 1248, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/x/venv/lib/python2.7/site-packages/flask/views.py", line 84, in view
return self.dispatch_request(*args, **kwargs)
File "/x/venv/lib/python2.7/site-packages/flask/views.py", line 151, in dispatch_request
return meth(*args, **kwargs)
File "/x/api/modules/document/document.py", line 14, in get
return get_resource.apply_async(tag)
File "/x/venv/lib/python2.7/site-packages/celery/app/task/__init__.py", line 449, in apply_async
publish = publisher or self.app.amqp.publisher_pool.acquire(block=True)
File "/x/venv/lib/python2.7/site-packages/kombu/connection.py", line 657, in acquire
R = self.prepare(R)
File "/x/venv/lib/python2.7/site-packages/kombu/pools.py", line 54, in prepare
p = p()
File "/x/venv/lib/python2.7/site-packages/kombu/pools.py", line 45, in <lambda>
return lambda: self.create_producer()
File "/x/venv/lib/python2.7/site-packages/celery/app/amqp.py", line 265, in create_producer
pub = self.app.amqp.TaskPublisher(conn, auto_declare=False)
File "/x/venv/lib/python2.7/site-packages/celery/app/amqp.py", line 328, in TaskPublisher
return TaskPublisher(*args, **self.app.merge(defaults, kwargs))
File "/x/venv/lib/python2.7/site-packages/celery/app/amqp.py", line 158, in __init__
super(TaskPublisher, self).__init__(*args, **kwargs)
File "/x/venv/lib/python2.7/site-packages/kombu/compat.py", line 61, in __init__
super(Publisher, self).__init__(connection, self.exchange, **kwargs)
File "/x/venv/lib/python2.7/site-packages/kombu/messaging.py", line 79, in __init__
self.revive(self.channel)
File "/x/venv/lib/python2.7/site-packages/kombu/messaging.py", line 168, in revive
channel = channel.default_channel
File "/x/venv/lib/python2.7/site-packages/kombu/connection.py", line 581, in default_channel
self.connection
File "/x/venv/lib/python2.7/site-packages/kombu/connection.py", line 574, in connection
self._connection = self._establish_connection()
File "/x/venv/lib/python2.7/site-packages/kombu/connection.py", line 533, in _establish_connection
conn = self.transport.establish_connection()
File "/x/venv/lib/python2.7/site-packages/kombu/transport/amqplib.py", line 279, in establish_connection
connect_timeout=conninfo.connect_timeout)
File "/x/venv/lib/python2.7/site-packages/kombu/transport/amqplib.py", line 89, in __init__
super(Connection, self).__init__(*args, **kwargs)
File "/x/venv/lib/python2.7/site-packages/amqplib/client_0_8/connection.py", line 129, in __init__
self.transport = create_transport(host, connect_timeout, ssl)
File "/x/venv/lib/python2.7/site-packages/amqplib/client_0_8/transport.py", line 281, in create_transport
return TCPTransport(host, connect_timeout)
File "/x/venv/lib/python2.7/site-packages/amqplib/client_0_8/transport.py", line 85, in __init__
raise socket.error, msg
error: [Errno 111] Connection refused
-->
I'm using redis, and if I install rabbitmq I get another error, but I do not understand this right now --the broker should be redis but its isn't finding it or what? Can anyone give me more of a clue what is going on here? Do I need to import something else, etc. The point is, there is very little beyond the bare bones example and this makes no sense to me.
The most I've been able to determine as that in the Api module there is no access to the 'celery' and when it goes to try and put data there when at the app level, the celery there falls into some defaults, which aren't installed because I'm pointing to redis. Just a guess. I haven't been able to import information into the module, only determined that calling anything 'celery'(for example, output celery.conf) from the app causes an error -- although I could import celery.task.
This is the broker config the application is using, direct from the example:
BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = "redis"
CELERY_REDIS_HOST = "localhost"
CELERY_REDIS_PORT = 6379
CELERY_REDIS_DB = 0
EDIT:
If you'd like to see a demo: https://github.com/thrisp/flask-celery-example
AS it turns out having BROKER_TRANSPORT = 'redis' in your settings is important for whatever is being passed that I'm passing in (for the setup I put forth here and in the git example), I'm not entirely sure why it isn't in the example bits, but is in the ones I've added but it is -- without this it wants to dump everything onto a default ampq queue.
EDIT2:
Also, this a rather a big deal, using the upcoming version of Celery simplifies 10,000 issues when using it with Flask, making all of this unecesssary.

You must configure redis to bind the localhost. In /etc/redis/redis.conf, uncomment the line with
bind 127.0.0.1

Related

Celery ConnectionResetError: [Errno 104] Connection reset by peer

We are creating an application which consists of a frontend (flask api) and a backend that uses celery. The API starts a celery task and retrieves the result like this:
result = data_source_tasks.add_data_point.delay(tok, uuid, source_type, datum, request_counter)
return result.get(timeout=5)
We use RabbitMQ as broker and result backend:
celery_broker_url = pyamqp://guest#localhost//
celery_result_backend = rpc://
After everything runs fine for a while (multiple thousand api calls) I get the following error:
Traceback (most recent call last):
File "/usr/local/lib/python3.4/dist-packages/flask/app.py", line 1982, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.4/dist-packages/flask/app.py", line 1614, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.4/dist-packages/flask/app.py", line 1517, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.4/dist-packages/flask/_compat.py", line 33, in reraise
raise value
File "/usr/local/lib/python3.4/dist-packages/flask/app.py", line 1612, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.4/dist-packages/flask/app.py", line 1598, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.4/dist-packages/connexion/decorators/decorator.py", line 66, in wrapper
response = function(request)
File "/usr/local/lib/python3.4/dist-packages/connexion/decorators/validation.py", line 122, in wrapper
response = function(request)
File "/usr/local/lib/python3.4/dist-packages/connexion/decorators/validation.py", line 293, in wrapper
return function(request)
File "/usr/local/lib/python3.4/dist-packages/connexion/decorators/decorator.py", line 42, in wrapper
response = function(request)
File "/usr/local/lib/python3.4/dist-packages/connexion/decorators/parameter.py", line 219, in wrapper
return function(**kwargs)
File "/mynedata/lib/api/apicalls.py", line 747, in store_datum
return result.get(timeout=5)
File "/usr/local/lib/python3.4/dist-packages/celery/result.py", line 224, in get
on_message=on_message,
File "/usr/local/lib/python3.4/dist-packages/celery/backends/async.py", line 188, in wait_for_pending
for _ in self._wait_for_pending(result, **kwargs):
File "/usr/local/lib/python3.4/dist-packages/celery/backends/async.py", line 255, in _wait_for_pending
on_interval=on_interval):
File "/usr/local/lib/python3.4/dist-packages/celery/backends/async.py", line 56, in drain_events_until
yield self.wait_for(p, wait, timeout=1)
File "/usr/local/lib/python3.4/dist-packages/celery/backends/async.py", line 65, in wait_for
wait(timeout=timeout)
File "/usr/local/lib/python3.4/dist-packages/celery/backends/rpc.py", line 63, in drain_events
return self._connection.drain_events(timeout=timeout)
File "/usr/local/lib/python3.4/dist-packages/kombu/connection.py", line 301, in drain_events
return self.transport.drain_events(self.connection, **kwargs)
File "/usr/local/lib/python3.4/dist-packages/kombu/transport/pyamqp.py", line 103, in drain_events
return connection.drain_events(**kwargs)
File "/usr/local/lib/python3.4/dist-packages/amqp/connection.py", line 471, in drain_events
while not self.blocking_read(timeout):
File "/usr/local/lib/python3.4/dist-packages/amqp/connection.py", line 476, in blocking_read
frame = self.transport.read_frame()
File "/usr/local/lib/python3.4/dist-packages/amqp/transport.py", line 226, in read_frame
frame_header = read(7, True)
File "/usr/local/lib/python3.4/dist-packages/amqp/transport.py", line 401, in _read
s = recv(n - len(rbuf))
ConnectionResetError: [Errno 104] Connection reset by peer
I can see in the console where I started the celery worker that the task (and all following tasks) succeeded, however, result.get results in a timeout for this and all following tasks. Did my connection to the result backend somehow break? If I restart the API, neither restarting the celery worker nor rabbitmq, everything works fine again.

Flask integration with Celery 3.x & SqlAlchemy

There are a number of unanswered questions about this topic.
The challenge is utilising celery tasks that can access the database over SQLAlchemy. Without a proper integration with Flask, we would get the typical out of context error message.
Hence this is my attempt to integrate Flask and Celery, which seems to be working fine. As the next step I would like to invoke that task when '/' is hit. However I get a connection refused message (see below for trace)
wsgi_fb.py
def make_celery(the_app):
the_celery = Celery(the_app.import_name, backend=the_app.config['CELERY_RESULT_BACKEND'], broker=the_app.config['BROKER_URL_CELERY'])
the_celery.config_from_object(config_celery)
the_celery.conf.update(the_app.config)
task_base = the_celery.Task
class ContextTask(task_base):
abstract = True
def __call__(self, *args, **kwargs):
with the_app.app_context():
return task_base.__call__(self, *args, **kwargs)
the_celery.Task = ContextTask
return the_celery
app = Flask(__name__)
app.config.from_object(config)
celery_app = make_celery(app)
celery_app.config_from_object(config_celery)
celery_app.set_current()
api = Api(app)
db.init_app(app)
api.add_resource(Index, '/')
facebook_bot.py
import tasks
class Index(Resource):
def get(self):
tasks.send_tag_batches.delay()
return 'OK'
tasks.py
from celery import current_app
#current_app.task
def send_tag_batches():
...
return 'ok'
StackTrace:
File "/Users/houmie/projects/chasebot/src/facebook/resource/facebook_bot.py", line 22, in get
tasks.send_tag_batches.delay()
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/celery/app/task.py", line 453, in delay
return self.apply_async(args, kwargs)
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/celery/app/task.py", line 565, in apply_async
**dict(self._get_exec_options(), **options)
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/celery/app/base.py", line 354, in send_task
reply_to=reply_to or self.oid, **options
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/celery/app/amqp.py", line 310, in publish_task
**kwargs
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/kombu/messaging.py", line 172, in publish
routing_key, mandatory, immediate, exchange, declare)
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/kombu/connection.py", line 457, in _ensured
interval_max)
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/kombu/connection.py", line 369, in ensure_connection
interval_start, interval_step, interval_max, callback)
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/kombu/utils/__init__.py", line 246, in retry_over_time
return fun(*args, **kwargs)
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/kombu/connection.py", line 237, in connect
return self.connection
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/kombu/connection.py", line 742, in connection
self._connection = self._establish_connection()
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/kombu/connection.py", line 697, in _establish_connection
conn = self.transport.establish_connection()
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/kombu/transport/pyamqp.py", line 116, in establish_connection
conn = self.Connection(**opts)
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/amqp/connection.py", line 165, in __init__
self.transport = self.Transport(host, connect_timeout, ssl)
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/amqp/connection.py", line 186, in Transport
return create_transport(host, connect_timeout, ssl)
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/amqp/transport.py", line 299, in create_transport
return TCPTransport(host, connect_timeout)
File "/Users/houmie/.pyenv/versions/3.5.1/envs/venv35/lib/python3.5/site-packages/amqp/transport.py", line 95, in __init__
raise socket.error(last_err)
OSError: [Errno 61] Connection refused

Flask-SQLAlchemy with Google App Engine only works in interactive console

I am trying to get my Google App Engine app to work with Flask-SQLAlchemy. I am getting the error:
AttributeError: 'module' object has no attribute 'paramstyle'
I have followed the instructions in this answer (Local MySQLdb connection fails with AttributeError for paramstyle when running GAE development server) and so I no longer get the error in the interactive console (when I make an app context).
However, the error still appears when I run the app in my local GAE. Why is this still happening? I am running OSX Mavericks. The full trace:
Traceback (most recent call last):
File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/runtime/wsgi.py", line 266, in Handle
result = handler(dict(self._environ), self._StartResponse)
File "/Users/conor/Documents/Projj/lib/flask/app.py", line 1836, in __call__
return self.wsgi_app(environ, start_response)
File "/Users/conor/Documents/Projj/lib/flask/app.py", line 1820, in wsgi_app
response = self.make_response(self.handle_exception(e))
File "/Users/conor/Documents/Projj/lib/flask/app.py", line 1403, in handle_exception
reraise(exc_type, exc_value, tb)
File "/Users/conor/Documents/Projj/lib/flask/app.py", line 1817, in wsgi_app
response = self.full_dispatch_request()
File "/Users/conor/Documents/Projj/lib/flask/app.py", line 1477, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/Users/conor/Documents/Projj/lib/flask/app.py", line 1381, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/Users/conor/Documents/Projj/lib/flask/app.py", line 1475, in full_dispatch_request
rv = self.dispatch_request()
File "/Users/conor/Documents/Projj/lib/flask/app.py", line 1461, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/Users/conor/Documents/Projj/financey.py", line 117, in login_view
userObj = get_user(username)
File "/Users/conor/Documents/Projj/financey.py", line 88, in get_user
return db.session.query(User).filter_by(username=username).first()
File "/Users/conor/Documents/Projj/lib/sqlalchemy/orm/scoping.py", line 149, in do
return getattr(self.registry(), name)(*args, **kwargs)
File "/Users/conor/Documents/Projj/lib/sqlalchemy/util/_collections.py", line 864, in __call__
return self.registry.setdefault(key, self.createfunc())
File "/Users/conor/Documents/Projj/lib/flask_sqlalchemy/__init__.py", line 139, in __init__
bind=db.engine,
File "/Users/conor/Documents/Projj/lib/flask_sqlalchemy/__init__.py", line 780, in engine
return self.get_engine(self.get_app())
File "/Users/conor/Documents/Projj/lib/flask_sqlalchemy/__init__.py", line 797, in get_engine
return connector.get_engine()
File "/Users/conor/Documents/Projj/lib/flask_sqlalchemy/__init__.py", line 473, in get_engine
self._engine = rv = sqlalchemy.create_engine(info, **options)
File "/Users/conor/Documents/Projj/lib/sqlalchemy/engine/__init__.py", line 332, in create_engine
return strategy.create(*args, **kwargs)
File "/Users/conor/Documents/Projj/lib/sqlalchemy/engine/strategies.py", line 69, in create
dialect = dialect_cls(**dialect_args)
File "/Users/conor/Documents/Projj/lib/sqlalchemy/dialects/mysql/base.py", line 1985, in __init__
default.DefaultDialect.__init__(self, **kwargs)
File "/Users/conor/Documents/Projj/lib/sqlalchemy/engine/default.py", line 124, in __init__
self.paramstyle = self.dbapi.paramstyle
AttributeError: 'module' object has no attribute 'paramstyle'
EDIT: In attempting to further diagnose the issue, when I import MySQLdb in the GAE Interactive Console (in the browser) I get the error ImportError: No module named _mysql
After looking around and debugging for far too long, I finally discovered the issue. When you run the GAE development server, the dev_appserver must be told exactly which MySQL instance you want to connect to. The connection string you use (e.g. in Python) doesn't matter. The connection string only determines database name and instance. However, SQLAlchemy can figure it out fine if you run form an ipython console. This may be due to a different driver being used under the hood.
Hopefully this saves someone else a lot of hair-pulling. Run your dev_appserver.py like so:
dev_appserver.py --mysql_host=<Cloud SQL IP> --mysql_user=root --mysql_password=<Cloud SQL root password> <application directory>

Error 111 after following Werkzeug tutorial "shortly"

I followed the tutorial for Werkzeug "Shortly" here
And I get this error message after submitting a valid url.
Traceback (most recent call last)
File "/home/sadik/NLM/shortly/shortly.py", line 87, in __call__
return self.wsgi_app(environ, start_response)
File "/usr/local/lib/python2.7/dist-packages/Werkzeug-0.9.4-py2.7.egg/werkzeug/wsgi.py", line 579, in __call__
return self.app(environ, start_response)
File "/home/sadik/NLM/shortly/shortly.py", line 83, in wsgi_app
response = self.dispatch_request(request)
File "/home/sadik/NLM/shortly/shortly.py", line 33, in dispatch_request
return getattr(self, 'on_' + endpoint)(request, **values)
File "/home/sadik/NLM/shortly/shortly.py", line 45, in on_new_url
short_id = self.insert_url(url)
File "/home/sadik/NLM/shortly/shortly.py", line 72, in insert_url
short_id = self.redis.get('reverse-url:' + url)
File "/usr/local/lib/python2.7/dist-packages/redis-2.9.1-py2.7.egg/redis/client.py", line 705, in get
return self.execute_command('GET', name)
File "/usr/local/lib/python2.7/dist-packages/redis-2.9.1-py2.7.egg/redis/client.py", line 464, in execute_command
connection.send_command(*args)
File "/usr/local/lib/python2.7/dist-packages/redis-2.9.1-py2.7.egg/redis/connection.py", line 334, in send_command
self.send_packed_command(self.pack_command(*args))
File "/usr/local/lib/python2.7/dist-packages/redis-2.9.1-py2.7.egg/redis/connection.py", line 316, in send_packed_command
self.connect()
File "/usr/local/lib/python2.7/dist-packages/redis-2.9.1-py2.7.egg/redis/connection.py", line 253, in connect
raise ConnectionError(self._error_message(e))
ConnectionError: Error 111 connecting localhost:6379. Connection refused.
The error message indicates that there is something wrong with localhost:6379
The relevant part of code is here:
def create_app(redis_host='localhost', redis_port=6379, with_static=True):
app = Shortly({
'redis_host': redis_host,
'redis_port': redis_port
})
if with_static:
app.wsgi_app = SharedDataMiddleware(app.wsgi_app, {
'/static': os.path.join(os.path.dirname(__file__), 'static')
})
return app
if __name__ == '__main__':
from werkzeug.serving import run_simple
app = create_app()
run_simple('127.0.0.1', 5000, app, use_debugger=True, use_reloader=True)
That means of course that the server is running on localhost:5000. So why is there another port number in the create_app function? That confuses me a bit.
I'm not familiar with shortly or werkzeug but it looks like you're missing the redis server, install one using your favourite package manager and try again.

Celery creating a new connection for each task

I'm using Celery with Redis to run some background tasks, but each time a task is called, it creates a new connection to Redis. I'm on Heroku and my Redis to Go plan allows for 10 connections. I'm quickly hitting that limit and getting a "max number of clients reached" error.
How can I ensure that Celery queues the tasks on a single connection rather than opening a new one each time?
EDIT - including the full traceback
File "/app/.heroku/venv/lib/python2.7/site-packages/django/core/handlers/base.py", line 111, in get_response
response = callback(request, *callback_args, **callback_kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/newrelic-1.4.0.137/newrelic/api/object_wrapper.py", line 166, in __call__
self._nr_instance, args, kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/newrelic-1.4.0.137/newrelic/hooks/framework_django.py", line 447, in wrapper
return wrapped(*args, **kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/django/views/decorators/csrf.py", line 77, in wrapped_view
return view_func(*args, **kwargs)
File "/app/feedback/views.py", line 264, in zencoder_webhook_handler
tasks.process_zencoder_notification.delay(webhook)
File "/app/.heroku/venv/lib/python2.7/site-packages/celery/app/task.py", line 343, in delay
return self.apply_async(args, kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/celery/app/task.py", line 458, in apply_async
with app.producer_or_acquire(producer) as P:
File "/usr/local/lib/python2.7/contextlib.py", line 17, in __enter__
return self.gen.next()
File "/app/.heroku/venv/lib/python2.7/site-packages/celery/app/base.py", line 247, in producer_or_acquire
with self.amqp.producer_pool.acquire(block=True) as producer:
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/connection.py", line 705, in acquire
R = self.prepare(R)
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/pools.py", line 54, in prepare
p = p()
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/pools.py", line 45, in <lambda>
return lambda: self.create_producer()
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/pools.py", line 42, in create_producer
return self.Producer(self._acquire_connection())
File "/app/.heroku/venv/lib/python2.7/site-packages/celery/app/amqp.py", line 160, in __init__
super(TaskProducer, self).__init__(channel, exchange, *args, **kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/messaging.py", line 83, in __init__
self.revive(self.channel)
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/messaging.py", line 174, in revive
channel = self.channel = maybe_channel(channel)
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/connection.py", line 879, in maybe_channel
return channel.default_channel
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/connection.py", line 617, in default_channel
self.connection
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/connection.py", line 610, in connection
self._connection = self._establish_connection()
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/connection.py", line 569, in _establish_connection
conn = self.transport.establish_connection()
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/transport/virtual/__init__.py", line 722, in establish_connection
self._avail_channels.append(self.create_channel(self))
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/transport/virtual/__init__.py", line 705, in create_channel
channel = self.Channel(connection)
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/transport/redis.py", line 271, in __init__
self.client.info()
File "/app/.heroku/venv/lib/python2.7/site-packages/newrelic-1.4.0.137/newrelic/api/object_wrapper.py", line 166, in __call__
self._nr_instance, args, kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/newrelic-1.4.0.137/newrelic/api/function_trace.py", line 81, in literal_wrapper
return wrapped(*args, **kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/redis/client.py", line 344, in info
return self.execute_command('INFO')
File "/app/.heroku/venv/lib/python2.7/site-packages/kombu/transport/redis.py", line 536, in execute_command
conn.send_command(*args)
File "/app/.heroku/venv/lib/python2.7/site-packages/redis/connection.py", line 273, in send_command
self.send_packed_command(self.pack_command(*args))
File "/app/.heroku/venv/lib/python2.7/site-packages/redis/connection.py", line 256, in send_packed_command
self.connect()
File "/app/.heroku/venv/lib/python2.7/site-packages/newrelic-1.4.0.137/newrelic/api/object_wrapper.py", line 166, in __call__
self._nr_instance, args, kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/newrelic-1.4.0.137/newrelic/api/function_trace.py", line 81, in literal_wrapper
return wrapped(*args, **kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/redis/connection.py", line 207, in connect
self.on_connect()
File "/app/.heroku/venv/lib/python2.7/site-packages/redis/connection.py", line 233, in on_connect
if self.read_response() != 'OK':
File "/app/.heroku/venv/lib/python2.7/site-packages/redis/connection.py", line 283, in read_response
raise response
ResponseError: max number of clients reached
I ran into the same problem on Heroku with CloudAMQP. I do not know why, but I had no luck when assigning low integers to the BROKER_POOL_LIMIT setting.
Ultimately, I found that by setting BROKER_POOL_LIMIT=None or BROKER_POOL_LIMIT=0 my issue was mitigated. According to the Celery docs, this disables the connection pool. So far, this has not been a noticeable issue for me, however I'm not sure if it might be for you.
Link to relevant info: http://celery.readthedocs.org/en/latest/configuration.html#broker-pool-limit
I wish I was using Redis, because there is a specific option to limit the number of connections: CELERY_REDIS_MAX_CONNECTIONS.
http://docs.celeryproject.org/en/3.0/configuration.html#celery-redis-max-connections (for 3.0)
http://docs.celeryproject.org/en/latest/configuration.html#celery-redis-max-connections (for 3.1)
http://docs.celeryproject.org/en/master/configuration.html#celery-redis-max-connections (for dev)
The MongoDB has a similar backend setting.
Given these backend settings, I have no idea what BROKER_POOL_LIMIT actually does. Hopefully CELERY_REDIS_MAX_CONNECTIONS solves your problem.
I'm one of those folks using CloudAMQP, and the AMQP backend does not have its own connection limit parameter.
Try those settings :
CELERY_IGNORE_RESULT = True
CELERY_STORE_ERRORS_EVEN_IF_IGNORED = True
I had a similar issue involving number of connections and Celery. It wasn't on Heroku, and it was Mongo and not Redis though.
I initiated the connection outside of the task function definition at the task module level. At least for Mongo this allowed the tasks to share the connection.
Hope that helps.
https://github.com/instituteofdesign/wander/blob/master/wander/tasks.py
mongoengine.connect('stored_messages')
#celery.task(default_retry_delay = 61)
def pull(settings, google_settings, user, folder, messageid):
'''
Pulls a message from zimbra and stores it in Mongo
'''
try:
imap = imap_connect(settings, user)
imap.select(folder, True)
.......

Categories

Resources