The access log and root log for my Flask app was helped by zcbuildout. That's fine. Now I wondered how do I get logging from my own app. I know how to use logging library but paster just do not log it in the console or anywhere.
Thanks
Here's my config:
[loggers]
keys = root, wsgi, myapp
[handlers]
keys = console, accesslog
[formatters]
keys = generic, accesslog
[formatter_generic]
format = %(asctime)s %(levelname)s [%(name)s] %(message)s
[formatter_accesslog]
format = %(message)s
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[handler_accesslog]
class = FileHandler
args = (os.path.join(r'.', 'access.log'), 'a')
level = INFO
formatter = accesslog
[logger_root]
level = INFO
handlers = console
[logger_wsgi]
level = INFO
handlers = accesslog
qualname = wsgi
propagate = 0
[logger_myapp]
level = DEBUG
handlers = console
qualname = myapp
[filter:translogger]
use = egg:Paste#translogger
setup_console_handler = False
logger_name = wsgi
[app:main]
use = egg:myapp#debug
filter-with = translogger
...
Here's how I tried to log:
import logging as log
def myfunc():
log.debug("show me the log")
After a long time, this logging problem was resolved. Referring to the manual, Flask has configured its own logger. So to do logging, please use flask.logger
Documentation is here:
http://flask.pocoo.org/docs/api/#flask.Flask.logger
Related
I am running a python flask app using uwsgi and nginx. I am having trouble getting the app modules to log. If i run the flask app by itself, i can see the logs properly formatted, but in uwsgi, i see 'no handlers could be found for logger...' and the logs are missing. prints show up fine. Could someone help with what I am doing wrong?
Thanks
I run uwsgi as
/usr/local/bin/uwsgi --ini /root/uwsgi.ini
# cat /root/uwsgi.ini
[uwsgi]
base=/root/mainapp
app = mainapp
module = %(app)
pythonpath = %(base)
socket = /tmp/mainapp.sock
chmod-socket = 666
callable = app
logto = /var/log/mainapp/app.log
paste-logger = %p
[formatters]
keys: detailed
[handlers]
keys: console
[loggers]
keys: root, module1, module2, module3
[formatter_detailed]
format: %(asctime)s %(name)s:%(levelname)s %(module)s:%(lineno)d: %(message)s
[handler_console]
class: StreamHandler
args: []
formatter: detailed
[logger_root]
level: DEBUG
handlers:
[logger_module1]
level: DEBUG
qualname: module1
handlers: console
[logger_module2]
level: DEBUG
qualname: module2
handlers: console
[logger_module3]
level: DEBUG
qualname: module3
handlers: console
and in the module, i call
import logging
log = logging.getLogger('module1')
log.info('hello world')
Have you configured Flask to use the logger? Off the top of my head something like this should work
app.config['LOG_FILE'] = 'application.log'
# Configure logger.
if not app.debug:
import logging
from logging import FileHandler
file_handler = FileHandler(app.config['LOG_FILE'])
file_handler.setLevel(logging.WARNING)
app.logger.addHandler(file_handler)
looks like i have to instantiate flask's app.logger before i can do anything...this did the trick...set addhandler on app.logger and
import logging
import logging.config
shandler = logging.StreamHandler()
shandler.setLevel(logging.DEBUG)
app.logger.addHandler(shandler)
logging.config.fileConfig('logging.conf')
I've been trying multiple things on this one, but with no success.
I want to save log to file (SqlAlchemy logs, app debug logs, stack traces on errors, etc.).
I'm starting uwsgi with the following command:
uwsgi --ini-paste-logged myapp.ini
And here is the content of the ini file (where apiservice is my pakage)
[loggers]
keys = root, apiservice, sqlalchemy
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = INFO
handlers = console
[logger_apiservice]
level = DEBUG
handlers =
qualname = apiservice
[logger_sqlalchemy]
level = INFO
handlers =
qualname = sqlalchemy.engine
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(asctime)s %(levelname)-5.5s [%(name)s][%(threadName)s] %(message)s
[uwsgi]
socket = /tmp/myapp-uwsgi.sock
virtualenv = /var/www/myapp/env
pidfile = ./uwsgi.pid
daemonize = ./uwsgi.log
master = true
processes = 4
The uwsgi.log contains only request log, without any actual logging data.
I've tried with INI options like:
paste: config:%p
paste-logger: %p
logto: file
Nothing seem to work.
Apparently, the uwsgi config section was fine.
After closer look at the uwsgi.log, even though the server was launched and running successfully, you could see an error:
ImportError: No module named script.util.logging_config
I've installed following packages to solve my problems:
pip install pastescript
pip install pastedeploy
I am trying to setup logging for a Python Pyramid Waitress Server. I have followed the docs here:
Pyramid logging and here: Pyramid PasteDeploy logging. I have tired both methods which have yield no logging results from waitress. My own logging works perfectly.
I have set Waitress logging level to DEBUG and I get nothing even I remove server files. Waitress fails server silently.
How do you set up logging for a Pyramid Waitress Server so I can see files be requested, missing file errors, etc?
Method 1:
Setup from code:
import logging
logging.basicConfig()
logger = logging.getLogger('waitress')
logger.setLevel(logging.DEBUG)
Method 2:
Starting the server with pserve development.ini where the development.ini file sets up the logging as below
[app:main]
use = egg:MyProject
pyramid.reload_templates = true
pyramid.debug_authorization = false
pyramid.debug_notfound = false
pyramid.debug_routematch = false
pyramid.default_locale_name = en
pyramid.includes =
pyramid_debugtoolbar
[server:main]
use = egg:waitress#main
host = 0.0.0.0
port = 6543
[loggers]
keys = root, myproject, waitress
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = INFO
handlers = console
[logger_myproject]
level = DEBUG
handlers =
qualname = myproject
[logger_waitress]
level = DEBUG
handlers =
qualname = waitress
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(asctime)s %(levelname)-5.5s [%(name)s][%(threadName)s] %(message)s
The logging configuration actually works. Here I demonstrate a simple view to emit logging message for waitress logger
#view_config(route_name='hello_baby',
request_method='GET',
renderer='string')
def hello_baby(request):
import logging
logger = logging.getLogger('waitress')
logger.info('Hello baby!')
return 'Hi there'
You should be able to see the logging message when you hit the page. The reason you didn't see messages from waitress is - there is no logging messages are emitted for common routines in waitress. It only emits messages when something goes wrong, you can read the source code
For some other knowledge about Python logging, you can read my article : Good logging practice in Python
I got console logging (for all requests) to show up by using paste's translogger; A good example is at http://flask.pocoo.org/snippets/27/.
Here's the relevant section of my .ini:
[app:main]
use = egg:${:app}
filter-with = translogger
[filter:translogger]
use = egg:Paste#translogger
# these are the option default values (see http://pythonpaste.org/modules/translogger.html)
# logger_name='wsgi'
# format=None
# logging_level=20
# setup_console_handler=True
# set_logger_level=10
I've read a few posts on this but I'm still confused. I have this logging setup:
import logging
class MongoHandler(logging.Handler):
def __init__(self):
logging.Handler.__init__(self)
from pymongo import Connection
self.db = Connection('db_server').db_name
def emit(self, record):
try:
self.db.Logging.save(record.__dict__)
except:
print 'Logging Error: Unable to save log entry to db'
mh = MongoHandler()
sh = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s - %(threadName)s - %(levelname)s - %(message)s')
sh.setFormatter(formatter)
log = logging.getLogger('DeviceMonitor_%s' % hostname)
log.addHandler(mh)
log.addHandler(sh)
log.setLevel(logging.INFO)
I want to be able to set a different level for the StreamHandler and the MongoHandler. Is that possible or do I need to have a second Logger obj?
You can set a different logging level for each logging handler but it seems you will have to set the logger's level to the "lowest". In the example below I set the logger to DEBUG, the stream handler to INFO and the TimedRotatingFileHandler to DEBUG. So the file has DEBUG entries and the stream outputs only INFO. You can't direct only DEBUG to one and only INFO to another handler. For that you'll need another logger.
logger = logging.getLogger("mylog")
formatter = logging.Formatter(
'%(asctime)s | %(name)s | %(levelname)s: %(message)s')
logger.setLevel(logging.DEBUG)
stream_handler = logging.StreamHandler()
stream_handler.setLevel(logging.INFO)
stream_handler.setFormatter(formatter)
logFilePath = "my.log"
file_handler = logging.handlers.TimedRotatingFileHandler(
filename=logFilePath, when='midnight', backupCount=30)
file_handler.setFormatter(formatter)
file_handler.setLevel(logging.DEBUG)
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
logger.info("Started");
try:
x = 14
y = 0
z = x / y
except Exception as ex:
logger.error("Operation failed.")
logger.debug(
"Encountered {0} when trying to perform calculation.".format(ex))
logger.info("Ended");
I needed a time to understand the point
Set the general logger below your subloggers (handlers) (your result of logging.getLogger())
Set your subloggers levels on an equal or superior level to your general logger
In addition to GrantVS's answer:
I had to use
logging.basicConfig(level=logging.DEBUG)
in order for it to work. Otherwise great answer, thanks!
Had the same problem but the solution didn't work for iPython as the QtConsole automatically creates a handler with no level set:
import logging
root = logging.getLogger()
root.handlers
Out: [<StreamHandler <stderr> (NOTSET)>]
As a result iPython printed both DEBUG and INFO to console in spite of having different levels for my file handler and stream handler.
This thread pointed out this issue for me: Logging module does not print in IPython
I made a helper module (helped greatly by this stack thread!) called custom_logging.py to make logging more convenient in other modules:
import logging
from pathlib import Path
import sys
def _add_stream_handler(logger: logging.Logger):
stream_handler = logging.StreamHandler()
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')
stream_handler.setFormatter(formatter)
stream_handler.setLevel(logging.INFO)
logger.addHandler(stream_handler)
return logger
def _add_file_handler(logger: logging.Logger, log_path: Path):
file_handler = logging.FileHandler(log_path, mode='w')
formatter = logging.Formatter(
fmt='%(asctime)s %(name)-12s %(levelname)-8s %(message)s', datefmt='%m-%d %H:%M')
file_handler.setFormatter(formatter)
file_handler.setLevel(logging.DEBUG)
logger.addHandler(file_handler)
return logger
def create_logger(root_dir: Path, caller: str) -> logging.Logger:
log_path = root_dir / 'logs' / f'{caller}.log'
logger = logging.getLogger(caller)
root = logging.getLogger()
logger.setLevel(logging.DEBUG)
# If haven't already launched handlers...
if not len(logger.handlers):
_add_file_handler(logger=logger, log_path=log_path)
_add_stream_handler(logger=logger)
logger.info('Logging started.')
# Delete the Qtconsole stderr handler
# ... as it automatically logs both DEBUG & INFO to stderr
if root.handlers:
root.handlers = []
return logger
def log_dataframe(df, logger: logging.Logger, name: str = "DataFrame") -> None:
logger.debug(
f'''{name} head:\n {df.head()}\n----------\n''')
def log_dataframes(*args, logger: logging.Logger) -> None:
for gdf in args:
logger.debug(
f'''DataFrame head:\n {gdf.head()}\n----------\n''')
Can use its functions via:
from custom_logging import create_logger, log_dataframe
Or import custom_logging and custom_logging.create_logger() etc.
Also see sections 'Multiple handlers and formatters' and 'Logging to multiple destinations' in the official logging cookbook at:
https://docs.python.org/3/howto/logging-cookbook.html#logging-cookbook
I have setup logging as follows:
def setUp():
LOG_FORMAT = '%(asctime)s %(levelname)-8s %(name)s %(message)s'
#LOG_FORMAT = '%(asctime)s %(name)s %(message)s'
logging.basicConfig(level=logging.DEBUG, format=LOG_FORMAT)
formatter = logging.Formatter(LOG_FORMAT)
ch = logging.StreamHandler()
ch.setLevel(logging.ERROR)
ch.setFormatter(formatter)
logging.getLogger().addHandler(ch)
LOG_FILENAME = 'file.log'
fh = logging.FileHandler(LOG_FILENAME, 'w')
fh.setLevel(logging.DEBUG)
fh.setFormatter(formatter)
logging.getLogger().addHandler(fh)
However, the console still shows DEBUG messages. Am I missing something here?
Note that setting the level to ERROR on fh works fine.
I think you need to remove the call to logging.basicConfig. That function adds another logging.StreamHandler that probably is the one that is printing the messages you don't want to be printed.
To check this you can take a look at the handlers attribute for the root logger (it's a list with all the handlers being used) and verify how many logging.StreamHandlers there are. Also, probably the message with level set to logging.ERROR are printed twice because of the two logging.StreamHandlers.
My final advice is avoid using logging.basicConfig if you're going to explicitly configure the handlers in the code.
Edit: Just for completeness, the source code of logging.BasicConfig is as follows:
if len(root.handlers) == 0:
filename = kwargs.get("filename")
if filename:
mode = kwargs.get("filemode", 'a')
hdlr = FileHandler(filename, mode)
else:
stream = kwargs.get("stream")
hdlr = StreamHandler(stream)
fs = kwargs.get("format", BASIC_FORMAT)
dfs = kwargs.get("datefmt", None)
fmt = Formatter(fs, dfs)
hdlr.setFormatter(fmt)
root.addHandler(hdlr)
level = kwargs.get("level")
if level is not None:
root.setLevel(level)
where you can see that unless filename is passed, a logging.StreamHandler is created.
From Python docs on logging.basicConfig:
Does basic configuration for the logging system by creating a
StreamHandler with a default Formatter and adding it to the root
logger.
As you set debug level of root logger to logging.DEBUG and you didn't switched off forwarding messages up to the root logger your DEBUG messages get logged by this StreamHandler created by basicConfig