2

我确实有以下logger课程(如 logger.py):

import logging, logging.handlers
import config

log = logging.getLogger('myLog')

def start():
    "Function sets up the logging environment."
    log.setLevel(logging.DEBUG)
    formatter = logging.Formatter(fmt='%(asctime)s [%(levelname)s] %(message)s', datefmt='%d-%m-%y %H:%M:%S')

    if config.logfile_enable:
        filehandler = logging.handlers.RotatingFileHandler(config.logfile_name, maxBytes=config.logfile_maxsize,backupCount=config.logfile_backupCount)
        filehandler.setLevel(logging.DEBUG)
        filehandler.setFormatter(formatter)
        log.addHandler(filehandler)

    console = logging.StreamHandler()
    console.setLevel(logging.DEBUG)
    console.setFormatter(logging.Formatter('[%(levelname)s] %(message)s')) # nicer format for console
    log.addHandler(console)

    # Levels are: debug, info, warning, error, critical.
    log.debug("Started logging to %s [maxBytes: %d, backupCount: %d]" % (config.logfile_name, config.logfile_maxsize, config.logfile_backupCount))

def stop():
    "Function closes and cleans up the logging environment."
    logging.shutdown()

对于日志记录,我启动logger.start()一次,然后from logger import log在任何项目文件中导入。然后我只在需要时使用log.debug()log.error()。它在脚本的任何地方都可以正常工作(不同的类、函数和文件),但它不适用于通过多处理类启动的不同进程。

我收到以下错误:No handlers could be found for logger "myLog"

我能做些什么?

4

1 回答 1

9

来自 python 文档:logging to a single file from multiple processes is not supported, because there is no standard way to serialize access to a single file across multiple processes in Python.

请参阅:http ://docs.python.org/howto/logging-cookbook.html#logging-to-a-single-file-from-multiple-processes

顺便说一句:我在这种情况下所做的是使用 Scribe,它是一个分布式日志聚合器,我通过 TCP 登录。这允许我将我拥有的所有服务器记录到同一个地方,而不仅仅是所有进程。

看到这个项目: http: //pypi.python.org/pypi/ScribeHandler

于 2012-05-19T13:02:28.750 回答