When using the multiprocessing
module, logging becomes less useful since
sub-processes should log to individual files/streams or there's the risk of
records becoming garbled.
This simple module implements a Handler
that when set on the root
Logger
will handle tunneling the records to the main process so that
they are handled correctly.
It's currently tested in Linux and Python 2.7 & 3.3+.
Pypy3 hangs on the tests so I don't recommend using it.
Pypy has synchronization issues.
It was tested by users and reported to work on Windows with Python 3.5 and 3.6.
This library was taken verbatim from a StackOverflow post and extracted into a module so that I wouldn't have to copy the code in every project.
Later, several improvements have been contributed.
Before you start logging but after you configure the logging framework (maybe with logging.basicConfig(...)
), do the following:
import multiprocessing_logging
multiprocessing_logging.install_mp_handler()
and that's it.
When using a Pool, make sure install_mp_handler
is called before the Pool is instantiated, for example:
import logging
from multiprocessing import Pool
from multiprocessing_logging import install_mp_handler
loggig.basicConfig(...)
install_mp_handler()
pool = Pool(...)