my python code is interlaced with lots of function calls used for (debugging|profiling|tracing etc.)
for example:
import logging
logging.root.setLevel(logging.DEBUG)
logging.debug('hello')
j = 0
for i in range(10):
j += i
logging.debug('i %d j %d' % (i,j))
print(j)
logging.debug('bye')
i want to #define these resource consuming functions out of the code. something like the c equivalent
#define logging.debug(val)
yes, i know the logging module logging level mechanism can be used to mask out loggings below set log level. but, im asking for a general way to have the python interpreter skip functions (that take time to run even if they dont do much)
one idea is to redefine the functions i want to comment out into empty functions:
def lazy(*args): pass
logging.debug = lazy
the above idea still calls a function, and may create a myriad of other problems
Python does not have a preprocessor, although you could run your python source through an external preprocessor to get the same effect – e.g.
sed "/logging.debug/d"will strip out all the debug logging commands. This is not very elegant though – you will end up needing some sort of build system to run all your modules through the preprocessor and perhaps create a new directory tree of the processed .py files before running the main script.Alternatively if you put all your debug statements in an
if __debug__:block they will get optimised out when python is run with the -O (optimise) flag.As an aside, I checked the code with the dis module to ensure that it did get optimised away. I discovered that both
and
are optimised, but
is not. This is because False is a regular Python object, and you can in fact do this:
Which seems to me a flaw in the language – hopefully it is fixed in Python 3.
Edit:
This is fixed in Python 3: Assigning to True or False now gives a SyntaxError.
Since True and False are constants in Python 3, it means that
if False: doStuff()is now optimised: