9. MetaProgramming and Co.
Metaprogramming is quite an interesting area of programming. Metaprogramming deals with code that manipulates other code. It is a broad
category that covers areas such as function decorators, class decorators, metaclasses and the use of built-ins like exec, eval and
context managers etc. These constructs sometimes help to prevent repetitive code and most times add new functionality to a piece of code
in elegant ways. In this chapter, decorators, metaclasses and context managers are discussed.
9.1 Decorators
A decorator is a function that wraps another function or class. It introduces new functionality to the wrapped class or function without altering the original functionality of such class or function thus the interface of such class or function remains the same.
Function Decorators
A good understanding of functions as first class objects is important in order to understand function decorators. A reader will be well served by reviewing the material on functions. When functions are first class objects the following will apply to functions:
- Functions can be passed as arguments to other functions.
- Functions can be returned from other function calls.
- Functions can be defined within other functions resulting in closures.
The above listed properties of first class functions provide the foundation needed to explain function decorators. Put simply, function decorators are “wrappers” that enable the execution of code before and after the function they decorate without modifying the function itself.
Function decorators are not unique to Python so to explain them, Python function decorators and the corresponding syntax are ignored for the moment and instead the essence of function decorators is focused on. To understand what decorators do, a very trivial function is decorated with another trivial function that logs calls to the decorated function in the following example. This function decoration is achieved using function composition as shown below (follow the comments):
import datetime
# decorator expects another function as argument
def logger(func_to_dec):
# A wrapper function is defined on the fly
def func_wrapper():
# add any pre original function execution functionality
print("Calling function: {} at {}".format(func_to_dec.__name__, datetime.datetime.now()))
# execute original function
func_to_dec()
# add any post original function execution functionality
print("Finished calling : {}".format(func_to_dec.__name__))
# return the wrapper function defined on the fly. Body of the
# wrapper function has not been executed yet but a closure
# over the func_to_decorate has been created.
return func_wrapper
def print_full_name():
print("My name is John Doe")
# use composition to decorate the print_full_name function
>>>decorated_func = logger(print_full_name)
>>>decorated_func
# the returned value, decorated_func, is a reference to a func_wrapper
<function func_wrapper at 0x101ed2578>
>>>decorated_func()
# decorated_func call output
Calling function: print_full_name at 2015-01-24 13:48:05.261413
# the original functionality is preserved
My name is John Doe
Finished calling : print_full_name
In the trivial example defined above, the decorator adds a new feature, printing some information before and after the original function
call, to the original function without altering it. The decorator, logger takes a function to be decorated, print_full_name and returns
a function, func_wrapper that calls the decorated function, print_full_name, when it is executed. The decoration process here is calling
the decorator with the function to be decorated as argument. The function returned, func_wrapper is closed over the reference to the
decorated function, print_full_name and thus can invoke the decorated function when it is executing. In the above, calling
decorated_func results in print_full_name being executed in addition to some other code snippets that implement new functionality.
This ability to add new functionality to a function without modifying the original function is the essence of function decorators. Once
this concept is understood, the concept of decorators is understood.
Decorators in Python
Now that the essence of function decorators have been discussed, an attempt is made to de-construct Python constructs that enable the
definition of decorators more easily. The previous section describes the essence of decorators but having to use decorators via function
compositions as described is cumbersome. Python introduces the @ symbol for decorating functions. Decorating a function using the Python
decorator syntax is achieved as shown in the following example.
@decorator
def a_stand_alone_function():
pass
Calling stand_alone_function now is equivalent to calling decorated_func function from the previous section but there is no longer
a need to to define the intermediate decorated_func.
It is important to understand what the @ symbol does with respect to decorators in Python. The @decorator line does not define a python
decorator rather one can think of it as syntactic sugar for decorating a function. I like to define decorating a function as the
process of applying an existing decorator to a function. The decorator is the actual function, decorator, that adds the new
functionality to the original function. According to PEP 318, the following decorator
snippet
@dec2
@dec1
def func(arg1, arg2, ...):
pass
is equivalent to
def func(arg1, arg2, ...):
pass
func = dec2(dec1(func))
without the intermediate func argument. In the above, @dec1 and @dec2 are the decorator invocations. Stop, think carefully and ensure you understand this. dec1 and dec2 are function object references and these are the actual decorators. These values can even be replaced by any function call or a value that when evaluated returns a function that takes another function. What is of paramount importance is that the name reference following the @ symbol is a reference to a function object (for this tutorial we assume this should be a function object but in reality it should be a callable object) that takes a function as argument. Understanding this profound fact will help in understanding python decorators and more involved decorator topics such as decorators that take arguments.
Passing Arguments To Decorated Functions
Arguments are supplied to functions that are being decorated by simply passing the arguments into the function that wraps, i.e the inner function returned when the decorator is invoked, the decorated function. This is illustrated with the following example.
import datetime
# decorator expects another function as argument
def logger(func_to_decorate):
# A wrapper function is defined on the fly
def func_wrapper(*args, **kwargs):
# add any pre original function execution functionality
print("Calling function: {} at {}".format(func_to_decorate.__name__, datetime.datetime.now()))
# execute original function
func_to_decorate(*args, **kwargs)
# add any post original function execution functionality
print("Finished calling : {}".format(func_to_decorate.__name__))
# return the wrapper function defined on the fly. Body of the
# wrapper function has not been executed yet but a closure over
# the func_to_decorate has been created.
return func_wrapper
@logger
def print_full_name(first_name, last_name):
print("My name is {} {}".format(first_name, last_name))
print_full_name("John", "Doe")
Calling function: print_full_name at 2015-01-24 14:36:36.691557
My name is John Doe
Finished calling : print_full_name
Note how the *args and **kwargs parameters are used in defining the inner wrapper function; this is for the simple reason that it cannot be known beforehand what functions are going to be decorated and thus the function signature of such functions.
Decorator Functions with Arguments
Decorator functions can also be defined to take arguments but this is more involved than the case of passing functions to decorated functions. The following example illustrates this.
# this function takes arguments and returns a function.
# the returned functions is our actual decorator
def decorator_maker_with_arguments(decorator_arg1):
# this is our actual decorator that accepts a function
def decorator(func_to_decorate):
# wrapper function takes arguments for the decorated
# function
def wrapped(function_arg1, function_arg2) :
# add any pre original function execution
# functionality
print("Calling function: {} at {} with decorator arguments: {} and function arguments:{} {\
}".\
format(func_to_decorate.__name__, datetime.datetime.now(), decorator_arg1, function_ar\
g1, function_arg2))
func_to_decorate(function_arg1, function_arg2)
# add any post original function execution
# functionality
print("Finished calling : {}".format(func_to_decorate.__name__))
return wrapped
return decorator
@decorator_maker_with_arguments("Apollo 11 Landing")
def print_name(function_arg1, function_arg2):
print ("My full name is -- {} {} --".format(function_arg1, function_arg2))
>>> print_name("Tranquility base ", "To Houston")
Calling function: print_name at 2015-01-24 15:03:23.696982 with decorator arguments: Apollo 11 Landing\
and function arguments:Tranquility base To Houston
My full name is -- Tranquility base To Houston --
Finished calling : print_name
As mentioned previously, the key to understanding what is going on with this is to note that we can replace the reference value following the @ in a function decoration with any value that evaluates to a function object that takes another function as argument. In the above snippet, the value returned by the function call, decorator_maker_with_arguments("Apollo 11 Landing") , is the decorator. The call evaluates to a function, decorator that accepts a function as argument. Thus the decoration @decorator_maker_with_arguments("Apollo 11 Landing") is equivalent to @decorator but with the decorator, decorator , closed over the argument, Apollo 11 Landing by the decorator_maker_with_arguments function call. Note that the arguments supplied to a decorator can not be dynamically changed at run time as they are executed on script import.
Functools.wrap
Using decorators involves swapping out one function for another. A result of this is that meta information such as docstrings in the swapped out function are lost when using a decorator with such function. This is illustrated below:
import datetime
# decorator expects another function as argument
def logger(func_to_decorate):
# A wrapper function is defined on the fly
def func_wrapper():
# add any pre original function execution functionality
print("Calling function: {} at {}".format(func_to_decorate.__name__, datetime.datetime.now()))
# execute original function
func_to_decorate()
# add any post original function execution functionality
print("Finished calling : {}".format(func_to_decorate.__name__))
# return the wrapper function defined on the fly. Body of the
# wrapper function has not been executed yet but a closure
# over the func_to_decorate has been created.
return func_wrapper
@logger
def print_full_name():
"""return john doe's full name"""
print("My name is John Doe")
>>> print(print_full_name.__doc__)
None
>>> print(print_full_name.__name__)
func_wrapper
In the above example, an attempt to print the documentation string returns None because the decorator has swapped out the print_full_name function with the func_wrapper function that has no documentation string.
Even the function name now references the name of the wrapper function rather than the actual function. This, most times, is not what we want when using decorators. To work around this Python functools module provides the wraps function that also happens to be a decorator. This decorator is applied to the wrapper function and takes the function to be decorated as argument. The usage is illustrated in the following example.
import datetime
from functools import wraps
# decorator expects another function as argument
def logger(func_to_decorate):
@wraps(func_to_decorate)
# A wrapper function is defined on the fly
def func_wrapper(*args, **kwargs):
# add any pre original function execution functionality
print("Calling function: {} at {}".format(func_to_decorate.__name__, datetime.datetime.now()))
# execute original function
func_to_decorate(*args, **kwargs)
# add any post original function execution functionality
print("Finished calling : {}".format(func_to_decorate.__name__))
# return the wrapper function defined on the fly. Body of the
# wrapper function has not been executed yet but a closure over
# the func_to_decorate has been created.
return func_wrapper
@logger
def print_full_name(first_name, last_name):
"""return john doe's full name"""
print("My name is {} {}".format(first_name, last_name))
>>> print(print_full_name.__doc__)
return john doe's full name
>>>print(print_full_name.__name__)
print_full_name
Class Decorators
Like functions, classes can also be decorated. Class decorations server the same purpose as function decorators - introducing new functionality without modifying the actual classes. An example of a class decorator is given in the following singleton decorator that ensures that only one instance of a decorated class is ever initialised throughout the lifetime of the execution of the program.
def singleton(cls):
instances = {}
def get_instance():
if cls not in instances:
instances[cls] = cls()
return instances[cls]
return get_instance
Putting the decorator to use in the following examples shows how this works. In the following example, the Foo class is initialized twice however comparing the ids of both initialized objects shows that they both refer to the same object.
@singleton
class Foo(object):
pass
>>> x = Foo()
>>> id(x)
4310648144
>>> y = Foo()
>>> id(y)
4310648144
>>> id(y) == id(x) # both x and y are the same object
True
>>>
The same singleton functionality can be achieved using a metaclass by overriding the __call__ method of the metaclass as shown below:
class Singleton(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
return cls._instances[cls]
class Foo(object):
__metaclass__ = Singleton
>>> x = Foo()
>>> y = Foo()
>>> id(x)
4310648400
>>> id(y)
4310648400
>>> id(y) == id(x)
True
Applying Decorators to instance and static methods
Instance, static and class methods can also be decorated. The important thing is to take note of the order in which the decroators are placed in static and class methods. The decorator must come before the static and class method decorators that are used to create static and class methods because these method decorators do not return callable objects. A valid example of method decorators is shown in the following example.
def timethis(func):
@wraps(func)
def wrapper(*args, **kwargs):
start = time.time()
r = func(*args, **kwargs)
end = time.time()
print(end - start)
return r
return wrapper
# Class illustrating application of the decorator to different kinds of methods
class Spam:
@timethis
def instance_method(self, n):
print(self, n)
while n > 0:
n -= 1
@classmethod
@timethis
def class_method(cls, n):
while n > 0:
print(n)
n -= 1
@staticmethod
@timethis
def static_method(n):
while n > 0:
print(n)
n -= 1
>>>Spam.class_method(10)
10
9
8
7
6
5
4
3
2
1
0.00019788742065429688
>>>Spam.static_method(10)
10
9
8
7
6
5
4
3
2
1
0.00014591217041015625
9.2 Decorator Recipes
Decorators have a wide range of applications in python; this section discusses some interesting uses of decorators providing the implementation for such decorators. The following are just samples of the possible applications of decorators. A more comprehensive listing of recipes including the examples listed that are discussed in this section can be found at Python decorator library website. A major benefit of a lot of decorators is that cross cutting concerns such as logging information can be done in a single place, the decorator, rather across multiple functions. The benefit of having such functionality in one place is glaringly obvious as changes are localised and way easier to maintain. The following recipes illustrate this.
- Decorators provide a mean to log information about other functions; these maybe information such as timing information or argument information. An example of such a decorator is shown in the following example.
importloggingdeflog(func):'''Returns a wrapper that wraps func. The wrapper will log the entry and exit points of the funct\ion with logging.INFO level.'''logging.basicConfig()logger=logging.getLogger(func.__module__)@functools.wraps(func)defwrapper(*args,**kwds):logger.info("About to execute {}".format(func.__name__))f_result=func(*args,**kwds)logger.info("Finished the execution of {}".format(func.__name__))returnf_resultreturnwrapper - A memoization decorator can be used to decorate a function that performs a calculation so that for a given argument if the result has been previously computed, the stored value is returned but if it has not then it is computed and stored before it is returned to the caller. This kind of decorator is available in the functools module as discussed in the chapter on functions. An implementation for such a decorator is shown in the following example.
importcollectionsdefcache(func):cache={}logging.basicConfig()logger=logging.getLogger(func.__module__)logger.setLevel(10)@functools.wraps(func)defwrapper(*arg,**kwds):ifnotisinstance(arg,collections.Hashable):logger.info("Argument cannot be cached: {}".format(arg))returnfunc(*arg,**kwds)ifargincache:logger.info("Found precomputed result, {}, for argument, {}".format(cache[arg],arg))returncache[arg]else:logger.info("No precomputed result was found for argument, {}".format(arg))value=func(*arg,**kwds)cache[arg]=valuereturnvaluereturnwrapper - Decorators could also easily be used to implement functionality that retries a callable up to a maximum amount of times.
defretries(max_tries,delay=1,backoff=2,exceptions=(Exception,),hook=None):"""Function decorator implementing retrying logic. The decorator will call the function up to max\_tries times if it raises an exception."""defdec(func):deff2(*args,**kwargs):mydelay=delaytries=range(max_tries)tries.reverse()fortries_remainingintries:try:returnfunc(*args,**kwargs)exceptexceptionsase:iftries_remaining>0:ifhookisnotNone:# hook is any function we want to call# when the original function failshook(tries_remaining,e,mydelay)sleep(mydelay)mydelay=mydelay*backoffelse:raiseelse:breakreturnf2returndec - Another very interesting decorator recipe is the use of decorators to enforce types for function call as shown in the following example.
importsysdefaccepts(*types,**kw):'''Function decorator. Checks decorated function's arguments areof the expected types.Parameters:types -- The expected types of the inputs to the decorated function.Must specify type for each parameter.kw -- Optional specification of 'debug' level (this is the only validkeyword argument, no other should be given).debug = ( 0 | 1 | 2 )'''ifnotkw:# default level: MEDIUMdebug=1else:debug=kw['debug']try:defdecorator(f):defnewf(*args):ifdebugis0:returnf(*args)assertlen(args)==len(types)argtypes=tuple(map(type,args))ifargtypes!=types:msg=info(f.__name__,types,argtypes,0)ifdebugis1:raiseTypeError(msg)returnf(*args)newf.__name__=f.__name__returnnewfreturndecoratorexceptKeyErroraserr:raiseKeyError(key+"is not a valid keyword argument")exceptTypeError(msg):raiseTypeError(msg)definfo(fname,expected,actual,flag):'''Convenience function returns nicely formatted error/warning msg.'''format=lambdatypes:', '.join([str(t).split("'")[1]fortintypes])expected,actual=format(expected),format(actual)msg="'{}' method ".format(fname)\+("accepts","returns")[flag]+" ({}), but ".format(expected)\+("was given","result is")[flag]+" ({})".format(actual)returnmsg>>>@test_concat.accepts(int,int,int)...defdiv_sum_by_two(x,y,z):...returnsum([x,y,z])/2...>>>div_sum_by_two('obi','nkem','chuks')# calling with wrong argumentsTraceback(mostrecentcalllast):File"<stdin>",line1,in<module>File"/Users/c4obi/src/test_concat.py",line104,innewfraiseTypeError(msg)TypeError:'div_sum_by_two'methodaccepts(int,int,int),butwasgiven(str,str,str) - A common use of class decorators is for registering classes as the class statements are executed as shown in the following example.
registry = {}
def register(cls):
registry[cls.__clsid__] = cls
return cls
@register
class Foo(object):
__clsid__ = ".mp3"
def bar(self):
pass
A more comprehensive listing of recipes including the examples listed that are discussed in this section can be found at Python decorator library website.
9.3 Metaclasses
“Metaclasses are deeper magic than 99% of users should ever worry about. If you wonder whether you need them, you don’t”
– Tim Peters
All values in Python are objects including classes so a given class object must have another class from which it is created. Consider, an instance, f, of a user defined class Foo. The type/class of the instance, f, can be found by using the built-in method, type and in the case of the object, f,the type of f is Foo.
>>> class Foo(object):
... pass
...
>>> f = Foo()
>>> type(f)
<class '__main__.Foo'>
>>>
This introspection can also extended to a class object to find out the type/class of such a class. The following example shows the result of applying the type() function to the the Foo class.
class Foo(object):
pass
>>> type(Foo)
<class 'type'>
In Python, the class of all other class objects is the type class. This applies to user defined classes as shown above as well as built-in classes as shown in the following code example.
>>>type(dict)
<class 'type'>
A class such as the type class that is used to create other classes is called a metaclass. That is all there is to a metaclass - a metaclass is a class that are used in creating other classes. Custom metaclasses are not used often in Python but sometimes it is necessary to control the way classes are created most especially when working on big projects with big team.
Before explaining how metaclasses are used to customize class creation, a recap of how class objects are created when a class statement is encountered during the execution of a program is provided.
The following snippet is the class definition for a simple class that every Python user is familiar with but this is not the only way a class can be defined.
# class definition
class Foo(object):
def __init__(self, name):
self.name = name
def print_name():
print(self.name)
The following snippet shows a more involved method for defining the same class with all the syntactic sugar provided by the class keyword stripped away. This snippet provides a better understanding of what actually goes on under the covers during the execution of a class statement.
class_name = "Foo"
class_parents = (object,)
class_body = """
def __init__(self, name):
self.name = name
def print_name(self):
print(self.name)
"""
# a new dict is used as local namespace
class_dict = {}
#the body of the class is executed using dict from above as local
# namespace
exec(class_body, globals(), class_dict)
# viewing the class dict reveals the name bindings from class body
>>>class_dict
{'__init__': <function __init__ at 0x10066f8c8>, 'print_name': <function blah at 0x10066fa60>}
# final step of class creation
Foo = type(class_name, class_parents, class_dict)
During the execution of class statement, the interpreter carries out the following procedures behind the scene:
- The body of the class statement is isolated in a string.
- A class dictionary representing the namespace for the class is created.
- The body of the class is executed as a set of statements within this namespace.
- As a final step in the process, the class object is created by instantiating the
typeclass passing in the class name, base classes and class dictionary as arguments. Thetypeclass used here in creating theAccountclass object is the metaclass. The metaclass value used in creating the class object can be explicitly specified by supplying themetaclasskeyword argument in theclassdefinition. In the case that it is not supplied, the class statement examines the first entry in the tuple of the the base classes if any. If no base classes are used, the global variable__metaclass__is searched for and if no value is found for this, the default metaclass value is used.
Armed with a basic understanding of metaclasses, a discussion of their applications follows.
Metaclasses in Action
It is possible to define custom metaclasses that can be used when creating classes. These custom metaclasses will normally inherit from type and re-implement certain methods such as the __init__ or __new__ methods.
Imagine that you are the chief architect for a shiny new project and you have diligently read dozens of software engineering books and style guides that have hammered on the importance of docstrings so you want to enforce the requirement that all non-private methods in the project must have *docstrings; how would you enforce this requirement?
A simple and straightforward solution is to create a custom metaclass for use across the project that enforces this requirement. The snippet that follows though not of production quality is an example of such a metaclass.
class DocMeta(type):
def __init__(self, name, bases, attrs):
for key, value in attrs.items():
# skip special and private methods
if key.startswith("__"):
continue
# skip any non-callable
if not hasattr(value, "__call__"):
continue
# check for a doc string. a better way may be to store
# all methods without a docstring then throw an error showing
# all of them rather than stopping on first encounter
if not getattr(value, '__doc__'):
raise TypeError("%s must have a docstring" % key)
type.__init__(self, name, bases, attrs)
DocMeta is a type subclass that overrides the type class __init__ method. The implemented __init__ method iterates through all the class attributes searching for non-private methods missing a docstring; if such is encountered an exception is thrown as shown below.
class Car(object, metaclass=DocMeta):
def __init__(self, make, model, color):
self.make = make
self.model = model
self.color = color
def change_gear(self):
print("Changing gear")
def start_engine(self):
print("Changing engine")
car = Car()
Traceback (most recent call last):
File "abc.py", line 47, in <module>
class Car(object):
File "abc.py", line 42, in __init__
raise TypeError("%s must have a docstring" % key)
TypeError: change_gear must have a docstring
Another trivial example that illustrates an application of a metaclass is in the creation of a final class, that is a class that cannot be sub-classed. Some people may argue that final classes are unpythonic but for illustration purposes such functionality is implemented using a metaclass in the following snippet.
class Final(type):
def __init__(cls, name, bases, namespace):
super().__init__(name, bases, namespace)
for c in bases:
if isinstance(c, Final):
raise TypeError(c.__name__ + " is final")
class B(object, metaclass=Final):
pass
class C(B):
pass
>>> class B(object, metaclass=Final):
... pass
...
>>> class C(B):
... pass
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 6, in __init__
TypeError: B is final
In the example, the metaclass simply performs a check ensuring that the final class is never part of the base classes for any class being created.
Another very good example of a metaclass in action is in Abstract Base Classes that was previously discussed. When defining an abstract base class, the ABCMeta metaclass from the abc module is used as the metaclass for the abstract base class being defined and the @abstractmethod and @abstractproperty decorators are used to create methods and properties that must be implemented by non-abstract subclasses.
from abc import ABCMeta, abstractmethod
class Vehicle(object):
__metaclass__ = ABCMeta
@abstractmethod
def change_gear(self):
pass
@abstractmethod
def start_engine(self):
pass
class Car(Vehicle):
def __init__(self, make, model, color):
self.make = make
self.model = model
self.color = color
# abstract methods not implemented
>>> car = Car("Toyota", "Avensis", "silver")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class Car with abstract methods change_gear, start_engine
>>>
Once a class implements all abstract methods then such a class becomes a concrete class and can be instantiated by a user.
from abc import ABCMeta, abstractmethod
class Vehicle(object):
__metaclass__ = ABCMeta
@abstractmethod
def change_gear(self):
pass
@abstractmethod
def start_engine(self):
pass
class Car(Vehicle):
def __init__(self, make, model, color):
self.make = make
self.model = model
self.color = color
def change_gear(self):
print("Changing gear")
def start_engine(self):
print("Changing engine")
>>> car = Car("Toyota", "Avensis", "silver")
>>> print(isinstance(car, Vehicle))
True
Overriding __new__ vs __init__ in Custom Metaclasses
Sometimes, there is confusion over whether to override the __init__ or __new__ method when defining custom metaclasses. The decision about which to override depends the objective of such custom metaclasses. If the intent is to modify the class by changing some class attribute such as the list of base classes then the __new__ method should be overridden. The following example is a metaclass that checks for camel case attribute names and converts such to names with underscores between words.
class NamingMeta(type):
def __new__(mcl, name, bases, attrs):
new_attrs = dict()
for key, value in attrs.items():
updated_name = NamingMeta.convert(key)
new_attrs[updated_name] = value
return super(NamingMeta, mcl).__new__(mcl, name, bases, new_attrs)
@staticmethod
def convert(name):
s1 = re.sub('(.)([A-Z][a-z]+)', r'\1_\2', name)
return re.sub('([a-z0-9])([A-Z])', r'\1_\2', s1).lower()
It would not be possible to modify class attributes such as the list of base classes or attribute names in the __init__ method because as has been said previously, this method is called after the object has already been created.
On the other hand, when the intent is just to carry out initialization or validation checks such as was done with the DocMeta and Final metaclasses then the __init__ method of the metaclass should be overridden.
9.4 Context Managers
Sometimes, there is a need to execute some operations between another pair of operations. For example, open a file, read from the file and close the file or acquire a lock on a data structure, work with the data structure and release the data structure. These kinds of requirements come up most especially when dealing with system resources where the resource is acquired, worked with and then released. It is important that the acquisition and release of such resources are handled carefully so that any errors that may occur are correctly handled. Writing code to handle this all the time leads to a lot of repetition and cumbersome code. Context managers provide a solution to this. They provide a mean for abstracting away a pair of operations that are executed before and after another group of operation using the with statement. The with statement enables a set of operations to run within a context. The context is controlled by a context manager object.
An example of the use of the with statement is in opening files; this involves a pair of operations - opening and closing the file.
# create a context
with open('output.txt', 'w') as f:
# carry out operations within context
f.write('Hi there!')
The with statement can be used with any object that implements the context management protocol. This protocol defines a set of operations, __enter__ and __exit__ that are executed just before the start of execution of some piece of code and after the end of execution of some piece of code respectively. Generally, the definition and use of a context manager is shown in the following snippet.
class context:
def __enter__(self):
set resource up
return resource
def __exit__(self, type, value, traceback):
tear resource down
# the context object returned by __enter__ method is bound to name
with context() as name:
do some functionality
If the initialised resource is used within the context then the __enter__ method must return the resource object so that it is bound within the with statement using the as mechanism. A resource object must not be returned if the code being executed in the context doesn’t require a reference to the object that is set-up. The following is a very trivial example of a class that implements the context management protocol in a very simple fashion.
>>> class Timer:
... def __init__(self):
... pass
... def __enter__(self):
... self.start_time = time.time()
... def __exit__(self, type, value, traceback):
... print("Operation took {} seconds to complete".format(time.time()-self.start_time))
...
...
>>> with Foo():
... print("Hey testing context managers")
...
Hey testing context managers
Operation took 0.00010395050048828125 seconds to complete
>>>
When the with statement executes, the __enter__() method is called to create a new context; if a resource is initialized for use here then it is returned but this is not the case in this example. After the operations within the context are executed, the __exit__() method is called with the type, value and traceback as arguments. If no exception is raised during the execution of the of the operations within the context then all arguments are set to None. The __exit__ method returns a True or False depending on whether any raised exceptions have been handled. When False is returned then exception raised are propagated outside of the context for other code blocks to handle. Any resource clean-up is also carried out within the __exit__() method. This is all there is to context management. Now rather than write try...finally code to ensure that a file is closed or that a lock is released every time such resource is used, such chores can be handled in the the __exit__ method of a context manager class thus eliminating code duplication and making the code more intelligible.
The Contextlib module
For very simple use cases, there is no need to go through the hassle of implementing our own classes with __enter__ and __exit__ methods. The python contextlib module provides us with a high level method for implementing context manager. To define a context manager, the @contextmanager decorator from the contextlib module is used to decorate a function that handles the resource in question or carries out any initialization and clean-up; this function carrying out the initialization and tear down must however be a generator function. The following example illustrates this.
from contextlib import contextmanager
>>> from contextlib import contextmanager
>>> @contextmanager
... def time_func():
... start_time = time.time()
... yield
... print("Operation took {} seconds".format(time.time()-start_time))
>>> with time_func():
... print("Hey testing the context manager")
...
Hey testing the context manager
Operation took 7.009506225585938e-05 seconds
This context generator function, time_func in this case, must yield exactly one value if it is required that a value be bound to a name in the with statement’s as clause. When generator yields, the code block nested in the with statement is executed. The generator is then resumed after the code block finishes execution. If an exception occurs during the execution of a block and is not handled in the block, the exception is re-raised inside the generator at the point where the yield occurred. If an exception is caught for purposes other than adequately handling such an exception then the generator must re-raise that exception otherwise the generator context manager will indicate to the with statement that the exception has been handled, and execution will resume normally after the context block.
Context managers just like decorators and metaclasses provide a clean method for abstracting away these kind of repetitive code that can clutter code and makes following code logic difficult.