We continue to talk about metaprogramming in Python. When used correctly, it allows you to quickly and elegantly implement complex design patterns. In the last part of this article, we showed how metaclasses can be used to change the attributes of instances and classes.
Now let's see how you can change the method calls. You can learn more about metaprogramming features in the Advanced Python course.
As you already understood, with the help of the metaclass any class can be transformed beyond recognition. For example, replace all class methods with others or apply an arbitrary decorator to each method. This idea can be used to debug application performance.
The following metaclass measures the execution time of each method in the class and its instances, as well as the creation time of the instance itself:
from contextlib import contextmanager import logging import time import wrapt @contextmanager def timing_context(operation_name): """ """ start_time = time.time() try: yield finally: logging.info('Operation "%s" completed in %0.2f seconds', operation_name, time.time() - start_time) @wrapt.decorator def timing(func, instance, args, kwargs): """ . https://wrapt.readthedocs.io/en/latest/ """ with timing_context(func.__name__): return func(*args, **kwargs) class DebugMeta(type): def __new__(mcs, name, bases, attrs): for attr, method in attrs.items(): if not attr.startswith('_'): # attrs[attr] = timing(method) return super().__new__(mcs, name, bases, attrs) def __call__(cls, *args, **kwargs): with timing_context(f'{cls.__name__} instance creation'): # return super().__call__(*args, **kwargs)
Let's look at debugging in action:
class User(metaclass=DebugMeta): def __init__(self, name): self.name = name time.sleep(.7) def login(self): time.sleep(1) def logout(self): time.sleep(2) @classmethod def create(cls): time.sleep(.5) user = User('Michael') user.login() user.logout() user.create() # INFO:__main__:Operation "User instance creation" completed in 0.70 seconds INFO:__main__:Operation "login" completed in 1.00 seconds INFO:__main__:Operation "logout" completed in 2.00 seconds INFO:__main__:Operation "create" completed in 0.50 seconds
Try expanding DebugMeta
and logging the signature of methods and their stack-trace.
We now turn to the exotic cases of using metaclasses in Python projects.
Surely many of you use the usual Python module to implement the loner design pattern (aka Singleton), because it is much more convenient and faster than writing the corresponding metaclass. However, let's write one of its implementations for the sake of academic interest:
class Singleton(type): instance = None def __call__(cls, *args, **kwargs): if cls.instance is None: cls.instance = super().__call__(*args, **kwargs) return cls.instance class User(metaclass=Singleton): def __init__(self, name): self.name = name def __repr__(self): return f'<User: {self.name}>' u1 = User('Pavel') # u2 = User('Stepan') >>> id(u1) == id(u2) True >>> u2 <User: Pavel> >>> User.instance <User: Pavel> # , ? >>> u1.instance.instance.instance.instance <User: Pavel>
This implementation has an interesting nuance - since the class constructor is not invoked a second time, you can make a mistake and do not pass the desired parameter there and nothing happens at runtime if the instance has already been created. For example:
>>> User('Roman') <User: Roman> >>> User('Alexey', 'Petrovich', 66) # ! <User: Roman> # User # TypeError!
And now let's take a look at an even more exotic option: a ban on inheriting from a certain class.
class FinalMeta(type): def __new__(mcs, name, bases, attrs): for cls in bases: if isinstance(cls, FinalMeta): raise TypeError(f"Can't inherit {name} class from final {cls.__name__}") return super().__new__(mcs, name, bases, attrs) class A(metaclass=FinalMeta): """ !""" pass class B(A): pass # TypeError: Can't inherit B class from final A # !
In the previous examples, we used metaclasses to customize the creation of classes, but you can go even further and begin to parameterize the behavior of the metaclasses.
For example, you can pass a function into the metaclass
parameter when declaring a class and return different instances of metaclasses from it depending on some conditions, for example:
def get_meta(name, bases, attrs): if SOME_SETTING: return MetaClass1(name, bases, attrs) else: return MetaClass2(name, bases, attrs) class A(metaclass=get_meta): pass
But a more interesting example is the use of extra_kwargs
parameters when declaring classes. Suppose you want to change the behavior of certain methods in a class with the help of a metaclass, and for each class these methods may be called differently. What to do? And that's what
# `DebugMeta` class DebugMetaParametrized(type): def __new__(mcs, name, bases, attrs, **extra_kwargs): debug_methods = extra_kwargs.get('debug_methods', ()) for attr, value in attrs.items(): # , # `debug_methods`: if attr in debug_methods: attrs[attr] = timing(value) return super().__new__(mcs, name, bases, attrs) class User(metaclass=DebugMetaParametrized, debug_methods=('login', 'create')): ... user = User('Oleg') user.login() # "logout" . user.logout() user.create()
In my opinion, it turned out very elegant! You can think up quite a lot of patterns for using such parameterization, but remember the main rule - everything is good in moderation.
__prepare__
methodFinally, I will talk about the possible use of the __prepare__
method. As mentioned above, this method should return a dictionary object that the interpreter fills when the class body is __prepare__
, for example, if __prepare__
returns an object d = dict()
, then when reading the next class:
class A: x = 12 y = 'abc' z = {1: 2}
The interpreter will perform the following operations:
d['x'] = 12 d['y'] = 'abc' d['z'] = {1: 2}
There are several possible uses for this feature. They are all of different utility levels, so:
collections.OrderedDict
from the __prepare__
method, in older versions, the built-in dictionaries already preserve the order of adding keys, so OrderedDict
no longer necessary.enum
library, a custom dict-like object is used to determine cases when the class attribute is duplicated during the declaration. Code can be found here .For example, consider the following class with three implementations of the same polymorphic method:
class Terminator: def terminate(self, x: int): print(f'Terminating INTEGER {x}') def terminate(self, x: str): print(f'Terminating STRING {x}') def terminate(self, x: dict): print(f'Terminating DICTIONARY {x}') t1000 = Terminator() t1000.terminate(10) t1000.terminate('Hello, world!') t1000.terminate({'hello': 'world'}) # Terminating DICTIONARY 10 Terminating DICTIONARY Hello, world! Terminating DICTIONARY {'hello': 'world'}
Obviously, the last terminate
method declared overwrites the implementations of the first two, and we need the method chosen depending on the type of argument passed. To achieve this, we program a couple of additional wrapper objects:
class PolyDict(dict): """ , PolyMethod. """ def __setitem__(self, key: str, func): if not key.startswith('_'): if key not in self: super().__setitem__(key, PolyMethod()) self[key].add_implementation(func) return None return super().__setitem__(key, func) class PolyMethod: """ , . , : instance method, staticmethod, classmethod. """ def __init__(self): self.implementations = {} self.instance = None self.cls = None def __get__(self, instance, cls): self.instance = instance self.cls = cls return self def _get_callable_func(self, impl): # "" classmethod/staticmethod return getattr(impl, '__func__', impl) def __call__(self, arg): impl = self.implementations[type(arg)] callable_func = self._get_callable_func(impl) if isinstance(impl, staticmethod): return callable_func(arg) elif self.cls and isinstance(impl, classmethod): return callable_func(self.cls, arg) else: return callable_func(self.instance, arg) def add_implementation(self, func): callable_func = self._get_callable_func(func) # , 1 arg_name, arg_type = list(callable_func.__annotations__.items())[0] self.implementations[arg_type] = func
The most interesting thing in the code above is the PolyMethod
object, which stores the registry with implementations of the same method, depending on the type of argument passed to this method. We will return a PolyDict
object from the __prepare__
method and thereby save different implementations of methods with the same terminate
name. The important point is that when reading the body of a class and when creating an attrs
object, the interpreter places so-called unbound
functions there; these functions do not yet know which class or instance they will be called. We had to implement the descriptor protocol to determine the context at the time of the function call and pass either self
or cls
as the first parameter or do not transmit anything if the staticmethod
called.
As a result, we will see the following magic:
class PolyMeta(type): @classmethod def __prepare__(mcs, name, bases): return PolyDict() class Terminator(metaclass=PolyMeta): ... t1000 = Terminator() t1000.terminate(10) t1000.terminate('Hello, world!') t1000.terminate({'hello': 'world'}) # Terminating INTEGER 10 Terminating STRING Hello, world! Terminating DICTIONARY {'hello': 'world'} >>> t1000.terminate <__main__.PolyMethod object at 0xdeadcafe>
If you know any other interesting uses of the __prepare__
method, please write in the comments.
Metaprogramming is one of the many topics that I have narrated on Advanced Python intensive. As part of the course, I will also explain how to effectively use the principles of SOLID and GRASP in the development of large projects in Python, design the architecture of applications and write high-performance and high-quality code. I would be glad to see you in the walls of the Binary District!
Source: https://habr.com/ru/post/422415/
All Articles