__init__
, then it is more expedient to pass them ready as arguments, and to create an object use the factory method. This will separate the business logic from the technical implementation of the creation of objects.__init__
accepts host
and port
as arguments to the database: class Query: def __init__(self, host, port): self._connection = Connection(host, port)
class Query: def __init__(self, connection): self._connection = connection @classmethod def create(cls, host, port): return cls(Connection(host, port))
Query(FakeConnection())
.host
and port
, but also by cloning another connection, reading the configuration file, using the default connection, etc.__init__
.super()
function allows you to refer to the base class. This is very useful in cases where the derived class wants to add something to the implementation of the method, rather than redefining it completely. class BaseTestCase(TestCase): def setUp(self): self._db = create_db() class UserTestCase(BaseTestCase): def setUp(self): super().setUp() self._user = create_user()
super()
does not always refer to the base class, it can easily return a child class. So it would be better to use the name next()
, since the next class is returned according to the MRO. class Top: def foo(self): return 'top' class Left(Top): def foo(self): return super().foo() class Right(Top): def foo(self): return 'right' class Bottom(Left, Right): pass # prints 'right' print(Bottom().foo())
super()
can produce different results depending on where the method was originally called from. >>> Bottom().foo() 'right' >>> Left().foo() 'top'
locals()
) is used by the metaclass (by default it is type
) to create a class object. class Meta(type): def __new__(meta, name, bases, ns): print(ns) return super().__new__( meta, name, bases, ns ) class Foo(metaclass=Meta): B = 2
{'__module__': '__main__', '__qualname__':'Foo', 'B': 3}
.B = 2; B = 3
B = 2; B = 3
, then the metaclass will see only B = 3
, because only this value is in ns
. This limitation stems from the fact that the metaclass begins to work only after the body is executed.__prepare__
method. class CustomNamespace(dict): def __setitem__(self, key, value): print(f'{key} -> {value}') return super().__setitem__(key, value) class Meta(type): def __new__(meta, name, bases, ns): return super().__new__( meta, name, bases, ns ) @classmethod def __prepare__(metacls, cls, bases): return CustomNamespace() class Foo(metaclass=Meta): B = 2 B = 3
__module__ -> __main__ __qualname__ -> Foo B -> 2 B -> 3
enum.Enum
protected from duplication .matplotlib
is a complex and flexible Python library for graphing. It is supported by many products, including Jupyter and Pycharm. Here is an example of drawing a simple fractal using matplotlib
: https://repl.it/@VadimPushtaev/myplotlib (see the title picture of this publication).datetime
library for working with dates and times. It is curious that datetime
objects have a special interface for supporting time zones (namely, the tzinfo
attribute), but this module has limited support for the interface, so some of the work is assigned to other modules.pytz
. But the fact is that pytz
does not fully match the tzinfo
interface. This is stated at the very beginning of the pytz
documentation: “This library differs from the documented Python API for tzinfo implementations.”pytz
objects as tzinfo
. If you try to do it, you risk getting a completely insane result: In : paris = pytz.timezone('Europe/Paris') In : str(datetime(2017, 1, 1, tzinfo=paris)) Out: '2017-01-01 00:00:00+00:09'
In : str(paris.localize(datetime(2017, 1, 1))) Out: '2017-01-01 00:00:00+01:00'
normalize
to your datetime
objects to avoid changing the offsets (for example, on the border of the DST period). In : new_time = time + timedelta(days=2) In : str(new_time) Out: '2018-03-27 00:00:00+01:00' In : str(paris.normalize(new_time)) Out: '2018-03-27 01:00:00+02:00'
dateutil.tz
instead of pytz
. This library is fully compatible with tzinfo
, it can be passed as an attribute and you do not need to use normalize
. True, it works more slowly.pytz
does not support the datetime
API, or want to see more examples, then read this article.next(x)
called, it returns a new value from the iterator x
until an exception is thrown. If it turns out to be StopIteration
, then the iterator is exhausted and can no longer provide values. If the generator is iterated, then at the end of the body it will automatically throw a StopIteration
: >>> def one_two(): ... yield 1 ... yield 2 ... >>> i = one_two() >>> next(i) 1 >>> next(i) 2 >>> next(i) Traceback (most recent call last): File "<stdin>", line 1, in <module> StopIteration
StopIteration
can be automatically processed with tools that cause next
: >>> list(one_two()) [1, 2]
def one_two(): yield 1 yield 2 def one_two_repeat(n): for _ in range(n): i = one_two() yield next(i) yield next(i) yield next(i) print(list(one_two_repeat(3)))
yield
is an error: the thrown StopIteration
exception stops iterating the list(...)
. We get the result [1, 2]
. However, in Python 3.7, this behavior has changed. Alien StopIteration
replaced by RuntimeError
: Traceback (most recent call last): File "test.py", line 10, in one_two_repeat yield next(i) StopIteration The above exception was the direct cause of the following exception: Traceback (most recent call last): File "test.py", line 12, in <module> print(list(one_two_repeat(3))) RuntimeError: generator raised StopIteration
__future__ import generator_stop
to enable the same behavior since Python 3.5.Source: https://habr.com/ru/post/422789/
All Articles