⬆️ ⬇️

Selection @pythonetc, September 2018



This is the fourth selection of tips about Python and programming from my author’s @pythonetc channel.



Previous selections:





Override and Overload



There are two concepts that are easily confused: overriding and overloading.



Overriding happens when a child class defines a method already provided by the parent classes, and thereby replaces it. In some languages, it is required to explicitly mark the overriding method (in C #, the override modifier is used), and in some languages ​​this is done at will (the @Override annotation in Java). Python does not require the use of a special modifier and does not provide for standard tagging of such methods (someone for the sake of readability uses the custom decorator @override , which does nothing).



Overloading is another story. This term refers to the situation when there are several functions with the same name, but with different signatures. Overloading is possible in Java and C ++, it is often used to provide default arguments:



 class Foo { public static void main(String[] args) { System.out.println(Hello()); } public static String Hello() { return Hello("world"); } public static String Hello(String name) { return "Hello, " + name; } } 


Python does not support searching functions by signature, only by name. Of course, you can write code that explicitly analyzes the types and number of arguments, but it will look awkward, and this practice is best avoided:



 def quadrilateral_area(*args): if len(args) == 4: quadrilateral = Quadrilateral(*args) elif len(args) == 1: quadrilateral = args[0] else: raise TypeError() return quadrilateral.area() 


If you need type hints, use the typing module with the @overload decorator:



 from typing import overload @overload def quadrilateral_area( q: Quadrilateral ) -> float: ... @overload def quadrilateral_area( p1: Point, p2: Point, p3: Point, p4: Point ) -> float: ... 


Autovivification



collections.defaultdict allows you to create a dictionary that returns a default value if the requested key is not present (instead of throwing a KeyError ). To create defaultdict you need to provide not just a default value, but a factory of such values.



So you can create a dictionary with a virtually infinite number of nested dictionaries, which will allow you to use constructions like d[a][b][c]...[z] .



 >>> def infinite_dict(): ... return defaultdict(infinite_dict) ... >>> d = infinite_dict() >>> d[1][2][3][4] = 10 >>> dict(d[1][2][3][5]) {} 


This behavior is called “autovivification,” a term that comes from Perl.



Instantiation



Instantiating objects involves two important steps. First, the __new__ method is called from the class, which creates and returns a new object. Then, from it, Python calls the __init__ method, which sets the initial state of this object.



However, __init__ will not be called if __new__ returns an object that is not an instance of the original class. In this case, the object could have been created by another class, and therefore __init__ already called on the object:



 class Foo: def __new__(cls, x): return dict(x=x) def __init__(self, x): print(x) # Never called print(Foo(0)) 


This also means that you should not create instances of the same class in __new__ using the normal constructor ( Foo(...) ). This can lead to the re-execution of __init__ , or even to infinite recursion.



Infinite recursion:



 class Foo: def __new__(cls, x): return Foo(-x) # Recursion 


Double performance __init__ :



 class Foo: def __new__(cls, x): if x < 0: return Foo(-x) return super().__new__(cls) def __init__(self, x): print(x) self._x = x 


The right way:



 class Foo: def __new__(cls, x): if x < 0: return cls.__new__(cls, -x) return super().__new__(cls) def __init__(self, x): print(x) self._x = x 


[] Operator and slices



In Python, you can override the [] operator by defining the __getitem__ magic method. For example, you can create an object that virtually contains an infinite number of duplicate elements:



 class Cycle: def __init__(self, lst): self._lst = lst def __getitem__(self, index): return self._lst[ index % len(self._lst) ] print(Cycle(['a', 'b', 'c'])[100]) # 'b' 


The unusual thing here is that the [] operator supports a unique syntax. With it, you can get not only [2] , but also [2:10] , [2:10:2] , [2::2] and even [:] . The operator semantics is: [start: stop: step], however you can use it in any other way to create custom objects.



But if you call __getitem__ with this syntax, what does he get as an index parameter? That is what slice objects are for.



 In : class Inspector: ...: def __getitem__(self, index): ...: print(index) ...: In : Inspector()[1] 1 In : Inspector()[1:2] slice(1, 2, None) In : Inspector()[1:2:3] slice(1, 2, 3) In : Inspector()[:] slice(None, None, None) 


You can even combine syntaxes of tuples and slices:



 In : Inspector()[:, 0, :] (slice(None, None, None), 0, slice(None, None, None)) 


slice does nothing, only stores the start , stop and step attributes.



 In : s = slice(1, 2, 3) In : s.start Out: 1 In : s.stop Out: 2 In : s.step Out: 3 


Asyncio interruption



Any coroutine asyncio can be interrupted using the cancel() method. In this case, a CancelledError will be sent to the quortenine, as a result of this and all the associated Korutins will be interrupted until the error is caught and suppressed.



CancelledError is a subclass of Exception , which means you can accidentally catch it using the try ... except Exception combination designed to catch "any errors." To safely catch a coroutine, you have to do this:



 try: await action() except asyncio.CancelledError: raise except Exception: logging.exception('action failed') 


Execution planning



To schedule the execution of some code at a certain time in asyncio usually create a task that performs await asyncio.sleep(x) :



 import asyncio async def do(n=0): print(n) await asyncio.sleep(1) loop.create_task(do(n + 1)) loop.create_task(do(n + 1)) loop = asyncio.get_event_loop() loop.create_task(do()) loop.run_forever() 


But creating a new task can be expensive, and you don’t have to do it if you don’t plan to perform asynchronous operations (like the do function in my example). Instead, you can use the functions loop.call_later and loop.call_at , which allow you to schedule an asynchronous callback call:



 import asyncio def do(n=0): print(n) loop = asyncio.get_event_loop() loop.call_later(1, do, n+1) loop.call_later(1, do, n+1) loop = asyncio.get_event_loop() do() loop.run_forever() 


')

Source: https://habr.com/ru/post/425125/



All Articles