def foo(s: String) = s.length // : def bar(x, y) = foo(x) + y
bar
, we can say that its type should be (String, Int)=>Int
. It is not difficult to withdraw. We simply look at the function body and see two ways to use the x
and y
parameters. x
is passed to foo
, which expects a String
. Therefore, x
must be of type String
in order for this code to be compiled. Moreover, foo
returns an Int
. Method +
class Int
expects a parameter, also Int
, therefore y
must be of type Int
. Finally, we know that +
returns Int
, which means this is the type returned by bar
.foo
accepts a parameter of type String
. Once many constraints have been created, the algorithm completes type recovery using the unification of these constraints. If the expression is well-typed, then ultimately the constraints will result in an unambiguous type. Otherwise, one (or several) constraints will be mutually exclusive or intractable with a given set of types.x
and y
) a new type (that is, “not existing”). If we recorded bar
with these types, it would have been something like [1] : def bar(x: X, y: Y) = foo(x) + y
foo
method. [2] We know that type foo
is String=>Int
and this allows us to write the constraint:X ↦ String
+
associated with the value of y
. Scala represents all operators as a method call, so this expression actually means " foo(x).+(y)
". We already know that foo(x)
is an expression of type Int
(from the type returned by foo
), as well as the fact that +
defined as a method of class Int
with type Int=>Int
(of course, it is not entirely correct to use magic inherent only in Scala) . That allows you to add one more constraint to our set:X ↦ String
Y ↦ Int
X
and Y
Unification, literally, is the process of going through all the restrictions in an attempt to find the only type that suits them all. For example, imagine that you were given the following facts:x
and y
) and both of them are mapped to real types. The only thing required was to substitute " String
" instead of " X
" and " Int
" instead of " Y
". def baz(a, b) = a(b) :: b
baz
function is declared, which takes a function from the parameters and something else, calling the function from the second parameter passed to it and then attaching the result to it itself. It is easy to derive many restrictions for this function. As before, we will start by creating new types for each value. def baz(a: X, b: Y): Z = a(b) :: b
a
must be a function, that takes a value of type Y
and returns the value of a new type of Y'
. Moreover, we know that ::
is a function on the List[A]
class, which takes a new element A
and returns a new List[A]
. Thus, we know that Y
and Z
must be of type List[ Y' ]
. Restrictions written in the formal form:X ↦ ( Y => Y' )
Y ↦ List[ Y' ]
Z ↦ List[ Y' ]
X
depends on Y
, that is, at least one more additional step is required:X ↦ (List[ Y' ]=> Y' )
Y ↦ List[ Y' ]
Z ↦ List[ Y' ]
Y
into the mapping for X
Using this substitution, we got rid of X
, Y
and Z
from the types we derived, getting the following, typed baz
function: def baz(a: List[Y`]=>Y`, b: List[Y`]): List[Y`] = a(b) :: b
Y'
would be syntactically correct, then the error would be the absence of this type. This happens quite often when working with the Hindley-Milner type recovery algorithm. Somehow, after the derivation of all constraints and unification, we have “residuals” of variable types for which there are no longer constraints.baz
function, where we added an unlimited parameter of type " A
" and replaced it with all occurrences of Y'
in the inferred types: def baz[A](a: List[A]=>A, b: List[A]): List[A] = a(b) :: b
bar
" example was specifically cited in a Ruby-like syntax to show that this information is still sufficient for type checking. This can be useful the next time someone says that all statically typed languages ​​are too expressive.::
" is the so-called cons
operator in Scala. He is responsible for attaching the left element to the right list.Source: https://habr.com/ru/post/125250/
All Articles