How does ML perform the type inference in the following function definition:
let add a b = a + b
Is it like C++ templates where no type-checking is performed until the point of template instantiation after which if the type supports the necessary operations, the function works or else a compilation error is thrown ?
i.e. for example, the following function template
template <typename NumType>
NumType add(NumType a, NumType b) {
return a + b;
}
will work for
add<int>(23, 11);
but won’t work for
add<ostream>(cout, fout);
Is what I am guessing is correct or ML type inference works differently?
PS: Sorry for my poor English; it’s not my native language.
I suggest you have a look at this article: What is Hindley-Milner? (and why is it cool)
Here is the simplest example they use to explain type inference (it’s not ML, but the idea is the same):
Just looking at the definition of bar, we can easily see that its type must be (String, Int)=>Int. That’s type inference in a nutshell. Read the whole article for more information and examples.
I’m not a C++ expert, but I think templates are something else that is closer to genericity/parametricity, which is something different.