[bitc-dev] Type Classes for BitC
Mark P Jones
mpj at cse.ogi.edu
Fri Jul 22 01:32:56 EDT 2005
| Let me give a more serious case.
| (define counter
| (let (c (mutable 10)) (lambda (x) (begin (set! c (+ c x)) c))))
| with the intention that I should get various types (int 16,
| 32, 64 etc)
| of skip counters.
| However, the type of counter is (somewhat) _Num -> _Num and
| NOT Num->Num
| because of the value restriction. So we will have to define a new
| counter for every integer type. I don't think this is a very rare
| usage pattern.
I'm not sure what you are getting at with this example. Here are
some of the things that have me puzzled:
- Suppose that you didn't have overloading: then you would
also need to define a different counter for each different
type. What is different here (or why would you expect it
to be different)?
- Were you wanting a single counter that could accept input values
(x in the above) and return results with different types in
different calls? (I don't see how that would work.)
- Wouldn't a more likely use case be that I wanted to have multiple
(i.e., distinct) counters, possibly with the same type in each case?
In short, the example above seems to demonstrate exactly the behavior
I would both want and expect ...
One way to deal with an example like this would be to capture the
counter pattern in a definition like the following:
(let (c (mutable init))
(begin (set! c (+ c x))
(Please forgive syntactic snafu's on my part: my Scheme is very rusty
and I'm not up to speed with all the details of BitC syntax ...)
In qualified types notation, the type of makeCounter might be written:
makeCounter :: forall a. Num a => a -> a
And now you might define a bunch of counters as follows:
(define aCounter (makeCounter 0:int32))
(define anotherCounter (makeCounter 2:int32))
(define microCounter (makeCounter 0:int64))
(define countDracula (makeCounter 0))
This defines a total of four distinct counters. The first two
have type nt32 -> int32, while the third has type int64 -> int 64.
The fourth is more interesting (assuming that you have overloaded
numeric literals as in Haskell and hobbit), because it gets a type
a -> a (there is no forall quantifier here) that doesn't specify
a particular choice for the type a, although it does add a global
predicate/class constraint of the form (Num a) to the environment.
Only later, when you write a function that uses countDracula in a
specific context, will you fix the type of a. For example:
(define (multicount x)
(+ (countDracula x) (aCounter x)))
; now we can see that countDracula must have type int32 -> int32
All the best,
PS. What I've said here assumes the ML/Scheme approach to mutation
and side effects; you probably know already, but Haskell takes a
very different approach based on "monads" ... maybe we'll get to
that later, but it's really a separate topic.
More information about the bitc-dev