An `Addable` trait?

Previously the code worked with a deprecation warning:

implicit convertion form Int to ValueT

from collections import Set

alias ValueT =  KeyElement

fn test[T: ValueT](a: T, b: T):
	s = Set[T]() # T is KeyElement

	c = a + b # T has __add__

fn main():
	test[Int](1, 2)
	test[Float64](1.0, 2.0)

Int is implicitly convertable to ValueT because it has __add__ method.

But after an pixi update, it no longer works. I understand why, but how to solve this now?

I tried to use an Addable trait, but Int is not an Addable.

from collections import Set

trait Addable:
	fn __add__(self, other: Self) -> Self:
		pass

alias ValueT =  KeyElement & Addable 

fn test[T: ValueT](a: T, b: T):
	s = Set[T]() # T is KeyElement

	c = a + b # T is Addable

fn main():
	test[Int](1, 2)

I’d expect there to be an built-in Addable trait, or an Arithmetic trait that includes __add__, __sub__, etc. And the built-in Int, Float64 should’ve implemented them. But there isn’t any.

How to solve this at compile time? I don’t want to use Variant for better performance.
Thanks.

There is a built-in Addable trait and Int confirms to it. I am afk, but I think you just need to delete your own trait definition and I it should work. If not search the docs for Addable and import it from somewhere.

Correction. I didn’t find it either. This looks like a bug to me and I would vote for filing an issue if no one here has a better answer.

I found this: [Feature Request] Improve ergonomics for writing functions generic across integral scalars. · Issue #2776 · modular/modular · GitHub
Looks like there isn’t such a trait yet.

Unfortunately, given that we have now removed implicit trait conformance, there’s no good way to make an existing type conform to a new trait.

It is just weird that the compiler knows of Addable if this trait does not exist somewhere. I mean, this should not be a new trait!

I’m not sure I understand what you mean. I meant it’s currently impossible to add trait conformance retroactively without implicit trait conformance; so, Int does not conform to Addable.

Well, there needs to be a trait for structs that have __add__. This is what the compiler complains about in the example above. This should be a built-in trait, as adding two values is a basic feature. However, I don’t see this in the standard library.

An Addable trait is blocked on parametric traits. It would ideally look like:

trait Addable[T: AnyType]:
    alias Output: AnyType
    fn __add__(self, rhs: T) -> Output:
        …

This is necessary so one can express e.g String + StringSlice = String and String + String = String

Ah, this makes sense, thanks. But this becomes a pressing issue now with the requirement of explicit trait conformance :slight_smile:.

I agree, though you already currently can’t express

fn foo[U: AnyType,  W: AnyType, T: Addable[U]](
   a: T, b: U
) -> W
   requires T.AddOutput == W:
   return a + b

with our current type system. I am excited for when we can though!

2 Likes

One last note here:
We definitely want parametric traits, but they are not on the Q3 roadmap, and we’ll evaluate exactly when to prioritize them at a later date.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.