Previously the code worked with a deprecation warning:
implicit convertion form Int to ValueT
from collections import Set
alias ValueT = KeyElement
fn test[T: ValueT](a: T, b: T):
s = Set[T]() # T is KeyElement
c = a + b # T has __add__
fn main():
test[Int](1, 2)
test[Float64](1.0, 2.0)
Int is implicitly convertable to ValueT because it has __add__ method.
But after an pixi update, it no longer works. I understand why, but how to solve this now?
I tried to use an Addable trait, but Int is not an Addable.
from collections import Set
trait Addable:
fn __add__(self, other: Self) -> Self:
pass
alias ValueT = KeyElement & Addable
fn test[T: ValueT](a: T, b: T):
s = Set[T]() # T is KeyElement
c = a + b # T is Addable
fn main():
test[Int](1, 2)
I’d expect there to be an built-in Addable trait, or an Arithmetic trait that includes __add__, __sub__, etc. And the built-in Int, Float64 should’ve implemented them. But there isn’t any.
How to solve this at compile time? I don’t want to use Variant for better performance.
Thanks.
There is a built-in Addable trait and Int confirms to it. I am afk, but I think you just need to delete your own trait definition and I it should work. If not search the docs for Addable and import it from somewhere.
I’m not sure I understand what you mean. I meant it’s currently impossible to add trait conformance retroactively without implicit trait conformance; so, Int does not conform to Addable.
Well, there needs to be a trait for structs that have __add__. This is what the compiler complains about in the example above. This should be a built-in trait, as adding two values is a basic feature. However, I don’t see this in the standard library.
One last note here:
We definitely want parametric traits, but they are not on the Q3 roadmap, and we’ll evaluate exactly when to prioritize them at a later date.