Can the parameter of the Python.evaluate method only be a StringLiteral instead of a String?

fn initialize_with_zeros(dim: UInt) raises -> (PythonObject, UInt):
    var str = String("(12288,1)")
    print(str)
    print(str == "(12288,1)")
    var obj = Python.evaluate(str)
    return obj, Uint()

When I run this function,the compiler said Error loading dataset: invalid non-printable character U+0001 (<string>, line 1).
image

fn initialize_with_zeros(dim: UInt) raises -> (PythonObject, UInt):
    var str = "(12288,1)"
    print(str)
    print(str == "(12288,1)")
    var obj = Python.evaluate(str)
    var np = Python.import_module("numpy")
    var w = np.zeros(obj)
    var b: UInt = 0
    print("w", w)
    return w, b

The only differences between this code and above, is that the variable str is StringLiteral. And this code runs perfectly.
The argument of Python.evaluate only support StringLiteral? What if i have to parse a variable into PythonObject?

Could you open an issue on GitHub? I think it’s a null-correctness bug: Python.evaluate assumes the input value is null-terminated.

from python import Python

fn main() raises:
  var s = String("(100, 1)")
  # s.append_byte(0)  # uncommenting this line fixes the issue
  print(Python.evaluate(s))
2 Likes

Thanks for noticing this, it should be fixed in the next nightly.

1 Like

Thank you for replying so promptly.I am researching on mojo, and I maybe will have a lot of questions in the future.