fn initialize_with_zeros(dim: UInt) raises -> (PythonObject, UInt):
var str = String("(12288,1)")
print(str)
print(str == "(12288,1)")
var obj = Python.evaluate(str)
return obj, Uint()
When I run this function,the compiler said Error loading dataset: invalid non-printable character U+0001 (<string>, line 1).

fn initialize_with_zeros(dim: UInt) raises -> (PythonObject, UInt):
var str = "(12288,1)"
print(str)
print(str == "(12288,1)")
var obj = Python.evaluate(str)
var np = Python.import_module("numpy")
var w = np.zeros(obj)
var b: UInt = 0
print("w", w)
return w, b
The only differences between this code and above, is that the variable str is StringLiteral. And this code runs perfectly.
The argument of Python.evaluate only support StringLiteral? What if i have to parse a variable into PythonObject?