Hi! @jeshli , it’s me again, do you have a rust-connect-py-ai-to-ic tokenizer? Otherwise the input and output are tensors, not readable text
Yes. It’s in one of the branches.
Oh I found it, but I noticed that the return value of tokenize_text is vec int32 while the parameter of model_inference is vec int64? If these two datatypes don’t match then how do I use tokenizer_backend please? Happy New Year btw