Skip to content

Orangepi AiPro 20T ms is not a valid TensorType #2162

@Terminator98

Description

@Terminator98

I have this setup
Board: OrangePi AiPro 20T Ascend 310B
CANN 8.2.RC1
OS: Ubuntu 22.04
Python 3.10
Mindspore 0.5.0rc1
Mindnlp 2.7.0

I'm running this code below

import mindspore as ms
from mindnlp.transformers import AutoModelForCausalLM, AutoTokenizer

# Suppress harmless NumPy warnings
import warnings
warnings.filterwarnings("ignore", message="The value of the smallest subnormal*")

# Set Ascend context
ms.set_device(device_target="Ascend", device_id=0)

model_name = "Qwen/Qwen2.5-0.5B-Instruct"

# Load model in float16 (required for Ascend 310B)
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, ms_dtype=ms.float16)

# Prepare chat
chat = [
    {"role": "system", "content": "You are a sassy, wise-cracking robot as imagined by Hollywood circa 1986."},
    {"role": "user", "content": "Hey, can you tell me any fun things to do in New York?"}
]

# Format input using Qwen's chat template
input_text = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(input_text, return_tensors="ms")

# Generate without GenerationConfig (uses defaults)
outputs = model.generate(
    input_ids=inputs["input_ids"],
    max_new_tokens=512,
    do_sample=True,
    top_p=0.9,
    temperature=0.7
)

# Decode output
response_text = tokenizer.decode(outputs[0], skip_special_tokens=False)

# Extract assistant's reply
import re
match = re.search(r"<\|im_start\|>assistant\s*(.*?)<\|im_end\|>", response_text, re.DOTALL)
if match:
    assistant_reply = match.group(1).strip()
else:
    # Fallback extraction
    parts = response_text.split("<|im_start|>assistant")
    if len(parts) > 1:
        assistant_reply = parts[-1].split("<|im_end|>")[0].strip()
    else:
        assistant_reply = response_text

print("\n🤖 Assistant Reply:\n")
print(assistant_reply)

I'm getting this error

Traceback (most recent call last):
  File "/home/HwHiAiUser/mostafa/notebooks/mindnlp_test.py", line 25, in <module>
    inputs = tokenizer(input_text, return_tensors="ms")
  File "/home/HwHiAiUser/.venv/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2911, in __call__
    encodings = self._call_one(text=text, text_pair=text_pair, **all_kwargs)
  File "/home/HwHiAiUser/.venv/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 3021, in _call_one
    return self.encode_plus(
  File "/home/HwHiAiUser/.venv/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 3096, in encode_plus
    return self._encode_plus(
  File "/home/HwHiAiUser/.venv/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 627, in _encode_plus
    batched_output = self._batch_encode_plus(
  File "/home/HwHiAiUser/.venv/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 601, in _batch_encode_plus
    return BatchEncoding(sanitized_tokens, sanitized_encodings, tensor_type=return_tensors)
  File "/home/HwHiAiUser/.venv/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 240, in __init__
    self.convert_to_tensors(tensor_type=tensor_type, prepend_batch_axis=prepend_batch_axis)
  File "/home/HwHiAiUser/.venv/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 707, in convert_to_tensors
    tensor_type = TensorType(tensor_type)
  File "/usr/lib/python3.10/enum.py", line 385, in __call__
    return cls.__new__(cls, value)
  File "/usr/lib/python3.10/enum.py", line 718, in __new__
    raise exc
  File "/usr/lib/python3.10/enum.py", line 700, in __new__
    result = cls._missing_(value)
  File "/home/HwHiAiUser/.venv/lib/python3.10/site-packages/transformers/utils/generic.py", line 501, in _missing_
    raise ValueError(
ValueError: ms is not a valid TensorType, please select one of ['pt', 'tf', 'np', 'jax', 'mlx']

So how to solve this?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions