You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/functions.py:663: UserWarning: Graph break due to unsupported builtin posix.fspath. This function is either a Python builtin (e.g. _warnings.warn) or a third-party C/C++ Python extension (perhaps created with pybind). If it is a Python builtin, please file an issue on GitHub so the PyTorch team can add support for it and see the next case for a workaround. If it is a third-party C/C++ Python extension, please either wrap it into a PyTorch-understood custom operator (see https://pytorch.org/tutorials/advanced/custom_ops_landing_page.html for more details) or, if it is traceable, use torch.compiler.allow_in_graph.
torch._dynamo.utils.warn_once(msg)
/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884
warnings.warn(
W0904 11:46:21.351000 140371205161088 torch/_dynamo/convert_frame.py:762] [6/8] torch._dynamo hit config.cache_size_limit (8)
W0904 11:46:21.351000 140371205161088 torch/_dynamo/convert_frame.py:762] [6/8] function: '__init_subclass__' (/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/utils/generic.py:324)
W0904 11:46:21.351000 140371205161088 torch/_dynamo/convert_frame.py:762] [6/8] last reason: ___check_obj_id(L['cls'], 165879472)
W0904 11:46:21.351000 140371205161088 torch/_dynamo/convert_frame.py:762] [6/8] To log all recompilation reasons, use TORCH_LOGS="recompiles".
W0904 11:46:21.351000 140371205161088 torch/_dynamo/convert_frame.py:762] [6/8] To diagnose recompilation issues, see https://pytorch.org/docs/main/torch.compiler_troubleshooting.html.
/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/functions.py:663: UserWarning: Graph break due to unsupported builtin posix.stat. This function is either a Python builtin (e.g. _warnings.warn) or a third-party C/C++ Python extension (perhaps created with pybind). If it is a Python builtin, please file an issue on GitHub so the PyTorch team can add support for it and see the next case for a workaround. If it is a third-party C/C++ Python extension, please either wrap it into a PyTorch-understood custom operator (see https://pytorch.org/tutorials/advanced/custom_ops_landing_page.html for more details) or, if it is traceable, use torch.compiler.allow_in_graph.
torch._dynamo.utils.warn_once(msg)
Traceback (most recent call last):
File "/local/mnt/workspace/users/vpandya/iree-turbine/examples/simple_gpt2.py", line 24, in <module>
test_gpt2_demo()
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/eval_frame.py", line 433, in _fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/examples/simple_gpt2.py", line 5, in test_gpt2_demo
@torch.compile(backend="turbine_cpu")
File "/local/mnt/workspace/users/vpandya/iree-turbine/examples/simple_gpt2.py", line 7, in torch_dynamo_resume_in_test_gpt2_demo_at_7
tokenizer = AutoTokenizer.from_pretrained("gpt2")
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 2927, in from_pretrained
@classmethod
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3245, in torch_dynamo_resume_in_from_pretrained_at_3245
if not isinstance(config, PretrainedConfig):
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3352, in torch_dynamo_resume_in_from_pretrained_at_3352
if not isinstance(config, PretrainedConfig):
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3377, in torch_dynamo_resume_in_from_pretrained_at_3377
config = copy.deepcopy(config)
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3431, in torch_dynamo_resume_in_from_pretrained_at_3431
is_local = os.path.isdir(pretrained_model_name_or_path)
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3509, in torch_dynamo_resume_in_from_pretrained_at_3509
elif os.path.isfile(os.path.join(subfolder, pretrained_model_name_or_path)):
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3509, in torch_dynamo_resume_in_from_pretrained_at_3509
elif os.path.isfile(os.path.join(subfolder, pretrained_model_name_or_path)):
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3512, in torch_dynamo_resume_in_from_pretrained_at_3512
elif os.path.isfile(os.path.join(subfolder, pretrained_model_name_or_path + ".index")):
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3512, in torch_dynamo_resume_in_from_pretrained_at_3512
elif os.path.isfile(os.path.join(subfolder, pretrained_model_name_or_path + ".index")):
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3837, in torch_dynamo_resume_in_from_pretrained_at_3520
# Check first if we are `from_pt`
^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1190, in __init__
self.transformer = GPT2Model(config)
^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 911, in __init__
self.post_init()
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/convert_frame.py", line 1116, in __call__
return self._torchdynamo_orig_callable(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/convert_frame.py", line 948, in __call__
result = self._inner_convert(
^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/convert_frame.py", line 472, in __call__
return _compile(
^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_utils_internal.py", line 84, in wrapper_function
return StrobelightCompileTimeProfiler.profile_compile_time(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_strobelight/compile_time_profiler.py", line 129, in profile_compile_time
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/contextlib.py", line 81, in inner
return func(*args, **kwds)
^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/convert_frame.py", line 817, in _compile
guarded_code = compile_inner(code, one_graph, hooks, transform)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/utils.py", line 231, in time_wrapper
r = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/convert_frame.py", line 636, in compile_inner
out_code = transform_code_object(code, transform)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/bytecode_transformation.py", line 1185, in transform_code_object
transformations(instructions, code_options)
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/convert_frame.py", line 178, in _fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/convert_frame.py", line 582, in transform
tracer.run()
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 2451, in run
super().run()
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 893, in run
while self.step():
^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 805, in step
self.dispatch_table[inst.opcode](self, inst)
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 499, in wrapper
return inner_fn(self, inst)
^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 2059, in CALL
self.call_function(fn, args, kwargs)
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 743, in call_function
self.push(fn.call_function(self, args, kwargs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/functions.py", line 344, in call_function
return super().call_function(tx, args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/functions.py", line 293, in call_function
return super().call_function(tx, args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/functions.py", line 90, in call_function
return tx.inline_user_function_return(self, [*self.self_args(), *args], kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 749, in inline_user_function_return
return InliningInstructionTranslator.inline_call(self, fn, args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 2666, in inline_call
return cls.inline_call_(parent, func, args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 2782, in inline_call_
tracer.run()
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 893, in run
while self.step():
^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 805, in step
self.dispatch_table[inst.opcode](self, inst)
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 1561, in LOAD_ATTR
self._load_attr(inst)
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/symbolic_convert.py", line 1551, in _load_attr
result = BuiltinVariable(getattr).call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/builtin.py", line 962, in call_function
return handler(tx, args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/builtin.py", line 846, in builtin_dipatch
rv = fn(tx, args, kwargs)
^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/builtin.py", line 764, in call_self_handler
result = self_handler(tx, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/builtin.py", line 1626, in call_getattr
return obj.var_getattr(tx, name)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/dicts.py", line 914, in var_getattr
return ConstantVariable.create(getattr(self.obj, name))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local/mnt/workspace/users/vpandya/iree-turbine/install/torch/_dynamo/variables/constant.py", line 39, in create
assert not isinstance(value, disallowed_type), reason
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError: Dict types must use ConstDictVariable.
from user code:
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 1406, in post_init
self.init_weights()
File "/local/mnt/workspace/users/vpandya/.local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 2316, in init_weights
if self.config.pruned_heads:
Set TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information
You can suppress this exception and fall back to eager by setting:
import torch._dynamo
torch._dynamo.config.suppress_errors = True
The text was updated successfully, but these errors were encountered:
That error looks more related to PyTorch itself than iree-turbine. Do other backends for torch.compile work with that code? What versions of Python packages (notably torch) are you using?
That error looks more related to PyTorch itself than iree-turbine. Do other backends for torch.compile work with that code? What versions of Python packages (notably torch) are you using?
Python 3.11.9 (main, Apr 27 2024, 21:16:11) [GCC 13.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
I am getting following error
The text was updated successfully, but these errors were encountered: