Skip to content

Langchain Model Deploy failed when putting input_sample #3618

Closed as not planned
@prise6

Description

@prise6

Hello, when i use something like below to log langchain model

input_example = {"user_message": "foobar"}
with mlflow.start_run():
    model_info = mlflow.langchain.log_model(
        lc_model=lc_model_path,
        artifact_path=artifact_path,
        input_example=input_example,
        signature=mlflow.models.infer_signature(
            model_input=input_example, model_output="output"
        ),
    )

And try to deploy the logged model with default image endpoint (example: azureml://registries/azureml/environments/mlflow-py39-inference/versions/2). There is an error :

 File "/opt/miniconda/envs/userenv/lib/python3.11/site-packages/azureml_inference_server_http/server/user_script.py", line 77, in load_script
   main_module_spec.loader.exec_module(user_module)
 File "<frozen importlib._bootstrap_external>", line 940, in exec_module
 File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
 File "/var/mlflow_resources/mlflow_score_script.py", line 374, in <module>
   input_param, output_param, params_param = get_parameter_type(sample_input, sample_output, sample_params)
                                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/var/mlflow_resources/mlflow_score_script.py", line 342, in get_parameter_type
   param_arg[key] = NumpyParameterType(value, enforce_shape=False)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/opt/miniconda/envs/userenv/lib/python3.11/site-packages/inference_schema/parameter_types/numpy_parameter_type.py", line 33, in __init__
   raise Exception("Invalid sample input provided, must provide a sample Numpy array.")
Exception: Invalid sample input provided, must provide a sample Numpy array.

This is because mlflow_score_script.py contains theses lines :

#...
            elif isinstance(sample_input_ex, dict):
                _logger.info("sample input is a dict")
                # TODO keeping this around while _infer_schema doesn't work on dataframe string signatures
                param_arg = {}
                for key, value in sample_input_ex.items():
                    param_arg[key] = NumpyParameterType(value, enforce_shape=False)
                input_param = StandardPythonParameterType(param_arg)
#...

Am i doing something wrong ?

Thank you for you help.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions