Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
966 views
in Technique[技术] by (71.8m points)

deployment - Unable to serve an mlflow model locally

I have created an mlflow model with custom pyfunc. It shows the results when I send input to the loaded model in Jupyter notebook. However if I am trying to serve it to a local port

!mlflow models serve -m Home/miniconda3/envs/mlruns/0/baa40963927a49258c845421e3175c06/artifacts/model -p 8001

I am getting this error

 Traceback (most recent call last):
  File "/home/subhojyoti/miniconda3/envs/python3-env/bin/mlflow", line 10, in <module>
    sys.exit(cli())
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/mlflow/models/cli.py", line 56, in serve
    install_mlflow=install_mlflow).serve(model_uri=model_uri, port=port,
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/mlflow/models/cli.py", line 163, in _get_flavor_backend
    append_to_uri_path(underlying_model_uri, "MLmodel"), output_path=tmp.path())
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/mlflow/tracking/artifact_utils.py", line 76, in _download_artifact_from_uri
    artifact_path=artifact_path, dst_path=output_path)
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/mlflow/store/artifact/local_artifact_repo.py", line 67, in download_artifacts
    return super(LocalArtifactRepository, self).download_artifacts(artifact_path, dst_path)
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/mlflow/store/artifact/artifact_repo.py", line 140, in download_artifacts
    return download_file(artifact_path)
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/mlflow/store/artifact/artifact_repo.py", line 105, in download_file
    self._download_file(remote_file_path=fullpath, local_path=local_file_path)
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/site-packages/mlflow/store/artifact/local_artifact_repo.py", line 95, in _download_file
    shutil.copyfile(remote_file_path, local_path)
  File "/home/subhojyoti/miniconda3/envs/python3-env/lib/python3.6/shutil.py", line 120, in copyfile
    with open(src, 'rb') as fsrc:
FileNotFoundError: [Errno 2] No such file or directory: 'Home/miniconda3/envs/mlruns/0/baa40963927a49258c845421e3175c06/artifacts/model/MLmodel'
question from:https://stackoverflow.com/questions/65937623/unable-to-serve-an-mlflow-model-locally

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

From your error traceback, the model artifact can't be located. In your code, you are executing the 'mlflow' command from within a Jupyter Notebook. I would suggest trying the following:

  1. Check if your models artifacts are on the path you are using Home/miniconda3/envs/mlruns/0/baa40963927a49258c845421e3175c06/artifacts/model
  2. Try opening a terminal, then cd /Home/miniconda3/envs and execute mlflow models serve -m ./mlruns/0/baa40963927a49258c845421e3175c06/artifacts/model -p 8001
  3. MLFlow offers different solutions to serve a model, you can try to register your model and refer to it as "models:/{model_name}/{stage}" as mentioned in the Model Registry docs

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...