Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DAG Error Because of Python Script #92

Open
sungreong opened this issue Mar 2, 2023 · 3 comments
Open

DAG Error Because of Python Script #92

sungreong opened this issue Mar 2, 2023 · 3 comments

Comments

@sungreong
Copy link

Description

I want to check the DAG of the flow script, but an error occurs in the Python script

Steps to Reproduce

  1. build and run metaflow-ui
docker build --tag metaflow-ui:latest .
docker run -p 3000:3000 -e METAFLOW_SERVICE=http://localhost:8083/ metaflow-ui:latest
git clone https://github.com/Netflix/metaflow-service.git
cd metaflow-service
docker-compose -f docker-compose.development.yml up
  1. RUN flow script in metaflow-ui container
# install python package and metaflow
METAFLOW_SERVICE_URL=http://0.0.0.0:8080/ METAFLOW_DEFAULT_METADATA=service python3 helloworld.py run

Expected behavior:

DAG GRAPH SHOW

image

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/local/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/root/services/ui_backend_service/data/cache/client/cache_server.py", line 307, in <module>
    cli(auto_envvar_prefix='MFCACHE')
  File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1128, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1053, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1395, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.7/site-packages/click/core.py", line 754, in invoke
    return __callback(*args, **kwargs)
  File "/root/services/ui_backend_service/data/cache/client/cache_server.py", line 301, in cli
    Scheduler(store, max_actions).loop()
  File "/root/services/ui_backend_service/data/cache/client/cache_server.py", line 199, in __init__
    maxtasksperchild=512,  # Recycle each worker once 512 tasks have been completed
  File "/usr/local/lib/python3.7/multiprocessing/context.py", line 119, in Pool
    context=self.get_context())
  File "/usr/local/lib/python3.7/multiprocessing/pool.py", line 176, in __init__
    self._repopulate_pool()
  File "/usr/local/lib/python3.7/multiprocessing/pool.py", line 241, in _repopulate_pool
    w.start()
  File "/usr/local/lib/python3.7/multiprocessing/process.py", line 112, in start
    self._popen = self._Popen(self)
  File "/usr/local/lib/python3.7/multiprocessing/context.py", line 277, in _Popen
    return Popen(process_obj)
  File "/usr/local/lib/python3.7/multiprocessing/popen_fork.py", line 20, in __init__
    self._launch(process_obj)
  File "/usr/local/lib/python3.7/multiprocessing/popen_fork.py", line 74, in _launch
    code = process_obj._bootstrap()
  File "/usr/local/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python3.7/multiprocessing/pool.py", line 121, in worker
    result = (True, func(*args, **kwds))
  File "/root/services/ui_backend_service/data/cache/client/cache_worker.py", line 29, in execute_action
    execute(tempdir, action_cls, request)
  File "/root/services/ui_backend_service/data/cache/client/cache_worker.py", line 56, in execute
    invalidate_cache=req.get('invalidate_cache', False))
  File "/root/services/ui_backend_service/data/cache/generate_dag_action.py", line 97, in execute
    results[result_key] = json.dumps(dag)
  File "/usr/local/lib/python3.7/contextlib.py", line 130, in __exit__
    self.gen.throw(type, value, traceback)
  File "/root/services/ui_backend_service/data/cache/utils.py", line 130, in streamed_errors
    get_traceback_str()
  File "/root/services/ui_backend_service/data/cache/utils.py", line 124, in streamed_errors
    yield
  File "/root/services/ui_backend_service/data/cache/generate_dag_action.py", line 93, in execute
    dag = DataArtifact("{}/_graph_info".format(param_step.task.pathspec)).data
  File "/usr/local/lib/python3.7/site-packages/metaflow/client/core.py", line 825, in data
    obj = filecache.get_artifact(ds_type, location[6:], meta, *components)
  File "/usr/local/lib/python3.7/site-packages/metaflow/client/filecache.py", line 216, in get_artifact
    [name],
  File "/usr/local/lib/python3.7/site-packages/metaflow/datastore/task_datastore.py", line 364, in load_artifacts
    for (key, blob) in self._ca_store.load_blobs(to_load.keys()):
  File "/usr/local/lib/python3.7/site-packages/metaflow/datastore/content_addressed_store.py", line 140, in load_blobs
    with open(file_path, "rb") as f:

TypeError: expected str, bytes or os.PathLike object, not NoneType

Versions

Metaflow 2.8.0

Additional Information

@hermawanmulyono
Copy link

+1

I'm having this issue as well.

@RaulPL
Copy link

RaulPL commented May 6, 2023

I had the same issue, I was able to see my DAG after setting AWS S3 in the metaflow configuration file (Configuring Metaflow) and setting my AWS_PROFILE in both the metaflow-ui and metaflow-service.

@hermawanmulyono
Copy link

Hi, sorry for the late reply. I fixed the issue. At my company, we don't use cloud. Apparently, the issue was fixed by correctly setting up the UI such that the UI server can read off the datastore in our NAS.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants