Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GetAllJobInfo is_running_tasks is not returning the correct value when driver starts ray #44459

Closed
sofianhnaide opened this issue Apr 3, 2024 · 2 comments · Fixed by #44626
Closed
Assignees
Labels
bug Something that is supposed to be working; but isn't core Issues that should be addressed in Ray Core observability Issues related to the Ray Dashboard, Logging, Metrics, Tracing, and/or Profiling P0 Issues that should be fixed in short order

Comments

@sofianhnaide
Copy link
Contributor

What happened + What you expected to happen

The is_running_tasks attribute on GetAllJobInfo is not returning the expected value (True) when no ray cluster is started, but it returns the correct value when a cluster is already started.

e.g.

import ray

@ray.remote
def f(x):
    import time
    time.sleep(5)
    return 0

ray.get([f.remote(i) for i in range(4)])

to reproduce:

  1. Run ray stop -f
  2. Run the script above and query the job API

Actual: while the task are in progress, the is_running_tasks flag should be True but it is False.
Expected: It should be True. If you repeat the steps above after pre-starting ray with ray start --head. The expected value is returned.

Versions / Dependencies

2.10

Reproduction script

import ray

@ray.remote
def f(x):
import time
time.sleep(5)
return 0

ray.get([f.remote(i) for i in range(4)])

Issue Severity

Medium: It is a significant difficulty but I can work around it.

@sofianhnaide sofianhnaide added bug Something that is supposed to be working; but isn't triage Needs triage (eg: priority, bug/not-bug, and owning component) core Issues that should be addressed in Ray Core labels Apr 3, 2024
@anyscalesam anyscalesam added the observability Issues related to the Ray Dashboard, Logging, Metrics, Tracing, and/or Profiling label Apr 3, 2024
@jjyao jjyao added P0 Issues that should be fixed in short order and removed triage Needs triage (eg: priority, bug/not-bug, and owning component) labels Apr 3, 2024
@rynewang
Copy link
Contributor

Can repro. The probe code:

import ray

ray.init()

def f():
    client = ray.worker.global_worker.gcs_client
    job_id = ray.worker.global_worker.current_job_id.binary()
    all_job_info = client.get_all_job_info()
    for job_id in all_job_info:
        print(job_id, all_job_info[job_id].is_running_tasks)
        
f()

It prints the task running code's status = False, if the cluster is started from within the driver. Should be True.

@rynewang
Copy link
Contributor

So when there is a GetAllJobInfo call, the gcs job_manager calls the core worker client NumPendingTasks at gcs_job_manager.cc:220. When the job is started with ray start --head all is good; when the job starts from the job, we get this:

[2024-04-10 20:50:27,362 D 81581 11854183] (gcs_server) core_worker_client_pool.cc:39: Connected to worker 01000000ffffffffffffffffffffffffffffffffffffffffffffffff with address 127.0.0.1:0
[2024-04-10 20:50:27,362 D 81581 11854183] (gcs_server) gcs_job_manager.cc:220: Send NumPendingTasksRequest to worker 01000000ffffffffffffffffffffffffffffffffffffffffffffffff
[2024-04-10 20:50:27,363 D 81581 11854183] (gcs_server) gcs_job_manager.cc:226: Received NumPendingTasksReply from worker 01000000ffffffffffffffffffffffffffffffffffffffffffffffff
[2024-04-10 20:50:27,363 W 81581 11854183] (gcs_server) gcs_job_manager.cc:228: Failed to get is_running_tasks from core worker: GrpcUnavailable: RPC Error message: failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:0: Can't assign requested address; RPC Error details:
[2024-04-10 20:50:27,363 D 81581 11854183] (gcs_server) gcs_job_manager.cc:185: Finished getting all job info.

See the core worker address being 127.0.0.1:0. Will look more at where this is populated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something that is supposed to be working; but isn't core Issues that should be addressed in Ray Core observability Issues related to the Ray Dashboard, Logging, Metrics, Tracing, and/or Profiling P0 Issues that should be fixed in short order
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants