diff --git a/doc/source/_toc.yml b/doc/source/_toc.yml index 3e6e05475e5e..beca365eee78 100644 --- a/doc/source/_toc.yml +++ b/doc/source/_toc.yml @@ -24,6 +24,7 @@ parts: - file: ray-core/examples/overview title: "Examples" sections: + - file: ray-core/examples/gentle_walkthrough - file: ray-core/examples/monte_carlo_pi - file: ray-core/examples/plot_example-a3c - file: ray-core/examples/plot_example-lm diff --git a/doc/source/ray-core/examples/gentle_walkthrough.ipynb b/doc/source/ray-core/examples/gentle_walkthrough.ipynb new file mode 100644 index 000000000000..a63c7520f3c8 --- /dev/null +++ b/doc/source/ray-core/examples/gentle_walkthrough.ipynb @@ -0,0 +1,776 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "9cdddbae", + "metadata": {}, + "source": [ + "# A Gentle Introduction to Ray Core by Example" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "9d4d0ecd", + "metadata": {}, + "source": [ + "Implement a function in Ray Core to understand how Ray works and its basic concepts.\n", + "Python programmers from those with less experience to those who are interested in advanced tasks,\n", + "can start working with distributed computing using Python by learning the Ray Core API.\n", + "\n", + "## Install Ray\n", + "Install Ray with the following command:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6115afbb", + "metadata": {}, + "outputs": [], + "source": [ + "! pip install ray" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "be9d3c98", + "metadata": {}, + "source": [ + "## Ray Core\n", + "\n", + "Start a local cluster by running the following commands:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "import ray\n", + "ray.init()" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "Note the following lines in the output:" + ] + }, + { + "cell_type": "markdown", + "id": "e75826f4", + "metadata": {}, + "source": [ + "```\n", + "... INFO services.py:1263 -- View the Ray dashboard at http://127.0.0.1:8265\n", + "{'node_ip_address': '192.168.1.41',\n", + "...\n", + "'node_id': '...'}\n", + "```" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "4c837608", + "metadata": {}, + "source": [ + "These messages indicate that the Ray cluster is working as expected. In this example output, the address of the Ray dashboard is `http://127.0.0.1:8265`. Access the Ray dashboard at the address on the first line of your output. The Ray dashboard displays information such as the number of CPU cores available and the total utilization of the current Ray application.\n", + "This is a typical output for a laptop:\n", + "\n", + "```\n", + "{'CPU': 12.0,\n", + "'memory': 14203886388.0,\n", + "'node:127.0.0.1': 1.0,\n", + "'object_store_memory': 2147483648.0}\n", + "```" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "cf542ed9", + "metadata": {}, + "source": [ + "Next, is a brief introduction to the Ray Core API, which we refer to as the Ray API.\n", + "The Ray API builds on concepts such as decorators, functions, and classes, that are familiar to Python programmers.\n", + "It is a universal programming interface for distributed computing. \n", + "The engine handles the complicated work, allowing developers to use Ray with existing Python libraries and systems." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "83d1a3bf", + "metadata": {}, + "source": [ + "## Your First Ray API Example\n", + "\n", + "The following function retrieves and processes\n", + "data from a database. The dummy `database` is a plain Python list containing the\n", + "words of the title of the [\"Learning Ray\" book](https://www.amazon.com/Learning-Ray-Flexible-Distributed-Machine/dp/1098117220/).\n", + "The `sleep` function pauses for a certain amount of time to simulate the cost of accessing and processing data from the database. " + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "e053331e", + "metadata": {}, + "outputs": [], + "source": [ + "import time\n", + "\n", + "database = [\n", + " \"Learning\", \"Ray\",\n", + " \"Flexible\", \"Distributed\", \"Python\", \"for\", \"Machine\", \"Learning\"\n", + "]\n", + "\n", + "\n", + "def retrieve(item):\n", + " time.sleep(item / 10.)\n", + " return item, database[item]" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "2b518f4f", + "metadata": {}, + "source": [ + "If the item with index 5 takes half a second `(5 / 10.)`, an estimate of the total runtime to retrieve all eight items sequentially is `(0+1+2+3+4+5+6+7)/10. = 2.8` seconds.\n", + "Run the following code to get the actual time:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "b0091149", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Runtime: 2.82 seconds, data:\n", + "(0, 'Learning')\n", + "(1, 'Ray')\n", + "(2, 'Flexible')\n", + "(3, 'Distributed')\n", + "(4, 'Python')\n", + "(5, 'for')\n", + "(6, 'Machine')\n", + "(7, 'Learning')\n" + ] + } + ], + "source": [ + "def print_runtime(input_data, start_time):\n", + " print(f'Runtime: {time.time() - start_time:.2f} seconds, data:')\n", + " print(*input_data, sep=\"\\n\")\n", + "\n", + "\n", + "start = time.time()\n", + "data = [retrieve(item) for item in range(8)]\n", + "print_runtime(data, start)" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "97aa047d", + "metadata": {}, + "source": [ + "The total time to run the function is 2.82 seconds in this example, but time may be different for your computer.\n", + "Note that this basic Python version cannot run the function simultaneously." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "30291db3", + "metadata": {}, + "source": [ + "You may expect that Python list comprehensions are more efficient. The measured runtime of 2.8 seconds is actually the worst case scenario.\n", + "Although this program \"sleeps\" for most of its runtime, it is slow because of the Global Interpreter Lock (GIL)." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "2ebb32d4", + "metadata": {}, + "source": [ + "### Ray Tasks\n", + "\n", + "This task can benefit from parallelization. If it is perfectly distributed, the runtime should not take much longer than the slowest subtask,\n", + "that is, `7/10. = 0.7` seconds.\n", + "To extend this example to run in parallel on Ray, start by using the @ray.remote decorator:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "1e21e7c7", + "metadata": {}, + "outputs": [], + "source": [ + "import ray \n", + "\n", + "\n", + "@ray.remote\n", + "def retrieve_task(item):\n", + " return retrieve(item)" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "935a4062", + "metadata": {}, + "source": [ + "With the decorator, the function `retrieve_task` becomes a :ref:`ray-remote-functions`_.\n", + "A Ray task is a function that Ray executes on a different process from where\n", + "it was called, and possibly on a different machine." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "78afc80a", + "metadata": {}, + "source": [ + "Ray is convenient to use because you can continue writing Python code,\n", + "without having to significantly change your approach or programming style.\n", + "Using the :func:`ray.remote()<@ray.remote>` decorator on the retrieve function is the intended use of decorators,\n", + "and you did not modify the original code in this example." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "c5dc2e18", + "metadata": {}, + "source": [ + "To retrieve database entries and measure performance, you do not need to make many changes to the code. Here's an overview of the process:" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "a34697da", + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "2022-12-20 13:52:34,632\tINFO worker.py:1529 -- Started a local Ray instance. View the dashboard at \u001B[1m\u001B[32m127.0.0.1:8265 \u001B[39m\u001B[22m\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Runtime: 2.82 seconds, data:\n", + "(0, 'Learning')\n", + "(1, 'Ray')\n", + "(2, 'Flexible')\n", + "(3, 'Distributed')\n", + "(4, 'Python')\n", + "(5, 'for')\n", + "(6, 'Machine')\n", + "(7, 'Learning')\n" + ] + } + ], + "source": [ + "start = time.time()\n", + "object_references = [\n", + " retrieve_task.remote(item) for item in range(8)\n", + "]\n", + "data = ray.get(object_references)\n", + "print_runtime(data, start)" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "8df4087c", + "metadata": {}, + "source": [ + "Running the task in parallel requires two minor code modifications.\n", + "To execute your Ray task remotely, you must use a `.remote()` call.\n", + "Ray executes remote tasks asynchronously, even on a local cluster.\n", + "The items in the `object_references` list in the code snippet do not directly contain the results.\n", + "If you check the Python type of the first item using `type(object_references[0])`,\n", + "you see that it is actually an `ObjectRef`.\n", + "These object references correspond to _futures_ for which you need to request the result.\n", + "The call :func:`ray.get()` is for requesting the result. Whenever you call remote on a Ray task,\n", + "it immediately returns one or more object references.\n", + "Consider Ray tasks as the primary way of creating objects.\n", + "The following section is an example that links multiple tasks together and allows\n", + "Ray to pass and resolve the objects between them." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "2373ddd9", + "metadata": {}, + "source": [ + "Let's review the previous steps.\n", + "You started with a Python function, then decorated it with `@ray.remote`, making the function a Ray task.\n", + "Instead of directly calling the original function in the code, you called `.remote(...)` on the Ray task.\n", + "Finally, you retrieved the results from the Ray cluster using `.get(...)`.\n", + "Consider creating a Ray task from one of your own functions as an additional exercise." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "e008a500", + "metadata": {}, + "source": [ + "Let's review the performance gain from using Ray tasks.\n", + "On most laptops the runtime is around 0.71 seconds,\n", + "which is slightly more than the slowest subtask, which is 0.7 seconds.\n", + "You can further improve the program by leveraging more of Ray’s API." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "54f53644", + "metadata": {}, + "source": [ + "### Object Stores\n", + "\n", + "The retrieve definition directly accesses items from the `database`. While this works well on a local Ray cluster, consider how it functions on an actual cluster with multiple computers.\n", + "A Ray cluster has a head node with a driver process and multiple worker nodes with worker processes executing tasks.\n", + "In this scenario the database is only defined on the driver, but the worker processes need access to it to run the retrieve task.\n", + "Ray's solution for sharing objects between the driver and workers or between workers is to use\n", + "the `ray.put` function to place the data into Ray's distributed object store.\n", + "In the `retrieve_task` definition, you can add a `db` argument to pass later as the `db_object_ref` object." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "da66a836", + "metadata": {}, + "outputs": [], + "source": [ + "db_object_ref = ray.put(database)\n", + "\n", + "\n", + "@ray.remote\n", + "def retrieve_task(item, db):\n", + " time.sleep(item / 10.)\n", + " return item, db[item]" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "72f37eb4", + "metadata": {}, + "source": [ + "By using the object store, you allow Ray to manage data access throughout the entire cluster.\n", + "Although the object store involves some overhead, it improves performance for larger datasets.\n", + "This step is crucial for a truly distributed environment.\n", + "Rerun the example with the `retrieve_task` function to confirm that it executes as you expect." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "453e312f", + "metadata": {}, + "source": [ + "### Non-blocking calls\n", + "\n", + "In the previous section, you used `ray.get(object_references)` to retrieve results.\n", + "This call blocks the driver process until all results are available.\n", + "This dependency can cause problems if each database item takes several minutes to process.\n", + "More efficiency gains are possible if you allow the driver process to perform other tasks while waiting for results,\n", + "and to process results as they are completed rather than waiting for all items to finish.\n", + "Additionally, if one of the database items cannot be retrieved due to an issue like a deadlock in the database connection,\n", + "the driver hangs indefinitely.\n", + "To prevent indefinite hangs, set reasonable `timeout` values when using the `wait` function.\n", + "For example, if you want to wait less than ten times the time of the slowest data retrieval task,\n", + "use the `wait` function to stop the task after that time has passed." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "75da06ec", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Runtime: 0.11 seconds, data:\n", + "(0, 'Learning')\n", + "(1, 'Ray')\n", + "Runtime: 0.31 seconds, data:\n", + "(2, 'Flexible')\n", + "(3, 'Distributed')\n", + "Runtime: 0.51 seconds, data:\n", + "(4, 'Python')\n", + "(5, 'for')\n", + "Runtime: 0.71 seconds, data:\n", + "(6, 'Machine')\n", + "(7, 'Learning')\n", + "Runtime: 0.71 seconds, data:\n", + "(0, 'Learning')\n", + "(1, 'Ray')\n", + "(2, 'Flexible')\n", + "(3, 'Distributed')\n", + "(4, 'Python')\n", + "(5, 'for')\n", + "(6, 'Machine')\n", + "(7, 'Learning')\n" + ] + } + ], + "source": [ + "start = time.time()\n", + "object_references = [\n", + " retrieve_task.remote(item, db_object_ref) for item in range(8)\n", + "]\n", + "all_data = []\n", + "\n", + "while len(object_references) > 0:\n", + " finished, object_references = ray.wait(\n", + " object_references, timeout=7.0\n", + " )\n", + " data = ray.get(finished)\n", + " print_runtime(data, start)\n", + " all_data.extend(data)\n", + "\n", + "print_runtime(all_data, start)" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "cf6f00c3", + "metadata": {}, + "source": [ + "Instead of printing the results, you can use the retrieved values\n", + "within the `while` loop to initiate new tasks on other workers." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "1a9f6be5", + "metadata": {}, + "source": [ + "### Task dependencies\n", + "\n", + "You may want to perform an additional processing task on the retrieved data. For example, \n", + "use the results from the first retrieval task to query other related data from the same database (perhaps from a different table).\n", + "The code below sets up this follow-up task and executes both the `retrieve_task` and `follow_up_task` in sequence." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "f5734bb1", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "((0, 'Learning'), (1, 'Ray'))\n", + "((2, 'Flexible'), (3, 'Distributed'))\n", + "((4, 'Python'), (5, 'for'))\n", + "((6, 'Machine'), (7, 'Learning'))\n" + ] + } + ], + "source": [ + "@ray.remote\n", + "def follow_up_task(retrieve_result):\n", + " original_item, _ = retrieve_result\n", + " follow_up_result = retrieve(original_item + 1)\n", + " return retrieve_result, follow_up_result\n", + "\n", + "\n", + "retrieve_refs = [retrieve_task.remote(item, db_object_ref) for item in [0, 2, 4, 6]]\n", + "follow_up_refs = [follow_up_task.remote(ref) for ref in retrieve_refs]\n", + "\n", + "result = [print(data) for data in ray.get(follow_up_refs)]" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "6e4a6945", + "metadata": {}, + "source": [ + "If you're unfamiliar with asynchronous programming, this example may not be particularly impressive.\n", + "However, at second glance it might be surprising that the code runs at all.\n", + "The code appears to be a regular Python function with a few list comprehensions.\n", + "\n", + "The function body of `follow_up_task` expects a Python tuple for its input argument `retrieve_result`.\n", + "However, when you use the `[follow_up_task.remote(ref) for ref in retrieve_refs]` command,\n", + "you are not passing tuples to the follow-up task.\n", + "Instead, you are using the `retrieve_refs` to pass in Ray object references.\n", + "\n", + "Behind the scenes, Ray recognizes that the `follow_up_task` needs actual values,\n", + "so it _automatically_ uses the `ray.get` function to resolve these futures.\n", + "Additionally, Ray creates a dependency graph for all the tasks and executes them in a way that respects their dependencies.\n", + "You don't have to explicitly tell Ray when to wait for a previous task to be completed––it infers the order of execution.\n", + "This feature of the Ray object store is useful because you avoid copying large intermediate values\n", + "back to the driver by passing the object references to the next task and letting Ray handle the rest." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "f8722673", + "metadata": {}, + "source": [ + "The next steps in the process are only scheduled once the tasks specifically designed to retrieve information are completed.\n", + "In fact, if `retrieve_refs` was called `retrieve_result`, you might not have noticed this crucial and intentional naming nuance. Ray allows you to concentrate on your work rather than the technicalities of cluster computing.\n", + "The dependency graph for the two tasks looks like this:" + ] + }, + { + "cell_type": "markdown", + "id": "d03b8e46", + "metadata": { + "lines_to_next_cell": 2 + }, + "source": [ + "![Task dependency](https://raw.githubusercontent.com/maxpumperla/learning_ray/main/notebooks/images/chapter_02/task_dependency.png)" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "e4001edb", + "metadata": {}, + "source": [ + "### Ray Actors\n", + "\n", + "This example covers one more significant aspect of Ray Core.\n", + "Up until this step, everything is essentially a function.\n", + "You used the `@ray.remote` decorator to make certain functions remote, but aside from that, you only used standard Python." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "2a7a6e69", + "metadata": {}, + "source": [ + "If you want to keep track of how often the database is being queried, you could count the results of the retrieve tasks.\n", + "However, is there a more efficient way to do this? Ideally, you want to track this in a distributed manner that can handle a large amount of data.\n", + "Ray provides a solution with actors, which run stateful computations on a cluster and can also communicate with each other.\n", + "Similar to how you create Ray tasks using decorated functions, create Ray actors using decorated Python classes.\n", + "Therefore, you can create a simple counter using a Ray actor to track the number of database calls." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "717df7d0", + "metadata": {}, + "outputs": [], + "source": [ + "@ray.remote\n", + "class DataTracker:\n", + " def __init__(self):\n", + " self._counts = 0\n", + "\n", + " def increment(self):\n", + " self._counts += 1\n", + "\n", + " def counts(self):\n", + " return self._counts" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "2e003cc6", + "metadata": {}, + "source": [ + "The DataTracker class becomes an actor when you give it the `ray.remote` decorator. This actor is capable of tracking state,\n", + "such as a count, and its methods are Ray actor tasks that you can invoke in the same way as functions using `.remote()`.\n", + "Modify the retrieve_task to incorporate this actor." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "id": "6843b8d9", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[(0, 'Learning'), (1, 'Ray'), (2, 'Flexible'), (3, 'Distributed'), (4, 'Python'), (5, 'for'), (6, 'Machine'), (7, 'Learning')]\n", + "8\n" + ] + } + ], + "source": [ + "@ray.remote\n", + "def retrieve_tracker_task(item, tracker, db):\n", + " time.sleep(item / 10.)\n", + " tracker.increment.remote()\n", + " return item, db[item]\n", + "\n", + "\n", + "tracker = DataTracker.remote()\n", + "\n", + "object_references = [\n", + " retrieve_tracker_task.remote(item, tracker, db_object_ref) for item in range(8)\n", + "]\n", + "data = ray.get(object_references)\n", + "\n", + "print(data)\n", + "print(ray.get(tracker.counts.remote()))" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "ae886162", + "metadata": {}, + "source": [ + "As expected, the outcome of this calculation is 8.\n", + "Although you don't need actors to perform this calculation, this demonstrates a way to maintain state across the cluster, possibly involving multiple tasks.\n", + "In fact, you could pass the actor into any related task or even into the constructor of a different actor.\n", + "The Ray API is flexible, allowing for limitless possibilities.\n", + "It's rare for distributed Python tools to allow for stateful computations,\n", + "which is especially useful for running complex distributed algorithms such as reinforcement learning." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "fb8bd0f5", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this example, you only used six API methods.\n", + "These included `ray.init()` to initiate the cluster, `@ray.remote` to transform functions and classes into tasks and actors,\n", + "`ray.put()` to transfer values into Ray's object store, and `ray.get()` to retrieve objects from the cluster.\n", + "Additionally, you used `.remote()` on actor methods or tasks to execute code on the cluster, and `ray.wait` to prevent blocking calls." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0d936caa", + "metadata": {}, + "outputs": [], + "source": [] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "0d8693e9", + "metadata": {}, + "source": [ + "The Ray API consists of more than these six calls, but these six are powerful, if you're just starting out.\n", + "To summarize more generally, the methods are as follows:" + ] + }, + { + "cell_type": "markdown", + "id": "f2fb14bd", + "metadata": {}, + "source": [ + "- `ray.init()`: Initializes your Ray cluster. Pass in an address to connect to an existing cluster.\n", + "- `@ray.remote`: Turns functions into tasks and classes into actors.\n", + "- `ray.put()`: Puts values into Ray’s object store.\n", + "- `ray.get()`: Gets values from the object store. Returns the values you’ve put there or that were computed by a task or actor.\n", + "- `.remote()`: Runs actor methods or tasks on your Ray cluster and is used to instantiate actors.\n", + "- `ray.wait()`: Returns two lists of object references, one with finished tasks we’re waiting for and one with unfinished tasks." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "## Want to learn more?\n", + "\n", + "This example is a simplified version of the Ray Core walkthrough of [our \"Learning Ray\" book](https://maxpumperla.com/learning_ray/).\n", + "If you liked it, check out the [Ray Core Examples Gallery](./overview.rst) or some of the ML workloads in our [Use Case Gallery](../../ray-overview/use-cases.rst)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [] + } + ], + "metadata": { + "jupytext": { + "cell_metadata_filter": "-all", + "main_language": "python", + "notebook_metadata_filter": "-all" + }, + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.9.13" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/doc/source/ray-core/examples/overview.rst b/doc/source/ray-core/examples/overview.rst index 95364707dc48..00753e8b2090 100644 --- a/doc/source/ray-core/examples/overview.rst +++ b/doc/source/ray-core/examples/overview.rst @@ -105,6 +105,15 @@ Basic Examples --- :img-top: /ray-overview/images/ray_svg_logo.svg + +++ + .. link-button:: gentle_walkthrough + :type: ref + :text: A Gentle Introduction to Ray Core by Example + :classes: btn-link btn-block stretched-link + + --- + :img-top: /ray-overview/images/ray_svg_logo.svg + +++ .. link-button:: highly_parallel :type: ref