Skip to content

Commit

Permalink
chore: update json tooling docs
Browse files Browse the repository at this point in the history
Co-authored-by: Sasha <[email protected]>
  • Loading branch information
RuanJohn and sash-a authored Feb 27, 2024
1 parent b1a6a36 commit 82248f0
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/json_tooling_usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

The JSON logger will write experiment data to JSON files in the format required for downstream aggregation and plotting with the MARL-eval tools. To initialise the logger the following arguments are required:

* `path`: the path where a file called `metrics.json` will be stored which will contain all logged metrics for a given experiment. Data will be stored in `<path>/metrics.json` by default. If a JSON file already exists at a particular path, new experiment data will be appended to it. MARL-eval does currently **NOT SUPPORT** asynchronous logging. So if you intend to run distributed experiments, please create a unique `path` per experiment and concatenate all generated JSON files after all experiments have been run.
* `path`: the path where a file called `metrics.json` will be stored which will contain all logged metrics for a given experiment. Data will be stored in `<path>/metrics.json` by default. If a JSON file already exists at a particular path, new experiment data will be appended to it. MARL-eval currently does not support asynchronous logging. So if you intend to run distributed experiments, please create a unique `path` per experiment and concatenate all generated JSON files after all experiments have been run with the provided `concatenate_json_files` function.
* `algorithm_name`: the name of the algorithm being run in the current experiment.
* `task_name`: the name of the task in the current experiment.
* `environment_name`: the name of the environment in the current experiment.
Expand Down

0 comments on commit 82248f0

Please sign in to comment.