-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Continuing examples #13
Comments
This might make the formatters and linters quite sad, though. |
Happy to accept a change to fix this if it's not too complicated. How would you implement a fix? |
The first thing you'd probably need is to add in the ability for import itertools
import pytest
from pytest_examples import CodeExample
from pytest_examples import EvalExample
from pytest_examples import find_examples
GROUPED_EXAMPLES = []
for _, group in itertools.groupby(find_examples("path/to/examples"), lambda ex: ex.path):
GROUPED_EXAMPLES.append(list(group))
@pytest.mark.parametrize("examples", GROUPED_EXAMPLES, ids=str)
def test_grouped_examples(examples: list[CodeExample], eval_example: EvalExample):
for ex in examples:
eval_example.run(ex) You can also get clever with ```python { test_group="1" } x = 1 ``` Some text ```python { test_group="1" } assert x == 1 ``` Where you could then gather only the specific examples that need to be run together: import itertools
import pytest
from pytest_examples import CodeExample
from pytest_examples import EvalExample
from pytest_examples import find_examples
GROUPED_EXAMPLES = []
UNGROUPED_EXAMPLES = []
for group_id, group in itertools.groupby(
find_examples("path/to/examples"),
lambda ex: ex.prefix_settings().get("test_group"),
):
if group_id:
GROUPED_EXAMPLES.append(list(group))
else:
UNGROUPED_EXAMPLES.extend(group)
@pytest.mark.parametrize("examples", GROUPED_EXAMPLES, ids=str)
def test_grouped_examples(examples: list[CodeExample], eval_example: EvalExample):
for ex in examples:
eval_example.run(ex)
@pytest.mark.parametrize("example", UNGROUPED_EXAMPLES, ids=str)
def test_ungrouped_examples(example: CodeExample, eval_example: EvalExample):
eval_example.run(example) |
For reference I'm already using this strategy to skip testing certain docstrings by marking them with |
Would you consider supporting "continuing" examples that a feature text in between code fences?
For example:
We set
foo
to"bar"
to indicate bar. We can then print"Quux!"
Currently, I'm trying some sort of hybrid setup with pytest-markdown-docs that supports it, but that more or less has to run every example twice. Also their
```python continuation
thing doesn't quite render right in MkDocs.The text was updated successfully, but these errors were encountered: