Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add automated tests for demo code snippets #7

Merged
merged 10 commits into from
Sep 3, 2019
Merged

Add automated tests for demo code snippets #7

merged 10 commits into from
Sep 3, 2019

Conversation

lukpueh
Copy link
Member

@lukpueh lukpueh commented Jul 18, 2019

This PR adds a script that extracts the demo code snippets from README.md and runs them in a shell, raising SystemExit, if the output is not as expected.
Comparing outputs requires some minor modifications in the used commands (to make the output cross-platform deterministic). The PR further

  • adds a travis config file to automatically run the tests on all Python versions, and
  • updates the pinned in-toto version requirement to the latest in-toto release.

In the future, automatic builds will also be triggered when a new version of in-toto is released (using dependabot.

In terms of testing this is an alternative to the existing run_demo.py script (which just replicates the commands from README.md), with the advantage that now the commands don't have to be
synced anymore.
If we decide to remove run_demo.py we might want to add a simple clean command, to remove files added during the demo, and consider live output (see comment about in_toto.process.run_duplicate_streams in run_demo_md.py).

RELATED WORK: This is a rather custom solution for a problem that was also discussed in theupdateframework/python-tuf#808. A more generic tool, would need to allow different languages, e.g. shell and python (using doctest) and maybe more options to customize the test success/fail conditions.

README.md Outdated
@@ -198,13 +202,13 @@ malicious code.

```shell
cd ../functionary_carl
echo "something evil" >> demo-project/foo.py
echo something evil >> demo-project/foo.py
```
Carl thought that this is the sane code he got from Bob and
Copy link
Member

@adityasaky adityasaky Sep 2, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably shouldn't be in this PR but maybe okay to sneak in?

sane -> same

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually did mean to write "sane", but since you're the second person (#3) to stumble across this, I'll change it.

Does "genuine" sound right?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I think genuine could work better than sane.

@adityasaky
Copy link
Member

I also noticed that #5 adds --verbose to in-toto-run. If the plan is to also add that to README, EXPECTED_STDOUT will have to be updated, correct? Is it worth doing it now?

@lukpueh
Copy link
Member Author

lukpueh commented Sep 3, 2019

Regarding --verbose: I'd say we change this if we decide to deprecate run_demo.py.

Add a script that extracts the demo code snippets from README.md
and runs them in a shell, raising `SystemExit`, if the output is
not as expected.

This is an automated testing alternative to the existing
`run_demo.py`, which replicates the commands from the demo
instructions, with the advantage that the commands don't have to be
synced.

NOTE: The script requires the in-toto version specified in
requirements.txt, i.e. 0.2.3 at the moment.
Adopt demo instructions to be used with newly created script that
extracts snippets from fenced code blocks and runs them.

This commit marks to snippets for exclusion, by specifying the
snippet language, used for syntax highlighting, as `bash`
(`run_demo_md.py` only extracts `shell` snippets).

Plus some minor cleanup.
`tree` behaves (sorts) differently on different platforms, which
is a problem, when testing against an expected output.
Requires changes in the expected output of the automated demo run
script.
This is necessary to get the same output in different environments,
which in turn is necessary to compare it to hardcoded expected
output.
These are treated differently with `set -x` on different systems
and thus break the build
E.g. on Travis (ubuntu): `+ echo something evil`
and locally: `+ echo 'something evil'`
@lukpueh lukpueh merged commit f89c683 into master Sep 3, 2019
cokieffebah added a commit to cokieffebah/in-toto-demo that referenced this pull request Feb 25, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants