Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automated testing for conflicting and overridden features #38

Closed
danepowell opened this issue May 6, 2016 · 4 comments
Closed

Automated testing for conflicting and overridden features #38

danepowell opened this issue May 6, 2016 · 4 comments
Assignees
Labels
Enhancement A feature or feature request

Comments

@danepowell
Copy link
Contributor

danepowell commented May 6, 2016

Now that we are starting to support Features for configuration management, one of the most useful things that Bolt could provide is an automated test for overridden and conflicting features. Right now, about 25% of pull requests that I review produce a features conflict or override, but this is impossible to detect without manually installing the site and checking the Features UI.

The best approach would probably be a test that just runs drush fl, although I'm not sure how easy it is to parse state from this. It does support json output, and you can also restrict what fields to show (i.e. just state), so I'm guessing it's doable.

You could also have a behat test that just looks for "Changed", "Conflict", or "Added" on the features UI page.

One reason this is so important is the need to detect schema changes (applied via updb) in core and contrib modules. If features are perpetually overridden anyway, it's much harder to detect when a legitimate schema change has occurred.

@grasmash grasmash added ready Enhancement A feature or feature request labels Aug 15, 2016
@seanpclark
Copy link
Contributor

I'm taking a look at this. A behat test may make more sense. From what I've seen drush fl only shows the "Changed" state whereas the ui page will also display "Conflicts" etc. At first I tried this but then realized it wasn't catching the conflicts when I tested it on a project:
drush fl | grep -Ei '(changed|conflicts|added)( *)$'

@seanpclark
Copy link
Contributor

@grasmash or @danepowell it doesn't look like we include behat tests in blt other than the Examples.feature. I created a behat test, but having trouble figuring how/if this fits into the blt structure. Should we write a script to check if the features_ui is enabled and run the behat tests in .travis.yml? It seems likely that the deployment may not have the admin ui enabled.

Or would we just provide an example in template/tests/behat/features and just leave it up to the dev to include the tests as needed?

@danepowell
Copy link
Contributor Author

I'm guessing the second solution is the best one... provide an example that developers can include if it applies to their project.

@danepowell
Copy link
Contributor Author

I incorporated this test into the feature import step.

We also need to figure out a way to test for schema updates, and that's not as easy as I initially thought, so we can defer that discussion to #842.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Enhancement A feature or feature request
Projects
None yet
Development

No branches or pull requests

3 participants