-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Coverage report missing for untested files #2674
Comments
I think you might need to add |
Hi, thanks so much for your comment! You're totally right, just adding I should have noticed there was a "Config Reference" page but for some reason (tiredness probably!) I missed that. I think I was expecting the coverage guide to mention this somewhere. I've created a PR to add a link, in case that's useful. Is Either that, or do what Jest does (though I'm not a fan of it): generating an empty report until a user adds files using Open to thoughts/scrutiny and happy to help in any way I can :) |
Maybe not. Currently Vitest exposes many coverage options in its API - maybe even too many. At some point it should be checked whether some of the options need to be configurable at all. The default value for
Jest's |
I see - well it does make sense to keep
Yeah, actually I agree - perhaps this should be the recommended approach. Something like this would work:
Note, we don't need to set {
...
coverage: {
// Instrument all files
all: true,
// Include only .ts files
+ include : [
+ "**/*.ts"
+ ],
// Optional - exclude generated source code from coverage reports
exclude : [
"**/*.generated.ts"
]
}
} Given that
The decision for this comes down to whether you want users to be familiar with the defaults of the underlying coverage providers and their caveats or whether you want things to "just work" out of the box. I'd personally lean towards the 2nd option, as it feels wrong to tell users to configure vitest to report coverage for all of their files - they'd expect it to do this anyway. Again, very happy to be corrected! :) |
The There's a good summary in
To make things easier for end users, I think we could remove |
Ahh yes, that makes sense!
So I guess this should be as simple as: const coverageConfigDefaults = {
- all: false,
+ all: true,
When you say I think there would be most value by initially:
The other options could be reviewed separately? Though I do agree they should be reviewed |
Yes, exactly. There are also other configuration options that could be improved.
Sure, for now this change could be its own PR. Later when going through other options, we might end up removing |
One major issue in defaulting |
@madeleineostoja that |
Describe the bug
When running a coverage report using instanbul or c8, modules that aren't tested aren't being shown in the report. This could mean I have a partially tested codebase but still get a 100% code coverage report. This could be particularly troublesome if you have a large codebase and a small number of files are not tested (e.g. after a refactor). These could be quite easily missed, while the user sees 100% code coverage.
In jest, this is usually solved with the "collectCoverageFrom" option which allows for specifying a pattern to match files that should be analysed for coverage. Perhaps a similar setting could be introduced.
Note: ErrorBoundary and FallbackComponent are used in the application but not imported by any
.test.*
moduleReproduction
To reproduce, run a coverage report using vitest with a module that is tested and a module that is not. The untested module does not appear in the coverage report, but it should as it may still be used by the application. This can lead to false coverage reports.
If further reproduction is required, I'm happy to submit a minimal reproduction sometime soon but need sleep right now..!
System Info
Used Package Manager
npm
Validations
The text was updated successfully, but these errors were encountered: