-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
yarn audit --json produces large amounts of data #7404
Comments
We're running
Looks similiar to this problem, because the output was huge. |
@nishils Step 3 should have |
@tbezman yes, my bad. I have updated the original issue. |
I am seeing
|
I think this error is related to this issue (jestjs/jest#8682). If other folks can confirm that they have this dependency in their problematic repos, then that would be very helpful. The solution posted to run npm audit fix does not work for yarn. Yarn audit fix actually doesn't do anything (See this issue: #7075) There might be a way to filter test only packages in yarn audit (See issue #6632) Possibly Another solution might be to create a npm package-lock.json file and convert it back to yarn.
|
Can confirm that the impacted repository with the error has However wouldn't this impact |
I think npm does a better job of filtering paths. Yarn seems to be outputting every single path whereas npm is consolidating them somehow. That is my best guess at the moment. It seems like if you run |
Just to close the loop on fixing the immediate issue, This issue is still relevant as the amount of data that yarn can potentially output is still way too high. |
Having the same problem. Imho as part of a bigger fix, would be nice to group for vulnerability: if two packages have got the same dependency, at the same version and that dependency has got a vulnerability, would make sense to see the vulnerability reported only once, instead of twice. Then a list of paths for that vulnerability, could give more details about which packages are vulnerable. So, the idea would be to render the vulnerabilities list grouping by vulnerability and not by package. Even though sounds more like a feature than a bug fix. Suggesting this because, for what I am seeing in a project I am working on, the same vulnerability seems reported many times; I guess that could be at least one reason that is making the json size increasing. PS for me |
I totally agree but from statements made from the current maintainers, it seems that yarn audit work is being abandoned in favor of supporting it via plugins in the next generation of yarn. I'm guessing it will be a while before this does get resolved. Honestly, the easiest solution is to switch to npm as npm seems to handle this more intelligently along with more audit features. |
- Print advisories object to reduce redundancy in json output Solves yarnpkg#7404
I ended up on this issue after getting a Of course I could add some parmeters to |
adding --groups dependencies, greatly reduces the size. This allowed me to pipe it to yarn-audit-html which was failing my jenkins build without it. yarn audit --level high --groups dependencies --json | yarn-audit-html 5.3 GB to 346KB. |
Json output option for yarn audit is broken: yarnpkg/yarn#7404
Do you want to request a feature or report a bug?
Bug
What is the current behavior?
When running yarn audit against a couple of different projects and saving them to a file, the resulting file size is around 30 MB. When running against the same project but with the json flag, the file size is around 19 GB. A factor of 10x or even up to a 100x seems reasonable, but a ~633x seems like a bit much.
If the current behavior is a bug, please provide the steps to reproduce.
yarn audit > sample.txt
against this repo.ls -l sample.txt
to view the file size.yarn audit --json > sample.json
against the same repo.ls -l sample.json
to view the file size.What is the expected behavior?
A file size similar to running yarn audit.
Please mention your node.js, yarn and operating system version.
yarn v. 1.16.0, node.js v 12.6.0, up to date OSX,
The text was updated successfully, but these errors were encountered: