-
-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What if tests and implementations were not including? #181
Comments
@mattcbaker Really interesting point. Having a standard set of tests helps in two ways:
A lot of our thinking with the reboot that we've been working on has been that Exercism is as much about the discussion as it is the code itself. I think most of the value comes from the feedback and refactoring cycles, rather than the initial implementation. With those caveats, I wouldn't be against a CLI flag that skips the default files and just gives you a README. Would that achieve what you're thinking? |
@iHiD thanks for the reply. I believe that a CLI flag would accomplish the technical side of this. Do you think that mentors would be willing to review these kinds of submissions? PS - I agree, the value I get from Exercism is in the feedback, refactoring and mentoring. Thanks for all the great work! |
I guess we'd have to see :) The whole formal mentoring thing is new (as in not yet launched) so it's all a bit guessworky (albeit informed guesswork) atm. I don't see why not though conceptually. @kytrinyx As you're knee-deep in CLI now, would this be something you were interested in adding? |
Oh interesting. Had discussed whether to have implementations and suggested
configurability at
#114 (comment)
but had not thought of using the same mechanism for the test files too.
Will be interesting to see where this goes.
|
I think this is an interesting thing to consider. I would totally add functionality to the CLI for it, though I'd want to hold off until we settle the basic prototype before doing so (and maybe figure out some of the mentoring aspects as well). For the tests, the point of the test suites is to not guide the solution but to guide the specification. I'm not sure that we do that well across the board. |
I would like to add that students should feel empowered to add tests if they feel it would help them towards the correct solution. If there is some aspect of their solution they are not sure about, that they are having hard time working out and an additional (intermediate) test might shed some light on the problem they are having... Maybe I am stating the obvious. |
@kytrinyx could you elaborate on this? |
In a poor test suite, the tests would tell you what to implement, and all the solutions will end up looking the same. In a good test suite, the tests are just saying what the correct outcome is, but you still have to figure out how to get to that outcome. If you look at the canonical test inputs/outputs for pangram, it's a pretty good example of just saying "it should be so", but it doesn't give you any direction as to how you should actually figure out if something is a pangram: https://github.com/exercism/problem-specifications/blob/master/exercises/pangram/canonical-data.json |
Ah, I think I understand now. This comes back to my original question. It might be nice to encourage people to get better at coming up with the canonical list of tests for a given problem. It is definitely an important skill (and part of the critiques that I've heard), I just don't know if it is what exercism.io wants to focus on. I think I might try submitting solutions that don't use the provided test cases, just to see what happens. I'd be interested in seeing if people still take the time to review and offer feedback. |
Totally agree.
Right—that's potentially non-overlapping with what we're trying to aim for (I haven't put any thought into this yet, so I don't have an answer to this question at the moment). That said there is absolutely no reason why you shouldn't be able to base your practice off of just the README. |
If we were to not provide tests, the main benefit is that users would learn to write tests. However. I think it would have several not so great consequences:
I understand the question, but to me this is not what Exercism is about. To me, Exercism is about learning a language and trying out various approaches to a problem. It is not about learning how to write tests to me. |
I will often implement a solution by writing my own tests from scratch, and then run against the Exercism provided tests when I think I'm done. |
I really like that pattern @Insti. |
Thanks, everyone for the great discussion. Having had a year to think about this (and develop the new version of the site), I've concluded that while the idea is interesting, it's separate from what Exercism is trying to accomplish. I'm going to go ahead and close this, and if in the future there are more thoughts around this, I'd ask that you open a new issue in http://github.com/exercism/exercism.io/issues |
One critique for exercism.io I've heard is that since the test cases and initial implementations are included, it can feel like "code by numbers".
Has there been any discussion around not including test cases and initial implementations? Maybe a track for that? Or maybe just some exercises?
Or, maybe exercism.io isn't interested in this at all.
What does everyone think?
The text was updated successfully, but these errors were encountered: