The test currently requires the student is removed from the grade(s) they’re first added to and then added to the last one. Since changing the tests to better conform to the instructions would break implementations, I suggest changing the instructions to match the test instead.
I’ve created a PR which was auto-closed and sent me here…
The tests likely still need to be synced. It appears to have been several years since the last one, and there are a few missing in the test suite.
If you still want to update the instructions, I’d make a track-specific instructions append file rather than updating the instructions file directly. The instructions are synced from the problem-specifications repo all tracks use for practice exercise tests and documentation so any track-specific edits to the instructions file would be overwritten if the upstream instructions are changed and synced to the TypeScript repo.
That said, if you have edits that might be worthwhile to push to problem-specifications, more folks could benefit from that than just the TypeScript track. However, we’d definitely want to discuss those edits here on the forum before making the PR there.
I think the global instructions make more sense, so adjusting the tests to expect an exception to be raised would be preferable imho. I don’t know how much of a negative impact that would have, though. The thing is that we’d have to adjust the problem specification as well if we go that way - and thus likely most of the tracks…
If you think we can change the tests, I see three options:
locally override the instructions and / or the tests, diverging from the global exercise format
go for what’s in the problem specification and
adjust the global instructions file - it’s a bit confusing anyhow imho: I expected to be parsing input sentences and returning answers when I read it.
simply “flip” the grade we expect an empty array in
adjust the problem spec to match the instructions and
adjust the test to expectan exception
notify all track maintainers to change their tests
To be honest, I think I’d
initially adjust the instructions locally to match the problem spec - it makes the exercise more interesting imho.
ask all track maintainers to check if their tests match the problem spec - and if not potentially create an override to the instructions to keep them as is locally once we change them globally
change the instructions globally to match the ones we initially add locally and remove our local override.
What do you think? Happy to create PRs going either way.
If you intend to adjust the problem specs, you probably should do that first to reduce churn.
If you plan on adjusting the problem specs, please start a fresh discussion and lay out what you plan on adjusting and why. Changes there after many/most tracks. There are over 60 tracks, each with their own set of maintainers. Changes which break existing solutions need to have a really good reason as the cost of change is quite high. It literally costs money to rerun all the tests for all the submitted solutions.
If we do it this way round, then we don’t need to rerun the tests everywhere, which is a good thing.
Agree with this. I think opening a new forum post (in Exercism/Building) that is targeting updating Prob Specs is the right way forward.
Please copy and paste the change you’re making into that forum post so people can easily see what’s changing. That’ll make it easier/quicker to get consensus.
Please tag @erikschierboom in there, who will likely know the state of the exercise across tracks.