[Inventory Management] New test proposal for decrement_items not in inventory

[Is this the right place for these kind of issues? read that PR should be discussed on the forum before opening.]

I was mentoring a solution for Inventory Management and notice an implementation of decrement_items function pass the tests, although it did not handle the case of an item existing in the list but not in the inventory dictionary.

I suggest adding this test to for checking above case:

    @pytest.mark.task(taskno=3)
    def test_decrement_items_not_in_inventory(self):
        self.assertEqual(decrement_items({"iron": 3, "gold": 2},
                                      ["iron", "wood", "iron", "diamond"]),
                         {"iron": 1, "gold": 2})

I’ve got a PR ready to open here:

Hope this issue is relevant and this is the right way to contribute to exercism and the Python track.

Hi @izmirli :wave:

Thanks for posting this issue, and for suggesting a test case.

Unfortunately, I can’t see the full content of your code, since the link you provided only shows me the “compare and PR” page, and not the code diff.

However, based on what you’ve entered here I am open to adding this to the exercise. But I want to make sure you follow the conventions already in the test file (which you may have done in your branch, but I can’t see them):

    @pytest.mark.task(taskno=3)
    def test_decrement_items_not_in_inventory(self):
        actual_result = decrement_items({"iron": 3, "gold": 2},
                                        ["iron", "wood", "iron", "diamond"])

        expected = {"iron": 1, "gold": 2}
        error_message = ('Called decrement_items({"iron": 3, "gold": 2}, '
                         '["iron", "wood", "iron", "diamond"]). The function '
                         f'returned {actual_result}, but the tests '
                         f'expected {expected}.')

        self.assertEqual(actual_result, expected, msg=error_message)

Yes - it is a bit klugey, but setting the test up in this fashion makes it so the student sees a more directed error message in the UI (since this is a concept exercise, and they can’t actually see the tests).

Once you submit the PR, it will be auto-closed. But don’t panic :slightly_smiling_face: , I’ll get notified and re-open it.

Let me know if you have any questions or issues. :smile:

Hi @BethanyG

Thanks for taking the time to explain. At first I didn’t see the error_message (at my local test), yet I did see it later on the forked repository. Anyway to avoid potential conflicts, I synced the repository now and re-did it on a new branch.

Opened a new PR for it here (as you wrote, it auto-closed):

Is it OK?

For next time, should I open a PR before posting here?

Thank you for submitting that! :star2:

Reopened, approved, and merged. :smile:

No - unless its small typos or some such, in which case you can post here with a link to the closed PR.

For anything larger, its best to discuss here first, since not all changes are ones we might want on the track. :slightly_smiling_face: