Can I see somewhere all the representer comments created by other mentors?
In the automation tab, there’s a “feedback submitted” filter, and it shows “0” for me, but I don’t know if that means that I didn’t create any representer comments, or that nobody in the track created any.
No you can’t. But it’s relatively easy for us to add and I think it’s both important and high priority, so I’ll ask Erik and Aron to look at it this month.
Related to this, I got this automated feeback (from a mentor, but via the representer) for little-sisters-essay in python.
Consider what would happen if you run print(replace_word_choice("The cat ate a mouse.", "a", "the")). This would print "The cthet thete the mouse.". Is that what it ought to print? What other unexpected behaviors can be triggered with the right inputs?
While this is kind of a good point, fixing it is quite a lot of extra work, and is not a requirement of the tests or the instructions, so I think it probably isn’t good feedback to show.
I think there is no way for the student to flag feedback to the track maintainers.
Also, I think that track maintainers don’t really have visibility of, or control over the feedback, which would be useful to curate an ideal learning experience.
I actually appreciated this feedback. I initially solved the exercise quickly, and didn’t think to replace entire words, rather simple string occurrences. So when I received the feedback, I spent some additional time to update my solution.
While this is kind of a good point…
I think having these kinds of points as part of mentor feedback is probably the best approach. The other alternatives would be to 1. not say anything; or 2. update the exercise and the tests to be more strict.
I think having additional challenges to think about, while not strictly covered by tests is a great way to let students, who are so inclined, to explore the exercise, while not complicating the exercise for the rest.
But in general, I also agree that there should be a way to report feedback that is not helpful.