I don’t know what these numbers mean, but the overall quality of the mentoring/reviewing procedure appears to be very low. The vast majority of people who submit their code simply don’t care at all. You spend time reviewing and they never respond. They also don’t know that it’s their responsibility to end the session. Even after multiple notices, they still don’t bother to reply. I would say this happens a good 80-90% of the time, especially from people who have < 10 rep. They come to the site, they solve couple of problems and that’s it. So why should we spend our time reviewing?
I would like to see a min rep requirement before being able to request a code review. This will of course dramatically reduce the number of reviews, but i just don’t see any other way to solve the problem i’m describing.
Because we want to provide people with value. There’s no guarantee people will engage with your feedback. You could ask simple questions like, “What sort of feedback would you like?” before putting in a bunch of time and effort if you want to minimize the cost. But sometimes students just want a second opinion and don’t want to engage and that’s fine, too!
I wonder if your expectations are maybe too high. Some students will engage, some will not. Some will have specific questions, most will not. I’ve had students disappear for literally a year before responding: we don’t know what’s going on in their lives.
Putting up barriers for new students is literally the opposite of what Exercism is for.
I agree that this is a hobby learning platform. And I’ve read a lot, that there might be situations some will not reply soon. All accepted. But many of my session are not getting closed (1/3). Most of the students not even reply back before close. You can always say there is for sure a personal situation.
I claim the process is broken. That it’s not clear how it works. That there could be automatisms for reminders for open sessions etc.
More than once I accepted requests where students thought they have to go the review way.
While I absolutely accept, that there are personal situations, it’s important this can’t be the answer for broken workflows and processes.
I’m still a big fan of automatic closing. When you really have no time for multiple weeks, what’s the point with starting a new session? Or you could reopen the last.
And to be very honest: When I do the mentoring, I just want also the rep points. Can’t get them without clean closed sessions. Even when the session was a super success.
I agree, that it always should be an absolutely open platform. And with this you also must be open to challenge current processes. I don’t like the current process either. And while it’s ok students disappear, I would like to see automatism for reminders and cleaning up.
I don’t think it can’t be the goal of the platform, to have thousands of open old mentoring sessions.
The goal of the platform is to provide students with the tools to get really good at programming. If that means a whole lot of open sessions that students may or may not return to in a week or in a year, then I don’t think that’s a problem. If that’s not a workflow that works for you as a mentor, that’s fine.
I too feel that we shouldn’t block people from using mentoring - it’s a unique feature we tout and putting up barriers is probably not a good idea. However, I do get your problem: but atleast for me, the numbers are significantly lower than what you have - I’d estimate a fifth of people don’t reply, or even lower. What I do is to give a quick overview in the first comment, mentioning any glaring problems and asking specifically which areas the student wants help in. If the student replies, you can generally gauge by their reply how “interested” they are - after which you can give the maximum possible amount of feedback. Alternatively, if you feel that most low-rep holders respond to mentoring negatively, you can always filter them out in your selection process.
Yeah, I too would like something like this. Less about the rep, more about tying up dangling threads. Perhaps email reminders before this happens.
It isn’t a problem per se - but it leads to clutter. Imagine a student returns after a year, does an exercise, and requests mentoring. The platform says that his mentoring queue is filled up - because of an open session he had the last year. He does not remember the session, and has no idea why this is happening. Viewing open sessions (mentoring slots) is also not intuitive for me - I didn’t realize that it was there for a long time. You have to go the track and scroll past a long list of recent activity before accessing it. I’d prefer seeing open sessions to the number of exercises completed or concepts learned or whatever. Do you folks think that swapping the stats and the mentor queue would help?
It’s a learning platform for programming. The first thing you should learn (at job) is to finish tasks. Not to open 10 and work on everything parallel. But to focus. While everybody has different motivations, I can hardly believe that you come back months later and know you want to continue working on this problem. Time changed and you lost context. That doesn’t make sense to me from many perspectives. And the reason for coming back after so long really doesn’t matter for this.
I mean, I still can and will close sessions by myself. No problem at all. But I guess we are talking about optimizing the experience of mentoring sessions.
And yeah, maybe mentors start to dislike it that much they don’t want to take new sessions. That can’t be what anybody wants. So we should be very open to discuss changes and not just say everything works.
Thanks to everyone who posts their thoughts, much appreciated. While i understand that people can just get busy, my main concern isn’t really about them. It’s about those who spend some time on the website and then just decide to leave because they did not like something. It could be that they did not find the content engaging enough. It could be that they found the site too difficult to navigate. It could be that the mentoring they received was inappropriate and they just decided to leave in the middle of the session. It could be that they decided mentoring wasn’t really to their liking. It could be that they did not understand how reviewing works. The problem is that if they leave, there’s 0 feedback about what went wrong. And without that feedback there’s no way to improve and learn from any mistakes. I can’t just say “they are busy” or “it’s ok to leave”, those excuses are just not good enough for me.
It’s easy to say everything is fine when we are spending too much time here, but let’s not forget that beginners often have very different opinions.
So for me at least, a session that does not end is simply demotivating. I am responsible for my reviews, also, it takes time to go over the code, identify the issues, and post suggestions. So ideally I’d like to know what went wrong.
The action of those two groups are identical. The only difference is the intent. It’s pretty hard to tell which group a student is in based on their actions.
If you mentor solutions here, you’re inevitably going to get some of both over time. Don’t spend too much time here. Spend the amount of time you can afford to spend, and no more. As was pointed out, there are ways to mentor solutions with a larger or smaller time investment in the first post. Also, with practice, mentoring gets easier and faster, like all other skills.
Of course the current way works. Accepted. Doesn’t mean it can’t be improved. I don’t know a valid system, where requests and sessions (dialogs) are open forever.
Support systems, ticket systems, many systems will set a session into a closed or “hibernate” state. That’s really nothing special.
And it was also mentioned earlier, that students often do not even know about there open sessions or forget about them. Why should that not be optimized, just because there are people that come back 6 or 12 months later by intention?
BTW, the overall activity on Exercism can be taken for the more or less correct reminder message. When there is really no activity at all, some kind of hibernation would be great. But when there is more activity on other exercises or even mentoring requests, good chance the student forgot the other sessions or really just doesn’t give feedback. Good case to even close after a while and reminders. Of course, independent of a good visibility of current sessions on the website.
It would be something if the student is not logged in after some time to do a “Student has been away” kind of thing, and then “activate” them again once they log in, perhaps with some kind of message:
Welcome Back! It’s been 4 months since we last saw you. Here are things awaiting your response: 1., 2., 3,. etc.
And as a mentor waiting on a student, it would be informative as well. “It is not so much they are ignoring you, they just have not been on in a while.” type indicator.
Based on their actions yes, but more often than not there are also other clues that help us make an educated guess. These days being busy very often means “i’m not interested so i’m going to get busy doing other things”. I almost never take this excuse seriously, because literally everyone is busy.
When i started mentoring back in Feb. It was Clojure, most people were in for the the 12in23 challenge. Almost never had a problem with the sessions. Most of them ended quickly and communication was never an issue.
After Feb. the number of the requests for Clojure dropped dramatically, so i decided to add Java and JavaScript. It was then when things started going downhill. Communication problems, people dropping out in the middle of the session, and people who never bother to post a single comment. So what happened? People suddenly got busy? Probably not. I’m guessing that people who were in for the 12in23 challenge were more motivated and quite possibly more experienced. They persisted even when they were facing problems. On the other hand, Java and JavaScript are popular choices for complete beginners. There’s a much greater chance that these people will drop out if something isn’t to their liking.
So please, let’s leave the “busy” excuse for another day. Can we pinpoint any other issues that are causing beginners to drop out? @All (mentors or not), please post what you believe that could cause problems. It could be something that was mentioned during a session, or it could be something you’ve personally observed.