Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Hold" on new users edits #889

Open
logrady opened this issue Nov 21, 2016 · 17 comments
Open

"Hold" on new users edits #889

logrady opened this issue Nov 21, 2016 · 17 comments

Comments

@logrady
Copy link

logrady commented Nov 21, 2016

Based on feedback from mapathon organizers and validators (and the Training Working Group) we would like to suggest that a new user have a hold placed on their account after X number of edits. This would give validators a chance to check over their work and provide feedback before they move forward.

If we compiled a list of validators they could be notified through the TM (see other suggestions about messaging systems in the TM) of new users "holds" and review their work in a timely fashion. If necessary a new user could continue on a probationary period where their edits are checked again by an experienced validator or they could move on to edit in a more independent fashion.

@bgirardot
Copy link
Contributor

I thought we thought this was a bad idea. we would not block anyone from mapping waiting for validation.

I for sure feel like we can improve validation tools that will have a big impact so more drastic measures are not needed :)

@logrady
Copy link
Author

logrady commented Nov 22, 2016

How about a modified version where a new user hits a certain number of edits and a validator is contacted to review but the new user can continue unobstructed? I believe this could form the basis of a mentoring system, which I think has been discussed before.

@Nick-Tallguy
Copy link

I'd rather see a scheme whereby;

  • TM checks for previously completed squares, or account creation date, or some other significant indication of a 'newbie'.
  • If TM recognises person is a 'newbie', then any square they work on (not complete - just work on) is flagged for more urgent attention by a 'feedback validator - someone who will provide feedback and not just validate or invalidate'
  • TM should also for the 'feedback validator' provide some form of indication of any other feedback given.
  • In addition any validator should be able to flag a 'user' of any age as needing to appear on the feedback validator list.

This would allow newbies to continue contributing but they are more likely to receive feedback at an early stage.

@RAytoun
Copy link

RAytoun commented Nov 22, 2016

I am with Nick regards some indication of a newbie and the problem that we only know when a tile has been completed by a newbie, but not those that map a few items but never complete a tile. So it would have to be some way of flagging according to changesets by that newbie for us to be able to follow where and when they are mapping.
Taking into account that most mappers never go on to be consistent or prolific mappers and many only do a few edits it would be great if we could see whether a newbie is continuing to map or is just a 'one off' mapper. , I would encourage validators to give positive feedback and encouragement to mappers who have completed about four tiles as they have shown the interest and you can look at their progress over the different tiles to give a better assessment. I do not exclude them from my validations but

But there is not enough validators to be able to effectively police all the newbies.
By creating a list of accredited validators we have a pool of people who can be invited to take the Lead Validator on a task. They can then start validating that task and when they notice that a mapper also doing that task is good at it they can be asked to assist in the validating of that task. This not only gives that 'good' person the extra recognition for the good work they are doing, it also gives them the confidence to fix the mistakes of other mappers. The Lead Validator can keep an eye on the tiles validated by the nominee validator to ensure that it is maintaining standards and can give that nominee help where needed and positive encouragement if they are good. It also means that the Lead Validator takes responsibility to see that the validation of a task is progressing and that comments are noticed.

@bgirardot
Copy link
Contributor

Lets talk about this.

How could we best identify a new mapper?

Less than some number of changesets? If so what number? Do we need a scale or a range 1-10 edits, 11-50 is less than 100 changesets meaningful? If so how?

@logrady
Copy link
Author

logrady commented Nov 22, 2016

Less than some number of changesets? If so what number? Do we need a scale or a range 1-10 edits, 11-50 is less than 100 changesets meaningful? If so how?

I think the best way to calculate it is to create a mathematical equation or algorithm that would represent a new user. Something like "new user = time_since_registration + number_edits" but I wonder if this needs to be calculated with respect to what other contributors are doing.

For example, currently in my OSM bio I'm categorized as a "casual mapper". I know others who are classified as heavy mappers. I don't know what other classifications exist or how this is calculated. I'm assuming it's number of edits and across a period of time.

Perhaps a first step would be to find out how that works and go from there as I'm thinking the new user algorithm might need to be in relation to these other user categories.

@logrady
Copy link
Author

logrady commented Nov 22, 2016

Correction. That should be something like, "new user = time_since_registration/number_edits"

@dalekunce
Copy link
Contributor

I fundamentally disagree that we should somehow stop mappers from mapping. I would rather focus on the actual quality of the edit versus grading the editor by focusing on the osm-analytics toolchain.

@bgirardot
Copy link
Contributor

Yes, I do not think we will block any editing. There are a lot of other gains the validation QA process to be had that are more productive for people giving feedback. But newness rating is a valuable metric as expressed by many validators.

@aawiseman
Copy link

I love @Nick-Tallguy and @logrady idea of pinging a validator (or validator group) after some amount of mapping!

@bgirardot
Copy link
Contributor

I think pinging validators is a really good idea as well.

But I think that we need to figure out what a "new mapper" is and when is a good time to ping a validator exactly.

And that leads me to better metrics that validators can use to find mappers that they would like to focus on.

And I fear pining validators could be thousands of pings at busy time, i would certainly hope so, unless the metric filtered those out, and then I am back to tools for validators to find people to work with based on varied criteria. It might be different for different validators, or who knows, it could even change in disaster v. mm type of mapping.

I do not have a good vision for how to do this UI wise either. Certainly when viewing a project, color coded or get a list of the mappers with this available information:

osm sign up date
number of days mapped
number of changesets
number of objects created
last edit date
number of checkout tasks
number of completed tasks
number of validated tasks (have been validated, not doing validating that will show other places)

and a UI to select those for example an expanded UI that lets you set things like this:

number of days mapped is <, =, > X
and
number of completed tasks <, =, > X

Then you get a list of mappers in that project with all their stats listed.

Or simpler, task squares they have checked out in the project are turned blue so you can select them easily to review mapping. (bonus if you could select all their task squares at once to review in mass in josm, but that is not part of this issue :)

@logrady
Copy link
Author

logrady commented Dec 6, 2016

I'm not sure if this would work but what if people were to self declare their learning level? I've used this method successfully when working as a trainer but that was in-person.

Or possibly a couple of quick questions after someone logs into to the TM (for X number of times) such as, "Are you comfortable identifying and mapping buildings (huts, roadways, etc.)"?

Or both these methods.

@majkaz
Copy link

majkaz commented Dec 6, 2016

Just a note - I have noticed most tools use the easily visible number of changesets to evaluate the mapper. But this can be deceptive - some of my changesets on HOTOSM are huge - sometimes one changeset per completely task mapped task, using upload into the same changeset during the edits. Not sure if this is a good practice - but it happens and works well for me. I don't mind having relatively low number of changesets visible, but we should be aware of it - it is difficult to compare a changeset with 3 new object with one as such. I have small changesets too - few reverts, some deletes or edits where I expect it might need reverted if something goes wrong. But mostly, it goes in several thousands of changes in one.

Blake's previous comment about number of object created is the only one not opening way to false positives. Perhaps letting it as it is (number of changesets) but allowing to mark "up" a mapper internally? Otherwise you get the same "warning" if such mapper switches the projects on every other project as well.

@logrady
Copy link
Author

logrady commented Dec 6, 2016

@majkaz I've written something very similar to your comments about changesets being a deceptive measure (see #894).

@aawiseman
Copy link

aawiseman commented Dec 7, 2016

Could we use existing things like the How Do You Contribute tool? http://hdyc.neis-one.org Here's me for example -- if we can somehow access it maybe we can just use their categories http://hdyc.neis-one.org/?Marion%20Barry

@logrady
Copy link
Author

logrady commented Dec 7, 2016

@aawiseman I've mentioned Pascal Neis' work in #894.

Note to all: I'm trying to cross reference as many entries by their 3 digit number so we can keep track of the various suggestions across as well as within this "Issues" forum.

@aawiseman
Copy link

#933 has a similar idea, notifying validators when squares are completed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants