Tag: Contributors

OpenData vom Bundesamt für Kartographie und Geodäsie vs. Crowdsourced OpenStreetMap in Deutschland – Ein Vergleich Offener Daten

Nach knapp 1.000 Tagen Abstinenz (endlich?) mal wieder ein Blog Post von mir. Aufgrund des inhaltlichen und räumlichen Bezugs diesmal auf deutsch. English version via Google translate?

Präambel – Im Herbst 2020 entstand beim FOSSGIS e.V. eine Open Data Arbeitsgruppe. Durch verschiedene gemeinsame Aktivitäten von der Arbeitsgruppe und dem Bundesamt für Kartographie und Geodäsie (BKG), wie z.B. einem Workshop, wurden Anfang Dezember 2020 zwei Datensätze von Standorten der Landespolizei und Gesundheitsämtern für die „Pflege und Erweiterung der OpenStreetMap-Datenbank“ freigegeben. Daneben existieren beim BKG noch weitere interessante „Open Data“ Geodaten und Webdienste, die aber aufgrund ihrer Lizenzbedingungen nicht vom OpenStreetMap (OSM) Projekt verwendet werden dürfen.

Ein „offizieller offener“ Datensatz von einer Bundesbehörde? Gut, wie sieht’s im Vergleich zu gemeinsam zusammengetragen Daten aus, z.B. OpenStreetMap? Lassen sich Unterschiede in der Qualität feststellen? Sind die Datensätze womöglich auf Augenhöhe oder existieren gravierende Unterschiede oder wovon könnten alle profitieren?

Um zumindest einen Teil der zuvor genannten Fragen beantworten zu können, liegt es auf der Hand eine klassische Qualitätsanalyse zwischen den zwei Datensätzen durchzuführen. Eine interessante Frage dabei: Welcher Datensatz ist die Referenzquelle? Ist der BKG Datensatz die Referenz oder inzwischen vielleicht der OSM Datensatz? In annähernd allen mir bekannten Qualitätsuntersuchungen wird der „offizielle“ Datensatz als Referenz angenommen, daher wird die folgende Analyse ebenfalls so durchgeführt.

Wie wurde methodisch vorgegangen? Die beiden hier untersuchten Datensätze vom BKG wurden über Github bezogen. Die OSM Elemente für den Vergleich wurden aus einem aktuellen Planetfile mit osmium für Deutschland extrahiert (vielen Dank an dieser Stelle an Jochen als Maintainer für dieses super schnelle Tool und die Unterstützer). Bei der eigentlichen Analyse der Qualität wurden folgende Merkmale untersucht: Vollständigkeit, Logische Konsistenz, Positionsgenauigkeit, Zeitliche Genauigkeit und Thematische Genauigkeit. Dabei kamen verschiedene JAVA Klassen zum Einsatz, die zum größten Teil bei mir auf GitHub gefunden werden können.

Wie sehen die einzelnen Ergebnisse des Vergleichs der Datensätze im Detail aus? Starten wir als erstes mit der Vollständigkeit von den beiden Datensätzen im Vergleich:

  • Anzahl Objekte Landespolizei vom BKG: 4,257
  • Anzahl Objekte amenity=police OSM: 3,871

Auf den ersten Blick existieren damit rund 10% mehr Standorte im Datensatz vom BKG als wie am 03.02.2022 in OSM eingetragen waren. Die Besonderheit liegt aber im verwendeten OSM-Element und -Tagging, was in der ersten Version dieses Blog Posts zu Abweichungen in den Ergebnissen bei der Vollständigkeit geführt hat.

Hier verfügt der „offizielle“ Datensatz vom BKG um rund 35% mehr Objekte als was in OSM auf die schnelle zu finden ist.

Die logische Konsistenz kann über verschiedene Wege geprüft werden. In meinem Beispiel hier wurde jeweils des BKG und der OSM Datensatz bzgl. des Vorhandensein der Attribute mit sich selbst untersucht. Bedeutet: Der Datensatz der Landespolizei vom BKG besitzt 11 Sachattribute und die Gesundheitsämter verfügen über 12 Sachattribute. Bei der Landespolizei sind bei den Objekten, bis auf Telefax (73%) und E_Mail (52%), die Attribute/Eigenschaften mindestens zu 97% angegeben. Bei den Gesundheitsämtern vom BKG sind, bis auf Telefax (80%) und E_Mail (90%), die Attribute mindestens zu 99% angegeben. Bei den OSM sieht dies anders aus. Vergleichbare Eigenschaften, also Tags (key-value Paare), sind bei den in OSM vorhandenen Standorten der Polizei mit name (86%), addr:street/housenumber/postcode/city (ca. 63%), phone (27%) und fax (8%) mit einem Wert vorhandenen. Bei den Gesundheitsämtern von OSM sieht es ähnlich aus: Hier sind name (100%), addr:street/housenumber/postcode/city (ca. 78%), phone (14%) und fax (7%) mit einem Wert befüllt.

Um die Genauigkeit der Lage (Positionsgenauigkeit) zu vergleichen, wurde jeweils mit einem Puffer im Umkreis von 500m um den Standort einer Landespolizei oder eines Gesundheitsamtes vom BKG nach vergleichbaren Objekten in OSM gesucht. Im genannten Umkreis der Landespolizei-Stellen vom BKG befindet sich bei 87% ein erstelltes Polizei-Element im OSM Datensatz. Bei den Gesundheitsämtern finden sich bei 44% ein Eintrag bei OSM.

Die Prüfung der thematischen Genauigkeit erfolgte nur über einen minimalistischen Ansatz, in dem die Namen der über die Positionsgenauigkeit verknüpften Objekte miteinander verglichen wurden. Hierbei zeigte sich, dass nur 25% (Gesundheitsämter) und 32% (Landespolizei) der Namen zwischen den BKG und OSM Datensätzen exakt übereinstimmen. Die Untersuchung dieses Qualitätsmerkmals könnte oder müsste umfangreicher angegangen werden.

Die Datensätze des BKG wurden im Jahr 2021 veröffentlicht. Bei OpenStreetMap wird für gewöhnlich der Zeitpunkt der letzten Änderung des Elementes für die Aktualität bzw. zeitliche Genauigkeit verwendet.

Zusatzinfo: Die Mitwirkenden beim OSM Projekt – In OpenStreetMap haben bei den Standorten der Polizei insgesamt mind. 1.428 verschiedene Mitglieder an den Daten mitgearbeitet. Bei den Gesundheitsämtern waren es mind. 120 Personen, die die Elemente in irgendeiner Form (Lage oder Sachinformationen) bearbeitet oder ergänzt haben.

Kurzzusammenfassung oder was bringt jetzt dieser „Vergleich“? Dieser Blog Post hat keinen Anspruch auf Richtig- und Vollständigkeit. Es wird dennoch gezeigt, dass neben der Quantität (siehe Vollständigkeit) insbesondere das Augenmerk anscheinend auf die Attribute bzw. enthaltenen Details zu den jeweiligen Einträgen bei OSM gelegt werden sollte. Welche Vorgehensweise hat sich bei OSM in der Vergangenheit etabliert? Zumindest in Deutschland sollten nicht nur meiner Meinung nach keine Datenimporte mehr stattfinden. Vielmehr würde es sich anbieten, und wie in manchen Städten oder Ländern bereits erfolgreich umgesetzt und gelebt, eine Art Datenabgleich angeboten werden, wonach Interessierte und Engagierte die einzelnen Einträge vergleichen können.

Solch freigebende Datensätze, wie die vom BKG, eignen sich hervorragend zur Kontrolle und/oder Erweiterung der gesammelten Daten des OpenStreetMap-Projektes. Um es hier auch erwähnt zu haben: Nicht nur gemeinsam zusammengetragene Daten, sondern auch offizielle Daten, können Fehler oder Abweichungen enthalten. Dadurch sollten nach Möglichkeit diese Daten oder Informationen nicht unreflektiert nach OSM übernommen werden.

PS: Dieser Blog Post hat keinen Anspruch einer Wissenschaftlichen Untersuchung, sondern ist einfach aus einer Laune heraus an einem Sonntagmorgen bei einem Espresso entstanden. Hoffe es waren dennoch ein paar interessante Einblicke für Euch mit dabei?

#100 – Thank you!

While I was working on my latest blog post, I realized that I had already written 100 posts over the past nine years. All posts have one thing in common: They are about the well-known and maybe never ending OpenStreetMap project. From time to time there are still emerging questions or issues which must be tackled by someone. This always fascinated me about OSM. However, this particular number 100 is not about a specific subject, it’s just a tiny post to say thank you! Thank you for your continuous interest in reading, commenting and of course sometimes criticizing my work. To me it’s still awesome to see that you, a few thousand people in total, use tools or services daily, that I implemented.

It’s still incredible that many people (not all) spent their spare time contributing to the project, not only as spatial data contributors but also as software engineers, system admins or coordinators of workshops, conferences or mapping events or by just validating or reviewing the latest map changes. Some of my webpages wouldn’t be as successful without your feedback. So, thanks again! Finally, I would like to thank all the people who I have met during the different meet ups, such as FOSSGIS, OSM hack weekends etc. over the past couple of years. There have always been friendly, respectful and useful chats: It’s always a pleasure.

Thanks to maɪˈæmɪ Dennis.

New metric for measuring the “qualitative nature” of OpenStreetMap activities @ How did you contribute ?

Back in June we had a twitter chat about potential new features for the “How did you contribute to OpenStreetMap” (HDYC) website. One suggestion was to “show more relevant information about skills, tagging system or the quality of contributions” of a project member (by J-Louis). Overall I really like the following summary by Claudius: “HDYC started off with a strong focus on quantitative metrics and you expanded it lately a lot to reflect the qualitative nature of contributions. I think there’s value to show more about which area of data someone contributed: Auto/bike/railway/water infrastructure, amenities…”.

So I finally started searching in the OpenStreetMap (OSM) wiki for any feasible information about “groups of tags” or “tag categories”. Altogether, I couldn’t discover any solution that fits perfectly to determine the areas of data a mapper contributed in. However, later I got a hint from the JOSM developers to use the presets of the well-known and popular editor. You may ask, ‘What are presets?’ “Presets in JOSM are menu-driven shortcuts to tag common object types in OpenStreetMap. They provide you with a user friendly interface to edit one or more objects at a time, suggest additional keys and values you may wish to add to those objects, and most importantly, prevent you from having to enter keys and values by hand.” You can find many different presets at the aforementioned JOSM page. However, during my data processing I utilized the “default presets”. The XML file contains many combinations of popular or established tag combinations, which contributors use when they are mapping.

So far so good, as a first step I released a new version of “Find Suspicious OpenStreetMap Changesets“. It shows the utilized presets for each changeset. This can already indicate some quality aspects such as attribute (tag) accuracy or completeness. Now, after some weeks and some minor adjustments, I started to use this collected information about applied presets to expand the metrics of a mapper’s profile. The HDYC-page now also lists which presets the mapper recently utilized during her/his contributions such as adding, modifying or removing map elements. I think this is a really useful next step towards an even more required aspect of quality assurance that we highly need with the OSM project.

Some technical details: The database behind the “Find Suspicious OpenStreetMap Changesets” webpage uses the augmented diff files of the Overpass-API. The utilized “default” preset list of the JOSM editor can be found here (Internal Preset list). The entire processing tool was developed with JAVA and uses a Postgres database to store the results. By now, only recently utilized presets of the past 60 days of the contributor’s activity are utilized and presented.

However, thank you very much for all your feedback. Hope that it helps.

Thanks to maɪˈæmɪ Dennis.

Additional insights about OSM changeset discussions: Who requests, receives and responds?

Last year I wrote two blog posts about the OpenStreetMap (OSM) feature that allows commenting on contributor map changes within a changeset. The first blog post showed some general descriptive statistics about the number of created changeset discussions, affected countries, the origin of the commenting contributors or their mapping reputation. The second post described a newly introduced feature, where contributors can flag their changeset so that their map edits can be reviewed. This blog post will follow up on this topic and conducts some similar but updated research.

The first chart shows the number of created comments (discussed changesets) and the contributors involved over the last 15 months. The number of created comments and discussed changesets fluctuates over time, whereas the number of contributors who take part in changeset discussions stays consistent at around 1,500 per month. Around 3,200 contributors received a comment on at least one changeset’s map edits a month.

After publishing the aforementioned blog post, people were asking for some numbers that show the commented changeset grouped by the editing application that was utilized. The results show that these numbers stayed more or less the same with 2/3 of all commented changesets (almost 160,000) being edited by the iD editor. This is not very surprising since this particular editor is used by many OSM beginners during first edits. It’s also interesting to see whether the changeset author responded (also grouped by the OSM editor that was used). Overall only around 32,000 contributors responded to their changeset comment. You can find some additional charts about the comments per discussed changeset in the previous blog post. Again, the majority (around 71%) of the changeset discussions contain one comment only.

Since last August, contributors can mark their changeset with a flag for “review_requested”. After a few months now I think it’s time for a first look at the numbers. The following charts display the number of requested reviews by contributors and their marked changesets. First of all, almost each month around 7,000 contributors asked for one review minimum. Overall almost 36,000 changesets have been marked for review each month. If we take a close look and filter changesets by hashtags, we can see that sometimes large numbers of the changesets are contributed by #HOTOSM or #MissingMaps members.

The following diagram shows probably the most disappointing results: The number of requested reviews that actually have been reviewed in the end. No matter if the changeset has the #HOTOSM or #MissingMaps tags or not, the relative value of reviewed changesets lies only between 6 and 18%. To be honest, I’m also a bit surprised that only a few of #HOTOSM or #MissingMaps changesets have been reviewed so far.

So, what do you think? Do you review contributions without commenting on the changesets? Do we need more attention here or is it just boring to look after changesets which are marked for review? I think it’s obvious, that we need more contributors who review map changes or least “documenting” their work. But can we handle this? Or do we need better tools?

Thanks to maɪˈæmɪ Dennis.

Adding Indicators to OSM Map Edits Assessment

Almost two years ago I published a web service that finds suspicious OpenStreetMap (OSM) map changes. You can use the service here and find some more information in previous blog posts. Especially Changeset discussions revealed that they are more or less de facto standard for communication between contributors during map change reviews.

However, when I am inspecting map changes, I sometimes see new contributors using uncommon OSM tags. Therefore I think it could be useful to add an additional assessment parameter to the aforementioned suspicious OSM map changes page. The newly introduced indicator states the matching ratio between the contributed and the most popular OSM tags. This means, if the changeset contributor used many uncommon tags at her/his map changes objects, the matching rate will be low. If the contributor applied many common (“popular”) tags, it results in a high matching rate towards 100%. For the calculation I used Jochen Topf’s taginfo API to get commonly used OSM tags. An API description can be found here. Furthermore I added the average age (in days) of modified and deleted objects. This indicator can be used to see if the contributor edited objects, which have been mapped today (0 days) or exist already for a longer period of time, e.g. 1566 days. The values for the average version numbers are computed in a similar fashion.

Last but not least, the number of the affected contributors of the changeset is calculated. If a contributor only changes objects on which she or he is the latest modifier, this number will be ‘0’. Otherwise the value represents the number of unique mappers whose contributions have been changed. I hope that overall the newly added indicators can be useful for identifying changesets which need a closer look. The suspicious OSM map changes website has also received some style updates. They should help to highlight the most important parameters. I also added the aggregation of the latest changesets for a specific contributor. Guess this could be really useful to see a “big picture” of the individual mapping activities.

The aforementioned service is online here –> “Find Suspicious OpenStreetMap Changesets

Thanks to maɪˈæmɪ Dennis.

Public profiles on “How did you contribute to OSM?”

The web page How did you contribute to OpenStreetMap? (HDYC) provides individual detailed information about project members. Some time ago, the page has been revised, that member profiles can only be accessed, when users logged in with their OpenStreetMap (OSM) user account. This feature has been implemented, after a long and important discussion about “protecting user privacy in the OSM project”. The complete German discussion can be found here. However, I don’t want to continue the discussion here. I still support that any information, which are available about contributors, should not be hidden in project data dumps, APIs or on webpages. In my opinion, information such as contributor names or ids and modification timestamps are essential for doing quality analysis and assessments to protect the project against e.g. vandalism or unintended map edits.

Anyway, after the last modification, which required the mentioned user login on HDYC, I got positive and also negative feedback. Most negative feedback concerned that profiles are now hidden and not public anymore. But because contributors want to show their mapping efforts, I implemented a new feature, that profiles can be accessed without a user login on HDYC. So, if you want that anyone can access and see your OSM profile, just add a link to your HDYC profile on your OSM profile page. Similar as you did this maybe already for your OSM-related accounts (see blog post). The tool-chain checks the profiles of every contributor, who has been active within the last 24 hours.

Additionally, the HDYC web page got several small updates. The overall ranking has been switched to more meaningful recent country rankings. The “last modifier of” amounts have been temporary removed/replaced by detailed numbers of created and modified way elements. The changeset table now also contains some really useful hints about used words in the changesets comments and hashtags and their amounts. This feature has been requested by a German contributor, thanks “!i!”. Most of the displayed numbers should be updated on an hourly basis. Only the activity areas and information about changesets are “only” updated every 24 hours. Some numbers also contain links to further statistics such as detailed information about recent changesets, ranking lists of a country and commented or discussed changesets. Overall I tried to highlight further efforts and activities, such changeset discussions, related accounts or roles, and not “only” raw mapping element amounts.

Thanks to maɪˈæmɪ Dennis.

Review requests of OpenStreetMap contributors
– How you can assist! –

The latest version of the OpenStreetMap editor iD has a new feature: “Allow user to request feedback when saving“. This idea has been mentioned in a diary post by Joost Schouppe about “Building local mapping communities” (at that time: “#pleasereview”) in 2016. The blog post also contains some other additional and good thoughts, definitely worth reading.

However, based on the newly implemented feature, any contributor can flag her/his changeset and ask for feedback. Now it’s your turn! How can you find and support those OSM’ers?

  • Step 1: Based on the “Find Suspicious OpenStreetMap Changesets” page you can search for flagged changesets, e.g. limited to your country only: Germany or UK.
  • Step 2: Leave a changeset comment where you e.g. welcome the contributor and (if necessary) give her/him some feedback about the map changes. You could also add some additional information, such as links to wiki pages of tags (map features), good mapping practices, the OSM forum, OSM help or mailing lists. Based on the changeset comment other contributors can see that the original contributor of this changeset already has been provided with some feedback.
  • Step 3: Finally you could create & save a feed URL of your changeset’s search. That’s it.

Personally, I really like this new feature. It provides an easy way to search for contributors who are asking for feedback about their map edits. Thanks to all iD developer’s for implementing this idea. What do you think? Should I add an extra score to “How did you contribute to OpenStreetMap” where every answer to a requested feedback changeset will be counted?

Some statistics? There you go: “OSM Changesets of the last 30 Days

Thanks to maɪˈæmɪ Dennis.

Detecting vandalism in OpenStreetMap – A case study

This blog post is a summary of my talk at the FOSSGIS & OpenStreetMap conference 2017 (german slides). I guess some of the content might be feasible for a research article, however, here we go:

Vandalism is (still) an omnipresent issue for any kind of open data project. Over the past few years the OpenStreetMap (OSM) project data has been implemented in a number of applications. In my opinion, this is one of the most important reasons why we have to bring our quality assurance to the next level. Do we really have a vandalism issue after all? Yes, we do. But first we should take a closer look at the different vandalism types.

It is important to distinguish between different vandalism types. Not each and every unusual map edit should be considered as vandalism. Based on the OSM wiki page, I created the following breakdown. Generally speaking, vandalism can occur intentionally and unintentionally. Therefore we should distinguish between vandalism and bad-map-editing-behavior. Oftentimes new contributors make mistakes which are not vandalism because they do not have the expert mapper knowledge. In my opinion, only intentional map edits such as mass-deletions or “graffiti” are real cases of vandalism.

To get an impression of the state of vandalism in the OSM project, I conducted a case study for a four week timeframe (between January 5th and February 12th, 2017). During my study I analyzed OSM edits, which mostly deleted objects from new contributors who created fictitious data or changesets for the Pokemon game. If you did not hear or read about OSM’s Pokemon phenomena, you can read more about it here. The OSM wiki page for quality assurance lists some tools that can be used for vandalism detection. However, for this study I applied my own developed OSM suspicious webpage and the quite useful augmented OSM change viewer (Achavi). Furthermore, a webpage that lists the newest OSM contributors may also be of interest to you.

So what can you do when you find a strange map edit that could be a vandalism case? The OSM help page contains an answer for that. First of all: Keep calm! Use changeset comments and try to ask in a friendly manner for the suspicious mapping reasons.

Results of the study: Overall I commented 283 Changesets in the aforementioned timeframe of four weeks. Unfortunately I did not count the number of analyzed changesets, but I assume that it should be around 1,200 (+- 200). The following chart shows the commented changesets per day. The weekends tend to have a larger number of commented/discussed changesets.

As mentioned in the introduction of the vandalism types, we should distinguish between different vandalism types. The following image shows the numbers for each category. In my prototype study, 45% of the commented changesets were vandalism related and 24% have already been reverted which was not documented in the discussion of the changeset. Sometimes I also found imported test- and fictitious data, which the initial contributor of the changeset didn’t revert. It should be clarified to everyone that the live-database should never be used for testing purposes. Interested developers can use the test API and a test database (see sandbox for editing).

Responses and spatial distribution: Overall I received 70 responses for the discussed changesets, sadly only 20 from the owner/contributor of the changeset. But, more or less every response was in a friendly manner. Most often the contributors wrote “thank you” or “I didn’t know that my changes are going to be saved in the live database”. Furthermore, if I received a response, it was within 24 hours.

The following map contains some clustered markers. Each one highlights areas where the discussed changesets are located. As you can see on the map, the commented changesets are spread almost all over the world. In some areas they tend to correlate with the number of active OSM’ers. However, here is some additional information about three selected areas: 1: USA – Several cases of Pokemon Go related and fictitious map edits. 2: Japan/China – Some mass deletions and 3: South Africa – Oftentimes new MissingMaps or HOT contributors tend to delete and redraw more or less the same objects such as buildings. I guess it was not explained well enough to these editors that this destroys the object history? However, the article about “Good practice” in the OSM wiki is quite useful in this case.

Conclusion: The study reveals that there is an ongoing issue with vandalism in OSM’s map data. I think we do need to simplify the tools for detecting vandalism. In particular we should omit work where several users review identical suspicious map edits. Maybe the best possible solution should be a tool which is integrated directly in the OSM.org infrastructure. However, my presentation also contained some statistics and charts about the OSM changeset discussions feature. This will be the content of a separate blog post in following weeks. Also, the prototype introduced at the end of my talk will (hopefully) be presented in the next few months.

Thanks to maɪˈæmɪ Dennis.

Reviewing OpenStreetMap contributions 1.0 – Managed by changeset comments and discussions?

The OSM project still records around 650 new contributors each day (out of almost 5,000 registered members per day). Some countries (such as Belgium or Spain) already provide platforms to coordinate the introduction to OSM for new mappers. Others use special scripts or intense manual work to send the newly registered contributors mails with useful information (Washington or The Netherland). However, oftentimes new contributors make, as expected, beginner-mistakes. Personally, I often detect unconnected ways, wrong tags or rare fictive data. Unfortunately, sometimes (new) members also delete, intentionally or unintentionally, existing map data.

At the end of 2014, many people were anticipating the newly introduced changeset discussions feature. A few months later, I developed a page that finds the latest discussions around the world or in your country. By now, many OSM members use changeset discussions for commenting or questioning map edits of other members.

main

However, one year ago, almost to the day, I wrote a blog post about a webpage for detecting suspicious OSM edits. In the newly updated version, I would like to combine the aforementioned changeset discussions and comments about suspicious edits to communicate with members in a more direct way. The following image shows the revised webpage.

map

Furthermore you can request all changesets of a contributor, which have been commented on. The same page can also show all comments written by a selected contributor (with all comments of the particular changeset). I think the last both features are really helpful for keeping control over your own and other changeset discussions. This should also simplify the reviewing process of changesets and map edits.

overview

As mentioned at the beginning of this blog post, some OSM groups send a welcoming e-mail to new contributors. I also saw that some mappers are welcoming new members in Taiwan with a changeset comment and information on their first changeset. Pretty neat stuff if you ask me.

Latest OSM Changeset Discussions: http://resultmaps.neis-one.org/osm-discussions
Find Suspicious OSM Changesetshttp://resultmaps.neis-one.org/osm-suspicious

Thanks to maɪˈæmɪ Dennis.

A comparative study between different OpenStreetMap contributor groups – Outline 2016

Over the past few years I have written several blog posts about the (non-) activity of newly registered OpenStreetMap (OSM) members (2015, 2014, 2013). Similarly to the previous posts, the following image shows the gap between the number of registered and the number of active OSM members. Although the project still shows millions of new registrations, “only” several hundred thousand of these registrants actually edited at least one object. Simon showed similar results in his yearly changeset studies.

2016members

The following image shows, that the project still has some loyal contributors. More specifically, it shows the increase in monthly active members over the past few years and their consistent data contributions based on the first and latest changeset:

2016months

However, this time I would like to combine the current study with some additional research. I tried to identify three different OSM contributor groups, based on the hashtag in a contributor’s comment or the utilized editor, for the following analysis:

  1. Contributors of the MissingMaps-Project: A contributors of the project usually use #missingmaps in their changeset.
  2. Contributors that utilized the Maps.Me app: The ‘created_by’-tag contains ‘MAPS.ME’.
  3. All other ‘regular’ contributors of the OSM project, who don’t have any #missingmaps in their changesets and neither used the maps.me editor.

In the past 12 months, almost 1.53 million members registered to the OSM project. So far, only 12% (181k) ever created at least one map edit: Almost 12,000 members created at least one changeset with the #missingmaps hashtag. Over 70,000 used the maps.me editor and 99,000 mapped without #missingmaps and the maps.me editor. The following diagram shows the number of new OSM contributors per month for the three aforementioned groups.

2016permonth

The release of the maps.me app (more specifically the OSM editor functionality) clearly has an impact on the monthly number of new mappers. Time for a more detailed analysis about the contributions and mapping times: The majority of the members of the groups don’t show more than two mapping days (What is a mapping day, you ask? Well, my definition would be: A mapping day is day, where a contributor created at least one changeset). Only around 6% of the newly active members are contributing for more than 7 days.

2016mappingdays

Some members of the #missingmaps group also contributed some changesets without the hashtag. But many of those members (70%) only contributed #missingmaps changesets. Furthermore, 95% of this adjusted group doesn’t map for more than two days. Anyway, despite identifying three different contributor groups, the results are looking somewhat similar. Let’s have a look at the number of map changes. The relative comparison shows that the smaller #missingmaps group produces a large number of edits. The maps.me group only generates small numbers of map changes to the project’s database.

2016mapchanges

Lastly, I conducted an analysis for three selected tag-keys: building, highway and name. The comparison shows that the #missingmaps group generates a larger number of building and highway features. In contrast “regular” OSM’ers and maps.me users contributed more primary keys such as the name- or amenity-tag.

2016tags

I think the diagrams in this blog post are quite interesting because they show that the #missingmaps mapathons can activate members that contribute many map objects. But they also indicate that the majority of these elements are traced from satellite imagery without primary attributes. In contrast the maps.me editor functionality proofed to be successful with its in-app integration and its easy usability, which resulted in a huge number of new contributors. In summary, I think it would be good to motivate contributors not only to participate in humanitarian mapathons but also to map their neighborhood in an attempt to stick to the project. Also, I guess it would be great if the maps.me editor would work on the next steps in providing easy mapping functionality for its users (of course with some sort of validation to reduce questionable edits).

Thanks to maɪˈæmɪ Dennis.