Style | StandardCards

OpenStreetMap Blogs

Saturday, 21. February 2026

OpenStreetMap User's Diaries

Improving OSRM Foot Routing with Greenery Waypoints

I have a large set of photographs I made while running. They are geotagged, as I took them with my phone camera. The compass direction is completely unreliable, but lat/lon is more trustworthy. I thought it would be an interesting experiment to extract greenery like grass and trees from these photographs. It can be a useful addition for creating routes that are more pleasant to walk, since the e

I have a large set of photographs I made while running. They are geotagged, as I took them with my phone camera. The compass direction is completely unreliable, but lat/lon is more trustworthy. I thought it would be an interesting experiment to extract greenery like grass and trees from these photographs. It can be a useful addition for creating routes that are more pleasant to walk, since the eye-level point of view is not available in OSM. As this is based on my personal photographs, it has the additional benefit of recommending routes that I tend to use. The first challenge I encountered is that out of a few thousand photographs, only a handful were taken during the daytime. After deduplicating and dropping all photos that contain no greenery, this becomes a relatively small set of waypoints. I decided not to extrapolate additional points along OSM ways to keep the dataset small and avoid adding misleading info. The greenery detection works well enough with the SegFormer model, although it is somewhat slow locally. My plan is to select waypoints from this dataset before calling OSRM. This way I get routes that are more enjoyable to walk and run, but are generally longer than the default shortest route. You can find my dataset on Kaggle.

Friday, 20. February 2026

OpenStreetMap User's Diaries

Some local changes to OSM of my area

A few quick notes on some changes I made to OSM based on local knowledge.

  1. Changed the point for the Riverside Centre building to reflect that it is now a Builder’s Corner hardware store.

  2. Added a point for the nearby Hole in the Wall Centre

  3. Defined an area for the Somerset Lofts apartment complex and added some details fo

A few quick notes on some changes I made to OSM based on local knowledge.

  1. Changed the point for the Riverside Centre building to reflect that it is now a Builder’s Corner hardware store.

  2. Added a point for the nearby Hole in the Wall Centre

  3. Defined an area for the Somerset Lofts apartment complex and added some details for it.


Converting dash cam videos into Panoramax images

I’ve recently begun contributing street-level imagery on Mapillary and Panoramax in my local area. I figured that my dash cam was already recording anyway, so if it could be of use to anyone, why not share it?

Contributing to Mapillary was very easy; since my dash cam has an integrated GPS that encoded its data into the video file, I could just upload the video to Mapillary and their web

I’ve recently begun contributing street-level imagery on Mapillary and Panoramax in my local area. I figured that my dash cam was already recording anyway, so if it could be of use to anyone, why not share it?

Contributing to Mapillary was very easy; since my dash cam has an integrated GPS that encoded its data into the video file, I could just upload the video to Mapillary and their website would turn it into an image sequence. Panoramax requires you to preprocess the video into geotagged images yourself, which made it hard to contribute to. Some cameras can be configured to save periodic images instead of videos, but that didn’t work for me because I still needed the dash cam to work normally as a dash cam first and Panoramax instrument second. It took me a while to figure it out, so I’m writing this blog post to hopefully help out the next guy in the same situation.

The task involves four basic steps. I scripted a solution that works specifically for my dash cam model (Garmin 47) and operating system (Linux). If Panoramax continues to grow, I imagine that separate scripts could be written for each step to mix and match for different camera types and computing environments. The steps are:

  1. Extract the raw GPS data from the dash cam video clip(s)

  2. Along the GPS trace, create a set of evenly-spaced points

  3. Extract images from the video occurring at the evenly-spaced points, and

  4. Add the GPS and time data to the image files

One could go even further and automatically upload the images to Panoramax straight from the terminal, but that’s beyond my coding abilities.

Let’s take a look at each step in detail:

Step 1 - Getting GPS data from the video

Thankfully, Garmin makes this relatively easy to do with exiftool. If you open the terminal in the directory with the video clips and run the command

exiftool GRMN<number>.MP4

The output will contain a warning:

Warning : [minor] The ExtractEmbedded option may find more tags in the media data

So we can modify the command into

exiftool -ee3 GRMN<number>.MP4

Now exiftool will output all the same information as before, as well as a bunch of the following

Sample Time                     : 0:00:58
Sample Duration                 : 1.00 s
GPS Latitude                    : XX deg YY' ZZ.ZZ" N
GPS Longitude                   : UU deg VV' WW.WW" W
GPS Speed                       : 11.2654
GPS Date/Time                   : 2026:02:13 22:24:45.000Z

Jackpot! Now we can redirect the output to a file and get our GPS coordinates. We need to have a file saved in the working directory to tell exiftool how to format the data. So I saved the following as gps_format.fmt:

#[IF]  $gpslatitude $gpslongitude
#[BODY]$gpslatitude#,$gpslongitude#,${gpsdatetime#;DateFmt("%Y-%m-%dT%H:%M:%S%f")}

Now we pass that to exiftool to only print the metadata we’re interested in. We’ll also put > gps.tmp to save the output to a file:

exiftool -p gps_format.fmt -ee3 GRMN<number>.MP4 > gps.tmp

And we’re done! Now we have the raw GPS information out of the video and into plain text.

Step 2 - Turn the GPS data into evenly spaced points

To do this, I use python to linearly interpolate between GPS points approximately 3 meters apart. And I do mean very approximately: instead of doing a proper distance calculation, I just eyeball how many meters are in a degree. One meter is very roughly about 0.000009° of latitude. Since one meter is a larger portion of a degree near the poles, it needs to be adjusted based on the latitude. I blindly use the latitude of the first point of the sequence and assume it doesn’t change enough over time to matter.

from math import cos, radians
cosd = lambda x: cos(radians(x))

scale_lat =  1 / 9e-6
scale_lon = (1 / 9e-6) * cosd(lat0)

Now it is easy to use the Pythagorean Theorem to estimate the distance between two points:

dx = scale_lon * (lon1 - lon0)
dy = scale_lat * (lat1 - lat0)
dist_between_points = (dx**2 + dy**2)**0.5

Recursively find this distance for each pair of points along the GPS trace. Also keep a running tally of the total distance traveled. For example, consider the following data after you stop at a red light, sit for a while, and then keep going:

Pt | Dist | Tot
A  | --   | 0
B  | 10   | 10
C  | 6    | 16
D  | 2    | 18
E  | 0    | 18
(sit at the red light...)
Q  | 0    | 18
R  | 1    | 19
S  | 3    | 22
T  | 7    | 29
U  | 11   | 40
V  | 14   | 54
(and so on)

Suppose you want image spacing of about 3 meters (about 10 feet or half a car length). So you want images at 0, 3, 6, 9, 12, 15, …, and so on. We can take point A as our first point, but we need to interpolate between GPS points to find evenly-spaced points. I’ll use the notation X -> Y N% to mean “interpolate N% from X to Y.” Then to find our desired points, we need:

Pt | Formula
0  | A
3  | A -> B 30%
6  | A -> B 60%
9  | A -> B 90%
12 | B -> C 33%
15 | B -> C 83%
18 | D
21 | R -> S 67%
24 | S -> T 29%
27 | S -> T 71%
30 | T -> U  9%
etc...

Since Garmin takes GPS measurements once per second, this is a convenient way to determine at exactly what time each new point occurred. For the point 60% from A to B, it’s just the GPS timestamp of A plus 0.60 seconds. For the latitude and longitude of the interpolated point, we can just interpolate the latitude and longitude coordinates separately. 3 meters is not even close to far enough for great-circle paths to matter. So e.g.

lerp = lambda a, b, x: (1 - x) * a + x * b

lat_interp = lerp(latA, latB, 0.6)
lon_interp = lerp(lonA, lonB, 0.6)

# And so on for each interpolated point

Save this output to a file (I call mine processed_points.csv), and you’re done with step 2!

Step 3 - Extract images from the video

It is possible to extract a single frame of a video using ffmpeg. The time should be a decimal number of seconds after the start of the video to exactly three decimal places.

ffmpeg -ss <time> -i <video>.MP4 -frames:v 1 output.jpg

By default, ffmpeg compresses the images quite a bit. It was enough that I could notice a quality difference when I put a paused frame of the video side-by-side with an extracted image. We can force ffmpeg to improve the quality with q:v (number). A smaller number here produces a higher quality image at the expense of file size and processing time. I’ve settled on a value of 3, but feel free to play around with this to get the quality or file sizes you want.

ffmpeg -ss <time> -i <video>.MP4 -q:v 3 -frames:v 1 output.jpg

ffmpeg will print a bunch of text to the console that we don’t care about. To avoid flooding the screen, use the -hide_banner and -loglevel options to reduce (but not completely shut up) the amount it outputs to the console:

ffmpeg -ss <time> -i <video>.MP4 -q:v 3 -frames:v 1 -hide_banner -loglevel fatal output.jpg

Since you are going to extract many images, you’ll have to use this command in a loop with a bunch of variables that change from iteration to iteration, e.g.

ffmpeg -ss $(printf "%.3f" "$time") -i "$input_dir""/DCIM/105UNSVD/GRMN""$num"".MP4" -q:v "$jpeg_quality" -frames:v 1 -hide_banner -loglevel fatal "$output_dir"/"$num""-""$(printf "%04d" $img_num)"".jpg"

My naming convention produces file names of the format video number-image number.jpg. So for example, the 25th image extracted from GRMN4567.MP4 would be named 4567-0025.jpg.

And we’re almost there! Now we just need to put the metadata from step 2 into the images we just generated.

Step 4 - Add the GPS and time metadata to the images

You can write tags to files using exiftool using the format:

exiftool -<key>=<value> <file name>.jpg

You can add multiple tags in a single line.

exiftool -<key1>=<value1> -<key2>=<value2> <file name>.jpg

Note that exiftool only supports specific keys, so it won’t write the metadata if it doesn’t know what the key is. It creates a new image by default, so to avoid duplicating each image, add:

exiftool -overwrite_original -<key1>=<value1> -<key2>=<value2> <file name>.jpg

This will write a line to the terminal to confirm after every single image. To avoid that, redirect the output to /dev/null. This tells the terminal to throw the output into a black hole, or the wardrobe to Narnia, or anywhere else besides the terminal.

exiftool -overwrite_original -<key1>=<value1> -<key2>=<value2> <file name>.jpg 2> /dev/null

For Panoramax to accept your images, you need all of the following tags:

-gpslatitude=45.6789
-gpslongitude=-123.456789
-gpslatituderef=N
-gpslongituderef=W
-datetimeoriginal=2000-01-02T03:04:05

If you are missing these, Panoramax will reject your image. Note that the latitude and longitude ref tags are necessary because exiftool doesn’t understand negative coordinates as being in the southern or western hemispheres. You have to provide them separately for the GPS data to be read correctly. If you forget to add them, Panoramax may accept the image but put it in the wrong place. The date and time should be given in ISO 8601 format. If you don’t specify a time zone, Panoramax will assume local time and automatically convert it to UTC on their site.

You can theoretically add any tag in the exif specification. Some ones I like for Panoramax are:

-subsectimeoriginal=067
-author=FeetAndInches
-make=Garmin
-model="Garmin 47 Dash Cam"

The SubSecTimeOriginal field is important for getting Panoramax to put your sequence in the right order. Since the images come from a dash cam, speeds of 10-20 m/s are common, so multiple images are taken per second of video. The DateTimeOriginal tag does not preserve fractional seconds (even if you provide them when writing the tag), so several pictures would be recorded as the same time and Panoramax would have to guess their order. Note that this needs to be provided as an integer string after the decimal point. So for a time of 51.328 seconds, you would write -subsectimeoriginal=328. For a time of 51.1 seconds, you would just write -subsectimeoriginal=1. For a time of 51.001 seconds, you would need to include leading zeroes as -subsectimeoriginal=001.

If you don’t use the SubSecTimeOriginal tag, you can still get Panoramax to show your images in order if you use a suitable file naming convention. You can open the sequence on the website and select the option to sort by file name.

The author tag is just nice to attribute that it’s your image even if it gets shared outside Panoramax. The make and model tags help fill in some of the camera information on Panoramax and helps determine your GPS accuracy, which is used to determine the image’s quality score.

You can do step 4 in the same loop as step 3. Since the coordinates and time will change for each image, the command will look messy like:

exiftool -overwrite_original -gpslongitude=$lon -gpslatitude=$lat -gpslatituderef=$ns -gpslongituderef=$ew -datetimeoriginal=$timestamp -author="$exif_author" -subsectimeoriginal="$subsec" -make="$exif_make" -model="$exif_model" -usercomment="$exif_comment" "$output_dir"/"$num""-""$(printf "%04d" $img_num)"".jpg" > /dev/null

Closing Notes

This post explains the basic principles of how to turn a video into usable images on Panoramax. I plan to write a second post going into the 201 level - things like how to deal with missing a single GPS measurement, duplicated measurements, getting sent to Null Island, how to detect erroneous data, using the videos immediately before and after to interpolate better at the edges, recursively doing this for multiple video clips, etc. But for now, I hope this has been useful to you.

If anyone is interested, I can share the entire scripts that I use right now. They’re a little buggy, only partially commented, and occasionally require some babysitting to make sure they work properly. But if something is better than nothing and you are willing to try and deal with someone else’s amateur code, please let me know.

Thanks for reading,

FeetAndInches


Neighborhood Update: [Wadsa, Desaiganj]

  • I spent some time today improving the map data in my local area using the iD editor. As a local, I noticed that several roads were untracted

  • added roads but i got confused while selecting presets- then i realised the more i do mapping, the better i will get with using presets. Each preset serves a unique purpose.

  • Few weeks ago i

  • I spent some time today improving the map data in my local area using the iD editor. As a local, I noticed that several roads were untracted

  • added roads but i got confused while selecting presets- then i realised the more i do mapping, the better i will get with using presets. Each preset serves a unique purpose.

  • Few weeks ago i spent time mapping my school in my city, i was soo fun- just wish they could use more updated satelite image.

Thursday, 19. February 2026

Jochen Topf

OSM Spyglass

Two years ago or so I started the OSM XRAY project, later I wrote about it in this blog post. Since then I have renamed this project to “OSM Spyglass” and I have kept working on it on and off.

At the State of the Map Europe 2025 in Dundee I gave a talk with the title “Everything Everywhere All At Once” about this project. You can see the video on Youtube. This got

Two years ago or so I started the OSM XRAY project, later I wrote about it in this blog post. Since then I have renamed this project to “OSM Spyglass” and I have kept working on it on and off.

At the State of the Map Europe 2025 in Dundee I gave a talk with the title “Everything Everywhere All At Once” about this project. You can see the video on Youtube. This got some people excited about the project, there is even some talk about putting the tool on OSMF infrastructure. Until this comes about the tool is now hosted at spyglass.jochentopf.com.

I am finally getting around to writing some more about what’s been happening since my first announcement and since the talk.

User Interface

I keep fiddling with the user interface. Optional globe view (not much to do for me now that Maplibre supports that out of the box), map is now resizable (horizontally), display of city names in some zoom levels, improved pop-up menus for keys and tags, and much more. Generally the UI has been getting faster and more reliable.

There are still some bugs to fix and plenty of possible improvements. And I’d be happy about feedback and ideas. Its quite a lot of information we are trying to show here in limited space, so good ideas on how to do that are needed.

Caching

In the first blog post I wrote about some caching that I implemented in the database. That did work but it turns out it is pretty useless. The user wants to access the newest data anway and we can keep up with minutely updates (at least in larger zoom levels), so I removed the caching completely for vector tiles and for high zoom rasters. Only raster images at zoom levels up to 10 are cached. Currently we can not deliver them fast enough otherwise.

Map updates

The database is updated from OSM using minutely diffs. We are usually about 3 to 5 minutes behind the OSM data, that’s just how long it takes the OSM servers to create the minutely diffs, push them out to their server and for our update job to download the data and to apply it to the database. It is unlikely we can improve on that much further. Spyglass shows the timestamp of the latest data it has in the bottom right corner. This timestamp is updated whenever new data is loaded, i.e. when you move the map or so.

Vector tiles are always generated on the fly from the current database, for higher zoom levels they contain all data, for medium zoom levels only “larger” objects are shown, i.e. long ways and larger areas. In small and medium zoom levels raster tiles are shown. They always contain all data. So for the medium zoom levels raster data in gray is overlayed with vector data in black (nodes and ways) or blue (relations). So you can see everything, but only click on the larger items.

Raster tiles in small zoom levels are only updated once per day, for zoom 0 to 7 this happens by taking the zoom level 8 tiles, and merging and rescaling them. I have spent quite some time on optimizing this. The first version happened in the database but only generated black-and-white tiles, the current version uses code written in Go which creates grayscale images which are much better than the black-and-white images. And it is much faster than the gdal tools I tried for this task. Gdal is a great tool, but, as an “all purpose tool”, it has to cope with all sorts of different data sources, projections etc. which makes it much slower than a specialized tool for a specific use case. It only takes a few minutes now to create the low zoom tiles from the zoom level 8 tiles. And they are not stored in the database any more but on disk which is easier and they are faster to use that way, too.

Rasters are still generated in the database from the data. That is, unfortunately, not as efficient as one might think. We don’t need to copy the data from the database into another process, and the cost of actually getting the data seems to be not that huge, but the rasterizing costs time. This is probably something that could be improved inside PostGIS, or maybe we have to get rid of this idea alltogether and move rendering outside the database. There is plenty of space to experiment and improve performance here.

Server

Originally I used pg_tileserv as server to create the vector tiles from the database on the fly. It could also be tricked into creating the raster tiles. But I also needed GeoJSON output and some other API endpoints. I experimented with pg_featureserv which did work, but having two servers with lots of specialized PL/pgSQL functions in the database plus an ever growing configuration for nginx (used as reverse proxy) became too complicated and error prone. So I decided to rewrite the server from scratch in Go. Turns out it is really easy to write robust and featureful HTTP servers in Go, it comes with everything you need; the only external library I am using is for accessing the database. And deployment is really easy: Just copy over one Go binary and restart the server, no extra configuration files or functions to update in the database etc.

Filters

Everything is done three times for nodes, ways, and relations. There are three sets of raster tiles, 3 sets of vector tiles. It is easy to switch those layers on and off in the UI. And then there is the key or tag filter. The vector tiles in higher zoom levels contain all the data, the filter is applied on the client, which is very fast. For raster tiles the filtering has to be done on the server which takes somewhat more time. Filtering is (silently) disabled on the small zoom levels, so you always see all data there. This isn’t great as a user experience, I’ll still have to figure out a way to make this transition more user friendly. Or, ideally, allow filtering on all zoom levels.

It is a lot of fun to zip around the map and look at far away places and how they are mapped. Try it out!. And if you have any problems or ideas, open an issue on Codeberg.


OpenStreetMap User's Diaries

Querying OSM objects by their shapes

There has been a very interesting question on the OSM US Slack lately.

“Does anyone have a method to search through the OSM database for a building of a particular shape? I need assistance finding OSM buildings with this specific shape. They should be located in NJ, DE, northeastern MD, eastern PA, or southern NY.”

The question quickly exploded into a

There has been a very interesting question on the OSM US Slack lately.

“Does anyone have a method to search through the OSM database for a building of a particular shape? I need assistance finding OSM buildings with this specific shape. They should be located in NJ, DE, northeastern MD, eastern PA, or southern NY.”

The question quickly exploded into a huge discussion. At the time of writing, there are already 71 replies.

Someone suggested :

“You could load OSM buildings into PostGIS and then use ST_HausdorffDistance to compare the geometries.”

From there, the discussion veered into how to solve that specific puzzle and find the exact OSM building in question.

One person added, “So the strategy is: create the shape of the building you want to search for, scale it to, say, fill a 100x100 m bounding box or something. Ask Postgres to, within a search-area bounding box, take each building and scale it to a 100x100 m bounding box, compute the Hausdorff distance with the scaled input shape, and return all OSM element IDs and their Hausdorff distances, sorted in ascending order.”

Another said, “What I’m currently doing is combining several shape exports into a single file with around 20,000 objects that have concavity. Concavity plus more than 10 nodes eliminates most buildings.”


At that point, instead of hunting that elusive specific OSM building, I became more interested in the generalized version of the problem.

So I added my two cents to the discussion:

“The generalized version of this problem would be : Can we represent a shape in some kind of data type that allows us to computationally check whether two objects have the same shape, regardless of rotation and scaling?

I haven’t studied the Hausdorff distance yet, but I’m wondering whether it can solve this problem, or if there’s a better alternative—Hu moments, Procrustes analysis, Fourier descriptors for contours…”

Someone replied :

“Hu moments are a good option. Elliptic Fourier Descriptors, Shape Context Histograms, Turning functions, etc. I’ve experimented with those four while trying to classify sports pitches more accurately. You can actually get pretty far with just compactness, convexity, and aspect ratio, thankfully.”

Do you have any other ideas on how to solve this problem?

Wednesday, 18. February 2026

OpenStreetMap User's Diaries

New CNEFE Tool Revolutionizes Street Name Correction in OpenStreetMap Brazil.

New CNEFE Tool Revolutionizes Street Name Correction in OpenStreetMap Brazil

The community of Brazilian mappers has just gained a powerful ally to improve one of the most crucial and, at the same time, challenging data points in any map: street names. The CNEFE Verification System platform has been launched, accessible at cnefe.mapaslivre.com.br, a tool created by and for the OpenStreetM

New CNEFE Tool Revolutionizes Street Name Correction in OpenStreetMap Brazil

The community of Brazilian mappers has just gained a powerful ally to improve one of the most crucial and, at the same time, challenging data points in any map: street names. The CNEFE Verification System platform has been launched, accessible at https://cnefe.mapaslivre.com.br, a tool created by and for the OpenStreetMap (OSM) community in Brazil, aimed at validating and correcting address data using the latest information from the 2022 IBGE Census.

The project is an initiative of UMBRAOSM (Union of Brazilian OpenStreetMap Mappers) and was developed by experienced mappers Raphael de Assis, president of UMBRAOSM and member of the OpenStreetMap Foundation, and Anderson Toniazo, both active members of the OSM Brazil community. The tool arrives to solve a long-standing bottleneck in national mapping: the updating and verification of street names based on official sources. The Challenge of Street Names in Brazil

For those mapping in Brazil, one of the biggest challenges has always been the lack of a complete, accurate, and freely accessible street database. Through the Demographic Census, IBGE compiles the National Registry of Addresses for Statistical Purposes (CNEFE) . This registry is a vast list of addresses from across the country, containing street names, address types, neighborhoods, and, in many cases, geographic coordinates, especially in rural and non-residential areas.

Historically, the OSM community has used CNEFE data from previous censuses (such as 2010) to enrich the map. However, the process was complex, involving downloading text files (fixed format), cross-referencing them with census tract shapefiles, and extensive manual work to match the information with the streets already drawn on the map, in addition to correcting spelling differences.

With the recent publication of the CNEFE 2022 microdata by IBGE, the need for an efficient tool to integrate this new data into OSM became even more evident. CNEFE System: A Bridge Between Official Data and the Collaborative Map

It is in this context that the CNEFE Verification System emerges. The platform created by Raphael de Assis and Anderson Toniazo is not just a data viewer; it is a complete work tool, designed to optimize the collaborative verification and correction workflow.

The system’s intuitive interface allows mappers of all experience levels to:

Visualize CNEFE 2022 Data: The tool presents official address data from the most recent census clearly, overlaid on the map.

Compare with OpenStreetMap: The mapper can easily identify discrepancies between a street name recorded in CNEFE and the name currently present in OSM.

Correct and Include Names: When a street in OSM is unnamed (very common in less mapped areas) or has a different name than the IBGE registry, the tool facilitates the correction and inclusion of the correct name directly on the map.

Fill Gaps: In places where IBGE registered addresses, but the corresponding streets have not yet been drawn in OSM, the application highlights these areas, encouraging the complete mapping of road geometries and, subsequently, the addition of names.

The platform is already at version 1.0, updated on January 22, 2026, and features rich support material for the community. Mappers can access a step-by-step tutorial with images, watch demonstrative videos, and even download complete PDF tutorials for offline consultation, ensuring everyone can make the most of the tool. The Strength of the Community Behind the Tool

The development of the CNEFE System is a testament to the power and organization of the OSM Brazil community. UMBRAOSM, under the leadership of Raphael de Assis, has stood out for promoting initiatives that facilitate and professionalize collaborative mapping in the country. Projects like “Mapeia Crato” have already demonstrated the capacity of unity in training new mappers and carrying out large-scale tasks.

The partnership between Raphael and Anderson in developing this tool reinforces the community’s commitment to not only use open data but also to give back, creating ecosystems that improve the quality of geospatial information available to everyone. Their work directly aligns with broader discussions within the community, such as the matching of CNEFE 2022 variables with OSM tags, a fundamental step for any data import or validation process. A Future with More Accurate Maps

The availability of the CNEFE System marks a significant advance for Brazilian mapping. By facilitating access and comparison with official Census 2022 data, the tool not only speeds up the map update process but also increases the reliability of the OpenStreetMap database as a whole.

For the end-user, whether a driver using a navigation app, a delivery person, or a researcher, the result is more accurate maps, with correctly identified streets and addresses that are easier to locate. The CNEFE tool is, therefore, a key piece in Brazil’s open data infrastructure, built collaboratively by those who understand the subject best: the mapping community itself.

Visit https://cnefe.mapaslivre.com.br and start contributing to a more complete and correct map of Brazil.


Nova Ferramenta CNEFE Revoluciona a Correção de Nomes de Ruas no OpenStreetMap Brasil.

Nova Ferramenta CNEFE Revoluciona a Correção de Nomes de Ruas no OpenStreetMap Brasil

A comunidade de mapeadores brasileiros acaba de ganhar uma poderosa aliada para aprimorar um dos dados mais cruciais e, ao mesmo tempo, desafiadores de qualquer mapa: os nomes das ruas. Foi lançada a plataforma Sistema de Verificação CNEFE, acessível em cnefe.mapaslivre.com.br, uma ferramenta criada por e para

Nova Ferramenta CNEFE Revoluciona a Correção de Nomes de Ruas no OpenStreetMap Brasil

A comunidade de mapeadores brasileiros acaba de ganhar uma poderosa aliada para aprimorar um dos dados mais cruciais e, ao mesmo tempo, desafiadores de qualquer mapa: os nomes das ruas. Foi lançada a plataforma Sistema de Verificação CNEFE, acessível em https://cnefe.mapaslivre.com.br, uma ferramenta criada por e para a comunidade OpenStreetMap (OSM) no Brasil, com o objetivo de validar e corrigir os dados de logradouros utilizando as informações mais recentes do Censo 2022 do IBGE.

O projeto é uma iniciativa da UMBRAOSM (União dos Mapeadores Brasileiros do OpenStreetMap) e foi desenvolvido pelos experientes mapeadores Raphael de Assis, presidente da UMBRAOSM e membro da Fundação OpenStreetMap, e Anderson Toniazo, ambos membros ativos da comunidade OSM Brasil. A ferramenta chega para resolver um antigo gargalo no mapeamento nacional: a atualização e verificação dos nomes das ruas a partir de fontes oficiais . O Desafio dos Nomes de Ruas no Brasil

Para quem mapeia no Brasil, um dos grandes desafios sempre foi a falta de uma base de dados de logradouros completa, precisa e de livre acesso. O IBGE, através do Censo Demográfico, coleta o Cadastro Nacional de Endereços para Fins Estatísticos (CNEFE). Este cadastro é uma vasta lista de endereços de todo o país, contendo nomes de ruas, tipos de logradouro, bairros e, em muitos casos, coordenadas geográficas, especialmente em áreas rurais e não residenciais .

Historicamente, a comunidade OSM já utilizava dados do CNEFE de censos anteriores (como o de 2010) para enriquecer o mapa. No entanto, o processo era complexo, envolvendo o download de arquivos de texto (formato fixo), o cruzamento com shapefiles de setores censitários e um trabalho manual intenso para casar as informações com as ruas já desenhadas no mapa, além de corrigir diferenças de grafia .

Com a recente publicação dos microdados do CNEFE 2022 pelo IBGE, a necessidade de uma ferramenta eficiente para integrar esses novos dados ao OSM tornou-se ainda mais evidente . Sistema CNEFE: Uma Ponte entre o Dado Oficial e o Mapa Colaborativo

É nesse contexto que surge o Sistema de Verificação CNEFE. A plataforma criada por Raphael de Assis e Anderson Toniazo não é apenas um visualizador de dados; é uma ferramenta de trabalho completa, projetada para otimizar o fluxo de verificação e correção colaborativa.

A interface intuitiva do sistema permite que mapeadores de todos os níveis de experiência possam:

Visualizar os Dados do CNEFE 2022: A ferramenta apresenta os dados oficiais de logradouros do censo mais recente de forma clara e sobreposta ao mapa.

Comparar com o OpenStreetMap: O mapeador pode facilmente identificar discrepâncias entre o nome de uma rua registrado no CNEFE e o nome atualmente presente no OSM.

Corrigir e Incluir Nomes: Quando uma rua no OSM está sem nome (algo muito comum em áreas menos mapeadas) ou com um nome diferente do cadastro do IBGE, a ferramenta facilita a correção e a inclusão do nome correto diretamente no mapa .

Preencher Lacunas: Em locais onde o IBGE registrou endereços, mas as ruas correspondentes ainda não foram desenhadas no OSM, a aplicação sinaliza essas áreas, incentivando o mapeamento completo da geometria das vias e, posteriormente, a adição dos nomes.

A plataforma já está na versão 1.0, atualizada em 22 de janeiro de 2026, e conta com um rico material de suporte para a comunidade. Os mapeadores podem acessar um tutorial passo a passo com imagens, assistir a vídeos demonstrativos e até baixar tutoriais completos em PDF para consulta offline, garantindo que todos possam aproveitar a ferramenta ao máximo. A Força da Comunidade por Trás da Ferramenta

O desenvolvimento do Sistema CNEFE é um testemunho do poder e da organização da comunidade OSM Brasil. A UMBRAOSM, sob a liderança de Raphael de Assis, tem se destacado por promover iniciativas que facilitam e profissionalizam o mapeamento colaborativo no país. Projetos como o “Mapeia Crato” já demonstraram a capacidade da união em capacitar novos mapeadores e realizar tarefas de grande escala .

A parceria entre Raphael e Anderson no desenvolvimento desta ferramenta reforça o compromisso da comunidade em não apenas usar os dados abertos, mas também em retribuir, criando ecossistemas que melhoram a qualidade da informação geoespacial disponível para todos. O trabalho deles dialoga diretamente com discussões mais amplas na comunidade, como a correspondência das variáveis do CNEFE 2022 com as etiquetas do OSM, um passo fundamental para qualquer processo de importação ou validação de dados . #Um Futuro com Mapas Mais Precisos

A disponibilização do Sistema CNEFE marca um avanço significativo para o mapeamento brasileiro. Ao facilitar o acesso e a comparação com os dados oficiais do Censo 2022, a ferramenta não só acelera o processo de atualização do mapa, mas também aumenta a confiabilidade da base de dados do OpenStreetMap como um todo.

Para o usuário final, seja ele um motorista usando um aplicativo de navegação, um entregador ou um pesquisador, o resultado são mapas mais precisos, com ruas corretamente identificadas e endereços mais fáceis de localizar. A ferramenta do CNEFE é, portanto, uma peça chave na infraestrutura de dados abertos do Brasil, construída colaborativamente por quem mais entende do assunto: a própria comunidade de mapeadores.

Acesse https://cnefe.mapaslivre.com.br e comece a contribuir para um mapa do Brasil mais completo e correto.


Structured POI Enrichment in Bengaluru, Karnataka

Changeset: 178729012

Today I contributed to OpenStreetMap by improving map completeness in my local area in Bengaluru, Karnataka.

🔹 What I Worked On

Added a missing café using local knowledge Verified placement to ensure it was mapped at the correct entrance location Added appropriate tags including: amenity=cafe name= ##Bean Stop Café

Checked for duplicate entries befor

Changeset: 178729012

Today I contributed to OpenStreetMap by improving map completeness in my local area in Bengaluru, Karnataka.

🔹 What I Worked On

Added a missing café using local knowledge Verified placement to ensure it was mapped at the correct entrance location Added appropriate tags including: amenity=cafe name= ##Bean Stop Café

Checked for duplicate entries before uploading

🔹 Mapping Approach

I focused only on verified, ground-truth information and avoided copying from copyrighted sources. All additions were based on direct familiarity with the area.

🔹 Quality Checks

Ensured the point was not placed on the roadway Confirmed correct spelling and capitalization Reviewed surrounding features for consistency

🔹 Objective

The goal was to improve local POI completeness and contribute accurate, structured data to OpenStreetMap. This is part of my effort to make consistent, quality-focused contributions rather than large, unverified edits.


Pascal Neis

Adding the Missing Dimension: Position Tracking for Vehicle Data Logging

In one of my previous blog posts, I explored how to read live vehicle data through the OBD II port that is present in most (modern) cars. As mentioned in the outlook, the next step in my project is to combine vehicle telemetry with (accurate) positional information in order to enable more advanced analysis. To […]

In one of my previous blog posts, I explored how to read live vehicle data through the OBD II port that is present in most (modern) cars. As mentioned in the outlook, the next step in my project is to combine vehicle telemetry with (accurate) positional information in order to enable more advanced analysis. To achieve this, I created a small GNSS test setup. The platform for all experiments is again a Raspberry Pi. For a first comparison, I selected two GNSS boards from Waveshare: the L76X GPS HAT and the ZED F9X GPS RTK HAT.

Why these two modules?
The L76X is an inexpensive entry level device that is suitable for navigation, mapping or general position tracking. It supports GPS and BDS and normally delivers a position accuracy of a few meters. The ZED F9X belongs to a completely different class. It is a multi band GNSS receiver that supports real time kinematic (RTK) processing. When correction data is available, it can reach accuracy in the range of centimeters, which makes it suitable for robotics, surveying, precision agriculture or any application that requires very accurate geolocation data. The antenna systems also show clear differences. The L76X includes a simple single band GPS antenna, while the ZED F9X works together with a multi band active GNSS antenna that allows reception of several frequency ranges at once. This antenna design is essential for achieving the high accuracy that the ZED F9X is capable of.

From the provided software to writing my own scripts
Both modules are delivered with example software and Python scripts on the manufacturer web pages. I tried using these examples first, but outdated Python versions and older code libraries quickly created compatibility problems. Because of this I moved directly to writing my own scripts, which turned out to be the better choice later on. The L76X operates at one update per second in its default configuration, but it can be configured to send up to ten updates per second. The ZED F9X can operate with even higher update rates, in some cases up to twenty five updates per second depending on the selected messages. However, not every communication protocol supports these higher update rates. I started with NMEA, which worked well up to ten updates per second. Above that limit the protocol becomes inefficient because the messages are relatively large. For the ZED F9X, switching to UBX made much more sense because UBX uses compact binary messages. Unfortunately the L76X does not support UBX, which means NMEA remains the only option for that board.

What comes next?
With the hardware and software configured and with automated startup and first measurement routines working reliably, the next step will be real world testing inside a car. In particular, I want to find out how the speed of the vehicle affects the quality of the GNSS measurements, how different surroundings such as hills, forests and tall buildings influence the accuracy, and how big the practical performance gap is between the simple L76X with its basic antenna and the ZED F9X combined with a multi band active antenna.


OpenStreetMap User's Diaries

Automatic Pedestrian Detection at Signalised Crossings

Automatic Pedestrian Detection at Signalised Crossings

Hi everyone,

I recently noticed that many modern pedestrian crossings are equipped with automatic detection sensors that trigger the traffic signal without requiring a push button.

Currently, in OpenStreetMap, we can tag:

  • highway=crossing and crossing=traffic_signals for signalised c

Automatic Pedestrian Detection at Signalised Crossings

Hi everyone,

I recently noticed that many modern pedestrian crossings are equipped with automatic detection sensors that trigger the traffic signal without requiring a push button.

Currently, in OpenStreetMap, we can tag:

  • highway=crossing and crossing=traffic_signals for signalised crossings
  • button_operated=yes/no to indicate if a manual button is present
  • traffic_signals:sound=yes/no for auditory signals

However, there is no standard way to indicate automatic activation by a detector for pedestrians or vehicles.

To address this, I have proposed a new tag on the OSM forum: detector_operated=yes/no, which would clearly indicate that a traffic signal is automatically triggered by a detector.

You can view and comment on the proposal here: https://community.openstreetmap.org/t/proposal-tag-traffic-signals-detector-operated-pedestrian-presence-sensor/141624

Here is an example illustration showing automatic pedestrian detection:
Automatic pedestrian detection

This tag would help improve mapping of intersections, pedestrian routing, traffic simulation, and accessibility information.

I’d love to hear your thoughts and experiences with automatic pedestrian detection at crossings in your area!


OpenCage

Interview: Nicolas Collignon - Kale AI

Interview with Nicolas Collignon of Kale AI about how they are using OpenStreetMap to build the future of urban logistics delivery

In the second 2026 edition of our OpenStreetMap interview series it was my pleasure to chat with Nicolas Collignon, co-founder and CEO of Kale AI, who are building urban routing solutions for delivery using OpenStreetMap.

Screenshot of the Kale AI

1. Who are you and what do you do? What got you into OpenStreetMap?

I’m Nico, my background is in computational cognitive science. I’m now the CEO of Kale AI, a start up building technology for urban logistics planning. I initially got into OpenStreetMap during a side quest where I got really curious about how to better understand urban tissue, and how to represent it computationally.

2. What is Kale AI? What prompted you to create it?

Kale AI is a company focused on solving the inefficiency problem in urban logistics. We build tools to make complex logistics planning easy. It’s a very hard and interesting problem, and planning is one of the biggest weaknesses of LLMs. We’ve been focused on supporting the transition to Light EVs and cargo-bikes in modern urban logistics fleets. Light EVs are up to 2x more efficient in dense urban areas and use 95% less energy than diesel vans. They’re a multi-solution to improve urban life.

3. Why do we need special routing for urban logistics?

Different vehicles need tailored routing because urban space is becoming increasingly complex. With improving cycling infrastructure, Low Traffic Neighbourhoods and so on, all of this can lead to improved efficiency if we better route vehicles through street networks. For example, a 2-wheeled cargo bike might be able to take a shortcut that a 3-wheeler is blocked from by a bollard. For the 2-wheeler that can save 5-10 minutes off their route, but having to backtrack could add this in additional time for the slightly larger vehicle.

Most of our work doesn’t focus specifically on “navigation” but on planning, assigning deliveries to vehicles and designing the sequence of stops on those routes. Dantzig, who first proposed the Vehicle Routing Problem, explains quite well why it’s hard in his 1958 paper: “Even for small values of n the total number of routes is exceedingly large, e.g. for n = 15, there are 653,837,184,000 different routes.”

In our research, we found that deliverers spend 60-80% of their day not driving, but looking for parking and walking to the door. Different vehicles have different performance advantages in different parts of a city. Light EVs have a big advantage in the centre. Our work focuses on leveraging the different strengths of each vehicle type, and taking into account that diversity makes the VRP even harder to solve.

4. What are the unique challenges involved in routing with OpenStreetMap, particularly for urban logistics?

The data quality is surprisingly good in well-mapped areas. The OSM community is incredibly detail-oriented. But two challenges stand out for us.

The first is completeness and heterogeneity. Coverage varies enormously, not just between cities but within them, and sometimes between streets that are literally 300 metres apart. In our research we found a striking example in Boston where two neighbouring hexagonal cells with almost identical satellite imagery had wildly different tagging. One had 167 highway:service tags, the other just 3. In Chicago suburbs we found a municipality with the highest population density in Illinois where OSM had recorded only 8% of its buildings. That kind of patchiness is a real problem when you’re trying to build models that generalise across cities.

The second is semantic consistency. OSM relies on contributors to categorise things freely, which means the same real-world object can be tagged in multiple ways depending on who mapped it and where. We saw this clearly across our study cities. Contributors in Los Angeles tagged single-family homes as building=house, while the same homes in other cities were tagged with the catch-all building=yes. Locally that’s fine, but the moment you try to build a model that works across cities, those inconsistencies become noise you have to work around.

And beyond the map itself, OSM captures the physical world but not the operational reality of deliveries. How long it takes to park, unload, walk to a door varies enormously by urban context and is invisible to any map. In our research, service time turned out to be one of the biggest drivers of delivery efficiency, yet almost no publicly available data exists on it. That’s a gap OSM can’t fill alone, but it points to how much logistics-specific ground truth is still missing.

5. What steps could the OpenStreetMap community take to improve mapping for urban logistics?

Keep tagging surfaces, seriously. It might feel niche, but it’s one of the most operationally significant pieces of data we use. The granularity OSM brings to surface data is something you simply can’t get from commercial providers, and it makes a real difference in planning accuracy.

Beyond that, access restrictions need more attention: bollards, width restrictions, turning restrictions, loading zone locations. These are the invisible barriers that can completely change how a fleet operates in a city, and they’re often missing or under-tagged. A restriction that a small vehicle sails through might stop a larger one entirely, and right now OSM rarely has enough detail to distinguish those cases.

More broadly, mapping Low Traffic Neighbourhoods and filtered permeability in a consistent, machine-readable way would be hugely valuable. These are increasingly shaping how urban freight actually moves, and having reliable structured data on them would let us plan far more accurately.

6. Recently OpenStreetMap celebrated 20 years. Where do you think the project will be in another 20 years?

I think OSM is going to become even more foundational than it already is, but probably in ways that are less visible. A lot of the most interesting work being done today in autonomous mobility, urban planning, and logistics quietly depends on OSM as a base layer. That’s only going to grow.

What excites me is the intersection with AI. Models are getting better at extracting structured data from imagery, which could dramatically accelerate how quickly OSM reflects the real world: new infrastructure, surface changes, new access restrictions. The community’s role might shift from purely manual contribution toward curation and validation at scale.

And as cities get more complex, with more vehicle types, more restricted zones, more differentiated infrastructure, the value of a community that actually cares about tagging a bollard correctly becomes hard to overstate. That local, granular knowledge is something no corporate mapping effort has ever quite replicated.


Thank you, Nico! Wonderful to see OpenStreetMap becoming a core part of the infrastructure of modern cities. As people, companies, communities use and rely on OSM, they will in turn start editing and maintaining the data for all of us to benefit.

Forward!

Ed

Please let us know if your community would like to be part of our interview series here on our blog. If you are or know of someone we should interview, please get in touch, we’re always looking to promote people doing interesting things with open geo data.

Tuesday, 17. February 2026

OpenStreetMap User's Diaries

Initial Building Mapping and Data Refinement Session

Changeset: 176210161

In this changeset (176210161), I focused on improving building-level mapping by adding missing building outlines and refining structural details using Bing Maps aerial imagery.

The objective of this session was to enhance spatial accuracy and improve map completeness in the area. I ensured that:

  • Building footprints were aligned correctly with satelli

Changeset: 176210161

In this changeset (176210161), I focused on improving building-level mapping by adding missing building outlines and refining structural details using Bing Maps aerial imagery.

The objective of this session was to enhance spatial accuracy and improve map completeness in the area. I ensured that:

  • Building footprints were aligned correctly with satellite imagery
  • Proper geometry was maintained
  • No duplicate structures were created
  • Tagging remained consistent with OSM standards

This edit was completed using the iD editor (v2.37.3), and I requested a review to ensure quality validation and community feedback.

Working on building details helped strengthen my understanding of:

  • Accurate polygon tracing
  • Satellite imagery interpretation
  • Clean data structuring
  • Version control within OSM changesets

I will continue improving structured building data and map quality in Karnataka.


Campus Mapping & Data Refinement in Yelahanka, Karnataka

Changeset number is 178690672

Today, I worked on improving map data around Yelahanka Taluku, Karnataka. I updated the official name of Sai Vidya Institute of Technology to reflect accurate real-world information and ensured proper tagging consistency.

In addition to correcting the name, I reviewed campus boundary structure, building tagging, and surrounding infrastructure to avoid dupli

Changeset number is 178690672

Today, I worked on improving map data around Yelahanka Taluku, Karnataka. I updated the official name of Sai Vidya Institute of Technology to reflect accurate real-world information and ensured proper tagging consistency.

In addition to correcting the name, I reviewed campus boundary structure, building tagging, and surrounding infrastructure to avoid duplication and maintain data integrity. I verified that the edits align with real-world sources and OSM tagging standards.

My focus during this session was on:

  • Accurate name correction
  • Structured campus boundary validation
  • Road connectivity refinement
  • POI accuracy improvement
  • Avoiding duplicate objects

This session helped reinforce the importance of precise tagging, version tracking, and reviewing live map data versus cached tiles. I will continue contributing to improving structured geospatial data across Karnataka.


[Résolu] Vérification post-correction : Itinéraire cyclable à Toulouse (Note #5169818)

✅ Info : Problème résolu

Le bloc de béton obsolète à Toulouse (coordonnées : 43,5615376 ; 1,4920996) a été supprimé dans OpenStreetMap.
L’itinéraire cyclable est désormais correct sur Geovelo, sans détour inutile.

Contexte : Le 17 février 2026, j’ai résolu la note #5169818 signalant un problème d’itinéraire cyclable à Toulouse (coordonnées : 43,

✅ Info : Problème résolu

Le bloc de béton obsolète à Toulouse (coordonnées : 43,5615376 ; 1,4920996) a été supprimé dans OpenStreetMap.
L’itinéraire cyclable est désormais correct sur Geovelo, sans détour inutile.

Contexte : Le 17 février 2026, j’ai résolu la note #5169818 signalant un problème d’itinéraire cyclable à Toulouse (coordonnées : 43,5615376 ; 1,4920996). Un bloc de béton obsolète (après la fin des travaux) provoquait un détour inutile sur les calculs d’itinéraire.

Actions réalisées : - Correction dans OSM : suppression de l’obstacle (changeset #178691426). - Attente de la mise à jour des données par Geovelo.


Trajet test problématique

Itinéraire Geovelo avec détour Lien direct pour tester : Geovelo - Itinéraire test


À faire : - Vérifier vers le 19 mars 2026 si l’itinéraire est corrigé sur Geovelo/OSRM. - Si le problème persiste, rouvrir la note ou contacter Geovelo.

Localisation : Voir sur OSM


#OpenStreetMap #Toulouse #Vélo #Contribution #Geovelo


Attività di Alternanza Scuola/Lavoro (ex PCTO) a Pesaro

🗺️ Pesaro ha bisogno di te!

Di seguito le persone coinvolte nel progetto: “Pesaro ha bisogno di te!”.

A un gruppo di studenti e studentesse è stato chiesto di incrementare il livello di precisione e accuratezza della mappa nella loro città (e dintorni, alcune e alcuni vivono in zone limitrofe).

Tutte le modifiche saranno ritenute valide se, e solo se, riporteranno l’hashtag #PCT

🗺️ Pesaro ha bisogno di te!

Di seguito le persone coinvolte nel progetto: “Pesaro ha bisogno di te!”.

A un gruppo di studenti e studentesse è stato chiesto di incrementare il livello di precisione e accuratezza della mappa nella loro città (e dintorni, alcune e alcuni vivono in zone limitrofe).

Tutte le modifiche saranno ritenute valide se, e solo se, riporteranno l’hashtag #PCTOMarconi2026 e verranno effettuate dagli utenti coinvolti nel progetto di seguito elencati (viene riportato solo il nome utente).

📭 Per contattare il coordinatore del progetto

⚠️ In caso di necessità, vi prego di contattarmi via email <giacomo.alessandroni@wikimedia.it> o su Telegram (@galessandroni). In questi canali sono più reattivo rispetto alla messaggistica interna.

Naturalmente, sentitevi liberi di correggere qualsiasi vandalismo refuso doveste notare.

👩‍🎓👨‍🎓 Utenti coinvolti nel progetto

N Utente (attività) Modifiche
0 Galessandroni Tutor
1 _basii 0
2    
3 CoolCastle561 0
4    
5    
6 Lorenzo-Cecchini 0
7 FedericoCrine 0
8    
9 ANTOHH 0
10    
11 Pit-_- 2
12    
13 Roberto Fazzini 0
14    
15 ga gasparri 0
16    
17 dadograss 0
18    
19    
20 Ariannapagnoni 36
21    
22 santa222 23
23 ele stefanini 4
24    
25    
26 davide zagaria 0

Ultimo aggiornamento: 20 febbraio 2026

🛠️ Strumenti utili


Setting up an Overpass API server - how hard can it be?

Many people have noticed that publicly available Overpass servers have been suffering from overuse (a typical “tragedy of the commons”). OSM usage policies generally contain the line “OpenStreetMap (OSM) data is free for everyone to use. Our tile servers are not”. Unfortunately, there have been problems with overuse of the public Overpass servers, despite the usage policy. “Just blo

All the hospitals in the UK and Ireland, in about 10 seconds

Many people have noticed that publicly available Overpass servers have been suffering from overuse (a typical “tragedy of the commons”). OSM usage policies generally contain the line “OpenStreetMap (OSM) data is free for everyone to use. Our tile servers are not”. Unfortunately, there have been problems with overuse of the public Overpass servers, despite the usage policy. “Just blocking cloud providers” isn’t an option, because (see here - use the translate button below) lots of different sorts of IP addresses, including residential proxy addresses, are the problem.

People who want to use e.g. Overpass Turbo do have the option to point it at a different Overpass API instance. If you’re using Overpass Turbo and you get an error due to unavailability, likely that is because the Overpass API that it is using is overwhelmed. There are other public Overpass API instances, but they may be complete (in terms of geography, or history) or up to date.

At this point, if you’re one of the people who created the problem you’ll likely just spin up more instances to retry after timeouts and make the problem worse. Most people reading this are I hope not in that category. There are commercial Overpass API providers - more details for the example in that table can be found here.

Other people (including me) might wonder whether it’s possible (without too much work) to set up an Overpass API server that just covers one or two countries. To keep it simple, let’s restrict myself to Britain and Ireland 2.3GB in OSM, and let’s not worry about Attic Data (used for “queries about what was in OSM in the past”) or metadata.

Let’s just try and do regular Overpass queries such as you might start from this taginfo page, like this. I’ll also only target the Overpass API, and will use “settings” in an Overpass Turbo instance to point to my Overpass API server. I do want to apply updates as OSM data is changed.

I’m interested in creating a server covering the UK and Ireland. In terms of size, have a look at how much bigger or smaller your area of interest is than the 2.3GB of Britain and Ireland below and use that to judge what size server you might need.

Documentation

At this point it’s perhaps worth mentioning that the documentation around Overpass is … (and I’m channelling my inner Sir Humphrey here) “challenging”.

There’s the OSM wiki which talks about “Debian 6.0 (squeeze) or Debian 7.0 (wheezy)”, the latter of which went EOL in May 2018. There is also an HTML file on overpass-api.de. That is … (engages Sir Humphrey mode again) not entirely accurate, in that it says to run something that doesn’t exist if you’ve cloned the github repository.

One of the best document by far is external and is by ZeLonewolf, which starts off by saying “I found the existing guides to be lacking”. It then says “This is a combination of various other guides, the official docs, Kai Johnson’s diary entry…”

(which is the other “best document”)

“… and suggestions from @mmd on Slack. This guide is intended to demonstrate how to configure a server dedicated to only running overpass”. Kai’s diary entry from 2023 is definitely worth reading (sample quote there “Running an Overpass server is not for the faint of heart. The software is really finicky and not easy to maintain. You need to have some good experience with Linux system administration and the will and patience to deal with things that don’t work the way they’re supposed to”).

Also, this github issue (and things linked from it) summarises some of the issues that I had on the way to getting my test server set up.

Where what I’m doing below differs from what the other guides say I’ll try and say why I am doing it differently. Usually it’s because my requirements are different (e.g. an overpass server for a small area rather than everywhere, on a VSP rather than a piece of tin, or because I need limited functionality).

Server

For my use case, we’ll need a server that is publicly accessible on the internet to do this. I’m already a customer of Hetzner, so I’ll create a test server there. Other providers are available, and may make more sense depending where you are in the world and how much you want to pay. For testing, spinning up something at one of the hyperscalers might make financial sense, but I suspect not long-term. I went with a CX43 with 160GB of SSD disk space, 16GB RAM and a rather large amount of bandwidth. This turned out to be about the right size for Britain and Ireland. I went with Debian 13 and public ipv4 and ipv6 addresses. I don’t know if Overpass releases need a particular architecture, but went with “x86” rather than “ARM” just in case.

If you’re needs are different you don’t have to use a cloud server for this. and Kai’s diary entry has a lot of information about physical server sourcing and setup.

Sizing was alas largely guesswork and trial and error - while I’m sure that the commercial providers know chapter and verse on this, there isn’t a lot written down about “sizing based on extract size” that isn’t “how long is a piece of string”. I found that loading even North Yorkshire (just 56MB in OSM) created a nodes file in the database area of 23GB, so that sets the minimum server size, even for very small test extracts.

The speed of the disk used needs to be able to apply updates in less time than the updates are of. If it takes 2 hours to apply 1 hour of updates, your server will never catch up. In practice I didn’t find this to be an issue with the servers at Hetzer and the relatively small extracts that I was working with.

Initial server setup

In what follows I’ll use youruseraccount, yourserver and yourdomain in place of the actual values I used.

I already have some ssh keys stored at Hetzner, so when buying the server, I chose a new name in the format “yourserver.yourdomain” and added my ssh keys. I have yourdomain registered at a DNS provider, and I added the IPV4 and IPV6 addresses there. I can now ssh in as root to “yourserver.yourdomain”, and run the usual:

ssl -l root yourserver.yourdomain
apt update
apt upgrade

and bounce the server and log back in again.

The next job is to create a non-root account for regular use and add it to the “sudo” group:

useradd -m  youruseraccount
usermod -aG sudo youruseraccount
chsh -s /bin/bash youruseraccount

I’ll create a new password in my password manager for youruseraccount on this server (obviously I used my account name rather than actually youruseraccount, but you get the idea…). Next, set the new account password to the newly chosen password

passwd youruseraccount

and check I can login to the new server as youruseraccount with that password, and become root:

ssh -l youruseraccount yourserver.yourdomain
sudo -i 
exit

Install some initial software:

sudo apt install emacs-nox screen git tar unzip wget bzip2 net-tools curl apache2 wget g++ make expat libexpat1-dev zlib1g-dev libtool autoconf automake locate 

That list includes both software prereqquisites (apache2) and things that will be really useful (screen). It also includes emacs as a text editor; you can use your preferred one instead wherever emacs is mentioned below.

To use screen you just type screen and then press return. You can manually detach from it by using ^a^d and later reattach by using “screen -r”. If there are multiple screens you can attach to you’ll see something like this:

There are several suitable screens on:
    95207.pts-2.h23 (02/15/2026 09:20:20 AM)        (Detached)
    95200.pts-2.h23 (02/15/2026 09:19:57 AM)        (Detached)
    1633.pts-2.h23  (02/14/2026 12:37:50 PM)        (Attached)
Type "screen [-d] -r [pid.]tty.host" to resume one of them.

and you can choose which one to reconnect to by typing in (say) “95207” and pressing “tab”. To force a reconnection to a screen that something else is attached to, use “screen -d -r”.

In many cases below I’ll say “(in screen)” - this just means it’s a good idea to run these commands from somewhere that you can detach from and reattach to. It doesn’t mean you need to create a new screen every time.

The ssh keys that I had stored have been added for root by Hetzner, but I also want to add them to my new account too:

sudo -i
sudo -u youruseraccount -i
ssh-keygen -t rsa
(either use existing password for ssh passphrase, or create and store a new one)
exit

cp /root/.ssh/authorized_keys /home/youruseraccount/.ssh/
emacs /home/youruseraccount/.ssh

… and in there change the ownership of the files to youruseraccount.

Next, check that you can ssh in to yourserver.yourdomain without a password. Next disable regular password access. We don’t want people to be able to brute force password access to a server on the internet, so we can just turn this off.

sudo emacs /etc/ssh/sshd_config

Find the line that says

# To disable tunneled clear text passwords, change to "no" here!

and uncomment and change the next two lines to say

PasswordAuthentication no
PermitEmptyPasswords no

save the file and then

sudo /etc/init.d/ssh restart

and then try and login (from the shell on that machine will work as a test)

ssh 127.0.0.1

It should say Permission denied (publickey).

Setting up a certificate is the next priority. Everything on the internet these days pretty much assumes https access, so let’s do that before even thinking about overpass. I’ll use acme.sh for that. Other providers and tooling are available and you can use them if you prefer. Login as your non-root account and then:

sudo -i
cd
wget -O -  https://get.acme.sh | sh -s email=youremailaddress
exit
sudo -i
/etc/init.d/apache2 stop
acme.sh --standalone --issue -d yourserver.yourdomain -w /home/www/html  --server letsencrypt

the last lines of the output you get should be like

-----END CERTIFICATE-----
[Sat Feb 14 12:51:45 AM UTC 2026] Your cert is in: /root/.acme.sh/yourserver.yourdomain_ecc/yourserver.yourdomain.cer
[Sat Feb 14 12:51:45 AM UTC 2026] Your cert key is in: /root/.acme.sh/yourserver.yourdomain_ecc/yourserver.yourdomain.key
[Sat Feb 14 12:51:45 AM UTC 2026] The intermediate CA cert is in: /root/.acme.sh/yourserver.yourdomain_ecc/ca.cer
[Sat Feb 14 12:51:45 AM UTC 2026] And the full-chain cert is in: /root/.acme.sh/yourserver.yourdomain_ecc/fullchain.cer

Next do

sudo a2ensite default-ssl
sudo a2enmod ssl
sudo systemctl reload apache2

and then edit the default site config

sudo emacs /etc/apache2/sites-enabled/default-ssl.conf

Replace the SSL references with the correct ones.

SSLCertificateFile      /root/.acme.sh/yourserver.yourdomain_ecc/fullchain.cer
SSLCertificateKeyFile   /root/.acme.sh/yourserver.yourdomain_ecc/yourserver.yourdomain.key

Restart apache

sudo systemctl restart apache2

and browse to https://yourserver.yourdomain to make sure that the certificate is working. You’ll need to arrange for that certificate to be renewed every couple of months, but let’s concentrate on overpass for now.

That is it for the initial server setup, so now would be a good time for a server snapshot or other sort of backup.

Setting up the user for Overpass

For this part, we’re going to follow parts of ZeLoneWolf’s guide. I’ve reproduced that mostly as written below, although some of the software already was installed earlier.

sudo su

mkdir -p /opt/op
groupadd op
usermod -a -G op youruseraccount
useradd -d /opt/op -g op -G sudo -m -s /bin/bash op
chown -R op:op /opt/op
apt-get update
apt-get install g++ make expat libexpat1-dev zlib1g-dev apache2 liblz4-dev curl git
a2enmod cgid
a2enmod ext_filter
a2enmod headers

exit

The username that we created above is “op”. We won’t use a password for that but will just use

sudo -u op -i

when we need to change to it from our normal user account.

Configuring Apache

We already have Apache set up with a default HTTPS website that says “It works!”. We’ll use some of what’s in ZeLoneWolf’s Guide but we DON’T want to completely replace our config with that one. Instead we’ll selectively copy in some sections. Edit the file as is:

sudo emacs /etc/apache2/sites-available/default-ssl.conf

Note that we are using https with the defaults and the filename is different to the example.

Find this line:

DocumentRoot /var/www/html

and after it insert this section:

# Overpass API (CGI backend)                                                                          
ScriptAlias /api/ /opt/op/cgi-bin/

<Directory "/opt/op/cgi-bin/">
        AllowOverride None
        Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
        Require all granted

        # CORS for Overpass Turbo                                                                     
        Header always set Access-Control-Allow-Origin "*"
        Header always set Access-Control-Allow-Methods "GET, POST, OPTIONS"
        Header always set Access-Control-Allow-Headers "Content-Type"
</Directory>

# Compression (for API responses)                                                                     
ExtFilterDefine gzip mode=output cmd=/bin/gzip

# Logging                                                                                             
ErrorLog /var/log/apache2/error.log
LogLevel warn
CustomLog /var/log/apache2/access.log combined

# Long-running Overpass queries                                                                       
TimeOut 300

I then deleted a bunch of lines, all comments of functional duplicates of what we had just added, down to but not including:

#   SSL Engine Switch:                                                                                

Save and restart apache:

sudo /etc/init.d/apache2 restart

and check that you can still browse to “https://yourserver.yourdomain”. It won’t look any different as the default website has not been changed; we’ll test the “cgi-bin” parts later.

Compile and Install Overpass

This is drawn directly from ZeLoneWolf’s guide. Note that this does NOT clone the github repository and build it locally. At the time of writing the latest version is “v0.7.62.10” so you’ll see that number below.

sudo su op

cd
wget https://dev.overpass-api.de/releases/osm-3s_latest.tar.gz
tar xvzf osm-3s_latest.tar.gz

cd osm-3s_v0.7.62.10/

time ./configure CXXFLAGS="-O2" --prefix=/opt/op --enable-lz4

That took 5s when I ran it. Next:

time make install

That took 9 minutes. Next:

cp -pr cgi-bin ..
cd
chmod -R 755 cgi-bin
mkdir db
mkdir diff
mkdir log
cp -pr osm-3s_v0.7.62.10/rules db

Those three directories created are for the database, minutely diff files and logfiles. In operation, the biggest by far will be “db” - we’ll expect 2.3GB of .pbf extract to create a database of initially 80GB or so. We’ll talk more about this later.

Loading OSM Data

The equivalent section of ZeLoneWolf’s guide is called “Download the Planet”. We don’t actually want to do that - we just want a data extract for our area of interest.

I’ll download a Geofabrik extract in my normal user account and make sure that it is accessible to the “op” user. Firstly browse to (in may caase) https://download.geofabrik.de/europe/britain-and-ireland.html . There is a link there to https://download.geofabrik.de/europe/britain-and-ireland-latest.osm.pbf anf a comment that says something like “This file was last modified 22 hours ago and contains all OSM data up to 2026-02-12T21:23:29Z”.

When logged in as youruseraccount:

mkdir ~/data
cd ~/data
time wget https://download.geofabrik.de/europe/britain-and-ireland-latest.osm.pbf

I then moved the file so that the filename contained the timestamp

mv britain-and-ireland-latest.osm.pbf britain-and-ireland_2026-02-12T21:23:29Z.osm.pbf

That is a .pbf format download - that format was introduced to OSM around 2010 and is basically pretty standard now. Unfortunately, Overpass still needs the previously used .bz2 format, but we can convert it:

(in screen)
sudo apt install osmium-tool
time osmium cat britain-and-ireland_2026-02-12T21\:23\:29Z.osm.pbf -o britain-and-ireland_2026-02-12T21\:23\:29Z.osm.bz2

That took around 1 hour 20 minutes (and frustratingly the progress bar looks like it was written by someone from Windows 2000) - don’t cancel it if it appears to be stuck, instead have a look to see if it is actually writing out a file. If you want to verify the resulting file:

(in screen)
time bzip2 --test britain-and-ireland_2026-02-12T21\:23\:29Z.osm.bz2 

That took around 11 minutes for me.

Still as youruseraccount, make the download area browsable via the “op” user”::

chmod o+rx ~
chmod o+rx ~/data

If you’re not comfortable with this then you can of couurse copy or more the file as root later.

Configure launch scripts.

This is based on ZeLoneWolf’s guide again, which in turn is using scripts that Kai Johnson wrote.

As the overpass user:

mv bin bin.bak && mkdir bin
git clone --depth=1 https://github.com/ZeLonewolf/better-overpass-scripts.git bin
rm -rf bin/.git

and we’ll need to copy some things from the build into that directory. This will include at least:

cp /opt/op/osm-3s_v0.7.62.10/bin/update_database bin/
cp /opt/op/osm-3s_v0.7.62.10/bin/update_from_dir bin/
cp /opt/op/osm-3s_v0.7.62.10/bin/osm3s_query bin/
cp /opt/op/osm-3s_v0.7.62.10/bin/dispatcher bin/

but I actually copied everything missing from the new “bin” directory. We installed “locate” above. If anything hs been inadvertantly missed you can use e.g. “locate nameofmissingthing” and it will find it. This is a bit messy, and it’d be great to have something that’s a bit more solid and has less of the “porcine face paint applicator” feel to it; but I did not want to go too far down that road as I was trying to set something up “without too much work”.

Change the scripts to work with data extracts and no attic or meta data

We’re going to load a data extract from Geofabrik, and we’d also like to be able to update it with changes as other people update OSM. Normally the workflow that I’d suggest for this sort of thing is to download minutely updates from https://planet.osm.org, use trim_osm.py to snip them down to the area that we’re interested in and then apply those as updates.

By default, Overpass does run with planet.osm.org minutely diffs but alas I’ve struggled to get those to work with a data extrct; the updater falls over when it finds certain sorts of data that it is not expecting (i.e. was never originally loaded) in diff files. However, Geofabrik does provide daily diff files that match their extracts, so we can use those instead.

Also, we’re only interested in “now” data - we’re not creating an Overpass server with “attic” data that allows us to query data from back in 2012.

We therefore have to make a bunch of changes to scripts.

startup.sh

In there, we will change “https://planet.openstreetmap.org/replication/minute” to “https://download.geofabrik.de/europe/britain-and-ireland-updates”.

We’ll change --meta=attic to --meta=no because we’re not doing anything with “attic” data.

We’ll remove the --attic from the “dispatcher” call.

apply_osc_to_db.sh

We’ll change EXPECTED_UPDATE_INTERVAL from 57 to 3557 or even longer. We’re expecting files once a day not once a minute, but checking every hour is not too bad.

log file management

There’s a section in ZeLoneWolf’s guide that covers this.

Log files will eventually grow large and will eventually need a log rotation mechanism to be set up, but let’s gloss over that for now as I’m eager to see Overpass actually running!

Server automation

See ZeLoneWolf’s guide.

I have deliberately not done this yet as I don’t want to automatically do anything; rather I’d like to control it manually so that I can watch that it does what it is supposed to.

Load the data.

(in screen)
time bin/init_osm3s.sh /home/youruseraccount/data/britain-and-ireland_2026-02-12T21\:23\:29Z.osm.bz2 "db/" "./" --meta=no

That took about 77 minutes for me. Lots of files will have been created in “db”. A quick check on disk usage is in order:

df .
Filesystem     1K-blocks     Used Available Use% Mounted on
/dev/sda1      157207480 76407544  74363468  51% /

op@h23:~$ fc du
du -BG db/* | sort -n -r | head
53G     db/nodes.map
9G      db/ways.map
3G      db/ways.bin
3G      db/nodes_meta.bin
3G      db/nodes.bin
2G      db/way_tags_global.bin
2G      db/ways_attic.map
2G      db/nodes_attic.map
1G      db/way_tags_local.bin.idx
1G      db/way_tags_local.bin

It’s worth noting that those are large numbers for an extract. The 2.3GB data extract has created a 53GB nodes.map file. Compression is supported, but I haven’t tested it.

Set up replicate_id

There’s a file in the db directory (which will be created if it does not already exist) that determines the place to start consuming diffs from. These vary by server; the number corresponding to planet.osm.org replication from a certain data will different to the one for Geofabrik replication for the same date.

In our example we’re using Geofabrik data from 12th Feb 2026. We can browse through https://download.geofabrik.de/europe/britain-and-ireland-updates/ and https://download.geofabrik.de/europe/britain-and-ireland-updates/000/004/ until we find the immediately prior state file https://download.geofabrik.de/europe/britain-and-ireland-updates/000/004/693.state.txt , which contains sequenceNumber=4693. This means that 4693 is our magic number.

We’ll therefore edit the replicate_id file (creating it if it does not exist) and write 4693 (with a linefeed after) to it.

Before we do anything else, now is a good opportunity for another snapshot.

Start overpass

If this isn’t the first time you’ve started overpass you may want to take backup copies of previous “diff” directories or “log” files. Then:

bin/startup.sh

You should see something like this:

[2026-02-15 12:43:14] INFO: Starting Overpass API components...
[2026-02-15 12:43:14] INFO: Starting base_dispatcher...
[2026-02-15 12:43:14] INFO: Cleaning up stale files...
[2026-02-15 12:43:14] INFO: base_dispatcher is running (PID: 107771)
[2026-02-15 12:43:14] INFO: Starting area_dispatcher...
[2026-02-15 12:43:14] INFO: area_dispatcher is running (PID: 107783)
[2026-02-15 12:43:14] INFO: Starting apply_osc...
[2026-02-15 12:43:14] INFO: apply_osc is running (PID: 107795)
[2026-02-15 12:43:14] INFO: Starting fetch_osc...
[2026-02-15 12:43:14] INFO: fetch_osc is running (PID: 107835)
[2026-02-15 12:43:14] INFO: Performing final verification...
[2026-02-15 12:43:16] INFO: base_dispatcher verified (PID: 107771)
[2026-02-15 12:43:17] INFO: area_dispatcher verified (PID: 107783)
[2026-02-15 12:43:17] INFO: apply_osc verified (PID: 107795)
[2026-02-15 12:43:17] INFO: fetch_osc verified (PID: 107835)
[2026-02-15 12:43:17] INFO: All Overpass components started successfully

[2026-02-15 12:43:17] INFO: === Process Status ===
  base_dispatcher      PID: 107771
  area_dispatcher      PID: 107783
  apply_osc            PID: 107795
  fetch_osc            PID: 107835

In the directories below “diff”, you should see that it has downloaded daily diffs for any days since your extract, for example:

  /opt/op/diff/000/004: (56 GiB available)
  drwxrwxr-x 2 op op    4096 Feb 16 01:07 .
  -rw-rw-r-- 1 op op 3874289 Feb 16 01:07 697.osc.gz
  -rw-rw-r-- 1 op op     113 Feb 16 01:07 697.state.txt
  -rw-rw-r-- 1 op op 3033325 Feb 15 12:43 696.osc.gz
  -rw-rw-r-- 1 op op 3405594 Feb 15 12:43 695.osc.gz
  -rw-rw-r-- 1 op op 3057997 Feb 15 12:43 694.osc.gz
  -rw-rw-r-- 1 op op     113 Feb 15 12:43 695.state.txt
  -rw-rw-r-- 1 op op     113 Feb 15 12:43 696.state.txt
  -rw-rw-r-- 1 op op     113 Feb 15 12:43 694.state.txt
  drwxrwxr-x 3 op op    4096 Feb 15 12:43 ..

In “log” you should see something like:

  /opt/op/log: (56 GiB available)
  -rw-rw-r--  1 op op  12111701 Feb 16 23:39 apply_osc_to_db.out
  drwxr-xr-x 13 op op      4096 Feb 16 20:14 ..
  drwxrwxr-x  2 op op      4096 Feb 15 12:43 .
  -rw-rw-r--  1 op op         0 Feb 15 12:43 osm_base.out
  -rw-rw-r--  1 op op         0 Feb 14 14:26 fetch_osc.out
  -rw-rw-r--  1 op op         0 Feb 14 14:26 areas.out

Testing standalone

At the command line type:

bin/osm3s_query

Paste in this:

<query type="nwr"><bbox-query n="51.96" s="51.86" w="-3.31" e="-3.22"/><has-kv k="amenity" v="pub"/></query><print/>

Press return. Press ^d. A selection of data will be returned.

Testing from Overpass Turbo

In a web browser, browse to https://overpass-turbo.eu/s/2kEW .

Click “settings”. Change “server” from “https://overpass-api.de/api/” to “https://yourserver.yourdomain/api/”. Click “run”. You should not get an error, and should get a couple of nodes and 4 ways returned.

For the avoidance of doubt - if you browse to “https://yourserver.yourdomain/” you’ll get some sort of “It works!” page. If you browse to “https://yourserver.yourdomain/api/” you’ll actually get an error - it’s designed to be accessed (see the CORS settings above) by Overpass Turbo, not a regular browser.

Now what?

Shutting everything down and taking a snapshot of the server is a good idea at this point. The long-term cost of snapshots is small (€0.20 per month or so). The cost of leaving a server of this specification running 24x7 isn’t that large - around €10, perhaps a couple of beers or a couple of fancy coffees.

You might also want to think about setting up an Overpass server that does include metadata and attic data - but you’re probably better off with a dedicated server for that, and better off following one of the other guides linked above.

Edit: Minor clarification re use of Overpass API URL following a question on IRC.

Monday, 16. February 2026

OpenStreetMap User's Diaries

Lincolnshire Flood Emergency Routes Out

Lincolnshire ER OUT Routes

Hello! This is my first Diary Entry and I wanted to dedicate it to the Forum Post that I made about the UKs Only (I Believe) ER OUT routes in the case of any emergencies: mainly flooding in this case.

Overview

After major flooding in 2013 the council created the Lincolnshire ER Routes to enable people to quickly evacuate from the flood areas. Many of you make

Lincolnshire ER OUT Routes

Hello! This is my first Diary Entry and I wanted to dedicate it to the Forum Post that I made about the UKs Only (I Believe) ER OUT routes in the case of any emergencies: mainly flooding in this case.

Overview

After major flooding in 2013 the council created the Lincolnshire ER Routes to enable people to quickly evacuate from the flood areas. Many of you make have driven past these and never even noticed! They are Red rectangular signs with the white text of ER out on them with a direction to follow. They are placed at every turn, so the evacuees follow the road ahead until a signs says otherwise.

Example Sign

https://www.geograph.org.uk/photo/6485754

Route End

The end of the route signifies that the evacuees are clear of the major flood risk and (presumably) there would be further guidance at the end of the route. The route end sign is the same as the direction signs however it features 5 black diagonal lines.

https://www.geograph.org.uk/photo/6033649

OSM Mapping

In the forum post I have included some proposed tags and along with Insert User who has suggested some changes to the signage.

I will be unable to fully map these routes out as I rarely venture to the south of Lincolnshire. If you live near one of these routes please do help to map these! I presume it will take a while to map all of the routes but I think it will be worth it in the event of any flooding within the region!

Please do not hesitate to contribute to the forum post!


E65 CENTRAL GREECE

Σε εφαρμογές που “πατάνε” στους Open Street Maps (σίγουρα στις Mapy. OSMand, Organic Maps, ίσως και αλλού), ως E65 εμφανίζεται ΛΑΘΟΣ ο παλιός δρόμος Λαμία-Δομοκός-Φάρσαλα-Λάρισα και όχι ΣΩΣΤΑ ο αυτοκινητόδρομος Θερμοπύλες-Καλαμπάκα (και ημιτελές Βορειότερα ως την συμβολή με την Εγνατία). Ιδίως για τους ξένους ταξιδιώτες είναι μέγα μπέρδεμα.

Σε εφαρμογές που “πατάνε” στους Open Street Maps (σίγουρα στις Mapy. OSMand, Organic Maps, ίσως και αλλού), ως E65 εμφανίζεται ΛΑΘΟΣ ο παλιός δρόμος Λαμία-Δομοκός-Φάρσαλα-Λάρισα και όχι ΣΩΣΤΑ ο αυτοκινητόδρομος Θερμοπύλες-Καλαμπάκα (και ημιτελές Βορειότερα ως την συμβολή με την Εγνατία). Ιδίως για τους ξένους ταξιδιώτες είναι μέγα μπέρδεμα.


FOSSGIS e.V. / OSM Germany

Nur noch wenige Wochen bis zur FOSSGIS 2026 in Göttingen - die Vorfreude steigt

Die FOSSGIS-Konferenz 2026 findet vom 25.-28. März 2026 in Göttingen und Online statt. Es sind nur noch wenige Wochen bis zur Konferenz. Die Vorfreude wächst stetig und die Vorbereitungen laufen auf Hochtouren!

Die Konferenz wird vom gemeinnützigen FOSSGIS e.V, der OpenStreetMap Community in Kooperation mit dem Geographischen Institut der Georg-August-Universität Göttingen organisiert u

Die FOSSGIS-Konferenz 2026 findet vom 25.-28. März 2026 in Göttingen und Online statt. Es sind nur noch wenige Wochen bis zur Konferenz. Die Vorfreude wächst stetig und die Vorbereitungen laufen auf Hochtouren!

Die Konferenz wird vom gemeinnützigen FOSSGIS e.V, der OpenStreetMap Community in Kooperation mit dem Geographischen Institut der Georg-August-Universität Göttingen organisiert und findet auf dem Campus der Uni Göttingen statt.

Auch in diesem Jahr zeichnet sich ein großes Interesse an der Konferenz ab. Die Anmeldungen steigen von Woche zu Woche. Zum Glück bietet das Zentrale Hörsaalgebäude der Uni Göttingen ausreichend Platz, so dass es die bisher größte FOSSGIS-Konferenz werden könnte.

FOSSGIS Konferenz 2026 Göttingen

FOSSGIS 2026 Programm und Zeitplan

Das FOSSGIS Team freut sich auch in diesem Jahr auf ein spannendes Programm mit zahlreichen Vorträgen, ExpertInnenfragestunden, Demosessions, BoFs und Anwendertreffen und sowie 28 Workshops. Das Konferenzprogramm findet von Mittwoch bis Freitag im Zentralen Hörsaalgebäude (ZHG) der Uni Göttingen statt. Am Samstag finden OSM-Samstag und Community Sprint an der Fakultät für Geowissenschaften und Geographie am Nordcampus statt.

https://www.fossgis-konferenz.de/2026/programm/

Die Konferenz startet in diesem Jahr schon am Dienstag, den 24.03.2026 ab 10 Uhr mit längeren Workshops (180 Minuten). Wählen Sie unter 7 Workshops aus siehe Programm und reisen Sie schon am Dienstag an. Die Workshops sprechen sowohl Einsteiger:innen als auch Fortgeschrittene an, es sind noch Plätze frei. Buchen Sie gerne noch einen Workshop und nutzen Sie die Chance in kurzer Zeit Wissen zu einem Thema aufzubauen.

FOSSGIS vernetzt - Anwendertreffen und Community Sprint

Rund um die und während der Konferenz gibt es zahlreiche Möglichkeiten sich zu vernetzen. Die Pausenversorgung kombiniert mit Firmen-Ausstellung und Poster-Ausstellung finden im Foyer des ZHG statt sowie auch die Abendveranstaltung am ersten Konferenztag. Für die fachliche Vernetzung bieten sich Gelegenheiten bei den Anwendertreffen, Expert:innenfragestunden und weiteren Community Sessions, eine Onlineteilnahme ist möglich. https://www.fossgis-konferenz.de/2026/socialevents/

Reichhaltiges Rahmenprogramm

In diesem Jahr freuen wir uns über ein vielseitiges Rahmenprogramm mit spannenden Exkursionen und Treffen in interessanten Lokationen Göttingens. FOSSGIS steht auch für Netzwerken. Dies ist schon am Dienstagabend möglich. Die Geochicas laden zu einem Treffen ein. Außerdem findet der Inoffizielle Start mit einem gemeinsamen Abendessen (Selbstzahler) statt und heißt alle schon angereisten Konferenzteilnehmenden willkommen.

Alle Informationen finden sich unter https://www.fossgis-konferenz.de/2026/socialevents/

FOSSGIS Konferenz 2026 Sponsoren

Herzlichen Dank an die Sponsoren der Konferenz. die durch Ihre Unterstützung maßgeblich zur Finanzierung der Veranstaltung beitragen. Werden auch Sie FOSSGIS-Sponsor. Wir freuen uns über weitere Unterstützung. Informationen finden Sie unter https://fossgis-konferenz.de/2026/#Sponsoring

FOSSGIS Konferenz 2026 Sponsoren

FOSSGIS - ein Teamevent

Die FOSSGIS lebt vom ehrenamtlichen Engagement, zahlreiche Helfer:innen bringen sich ein und übernehmen unterschiedlichste Aufgaben vor und während der Konferenz. Herzlichen Dank dafür!

Es werden noch Helfende gesucht, insbesondere für Sessionleitung, Unterstützung im Hörsaal für die Vortragenden sowie beim Catering, siehe https://www.fossgis-konferenz.de/2026/helfen/.

OSM-Samstag und Community Sprint

Am Samstag, den 28.03.2026 werden OSM-Samstag und Community Sprint in den Räumen des Geographischen Instituts in der Goldschmidtstr. 3-5, 37073 Göttingen stattfinden. Die Gelegenheit ins Gespräch zu kommen oder beim Community Sprint sich einzubringen oder Know-How aufzubauen. Jede:r ist herzlich willkommen teilzunehmen, https://pretalx.com/fossgis2026/talk/VVYN7A/.

Informiert rund um die Konferenz

Informationen rund um die FOSSGIS finden sich unter dem Hashtag #FOSSGIS2026. Den Haschtag #FOSSGIS2026 nutzen wir für Informtionen in den Social Media, nutzen sie es auch, um die Social Media Aktivitäten zu verbinden.

Archiv FOSSGIS-Konferenzen

Im FOSSGIS-Archiv finden Sie die Homepages der vergangenen Konferenzen, inkl. Programm und Videos https://fossgis-konferenz.de/liste.html.

Das FOSSGIS Team 2026 wünscht eine gute Anreise und freut sich auf eine spannende Konferenz in Göttingen

Sunday, 15. February 2026

weeklyOSM

weeklyOSM 812

05/02/2026-11/02/2026 [1] The osm-mapper-globe by Martijn van Exel | map data © by OpenStreetMap Contributors. Mapping User AndreaDp271 is seeking comments on their proposal of a tagging scheme for civil protection areas used in case of large scale emergencies. Please review the proposal and share your feedback to help refine the technical details and address…

Continue reading →<

05/02/2026-11/02/2026

lead picture

[1] The osm-mapper-globe by Martijn van Exel | map data © by OpenStreetMap Contributors.

Mapping

  • User AndreaDp271 is seeking comments on their proposal of a tagging scheme for civil protection areas used in case of large scale emergencies. Please review the proposal and share your feedback to help refine the technical details and address any potential issues.
  • Voting on the tagging scheme for advisory access restriction signage on destination signs proposal is open until Saturday 21 February.

Mapping campaigns

  • Henry Wilkinson has mapped the Dundas West Station in Toronto, Canada, using the LiDAR sensor on an iPhone 17 Pro, in combination with the Niantic Scaniverse app, to capture 3D data. He then reconstructed the digital 3D scene with Meshroom and Blender before uploading the results to OpenStreetMap through JOSM (using the PicLayer plugin) to align imagery, and iD for streamlined indoor tagging. The completed work enabled detailed indoor mapping of the station, now viewable on OpenLevelUp.

Community

  • darkonus wrote in their diary about elliptical toponyms (geographical names in which generic terms disappear over time) and explained why it is important for OpenStreetMap mappers to verify the full forms of names. The author examined various cases and focused on Ukrainian microtoponyms, emphasising that abbreviations in sources or in speech do not always indicate a change in the proper name.
  • Bart Louwers and others have released the January 2026 edition of the MapLibre Newsletter.
  • Matt Whilden has made an Ultra query to create a map that renders the nickname tag of places in OpenStreetMap.

OpenStreetMap Foundation

  • Paul Norman reported that the OpenStreetMap Operations Team has recently made several improvements to the tile.openstreetmap.org raster map tile service.
  • Minh Nguyễn reported that the OSM Wiki has just switched the CAPTCHA used in the account creation process from hCaptcha to Cloudflare Turnstile, aiming to improve protection against bots.

Local chapter news

  • Oliver Rudzick announced that FOSSGIS e.V. will once again host the FOSSGIS–OSM Community Meeting at the Linuxhotel in Essen. The event is scheduled for the extended first May weekend, from Thursday 30 April to Sunday 3 May. Additional details are available on the event’s Wiki page.
  • OpenStreetMap United States 2026 board candidate nominations closed on 8 February 2026. There are five candidates and you can read their position statements on the Wiki.

Events

  • The FOSS4G 2026 organising team announced that the Call for Proposals (closing 16 March) and Travel Grant Programme (closing 28 February) are now open for the global conference to be held in Hiroshima, Japan (30 August to 5 September 2026).
  • Developers are invited to register for a free GeoSolutions instructional webinar on the Geonode 5.0 open-source software, a web-based application and platform for developing geospatial information systems and for deploying spatial data infrastructures. The webinar will be held on Tuesday 24 February at 4 pm GMT.
  • Recordings of the State of the Map CZ/SK 2025 sessions are available on its Peertube channel. The details can be found in the event’s schedule.
  • The State of the Map US call for proposals closes on 16 February. OpenStreetMap US invites you to share your presentation ideas. Looking for inspiration? Check out the recorded talks from previous conferences. They also have Mapping USA recordings available.

OSM in action

Software

  • [1] Martijn van Exel has developed the osm-mapper-globe, a visualisation dashboard that enables users to watch OpenStreetMap edits in real time on an interactive globe. The code is available on Codeberg under an ISC licence.
  • Stefan blogged about GraphHopper’s route optimisation API and ‘stop timing’, a new feature accounting for location overhead (the time spent at a stop on a delivery route).

Programming

  • Martijn van Exel has developed cosmo, a command-line tool for filtering and converting OSM PBF data into GeoJSON or Parquet formats.

Releases

  • The CoMaps team released version 2026.02.09 featuring OSM data from 7 February and automatic updates of the check_date tag when adding/editing POI. Furthermore, the search now supports Scandinavian letters (æøå), on Android you can now disable the speed limit display during car navigation, and on iOS in the EU you can now set CoMaps as your default map app.
  • Sarah Hoffmann (aka lonvia) announced enthusiastically that Photon 1.0.0 was released with a lot of improvements in its search engine. She also thanked GraphHopper, Komoot, and Entur for their continued support of Photon development, which has made this release possible.

Did you know that …

  • … the Academic Computer Club, at Umeå University, and the Oregon State University Open Source Lab contribute to OpenStreetMap by hosting tile render servers? If you also want to help OpenStreetMap run a tile server in your region, check this guide.
  • … according to OpenStreetMap’s tile usage policy, it is recommended to include a ‘Report a map issue‘ link, so your app users can help improve the data?
  • … Séverin Ménard (Les Libres Géographes) gave a keynote titled Le numérique, vecteur d’une appropriation collective des données environnementales? It was part of the Cycle Annuel 2025, helded by the Institut des hautes études d’aménagement des territoires. He showed the application of open and collaborative data and detailed the mapping carried out in Mayotte. The explanation of this mapping effort was published in the weeklyOSM issues 770 and 775.
  • … the General Bikeshare Feed Specification (GBFS) has used the OpenStreetMap opening_hours format since version 3.0? GBFS is an open data standard designed to make it easier to discover and use shared mobility services.
  • … the statistics of uMap instances are also published as a chart?

OSM in the media

  • Historian Arseniy Chuhuy outlined how place names in Crimea changed during the imperial and Soviet periods and described efforts to restore historical, especially Crimean Tatar, toponyms. He also noted practical issues in the process, including duplicate names and settlements without clearly documented historical alternatives. The article is illustrated with a map by Ukrainian OpenStreetMap contributor and cartographic designer Fedir Gontsa.

Other “geo” things

  • The Editora IVIDES is accepting applications for the selection process to form the scientific committee for volumes 2 and 3 of the series Case studies in collaborative and participatory mapping (in Portuguese). Raquel Dezidério Souto, editor of the series, reported in her OSM user diary that the first volume was released in August 2025 and has already been downloaded by over a thousand people.
  • The Government of Portugal has published a list of municipalities affected by Storm Kristin that are now under a state of public calamity. A humanitarian mapping effort is emerging, and you can learn more about this and other topics by following the Portuguese OpenStreetMap community group on Telegram and Discord.
  • The UCL Warning Research Centre invites you to their free and hybrid launch of the UCL Warning Database, which will occur on 4 March, from 3 pm to 4 pm GMT, at Room 225, Central House, Bloomsbury.
  • Miguel Alvarez has taken a look at the maps of Leonardo da Vinci. Further detail of Leonardo’s mapping career can be found in Christopher Tyler’s 2017 article.
  • Will Dunham wrote in The Japan Times about radar data which shows a cavernous underground lava tube on Venus. Lava tubes are also found in certain volcanic locations on Earth and its moon and are believed to be present on Mars.
  • Melinda Laituri, of Colorado State University, discussed women’s contributions to cartography. We previously reported that Daniel Meßner had highlighted the American geologist and cartographer Marie Tharp and her contribution to the cartographic depiction of the Mid-Atlantic Ridge.

Upcoming Events

Country Where Venue What When
flag Delhi Dosa Coffee, Model Town ILUGD Meetup × OSM Delhi Mapping Party No.26 (North Zone) 2026-02-15
flag Lancaster Lancaster University Welcome center 1 Lancaster FoMSF Mapathon 2026-02-16
flag EPN d’Arlon, rue de Diekirch 37, Arlon EPN d’Arlon – OpenStreetMap – Contribution 2026-02-17
flag Milano Building 3A Ground Floor – Politecnico di Milano PoliMappers Maptedì 2026-02-17
Missing Maps London: (Online) Mid-Month Mapathon [eng] 2026-02-17
flag Online Mappy Hour OSM España 2026-02-17
flag Lyon Tubà Réunion du groupe local de Lyon 2026-02-17
flag Bonn Dotty’s 197. OSM-Stammtisch Bonn 2026-02-17
flag San Jose Online South Bay Map Night 2026-02-17
flag Online Lüneburger Mappertreffen (online) 2026-02-17
flag MJC de Vienne Réunion des contributeurs de Vienne (38) 2026-02-18
flag Karlsruhe Chiang Mai Stammtisch Karlsruhe 2026-02-18
flag Bratislava Prírodovedecká fakulta UK Bratislava Missing Maps mapathon Bratislava #12 2026-02-19
OSMF Engineering Working Group meeting 2026-02-20
flag Karlsruhe Geofabrik, Amalienstraße 44, 76133 Karlsruhe Karlsruhe Hack Weekend February 2026 2026-02-21 – 2026-02-22
flag Belfast School of Geosciences, Queen’s University Belfast Belfast Mapathon 2026-02-21
flag TAK Kadıköy Tasarım Atölyesi OpenStreetMap Outdoor Editing 2026-02-21
flag Kalyani Nagar TomTom Pune Office, India OSM Mapping Party at TomTom Pune, India 2026-02-21
flag Atelier Vélo Utile Rencontre OSM Saint-Brieuc 2026-02-21
flag Toulouse Artilect – 10, Rue Tripière – Toulouse Rencontre OSM Toulouse 2026-02-21
flag Mumbai Third Wave Coffee Roasters, Lokhandwala Market OSM Mumbai Mapping Party No.7 (Western Line – South) 2026-02-22
Missing Maps : Mapathon en ligne – CartONG [fr] 2026-02-23
flag Olomouc Přírodovědecká fakulta Univerzity Palackého Únorový olomoucký mapathon 2026-02-24
flag City of Edinburgh Guildford Arms, Edinburgh OSM Edinburgh pub meetup 2026-02-24
flag Derby The Brunswick, Railway Terrace, Derby East Midlands pub meet-up 2026-02-24
flag Hannover Kuriosum OSM-Stammtisch Hannover 2026-02-25
flag Luxembourg neimënster, Luxembourg & online MSF Luxembourg hybrid Mapathon 2026-02-25
flag Düsseldorf Online bei https://meet.jit.si/OSM-DUS-2026 Düsseldorfer OpenStreetMap-Treffen (online) 2026-02-25
flag Seattle Seattle, WA, US OpenThePaths 2026: Connecting People and Places Through Sustainable Access 2026-02-26 – 2026-02-27
flag Santa Clara Santa Clara University Friends of MSF Mapathon 2026-02-26
flag Online Asamblea General Ordinaria – Asociación OpenStreetMap España 2026-02-26
flag Ferrara Cimitero monumentale della Certosa di Ferrara Ferrara mapping party 2026-02-28
flag Messina Messina Mapping Day @ Messina 2026-02-28
flag नई दिल्ली Jitsi Meet (online) OSM India – Monthly Online Mapathon 2026-03-01
flag Chennai Corporation Mapping Party @ Chennai 2026-03-01

Note:
If you like to see your event here, please put it into the OSM calendar. Only data which is there, will appear in weeklyOSM.

This weeklyOSM was produced by IVIDES.org, MatthiasMatthias, Raquel Dezidério Souto, Strubbl, Andrew Davidson, barefootstache, darkonus, derFred, giopera, jcr83.
We welcome link suggestions for the next issue via this form and look forward to your contributions.


OpenStreetMap User's Diaries

多言語表記のタグ付けを考える

多言語表記のタグ付けを考える

2026-02-01に’さくらインターネット Blooming Camp’で行われた「マッパーズサミット2026」での発表内容です

パート1: OSMの基礎知識編

発表内容のうち、OSM wikiに記載されている項目の解説を「OSMの基礎知識編」します

「基礎知識」と思ってバカにしないでください。日本の編集のほとんどすべて(99%以上)がOSM wikiに記載事項に違反しています
また、’talk-ja’などでの意見も OSM wikiに記載事項を理解されていないと思われるものが多いです

実例を使って交差点のタグ付けを考えて見ましょう。

ちなみに、この例は私の自宅近くにある交差点で一般的なものです、特殊な例ではありません。

  • 交差点

多言語表記のタグ付けを考える

2026-02-01に’さくらインターネット Blooming Camp’で行われた「マッパーズサミット2026」での発表内容です

パート1: OSMの基礎知識編

発表内容のうち、OSM wikiに記載されている項目の解説を「OSMの基礎知識編」します

「基礎知識」と思ってバカにしないでください。日本の編集のほとんどすべて(99%以上)がOSM wikiに記載事項に違反しています
また、’talk-ja’などでの意見も OSM wikiに記載事項を理解されていないと思われるものが多いです

p02

実例を使って交差点のタグ付けを考えて見ましょう。

ちなみに、この例は私の自宅近くにある交差点で一般的なものです、特殊な例ではありません。

  • 交差点なので junction=yes とします。

p03

案内板に表示されている「虚空蔵橋際」をタグ付けします

  • 「虚空蔵橋際」をそのまま name = 虚空蔵橋際 とします。

p04

アルファベット表記になっている部分をタグ付けします

この部分は @国土交通省では「国際化に対応する・・・」とされています

  • 国際化表記部分なので int_name = Kokuzobashi とします

p05

この部分は @国土交通省では「ローマ字表記」とされていますが、「英語」と「ローマ字」が混じった表記とされています

  • 「英語/ローマ字」の判断が必要です → 「bashi」となっているので「ローマ字表記」と考えられます name:ja-Latn = Kokuzobashi とします

p06

案内標識のローマ字表記は「長音」が省略されています。 OSMwiki では’省略してはいけない’ことになっているので「省略された長音」を補完します

p07

[OSM wiki]には「現地語名称は言語明示サブキーと重複させてください」とあります

p08

案内標識のローマ字表記は”省略”されています。 OSMwiki では’省略してはいけない’ことになっているので「省略された部分」を補完します

この例では「際」が省略されています

  • name:ja-Latn = Kokuzō-bashi-giwa とします

p09

osmwiki/JA:名称に「特定の言語での名前(name:en=…)の使用を検討してください」とありますので、int_name に対応する name:en の使用を検討します

  • 「特定の言語での名前(name:en=…)の使用を検討してください」@osmwiki/JA:名称

p10

name:ja-Latn と同様に、name:enについても省略された「長音」と「省略部分」を補完します

p11

name:en,name:ja,name:ja-Latn は一致させる必要があります。つまり int_nameからname:enを作るのではなく、nameからname:enを作ります

  • name=虚空蔵橋際」の「虚空蔵橋」は「ごくぞうばし」と読むので name:en = Gokuzoh bridge side とします

p12

ここで「ローマ字表記」部分が変更された場合を考えてみます

  • 案内標識のローマ字表記部分が「Gokuzou Bridge」に変更された場合は、int_name = Gokuzou Bridge に変更します。
  • name:en,name:ja-Latnを変更する必要はありません
    • 平成26年に発令された「改正標識令」により、2026年現在で約60%の「ローマ字表記」部分が変更されました。残りの40%程度が今後変更される可能性があります。
    • 令和7年に「内閣告示」が改定されましたので、またローマ字表記が変更されることが予想されます

p13

@osmwiki:多国語の名称には、「存在しないものには名称をタグ付けしないでください。」「その他すべての言語に対して name: タグをつけるべきではありません!」とあります

p14

osmwikiに「存在しないものには名称をタグ付けしないでください」とあるにも関わらず、@osmwiki:名称には「特定の言語での名前(name:en=…)の使用を検討してください」と矛盾した記述があります

name:enが特別扱いされる理由は,フォールバックした言語で現地語の名称を表示する場合に、目的の言語が存在しないときは’英語の名称’を表示するためです

つまり、name:enレンダリングのためのタグ付けということになります

  • 「存在しないものには名称をタグ付けしないでください」@osmwiki:多国語の名称
  • 「特定の言語での名前(name:en=…)の使用を検討してください」@osmwiki:名称には
  • 「[フォールバックした言語で現地語の名称を表示する場合に英語の名称を表示」@JA:多国語の名称#理由

p15

name:enレンダリングのためのタグ付け』というのはOSMの原則に反することになります

そのため、「特定の言語での名前(name:en=…)の使用を検討してください」というOSMwikiの基準に矛盾する記載が OSM wiki に多く見られます。

p16

ここまでのタグ付けをまとめてみましょう

nameint_name は、標識に記載された内容をそのまま設定している

しかし、name:ja以外の name:en,name:ja-Latnは、マッパーの思考・嗜好が反映されたものとなっています

p17

立場をかえて、日本人である私達は、USAの「ゴールデンゲートブリッジ」のことを「金門橋」と呼んでいます

このことをOSMに入力することはできるのでしょうか?

p18

では、「金門橋」のような現地に表示が情報はどうすればよいのでしょう?

それは、wikidataと連携させて間接的にOSMに反映させることができます

wikidataとのリンクで ‘name:XX’ が無効化

wikidataとリンクすると、iDエディタでは ‘name:XX’ がグレーアウトして ‘name:XX’が編集できなくなることに注目してください。

wikidata と OSM では扱うデータが住み分けられています。

  • wikidata :
    公開された文献(Webデータを含む)に記載された情報
  • openstreetmap :
    現地の事実情報(位置情報を持つものに限る)

つまり、nameint_nameは現地の事実情報なので
- –> OpenStreetMapに入力(OSM wikiでも推奨しています)

いっぽう、name:XX は、現地に表記されていない情報なので
- –> Wikidata は、公開情報に記載があれば入力できる - –> OpenStreetMapに入力できない(OSM wikiでも現地に表記のないデータは入力しないことになっています)

POIが wikidata とリンクした時点で、OSMのデータとしては不適切なname:XX は、用無しとなります
以後、name:XXを編集することは意味がありませんし、name:XXはゴミデータになります

wikidataとのリンク

もし、’wikidata’は 『’wikipedia’のようなもの』と認識しているのでしたら、それは間違いです。
‘wikidata’は、’wikipedia’よりも『’OpenStreetMap’のようなもの』と認識したほうが正しいです。

‘wikipedia’と’OpenStreetMap’は、互いに補完しあうことでデータの一貫性を担保することができるようになっています。
また、そのための仕組みや強力なツールも整備されています

  • wiki/JA:Wikidataには 「wikidataとのリンク」の重要性が説明されています
    • 1 なぜウィキデータにリンクするのか?
      • 1.1 ウィキデータはウィキペディアではない
    • 2 OSMからウィキデータへのリンク
      • 2.1 タグの修飾子として
      • 2.2 ツール
        • 2.2.1 OSM要素にウィキデータをリンクする
        • 2.2.2 検証ツール
        • 2.2.3 品質保証
      • 2.3 ウィキデータのユーザースクリプト
    • 3 ウィキデータからOSMへのリンク
      • 3.1 ウィキデータにおけるOpenStreetMap関連のプロパティ
      • 3.2 OSMタグとキー

p19

ライセンス上の理由により、OSMのデータをwikidataへコピーすることは禁止されています

  • そもそもOSMには、現地に存在する事実情報しかないはずなので、文献データを扱うwikidataへコピーする情報などないはず・・・

p20

ライセンス上の理由により、’wikidata’のデータをOSMへコピーすることは禁止されています

  • そもそもwikidataには、文献情報しかないはずなので 現地の事実情報を扱う OSMへコピーする情報などないはず・・・

p32


継続年数1年たった

OSMのマッピングをはじめて、今日で継続年数1年たった。

アカウント自体は5年前に作っていたが、
当時は県道沿いに建物は張り付いてた分しかなくて(うるおぼえ)、
かつて住んでたところくらいはと、3町分の建物を描いたところで、飽きてしまい辞めた。

そして去年の今日、たまたまマップを見てみたら、建物やPOIがかなり充実してたのを見て再燃し、今に至る。

今のところ、熱は冷めてない。
個人的に継続中の調査テーマもあるし、描きたいものもたくさんある。
続けていくつもりなら、いずれはデータを活用する側にも回りたい。
そのためにもマッピングを進めていこうと思う。

オレはようやくのぼりはじめたばかりだからな
このはてしなく遠いマッピング坂をよ……

OSMのマッピングをはじめて、今日で継続年数1年たった。

アカウント自体は5年前に作っていたが、
当時は県道沿いに建物は張り付いてた分しかなくて(うるおぼえ)、
かつて住んでたところくらいはと、3町分の建物を描いたところで、飽きてしまい辞めた。

そして去年の今日、たまたまマップを見てみたら、建物やPOIがかなり充実してたのを見て再燃し、今に至る。

今のところ、熱は冷めてない。
個人的に継続中の調査テーマもあるし、描きたいものもたくさんある。
続けていくつもりなら、いずれはデータを活用する側にも回りたい。
そのためにもマッピングを進めていこうと思う。

オレはようやくのぼりはじめたばかりだからな
このはてしなく遠いマッピング坂をよ……

Saturday, 14. February 2026

OpenStreetMap User's Diaries

大阪市中心部の信号停止線に「一時停止」タグがつけられた交差点の改善編集

大阪市の中心付近に、信号停止線に「一時停止」タグがつけられた箇所がたくさんあります。

「一時停止」標識が存在しない場所なので、誤ったタグ付けなのですが、交点に信号機があるだけの状態よりも優れたマッピングです。

しかし、このままでは、初心者マッパーに「誤りだから・・」という理由で単純に削除されてしまう恐れがあります。

改善編集

「osmwiki JA:Tag:highway=traffic_signals」で推奨されている「highway=traffic_signals」に書き換えましょう!

  • 信号停止線にある「highway=stop」タグを「highway=traffic_signals」に書き換えます。
  • 交差点の交点にあった「highway=traffic_si

大阪市の中心付近に、信号停止線に「一時停止」タグがつけられた箇所がたくさんあります。

「一時停止」標識が存在しない場所なので、誤ったタグ付けなのですが、交点に信号機があるだけの状態よりも優れたマッピングです。

しかし、このままでは、初心者マッパーに「誤りだから・・」という理由で単純に削除されてしまう恐れがあります。

改善前

改善編集

osmwiki JA:Tag:highway=traffic_signals」で推奨されている「highway=traffic_signals」に書き換えましょう!

  • 信号停止線にある「highway=stop」タグを「highway=traffic_signals」に書き換えます。
  • 交差点の交点にあった「highway=traffic_signals」を「junction=yes」に書き換える

改善編集