Commit 4db0c8a1 by colm

starting on the NO² cartography proget documentation

parent 02314eb3
Pipeline #89 failed with stage
in 0 seconds
Title: crossing pollution data with school zones qgis & openlayers
Date: 2018/03/04
Tags: cartography, webmaps, openlayers, pollution, no2, geotagging, geocoding, visualisation
Author: Colm
Status: draft
Journalist Céline Gautier from Médor is continuing to report on the problematic situation of air quality in Belgium and Brussels. A recent development propted the need for mapping and visualisation of publicly available pollution (NO² gaz levels) data in school zones in Wallonie and Brussels. This article details how the data was processed, how the queries were made, and how the map was assembled.
### Outline:
1. Geocoding school locations
2. Handeling the datasource
3. Bringing them together
## 1. Geocoding school locations
A request was made to []( to obtain a list of addresses of all the schools in the Communauté Française. The organism provided us with 7 CSV files that gathered schools by level and by tipes. The cvs format includes the name of the school, an address and contact information for each institution. In order to be able to plot each school on a map, we needed to have latitude and longitude coordinates for each address. This process is called geocoding, and can be done in multiple ways. Quite a lot of websites propose this as a (paying) service, but in our case, premiliminary test on [Nominatim]( (the OSM data search engine) prooved to be very accurate. I made a very simple python script that queries nominatim with the addresses extracted from the csv files and returns latitude and logitude to two new columns in the output document.
Nominatim did not return results for all addresses. If Nominatim could not find a result, I did an attempt with google's query, and if that still failed, arcGis seemed to always do the trick. This is all made extra easy thanks to the [geocoder]( library which is under the MIT licence. Thanks for this great tool!
I prioritised nominatim because I have a soft spot for anything OSM. It's clearly not the best for geocoding, but I knew we would be using OSM tiles as a basemap in the end project, so I thought there was some sense there. One of the downfalls of using nominatim's main server for queries is that you are limited to one request per second, which is totally normal and legitimate, but I still found out about it the hard way:
import csv
import sys
import geocoder
import time
ifile = open(sys.argv[1])
reader = csv.reader(ifile)
ofile = open(sys.argv[2], "w")
writer = csv.writer(ofile, delimiter=',', quotechar='"', quoting=csv.QUOTE_ALL)
rownum = 0
for row in reader:
# Save header row.
if rownum == 0:
header = row
header.extend(['lat', 'lng'])
colnum = 0
query = row[1] + ', ' + row[2] + ', ' + row[3] + ', ' + row[4]
print('query= ' + query)
g = geocoder.osm(query)
if g.ok is False:
print(query + 'no result with OSM')
g =
if g.ok is False:
print(query + 'returns no result with google')
g = geocoder.arcgis(query)
if g.ok is False:
print('result found with ARCGIS: ')
print('final result:')
row.extend([, g.lng])
#for col in row:
# colnum += 1
rownum += 1
print('line:' + str(rownum))
## 2. Handeling the datasource
Once all the addresses had latlng data available, all of the heavy lifiting was done in [QGIS]( The pollution data itself came form [IRCELINE]( the belgian authority that surveils gathers and publishes atmospheric emissions data. We worked from the annual nation wide NO² reports for the year 2016, while we wait for the 2017 reports to be issued.
It very important to say that these maps are representations, they are modelings of measures, averaged over the year.
## 3. Briging them together:
Tiles provided by
\ No newline at end of file
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment