I’ll be moving over to chrishavlin.github.io, so head over there to look for new stuff. My introductions to shapefiles still see a surprising amount of traffic, so I’ll keep this site up, but won’t be updating it anymore.
Ok, so after my post a little while ago about converting contours to shapefiles, I spent a day adapting the approach for use in a in a dash-plotly app… and in general it works, and took a ~800 MB file down to a ~14 MB shapefile. But the rendering of the shapefile was still very slooooooow, and due to limitations of how data callbacks update a figure’s data in Dash, I couldn’t just render the background map once. So, after some thinking, I realized that if I “simply” calculated my own map projections, I could use the standard x-y plotting routines (which includes contours, as well as the faster scattergl method for plotting a large number of points).
But then I realized that I only vaguely know what map projections are: I knew different projections vary in how they portray the 3D earth on a 2D surface, but when you start using all the mapping libraries out there, you run into a lot of jargon really fast. So here’s a super basic primer on calculation map projections in Python that explains some of that jargon!
Also, this is the first post I decided to write as a Python notebook. The notebook is in my github learning_shapefiles repo, but github has been having some issues displaying notebooks lately, so you can view the full notebook here.
To start off, pyproj (link), is a python library based on the command-line library https://proj.org/ that provides a large number of algorithms related to map projections. So go install that (simple to do with pip or conda).
In order to use pyproj to project a lat/lon point onto a map projection, it’s helpful to know a few acronyms:
- CRS: coordinate reference system
- EPSG: a database of coordinate reference systems and transformations, different projections covering different regional or local (or global) domains are specified by the number following EPSG. For example, EPSG:4326 is the global reference ellipse used by modern day GPS.
- WGS84 : World Geodetic System, latest revision, a.k.a. WGS1984, same as EPSG:4326
- reference ellipse: an ellipse describing the surface of the Earth on which positions (like latitude and longitude) are defined. GPS points use a reference ellipse that approximates the Earth’s geoid (i.e., the gravitational equipotential surface that sea-level would follow if due to gravity alone, see wiki).
So now that’s out of the way…. to project a single lat/lon point with Pyproj, first import pyproj then initialize a projection (Proj) object:
from pyproj import Proj p=Proj(proj='hammer',ellps='WGS84')
When initializing the Proj (projection) object, you give it the reference ellipse that your lat/lon is defined on, ellps=‘WGS84’, and the projection you want to transform to, in this case a hammer projection. Once you’ve initialized Proj, you can use it to move from lat/lon to cartesian x,y:
lon=-120.5 lat=42.4 x,y=p(lon,lat)
This lon/lat point become (x,y)=(-9894366.0792,5203184.81636). These values in meters by default and are the cartesian x-y distance from the map center for your point.
The Proj object is flexible enough to take lists (and numpy arrays!), so you can project many points at once. It can also take any of the parameters defined for the PROJ projections as keyword arguments. For example, the Hammer projection, has an argument ‘+lon_0’ to set the 0-longitude for the view, and so in pyProj you can use ‘lon_0’ as a keyword argument when using the hammer projection:
p = Proj(proj='hammer',ellps='WGS84',lon_0=-50)
The rest of the notebook goes through how you could construct a grid of lat/lon lines and plot them for different projections. And at the end, I take a digital elevation model of the western US (which gives an elevation for lat/lon points), project the lat/lon points onto a hammer projection to get points of (elevation,x,y) and contour it using the standard matplotlib.pyplot.contourf to get the following:
In my particular application, I’ll now be able to project my data to x,y and then use the cartesian contour functions of dash-plotly! The main drawback of this approach is having to write all the routines for handling rotations and plotting lines on maps… but ultimately I think it will work well with Dash, and it will be nice to be able use scattergl with my map point data.
I recently found myself needing to get the latitude/longitude of a list of cities (for this map here) and it turns out, it’s pretty easy now that I know how to do it. Here’s a quick tutorial!
Ok, so the process of taking a city and assigning a latitude/longitude point is called geocoding. There are many services that offer this (e.g., Google or Bing Maps APIs) but most I looked at seemed overkill for a one-time task of assigning lat/lon to about 500 cities. But then I discovered OpenStreetMap’s Nominatim! You can modify the http address to return results in an xml file. For example, the following searches for Providence, RI:
<searchresults timestamp="Thu, 02 Feb 17 16:17:00 +0000" attribution="Data © OpenStreetMap contributors, ODbL 1.0. http://www.openstreetmap.org/copyright" querystring="Providence RI USA" polygon="false" exclude_place_ids="158799064,159481664" more_url="https://nominatim.openstreetmap.org/search.php?format=xml&exclude_place_ids=158799064,159481664&accept-language=en-US,en;q=0.8&q=Providence+RI+USA"> <place place_id="158799064" osm_type="relation" osm_id="191210" place_rank="16" boundingbox="41.772414,41.861571,-71.4726669,-71.3736134" lat="41.8239891" lon="-71.4128342" display_name="Providence, Providence County, Rhode Island, United States of America" class="place" type="city" importance="0.80724054252736" icon="https://nominatim.openstreetmap.org/images/mapicons/poi_place_city.p.20.png"/> <place place_id="159481664" osm_type="relation" osm_id="1840541" place_rank="12" boundingbox="41.7232498,42.0188529,-71.7992521,-71.3177699" lat="41.8677428" lon="-71.5814833" display_name="Providence County, Rhode Island, United States of America" class="boundary" type="administrative" importance="0.58173948152676" icon="https://nominatim.openstreetmap.org/images/mapicons/poi_boundary_administrative.p.20.png"/> </searchresults>
If you scroll to the right you’ll see:
It’s pretty easy to write a python script to request then parse the xml result for lat and lon. Here’s what that might look like (BUT DON’T DO THIS):
import urllib2 city='Providence, RI' city_search=city.replace(' ','').split(',') # removes whitespace, splits city/state # build the http address: # (results in a string: 'https://nominatim.openstreetmap.org/search.php?q=Providence+RI+USA&format=xml') osm='https://nominatim.openstreetmap.org/search.php?q=' fmt='+USA&polygon=1&format=xml' srch = osm + city_search + '+' + city_search + fmt # now use urllib2 to open the url and store the result: response = urllib2.urlopen(srch) the_page = response.read().split() # and now we can parse the resulting string array where the xml info is stored. # the it only stores the first Lon/Lat that it encounters Lon = 0.0 Lat = 0.0 for iel in range(0,len(the_page)): # loop over the strings in the_page, look for Lat/Lon if 'lon=' in the_page[iel] and Lon == 0.0: Lon=float(the_page[iel].split("'")) if 'lat=' in the_page[iel] and Lat == 0.0: Lat=float(the_page[iel].split("'"))
So. Why not just loop over your list of cities and repeat this exercise? Well if you check out Nomanatim’s documentation page, and take a look at the usage policy, it requires: “(1) No heavy uses (an absolute maximum of 1 request per second). (2) Provide a valid HTTP Referer or User-Agent identifying the application (stock User-Agents as set by http libraries will not do). (3) Clearly display attribution as suitable for your medium. (4) Data is provided under the ODbL license which requires to share alike (although small extractions are likely to be covered by fair usage / fair dealing).” While I don’t think that my case of simply geocoding 500 or so cities falls under heavy usage and I could just delay my successive calls, I decided to look into their suggestions for other options.
In the end I settled on MapQuest’s implementation of Nominatim. It provides access to all the OpenStreetMaps data (still open source and subject to the OSM license agreements) and a MapQuest free developer account gets you 15,000 request/month for free. Waaay more than I’d need for this project.
So to geocode a list of cities, first sign up for a MapQuest Developer Account. You’ll get an API key assigned to you. Unlike some other API’s, MapQuest doesn’t use any fancy authentication. You basically just make a request for the URL with the API in the http address. Reaaaaally easy (but not exactly secure).
Then you can run a code very similar to that above. My implementation is here: look_up_latlons.py. But it’s kind of tied to the data that I was mapping.
Some notes on the code.
(1) the API key is passed in through a command line argument, so when you run this code you have to type
$ python look_up_latlons.py AL1243KSFD242332552134KLJ
where that long string of letters/numbers is whatever your API key is.
(2) And then the formatting of the http address is slightly different from the standard Nominatim api. The same search for Providence RI looks like:
where API_KEY is, again, your API key.
(3) In my implementation, I have imported a CSV file as a pandas dataframe (called Counts). Each row contains a city name along with the number of people who marched in the Women’s Marches on Jan. 21. The meat of the code is copied below, in which I iterate over the rows in the dataframe (named Counts here), find the lat/lon for each row (i.e., each city) and then store that lat/lon in a new dataframe (NewCounts) because it’s bad to modify an existing dataframe while iterating over it. Here’s what that looks like:
osm='http://open.mapquestapi.com/nominatim/v1/search.php?key='+API_KEY+'&format=xml&q=' # loop over cities in crowd counts, find Lat/Lon NewCounts=Counts.copy() NewCounts['lon']=np.zeros(len(Counts)) # add new column for lon NewCounts['lat']=np.zeros(len(Counts)) # add new column for lat for index, row in Counts.iterrows(): srch=osm+str(row['City']).replace(' ','+') print '\n\nLooking up lat/lon for',row['City'],index time.sleep(dt) response = urllib2.urlopen(srch) the_page = response.read().split() for iel in range(0,len(the_page)): if 'lon=' in the_page[iel] and NewCounts['lon'][index]==0.0: NewCounts['lon'][index]=float(the_page[iel].split("'")) if 'lat=' in the_page[iel] and NewCounts['lat'][index]==0.0: NewCounts['lat'][index]=float(the_page[iel].split("'")) print row['City'],NewCounts['lon'][index],NewCounts['lat'][index]
The MapQuest API didn’t have any specific usage constraints for how frequently you make a request, just overall number in a month, but I added a small delay between calls using the time.sleep() function anyway.
That’s all for now, hopefully some more posts with colorful plots coming soon!
Some occasional visualization and statistical analysis of open data sets using python. Code is released via github (https://github.com/chrishavlin) in case you want to dig into it. Enjoy!