Elijah Robison | GIS Blog

A scrapbook of GIS tricks, with emphasis on FOSS4G.

Archive for the ‘Python’ tag

Create UTFGrid Tiles from PostGIS Tables

without comments

I assume you’re where I was about a week ago. That is, you’ve heard of UTFGrid, and now you want to render your own UTFGrid tiles. Perhaps you found yourself looking at this thread over at GIS.SE, but you don’t want to jump into TileStache just now. Anyway if you’ve got a working installation of GDAL/OGR and Mapnik 2+, complete with Python bindings, I’ll show you what worked for me..

Because this is merely an adaptation of Matthew Perry’s original solution, I highly recommend considering his original blog entry on the topic, complete with discussion points and caveats, before proceeding further!

Once you’re ready, head over to GitHub and download some code, specifically globalmaptiles.py, create_tile_shp.py, and create_utfgrids.py. [1] You don’t actually need create_tile_shp.py, but I snagged all three. To keep things simple put these files in the same folder.

Next, in that same directory, create a new Python file, I called mine createUtfgridsFromPG.py.

Where create_utfgrids.py works entirely on shapefiles, createUtfgridsFromPG.py accepts an OGR PostgreSQL connection string in place of a shapefile path. Fortunately the original OGR code didn’t change, but to use the Mapnik PostGIS driver I had to iterate over the PostgreSQL connection string and store the connection parameters so I could supply them a differently to Mapnik.

Finally, copy the following code and paste it into your new file. It’s basically the same as create_utfgrids.py, but I changed shppath to pgconn and added some code to use a mapnik.PostGIS() datasource in place of a mapnik.Shapefile() datasource.

If you’re curious, the comment # ELR 2014.9.26: flags the few places I changed the original code.

createUtfgridsFromPG.py

#!/usr/bin/env python
# -*- coding: utf-8  -*-
"""
create_utfgrids.py
Author: Matthew Perry
License: BSD

Creates utfgrid .json tiles for the given polygon shapefile

Thx to Dane Springmeyer for the utfgrid spec and mapnik rendering code
and to  Klokan Petr Přidal for his MapTiler code
(http://www.maptiler.org/google-maps-coordinates-tile-bounds-projection/)

"""
import globalmaptiles
import mapnik
import ogr
import os
from optparse import OptionParser, OptionError
try:
    import simplejson as json
except ImportError:
    import json

def create_utfgrids(pgconn, minzoom, maxzoom, outdir, fields=None, layernum=0):

    # ELR 2014.9.26:
    # Original implementation pushed in a shapefile path.
    #ds = ogr.Open(shppath)
    ds = ogr.Open(pgconn)

    # ELR 2014.9.26:
    # Iterate over the PostgreSQL connection string and pull out values we need
    # to use Mapnik's PostGIS datasource constructor.
    pgConnARR = pgconn[3:].split(' ')
    for kvPair in pgConnARR:
        if kvPair.split('=')[0] == "host":
            nikHost = kvPair.split('=')[1]
        if kvPair.split('=')[0] == "port":
            nikPort = kvPair.split('=')[1]
        if kvPair.split('=')[0] == "user":
            nikUser = kvPair.split('=')[1]
        if kvPair.split('=')[0] == "password":
            nikPass = kvPair.split('=')[1]
        if kvPair.split('=')[0] == "dbname":
            nikDB = kvPair.split('=')[1]
        if kvPair.split('=')[0] == "tables":
            nikTable = kvPair.split('=')[1]

    print
    print "WARNING:"
    print " This script assumes a polygon shapefile in spherical mercator projection."
    print " If any of these assumptions are not true, don't count on the results!"
    # TODO confirm polygons
    # TODO confirm mercator
    # TODO get layernum from command line
    layer = ds.GetLayer(layernum)
    bbox = layer.GetExtent()
    print ""
    print str(bbox)

    mercator = globalmaptiles.GlobalMercator()

    m = mapnik.Map(256,256)

    # Since grids are `rendered` they need a style
    s = mapnik.Style()
    r = mapnik.Rule()
    polygon_symbolizer = mapnik.PolygonSymbolizer(mapnik.Color('#f2eff9'))
    r.symbols.append(polygon_symbolizer)
    line_symbolizer = mapnik.LineSymbolizer(mapnik.Color('rgb(50%,50%,50%)'),0.1)
    r.symbols.append(line_symbolizer)
    s.rules.append(r)
    m.append_style('My Style',s)

    print ""
    # ELR 2014.9.26:
    # Original implementation using shapefile..
    #ds = mapnik.Shapefile(file=shppath)

    # ELR 2014.9.26:
    # Parameterized PostGIS implementation..
    ds = mapnik.PostGIS(host=nikHost,port=nikPort,user=nikUser,password=nikPass,dbname=nikDB,table=nikTable)

    mlayer = mapnik.Layer('poly')
    mlayer.datasource = ds
    mlayer.styles.append('My Style')
    m.layers.append(mlayer)

    print ""
    if fields is None:
        fields = mlayer.datasource.fields()
        print "Fields were NONE. Using.."
        print fields
    else:
        print "Fields are USER PROVIDED. Using.."
        print fields
    print ""

    for tz in range(minzoom, maxzoom+1):
        print " * Processing Zoom Level %s" % tz
        tminx, tminy = mercator.MetersToTile( bbox[0], bbox[2], tz)
        tmaxx, tmaxy = mercator.MetersToTile( bbox[1], bbox[3], tz)
        for ty in range(tminy, tmaxy+1):
            for tx in range(tminx, tmaxx+1):
                output = os.path.join(outdir, str(tz), str(tx))
                if not os.path.exists(output):
                    os.makedirs(output)

                # Use top origin tile scheme (like OSM or GMaps)
                # TODO support option for TMS bottom origin scheme (ie opt to not invert)
                ymax = 1 << tz;
                invert_ty = ymax - ty - 1;

                tilefilename = os.path.join(output, "%s.json" % invert_ty) # ty for TMS bottom origin
                tilebounds = mercator.TileBounds( tx, ty, tz)
                #print tilefilename, tilebounds

                box = mapnik.Box2d(*tilebounds)
                m.zoom_to_box(box)
                grid = mapnik.Grid(m.width,m.height)
                mapnik.render_layer(m,grid,layer=0,fields=fields)
                utfgrid = grid.encode('utf',resolution=4)
                with open(tilefilename, 'w') as file:
                    file.write(json.dumps(utfgrid))

if __name__ == "__main__":
    usage = "usage: %prog [options] shapefile minzoom maxzoom output_directory"
    parser = OptionParser(usage)
    parser.add_option("-f", '--fields', dest="fields", help="Comma-seperated list of fields; default is all")
    (options, args) = parser.parse_args()

    if len(args) != 4:
        parser.error("Incorrect number of arguments")

    pgconn = args[0]
    minzoom, maxzoom = int(args[1]), int(args[2])
    outdir = args[3]

    if os.path.exists(outdir):
        parser.error("output directory exists already")

    if options.fields:
        fields = options.fields.split(",")
    else:
        fields = None

    create_utfgrids(pgconn, minzoom, maxzoom, outdir, fields)

Usage..

Once you’ve prepared createUtfgridsFromPG.py, you can call it from the command line like this..

C:\xDev\utfgrids\createUtfgridsFromPG.py "PG:host=127.0.0.1 port=5432 user=postgres dbname=gis password=passw0rd tables=parcels_pmerc" 12 16 "C:/xGIS/tiles/utf" -f tms,owner_name

  • Hopefully the PostgreSQL connection string ("PG:host=..") makes sense.
  • 12 and 16 represent the minimum and maximum zoom levels to be rendered, respectively.
  • The directory “C:/xGIS/tiles/utf” is where your UTFGrid tiles will be saved.
  • And -f tms,owner_name,the_wkt represents a comma-separated list of data fields you want in your UTFGrid.

Caveats..

  • Both create_utfgrids.py and createUtfgridsFromPG.py require your geodata table to be in a Web Mercator projection (EPSG:3857)!
  • The script assumes a top-origin tile scheme, like OSM and others.
  • The script will only work with polygons.
  • While the OGR PostgreSQL connection string has a tables parameter, this implementation will only accept one table.
  • The script will create your target directory, in the example case, utf, and it will throw an error if you create this directory in advance.

[1] Many thanks to Matthew Perry, Klokan Petr Přidal, and Dane Springmeyer for their collective efforts and for sharing their work.

Written by elrobis

September 26th, 2014 at 1:26 pm

Install httplib2 to your Preferred Python Runtime after ArcGIS Unceremoniously Hijacks the First Attempt

without comments

We’re going to use Google Maps Engine for some stuff, and so I thought I’d throw my first codes at it using Python. Well.. Google’s Python example requires the httplib2 library, so I needed to install that. When I did—following the library’s download/instructions—for some reason, the library went into service against ArcGIS’s embedded Python runtime, rather than my favored 2.7 instance, which I use for every single thing.

::grimaces::

To fix this, I just ensured the new library’s setup.py script was called from my preferred Python runtime rather than let Windows, or some black magic, decide which instance it should integrate with httplib2.

Breaking it down:

1) cd into the unpacked httplib2 directory:

C:\>cd C:\Users\elijah\Downloads\httplib2-0.8

2) Next, you can qualify which Python runtime you want to execute the supplied setup.py install script—in my case, this worked to install httplib2 to that specific runtime. This was the exact command I used:

C:\Users\elijah\Downloads\httplib2-0.8>C:\Python27\python.exe setup.py install

Voila! Now when I pull up IDLE, relative to my preferred instance of Python 2.7, I can import httplib2 without issues.

It didn’t take too long to figure this out—but it seemed like an easy post should someone else be stumped and on the wrong foot.

Written by elrobis

August 5th, 2013 at 9:42 am

Posted in Python

Tagged with

Decode Google Map encoded points as Well Known Text (WKT) with Python

with 2 comments

I had close encounter of the 5th kind yesterday.. here’s the gist..

It started when someone “gave” me a GIS dataset (..of polygons, kind of..) that a colleague of theirs, way back in ancient history, chose to pre-cook as ASCII-encoded point pairs. Their intention was almost certainly to use the pre-cooked data in Google Maps. Anyway, being arguably sane, I wanted to return this data to a more GIS-normal format so I could put it in a database like MySQL or Post and use it for other stuff.

I considered a few different approaches to this problem, including creating a Google Map that could load-in all of the polygons from their encodings, then iterate over the polygons, interrogate the polygon point pairs, and finally concatenate WKT features from the points and save the geofeatures into a MySQL table. This approach offered the advantage of using Google’s existing Maps API to do the decoding for me. But let’s face it, that’s lame, uninspired, and not inventive.. it wasn’t even interesting.

Besides..  I wanted to use Python.

I expected to find a Python recipe for this looming in the misty www, but I didn’t. However, I did find a JavaScript recipe by trolling around in Mark McClure’s website. Specifically, he provides a Polyline Decoder utility, and when I viewed the page source, I found the JavaScript code that actually does the decoding (opening in FireFox will show you the code, or IE should prompt you to download the file).

Long story short, the following Python method is an adaptation of Mark McClure’s JavaScript method (twisted a little to return WKT features rather than the pure point array). If you’re somewhat comfortable with Python, you should be able to copy/paste the method right into your Python file and start calling it; just pass-in the encoded point string and let the method do the rest.

Best / Elijah

———

def decodeGMapPolylineEncoding(asciiEncodedString):
    print "\nExtrapolating WKT For:"
    print asciiEncodedString

    strLen = len(asciiEncodedString)

    index = 0
    lat = 0
    lng = 0
    coordPairString = ""

    # Make it easy to close PolyWKT with the first pair.
    countOfLatLonPairs = 0
    firstLatLonPair = ""
    gotFirstPair = False

    while index < strLen:
        shift = 0
        result = 0

        stayInLoop = True
        while stayInLoop:                                                # GET THE LATITUDE
            b = ord(asciiEncodedString[index]) - 63
            result |= (b & 0x1f) << shift
            shift += 5
            index += 1

            if not b >= 0x20:
                stayInLoop = False

        # Python ternary instruction..
        dlat = ~(result >> 1) if (result & 1) else (result >> 1)
        lat += dlat

        shift = 0
        result = 0

        stayInLoop = True
        while stayInLoop:                                                # GET THE LONGITUDE
            b = ord(asciiEncodedString[index]) - 63
            result |= (b & 0x1f) << shift
            shift += 5
            index += 1

            if not b >= 0x20:
                stayInLoop = False

        # Python ternary instruction..
        dlng = ~(result >> 1) if (result & 1) else (result >> 1)
        lng += dlng

        lonNum = lng * 1e-5
        latNum = lat * 1e-5
        coordPairString += str(lonNum) + " " + str(latNum)

        if gotFirstPair == False:
            gotFirstPair = True
            firstLatLonPair = str(lonNum) + " " + str(latNum)

        countOfLatLonPairs += 1

        if countOfLatLonPairs > 1:
            coordPairString += ","

    # The data I was converting was rather dirty..
    # At first I expected 100% polygons, but sometimes the encodings returned only one point.
    # Clearly one point cannot represent a polygon. Nor can two points represent a polygon.
    # This was an issue because I wanted to return proper WKT for every encoding, so I chose
    # To handle the matter by screening for 1, 2, and >=3 points, and returning WKT for
    # Points, Lines, and Polygons, respectively, and returning proper WKT.
    #
    # It's arguable that any encodings resulting in only one or two points should be rejected.
    wkt = ""
    if countOfLatLonPairs == 1:
        wkt = "POINT(" + coordPairString + ")"
    elif countOfLatLonPairs == 2:
        wkt = "POLYLINE(" + coordPairString + ")"
    elif countOfLatLonPairs >= 3:
        wkt = "POLYGON((" + coordPairString + "," + firstLatLonPair + "))"

    return wkt

Written by elrobis

October 20th, 2012 at 1:00 pm

Install GDAL on Windows

with 57 comments

Later posts on this blog will assume a working install of GDAL/OGR, so it’s prudent that I first demonstrate how to get a fresh cut of GDAL/OGR up-and-running on a Windows system.

Windows users have a few options for installing GDAL (see this question at gis.stackexchange,which will point you to Christoph Gohlke’s binaries –scroll down to GDAL, and the OSGeo4W Installer, among other approaches). However, I’ll guide you through what works for me, which can be summarized as follows:

1) Install Python
2) Install the GDAL binaries published by Tamas Szekeres
3) Append your environment Path variable
4) Add the GDAL_DATA environment variable
5) Finally, perform a quick test to make sure everything worked.

First, Install Python

(For reference purposes, I used version 2.7 for Win32):

· Start at the Python homepage
· At left, find “Quick Links (2.7.2)” and select “Windows Installer”
· Complete the download and start the installation
· Accept the default options suggested by the installer

Once Python is installed, launch IDLE (the built-in Python IDE) to get specific details about your Python installation:

Follow Windows (Start)→Programs→Python 2.7→IDLE (Python GUI)

This figure emphasizes where to look:

Python Version Details Output in IDLE

In this case, I’m noting the following values:
· “MSC v.1500
· “on win32

Your values may be different than mine. For instance, you may have v.1400? Or maybe you installed 64-bit Python on a 64-bit system? Whatever you did, you’ll need to be consistent from here on out. Since these values will be important in the next step, you may want to keep the IDLE window open for a few minutes.

Next Install the GDAL Binaries

From the GDAL homepage, you’d navigate to Downloads→Windows, and open Tamas Szekeres’ Windows binaries. When Tamas’ page loads, you’ll need to identify the proper download package for your version of Python, system bit-depth, and VC dependencies.

Note: VC2005, VC2008, VC2010 refer to various dependencies upon C++ redistributable libraries. To get your very own, just Google “c++ redistributable” and pick up the most current release. Alternatively, if you want to play it uber-safe, download and install Visual C++ Express 2010, and it will automagically qualify you for MSVC2010 builds (which equates to the compiler used to build GDAL).

At Tamas’ site, do the following:
· Scroll down to “GDAL and MapServer latest release versions”

· Find a release corresponding to both your installed C++ redistributable library AND your Python install. For instance, I used “release-1500-gdal-1-8-0-mapserver-6-0-0”, which corresponds to MSVC2008 (Win32) in the left column (note the figure).

Finding the right GDAL binaries at gisinternals.com

Selecting a release will take you to a new screen promoting a handful
of EXE and MSI files—to use ogr2ogr, you’ll need two of these.

· Download and install gdal-18-xxxx-core.msi (where xxxx is 1310, 1400,
1500, or 1600, which matches your Python setup). I recommend keeping the defaults while installing.

· Next download and install the GDAL binaries for your version of Python (i.e. 2.6, 2.7, 3.1). Your choice should match the bit-depth and version of Python you installed in the first place. In keeping with my running example, I used GDAL-1.8.0.win32-py2.7.msi. Once again, run the installer and accept the defaults.

Note: The GDAL installer should detect the Python instance installed in the previous section (see Figure 1-9). If not, first review the steps in the previous section to see if you omitted something (this may indicate the GDAL binaries do not match your installation of Python), otherwise select “Python from another location” and proceed cautiously.

The GDAL Python bindings installer should detect your Python installation.

 Set the Environment Variables

We’re nearly finished installing GDAL. In this section, we’ll configure the Windows environment variables (Path and GDAL_DATA) so the GDAL/OGR utilities will be easily asccessible from the command line.

Append the Path Variable

First, append the GDAL installation directory onto the existing Path variable. Do the following to edit the Path variable:

· Open the Control Panel (Start→Control Panel), open System and select “Advanced System Settings”.

Note: Alternatively, you can open the Windows (Start) menu, right-click “Computer”, select Properties, then select “Advanced System Settings”, which I find easier.

· Select “Environment Variables” (near the bottom).

· Find “Path” among the list options and select it.

· With “Path” selected, click the “Edit” button.

· When the Edit System Variable dialog appears, scroll to the end of the series, add a semi-colon (;) to separate the prior path from the new one you’re about to enter, then add your GDAL installation path. I added exactly this:  ;C:\Program Files (x86)\GDAL

· Keep the Environment Variables dialog open for the next task.

Warning: Be sure to confirm your GDAL installation directory. I installed GDAL on Windows XP, but if you run a newer version of Windows, your installation path may be different from mine.

Appending the GDAL installation directory to the Path variable. (THIS IMAGE IS FROM A WIN XP INSTALL! A WIN 7 install will likely require "Program Files (x86)".

Add the GDAL_DATA Variable

Next we’ll add a new System Variable, GDAL_DATA, which helps the various GDAL utilities find the gdal-data folder. Without access to the files in the gdal-data folder, GDAL may have trouble recognizing some projections and coordinate systems, among other things.

In Windows, create the GDAL_DATA variable as follows:

· In the Environment Variables dialog, click “New” (near the bottom).

· In the New Variable dialog, enter GDAL_DATA for the name value.

· For value, enter the path to your gdal-data folder; it should be one level below your GDAL directory, which you can confirm by scouting the GDAL folder. I used:  C:\Program Files (x86)\GDAL\gdal-data

Once again, confirm the gdal-data directory makes sense relative to your GDAL installation folder.

· At this point, commit your changes by selecting “OK” on each dialog.

Adding the GDAL_DATA variable. (Note! On Win 7 you'll probably need to use "Program Files (x86)". I stole this image from my older WinXP tutorial.)

[Update 1 Oct, 2013]

Add the GDAL_DRIVER_PATH Variable

In the same way you added the GDAL_DATA variable, add a GDAL_DRIVER_PATH variable. This variable is necessary for the Python environment to locate various GDAL driver plugins. Like the gdal-data directory, the gdalplugins directory should be one level below your GDAL installation. Of course, confirm the path on your system. (I’m on Win 7 at this point–note this is inconsistent with the screenshot, above).

On my system, the gdalplugins directory is: C:\Program Files (x86)\GDAL\gdalplugins

For those who already have GDAL installed along with the Python bindings–If you’re trying to write custom scripts and you’re getting errors like ERROR 4: ..does not exist ..is not recognized as a supported dataset name, you may simply need to add this additional environment variable. This is especially true if you followed my tutorial for setting up GDAL on your system in the first place! In case of the latter, my apologies.

 

Confirming a Successful GDAL Installation

To confirm the availability of ogr2ogr, just open a new command terminal and attempt to call ogr2ogr. If your installation is successful, ogr2ogr will start running, fail to find any valid runtime arguments, and close, leaving some helpful usage hints in its wake.

To test ogr2ogr in a Windows Command Prompt, do the following:

· Open the Windows (Start) menu, and at the bottom (“Search programs and files”), type cmd and hit Enter; this will open the Command Prompt.

· With the terminal open, type ogr2ogr and hit Enter (see the image).

Because we added the GDAL installation directory to the Path variable, Windows will attempt to find a valid ogr2ogr executable in the GDAL folder. If all goes well your command prompt should resemble the following figure (the ogr2ogr command is highlighted).

Confirming a successful install by calling ogr2ogr.

————————

[Update 27 Jan, 2014]

Change in testing/confirming a successful install

My initial recommendation to just type “ogr2ogr”, which resulted in an error (which I warned you’d get), was confusing for several people. In hindsight, I agree it was a poor choice on my part. So I like this better:

At the command prompt, type this:

gdalinfo --version

And the OGR module should respond with something like this:

C:\Users\elrobis>gdalinfo --version
GDAL 1.9.2, released 2012/10/08

Assuming you get a version number and a release date, and not an error, you should have a clean install of GDAL/OGR to work with. Be warned, though, getting the version output doesn’t confirm that your environment is setup properly, if that stuff isn’t right, you’ll likely get errors when you actually apply GDAL or OGR to some task.

————————

And voila! Assuming you see the Usage/Help output, you should be good to go with a fresh GDAL/OGR installation.

Cheers, Elijah

[Edit 06.22.2012]

If you plan to create your own Python utility scripts with the OGR Python bindings, I further recommend the following Python libraries:

shapely, numpy, Python Image Library (pil), PostgreSQL Adapter (psycopg), MySQL Adaptor (mysql-python), Python Shapefile Library (pyshp), Dbase/Xbase Adaptor Library (pydbf), matplotlib

Written by elrobis

October 17th, 2011 at 9:33 pm

Posted in GDAL/OGR,Python

Tagged with , ,