{"id":136,"date":"2012-10-20T13:00:18","date_gmt":"2012-10-20T18:00:18","guid":{"rendered":"http:\/\/cartometric.com\/blog\/?p=136"},"modified":"2012-10-20T13:01:49","modified_gmt":"2012-10-20T18:01:49","slug":"decode-google-map-encoded-points-as-well-known-text-wkt-with-python","status":"publish","type":"post","link":"https:\/\/elrobis.com\/blog\/2012\/10\/20\/decode-google-map-encoded-points-as-well-known-text-wkt-with-python\/","title":{"rendered":"Decode Google Map encoded points as Well Known Text (WKT) with Python"},"content":{"rendered":"<p>I had <a href=\"http:\/\/en.wikipedia.org\/wiki\/Close_encounter\" target=\"_blank\">close encounter of the 5th kind<\/a> yesterday.. here&#8217;s the gist..<\/p>\n<p>It started when someone &#8220;gave&#8221; me a GIS dataset (..of polygons, kind of..) that a colleague of theirs, way back in ancient history, chose to pre-cook as ASCII-encoded point pairs. Their intention was almost certainly to use the pre-cooked data in Google Maps. Anyway, being arguably sane, I wanted to return this data to a more GIS-normal format so I could put it in a database like MySQL or Post and use it for other stuff.<\/p>\n<p>I considered a few different approaches to this problem, including creating a Google Map that could load-in all of the polygons from their encodings, then iterate over the polygons, interrogate the polygon point pairs, and finally concatenate WKT features from the points and save the geofeatures into a MySQL table. This approach offered the advantage of using Google&#8217;s existing Maps API to do the decoding for me. But let&#8217;s face it, that&#8217;s lame, uninspired, and not inventive.. it wasn&#8217;t even interesting.<\/p>\n<p>Besides..\u00a0 I wanted to use Python.<\/p>\n<p>I expected to find a Python recipe for this looming in the misty www, but I didn&#8217;t. However, I did find a JavaScript recipe by trolling around <a href=\"http:\/\/facstaff.unca.edu\/mcmcclur\/GoogleMaps\/EncodePolyline\/\" target=\"_blank\">in Mark McClure&#8217;s website<\/a>. Specifically, he provides a <a href=\"http:\/\/facstaff.unca.edu\/mcmcclur\/GoogleMaps\/EncodePolyline\/decode.html\" target=\"_blank\">Polyline Decoder utility<\/a>, and when I viewed the page source, I found <a href=\"http:\/\/facstaff.unca.edu\/mcmcclur\/GoogleMaps\/EncodePolyline\/decode.js\" target=\"_blank\">the JavaScript code<\/a> that actually does the decoding (opening in FireFox will show you the code, or IE should prompt you to download the file).<\/p>\n<p>Long story short, the following Python method is an adaptation of Mark McClure&#8217;s JavaScript method (twisted a little to return WKT features rather than the pure point array). If you&#8217;re somewhat comfortable with Python, you should be able to copy\/paste the method right into your Python file and start calling it; just pass-in the encoded point string and let the method do the rest.<\/p>\n<p>Best \/ Elijah<\/p>\n<p>&#8212;&#8212;&#8212;<\/p>\n<pre>def decodeGMapPolylineEncoding(asciiEncodedString):\r\n    print \"\\nExtrapolating WKT For:\"\r\n    print asciiEncodedString\r\n\r\n    strLen = len(asciiEncodedString)\r\n\r\n    index = 0\r\n    lat = 0\r\n    lng = 0\r\n    coordPairString = \"\"\r\n\r\n    # Make it easy to close PolyWKT with the first pair.\r\n    countOfLatLonPairs = 0\r\n    firstLatLonPair = \"\"\r\n    gotFirstPair = False\r\n\r\n    while index &lt; strLen:\r\n        shift = 0\r\n        result = 0\r\n\r\n        stayInLoop = True\r\n        while stayInLoop:\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0# GET THE LATITUDE\r\n            b = ord(asciiEncodedString[index]) - 63\r\n            result |= (b &amp; 0x1f) &lt;&lt; shift\r\n            shift += 5\r\n            index += 1\r\n\r\n            if not b &gt;= 0x20:\r\n                stayInLoop = False\r\n\r\n        # Python ternary instruction..\r\n        dlat = ~(result &gt;&gt; 1) if (result &amp; 1) else (result &gt;&gt; 1)\r\n        lat += dlat\r\n\r\n        shift = 0\r\n        result = 0\r\n\r\n        stayInLoop = True\r\n        while stayInLoop:\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0\u00a0\u00a0 \u00a0# GET THE LONGITUDE\r\n            b = ord(asciiEncodedString[index]) - 63\r\n            result |= (b &amp; 0x1f) &lt;&lt; shift\r\n            shift += 5\r\n            index += 1\r\n\r\n            if not b &gt;= 0x20:\r\n                stayInLoop = False\r\n\r\n        # Python ternary instruction..\r\n        dlng = ~(result &gt;&gt; 1) if (result &amp; 1) else (result &gt;&gt; 1)\r\n        lng += dlng\r\n\r\n        lonNum = lng * 1e-5\r\n        latNum = lat * 1e-5\r\n        coordPairString += str(lonNum) + \" \" + str(latNum)\r\n\r\n        if gotFirstPair == False:\r\n            gotFirstPair = True\r\n            firstLatLonPair = str(lonNum) + \" \" + str(latNum)\r\n\r\n        countOfLatLonPairs += 1\r\n\r\n        if countOfLatLonPairs &gt; 1:\r\n            coordPairString += \",\"\r\n\r\n    # The data I was converting was rather dirty..\r\n    # At first I expected 100% polygons, but sometimes the encodings returned only one point.\r\n    # Clearly one point cannot represent a polygon. Nor can two points represent a polygon.\r\n    # This was an issue because I wanted to return proper WKT for every encoding, so I chose\r\n    # To handle the matter by screening for 1, 2, and &gt;=3 points, and returning WKT for\r\n    # Points, Lines, and Polygons, respectively, and returning proper WKT.\r\n    #\r\n    # It's arguable that any encodings resulting in only one or two points should be rejected.\r\n    wkt = \"\"\r\n    if countOfLatLonPairs == 1:\r\n        wkt = \"POINT(\" + coordPairString + \")\"\r\n    elif countOfLatLonPairs == 2:\r\n        wkt = \"POLYLINE(\" + coordPairString + \")\"\r\n    elif countOfLatLonPairs &gt;= 3:\r\n        wkt = \"POLYGON((\" + coordPairString + \",\" + firstLatLonPair + \"))\"\r\n\r\n    return wkt<\/pre>\n","protected":false},"excerpt":{"rendered":"<p>I had close encounter of the 5th kind yesterday.. here&#8217;s the gist.. It started when someone &#8220;gave&#8221; me a GIS dataset (..of polygons, kind of..) that a colleague of theirs, way back in ancient history, chose to pre-cook as ASCII-encoded point pairs. Their intention was almost certainly to use the pre-cooked data in Google Maps. [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[43],"tags":[18,5,9,19,21,20],"_links":{"self":[{"href":"https:\/\/elrobis.com\/blog\/wp-json\/wp\/v2\/posts\/136"}],"collection":[{"href":"https:\/\/elrobis.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/elrobis.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/elrobis.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/elrobis.com\/blog\/wp-json\/wp\/v2\/comments?post=136"}],"version-history":[{"count":3,"href":"https:\/\/elrobis.com\/blog\/wp-json\/wp\/v2\/posts\/136\/revisions"}],"predecessor-version":[{"id":184,"href":"https:\/\/elrobis.com\/blog\/wp-json\/wp\/v2\/posts\/136\/revisions\/184"}],"wp:attachment":[{"href":"https:\/\/elrobis.com\/blog\/wp-json\/wp\/v2\/media?parent=136"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/elrobis.com\/blog\/wp-json\/wp\/v2\/categories?post=136"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/elrobis.com\/blog\/wp-json\/wp\/v2\/tags?post=136"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}