Spring Conferences from Coast to Coast

Every year it seems that there’s one week or month where we have more events and conferences to attend than we have employees. It’s actually a great problem to have since we relish the opportunity to get out from behind our desks and into the wild. We love presenting the latest developments from our engineering team, teaching people how to build applications during workshops, participating in panels and, most importantly, learning from all of the people we meet at these events. So if you’ll be at any of the event below events please stop by our booth or flag us down to begin a conversation. We hope to see you out there!

LocationTech San Francisco Meet Up

When: Thursday, March 20, 2014 at 19:00
Where: Hyatt Regency San Francisco Airport

Following his EclipseCon presentation on GeoGit, Juan Marin, our CTO, will join representatives from Eclipse, RedHat, and the Oak Ridge National Laboratory to present on LocationTech initiatives and other open source projects. The is a free event but you have to register to attend, so if you’re in the Bay Area make sure to sign up today!

LocationTech Birthday Blast

When: Wednesday, March 26, 2014 at 18:30
Where: Boundless, Arlington, Virginia

It’s been a full year since the Eclipse Foundation joined with Oracle, IBM, Boundless, and Actuate to launch the LocationTech working group for jointly developing components that bring location awareness to enterprise IT. To celebrate, LocationTech members are hosting an event with food, drink, and presentations on projects like GeoGit, GeoMesa and GeoTrellis. Space is limited so sign up soon.

Women in Geospatial Meetup

When: Thursday, April 10, 2014 – 18:30
Where: Boundless, Arlington, Virginia

Boundless is committed to increasing diversity in our industry. While we make our best efforts, we recognize that we’re not  experts so we’re hosting a meetup next month to help turn up the volume on this issue. The evening will feature a panel discussion with well-known women from the geospatial community.

QGIS US User Group Meeting

When: Friday, April 11, 2014 – 09:00
Where: OpenGovHub, 1889 F Street NW, Washington, DC

We’re proud to help organize the first QGIS user group meeting in the United States. The day will start with presentations from notable figures within the QGIS community. After lunch, we’ll break into workshops to cover topics ranging from how to become a QGIS developer to spatial analysis with QGIS. Whether you’re already an expert or just getting started, this event has something for everyone.

State of the Map US

When: April 12–13, 2014
Where: Walter E. Washington Convention Center, Washington, DC

SotM US is headed to our nation’s capital and Boundless will be there as Silver sponsors. We’re looking forward to participating and hearing about the latest developments in the OpenStreetMap community. Boundless will have a booth at the event and Jeff Johnson will give a talk on GeoGit-Based OpenStreetMap Import Workflows. Stop by to find out more about how Boundless tools can help you work with OpenStreetMap.


When: April 14–16, 2014
Where: Monterey, CA

Just after State of the Map, we’re headed to the Golden State for CalGIS. This is our first time attending CalGIS but we’re excited to be Silver sponsors. We’ve seen an increasing number of conferences dedicating time to open source tools and CalGIS is no exception. This year, the agenda includes a breakout session as well as a workshop focused on open source and Boundless will be participating in both. We’ll be at booth #209, so stop by if you’re in attendance!

GEOINT Symposium

When: April 14–17, 2014
Where: Tampa, FL

This year GEOINT is headed to Tampa and we’ll be there showing off how building web maps and apps is easy with OpenGeo Suite. If you’re attending, swing booth # 8090 or send us an email to set up a meeting.

3/21 Update: We didn’t mean to limit ourselves to North American coasts! Next month Jody Garnett will be representing Boundless at a number of events Australian events.

Eclipse Developers Day Sydney

When: Wednesday, April 2, 2014
Where: 100, Walker Street, North Sydney, NSW, Australia
Jody  will be joining representatives from CSIRO in adding a bit of mapping to this Eclipse community event. Jody will offer an introduction to LocationTech technologies with demos of JTS and GeoGit. This is a free event but you must register to attend.

Georabble All Stars

When: Wednesday, April 7, 2014
Where: Canberra, ACT, Australia

Georabble presents an All Star event brought to you by Boundless and LocationTech As apart of Locate14, Georabble organizers have put together an array of seasoned All Star top  plus a few new ones lead by MapStory’s Chris Tucker. The line Up includes: Pia Waugh – Open Data Ninja Julian Carver – LINZ Denise McKenzie – OGC Mike Bradford – Landgate Jody Garnett – Boundless Chris Tucker – Mapstory and more. Again, this is a free event but you must register to attend.

GeoGit in Action: Distributed Versioning for Geospatial Data

Juan MarinAs we showed with the Typhoon Yolanda recovery effort, every project working with geospatial information eventually faces the problem of managing change over time. At the center of the issue is data provenance: where the data originated, whom it belongs to, and what set of individual changes were made to a particular piece of information in order to reach its current state. While versioning approaches have existed for a while, they are cumbersome and have been a challenge in many workflows, especially those that involve more than one individual. We created GeoGit to help solve these problems.

GeoGit takes concepts and lessons learned from working with code in open source communities and applies them to managing geospatial information. GeoGit allows for decentralized management of versioned data and enables new and innovative workflows for collaboration. Users are able to track edits to geospatial information by importing raw data into repositories where they can view history, revert to older versions, branch into sandboxed areas, merge back in, and push to remote repositories.

Working with GeoGit

Once installed, a simple working session might look like this (data references are from the freely available Natural Earth collection):

1. Create a repository and import raw geospatial data (from Shapefiles and spatial databases such as PostGIS, Oracle Spatial or SQL Server):

mkdir repo
cd repo
geogit init
geogit shp import ne_110m_coastline.shp

2. Add the imported data to the staging area. This command signals that this is information to be versioned and tracked and prepares it for final insertion into the repository.

geogit add

3. Commit the information to the repository. Developers familiar with Git will appreciate the familiar API and command line options. In this case, we are passing a commit message that will be associated with this change.

geogit commit -m “Add coastline”

4. In order to make changes and collaborate with others, a typical workflow involves creating branches to isolate changes from the master branch. Creating a branch in GeoGit is as easy as issuing the following command:

geogit branch branch1

 GeoGit branching

This creates a new branch called branch1 where all commits will go to until another branch is chosen. Branching is an important concept in GeoGit as it enables editors of geospatial content to modify information without worrying about interfering with the quality of the main version, usually stored in the master branch.

5. When changes are ready to be brought back into the main version, they can be merged into another branch using the merge command.

geogit checkout master (switches to the master branch)
geogit merge edits

Upon merging, a merge conflict is returned if conflicts are detected (for example, two users independently modify the same geometry with different outcomes) and a commit cannot happen until the conflict is resolved. This is an important feature that prevents geospatial data corruption and enforces workflows that involve data quality assurance.

Learn More

Anyone familiar with tools like Git, which handles distributed version control for source code, will immediately see the advantages this approach brings.

GeoGit is an open source project based on the Java platform and is developed by committers across several organizations. It has recently been submitted as a project of the LocationTech working group within the Eclipse Foundation. GeoGit has also been designed to be extensible, and there is already a Python wrapper library that make these operations easier and enables automation.

To learn more, visit http://geogit.org/ or watch a full presentation on how we’re redefining geospatial data versioning with GeoGit below.

Openwashing: Kissing the Geospatial Blarney Stone

I mentioned last week that the word “open” is sometimes thrown around by closed source software companies a little… freely. But hey, what’s in a word?

When mining companies talk about “clean coal” it’s easy to see they’re engaged in “greenwashing.” Their financial interests in digging and selling lots of coal is clearly in conflict with their rhetorical support for generating less atmospheric carbon.

The blarney is not as obvious in the software industry. While proprietary software companies can launch true open source initiatives, often they open just a little to further entice users into their proprietary stack. What should we call it when a proprietary software company – whose financial interest is in selling lots of expensive closed software – starts talking about their commitment to open source? May we suggest, “openwashing”?

In honor of St. Patrick’s Day and inspired by Jeff Foxworthy, here are some handy examples to help spot openwashing in the wild:

 You know they’re openwashing when…

“Esri supports extending and complementing the ArcGIS platform with open source tools, languages, integrated development environments (IDEs), libraries, and Web server technologies.” — Esri

“Esri has always supported our developer community with source samples and detailed SDK documentation for configuring and extending the ArcGIS system.” — Esri

… “using open source” means “using open source to extend our proprietary platform”.

 You know they’re openwashing when…

… they release a “free viewer”!

 You know they’re openwashing when…

“Popular tools such as GDAL/OGR, PostgreSQL, Dojo, MongoDB, and others have all been actively supported by Esri.” — Esri

… they start listing open source projects they use as examples of projects they support. 

You know they’re openwashing when…

“Back before open-source code was a ‘thing‘ Esri had ArcScript and many other tools that developers could download and share their code.” — Esri

… they start claiming prior art on open source.

 You know they’re openwashing when…

“Word from Esri early this morning that today Esri’s education industry manager Michael Gould will announce the change in licensing of the Geoportal Extension at FOSS4G in Barcelona today.” — All Points Blog, 2010

… they start releasing failed or deprecated products as open source (but not the ArcGIS API for Flex, sorry Flexians)

 You know they’re openwashing when…

… “implementing open standards” means “getting our software ratified as the standard”.

You know they’re openwashing when…

“Microsoft has changed as a company and is becoming more open in the way that we work with and collaborate with others in the industry, in how we listen to customers, and in our approach to the cloud.” — Microsoft

“We have organically grown the understanding and utilization of open-source both internal to Esri as well as heavily promoted to our developer community.” — Esri

… they say they’ve changed! 

Know of other examples?

We had so much fun doing these we could think of about 100 more, but we’d rather hear from you. Tell us your #openwashing sightings, and we’ll share the best here.

Thoughts from GeoNext

Jody GarnettGeoNext is an eclectic spatial conference and a highlight in the Australian calendar. This year, the event took place on the 26th of February at the Australian Technology Park, just south of downtown Sydney. The location had that wonderful startup vibe of historic buildings and new ideas, making it a fitting venue for a conference that searches out what is next.

A quick word of thanks to Maurits van der Vlugt and the organizers, sponsors, and presenters. Maurits offered a quick introduction as MC and kept a decent pace throughout the day.

GeoNext at Australian Technology Park

The Best

The highlight of the conference was a presentation by James Moody (TuShare) on “Unleashing the Sharing Economy: Mapping the idle assets all around us”. This talk started with the “next” and worked its way back to “geo”.

His viewpoint on the future was provided by a stint at the national research organisation (CSIRO) “futures unit”, which asked scientists for a 30 year prediction in order to spot trends across disciplines. This was framed as a historical graph showing waves of innovation, market dominance, and economic collapse. To bring “geo” back into focus, the sixth wave of innovation (and all those predictions) is coalescing around handling resource scarcity, they key enabler of which is using location to drive efficiency.

While this is all heady stuff, two key examples brought the point home and provided context for the rest of the conference. The first was was looking at the amazing 15% engine efficiency of today’s automobiles. While this is fine if you want to move cars around, the actual goal is to move people, resulting in 1.5% efficiency (based on weight of car and weight of people). We were then asked to consider how many of us had cars sitting unused at home (acting as an idle asset). The second example was more subtle, but keenly felt given Australia’s current drought. A lot of effort goes into making Sydney water potable, but we only consume 2% of the water entering a home. The other 98% represented some wasted effort (as we do not require drinkable water for cleaning).

The Rest

Nic Lowe (GoGet) offered a jalopy to fleet story addressing all those idle cars left at home. It was a great low-tech reality check for the conference. Location, and indeed technology, only played a small part in his story but we were quite charmed by the initial low-tech solutions (key boxes on utility poles, spreadsheets, calling everyone when the computers were down).

GoGet was also kind enough to sponsor a hackathon around their vehicle data: resulting in a great display of creativity and insight from participants. The winners were impressive, showing business sense (ask users to fill up on Tuesday to save 8% annually on fuel) and marketing insight (allow users to name their cars to build a customer relationship).

All in all, an excellent reminder that the value in location is to be found everywhere, especially once you look beyond mapping.

Ian Scrivener created a polished business dashboard backed by PostGIS, GeoServer, and Leaflet.

While GoGet focused on being ubiquitous in Sydney (down to 100m walk for members), another organization got here first. Dr. Kurt Iveson (University of Sydney) looked at how the bus system is doing. From an initial “everything will be great” 18 years ago when buses were first equipped with GPS units, through to arguments with unions (buses are a workplace), and a sequence of abandoned initiatives that only a municipal government can provide. The result is an amazingly complex integration issue of varying accuracy.

I latched on to a few amusing facts: The effect of posting the time until the next bus? Patrons were more relaxed and felt that their wait time was half. More amusingly, users of transit apps did not really trust them, preferring to have the time posted on the bus stop.

The section of the talk devoted to data publication, and by extension open data, was illustrative. Brisbane, which has gone down the open data route, has a host of applications. Sydney has held a series of app competitions, resulting in a smaller selection of apps. Why the different approach? Sydney is in a position to recommend and promote the resulting apps, something they feel would be lost by open data (perhaps referring back to that idea of trusting your smartphone app).

Billy Haworth (University of Sydney) played to local strengths by combining natural disasters and social media, both of which Australia has in abundance. It is amazing to contrast the 2010 Brisbane Flood (when the police took to Twitter to communicate to the public rather than wait for journalists) and the 2013 Sydney Fires where the public already knew the drill and everything happened online. A real sense of going to the people in a time of crisis.

Kolt Luty (Pitney Bowes Software) gave a talk on location intelligence that was illuminating at several levels. Pitney Bowes is viewed with concentration in Australia as it has acquired many companies, including MapInfo. Using retail examples, Mr. Luty gradually introduced business intelligence and then bridged the gap to using location to make sense of data mining results. The concepts and the idea of bringing them together as an alternative to GIS were new to the audience and resulted in lively discussion.

Rohan Fernando (HERE) gave a presentation entitled “Race for the Geospatial Overworld” that did a great job of introducing the different industries vying for the Spatial Data Infrastructure crown.

There was also an entertaining panel discussion on wearable computing in which the participants did a great job playing off each other.

I was a little let down by the academic presentations, which are usually a highlight for me. The talk on indoor location featured base stations that looked a little like Death Stars but managed to avoid both the social opportunities and business motivations that would see the technology adopted. The talk on “Aging in Place” provided a vision of how society will change, but felt repurposed for the event and did not especially tie its concepts back to location.

The Tech

To bring things back to technology, Simon Hope offered a refreshing talk entitled “Geekification of GIS”. He really worked up the crowd, drawing influences from search, change and online maps to capture the imagination. I especially liked the nod to digital natives and open source as showing the way — and the shout-out to GeoGit was welcome.

The Wrap

The conference wrapped up with drinks and a chance to view the map gallery or speak with the sponsors. A nice GeoRabble style ending to a great day.

If you missed GeoNext, you can see additional photos on flickr. If you are near Sydney, here are couple more events to put on your radar:

  • Eclipse Day Sydney (April 2) has a bit of a BI / Big Data feel to it, but I see a few crossovers to mapping scheduled.
  • Locate14 (April 7-9) is the national GIS conference held a few hours down the road in Canberra.

Jody Garnett lives in Sydney, Australia with his wife. When he’s not working with computers you may find him painting, drawing, or taking photographs. Jody blogs at how2map.com and tweets at @jodygarnett.

Going “Open” with Esri?

Eddie PickleLast year, Esri released a blog post about going open source with Esri which they recently followed up with a progress report. While we laud Esri for some genuinely open efforts like their GIS Tools for Hadoop, their use of the word “open” can be a bit… misleading. That’s not helpful to a geospatial industry that’s still trying to catch up to the rest of the IT industry in the use of real open source software.

Open source software licensing has been a boon to the IT landscape. It promotes rapid adaptation, superior interoperability, better collaboration amongst developers, a greater universe of software testers, and greater control for enterprises using the software.

Open Like a Venus Flytrap

Open source has been so successful at replacing proprietary alternatives that many closed source companies have been scrambling to buy open source companies — or else releasing “fig leaf” open source projects to increase the market share of their closed source software.

Of course we’re glad to see that Esri has more than 200 projects on GitHub, but how many of them are sample applications or frameworks that only work with their proprietary offerings? It’s nice to see that the Geotrigger SDK for Android is licensed under the Apache 2.0 license, but how can one stand up a Geotrigger service without paying Esri? Is it even an open standard?

Working in the Open

We’re also happy to see Esri adopting modern development practices like distributed version control, but it’s a bit misleading to describe using Github internally as ‘open source’ — for one thing, the very meaning of ‘open source’ is public access to source code, and we don’t see Esri providing much access to the code that powers their APIs. If Esri is truly committed to working in the open, why start a LIDAR format war instead of working within an existing open source community? If Esri is committed to open source, why not save the ArcGIS API for Flex by releasing the source code, as their own customers have asked?

In contrast, all the source code for OpenGeo Suite is accessible on our public GitHub repository. And we don’t just provide the source code, we’re actively involved in the governance of the open source components we depend on and substantively participate in community events like FOSS4G.

A History of Open

While Esri seems eager to market their “open” platform and their many “open” initiatives, the DNA of a company is hard to change. Esri was forged in the proprietary GIS era of the 1980s and 1990s, and it shows in the way they twist the meaning of the very word “open” to lock customers into closed source, proprietary software.

At Boundless, our world is open. We started at a non-profit and continue that legacy by building truly open communities, using open licenses, employing expert core developers, publishing Creative Commons learning materials, and helping communities that build open source software.

“Open” is more than a word and open source software and open data are more than token initiatives. We sympathize that the initial acceleration towards openness can be difficult — especially for a company with Esri’s history — and we’re hopeful that the next year will show some true progress.

Eddie Pickle has worked in the geospatial industry for more than 30 years and has been a senior executive at well-known software and data leaders including IONIC, Claritas, and now Boundless.

Boundless Connections: Gretchen Peterson

Gretchen PetersonSay hello to Gretchen Peterson, our new Data Scientist. Gretchen has experience preparing and displaying data. Find out where Gretchen’s been, what’s she been up to, and where she plans on taking us.

Welcome to the team, we’re happy that you’re joining us. What will you be working on here at Boundless?

I’ll be developing and curating data products as well as helping to improve the overall user experience of OpenGeo Suite. I have a long history with data science — figuring out where to get data, how to manage it appropriately, and then how it would be best to gain understanding from that data. On the “understanding data” side of things, I’m adept at spatial analysis as well as cartography.

So just how does one become a data scientist?

It usually follows a circuitous path. In my case it started with a B.S. in Natural Resources, with an emphasis on GIS, from Cornell University. I later became a GISP. My career began with a lot of analyzing watersheds, salmon habitats, and other related things out in the Northwest. Along the way I got serious about cartography but never let up on the analytical side of things.

As a consultant, my clients were mostly interested in data and what it meant, so I learned as I worked. I always wanted to make sure we were using the right tools. Not just looking at maps for places where it looked like the habitat was good, but actually choosing models and nearest-neighbor algorithms and the like to truly pinpoint areas of concern and gain some knowledge of the potential for error.

You’ve also have written two books that deal with the craft of GIS map making — Cartographer’s Toolkit and GIS Cartography — what inspired you to work on those?

I figured there was a need to present people with color and typography options in an easy to compare way, with particular emphasis on mapping. As well as to showcase all the wonderful and unique things people are doing with maps these days. Plus, I had some free time.

Cartographer’s Toolkit deals primarily with presentation best practices, not so much the data side. I figure that if you work so hard on data and analysis but nobody understands it, then what’s the point?

Are you more of  data person or are you more about the way it all looks?

Hmm… I’m 60% about data and 40% about how it looks. Can I be both?

You’re almost completely in balance.

<laughs> “Almost.”

What were some of your favorite projects?

One of my favorite projects was creating basemaps for use in some business software. You’d think it’d be easy to create basemaps but it’s not. First, there’s the data. We had to look through over a dozen different data sources for the major data types — admin boundaries in particular have many iterations and many sources of authority.  We chose to use OpenStreetMap for much of the other data, so that was nice. It’s nice to have a whole database with so many different data types to potentially display roads, airports, buildings by type, and so on. Only once we had decided on data sources could I style it. It took about four months to complete. This is very fun, but of course has it’s own nuances and challenges.

What were those challenges you were facing?

Once we had developed a good basemap, I then had to tweak colors and things to create an alternate style, plus manage stylesheets for twelve regions (keeping track of different languages, different names) and three different world-views: China, India, and rest-of-the-world. Connecting to an OpenStreetMap database on Amazon Web Services via TileMill can take a while, so development of maps there need to happen on a smaller database or with query constraints to prevent each change from taking a long time to visualize. Then you have little problems — like, “Hey, there’s a weird artifact showing up at the mouth of the Thames.” — kind of thing to correct. Looking back it was such a challenge but lots of fun!

The sources of data must have been wildly different.

For the most part, OpenStreetMap mostly has it covered. It was more, like, what does India call this region and do they consider that region to be disputed whereas others don’t? Another complicating factor is fonts, as you have to use different fonts for different languages with different character sets and formatting — it gets to be a little hard to test if you don’t know what it’s supposed to look like.

In that time, you must have amassed a huge knowledge of these regions without having been there.

There was a lot of that. Things I should have known but didn’t. Like the status of Taiwan, for instance. Politically, is it a part of China or it’s own entity? That was something I didn’t know before.

How much knowledge of a place do you feel a cartographer needs to have when documenting it?

If a cartographer doesn’t have knowledge of a place they’re working with then they can make mistakes. These might not be immediately recognizable to them, but it can lead to releasing an embarrassing product. However, if you are building a world map there are going to be things that you don’t know as much about. The responsibility of the cartographer, then, is to know when to ask local experts for help.

For example, I had to make regions out of the U.S. for one client. And they had to be regions based on where people live and work, where they are likely to spend their time. I don’t know much about the midwest so I sub-contracted that part of the work to people who live in the midwest who could double-check the regions I had made. I mean, my regions were based on good data, but nothing substitutes for a good pair of local eyes!

What are your tools of choice?

I use whatever is needed. I currently love to do finishing work with Inkscape because you can create a lot of nice effects without much fuss and the filters are extremely fun to play around with. TileMill is pretty good too in that you can make bulk changes just by copying and pasting things as opposed to clicking on a lot of buttons like you would have to do in a GIS.

Out of curiosity, why Inkscape vs Adobe Illustrator?

Because it’s free. I used to do the month-to-month Illustrator license but then I realized Inkscape is comparable but free. I wasn’t an expert at Illustrator prior to my switch, so it wasn’t painful because either way I had to learn. If you already know Illustrator, though, I could see it being difficult to switch!

What’s your experience in using open source tools?

I’d describe my experience as novice but with great enthusiasm for them. More specifically, I’ve used tools like git, GitHub, CartoCSS, GRASS, and QGIS in the past, but not to the extent that I would like. I’m pretty sure that working for Boundless will ameliorate that!

Did you start out using Esri software?

Yes, I started with ArcView 3.

Have you ever used OpenGeo Suite?

I’ve used PostGIS and QGIS. I’m excited to jump into learning all of OpenGeo Suite and I’m already starting to see the value of GeoGit.

What do you do when you aren’t at work?

I recently became interested in skiing again after a long hiatus. I used to ski as a kid but had a lot of bad experiences. Now that I’ve taken it up again, I realize that, as an adult I can stay on the blue runs (the medium-hard runs) and have fun, though people who are out their first day sometimes do better than me. I’m slow but it’s all about having fun, right?! I go to this place where people come in their cowboy hats and sometimes jeans. It’s so low-key. It’s great.

Are you a world traveller?

I do like to travel and I’m able to do so maybe six times a year. Lately I’ve been going to various places in California. I just came back from London, but that was just for sightseeing. It was a blast. I like visiting both cities and rural or nature areas. I love to hike and campout but I also love to see NYC, London, and San Francisco. It’s all good. Our country’s natural resources and our cultural resources are both things to be enjoyed and cherished.

Let’s finish up with an interesting fact about yourself

Hm, interesting fact. Uh, I eat like three breakfasts? Then not a lot for dinner. I couldn’t tell you why. I’m just very hungry in the mornings. <laughs>

Another interesting fact I once had an internship where I used to walk around a little island censusing common tern babies on Oneida Lake in New York. I had to drive a boat to get there. So I had an internship that required boat commuting.

Are you a licensed boater? What kind of a boat is it?

I didn’t have a license, no, but I was probably the best boater out there. Those fisherman can be rather reckless. It was just a small little motor boat. Nothing special. It had a hole in it and my supervisor told me that if i ever got stuck in the lake and it leaked too much, to just gun it back to shore.

Any last words you’d like to add?

Just that I’m enthusiastic about doing data science at Boundless!

Interested in joining Boundless? We have an amazing team of passionate, creative, and incredibly bright people and we’re hiring more!

Recovering from Yolanda with help from OpenStreetMap and GeoGit

We’ve had several opportunities to refine GeoGit workflows in real-world situations, but among the most fulfilling was assisting with the response to Typhoon Yolanda (also known as Typhoon Haiyan) in the Philippines. It was the strongest cyclone to make landfall in recorded history, resulting in an urgent need to share data about the damage to help with recovery and reconstruction.

Yolanda Data

To meet this need, the Global Facility for Disaster Reduction and Recovery (GFDRR) teamed up with the American Red Cross and the Humanitarian OpenStreetMap Team (HOT) and launched an open data platform to gather and share data about Yolanda. The ROGUE project, which helps develop GeoGit, was asked to help manage and distribute extracts of OpenStreetMap data. As described below, we created a powerful bidirectional workflow with OpenStreetMap that enabled us not only to derive and publish up-to-date data for response and recovery efforts but also to contribute back to OpenStreetMap.

Importing OpenStreetMap Data

Thanks largely to HOT’s efforts, a large number of damaged and destroyed buildings were mapped into OpenStreetMap using commercial satellite imagery distributed under the Next View license or the State Department’s Imagery to the Crowd program. GeoGit was used to extract data from OpenStreetMap and transform it into formats more useful to traditional GIS applications.

While GeoGit supports reading and writing from OpenStreetMap data in a variety of ways, the Yolanda efforts started with the daily .pbf downloads from geofabrik that were then imported into a GeoGit repository using the geogit osm import command. This initial import command brings the data into the standard node and way layers in a GeoGit repository with all of the OpenStreetMap tags attached to each feature. During the initial few imports we were able to find and solve some performance bottlenecks that reduced the import time from over an hour to just a few minutes.

Mapping to a Schema

Once imported, the geogit osm map command was used to map the data into more traditional sets of layers, using the tags as attributes. A JSON mapping file specifies which tags were used to separate out the features into layers and assign attributes to each feature.  The key mapping involved taking nodes and ways tagged with typhoon:damage=yes and translating those into damage_polygons and damage_line layers with associated attributes. Over the course of mapping the data, we were able to make improvements to the codebase and workflow in several areas.

Sharing Up-to-Date Data

Once the repository had the data organized into the right schema, we used the geogit export pg command to load snapshots into a PostGIS database and serve them to the web. Since we wanted to provide the most current data, we used the geogit osm apply-diff command to update the repository with daily updates from OSM planet. This ensured that our repository always reflected recent edits and that layers were exported and updated on the site.

Contributing Back to OpenStreetMap

In addition to staying in sync with the global OpenStreetMap planet, GeoGit made it possible to change layers in our repository and apply them back to OpenStreetMap — enabling a fully round-trip or bidirectional workflow. For example, we found many misspellings or inconsistent use of tags in the data where able to correct them. We fixed these issues against our PostGIS snapshot, applied the changes back to the repository, generated a changeset using the geogit osm create-changeset command, and finally uploaded the changeset using JOSM. In the process, we were once again able to improve these functions based on real-world usage.

GeoGit Advantages

These tools enable a powerful bidirectional workflow with OpenStreetMap. We demonstrated that data can be imported from OpenStreetMap into a local repository, mapped into a set of layers with a well-defined schema, and served via OGC services. Repositories can be kept in sync with OpenStreetMap over time and, if changes are made to the local repository, GeoGit enables us to produce changesets that can be contributed to the global OSM dataset. Using this same workflow, it becomes possible for users to effectively work with a local extract of OSM data for both making and applying local edits as well as incorporating upstream changes.

Jeff Johnson will present more about GeoGit-based OpenStreetMap import workflows on April 13th at State of the Map US.

OpenLayers 3 Vector Rendering: Topology-Preserving Simplification

OpenLayersOpenLayers 3 aims to push the boundaries when it comes to rendering vector data in a browser. Whether you’re editing a stream network, exploring congressional district boundaries, or just viewing point markers on a world map, we want you to be able to see and interact with beautifully rendered vector data even when dealing with large, complex datasets. To achieve this, the library needs to be very efficient at simplifying geometries without sacrificing the visual integrity of the data.

When a geometry has many vertices per device pixel, it is best not to waste a user’s time by trying to draw every one of them.  Developers typically try to solve this problem by simplifying polygons with an algorithm based on Douglas–Peucker (although Visvalingam has not completely been overlooked). However, these line simplification algorithms are not suitable for polygon simplification when topology needs to be preserved — meaning when geometry rules like “must not overlap” and “must not have gaps” need to be enforced.

Considering Douglas-Peucker

If, for example, you are rendering polygons that represent administrative boundaries, and your audience is going to be surprised to see gaps between adjacent boundaries, you’ll need to consider a simplification algorithm that preserves topology.  The graphic below shows national borders simplified with a Douglas-Peucker–based algorithm at a few different tolerance levels.


I’ll use units of device pixels when talking about simplification tolerance.  While these are translated to map units before simplifying, it is easier to reason about pixels when discussing rendering. As shown above, simplification using Douglas-Peucker results in a visual loss of topology at the 2px tolerance level.  At a 4px tolerance, the method is clearly unsuitable for simplification before rendering.  Faced with these results, you might conclude that the method should just use a 1px tolerance or below and then move on.

An aside on frame budget

When animating things like map panning or zooming, having a constant frame rate prevents the viewer from experiencing stuttering map movements.  On capable devices, sixty frames per second is a good goal, but it leaves a frame budget of only 16.7 milliseconds.  Within that budget, you will need to access all features for rendering, calculate all symbolizers to be used, do any geometry simplification, and then do the actual drawing. The Douglas-Peucker simplification above takes about 35ms to run on a moderately “low resolution” set of features with a capable browser.  Since this completely blows the target frame budget, simplification during animation is out of the question.

The answer, of course, is to cache the simplified geometries (or the draw instructions themselves) and use the cached results during animation.  However, this presents a new problem.  During animated zooming, simplified geometries that looked okay at the original resolution may show gaps and overlaps at a 2x or 4x higher resolution — this is sort of like what you see going from the 1px simplification to 4px in the graphic above.

Quantization-based simplification

So, to avoid extra work rendering visually insignificant vertices without artifacts produced by Douglas-Peucker, you’ll need to consider an algorithm that preserves topology.

Quantization is basically the process of taking a continuous (or large) set of values and mapping them to a discrete (or smaller) set of values.  Rounding to the nearest integer is an example of quantization.  The reason to consider quantization as the basis for geometry simplification is that this repeatable mapping of continuous to discrete values preserves topology – two geometries that share coordinate values before simplification will still share values after.  As mentioned earlier, I’m concerned primarily with “must not overlap” and “must not have gaps” type topology rules.  Quantization will change the topology in other ways (rings may no longer have area), but it will respect topology in this aspect: if adjacent geometries don’t have gaps before, they won’t after.  Same goes for overlaps.


While quantization at the 4px tolerance level above generates undesirable results, the 1px and 2px levels look good.  Quantization itself is a form of simplification.  But we can take it a bit farther without doing much work.  The simplification steps are basically these:

  • quantize coordinate values given some tolerance
  • drop duplicate and collinear points (when more than 2)

This quantization based simplification can be performed in a single pass over the original coordinate values.  For similar data sets and tolerance, the method is considerably faster than Douglas-Peucker [1].  It is also clearly less aggressive at simplifying [2].  However, it accomplishes our goals.  Quantization based simplification avoids rendering vertices that don’t have a significant visual impact while preserving the kind of topology we care about.

Quantization based simplification is itself rather simplistic.  While it is possible to maintain topology with methods like Visvalingam’s line simplification, this requires that you have a known or fixed set of vertices before simplifying.  This doesn’t work if you are progressively loading and rendering data or otherwise need the simplification process to be repeatable for a single geometry regardless of which other geometries it is rendered with.

See for yourself!

Take a look at the OpenLayers 3 vector rendering examples to see this topology preserving simplification in action — and let us know if you find any gaps or overlaps in adjacent polygons!

[1]  Quantization-based simplification performed 2–7x faster than Douglas-Peucker on the same data sets with the same tolerance in unscientific benchmarks performed while writing this post.

[2]  With a tolerance of 80,000m, the Douglas-Peucker method reduced the original number of vertices in the sample dataset by 70% while the quantization based method reduced the number by 27%. The 80,000m tolerance corresponds to about 2 pixels at zoom level 2 in the common XYZ tiling scheme.

Boundless Connections: Rick Berry

Professional ServicesSay hello to Rick Berry! Last week, Rick joined our Professional Services team, where he’ll develop solutions for our enterprise customers. He’s been an avid user and advocate of open source geospatial software for some time and has recently used some of those components to develop web clients for the federal government.

Welcome to the team, Rick. We’re excited for you to join.

Thanks! I’m excited to be here. I’ve been interested in coming to work for Boundless for a while. I’ve noticed that there’s been a steady stream of great talent from the geospatial community coming to work at Boundless and I’m happy to be the next in line.

What are you most excited to be doing at Boundless?

For one, I feel like I’ll be doing what I’ve always been doing — solving complex problems for  clients — which is something I’ve always loved doing.  My job hasn’t changed that much from any of my previous ones, only now I’m using OpenGeo Suite to solve those problems and deliver professional services. Now that I’m in the inner circle at Boundless, I want to help drive the development of all of our tools.

Have you ever used OpenGeo Suite in a project?

Yes, actually just days after it became available in 2010. OpenGeo Suite had just been released for the first time two days prior.  I downloaded and deployed it on my desktop machine and, for dramatics, lets say I blurted out loud: “Hey, this stuff just works!” <laughs>  It was just fun to play around with and satisfying in that you could accomplish serving data and maps quickly. In minutes, I had a local app server running and was playing around with client examples and client interfaces in no time. Developing using OpenGeo Suite was incredibly clean and easy to configure — all the different modules just plug and play into the client.

Where did you first become exposed to open source geospatial tools?

At my last job we were a full Esri shop but we kept running into issues with the bloat and cumbersomeness of ArcGIS. Esri’s product had become so complex that in order to just deploy a server we’d actually need to have an Esri engineer on-site and looking over our shoulders and directing us. We weren’t able to just launch a WMS or WFS service without someone flying in to oversee it happening.

In the course of one of these installs, I did a search to see if there were any alternatives, if there were any other way to just quickly put up a geospatial server… and I stumbled upon GeoServer. This was back in 2010 but over the last four years there’ve been so many features added that it really started to give Esri a run for its money. Plus, it was easy to use out-of-the-box, as they say. It was just really helpful to be able to quickly get a server up and running.

Were you the only one in your shop who felt that way?

There became an enthusiastic groundswell from fellow engineers that wanted to deploy open source solutions.  It was a real natural push from engineers to get those in charge to adapt. There was also excitement from the government employees for something new — whether it be geospatial information systems, ‘geoweb’ services, or the like.  I’d say our government clients were generally about ten years behind what was new in the industry, partially due to inertia, partially to due ignorance of what else was available, partially due to being locked into Esri contracts.

Where do you see the geospatial web world moving in the future?

With Esri plateauing in terms of features and ease-of-use but forever rising in terms of cost and complexity, I believe that now is the time now for the more affordable and easier folks — us! — to move in.  I’ve seen a lot of desire for geospatial web enterprise infrastructure. Many of my government clients have been anticipating that the future is web geospatial information services.

What job or project did you find most interesting to work on?

I once worked at a sensor fusion company to develop the hardware and software used by infrared cameras to track movements and keep inventory. One was for an animal research lab that used the sensors to keep track of the whereabouts of lab monkeys within their habitats. Another was for an automotive parts supplier and it kept track of whether the proper amount of fluids were being added to the transmissions and differentials of cars as they were coming off the assembly line to ensure cars didn’t end up on the dealer lots without the proper fluid levels.

What do you do away from the keyboard?

If I have time off, I’ll be in my shop — I’m a big tinkerer. I guess I’ve been inadvertently co-opted into what is now called the maker movement. <laughs> I’m really interested in 3D printing, laser cutting, CNC machines. I even built my own CNC machine at one point. Most recently, I’ve been playing around with developing an open source watch that links to a smartphone. I’ve playing around with all the attendant hardware needed to do that — circuit boards, OLED displays, and the sensors needed to make that all work.

The Spatial IT Job Board

CareersThose of you who’ve seen Paul Ramsey’s Spatial IT and the Spatial Web presentation know that we see a great future for software developers and IT professionals with an interest in geospatial. Below are several job openings we’ve seen in the past month.

Job Listings

There are plenty of opportunities for spatial IT professionals but if we missed any relevant positions please share them in the comments or contact us and we’ll be sure to include them in future job board posts.