Vector Tiles for All!!

What are vector tiles?

Vector tiles are packets of geographic data, packaged into pre-defined roughly-square shaped “tiles” for transfer over the web. As with the widely used raster tiled web maps, map data is requested by a client as a set of “tiles” corresponding to square areas of land of a pre-defined size and location. Unlike raster tiled web maps, however, the server returns vector map data, which has been clipped to the boundaries of each tile, instead of a pre-rendered map image.

Why should one use vector tiles?

Compared to an un-tiled vector map, the data transfer is reduced because only data within the current view, and at the current extent needs to be transferred. Vector tiles are also faster to download than a tiled raster map, as vector data is typically much smaller than a rendered bitmap.

Additionally, with a tool such as Mapbox GL JS, styling can be applied later in the process, or even in the browser itself, allowing much greater flexibility in how data is presented. It is also easy to provide interactivity with map features, as their vector representation already exists within the client. 

This is an example of a building footprint layer (from Oregon Metro) presented as vector tiles using Mapbox GL JS. The tiles are served quickly to the client, and using the Mapbox GL JS the data can be rendered in a variety ways. Here it’s being shown in 3D.

How do we use them?

Our crack team of developers has deployed a vector tile server that allows us to serve mbtiles raster files, mbtiles vector files, and ESRI bundled cache files. At this point, we are using it for most of our client-hosted applications. It’s a faster solution than anything we’ve used in the past, and it allows our dev team to create tools and functions in our JS-based Map Viewer that take advantage of the flexibility of the tiles.

We like to share

We submitted the code to GitHub for anyone to deploy.  At the core of the MBTiles server are dual instances of a node.js script that has been customized and offered to the public. Everything windows users need to get this up and running is included in the repository. The mbtiles server, the .NET reverse proxy, and the service creation scripts.

Once you’ve installed the MBTiles server, starting a web service is as simple as copying either raster or vector mbtiles files into the mbtiles cache directory. Setting up an ESRI bundled cache service is just as simple; copy the arcgis server cache directory to your mbtiles cache directory and you’re done!

Let us know what you think, or if you have any questions!

Links

The Gartrell Group MBTiles server 

Script to install windows services

The script that we forked in order to create the windows services script

 

We're growing!

Drew
We’re happy to report the addition of a new team member! Drew Seminara has recently joined our motley crew of developers.

Drew brings with him a great deal of experience in all things location. He has a Master’s degree in Atmospheric Science, with an emphasis on remote sensing. He spent several years working as a Geospatial Analyst for the NOAA Environmental Cooperative Science Center, a funded EPP Cooperative Agreement in Florida where he spent his time using and analyzing hyperspectral aerial imagery and multispectral satellite imagery.

Eventually, he decided to move to Oregon in order to pursue the programming side of geospatial science. After an intense programming bootcamp, Drew gained the chops that he needed to start developing the type of applications that he had been using as a geospatial analyst. He’s spent the past few years honing those skills for Ecotrust, a local non-profit.

ben_8189

He’s now on our team, and we couldn’t be happier. Having someone developing the applications who also has a deep understanding of real-world uses of the applications is a key element in our ability to provide our clients with exactly what they need.

Bringing Drew onboard has already allowed our codebase to grow, which has allowed us to push out even more updates and functionality to our customers.

Contact us if you’d like to be one of our happy customers!

 

 

 

A New Database Platform for Metro

metroBig changes!

We have recently completed a project for our friends at Metro - the Portland area’s regional planning agency – where we assisted them in “re-platforming” their major databases from Oracle to SQL Server. We seem to be making a habit of working with Metro when they are making big changes!

Big risks!

The Oracle databases were integral to many internal and public-facing web applications as well as data processing routines and desktop analysis tools used throughout Metro and by other agency partners. In other words, there was a lot on the line! As you might imagine, there was a fairly high degree of anxiety; not about any particular issue, so much as simply about the breadth of connections and dependencies that needed to be planned out, migrated, tested, and put into production.

Did we mention that we had a little over a month to plan out the approach and implement it?

Luckily, for us, the Metro team was a pleasure to work with – diligent, detail-oriented, and mission focused. They did a great job in the pre-planning even before we began our work. Our planning-phase tasks involved performing legacy database analysis and assessment, including identifying system dependencies and prioritizing the migration of databases and applications accordingly.

Big rewards!

When we’d made it through the library of planning phase tasks, sub-tasks and sub-sub-tasks, the moment for the cutover was upon us. People from our combined teams hunkered down for a long night and proceeded step-by-step through the final maneuvers.

It all turned out to be a bit anticlimactic. Our teams were all able to head home well ahead of schedule that evening with no-need to initiate any of the much-discussed triage activities.

We like to think that wasn’t all luck.

With the migration complete, Metro (and everyone that depends upon those databases) can continue with business-as-usual, with the confidence that their systems will work as expected.

Some of the interesting tidbits

  • We used Microsoft’s SQL Azure Migration Wizard that added efficiency and provided interesting if unverified metrics about how much time we saved over taking a more manual approach.
  • We integrated some bench-marking processes into the migration steps so we could progressively analyze performance changes as we adapted applications and code to work with the new platform.
  • We did some load testing on the new system and got a chance to throttle up and see how well it would stand the heat of simultaneous users, as well as processor-intensive operations. Identifying and resolving any potential bottlenecks ahead of implementation certainly helps with performance anxiety, of a certain kind.
  • When you’re in the middle of big change, it’s a good opportunity to…make changes. We worked together to do a lot of house keeping and deferred maintenance and updating.