The Ghosts of Geospatial Past and Future
Download OmniSci Free, a full-featured version available for use at no cost.
My friends are getting tired of me asking, "What is the greatest innovation that has affected your geospatial workflow in the last decade?"
Well, how would you respond? The lackluster response I received was disappointing because we've just come to expect the status quo, with a little web-magic on top. That may be an oversimplification but contrast this to the lively discussions I have with my colleagues at OmniSci and you'd think we were in different universes, with all the leading edge projects they are working on.
Cell phone RF propagation analysis in OmniSci - 50 million rays rendered interactively
For this end of year post, I look back over the past decade or so and share thoughts on a couple of historic milestones that I see. Then I share why I believe OmniSci has tapped into the next great innovation: geospatial analytics at scale.
But first, my two picks for biggest innovations affecting geospatial in recent years.
The biggest disruptive innovation that I recall was back in 2005. Google launched its public web-based mapping platform and we wondered how far it would go. Until that time we had to use open source mapping tech stacks from OSGeo.org to build our own solutions, even for the simplest applications. O'Reilly's Where 2.0 event promised a landslide of web-based apps coming down the pipe and pointed clearly to collaborative web-based mapping as the Next Big Thing™.
While Google Maps, and web-mapping in general, didn't really solve GIS problems beyond simply rendering points and imagery in a dynamic way, it did provide a useful tool for those who just wanted to search for things in their geographic context. Similarly, content and application-specific providers like Mapbox and OpenStreetMap have come along to bring additional data and context to the web mapping applications we enjoy today.
The more recent advancement was cloud-based computing. GIS applications, in particular, had been largely focused on the desktop workstation user. By moving (largely open source) stacks to cloud service providers we were able to start having more collaborative analytic applications, not just map rendering.
The promise of cloud computing made it look like coupling large data volumes with distributed analytics was going to finally crack the code for geospatial analytics. Application service providers like HERE and platform providers like GIS Cloud.com and Esri's ArcGIS online offer more of the GIS analytical tools you'd expect but in a hosted environment.
They service businesses that need some help building and analyzing geographic information without having to manage desktop deployments. The entire data analytics industry made the gold rush dash to the cloud while building new workflows for getting their data back out in a meaningful way.
So What's Missing?
That is my very short list of recent innovations in geospatial, I can't point to much more, until now. What I've been waiting for is the scalable spatial database that can handle my geospatial queries and support my mapping applications. This is why I joined OmniSci, working toward seeing my spatial database dreams finally come true.
As an early PostGIS user, I thought that I had the problem solved – turns out that I was 10 years and 10 teraflops too early. While databases have been great for data storage and query, they have not been good at geospatial analytics and performance at scale.
Geospatial data is the original big data. We've always had more data than we could ever analyze - larger databases and wider geographic scopes. Likewise, there are more types of analytics possible but tools remain limited to smaller datasets and were never designed with scalability in mind.
System proliferation has also been a problem, even when cloud-based: one system as a database, one for managing geospatial files, one for analyzing data, and likely another for serving up data in apps, and another for visualizing it all for end users.
Because of all the bottlenecks in running these systems together, we've missed being able to fully explore and efficiently analyze our increasingly large datasets.
Furthermore, typical analytical tooling makes poor use of modern computing resources, relying heavily on Moore's law to bring more generalized computing power to the table even as data volumes outpace it.
You'd be forgiven if you haven't heard of GPU-based accelerated analytics before. Until recently, graphics processing technology was primarily used by the gaming industry. But the magic happens when you combine the hyper parallel processing capabilities of GPUs and modern CPUs, with a next-generation SQL database and a graphical rendering engine. We have partnered with NVIDIA and Intel to achieve just that.
The result has been incredibly encouraging for me as a geospatial professional:
- More data - A high-performance spatial database with the ability to process and analyze high data volumes (billions of records) in fractions of a second
- Faster rendering - Server-side rendering on a GPU that can render maps and charts instantly, that would typically take minutes or more in a batch process
- Intuitive UI - A built-in graphical interface for designing dashboards, charts, and maps
- Scalable platform - add more nodes as workloads grow
Ultimately, OmniSci gives you the ability to look at billions of map features in rapid time, at the speed of your curiosity. By addressing many of the bottlenecks of traditional approaches, exploring big data through a web application, on-prem or in the cloud, is now possible – whether looking at raw geodata, aggregated tabular information, heatmaps, choropleths, etc. or even filtered datasets with spatial joins.
These capabilities are exposed through an intuitive user interface, but also as a set of APIs and a SQL database to give maximum flexibility for application developers.
See OmniSci in action in this Telco industry demo by NVIDIA at Mobile World Congress.
Your Geo-Christmas Wishlist
What are the questions you had about your data before you realized the tools couldn't keep up? Do you remember adjusting your expectations?
We are so used to sampling, slicing, and subsetting that we naturally endure many limitations. But now we can take a step back and ask questions over an entire dataset (across years instead of only last quarter) or with a more global context instead of a single country, etc.
What would you wish for?
For example, do you find you have to pre-render data just to make it usable? Check out this FOSS4G event video showing the entirety of OpenStreetMap rendered on-the-fly (no tiling and over conference wifi)
Check out our Oil & Gas Demo here!
I'd love to hear if you agree that this is the next frontier of geospatial innovation, unlocking data that was previously out of reach of geo-analytic tools. We work with some of the largest companies in the world to help unwrap insights that have been previously hidden.
What problems can such a platform solve for you?
Maybe next year I'll get more answers if I ask my friends "What do you want for geo-Christmas?" I've already got a great gift to share with them. Drop me a note and tell me what's on your list for 2020.
- Billion-Row Geospatial Datasets Finally Have a Platform
- To Jupyter and Beyond: Interactive Data Science at Scale with OmniSc