“All teams will henceforth expose their data and functionality through service interfaces.” — Jeff Bezos, 2002
If you haven’t read Jeff Bezos’ big mandate in 2002 (as told by Steve Yegge), it’s definitely worth
. Bezos’ decree to Amazonians shifted the internal culture towards APIs and open data. Those who didn’t follow along were swiftly fired. Fast forward to 2006, AWS launches and today has
of the cloud infrastructure market. Looks like the mandate steered the ship in the right direction.
I’ve been thinking more about this mandate following Coda’s
last week. In case you missed it, Packs gives you the ability to
with apps you are are already using today like Gmail, Slack, and Instagram.
To me, using Packs is like the first time I discovered suitcases with multi-direction wheels on them (why did we ever think suitcases without wheels made sense?). As we were internally testing Packs, I couldn’t see a world where you would ever want to start with a blank Coda doc…or even a blank Excel spreadsheet or Word doc. I thought copying and pasting data from different sources is just “part of the job,” and maybe if you’re lucky some developer wrote an integration so that you can see the data you want in Excel or Google Sheets. My introduction to this world of seeing your data from other places started with a small Excel plugin more than 10 years ago.
It’s 2008, and I’m starting a rotation on the Engineering Finance team at Google. My job was to forecast multi-million dollar budgets for Google’s Engineering department. This seemed like a non-trivial task for a 24-year old at the time. Back then, Oracle was the de facto standard for storing your company’s financial data. The way interacted with the Oracle database was through this small management system called
Source: Mindstream Analytics
All back office financial analysts just cringed a little after seeing the image to the left.
This was supposed to be the user-friendly version of Hyperion for logging budgets and other various financial data. Since we were doing all our analysis in Excel, Oracle released a plug-in for Essbase so that you could “Lock and Send” budgets to the database from Excel.
When I first saw this functionality, I was amazed! Not only could I do my analysis and forecasting in Excel, I could
the results from Excel to this Oracle database. I could
data from the database right into my Excel file. Just as I became familiar with this new plugin, I saw how fraught with mistakes it was. A new financial analyst could easily screw up the budgets if they didn’t know exactly where to align the columns and cells so that the data got pushed back to the database correctly. Nonetheless, this was my first taste of manipulating a data set coming from a 3rd-party source.
Finding Places to Eat With Your Friends
A couple years later I started working on a side project with some friends. The project was called Monje, and the problem we were trying to solve was help you and your friends discover restaurants in NYC that you wanted to eat at (a 1st-world millennial problem for urban-dwellers). Thanks to the interwebs, you can still see a version of the web app
. It was an amazing experience. We developed a Chrome extension, gathered thousands of photos via cheap labor on Elance, and got high for inspiration.
One of my fondest memories during this project was building my first ever web-scraper for Yelp. Back in 2010, Yelp was relatively easy to scrape along with Foursquare, Streeteasy, and Craigslist to a certain extent. The web-scraper was pretty elementary. All it did was scrape the website for a user’s favorite restaurants, a user’s friend’s favorites, and stored this all in a database.
Yelp circa 2010. Source: Yelp Blog
Seeing the data come back from Yelp in a structured format was magical to me. I could sort and filter the data using a framework called Active Record in Rails to get only the data that mattered to me. More importantly, I knew I could automate the whole process by writing a little script to make sure our web app had the most update to date information from Yelp.
While this was a great learning experience, there was something inherently hard about the whole process. I was not a developer, and was doing things by trial and error until I got the computer to do what I wanted it to with the data. A whole new set of rules would have to be developed for Foursquare and Zagat, two other websites we were thinking about scraping. Finally, I didn’t have a way to deal with basic HTML changes to Yelp’s website. If they decided to put the title of the page inside a different <div>, my whole program would fail.
There is no shortage of blog posts about the rise of APIs for connecting your data or service with the outside world. If Yelp or Hyperion Essbase had easy to use APIs back then, I’m sure a developer would have built a tool to help business users like myself pull data more easily. Today, most of the applications you use online have an API to make it easy for developers to build connectors between applications. Products like Gmail, Slack, and Google Calendar all have APIs that allow you to see the data in other programs.
Why would you ever want to see your e-mail or Slack messages in another application besides Gmail or Slack? Doesn’t it make sense to just use these applications for what they were meant for? I was skeptical at first until I started
from these other applications in
and had the ability to
to this data from
What excites me most about what we are doing is the speed in which our team is moving. We are constantly releasing and testing new Packs every day and I believe there will be a day where you can see your data from any application laid out nicely in a table in Coda.
Gone are the days of exporting to CSV and hoping your columns line up nicely.
Gone are the days where you export 10,000 more rows than you need because there’s no way to filter out the data from your application’s export feature.
Your data gets laid out nicely in a structured way and you can act on it without having to open up the other application.
This quote from Yegge’s post referenced at the beginning of this post summarizes this new world we are are heading in quite well:
A product is useless without a platform, or more precisely and accurately, a platform-less product will always be replaced by an equivalent platform-ized product.
I hope you get to join me on this quest to free your data.