For more than a decade, business intelligence firms, publishing, integration platform companies, and database vendors have claimed high ground concerning “real-time” data, reporting, and all manner of information distribution. Companies in every business segment have enjoyed great success standing high atop the real-time heap. For the most part, real-time solutions were certainly faster compared to previous generation architectures.
Most of them are lying and more likely to be selling solutions that are seemingly real-time or near-real-time.
There’s a difference between [seemingly] real-time data interchange and truly real-time performance. And while we’re putting a finer point on the definition of real-time, we might also want to consider real-time architectures that also scale. After all, it’s easy to say that your solution is real-time, but if only one person at a time can get real-time updates, the definition falls short in most business contexts.
This is a common performance characteristic. Your server in Los Angeles has data; your sales team spread across 120 cities need data. There are many ways to deliver critical information to the sales professionals at a variety of speeds. Delivering sales-related data in near-real-time is generally fine until, of course, your competitor updates their sales force slightly faster.
This is the moment you realize that near-real-time is adequate but not competitive.
Information delivery performance depends largely on the shelf-life of the information. Most data has a relatively moderate shelf life, but some content is worthless in less than a minute.
Information architects rarely consider the shelf-life of data, but there are new emerging requirements that will transform the shelf-life of information into a key competitive attribute.
We live in an ever-increasing real-time consumer economy; we expect information instantly. As an always-on society, fresh and informative content is considered a necessity, not a luxury. This expectation is wending its way into every aspect of business.
Until recently, near-real-time was about the best we could do. Even the most modernized integration and API architectures are incapable of moving data with truly real-time performance.
This has changed in the last year; truly real-time performance will soon be ubiquitous across many data platforms and web services.
The Promise of Real-Time Data
I recently completed a project that required rapid access to a list of 2 million companies over HTTP - a basic web app.
Under most circumstances - and this was initially the case - the client’s IT group created a traditional architecture with a SQL database, and a RESTful integration. It performed reasonably well, but it didn’t give users any sense of zippy performance.
I modified the app by using Firebase as the data store which allowed me to update web pages in about a quarter of a second and all without the heavy-handedness of a request-response architecture commonly used in modern web applications.
How is this possible?
Sockets. Instead of opening a socket for each request/response via an API, open a single socket and maintain it throughout the entirety of the user’s session with the web app.
Publish-Subscribe (and then some)
Think of sockets as a pipe that your database has opened, and an endpoint that clients may subscribe to about specific data elements. It’s like a wormhole for data providing a super-highway between two points.
This architecture makes it possible for dozens, hundreds, or tens of thousands of clients to receive the same (or different) data instantly - without any latency.
Client applications are able to specify what flows through the pipes and the real-time database is able to convey the desired information with true real-time performance.
This is the future of real-time analytics, but it’s game-changing on two sides of the process:
Capturing event data
This demonstrates a very important aspect of sockets; they are just as good at conveying information out to applications as they are at collecting information from those same applications.
There’s absolutely no doubt that Firebase can enhance, and in some cases revolutionize the way we build apps in the Google ecosystem. To date, I haven’t countered a single project where this new real-time datastore couldn’t be used to expand the performance and reliability of Google Apps Script applications.
A Production Instance
I recently worked with CLoans, Inc. to build an app that delivers search results driven via sockets which are returned in less than .25 seconds. Likewise, leads are driven from the client into the backend platform in sub-half-second performance. Lending consultants are able to see new leads on their real-time dashboard even before the search results are delivered to the prospective borrowers.
And regardless of whether the integrated sockets layer is leveraged, even the REST interface is blistering fast and able to support very large data sets moving between Google sheets and Firebase.
Firebase and other truly real-time platforms such as PubNub change everything. It’s faster, simpler, and easier than ever to deliver solutions that are faster than [seemingly] fast - they’re instant.