I always ask people…
Do you really need real-time?
This question really boils down to your data-access patterns in product. If most of your specifications are static lists or content, then having a simple request and response transport coupled with a caching client can serve most of your needs.
This is what is I love about GraphQL technologies, they are optimized for the majority use case and give you alternative methods to keeping data fresh:
- Client Cache Invalidation
Previous Real Time Experience
Coming from the Meteor community previously, data access was “real-time” via a piece of technology called DDP (Distributed Data Protocol). This allowed the engineer to create a subscription on the client for pieces of data, sent over WebSockets via “live-queries”.
This real-time solution works great for experiences demanding real-time feature sets, and having your data refresh on screen in response to any document changes is magical, but it leads to tons of waste. In this pub sub system you would, using Mongo Syntax, enforce the data over the wire via “projections”. A client developer would specify fields they need to render UIs and the system will react to any changes to the data in those underlying fields.
This is where it gets suboptimal. In many user interfaces there are a collection of data points used to present information that is static. So if most of your data is static, you incur a higher cost than its worth with live-queries.
Ideally, we only want to update the piece of state we as product developers deem necessary to keep the data and UX consistent. This is where event based subscriptions come in.
The most popular way to do subscriptions in GraphQL today is via an Event Based Subscription. Let’s look at how this happens.
The current state of the art around GraphQL Subscriptions center around a topic-based publish/subscribe model. “Messages” are published to “topics” or “channels”. Subscribers in these systems receive all messages to…