Interested in data fabric? Join us in our free webinar to learn what data fabric actually is and how it addresses all integration problems in one.
Why Data Economy is subsuming the API Economy
Working together as a team requires collaboration and communication. Humans have several communication mediums – talking, writing, making gestures, taking pictures, creating videos. When computers work together, how do they communicate? Of course using bits and bytes, but just like humans, computers transmit information in many mediums – files, APIs, streams, events etc.
As every aspect of work becomes digital, be it operations, sales, marketing, logistics, product, computers increasingly need to work together. A regular purchase has computers collaborating from taking orders on an app or a website, to triggering warehouse logistics, to printing shipping labels, optimizing shipping routes, tracking packages, and even delivering goods. The bits and bytes flowing across computers are essentially data. These bits can be information such as “List of available products”, or instruction, such as “Print a shipping label”, or both.
How the data is delivered is important, but no one method (e.g. API) is universally better or worse than another (e.g. a File). Just as we may choose to communicate through pictures, video, gestures or words depending on the situation, the choice of delivery medium with computers also depends on the need. Belief that APIs are the only good delivery mechanism is faulty.
Like any consideration in computer science (and life for that matter), the choices are many and usually there is almost always a tradeoff that is made between speed and quantity.
The essential building block of an API is a network connection between two computers. As network connections have gotten faster and have gained capacity, APIs have become a preferred mechanism for a broad range of computer communication and data movement.
Every API call has two basic elements
- Request: Computer A sends a request to Computer B. This essentially includes two things
- Authentication: To tell Computer B that it can trust Computer A.
- Data. Data itself is optional. For example, when we send a web request to “www.nytimes.com” we don’t explicitly send any data. The Nytimes computer knows it needs to send us the news. However, in most cases, data is sent (either explicitly or implicitly) and may contain
- Instruction that tells Computer B what to do. For example, print a shipping label.
- Information for Computer B. For example, which shipping address to put on the label.
- Response: This is what Computer B sends back to Computer A after processing the request
- Status: The request was executed successfully or failed.
- Data: Again data is optional. But if the request was “Send me a list of all products in the store”. Then the response will contain a list of product names.
Service and Data APIs
Technically APIs are APIs with requests and responses, but for the sake of classification we can differentiate between APIs that are mainly instructions as service APIs, while APIs that primarily are meant to deliver data (in their response) as Data APIs.
Data is What Really Matters
Communication between computers is an essential part of the digital economy. So is it an API economy then? No. The economy is all about communication between computers. Yes a good chunk of it tends to be instructions and increasingly data, but the delivery mechanism, while important, is not constant.
APIs are great but they give you data when you ask for it. Also they can carry only so much payload of data, that’s why the overall communication between computers uses a variety of mechanisms. Here is just a small sample of the diverse ways in which data often gets integrated in modern business ecosystems:
- Data Exchanges are growing rapidly in popularity not only because of technological advances but also because organizations are realizing the importance of data sharing between the ecosystem of suppliers, partners, and customers. Across sectors, companies are engaging more and more with data ecosystems.
- EDI (Electronic Data Interchange) has become the standard now for rapidly sharing data between multiple entities in supply chain and logistics. Even though it’s been in use for a few decades now, it’s finding new use today by making supply chains more automated. For example, with an EDI process, procurement systems can automatically generate EDI-formatted PO when the inventory threshold is crossed, the sales order system can then automatically get a PO, get the item shipped, and continuously track while it’s on the way to the warehouse.
- APIs have become the de-facto solution for data transfer in many cases. This is largely due to the simple steps it takes – just define a data-transfer and initiate it using a transfer protocol like REST or RESTful. APIs serve the purpose of data transfer but creates complexity when working with multiple data channels and users.
- Emails are a quick way of sending data directly to the recipient in the form of attachments instead of other ways. But they often have a challenge in terms of attachment file size limits.
- FTP is one of the fastest ways of file transfer, businesses can implement the file transfer solution without having to go through complicated deployment requirements. FTP has gained popularity with large file transfers with speed, improved workflow where the transfer can be scheduled and even resumed after pause. For example, when it comes to transferring inventory data from stores to delivery services, this is still the battle-hardened way.
- Stream data is continuously generated by events occurring in operations such as clickstreams, market data etc. The data is continuously analyzed in real-time to give insights into service usage and customer interactions before it’s sent to a data store for archival purposes.
- IoT data for apps tend to follow a MQTT or DDS protocol depending on whether they need a hub-and-spoke or decentralized architecture. Depending on the use case, sometimes the data has to be centralized for data collection and analysis where MQTT works and sometimes the data has to be processed in a more distributed manner as is the case with DDS.
It’s a Data Economy After All
Ultimately that data gets delivered safely, reliably, and in a timely manner is what matters. The exact mechanism will depend on the use case. And while APIs hold an important role, other methods mentioned above are also very important, especially as new use cases continue to emerge in an enterprise. Data becomes much more important than a particular integration style such as APIs. In other words, the Data Economy becomes a superset of the API Economy. It is the data that ultimately holds true economic value, while the various delivery mechanisms take an essential but invisible role in the background.
What Should You Do?
Think about all the ways in which the focus on data versus delivery will impact your business. Here are three questions to ask at different levels:
- Strategy level: Are you thinking of data from files, DB, API, or streams as different tech stacks, different product strategies, or as different parts of a larger whole?
- Architecture level: Next time you are thinking of building an API or buying an API tool, ask this question: “How well will this solution adapt in coming years as my data volume or velocity needs change”?
- Execution level: Is this problem a data problem or a delivery problem?
The Nexla Solution
At Nexla, we have centered our focus on data with the understanding that the right data needs to be paired with the right delivery mechanism depending on the use case. That is why the Nexla DataOps platform supports multi-modal data delivery depending on the use cases. Based on this approach we built a mechanism to automatically determine the right way to deliver data – streaming, files, APIs – push APIs pull APIs, data exchange.
Here are six features in the Nexla DataOps platform that converge APIs into regular data flows
- API Data Integration: With APIs as a Data Source, delivering data into a warehouse, data lake, or stream. E.g bringing Customer Support data into Redshift
- Reverse ETL : Nexla delivers data from any source to API systems. E.g. Creating audience segments in advertising platforms from a user data
- API webhook: Generate a webhook in Nexla to receive real-time data events. E.g. Receiving product or inventory events from Shopify
- No-code Data API generation: Generate and design data API endpoints from any data store.
- API Proxy: A single API interface API encapsulating payload, authentication, and response mapping across multiple diverse API endpoints.
- API as Data Product: Auto-generated Data Products from APIs bring APIs within the fold of all data use cases.
For more information or questions, check out our website www.nexla.com, send us a message, or email me directly at saket at nexla.com.
Unify your data operations today!
Discover how Nexla’s powerful data operations can put an end to your data challenges with our free demo.