Blockchain scalability for supply chains
How can we achieve a scalable blockchain-based supply chain solution without sacrificing security and efficiency? You’ll need a consensus algorithm that can quickly validate the transactions being made. In this article, Dr. Vlad Trifa explains why (and how!) a blockchain-based network designed to support global supply chains must be flexible, reliable, secure, and above all, scalable.
The ability of a network to scale is one of the most important factors for its success. Scaling refers to a network’s ability to grow in usage while still maintaining its original performance and security. This is a challenge that’s particularly heightened by the application of blockchain to addressing issues in global supply chains. Capital that’s tied up in supply chains worldwide represents two-thirds of the global GDP; therefore, ensuring performance in a solution as it grows with usage is a real problem that can have a significant impact on both producers along the supply chain as well as end consumers. A blockchain-based network designed to support global supply chains must be flexible, reliable, secure, and above all, scalable.
From Sensors to AMB-NET
While it is clear that Ambrosus is primarily focused on the quality assurance of food and pharmaceutical products through their entire life-cycle in the supply chain, it is less clear exactly how this quality assurance is realized. After all, not every person has a technical background to understand concepts like ‘Events,’ and ‘Metadata’.
In clear terms, we can break down what happens every time an object on the supply chain is recorded by the Ambrosus Network (called AMB-NET).
- First, the object must be monitored by some type of hardware sensor at some stage of its supply chain journey. These sensors can either come from the company in charge of the product, or from Ambrosus’ own laboratory.
- Second, once a product (e.g. a bottle of extra virgin olive oil) has been connected to a sensor (e.g. a simple temperature sensor) the data recorded by that sensor is transmitted in real time to a node on the network.
Every piece of data is recorded as either an ‘Asset’ or an ‘Event.’ An ‘Asset’ is a globally unique digital ID recorded on the blockchain and acts as placeholder for data. This can refer to any unique physical or logical object in the supply chain (an individual product, a batch, a shipment, etc.).
Meanwhile, an ‘Event’ refers to the conditions of an ‘Asset’ during its journey in the supply chain. In the example above, this would pertain to the temperature of the bottle of extra virgin olive oil. All ‘Events’ are recorded by the hardware sensors accompanying the specific products throughout the supply chain.
- Third, after receiving the data from a sensor, the node then prepares the data from the sensor for the blockchain by generating a special ‘hash’ that refers to certain ‘Meta-data.’
To break down some of this jargon, it’s possible to say that a hash is an alphanumeric character string that corresponds to a particular data entry (essentially like a Wallet Address). ‘Metadata’ meanwhile, refers to: a) the Unique ID (either Asset ID or Event ID) of the entity being recorded, b) the author of the data, c) the timestamp of when the data was recorded, and most importantly d) the hash taken of the original data corresponding to that asset/event.
- Fourth, all of the metadata and public data of assets and events are then all packaged up into a ‘Bundle’. Importantly, each ‘Bundle’ can contain the metadata of 16’384 ‘Assets’ or ‘Events’, along with their public data.
- Fifth and finally, the entire ‘Bundle’ is then dispatched and stored by 7 Atlas nodes, and the Merkle tree of all hashes in the Bundle is created and the root is then included in the next block validated by an Apollo node. Once validated, the Bundle – and all the data contained within is forever etched upon the blockchain as a secure and immutable keeper of trust for all of the data that has been recorded. This way, the quality assurance of products can be guaranteed.
The metadata of the original bottle of extra virgin olive oil, and its corresponding temperature at a specific time would thus be accessible publicly by anyone (or only selected partners only for private data, such as an insurance company, government authorities, etc.).
Such a process seems to be pretty straightforward, especially with our simple example of one bottle of extra virgin olive oil undergoing one event. But what happens when there are hundreds of thousands, or even millions, of Assets and Events being sent to the network every day, or even every minute? How can all of the data be collected and stored securely, and without slowing down and clogging the network?
The proof of authority consensus mechanism
One solution lies in the way the data is validated by certain types of nodes on the network. Apollo Nodes are permissioned nodes on the network that validate all transactions put onto the Ambrosus blockchain. Importantly, the ‘Proof of Authority’ consensus mechanism that Ambrosus adopts, gives only whitelisted Apollo Nodes the authority to validate the data that needs to be recorded.
Every time information needs to be verified, the various Apollo nodes on the network will come to an agreement concerning whether the data put onto the network is valid. At a certain mathematical level of convergence, the Apollo Nodes reach a consensus about their broadcasts, and then all of them together update their ledger record, with the creation of a new block. A new block is created every 5 seconds and each block can contain 45-50 transactions (~10 TPS). Clearly, degrees of magnitude lower than what would be required for a global scale supply chain infrastructure.
How AMB-NET enables scalability to support such scenario is that it uses a network of decentralised gateways – Hermes nodes – to accumulate up to 16’384 events/assets into one bundle which gets written to the blockchain at one time. Suddenly, each time an Apollo validates a transaction, it also validates a ‘Bundle’ of data containing up to 16,384 Assets/Events, totalling up to 819,200 sensor readings in a block containing 50 Bundles. Thus, the innovative notion of a ‘Bundle’ allows a substantial increase in the number of sensor transmissions recorded to the blockchain. Ultimately, this enables the Ambrosus Network to multiply the throughput of the blockchain by over 10,000 times.
The numbers behind a scalable solution
To get an idea of the magnitude and capability of the network it’s necessary to do some math.
Based on the fact that Ambrosus’ Network usage is monitored in terms of Bundles created per day (Bundles/day), we can calculate the following about the current maximum capacity of AMB-NET:
- Transacting up to 819,200 Asset/Events per block, is akin to transacting 10 bundles per second (as there can be up to 16,384 Assets/Events in each Bundle).
- 10 bundles per second equals 600 bundles per minute (10 bundles times 60 seconds).
- 600 bundles per minute equals 36,000 bundles per hour (600 bundles per minute times 60 minutes in each hour).
- 36’000 bundles per hour equals 864,000 bundles per day (36,000 bundles per hour times 24 hours in one day).
As such, with the current settings of the Ambrosus Network, the blockchain would be capable of processing 864,000 bundles per day (roughly 14.15B assets/events per day). By increasing the number of assets/events in a block – or by not storing individual sensor readings on AMB-NET, the number of events/assets processed is only limited by the scalability of the individual Hermes nodes and how much assets/events they can process and store each day.
Finally, if we come back to the original question, what can we conclude about how a scalable blockchain-based supply chain solution can be achieved, without sacrificing security and efficiency?
In short, the necessary ingredient to create a scalable blockchain-based supply chain network is a consensus algorithm that can quickly validate the transactions being made. In the case of Ambrosus, the Apollo Nodes can validate the accuracy of metadata stored within much larger bundles, and thereby drastically cut down on the number of transactions needed per second. With increased scalability, more and more consumer products can be verified and monitored every second. Such a scalable supply chain network is essential for creating an efficient and sustainable solution that consumers can depend upon.