The value of zero – Olaf van Wijk
A short introduction: I am Olaf, a software engineer with experience in databases, distributed systems, edge computing, graph theory and relevant for this article: IOTA.
In relation to this, there is a question that comes up quite often when I and others talk about IOTA: What is the value or even reason for the IOTA token? “There are no fee’s and transactions can be sent for free, the innovation is the Tangle and doesn’t need the token. Value transfers can be easily done with another ledger like Bitcoin” as the argument.
In all honesty, most of the time I would even agree, though I know deep down this isn’t the whole debate. So let me try to explain.
From local to global.
When writing software you often need to store some data. Generally, this data is stored in a database, a ledger. A register where you keep records of whatever it is that you need to store for your operations. For the sake of argument, we can state that all software that is capturing ‘value’ (doing something useful) runs a ledger in one form or another.
Be it Uber, a supply chain or the grades at your school. What matters most for all these systems is the order of events. A driver gets paid after drop off, a package gets shipped after it arrived at the dock and a student gets a grade after they finished the exam.
Order, however, is relative to the observer, in a single machine system the order is simple. But if two or more machines need to agree on the order of events or data observed things get tricky and complex very quickly. This is where consensus mechanisms come into play, mechanisms to agree on the order of events so that the interpretation of them becomes consistent… at reasonable speeds. Distributed ‘big data’ database systems often only work ‘out of the box’ if they run within the same datacenter and can communicate at lightning speeds. These are requirements that are nigh impossible to meet under normal circumstances in a decentralized system where you have no control over the participants or their location.
However, Bitcoin invented something new, the blockchain. A system where untrusted participants agreed on a set of rules to agree on the order of events. Every ~10 minutes, a block at the time. Local precision of the order of events (datacenter speeds) was sacrificed to create a global order of events. No Uber, supply chain or school as the use-case but money. The first of it’s kind and truly a magnificent invention.
This leads to the premise that the more local and detailed the ordering of events, the more centralized it becomes and the more global the ordering, the more decentralized. It is not one or the other, it is a sliding scale for almost all use-cases. With blockchain however, we can only do global ordering, while in the real world almost all value is captured by centrally owned ‘locally ordered’ systems, called companies. To exchange their value for money they will need to expose themselves to a globally ordered system. Be it the banks, credit cards or a cryptocurrency. This is a hard systems disconnect, we have always done it like this so it doesn’t seem like a problem.
..what if you can have precision and speed of local ordering/consensus with the finality of a global one and all degrees that are in between?
This is exactly what IOTA offers, seamless integration between that what creates value(data) and the IOTA token to express it. Allowing to instantly mix in data and money in the same data structure at the speed that you can process it…. well is tremendously powerful! Oh and of course at zero fees!
You want to act on a payment or piece of data immediately, as fast as you possibly can? Then knowing and proving order is of the utmost importance.
Imagine having a sensor measuring the temperature outside. The sensor publishes its readings onto the tangle. You can read them and know the correct order of the readings, for free. In meanwhile all of your neighbors attach sensors as well and start producing their own private streams of sensor information. All nice, the tangle at it’s best! Free! No token required yet. Now the value being created here is registered in a nicely ordered data structure. The municipality now wants to know the average temperatures in your neighborhood. Aggregation, how does the municipality know how to align the different readings? Well, global ordering with local precision inside the tangle. Because of the indirect references between the transactions inside the ledger we can make a very strong estimation about the order of events inside that area, without even knowing the data yet. The municipality will pay each sensor some IOTA’s to get access to their data. Now the payment is created within the same data structure that the sensor data exists in. We KNOW the payment came after the generation of the data and that the average calculation based on that is executed after these payments. The order of events is immediately known to all participants and now they can act accordingly and there is an immediate audit log that leaves no room for interpretation.
Now imagine you need to pay the participants through another system, let’s say Bitcoin. First, we agree on an amount to be paid, we move to Bitcoin and transact there, once completed I can give access to my data. Now the question arises, when did this transaction happen relative to the ordered data in the tangle? (or even a database for that matter). We normally use unix time-stamps for these use-cases, but these are never accurate and only describe global ordering. Even the time-stamping of bitcoin is highly inaccurate and only describes with a precision of 2 hours. For global ordering in combination with siloed systems this might be enough but for sensors talking to other entities exchanging data for payments and services this is not precise enough.
So if you are convinced ‘the tangle’ is the invention, you inadvertently want to use the token at some point. If you use other tokens/coins/fiat you are recreating the problems that were solved for you by the tangle.
So the next time someone says the innovation is ‘the tangle’ and the token is useless? Tell them that the token represents the (potential)value created by the ‘zero value’ transactions. Want to sell sensor data? You need to mix. Do you want to sell personal data? You need to mix. Want to add arbitrary domain-specific digital assets and want them tradable? You will need to mix.
Now all those zero-value transactions I have made will eventually lead to the requirement to receive payments for services I am creating. Build the use-case first and then mix in the financial ledger. Not the other way around!
Build the use-case first and then mix in the financial ledger. Not the other way around!
Thnx for reading!