Must-watch: 500 Billion Reasons why IOTA

Category: Explaining Series

IOTA: Infinite use-cases in a zero margin society and collaborative economy

IOTA: Infinite use-cases in a zero margin society and collaborative economy

The Internet of Things is the future, but its real character, its disruptive and transformational nature is not easy to grasp.

Needless to say, IOTA’s use-cases are rarely understood in its entirety.

Time to clear up some misconceptions. This post is inspired by Jeremy Rifkin.

The IoT is not primarily about wearables, smart fridges, smart-hamsters and the toilet that orders new paper. These popular examples are often cited by the media in order to describe the matter at hand to the new reader.

But the truth is that the focus of the IoT will be a different one. The fourth (or third, if you count out the internet) industrial revolution will make life easier, more efficient, safer and fairer. Global progress.

The future is about reduction and improvement.

Reduction of fees, a smaller ecological footprint, improvement of equality, also fewer barriers for society, better healthcare, less corruption, the list goes on and on.

This also applies to almost every other DLT, so one of the questions is: what makes IOTA and its applications unique?

In the end, it’s all about the zero margin society and sharing economy.



An economical landscape where services and infrastructures are the basic layers that arise due to the industrial transformation.

The paradigm shift from a throw-away mentality to a sharing economy has already begun. Car sharing, Air BnB, the transformation from fossil energy to renewable energy that can be shared between neighbors.

The scope of applications has yet to be determined especially because most of the new use-cases are without existing reference.

Everything is new, everything is an uncharted area.

In order to specify possible use-cases, I’m going to list a few in their respective areas.

The transformations of technology and society are inevitable and already happening.

The biggest question is: Will this industrial and social conversion be successful in a way, that employment levels and the standard of living can be kept or even increased?

A second issue is: Can we provide all these solutions before we reach an economic breakdown and environmental disaster without a return ticket?


Machine Economy

When Machines are customers and provider or sellers, there are countless possibilities in the future.

Wherever a machine creates goods or services, it can sell them for microtransactions. That includes manufacturing products, services, data, streaming services and digital rights, such as video, audio, electricity, or even connection access, also services such as transportation or maintenance.

Everything that has an expensive middleman today, can be automated and distributed in the future. Centralisation is one of the losers of tomorrow.

Vending machines, Air BnB, e-Meters, gas meters, electrical vehicle charging.

Autonomous vehicles will inhabit every urban area. These vehicles can be equipped with a wallet and in order to make them truly autonomous, they belong to themselves.

That is one of many zero margin models, experts like Jeremy Rifkin draw for the future.

The possibilities are truly endless.

Use-cases: Industry, Domestic applications, gastronomy, electrical vehicles, payment models, pay on demand, infrastructure

Sensor Data

Data is the new oil, the new gold.

Whereas today many data-packages are sent to a cloud, thousands of kilometers away, the future will be that edge, fog, and mist computing create new ways for the industry to improve their systems and to sell these data-streams that are produced and processed in real-time.

The sharing economy enables business opportunities, where data of any kind are the product.

IOTA’s data-marketplace, therefore, can be the key corner for revenue, research, and sharing of personal data, that belongs to the people.

The GDPR is the first step in Europe, to ensure that digital rights for customers are always given and big companies hold onto these laws. IOTA can be the solution for every single use-case.

Use-cases: Sell/buy data, be the owner of your data, monetize transport data, environmental data for research, customer-behavior, biological indicators in health-care, pollution in subjects of protection. 

Legal Applications

Why using signatures, when you have tamper-proof, unique transactions and hash values that are as binding as legal contracts?

The advantage is that the decentralized storage of these signatures enables that only the appropriate owners and relevant contractors have access.

With this ability, you can ensure data protection, reduce barriers, and protect rights of owners.

As seen with NetObjects newest use-case, IOTA makes it possible to create a fair solution for owners of media of all kind.

Digital rights are a big field with billions of turnover each month. Smart-contracts are a new technological revolution, that enables all kind of transparent, yet fair solutions for hundreds of use-cases.

Use-cases: Heritage, insurance, digital rights, access rights according to age or position in a company. 

Data Integrity

There is virtually no use-case here where data integrity is irrelevant.

Every single data stream needs to be protected from third parties.

When the transparent customer loses his pseudonymity, the data he produces can be misused in many ways.

When big data companies are losing their control over their data, they lose their basic business model.

Data integrity is, therefore, the basic layer that needs to be given at any time, everywhere.

IOTA’s Masked Authenticated Massaging (MAM) will be the solution in the future, to protect the world of information in every application.

Use-cases: Everytime sensitive data is transacted, equipable in almost every single use-case.


The cure for the evils of democracy is more democracy. (H. L. Mencken)

Elections, the structure of organizations, the right of participation.

Democracy and governmental structures should be a basic right and every human should have access to it.

Sadly, corruption and the interference of the mightiest often reduces the democracy to an epitomal construct of irrelevance.

People do not feel like they own the power they belong.

In the future, DLT’s can and should be used to clear elections from external access.

One transaction could be one voice. This emancipation of the people before the big players in the world is an important step to stop corruption and inequality.

Transparent and credible governance ensure the trust of the people in their political representatives.

That also applies to all other elections in organizations and companies.

Use-Cases: elections, anti-fake news, emancipation, the power to the people, ensure that the source of information is legit.   

Sharing Economy

Given that collaborative efforts reduce costs of energy, transportation, and basic infrastructure, the economy can transform to a zero margin cost society: the sharing economy.

The combination of all other use-cases enables a world where local economic clusters are distributed all over the world.

Energy will be produced without fossil energy and will be collected, processed and sold in neighbor-efforts so that the energy markets are a distributed and

Less central authorities and companies will cite the daily life, as people can act

The picture I personally imagine for a city in 15 years is that we need no cash, that most of the systems, including shops and services run around-the-clock, and that mostly autonomous machines maintain supply and maintenance of almost everything.

We have access to a free infrastructure of electricity, connection, communication and basic supply of groceries, healthcare, and education.

IOTA could become the distributed layer for data and value transactions. With its approach to data integrity, scalability and zero transaction fees, it’s designed to work as the backbone of the IoT, in almost every imaginable use-case.

Fees or missing scalability would render the majority of use-cases impossible as the IoT will be a global network of billions or trillions of devices. No other open source project can deliver the necessary characteristics.

The following outstanding speech of Jeremy Rifkin draws the IoT and its implications.


Controversial IOTA. Nine heavy questions answered.

Controversial IOTA. Nine heavy questions answered.

Cryptoland is innovative, interesting, filled with opportunities, drama, unbelievable stories and fraud. We love it.

The ongoing list of scam-attempts, hacks, thefts, and inside-jobs, though, is part of the daily life -both entertaining and hair-raising.

Decentralized systems enable opportunities for bad parties, one cannot even imagine in the world behind the computer, and yet, we all believe in them.

When legitimate companies decide to collaborate with one of the ~1500 crypto-projects out there, it’s a good sign that things are going in the right direction, but to be honest: most of these companies are still learning about distributed ledgers, so they are merely a positive indicator for the answer of the question: is my money safe?

There are too many cases when these collaborations ended in a fiasco and the devs ran away with the funds. Since bleeding edge tech isn’t audited, and mostly unregulated, we have no bulletproof insurance that the persons involved are good-hearted and acting in the name of progression.

You’re on, and I write about IOTA like I perceive it, so naturally, things are drawn in a positive way here.

The reason is simple: I believed in the vision before I invested, and now I give them my full support -which, of course, means nothing to smart investors from outside.

Due diligence is the most valuable currency of traders and investors, and also for companies that decide to give IOTA a chance.

The last weeks, however, have shown that there is nothing to fear and that I obviously backed on the right horse. What a year!

Time to sum up the latest evidence that IOTA is a legitimate project, that has the potential to change everything.

IOTA works differently as blockchains in my regards, it’s just a matter of customer habits, that a lot of people doubt the technology, they don’t know, yet.

Therefore, I’d like to sum up a few spicy questions, that have been going around for quite some time.

  1. What about the alleged cryptographic vulnerability in the IOTA signing algorithm, the “MIT”-scientists wrote about?
  2. I heard negative things about the coordinator, what is it?
  3. What about the incentive to run a full node? Does the network find enough supporters?
  4. Is IOTA truly free to use?
  5. What about the bad usability of the wallet?
  6. Why is IOTA still not listed on Bittrex, Poloniex, and Kraken?
  7. There are reports that people have been hacked. What about that?
  8. I waited 20 hours to get a withdrawal from an exchange? Is this IOTA?
  9. If companies and projects don’t need tokens for data-transfer, how does the token gain value?


1. What about the alleged cryptographic vulnerability in the IOTA signing algorithm, the “MIT” -scientists wrote about.


What a great show! The headline on Forbes and several other magazines alone were incredibly catchy and held back countless companies and investments -that’s for sure.

That the alleged vulnerability wasn’t in effect, that no funds were ever at risk, that the testing conditions were ludicrous, and that everyone involved acted with a conflict of interest, wasn’t part of that article by Amy Castor.

But in mine:

Since then, a second article from the Digital Currency Initiative of the MIT Media Lab left many believers speechless, as the allegations were clouded in unsubstantial criticism.

The IOTA foundation then decided to respond to the matter at hand and they included a list of conflict of interests.

As part of a natural decision process, you should read all parts in order to draw a conclusion.

Part 1

Part 2

Part 3

Part 4

The not so responsible “responsible disclosure” was lead by Neha Narula of the MIT Media Lab. I’ve received several direct messages from Harvard and the MIT since then, that my research was accurate.

As for the future: The IOTA foundation decided to hire independent professionals (, to work on the mentioned signing algorithm, in order to legitimize their efforts and to develop a working and smart solution.

Please read the blog-post of IOTA to understand how serious the efforts of the IOTA foundation are.

To draw my conclusion:

-IOTA is working as intended

-There is no vulnerability, the funds of investors were and are 100% safe at any time.

-Independent cryptographers, in addition to the competent IOTA team, are working on the maintenance and security of the tangle

-Competing projects try to slow down IOTA’s progress

-In order to get an unbiased opinion, investors should take their time and view all evidence to understand that forces are working in combined efforts to harm IOTA.

Up to this day and after repeated requests from the IOTA team, the DCI team has still not released any exploit code publicly.

-This is how the crypto-community thinks about it:


2. I heard negative things about the coordinator, what is it?

Maybe the most controversial discussed topic in the IOTA ecosystem.

A node, in possession of the IOTA foundation, that is setting milestones, and that is used to take away the money from the people?

Bollocks. First of all, read the transparency compendium by the IOTA foundation.

The coordinator is training wheels for the juvenile network and has solely the task to set milestones as a Sybil-attack resistance, nothing else. Also, if people want to code referencing nodes themselves, they can do it, because the knowledge is out there, but people just did not do it yet, hence the importance of the security measure “coordinator”.

Needless to say: the foundation, which is officially registered in Germany as a non-profit organization, under German law, will not exploit it. They would stop their complete venture, which makes zero sense.

Luckily, we humans, sometimes, have a brain and can handle imperfect situations for the greater good.
So here is my personal opinion as someone that is part of the IOTA community for a long time:
I do trust these peoples, I know them for 2 years, they are nothing but progressive and eager, so I’d rather chose them over a world, where people just try to exploit everything they can. Including temporary solutions that aim for security.
Because it is necessary.
In a few months, when the hash power is big enough, the training wheels can be taken away step by step and the bad actors cannot attack the network with a Sybil-attack anymore.
That is a trade I’m willing to make, as someone that doesn’t like an inefficient Bitcoin et. al., that doesn’t like the waste of mining electricity and the hypocrisy of the whole cryptoland, because what matters is innovation and not a useless buzzword a la “decentralization” that won’t matter soon.


3. What is the incentive to run a full node? Does the network find enough supporters?

It’s been a while since I wrote my article about the incentives. Many things have happened until then.

The number of full-nodes has been steadily increased and estimates say we are at approx. 5000 but there is yet no way to pinpoint the number.

With the help of Zoran and the audience of my Sonntagsplausch and Sunday banter, we managed to add an additional number of ~400 full nodes to the existing base.

The people obviously believe in the technology and are willing to spend their time and money on it, even without compensation.

A good source to set up your own full node is

But there are other mentionable ventures.

Roman Semko from Semko development in Leipzig, Germany, is creating an ecosystem for the improvement of the Tangle topology.

Carriota, Nelson, and Bolero are brand-names of his proprietary software, that is running on top of IOTA, a layer that enables connections between all other Nelson participants and the Tangle itself.

Bolero even goes a step further in creating an easy 5-step installation of a full node that connects to the Nelson Network.

This solution works as a supplement which is wrapped around the Tangle. Its effects are quite helpful, especially because it significantly lowers the entry barrier for people that want to help.

His solutions are on the website and in Github 

We know that several cities and companies research the tangle for their business models.

Cities like Taipei, Tokio or Haarlem are creating clusters in the Tangle network, that rely on several hundred full nodes, which are added to the high number very soon.

We heart that argument very often: IOTA cannot work, there is no incentive.

They were wrong!


4. Is IOTA truly free to use?

That is a matter of taste.

In terms of transaction fees, yes. IOTA has no transaction fees. You send one cent from Japan, the guy in Brazil gets one cent.

However, you need to keep in mind that there must be a spam mechanism to prevent malicious parties from flooding the network.

Therefore a low proof of work algorithm has been added as an anti-Sybil-mechanism.

That is a countermeasure because a system like IOTA will always be targeted in order to try a double-spending attack.

There is practically no way to perform such an attack successfully, but that doesn’t stop some antagonists to attack the network anyway.

Mostly in order to slow down public nodes to make it look like IOTA wouldn’t work. That is a side effect of using the light-node system: you have to rely on your light node provider.

A full node has a way better user experience.

To conclude: IOTA is free to use, but the proof of work is necessary to protect the funds. There are no fees other than tiny amounts of electricity.

Compared to Bitcoin as the one cryptocurrency which is known the most, you had to pay ~$15 or more to send your amount.

Try to send 15 times one cent from your position to Brazil and compare the added transaction fees from Bitcoin and the electrical costs of IOTA’s marginal PoW.

$225 vs 0.0001 cents for electricity.

IOTA is as free as it gets.


5.  What about the bad usability of the wallet?

Before I get into that, I want to point out IOTA’s take on that, that has been repeated numerous times.

IOTA is made for the machine economy. Since competition and time are issues that are taken very seriously, the focus of the limited resources of the IOTA foundation lays, therefore,  in creating a protocol, the research of the technology, collaborations, and simulations.

Take a look around and compare IOTA with other projects that have a different approach: Byteball, Raiblocks(Nano), Pascalcoin.

The usability is higher in all of them, but they lack research, collaborations, simulations and a clear vision of how and where they want to solve problems. One of them received help from IOTA co-founder Sergey Ivancheglo because he found flaws in their system and helped them to correct the issues. Responsible disclosure done right – no FUD, no malice.

IOTA lacks usability and an easy to use wallet because they focused on getting Bosch, Volkswagen, Fujitsu on board. The list of collaborations and business connections is already too long to list them in one article, but it’s hard to deny that IOTA chose the way of success and adoption.

The Trinity wallet of the University College London around Dr. Navin Ramachandran, Charly Varley and a few other developers is the solution for the usability issues people communicated.

It will include an automated re-attach functionality, an address-re-use protection, a secure seed generator, and better UI. Critical issues that are problems of the Wallet that is in use right now.

This will take a few weeks, but from then on other projects have no advantage in terms of usability anymore.

Investments in the cryptocurrency sphere are often directly connected to the expectation value of an asset.

If the usability of IOTA is improved vastly, what do you expect?


6. Why is IOTA still not listed on Bittrex, Poloniex, and Kraken?

There are 5 reasons.

  1. Cryptoland has seen a tsunami of new registrations, ERC20 tokens, ICO’s, DDoS attacks.
    Exchanges need to compensate for all of that. Nothing is ensured or given, this is an ecosystem in development.
  2. IOTA is not easy to implement and maintain. The IOTA foundation develops the IXI, a library that makes it easier to implement IOTA on exchanges.
    The special abilities of IOTA need special treatment, therefore, we still see just a handful exchanges. Still. Bitfinex, Coinone, Binance, and Okex are not bad when we think about the circumstances.
  3. The focus of the Foundation on collaborations, adoption, research etc.
  4. The mentioned exchanges are regulated especially the US-based companies. Adding new cryptocurrencies is not easy under the new regulations.
  5. There are other, more important markets than cryptocurrency exchanges. The IOTA foundation lately talked about the possibility of adding IOTA to the stock market. This would mean that IOTA had access to regulated, legit funds, companies and traders all over the world. A very helpful company in that regard is the Advanced Blockchain AG in Germany.


7. There are reports, that people have been hacked. What’s that about?

That is a misconception that has been created by the yellow press and blogging landscape of cryptoland.

IOTA has never been hacked. Seeds cannot just randomly be found. The possible combinations of seeds in the IOTA Tangle are higher than all atoms in the observable universe. This is not just a phrase, it’s a fact.

If people lose their money, it’s mostly because they used a rigged online seed generator, or they gave someone access to their seed in other ways.

As long as only you know your seed, and no one has access to it, your funds are 100% save.


8.  I waited 20 hours to get a withdrawal from an exchange? Is this IOTA?

No. That’s a combination of 3 things. Your expectation, IOTA’s usability, and that exchanges are struggling to handle the insane numbers of requests by their user-base.

IOTA is pretty young, special, focused on corporal adoption. You shouldn’t expect it to be like consumer software with steady support and service.

The usability makes it look like IOTA is slow and broken, the truth is that if you know the appropriate solutions to issues, it almost always works, depending if the network is working as intended or not.

Attacks, snapshots, and improvements forced the developers numerous times to halt the coordinator or to upgrade the full nodes. That will improve vastly in the future, as the growing network ensure that the coordinator will be removed eventually and snapshots will be performed locally.

Exchanges will handle IOTAs deposits and withdrawals way better in the near future, as the IXI hub is in its final stages of development.

All 3 reasons considered, IOTA has a bright future in terms of user experience.


9. If companies and projects don’t need tokens for using the tangle, how does the token gain value?

A question that has been asked many times lately.

Remember that IOTA has basically 2 functionalities:

a) send data

b) send values

When companies use the IOTA data marketplace for selling their sensor-output, they use both functions as they offer their data and on the other hand, demand just a few iotas for each data-package.

How would they get paid, if the costs of the information are at a fraction of a cent? With a credit card or PayPal?

No. The established payment options cannot be used for micropayments. That is the whole idea of a machine economy.

Machines get a wallet and get paid for their work, especially when microtransactions are needed, such as electric vehicle charging.

For that, you need iotas, the currency of IOTA.

The total supply is limited, therefore rises in value the more real-world use-cases are built around it.

The IOTA foundation stated, that companies that are working with IOTA are “forced” to use iotas and not a “tokenized asset” on top of the tangle, to ensure that the value of the iotas will be secured.

If a company decides to only use the data transaction, they still support the system with proof of work and a better confirmation rate.

Most importantly: The IOTA foundation is a non-profit organization (gemeinnützige Stiftung) under German law, registered in Berlin.

That means that the founders and developers of the foundation are working transparently in favor of society to be “gemeinnützig” – serving the public good.

This major step into mass adoption and building a trustworthy standard for the Internet of Things shouldn’t be forgotten.

Their latest efforts in creating educational blog posts on their official medium blog should shed light on many uncertainties.

Explaining series: The new IOTA Whitepaper (vers. 1.3) summarized

Explaining series: The new IOTA Whitepaper (vers. 1.3) summarized

When dealing with innovative technology, it’s not always easy to understand the underlying mechanisms and technological features, explained by a whitepaper, as it is written with and for specialists with a thorough understanding of math and statistics.

The following article aims for outlining the most important parts.

Since most of the insights of the whitepaper are not easy to digest, I couldn’t erase all complex parts.

Disclaimer: To catch all information, you should read the whitepaper, as I can not guarantee that I included all necessary information.
Italic written parts are citings from the original document.

1 Introduction and description of the system

For IOTA, we have different layers of necessary information on top of that, that should be kept in mind, when trying to understand the field of application and functionality.

  • The IoT (Internet of Things), a global network of devices, that interact for different purposes with each other. Unlike the Internet, the IoT is not organized in one topology, that is synchronous and connected intrinsically, like a flat field. The IoT is more like a connected infrastructure that is comparable to a bumpy geomorphological landscape, with hills, holes, continents, that are connected via several connectivity types, with divided subnetworks, offline-areas, and usually higher latencies compared to the Web. But also harder to attack with DDoS- attacks due to its natural barriers and mesh-net connection.
  • Blockchains. A peer to peer network, that enables many use-cases, often bears a monetary incentive for miners or investors, and also could be a key-technology for disrupting certain industries, such as Fintech or bigdata-markets. Blockchains are running on the Internet.
  • Its derivative DLT (Distributed Ledger Technology), a technology that shares some similarities with blockchains. The difference is the chronological order, that is used to store information and value in blocks and chains, while distributed ledger technologies share these values or information between all nodes in their network. That means, that although all blockchains have distributed values, they are usually bound to the chronological order of blocks, ordered in a chain, which is not the case in DLT’s. That difference makes IOTA architecture different in many regards, while it keeps the advantages of blockchains.
  • The DAG (Directed Acyclic Graph). A mathematical graph, that is used for ordering and updating nodes or edges. Important here is that along its path, dependencies exist, that make it impossible for a confirmation of a transaction to “travel” back to its origin. Cycles are not allowed in a DAG. That means that once information is induced, it’s impossible to change it afterwards, which is an important part of the consensus-rule.
  • IOTA, the software/cryptocurrency/protocol that is based on the Whitepaper of the Tangle by Prof. Serguei Popov.
    IOTA is made for the machine economy in the IoT, so it’s natural habitat is a mesh-network, where asynchronicity and offline-clusters are part of the system. The Tangle that is based on a DAG for IOTA brings up the following rule: transactions between nodes can only be conducted if nodes reference two other unconfirmed transactions.
  • Therefore, users act as both: miners and validators. Consensus is not decoupled, like in most Blockchains.

Along their way, unconfirmed transactions are referenced. If they are referenced by good nodes, they earn “weight” and “cumulative weight” (Figure 1, bold numbers). The latter is the most important measurement for transactions on a way to its network approval. The referencing of good nodes is usually based on the fact that the tip is not created by a lazy node, so it did participate in the approval of newer transactions.

Prof. Serguei Popov writes:

In order to issue a transaction, a node does the following:

• The node chooses two other transactions to approve according to an algorithm.
In general, these two transactions may coincide.

• The node checks if the two transactions are not conflicting, and does not approve
conflicting transactions.

• For a node to issue a valid transaction, the node must solve a cryptographic
puzzle similar to those in the Bitcoin blockchain. This is achieved by finding a
nonce such that the hash of that nonce concatenated with some data from the
approved transaction has a particular form. In the case of the Bitcoin protocol,
the hash must have at least a predefined number of leading zeros.


The IOTA network is asynchronous. In general, nodes do not necessarily see the same set of transactions. 

The tangle may contain conflicting transactions. The nodes do not have to achieve
consensus on which valid transactions have the right to be in the ledger, meaning
all of them can be in the tangle. However, in the case where there are conflicting
transactions, the nodes need to decide which transactions will become orphaned. 

The confidence level of a transaction is decided by the tip-selection algorithm (MCRW), that selects transactions for confirmation according to their cumulative weight. If the cumulative weight is high, the tip is more likely to get chosen. Lazy tips: a node, that conducts a new transaction and avoids doing too much pow via choosing older transactions for approval. This lazy node creates a lazy tip, that is less likely to be chosen because the low cumulative weight leads to a slow or non-selection of the MCRW tip-selection algorithm.

2 Weights and more

The following part in the whitepaper describes the weight of a transaction and other indicators.

The weight of a transaction is proportional to the amount of work that the issuing node invested into it.

The weight is attached to a transaction via positive integer, which means a whole number that can be read when reading the information of the transaction and the bundle it’s in.

In general, the idea is that a transaction with a larger weight is more “important” than a transaction with a smaller weight.

The basic idea of the weight is to give transactions an indicator if they are valid and coming from a “good node”.

It’s practically impossible to generate “an abundance of transactions with acceptable weights in a short period of time“.


  • Weight:
    The weight of a transaction is proportional to the amount of work that the issuing node invested into it
  • Cumulative weight:
    The own weight of a particular transaction plus the sum of own weights of all transactions that directly or indirectly approve this transaction.
  • Tips:
    Unapproved transactions in the tangle graph.
  • Height:
    The length of the longest oriented path to the genesis.
  • Depth:
    The length of the longest reverse-oriented path to some tip.
  • Score:
    The score of a transaction is the sum of own weights of all transactions approved by this transaction plus the own weight of the transaction itself.

For a thorough understanding of these indices, I recommend reading section 2 of the whitepaper, as it describes the calculations of the cumulative weight. To draw a conclusion: The more pow is done by honest nodes, the higher the cumulative weight gets.

3 Stability of the system, and cutsets

Under the assumption that all devices in the Tangle network have the same computational power, the approval of tips is distinguished in 2 different scenarios:

Low load: the typical number of tips is small, and frequently becomes 1. This may happen when the flow of transactions is so small that it is not probable that several different transactions approve the same tip. Also, if the network latency is very low and devices compute fast, it is unlikely that many tips would appear. This even holds true in the case when the flow of transactions is reasonably large. Moreover, we have to assume that there are no attackers that try to artificially inflate the number of tips.

High load:  the typical number of tips is large. This may happen when the flow of transactions is large, and computational delays together with network latency make it likely that several different transactions approve the same tip.

As written in the whitepaper, there is no clear borderline between those two scenarios, just an informal distinction in order to showcase extreme cases.


Important indicators:

  • L(t):
    The total number of tips in the system at time t.
  • h:
    The average time a device needs to perform the PoW
  • λ:
    Poisson point process that can be applied to the large number of incoming transactions from roughly independent entities
  • r:
    Revealed tips. Tips that were attached to the tangle before time t − h (before the PoW was done)
  • λh:
    Hidden tips. Tips that were attached to the tangle simultaneously when the PoW was done. [t − h, t)

The stability of the total number of tips L(t) is the most important point to assess the rate of approval in both load-regimes.


  • We assume that the number of tips remains roughly stationary in time and is concentrated around a
    number L0 > 0
  • In the stationary regime, this mean number of chosen tips should be equal to 1


Approval in the low regime: 

The situation in the low load regime is relatively simple. The first approval happens on an average timescale of order λ^−1 since one of the first few incoming transactions will approve a given tip.

Approval in the high regime:

One may assume that the Poisson flows of approvals to different tips are independent and have an approximate rate of 2λ/L0. Therefore, the expected time for a transaction to receive its first approval is around L0/(2λ) ≈ 1.45h

However, it is worth noting that for more elaborate approval strategies, it may not be a good idea to passively wait a long time until a transaction is approved by the others. This is due to the fact that “better” tips will keep appearing and will be preferred for approval. Rather, in the case when a transaction is waiting for approval over a time interval much larger than L0/2λ, a good strategy would be to promote this latent transaction with an additional empty transaction. In other words, a node can issue an empty transaction that approves its previous transaction together with one of the “better” tips to increase the probability that the empty transaction receives approval. -> 

-> which can lead to an attack, described later.


1. We distinguish between two regimes, low load and high load (Figure 3).

2. There are only a few tips in the low load regime. A tip gets approved for the
first time in Θ(λ^−1) time units, where λ is the rate of the incoming flow of

3. In the high load regime, the typical number of tips depends on the tip approval
strategy employed by the new transaction.

4. If a transaction uses the strategy of approving two random tips, the typical
number of tips is given by (equation 1). It can be shown that this strategy is optimal
with respect to the typical number of tips. However, it is not practical to adopt
this strategy because it does not encourage approving tips.

5. More elaborate strategies are needed to handle attacks and other network issues.
A family of such strategies is discussed in Section 4.1 (of the Whitepaper).

6. The typical time for a tip to be approved is Θ(h) in the high load regime,
where h is the average computation/propagation time for a node. However, if
the first approval does not occur in the above time interval, it is a good idea
for the issuer and/or receiver to promote that transaction with an additional
empty transaction.

7. It can be observed that at any fixed time t the set of transactions that were tips at some moment
s ∈ [t, t + h(L_0, N)] typically constitutes a cutset. Any path from a transaction
issued at time t’ > t to the genesis must pass through this set. It is important
that the size of a new cutset in the tangle occasionally becomes small. One may then
use the small cutsets as checkpoints for possible DAG pruning and other tasks.

3.1 How fast does the cumulative weight typically grow?

The cumulative weight is the most important indicator, because most of attack vectors can be created around this indicator.

Tip approval is mostly based on the cumulative weight, therefore, one must understand the cumulative weight adaptation rate.

Prof. Popov writes in a footnote:

In fact, the author’s feeling is that the tip approval strategy is the most important ingredient for constructing a tangle-based cryptocurrency. It is there that many attack vectors are hiding. Also, since there is usually no way to enforce a particular tip approval strategy, it must be such that the nodes would voluntarily choose to follow it knowing that at least a good proportion of other nodes does so.

The growth of cumulative weight in:

Low regime:  After a transaction gets approved several times, its cumulative weight will grow with speed λ because all new transactions will indirectly reference this transaction.

High regime: In the case where the network is in the high load regime, an old transaction with a large cumulative weight will experience weight growth with speed λ because essentially all new transactions will indirectly reference it. Moreover, when the transaction is first added to the tangle it may have to wait for some time to be approved. In this time interval, the transaction’s cumulative weight behaves in a random fashion.


  • H(t):
    As the expected cumulative weight at time t (for simplicity, we start counting time at the moment when our transaction
    was revealed to the network, i.e., h time units after it was created).
  • K(t):
    As the expected number of tips that approve the transaction at time t.

The whitepaper offers complicated calculations at this point that lead to the cumulative weight adaptation rate of:


1. After a transaction gets approved multiple times in the low load regime, its
cumulative weight will grow with speed λw, where w is the mean weight of a
generic transaction.

2. In the high load regime, there are two distinct growth phases. First, a transaction’s
cumulative weight H(t) grows with increasing speed during the adaptation
period according to (equation 8 – in the whitepaper). After the adaptation period is over, the cumulative
weight grows with speed λw (Figure 4). In fact, for any reasonable strategy,
the cumulative weight will grow with this speed after the end of the adaptation
period because all incoming transactions will indirectly approve the transaction
of interest.

3. One can think of the adaptation period of a transaction as the time until most
of the current tips indirectly approve that transaction. The typical length of
the adaptation period is given by (equation 7 – in the whitepaper).

4 Possible attack scenarios

Outpacing attack/large weight attack:

1. An attacker sends a payment to a merchant and receives the goods after the
merchant decides the transaction has a sufficiently large cumulative weight.

2. The attacker issues a double-spending transaction.

3. The attacker uses their computing power to issue many small transactions that
approve the double-spending transaction, but do not approve the original transaction
that they sent to the merchant either directly or indirectly.

4. It is possible for the attacker to have a plethora of Sybil identities which are
not required to approve tips.

5. An alternative method to item 3 would be for the attacker to issue a big doublespending
transaction using all of their computing power. This transaction would have a very large own weight,
and would approve transactions prior to the legitimate transaction used to pay the merchant.

6. The attacker hopes that their dishonest subtangle outpaces the honest subtangle.
If this happens, the main tangle continues growing from the doublespending
transaction, and the legitimate branch with the original payment to
the merchant is orphaned (fig. 5)

7. From the above discussion (calculations in the whitepaper), it is important to recognize that the inequality λ > µ
should be true for the system to be secure. In other words, the input flow of “honest”
transactions should be large compared to the attacker’s computational power.
Otherwise, the estimate (equation 12) would be useless. This indicates the need for additional
security measures, such as checkpoints, during the early days of a tangle-based
system.  -> the Coordinator

8. When choosing a strategy for deciding which one of two conflicting transactions
is valid, one has to be careful when using cumulative weight as a decision metric.
This is due to the fact that cumulative weight can be subject to an attack similar
to the one described in Section 4.1, namely, the attacker may prepare a doublespending
transaction well in advance, build a secret subtangle referencing it, and
then broadcast that subtangle after the merchant accepts the legitimate transaction.
A better method for deciding between two conflicting transactions might be the one
described in the next section: run the tip selection algorithm and see which of the
two transactions is indirectly approved by the selected tip.

4.1 A parasite chain attack and a new tip selection algorithm

1. An attacker secretly builds a subtangle that
occasionally references the main tangle to gain a higher score. Note that the score
of honest tips is roughly the sum of all own weights in the main tangle, while the
score of the attacker’s tips also contains the sum of all own weights in the parasite
chain. Since network latency is not an issue for an attacker who builds a subtangle
alone27, they might be able to give more height to the parasite tips if they use a
computer that is sufficiently strong. Moreover, the attacker can artificially increase
their tip count at the moment of the attack by broadcasting many new transactions
that approve transactions that they issued earlier on the parasite chain (Figure 6).
This will give the attacker an advantage in the case where the honest nodes use some
selection strategy that involves a simple choice between available tips.

2. To defend against this attack style, we are going to use the fact that the main
tangle is supposed to have more active hashing power than the attacker. Therefore,
the main tangle is able to produce larger increases in cumulative weight for more
transactions than the attacker. The idea is to use a MCMC algorithm to select the
two tips to reference

3. It is easy to see why the MCMC selection algorithm will not select one of the attacker’s
tips with high probability. The reasoning is identical to the lazy tip scenario:
the sites on the parasite chain will have a cumulative weight that is much smaller
than the sites that they reference on the main tangle. Therefore, it is not probable
that the random walker will ever jump to the parasite chain unless it begins there,
and this event is not very probable either because the main tangle contains more

4. In any case, there is not a large incentive for the nodes to be selfish because possible gains
only amount to a slight decrease in confirmation time. This is inherently different
from other decentralized constructs, such as Bitcoin. The important fact is that nodes
do not have reasons to abandon the MCMC tip selection algorithm.

4.2 Splitting attack

1. Aviv Zohar suggested the following attack scheme against the proposed MCMC algorithm.
In the high-load regime, an attacker can try to split the tangle into two
branches and maintain the balance between them. This would allow both branches
to continue to grow. The attacker must place at least two conflicting transactions
at the beginning of the split to prevent an honest node from effectively joining the
branches by referencing them both simultaneously. Then, the attacker hopes that
roughly half of the network would contribute to each branch so that they would be
able to “compensate” for random fluctuations, even with a relatively small amount
of personal computing power. If this technique works, the attacker would be able to
spend the same funds on the two branches

2. To defend against such an attack, one needs to use a “sharp-threshold” rule that
makes it too hard to maintain the balance between the two branches. An example
of such a rule is selecting the longest chain on the Bitcoin network.

3. It is worth noting that the attacker’s task is very difficult because of network
synchronization issues: they may not be aware of a large number of recently issued

4. Another effective method for defending against a splitting attack
would be for a sufficiently powerful entity to instantaneously publish a large number
of transactions on one branch, thus rapidly changing the power balance and making
it difficult for the attacker to deal with this change. If the attacker manages to maintain
the split, the most recent transactions will only have around 50% confirmation
confidence (Section 1), and the branches will not grow. In this scenario, the “honest”
nodes may decide to start selectively giving their approval to the transactions that
occurred before the bifurcation, bypassing the opportunity to approve the conflicting
transactions on the split branches

5. One may consider other versions of the tip selection algorithm. For example, if
a node sees two big subtangles, then it chooses the one with a larger sum of own
weights before performing the MCMC tip selection algorithm outlined above.


Attack-scenario conclusion: 

1. We considered attack strategies for when an attacker tries to double-spend by
“outpacing” the system.
2. The “large weight” attack means that, in order to double-spend, the attacker
tries to give a very large weight to the double-spending transaction so that it
would outweigh the legitimate subtangle. This strategy would be a menace
to the network in the case where the allowed own weight is unbounded. As a
solution, we may limit the own weight of a transaction from above, or set it to
a constant value.
3. In the situation where the maximal own weight of a transaction is m, the best
attack strategy is to generate transactions with own weight m that reference
the double-spending transaction. When the input flow of “honest” transactions is
large enough compared to the attacker’s computational power, the probability
that the double-spending transaction has a larger cumulative weight can be
estimated using the formula (12) (see also examples below (equation 12)).
4. The attack method of building a “parasite chain” makes approval strategies
based on height or score obsolete since the attacker’s sites will have higher
values for these metrics when compared to the legitimate tangle. On the other
hand, the MCMC tip selection algorithm described in Section 4.1 seems to
provide protection against this kind of attack.
5. The MCMC tip selection algorithm also offers protection against the lazy nodes
as a bonus.

5 Resistance to quantum computations

As of today, one must check an average of 2^68 nonces to find a suitable hash that allows a
new block to be generated. It is known (see e.g. [source 15]) that a quantum computer would need
Θ(√N) operations to solve a problem that is analogous to the Bitcoin puzzle
stated above. This same problem would need Θ(N) operations on a classical computer.

Therefore, a quantum computer would be around √2^68 = 2^34 ≈ 17 billion times more
efficient at mining the Bitcoin blockchain than a classical computer. Also,
it is worth noting that if a blockchain does not increase its difficulty in response to
increased hashing power, there would be an increased rate of orphaned blocks.
For the same reason, a “large weight” attack would also be much more eefficient on
a quantum computer. However, capping the weight from above, as suggested
in Section 4, would effectively prevent a quantum computer attack as well. This is
evident in iota because the number of nonces that one needs to check in order to find
a suitable hash for issuing a transaction is not unreasonably large. On average, it is
around 3^8. The gain of efficiency for an “ideal” quantum computer would, therefore, be
of order 3^4 = 81, which is already quite acceptable.

More importantly, the algorithm used in the IOTA implementation is structured such that the time to find a nonce is not much
larger than the time needed for other tasks that are necessary to issue a
transaction. The latter part is much more resistant against quantum computing, and
therefore gives the tangle much more protection against an adversary with a quantum
computer when compared to the (Bitcoin) blockchain.

Questions about the whitepaper, its including math or special attack vectors can be discussed in the development slack under #tanglemath


Explaining Series: Fog Computing in the Internet of Things

Explaining Series: Fog Computing in the Internet of Things

Fog Computing -one of many new trendy terms that we see and read almost everywhere in this field.

What is it? -and how can IOTA enable the perfect fog-computing landscape, the IoT needs?

I give you a short explanation and good sources for a smooth heads-up.


This roundup is an experiment that aims for a better understanding of the greater picture. Some keywords before the actual article are meant as an information-index.

  • Internet of Things (IoT)= Term from the MIT, Kevin Ashton, 1999
  • Fog-Computing = Term from Cisco
  • Fog = Decentralized/Distributed
  • Cloud = Centralized
  • Realm = IoT + IIOT, B2B, M2M, IoE, Smart grids, Smarthome, Smart cities, interconnected world
  • Problem = Unused Sensor-Data, Need for a solution of a distributed network, Costs of cloud-computing, time
  • Application = Evolving Markets, Quality-as-a-service, Machine Communication, Scada
  • IoT Systems = Basically two groups: 1) Identification Group (sensors, data gathering) 2) Computational group (processing, data storage)
  • Limitations until now: Cloud computing (centralized, far away from consumers and devices)  doesn’t fit the requirements of the IoT (distributed, in need of close storage, computational resources, instant processing), Bandwidth
  • Connection Types: WiFi, Bluetooth, ZigBee, 2G/3G/4G/5G, Radio, Z-Wave, 6LowPan, Thread, Wifi, Cellular, NFC, Sigfox, Neul, LoraWan

The IoT

The vision of the Internet of Things is still in the making.

With the latest development in this interconnected world, new markets are emerging and a variety of requirements are born.

Wearables, smartphones, domestic devices like smart-home solutions for an intelligent household demand an interconnectivity solution that has yet to come.

It’s no secret that almost every company is also working on solutions to make it happen: a world, where data is a more valuable resource than oil. If not today, then in the near future.

This leads to a point, where technical barriers of today hinder progress for tomorrow.

The IoT, a distributed network around the world is more than the Internet.

A mesh-net that is connected with every possible connection type. Where devices work in local clusters, it’s obvious that centralized components, sometimes on a different continent, don’t fit in the greater picture.

Sensors, cameras, smart devices often use ad-hoc solutions to function in their specific field, such as monitoring systems like Scada, that send valuable data to a nearby control center in order to optimize industrial processes.

What if these monitoring systems are working time-sensitive, but the current solutions are slow and on top of that centralized and unsecured. The productivity could be better, employees may work in a more dangerous environment and as a result: the company could face problems.

Connected facilities incentivize industry-espionage and hacks.

Distributed denial of service attacks is a phenomenon of the last few years, where certain malicious parties are attacking infrastructural points in the web, to cripple communication of some systems and special services.

Sometimes as a decoy for a hack, sometimes for political or activist-reasons.

Not rarely, mentionable down times create financial losses or the blockage of regional infrastructure’s hits, next to the target, also other companies that are located in the surrounding area.

A problem of the Internet, not necessarily of the IoT.

Due to the distributed mesh-net characteristics, the IoT is envisioned as a network, that is self-sufficient, in which case it can connect devices of the identification group via many ways, not only one.

An attack on central points is per definition impossible because there is no center in the IoT.

That leads to a natural resistance against DDoS and other downtimes.

Legacy systems vs. new systems

An additional issue of cloud-computing in the IoT would be the costs. Legacy system use to ignore huge amounts of data because there is neither storage no need for them.

New systems in the IoT, with smart solutions, rely on this data, but sending them into the cloud would go beyond the scope of the IoT. Too much information is generated, and real-time analysis, as well as centralized cloud-computing solutions, are conflictive with each other as uploading these huge amounts takes time and money -especially if the cloud-storage is thousands of miles away.

Fog computing, however, creates a bridge-solution for the identification group and computation group: It is about forwarding the computational power to the edge of the network, where data is generated and the results are needed.

The benefits of using Fog computing instead of legacy cloud systems are tremendous.

Varghese, Wang, et. al [2017]. come to the conclusion that. “For an online game use-case, we found that the average response time for a user is improved by 20% when using the edge of the network in comparison to using a cloud-only model. It was also observed that the volume of traffic between the edge and the cloud server is reduced by over 90% for the use-case.

This is just one use case that can be mirrored on many other settings.

In consumer markets, Quality of Service and Quality of Experience are important factors.

Another example would be the transparent customer. When a transparent customer enters a big supermarket, his views and interests could be analyzed within seconds.

Cameras can detect his interest in certain devices or components, and advertisements on monitors along his path can be adjusted to his specific needs. With old legacy systems impossible due to the long processing times between these cameras, a cloud, and computational resources, with fog computing, however, the data can be processed way faster and deliver the necessary information back to the customer, along his way in the mall.

To draw a simplified picture of the fog-landscape:

The distributed mesh-net is growing in height z, if you will, whereas decentralized and centralized networks are growing on the x and y axis. Shorter ways from the data collectors to the computational resources are the result of fog-computing.

Concerns can be addressed with IOTA

Whether it’s the data-integrity, optimization or protection of the in-house Research & Development data, companies look for a lasting solution.

When data is stored centralized, hackers usually use social engineering, or phishing attacks to get access to the data.

As centrally stored data would be collected all in once with this method, Fog computing would make it possible to store sensitive information in small packets, distributed, with different passwords/keys/seeds to access them.

IOTA  can deliver a unique solution here. A data-stream, bound to countless seeds, in a distributed network, secured with sophisticated algorithms. Not even quantum computing would be a threat to the hashes.

As you may already know, IOTA is a distributed ledger technology, that enables fee free transactions.

For data-transfer with fog-computing, you wouldn’t even need tokens, the only condition would be to confirm two other transactions before sending one of your own.

A rule that enables true scalability for a billion device network on a global scale.

With Masked Authenticated Messaging, IOTA has an additional option to send and process sensitive data.

Now, a really big hurdle in the IoT is the availability of dozens of connections and different norms.

When devices could be connected in a similar way, the usability would increase. A plethora of standards that are built for the IoT can lead to a fragmentation of the network, as companies want to stick to their standards, to support their product-line or roadmap.

If IOTA would be the standard settlement and data layer, which is free to use, the Internet of Things could be a barrier-less environment with true scalability and data-integrity.

Due to the value of collected data, new markets would come up, that aim for selling this information in real-time.

People would possibly be able to sell their consumer data, each time they enter a shop, with true nano payments.

If data would be collected in the fog, BigChain DB a scalable distributed database for all kinds of data could deliver the necessary infrastructure for customers, institutes, and companies.

A seamless solution for the IoT.

Fog computing is, therefore, the next necessary milestone in the field of the Internet of Everything and a vital part of the vision of IOTA.


Video of Dominik Schieners Presentation at the Tech Open Air 2017