Category: Explaining Series

Bitcoin Will Inevitably Lose Its Value, IOTA Will Take Over

Bitcoin Will Inevitably Lose Its Value, IOTA Will Take Over

The following statements, backed by scientific papers, technological and geopolitical, illustrate how Bitcoin will inevitably lose its value.
IOTA was conceived and designed with a vision further into the future than most of us can comprehend. Its features will turn into major benefits as time progresses and it is a question of when, not if, IOTA will take over as the standard in cryptocurrency and industry 4.0.
What limitations will be the death of Bitcoin and the uprising of IOTA? Though you will not find a defined answer to that here, my commentary and facts I present should authenticate this opinion.

Is this article FUD?
That’s hard to determine because fear, uncertainty, and doubt have become anything to an investor that threatens his investment. First and foremost an investors ignorance is the main culprit of FUD.
This, of course, also applies to IOTA supporters. Cryptoland and its effects, in general, have the ability to transform investors into raging mobs that ignore facts, reason, and humanity, just to protect their funds and hopes of a big return on investment.
Bandwagoning is a phenomenon that goes beyond rationality.
Ad hominem, blatant lies, death threats, I’ve seen them all. But one thing is for sure: the truth always wins.
Is this article filled with logical fallacies, bad anecdotes or lies? No, because I don’t like misinformation. Since I’m a subjective being (and also invested in IOTA), it should be evident that my fact-based opinion can’t be completely objective.
Whether or not there is an incredible number of competing projects, magazines and investors that are spreading lies and false claims about IOTA, this article is not about creating deception, it is just my honest point of view.

As long as there are no major technical flaws (which can happen to every cutting-edge technology), there are a plethora of reasons why IOTA will eventually overtake Bitcoin.

Satoshi

The incredible whitepaper in 2008 opened a world of wonders.
With a new perspective on transferring money, we experienced the first sign of real emancipation from the big institutions that fostered inequality and a world where the power was not in the hand of the people.
Despite rules/laws that prohibited such concepts, when humans all over the world received a tool to share their wealth, they were finally granted sovereignty over their possessions.
I think all cryptocurrency investors can agree on these points or at least use them to newcomers why they invested.

Since then, the ideological component of cryptocurrencies has been drowned in a sea of greed, return on investment and proclamations of “when Lambo“.
Bitcoin, the currency that was created from the whitepaper of Satoshi Nakamoto, is different than “a peer-to-peer electronic cash system” in many regards.
Some other changes were implemented because game-theoretical aspects needed to be included, such as the small block size. Some others are changed to account for the growing number of users that congested the network.
The biggest change of the technological nature, however, was that people became aware of the monetary advantages Bitcoin introduced.
Hard-forks, specialized mining hardware, bandwagoning, social media manipulation, smear campaigns, and hacks are the daily madness we all are aware of.
The incentive to earn money is so big that the initial ideological dream, the democratic advantage, and most importantly the technological advancement is almost completely ignored to a point where Bitcoin doesn’t fit into the real world anymore.

Mining Issues

Mining is a vital part of Bitcoin’s consensus and creation of the currency. IOTA has no mining. For several reasons.
IOTA cannot function because there is no monetary incentive to run a full nodeby Anon.
This assumption turned out to be false given that the number of full nodes, including Nelson nodes is higher than 5000 already.

Bitcoin is proud to be the most secure project. There is no other project with a higher number of developers and a longer time-frame where bugs and problems have been eliminated. That is a truth that every investor should acknowledge.
There are hundreds of projects, dozens of wallets, countless corrected bugs, and a journey that has undoubtedly proven that Bitcoin is not hackable. Not anymore.
The consensus is decoupled from the user.
The holy trinity of Bitcoin’s consensus lies in the miners, validators, and users.
Mining in Bitcoin gives the network its blood-pressure and nutrients, but the heart grows too big.
The hash power is growing exponentially because the rising user-number and the incentive to earn money with mining follows the mainstream adoption.
More and more people and companies, even hardware giants like Nvidia and AMD start to specialize in mining cryptocurrencies.
That is an ecological tragedy and a centralized point of failure masked as an advantage and technological progress.

Bitcoin global hashrate

Mining, right now, is mostly performed in countries with low prices on electricity. China, Iceland, India. The power that is used comes primarily from fossil energy sources such as coal and oil. Mining farms look for maximum profit.
It is a reasonable assumption that even in the future, they will use the cheapest energy source.

Iceland, however, experiences a different problem. The geothermal energy that is used to mine cryptocurrency is a limited collected source. Additionally, the electric grid is pushed to its limits right now.
Politicians from the Icelandic pirate party stated that: “The value to Iceland … is virtually zero.” -as even almost no taxes can be derived from that. The opposition is growing.

Also, the consumption of energy from vulcanoes still heats up the atmosphere. It may be renewable, but it’s avoidable heat for the atmosphere. An ethical and technological regression.

These examples show that political decisions can shut down major parts of the hash power at any point, given that the incentive to mine threatens the environment and the electrical grid.

Bitcoin key statistics of mining

These aspects apply to all other minable projects as well. An incentive to use cheap electricity for revenue is a dangerous path, not a technological advantage.
Since IOTA has no mining, but a small proof of work, the electrical consumption can solely be created in renewable energy-clusters in every city and area in the world. “Zero margin electrical power” can support IOTA that is functioning in a multi-connection distributed mesh net.
The incentive to use a global standard for data/value transfer and data integrity is a way better solution, both more efficient and politically accepted.
Since there are no centralized mining farms nor fossil energy usage, IOTA will be used everywhere, while Bitcoin will face serious problems.
Additional information on the energy consumption and the ecological footprint is written here.

As the last addition to this section, I want to highlight that Cogniota, according to developers of IOTA, will make it possible to sell hash power for computational services. This incentive combines two major advantages: IOTA can incentivize parts of the mining industry to sell their hash power in order to solve actual problems, rather than an exchange of money.

Scaling Solutions

Scalability is not just a buzzword that has minor impacts. It determines if a currency can be widely used in the future or not. IOTA is theoretically infinitely scalable (as far as bandwidth it allows)  due to its unique consensus mechanism.
The lightning network (LN) will equip Bitcoin with payment channels that will enable billions of transfers, without fees. This is the missing piece of the puzzle, according to the Bitcoin evangelists.
Payment channels are a new approach that Bitcoin and Ethereum are exploring as the mainstream adoption leads to enormous transaction fees and transaction queues.
For now, the solution is in development, but we clearly understand that it is a necessity.
Those who used Bitcoin in December know that a solution is desperately needed as the transaction fees went insanely high.

Three problems, though, blur the hope and expectation the community has for LN.

  1. The block size is limited. The opening of millions of channels will lead to congested blocks again. It isn’t true scalability, just a sophisticated procrastination.
  2. Fees for closing the payment channels are an additional factor that hinders true adoption.
  3. Centralized hubs (like exchanges or big services) will be a possible threat to consensus according to Jonald Fyookball

That means that even if LN will work as intended, the development will still not be finished.
On the contrary, an additional step will be an assessment, whether or not a hard fork has to be performed that offers bigger blocks. In the case of full spread mainstream adoption, even the LN won’t grant full scalability like IOTA offers.
It is not possible to open an infinite number of payment channels, as the block size is too small right now.
Also, closing a channel and synchronizing it with the mainchain costs transaction fees.
In a world of microtransactions, transaction fees, even if they are small, are a threat to businesses and the majority of use-cases.
An exemplary calculation of transaction fees with present systems can be found here.

My conclusion is: Bitcoin will not work in an interconnected world, IOTA will.
Additional concerns about mathematical proof that Bitcoin cannot run decentralized with the LN have been raised, but it’s still inconclusive whether that is true or not.
The bottom line is: Bitcoin is trying to develop something that IOTA already owns.
Zero transaction fees and true scalability.
On top of that, IOTA has already flash channels which are bi-directional, feeless and extremely convenient. That means that IOTA combines far greater on and off-chain load capacity but Bitcoin still needs to find out if LN is feasible at all.

Adaptivity For The IoT

Mesh-net capabilities in the Internet of Things will have several characteristics that contradict Bitcoin’s functionality.
The economy of the future will eventually happen in local industrialized clusters, as smart cities will create economic islands of data and value streams with millions of devices each.

The effects of economic clusters can be further comprehended here in “A Historical Approach to Clustering in Emerging Economies” from the Harvard Business School (2017).
These clusters will have a demand for a DLT that is capable to function in this special environment.

a) Geographical distances will create latencies that are comparable to asynchronous networks. The network topology will represent the streets in a mountain chain with several connections (with remote parts) that are not always connected to the rest of the network at all times.
Bitcoin, therefore, needs a functionality that enables offline chains. Right now, this is not possible because the Lightning Network is neither ready nor fully suitable for mainstream adoption, as outlined earlier.
IOTA, however, can work in this environment, because offline chains are part of its architecture. The synchronization of offline chains has no disadvantages and can be performed with flash channels or normal transactions.

b) Transaction fees for microtransactions

c) If IOTA works as intended, try to come up with a use-case that is unfitting for IOTA, but perfectly suitable for Bitcoin.
Since I had no success in finding the answer, I can only deduce that IOTA will be chosen over Bitcoin because the advantages are obvious.

The impact of the IoT on the global economy can be inferred when taking a look at McKinsey’s latest assessment:

10-15 trillion dollar market according to GE and McKinsey until 2034

I conclude that the technology with the best abilities will likely be able to take the biggest part of the market capitalization.
It remains to be proven if Bitcoin will survive solely with a functionality as “digital gold” -without intrinsic value. That would mean that Bitcoin would cost millions in mining and fees but had no unique feature.

Shor’s Algorithm

Quantum computing is a threat to cryptography as we know it. Though it is not yet suitable to decipher algorithms used in Bitcoin and other cryptocurrencies big advances have been made, especially with the big D-wave 2000Q quantum computer that already has 2000 Qubits. Since the D-wave is solely focused on reverse annealing, it cannot be used for integer factorization or other applications.

Reverse annealing allows users to specify the problem they wish to solve along with a predicted solution in order to narrow the search space for the computation. The predicted solution may be a result of a previous quantum or classical computation or an educated guess. But it is not suitable for an efficient integer factorization that is needed in order to find collisions on standard cryptographic schemes.

Other ventures though, such as the IBM Q project with already 49 Qubits, which can be used for integer factorization, show a rapid and exponential rate in this field of quantum computing.

Timeline of available qubits

It turns out, that Moore’s law also applies to the field of quantum computer. According to this whitepaper by Aggarwal et.al. suitable quantum computer could be available even faster than expected.
If a quantum computer with a sufficient number of qubits could operate without succumbing to noise and other quantum decoherence phenomena, Shor’s algorithm could be used to break public-key cryptography schemes such as the widely used RSA scheme.Edward Gerjuoy (2004)
Same applies to the secp256k1 elliptic curve in Bitcoin.
That means that the necessary number of Qubits that can perform an efficient calculation of the Shor’s algorithm with a negligible amount of quantum error calculations can be available in approx. 6-7 years.

Since cryptocurrencies are not used in our daily lives to buy groceries, the majority of investments are due to the expectational value of cryptocurrencies as a whole.
That could mean that if there is a reason to doubt Bitcoins success, such as fundamental breakthrough’s in quantum computing, the expectation value can vanish and thus, decrease Bitcoins value immensely.
This point on my list is certainly not the most conclusive one because it is said that solutions could be implemented rather quickly, but the comparison to IOTA raises an additional question:
Why would we use a technology that will soon be rendered insecure, that needs adjustments to work again, when we already have a possible solution with the Winternitz algorithms that are used with IOTA?

The tradeoff right now is that addresses shouldn’t be re-used, but with the upcoming Trinity UCL wallet, people will receive a big portion of usability and security for the post-quantum era.

Hope ≠ Reality

“Shilling” describes the nature of cheering for your investment. All kinds of reasons are brought up in order to emphasize on how good this particular investment is. Everyone does or did it. Bitcoin, Ethereum, IOTA, Nano, you, me.
These arguments and word fights are common, boring, and unnecessary, yet everyone does it and everyone thinks they have a positive effect on the global cryptocurrency price.
Bitcoin especially has a major advantage that on the other side, doesn’t reflect the reality: The network effect.
There are by far the most people invested in Bitcoin. It’s impossible to determine but Bitcoin has been around since 2010, generally speaking. Since then, tens of millions of people have invested on countless platforms.
If you ask a person in an urban area what Bitcoin is they usually answer: “Internet money”.
This fact is largely true because Bitcoin has been used online for e-commerce services for years already.

Since most people are invested in Bitcoin, we normally see the most shilling for BTC in social media. This fact has nothing to do with its functionality or innovative quality. It is solely because many people know it, and many more are invested in Bitcoin than in all other cryptocurrencies.
Another factor is that Bitcoin is still the reference currency for the entirety of cryptocurrencies.
If Bitcoin’s price falls, 99% of all projects fall too.

In the future other coins will become more accessible and liquid through fiat pairs, therefore negating much of their dependence on Bitcoin. The future demand for Bitcoin will drop.
IOTA demand, on the other hand, will explode with a growing number of real-world use cases, industry adoption, and it’s capacity to offer a new standard.
This means that the true innovative value of currencies cannot be compared with Bitcoin because of the high number of all biased “judges” and the social media indicators that show how many people like a crypto project. Bitcoin owns social media. But IOTA owns the best tech.
The one-sided coverage of negative events and criticism of IOTA is a big sign that the landscape is highly biased. But the truth is: companies and institutions do not care about shilling, memes, and likes.

The Adoption Race

The most conclusive point, and the one that needs no additional explanation.
What IOTA lacks in ease of use it certainly wins and solves in adoption and innovation.
The number of companies that are convinced that IOTA is the solution, is by far higher than in any other cryptocurrency, including Bitcoin. IOTA has already won this adoption race, which is the most important race. In a short time-frame of 2 years.
Bitcoin is in an anti-adoption period right now, a problem that arises from transaction fees and transaction queue.

Under these conditions, IOTA will take over the market capitalization of Bitcoin sooner or later. It’s inevitable.
Use-cases such as micropayments, data marketplaces, data integrity, Q (a secret project the foundation is working on for 4 years) and the incentive to use hash power as computational power are the knockout for Bitcoin as soon as these applications are working.
Since expectational value is created long before these use-cases are actually deployed, I expect a major bull run right after the announcement of Q.

When I combine both projects and problems, I don’t see much room for Bitcoin, as its problems are way bigger than IOTAs’.
Bitcoin is still no closer to achieving its vision after 9 years of development. Yet we see IOTA rapidly progressing to a production ready state for multiple applications in the near future.

Usability issues on IOTA’s side versus architecture-flaws on Bitcoins side. Bitcoins time is running out.
A bigger network effect and shilling cannot change that.

 

 

Thanks to Ryan and Izelkay for editing.


Concerning IOTA problems and controversies:


1) Centralization of the coordinator:

David Sønstebø addresses the road-map and status quo here (at 29.00min):

He also explains that IOTA is exploring the ternary system, but they can go back to binary at any time.

The first comment under the interview gives timestamps for all discussed topics.

 

2) Concerning Reclaims:

3) Additional concerns, including wallet security, address re-usage, etc.

FUD Copy Pastas from Iota

 

IOTA: Infinite use-cases in a zero margin society and collaborative economy

IOTA: Infinite use-cases in a zero margin society and collaborative economy

The Internet of Things is the future, but its real character, its disruptive and transformational nature is not easy to grasp.

Needless to say, IOTA’s use-cases are rarely understood in its entirety.

Time to clear up some misconceptions. This post is inspired by Jeremy Rifkin.

The IoT is not primarily about wearables, smart fridges, smart-hamsters and the toilet that orders new paper. These popular examples are often cited by the media in order to describe the matter at hand to the new reader.

But the truth is that the focus of the IoT will be a different one. The fourth (or third, if you count out the internet) industrial revolution will make life easier, more efficient, safer and fairer. Global progress.

The future is about reduction and improvement.

Reduction of fees, a smaller ecological footprint, improvement of equality, also fewer barriers for society, better healthcare, less corruption, the list goes on and on.

This also applies to almost every other DLT, so one of the questions is: what makes IOTA and its applications unique?

In the end, it’s all about the zero margin society and sharing economy.

 

 

An economical landscape where services and infrastructures are the basic layers that arise due to the industrial transformation.

The paradigm shift from a throw-away mentality to a sharing economy has already begun. Car sharing, Air BnB, the transformation from fossil energy to renewable energy that can be shared between neighbors.

The scope of applications has yet to be determined especially because most of the new use-cases are without existing reference.

Everything is new, everything is an uncharted area.

In order to specify possible use-cases, I’m going to list a few in their respective areas.

The transformations of technology and society are inevitable and already happening.

The biggest question is: Will this industrial and social conversion be successful in a way, that employment levels and the standard of living can be kept or even increased?

A second issue is: Can we provide all these solutions before we reach an economic breakdown and environmental disaster without a return ticket?

 

Machine Economy

When Machines are customers and provider or sellers, there are countless possibilities in the future.

Wherever a machine creates goods or services, it can sell them for microtransactions. That includes manufacturing products, services, data, streaming services and digital rights, such as video, audio, electricity, or even connection access, also services such as transportation or maintenance.

Everything that has an expensive middleman today, can be automated and distributed in the future. Centralisation is one of the losers of tomorrow.

Vending machines, Air BnB, e-Meters, gas meters, electrical vehicle charging.

Autonomous vehicles will inhabit every urban area. These vehicles can be equipped with a wallet and in order to make them truly autonomous, they belong to themselves.

That is one of many zero margin models, experts like Jeremy Rifkin draw for the future.

The possibilities are truly endless.

Use-cases: Industry, Domestic applications, gastronomy, electrical vehicles, payment models, pay on demand, infrastructure

Sensor Data

Data is the new oil, the new gold.

Whereas today many data-packages are sent to a cloud, thousands of kilometers away, the future will be that edge, fog, and mist computing create new ways for the industry to improve their systems and to sell these data-streams that are produced and processed in real-time.

The sharing economy enables business opportunities, where data of any kind are the product.

IOTA’s data-marketplace, therefore, can be the key corner for revenue, research, and sharing of personal data, that belongs to the people.

The GDPR is the first step in Europe, to ensure that digital rights for customers are always given and big companies hold onto these laws. IOTA can be the solution for every single use-case.

Use-cases: Sell/buy data, be the owner of your data, monetize transport data, environmental data for research, customer-behavior, biological indicators in health-care, pollution in subjects of protection. 

Legal Applications

Why using signatures, when you have tamper-proof, unique transactions and hash values that are as binding as legal contracts?

The advantage is that the decentralized storage of these signatures enables that only the appropriate owners and relevant contractors have access.

With this ability, you can ensure data protection, reduce barriers, and protect rights of owners.

As seen with NetObjects newest use-case, IOTA makes it possible to create a fair solution for owners of media of all kind.

Digital rights are a big field with billions of turnover each month. Smart-contracts are a new technological revolution, that enables all kind of transparent, yet fair solutions for hundreds of use-cases.

Use-cases: Heritage, insurance, digital rights, access rights according to age or position in a company. 

Data Integrity

There is virtually no use-case here where data integrity is irrelevant.

Every single data stream needs to be protected from third parties.

When the transparent customer loses his pseudonymity, the data he produces can be misused in many ways.

When big data companies are losing their control over their data, they lose their basic business model.

Data integrity is, therefore, the basic layer that needs to be given at any time, everywhere.

IOTA’s Masked Authenticated Massaging (MAM) will be the solution in the future, to protect the world of information in every application.

Use-cases: Everytime sensitive data is transacted, equipable in almost every single use-case.

Governance

The cure for the evils of democracy is more democracy. (H. L. Mencken)

Elections, the structure of organizations, the right of participation.

Democracy and governmental structures should be a basic right and every human should have access to it.

Sadly, corruption and the interference of the mightiest often reduces the democracy to an epitomal construct of irrelevance.

People do not feel like they own the power they belong.

In the future, DLT’s can and should be used to clear elections from external access.

One transaction could be one voice. This emancipation of the people before the big players in the world is an important step to stop corruption and inequality.

Transparent and credible governance ensure the trust of the people in their political representatives.

That also applies to all other elections in organizations and companies.

Use-Cases: elections, anti-fake news, emancipation, the power to the people, ensure that the source of information is legit.   

Sharing Economy

Given that collaborative efforts reduce costs of energy, transportation, and basic infrastructure, the economy can transform to a zero margin cost society: the sharing economy.

The combination of all other use-cases enables a world where local economic clusters are distributed all over the world.

Energy will be produced without fossil energy and will be collected, processed and sold in neighbor-efforts so that the energy markets are a distributed and

Less central authorities and companies will cite the daily life, as people can act

The picture I personally imagine for a city in 15 years is that we need no cash, that most of the systems, including shops and services run around-the-clock, and that mostly autonomous machines maintain supply and maintenance of almost everything.

We have access to a free infrastructure of electricity, connection, communication and basic supply of groceries, healthcare, and education.

IOTA could become the distributed layer for data and value transactions. With its approach to data integrity, scalability and zero transaction fees, it’s designed to work as the backbone of the IoT, in almost every imaginable use-case.

Fees or missing scalability would render the majority of use-cases impossible as the IoT will be a global network of billions or trillions of devices. No other open source project can deliver the necessary characteristics.

The following outstanding speech of Jeremy Rifkin draws the IoT and its implications.

 

Controversial IOTA. Nine heavy questions answered.

Controversial IOTA. Nine heavy questions answered.

Cryptoland is innovative, interesting, filled with opportunities, drama, unbelievable stories and fraud. We love it.

The ongoing list of scam-attempts, hacks, thefts, and inside-jobs, though, is part of the daily life -both entertaining and hair-raising.

Decentralized systems enable opportunities for bad parties, one cannot even imagine in the world behind the computer, and yet, we all believe in them.

When legitimate companies decide to collaborate with one of the ~1500 crypto-projects out there, it’s a good sign that things are going in the right direction, but to be honest: most of these companies are still learning about distributed ledgers, so they are merely a positive indicator for the answer of the question: is my money safe?

There are too many cases when these collaborations ended in a fiasco and the devs ran away with the funds. Since bleeding edge tech isn’t audited, and mostly unregulated, we have no bulletproof insurance that the persons involved are good-hearted and acting in the name of progression.

You’re on tangleblog.com, and I write about IOTA like I perceive it, so naturally, things are drawn in a positive way here.

The reason is simple: I believed in the vision before I invested, and now I give them my full support -which, of course, means nothing to smart investors from outside.

Due diligence is the most valuable currency of traders and investors, and also for companies that decide to give IOTA a chance.

The last weeks, however, have shown that there is nothing to fear and that I obviously backed on the right horse. What a year!

Time to sum up the latest evidence that IOTA is a legitimate project, that has the potential to change everything.

IOTA works differently as blockchains in my regards, it’s just a matter of customer habits, that a lot of people doubt the technology, they don’t know, yet.

Therefore, I’d like to sum up a few spicy questions, that have been going around for quite some time.


  1. What about the alleged cryptographic vulnerability in the IOTA signing algorithm, the “MIT”-scientists wrote about?
  2. I heard negative things about the coordinator, what is it?
  3. What about the incentive to run a full node? Does the network find enough supporters?
  4. Is IOTA truly free to use?
  5. What about the bad usability of the wallet?
  6. Why is IOTA still not listed on Bittrex, Poloniex, and Kraken?
  7. There are reports that people have been hacked. What about that?
  8. I waited 20 hours to get a withdrawal from an exchange? Is this IOTA?
  9. If companies and projects don’t need tokens for data-transfer, how does the token gain value?

 

1. What about the alleged cryptographic vulnerability in the IOTA signing algorithm, the “MIT” -scientists wrote about.

 

What a great show! The headline on Forbes and several other magazines alone were incredibly catchy and held back countless companies and investments -that’s for sure.

That the alleged vulnerability wasn’t in effect, that no funds were ever at risk, that the testing conditions were ludicrous, and that everyone involved acted with a conflict of interest, wasn’t part of that article by Amy Castor.

But in mine: https://www.tangleblog.com/2017/09/13/competitors-amy-castor-tale-reputation-usage-discredit-campaign/

Since then, a second article from the Digital Currency Initiative of the MIT Media Lab left many believers speechless, as the allegations were clouded in unsubstantial criticism.

The IOTA foundation then decided to respond to the matter at hand and they included a list of conflict of interests.

As part of a natural decision process, you should read all parts in order to draw a conclusion.

Part 1

Part 2

Part 3

Part 4

The not so responsible “responsible disclosure” was lead by Neha Narula of the MIT Media Lab. I’ve received several direct messages from Harvard and the MIT since then, that my research was accurate.

As for the future: The IOTA foundation decided to hire independent professionals (www.cybercrypt.dk), to work on the mentioned signing algorithm, in order to legitimize their efforts and to develop a working and smart solution.

Please read the blog-post of IOTA to understand how serious the efforts of the IOTA foundation are.

To draw my conclusion:

-IOTA is working as intended

-There is no vulnerability, the funds of investors were and are 100% safe at any time.

-Independent cryptographers, in addition to the competent IOTA team, are working on the maintenance and security of the tangle

-Competing projects try to slow down IOTA’s progress

-In order to get an unbiased opinion, investors should take their time and view all evidence to understand that forces are working in combined efforts to harm IOTA.

Up to this day and after repeated requests from the IOTA team, the DCI team has still not released any exploit code publicly.

-This is how the crypto-community thinks about it: https://www.reddit.com/r/CryptoCurrency/comments/7svr8r/mit_media_lab_dci_allegations_proven_wrong_iotas/?sort=confidence

 


2. I heard negative things about the coordinator, what is it?

Maybe the most controversial discussed topic in the IOTA ecosystem.

A node, in possession of the IOTA foundation, that is setting milestones, and that is used to take away the money from the people?

Bollocks. First of all, read the transparency compendium by the IOTA foundation.

The coordinator is training wheels for the juvenile network and has solely the task to set milestones as a Sybil-attack resistance, nothing else. Also, if people want to code referencing nodes themselves, they can do it, because the knowledge is out there, but people just did not do it yet, hence the importance of the security measure “coordinator”.

Needless to say: the foundation, which is officially registered in Germany as a non-profit organization, under German law, will not exploit it. They would stop their complete venture, which makes zero sense.

Luckily, we humans, sometimes, have a brain and can handle imperfect situations for the greater good.
So here is my personal opinion as someone that is part of the IOTA community for a long time:
I do trust these peoples, I know them for 2 years, they are nothing but progressive and eager, so I’d rather chose them over a world, where people just try to exploit everything they can. Including temporary solutions that aim for security.
Because it is necessary.
In a few months, when the hash power is big enough, the training wheels can be taken away step by step and the bad actors cannot attack the network with a Sybil-attack anymore.
That is a trade I’m willing to make, as someone that doesn’t like an inefficient Bitcoin et. al., that doesn’t like the waste of mining electricity and the hypocrisy of the whole cryptoland, because what matters is innovation and not a useless buzzword a la “decentralization” that won’t matter soon.

 

3. What is the incentive to run a full node? Does the network find enough supporters?

It’s been a while since I wrote my article about the incentives. Many things have happened until then.

The number of full-nodes has been steadily increased and estimates say we are at approx. 5000 but there is yet no way to pinpoint the number.

With the help of Zoran and the audience of my Sonntagsplausch and Sunday banter, we managed to add an additional number of ~400 full nodes to the existing base.

The people obviously believe in the technology and are willing to spend their time and money on it, even without compensation.

A good source to set up your own full node is IOTA.partners

But there are other mentionable ventures.

Roman Semko from Semko development in Leipzig, Germany, is creating an ecosystem for the improvement of the Tangle topology.

Carriota, Nelson, and Bolero are brand-names of his proprietary software, that is running on top of IOTA, a layer that enables connections between all other Nelson participants and the Tangle itself.

Bolero even goes a step further in creating an easy 5-step installation of a full node that connects to the Nelson Network.

This solution works as a supplement which is wrapped around the Tangle. Its effects are quite helpful, especially because it significantly lowers the entry barrier for people that want to help.

His solutions are on the website http://www.carriota.com/ and in Github 

We know that several cities and companies research the tangle for their business models.

Cities like Taipei, Tokio or Haarlem are creating clusters in the Tangle network, that rely on several hundred full nodes, which are added to the high number very soon.

We heart that argument very often: IOTA cannot work, there is no incentive.

They were wrong!

 

4. Is IOTA truly free to use?

That is a matter of taste.

In terms of transaction fees, yes. IOTA has no transaction fees. You send one cent from Japan, the guy in Brazil gets one cent.

However, you need to keep in mind that there must be a spam mechanism to prevent malicious parties from flooding the network.

Therefore a low proof of work algorithm has been added as an anti-Sybil-mechanism.

That is a countermeasure because a system like IOTA will always be targeted in order to try a double-spending attack.

There is practically no way to perform such an attack successfully, but that doesn’t stop some antagonists to attack the network anyway.

Mostly in order to slow down public nodes to make it look like IOTA wouldn’t work. That is a side effect of using the light-node system: you have to rely on your light node provider.

A full node has a way better user experience.

To conclude: IOTA is free to use, but the proof of work is necessary to protect the funds. There are no fees other than tiny amounts of electricity.

Compared to Bitcoin as the one cryptocurrency which is known the most, you had to pay ~$15 or more to send your amount.

Try to send 15 times one cent from your position to Brazil and compare the added transaction fees from Bitcoin and the electrical costs of IOTA’s marginal PoW.

$225 vs 0.0001 cents for electricity.

IOTA is as free as it gets.

 

5.  What about the bad usability of the wallet?

Before I get into that, I want to point out IOTA’s take on that, that has been repeated numerous times.

IOTA is made for the machine economy. Since competition and time are issues that are taken very seriously, the focus of the limited resources of the IOTA foundation lays, therefore,  in creating a protocol, the research of the technology, collaborations, and simulations.

Take a look around and compare IOTA with other projects that have a different approach: Byteball, Raiblocks(Nano), Pascalcoin.

The usability is higher in all of them, but they lack research, collaborations, simulations and a clear vision of how and where they want to solve problems. One of them received help from IOTA co-founder Sergey Ivancheglo because he found flaws in their system and helped them to correct the issues. Responsible disclosure done right – no FUD, no malice.

IOTA lacks usability and an easy to use wallet because they focused on getting Bosch, Volkswagen, Fujitsu on board. The list of collaborations and business connections is already too long to list them in one article, but it’s hard to deny that IOTA chose the way of success and adoption.

The Trinity wallet of the University College London around Dr. Navin Ramachandran, Charly Varley and a few other developers is the solution for the usability issues people communicated.

It will include an automated re-attach functionality, an address-re-use protection, a secure seed generator, and better UI. Critical issues that are problems of the Wallet that is in use right now.

This will take a few weeks, but from then on other projects have no advantage in terms of usability anymore.

Investments in the cryptocurrency sphere are often directly connected to the expectation value of an asset.

If the usability of IOTA is improved vastly, what do you expect?

 

6. Why is IOTA still not listed on Bittrex, Poloniex, and Kraken?

There are 5 reasons.

  1. Cryptoland has seen a tsunami of new registrations, ERC20 tokens, ICO’s, DDoS attacks.
    Exchanges need to compensate for all of that. Nothing is ensured or given, this is an ecosystem in development.
  2. IOTA is not easy to implement and maintain. The IOTA foundation develops the IXI, a library that makes it easier to implement IOTA on exchanges.
    The special abilities of IOTA need special treatment, therefore, we still see just a handful exchanges. Still. Bitfinex, Coinone, Binance, and Okex are not bad when we think about the circumstances.
  3. The focus of the Foundation on collaborations, adoption, research etc.
  4. The mentioned exchanges are regulated especially the US-based companies. Adding new cryptocurrencies is not easy under the new regulations.
  5. There are other, more important markets than cryptocurrency exchanges. The IOTA foundation lately talked about the possibility of adding IOTA to the stock market. This would mean that IOTA had access to regulated, legit funds, companies and traders all over the world. A very helpful company in that regard is the Advanced Blockchain AG in Germany.

 

7. There are reports, that people have been hacked. What’s that about?

That is a misconception that has been created by the yellow press and blogging landscape of cryptoland.

IOTA has never been hacked. Seeds cannot just randomly be found. The possible combinations of seeds in the IOTA Tangle are higher than all atoms in the observable universe. This is not just a phrase, it’s a fact.

If people lose their money, it’s mostly because they used a rigged online seed generator, or they gave someone access to their seed in other ways.

As long as only you know your seed, and no one has access to it, your funds are 100% save.

 

8.  I waited 20 hours to get a withdrawal from an exchange? Is this IOTA?

No. That’s a combination of 3 things. Your expectation, IOTA’s usability, and that exchanges are struggling to handle the insane numbers of requests by their user-base.

IOTA is pretty young, special, focused on corporal adoption. You shouldn’t expect it to be like consumer software with steady support and service.

The usability makes it look like IOTA is slow and broken, the truth is that if you know the appropriate solutions to issues, it almost always works, depending if the network is working as intended or not.

Attacks, snapshots, and improvements forced the developers numerous times to halt the coordinator or to upgrade the full nodes. That will improve vastly in the future, as the growing network ensure that the coordinator will be removed eventually and snapshots will be performed locally.

Exchanges will handle IOTAs deposits and withdrawals way better in the near future, as the IXI hub is in its final stages of development.

All 3 reasons considered, IOTA has a bright future in terms of user experience.

 

9. If companies and projects don’t need tokens for using the tangle, how does the token gain value?

A question that has been asked many times lately.

Remember that IOTA has basically 2 functionalities:

a) send data

b) send values

When companies use the IOTA data marketplace for selling their sensor-output, they use both functions as they offer their data and on the other hand, demand just a few iotas for each data-package.

How would they get paid, if the costs of the information are at a fraction of a cent? With a credit card or PayPal?

No. The established payment options cannot be used for micropayments. That is the whole idea of a machine economy.

Machines get a wallet and get paid for their work, especially when microtransactions are needed, such as electric vehicle charging.

For that, you need iotas, the currency of IOTA.

The total supply is limited, therefore rises in value the more real-world use-cases are built around it.

The IOTA foundation stated, that companies that are working with IOTA are “forced” to use iotas and not a “tokenized asset” on top of the tangle, to ensure that the value of the iotas will be secured.

If a company decides to only use the data transaction, they still support the system with proof of work and a better confirmation rate.


Most importantly: The IOTA foundation is a non-profit organization (gemeinnützige Stiftung) under German law, registered in Berlin.

That means that the founders and developers of the foundation are working transparently in favor of society to be “gemeinnützig” – serving the public good.

This major step into mass adoption and building a trustworthy standard for the Internet of Things shouldn’t be forgotten.

Their latest efforts in creating educational blog posts on their official medium blog should shed light on many uncertainties.

Explaining series: The new IOTA Whitepaper (vers. 1.3) summarized

Explaining series: The new IOTA Whitepaper (vers. 1.3) summarized

When dealing with innovative technology, it’s not always easy to understand the underlying mechanisms and technological features, explained by a whitepaper, as it is written with and for specialists with a thorough understanding of math and statistics.

The following article aims for outlining the most important parts.

Since most of the insights of the whitepaper are not easy to digest, I couldn’t erase all complex parts.

Disclaimer: To catch all information, you should read the whitepaper, as I can not guarantee that I included all necessary information.
Italic written parts are citings from the original document.


1 Introduction and description of the system

For IOTA, we have different layers of necessary information on top of that, that should be kept in mind, when trying to understand the field of application and functionality.

  • The IoT (Internet of Things), a global network of devices, that interact for different purposes with each other. Unlike the Internet, the IoT is not organized in one topology, that is synchronous and connected intrinsically, like a flat field. The IoT is more like a connected infrastructure that is comparable to a bumpy geomorphological landscape, with hills, holes, continents, that are connected via several connectivity types, with divided subnetworks, offline-areas, and usually higher latencies compared to the Web. But also harder to attack with DDoS- attacks due to its natural barriers and mesh-net connection.
  • Blockchains. A peer to peer network, that enables many use-cases, often bears a monetary incentive for miners or investors, and also could be a key-technology for disrupting certain industries, such as Fintech or bigdata-markets. Blockchains are running on the Internet.
  • Its derivative DLT (Distributed Ledger Technology), a technology that shares some similarities with blockchains. The difference is the chronological order, that is used to store information and value in blocks and chains, while distributed ledger technologies share these values or information between all nodes in their network. That means, that although all blockchains have distributed values, they are usually bound to the chronological order of blocks, ordered in a chain, which is not the case in DLT’s. That difference makes IOTA architecture different in many regards, while it keeps the advantages of blockchains.
  • The DAG (Directed Acyclic Graph). A mathematical graph, that is used for ordering and updating nodes or edges. Important here is that along its path, dependencies exist, that make it impossible for a confirmation of a transaction to “travel” back to its origin. Cycles are not allowed in a DAG. That means that once information is induced, it’s impossible to change it afterwards, which is an important part of the consensus-rule.
  • IOTA, the software/cryptocurrency/protocol that is based on the Whitepaper of the Tangle by Prof. Serguei Popov.
    IOTA is made for the machine economy in the IoT, so it’s natural habitat is a mesh-network, where asynchronicity and offline-clusters are part of the system. The Tangle that is based on a DAG for IOTA brings up the following rule: transactions between nodes can only be conducted if nodes reference two other unconfirmed transactions.
  • Therefore, users act as both: miners and validators. Consensus is not decoupled, like in most Blockchains.

Along their way, unconfirmed transactions are referenced. If they are referenced by good nodes, they earn “weight” and “cumulative weight” (Figure 1, bold numbers). The latter is the most important measurement for transactions on a way to its network approval. The referencing of good nodes is usually based on the fact that the tip is not created by a lazy node, so it did participate in the approval of newer transactions.

Prof. Serguei Popov writes:

In order to issue a transaction, a node does the following:

• The node chooses two other transactions to approve according to an algorithm.
In general, these two transactions may coincide.

• The node checks if the two transactions are not conflicting, and does not approve
conflicting transactions.

• For a node to issue a valid transaction, the node must solve a cryptographic
puzzle similar to those in the Bitcoin blockchain. This is achieved by finding a
nonce such that the hash of that nonce concatenated with some data from the
approved transaction has a particular form. In the case of the Bitcoin protocol,
the hash must have at least a predefined number of leading zeros.

 

The IOTA network is asynchronous. In general, nodes do not necessarily see the same set of transactions. 

The tangle may contain conflicting transactions. The nodes do not have to achieve
consensus on which valid transactions have the right to be in the ledger, meaning
all of them can be in the tangle. However, in the case where there are conflicting
transactions, the nodes need to decide which transactions will become orphaned. 

The confidence level of a transaction is decided by the tip-selection algorithm (MCRW), that selects transactions for confirmation according to their cumulative weight. If the cumulative weight is high, the tip is more likely to get chosen. Lazy tips: a node, that conducts a new transaction and avoids doing too much pow via choosing older transactions for approval. This lazy node creates a lazy tip, that is less likely to be chosen because the low cumulative weight leads to a slow or non-selection of the MCRW tip-selection algorithm.


2 Weights and more

The following part in the whitepaper describes the weight of a transaction and other indicators.

The weight of a transaction is proportional to the amount of work that the issuing node invested into it.

The weight is attached to a transaction via positive integer, which means a whole number that can be read when reading the information of the transaction and the bundle it’s in.

In general, the idea is that a transaction with a larger weight is more “important” than a transaction with a smaller weight.

The basic idea of the weight is to give transactions an indicator if they are valid and coming from a “good node”.

It’s practically impossible to generate “an abundance of transactions with acceptable weights in a short period of time“.


Definitions:

  • Weight:
    The weight of a transaction is proportional to the amount of work that the issuing node invested into it
  • Cumulative weight:
    The own weight of a particular transaction plus the sum of own weights of all transactions that directly or indirectly approve this transaction.
  • Tips:
    Unapproved transactions in the tangle graph.
  • Height:
    The length of the longest oriented path to the genesis.
  • Depth:
    The length of the longest reverse-oriented path to some tip.
  • Score:
    The score of a transaction is the sum of own weights of all transactions approved by this transaction plus the own weight of the transaction itself.

For a thorough understanding of these indices, I recommend reading section 2 of the whitepaper, as it describes the calculations of the cumulative weight. To draw a conclusion: The more pow is done by honest nodes, the higher the cumulative weight gets.


3 Stability of the system, and cutsets

Under the assumption that all devices in the Tangle network have the same computational power, the approval of tips is distinguished in 2 different scenarios:

Low load: the typical number of tips is small, and frequently becomes 1. This may happen when the flow of transactions is so small that it is not probable that several different transactions approve the same tip. Also, if the network latency is very low and devices compute fast, it is unlikely that many tips would appear. This even holds true in the case when the flow of transactions is reasonably large. Moreover, we have to assume that there are no attackers that try to artificially inflate the number of tips.

High load:  the typical number of tips is large. This may happen when the flow of transactions is large, and computational delays together with network latency make it likely that several different transactions approve the same tip.

As written in the whitepaper, there is no clear borderline between those two scenarios, just an informal distinction in order to showcase extreme cases.

 

Important indicators:

  • L(t):
    The total number of tips in the system at time t.
  • h:
    The average time a device needs to perform the PoW
  • λ:
    Poisson point process that can be applied to the large number of incoming transactions from roughly independent entities
  • r:
    Revealed tips. Tips that were attached to the tangle before time t − h (before the PoW was done)
  • λh:
    Hidden tips. Tips that were attached to the tangle simultaneously when the PoW was done. [t − h, t)

The stability of the total number of tips L(t) is the most important point to assess the rate of approval in both load-regimes.

Assumption:

  • We assume that the number of tips remains roughly stationary in time and is concentrated around a
    number L0 > 0
  • In the stationary regime, this mean number of chosen tips should be equal to 1

 

Approval in the low regime: 

The situation in the low load regime is relatively simple. The first approval happens on an average timescale of order λ^−1 since one of the first few incoming transactions will approve a given tip.

Approval in the high regime:

One may assume that the Poisson flows of approvals to different tips are independent and have an approximate rate of 2λ/L0. Therefore, the expected time for a transaction to receive its first approval is around L0/(2λ) ≈ 1.45h

However, it is worth noting that for more elaborate approval strategies, it may not be a good idea to passively wait a long time until a transaction is approved by the others. This is due to the fact that “better” tips will keep appearing and will be preferred for approval. Rather, in the case when a transaction is waiting for approval over a time interval much larger than L0/2λ, a good strategy would be to promote this latent transaction with an additional empty transaction. In other words, a node can issue an empty transaction that approves its previous transaction together with one of the “better” tips to increase the probability that the empty transaction receives approval. -> 

-> which can lead to an attack, described later.

Conclusions: 

1. We distinguish between two regimes, low load and high load (Figure 3).

2. There are only a few tips in the low load regime. A tip gets approved for the
first time in Θ(λ^−1) time units, where λ is the rate of the incoming flow of
transactions.

3. In the high load regime, the typical number of tips depends on the tip approval
strategy employed by the new transaction.

4. If a transaction uses the strategy of approving two random tips, the typical
number of tips is given by (equation 1). It can be shown that this strategy is optimal
with respect to the typical number of tips. However, it is not practical to adopt
this strategy because it does not encourage approving tips.

5. More elaborate strategies are needed to handle attacks and other network issues.
A family of such strategies is discussed in Section 4.1 (of the Whitepaper).

6. The typical time for a tip to be approved is Θ(h) in the high load regime,
where h is the average computation/propagation time for a node. However, if
the first approval does not occur in the above time interval, it is a good idea
for the issuer and/or receiver to promote that transaction with an additional
empty transaction.

7. It can be observed that at any fixed time t the set of transactions that were tips at some moment
s ∈ [t, t + h(L_0, N)] typically constitutes a cutset. Any path from a transaction
issued at time t’ > t to the genesis must pass through this set. It is important
that the size of a new cutset in the tangle occasionally becomes small. One may then
use the small cutsets as checkpoints for possible DAG pruning and other tasks.

3.1 How fast does the cumulative weight typically grow?

The cumulative weight is the most important indicator, because most of attack vectors can be created around this indicator.

Tip approval is mostly based on the cumulative weight, therefore, one must understand the cumulative weight adaptation rate.

Prof. Popov writes in a footnote:

In fact, the author’s feeling is that the tip approval strategy is the most important ingredient for constructing a tangle-based cryptocurrency. It is there that many attack vectors are hiding. Also, since there is usually no way to enforce a particular tip approval strategy, it must be such that the nodes would voluntarily choose to follow it knowing that at least a good proportion of other nodes does so.


The growth of cumulative weight in:

Low regime:  After a transaction gets approved several times, its cumulative weight will grow with speed λ because all new transactions will indirectly reference this transaction.

High regime: In the case where the network is in the high load regime, an old transaction with a large cumulative weight will experience weight growth with speed λ because essentially all new transactions will indirectly reference it. Moreover, when the transaction is first added to the tangle it may have to wait for some time to be approved. In this time interval, the transaction’s cumulative weight behaves in a random fashion.

Definition:

  • H(t):
    As the expected cumulative weight at time t (for simplicity, we start counting time at the moment when our transaction
    was revealed to the network, i.e., h time units after it was created).
  • K(t):
    As the expected number of tips that approve the transaction at time t.

The whitepaper offers complicated calculations at this point that lead to the cumulative weight adaptation rate of:

Conclusion:

1. After a transaction gets approved multiple times in the low load regime, its
cumulative weight will grow with speed λw, where w is the mean weight of a
generic transaction.

2. In the high load regime, there are two distinct growth phases. First, a transaction’s
cumulative weight H(t) grows with increasing speed during the adaptation
period according to (equation 8 – in the whitepaper). After the adaptation period is over, the cumulative
weight grows with speed λw (Figure 4). In fact, for any reasonable strategy,
the cumulative weight will grow with this speed after the end of the adaptation
period because all incoming transactions will indirectly approve the transaction
of interest.

3. One can think of the adaptation period of a transaction as the time until most
of the current tips indirectly approve that transaction. The typical length of
the adaptation period is given by (equation 7 – in the whitepaper).


4 Possible attack scenarios

Outpacing attack/large weight attack:

1. An attacker sends a payment to a merchant and receives the goods after the
merchant decides the transaction has a sufficiently large cumulative weight.

2. The attacker issues a double-spending transaction.

3. The attacker uses their computing power to issue many small transactions that
approve the double-spending transaction, but do not approve the original transaction
that they sent to the merchant either directly or indirectly.

4. It is possible for the attacker to have a plethora of Sybil identities which are
not required to approve tips.

5. An alternative method to item 3 would be for the attacker to issue a big doublespending
transaction using all of their computing power. This transaction would have a very large own weight,
and would approve transactions prior to the legitimate transaction used to pay the merchant.

6. The attacker hopes that their dishonest subtangle outpaces the honest subtangle.
If this happens, the main tangle continues growing from the doublespending
transaction, and the legitimate branch with the original payment to
the merchant is orphaned (fig. 5)

7. From the above discussion (calculations in the whitepaper), it is important to recognize that the inequality λ > µ
should be true for the system to be secure. In other words, the input flow of “honest”
transactions should be large compared to the attacker’s computational power.
Otherwise, the estimate (equation 12) would be useless. This indicates the need for additional
security measures, such as checkpoints, during the early days of a tangle-based
system.  -> the Coordinator

8. When choosing a strategy for deciding which one of two conflicting transactions
is valid, one has to be careful when using cumulative weight as a decision metric.
This is due to the fact that cumulative weight can be subject to an attack similar
to the one described in Section 4.1, namely, the attacker may prepare a doublespending
transaction well in advance, build a secret subtangle referencing it, and
then broadcast that subtangle after the merchant accepts the legitimate transaction.
A better method for deciding between two conflicting transactions might be the one
described in the next section: run the tip selection algorithm and see which of the
two transactions is indirectly approved by the selected tip.

4.1 A parasite chain attack and a new tip selection algorithm

1. An attacker secretly builds a subtangle that
occasionally references the main tangle to gain a higher score. Note that the score
of honest tips is roughly the sum of all own weights in the main tangle, while the
score of the attacker’s tips also contains the sum of all own weights in the parasite
chain. Since network latency is not an issue for an attacker who builds a subtangle
alone27, they might be able to give more height to the parasite tips if they use a
computer that is sufficiently strong. Moreover, the attacker can artificially increase
their tip count at the moment of the attack by broadcasting many new transactions
that approve transactions that they issued earlier on the parasite chain (Figure 6).
This will give the attacker an advantage in the case where the honest nodes use some
selection strategy that involves a simple choice between available tips.

2. To defend against this attack style, we are going to use the fact that the main
tangle is supposed to have more active hashing power than the attacker. Therefore,
the main tangle is able to produce larger increases in cumulative weight for more
transactions than the attacker. The idea is to use a MCMC algorithm to select the
two tips to reference

3. It is easy to see why the MCMC selection algorithm will not select one of the attacker’s
tips with high probability. The reasoning is identical to the lazy tip scenario:
the sites on the parasite chain will have a cumulative weight that is much smaller
than the sites that they reference on the main tangle. Therefore, it is not probable
that the random walker will ever jump to the parasite chain unless it begins there,
and this event is not very probable either because the main tangle contains more
sites

4. In any case, there is not a large incentive for the nodes to be selfish because possible gains
only amount to a slight decrease in confirmation time. This is inherently different
from other decentralized constructs, such as Bitcoin. The important fact is that nodes
do not have reasons to abandon the MCMC tip selection algorithm.

4.2 Splitting attack

1. Aviv Zohar suggested the following attack scheme against the proposed MCMC algorithm.
In the high-load regime, an attacker can try to split the tangle into two
branches and maintain the balance between them. This would allow both branches
to continue to grow. The attacker must place at least two conflicting transactions
at the beginning of the split to prevent an honest node from effectively joining the
branches by referencing them both simultaneously. Then, the attacker hopes that
roughly half of the network would contribute to each branch so that they would be
able to “compensate” for random fluctuations, even with a relatively small amount
of personal computing power. If this technique works, the attacker would be able to
spend the same funds on the two branches

2. To defend against such an attack, one needs to use a “sharp-threshold” rule that
makes it too hard to maintain the balance between the two branches. An example
of such a rule is selecting the longest chain on the Bitcoin network.

3. It is worth noting that the attacker’s task is very difficult because of network
synchronization issues: they may not be aware of a large number of recently issued
transactions.

4. Another effective method for defending against a splitting attack
would be for a sufficiently powerful entity to instantaneously publish a large number
of transactions on one branch, thus rapidly changing the power balance and making
it difficult for the attacker to deal with this change. If the attacker manages to maintain
the split, the most recent transactions will only have around 50% confirmation
confidence (Section 1), and the branches will not grow. In this scenario, the “honest”
nodes may decide to start selectively giving their approval to the transactions that
occurred before the bifurcation, bypassing the opportunity to approve the conflicting
transactions on the split branches

5. One may consider other versions of the tip selection algorithm. For example, if
a node sees two big subtangles, then it chooses the one with a larger sum of own
weights before performing the MCMC tip selection algorithm outlined above.

 

Attack-scenario conclusion: 

1. We considered attack strategies for when an attacker tries to double-spend by
“outpacing” the system.
2. The “large weight” attack means that, in order to double-spend, the attacker
tries to give a very large weight to the double-spending transaction so that it
would outweigh the legitimate subtangle. This strategy would be a menace
to the network in the case where the allowed own weight is unbounded. As a
solution, we may limit the own weight of a transaction from above, or set it to
a constant value.
3. In the situation where the maximal own weight of a transaction is m, the best
attack strategy is to generate transactions with own weight m that reference
the double-spending transaction. When the input flow of “honest” transactions is
large enough compared to the attacker’s computational power, the probability
that the double-spending transaction has a larger cumulative weight can be
estimated using the formula (12) (see also examples below (equation 12)).
4. The attack method of building a “parasite chain” makes approval strategies
based on height or score obsolete since the attacker’s sites will have higher
values for these metrics when compared to the legitimate tangle. On the other
hand, the MCMC tip selection algorithm described in Section 4.1 seems to
provide protection against this kind of attack.
5. The MCMC tip selection algorithm also offers protection against the lazy nodes
as a bonus.


5 Resistance to quantum computations

As of today, one must check an average of 2^68 nonces to find a suitable hash that allows a
new block to be generated. It is known (see e.g. [source 15]) that a quantum computer would need
Θ(√N) operations to solve a problem that is analogous to the Bitcoin puzzle
stated above. This same problem would need Θ(N) operations on a classical computer.

Therefore, a quantum computer would be around √2^68 = 2^34 ≈ 17 billion times more
efficient at mining the Bitcoin blockchain than a classical computer. Also,
it is worth noting that if a blockchain does not increase its difficulty in response to
increased hashing power, there would be an increased rate of orphaned blocks.
For the same reason, a “large weight” attack would also be much more eefficient on
a quantum computer. However, capping the weight from above, as suggested
in Section 4, would effectively prevent a quantum computer attack as well. This is
evident in iota because the number of nonces that one needs to check in order to find
a suitable hash for issuing a transaction is not unreasonably large. On average, it is
around 3^8. The gain of efficiency for an “ideal” quantum computer would, therefore, be
of order 3^4 = 81, which is already quite acceptable.

More importantly, the algorithm used in the IOTA implementation is structured such that the time to find a nonce is not much
larger than the time needed for other tasks that are necessary to issue a
transaction. The latter part is much more resistant against quantum computing, and
therefore gives the tangle much more protection against an adversary with a quantum
computer when compared to the (Bitcoin) blockchain.



Questions about the whitepaper, its including math or special attack vectors can be discussed in the development slack under #tanglemath

 

Explaining Series: Fog Computing in the Internet of Things

Explaining Series: Fog Computing in the Internet of Things

Fog Computing -one of many new trendy terms that we see and read almost everywhere in this field.

What is it? -and how can IOTA enable the perfect fog-computing landscape, the IoT needs?

I give you a short explanation and good sources for a smooth heads-up.


Roundup:

This roundup is an experiment that aims for a better understanding of the greater picture. Some keywords before the actual article are meant as an information-index.

  • Internet of Things (IoT)= Term from the MIT, Kevin Ashton, 1999
  • Fog-Computing = Term from Cisco
  • Fog = Decentralized/Distributed
  • Cloud = Centralized
  • Realm = IoT + IIOT, B2B, M2M, IoE, Smart grids, Smarthome, Smart cities, interconnected world
  • Problem = Unused Sensor-Data, Need for a solution of a distributed network, Costs of cloud-computing, time
  • Application = Evolving Markets, Quality-as-a-service, Machine Communication, Scada
  • IoT Systems = Basically two groups: 1) Identification Group (sensors, data gathering) 2) Computational group (processing, data storage)
  • Limitations until now: Cloud computing (centralized, far away from consumers and devices)  doesn’t fit the requirements of the IoT (distributed, in need of close storage, computational resources, instant processing), Bandwidth
  • Connection Types: WiFi, Bluetooth, ZigBee, 2G/3G/4G/5G, Radio, Z-Wave, 6LowPan, Thread, Wifi, Cellular, NFC, Sigfox, Neul, LoraWan

The IoT

The vision of the Internet of Things is still in the making.

With the latest development in this interconnected world, new markets are emerging and a variety of requirements are born.

Wearables, smartphones, domestic devices like smart-home solutions for an intelligent household demand an interconnectivity solution that has yet to come.

It’s no secret that almost every company is also working on solutions to make it happen: a world, where data is a more valuable resource than oil. If not today, then in the near future.

This leads to a point, where technical barriers of today hinder progress for tomorrow.

The IoT, a distributed network around the world is more than the Internet.

A mesh-net that is connected with every possible connection type. Where devices work in local clusters, it’s obvious that centralized components, sometimes on a different continent, don’t fit in the greater picture.

Sensors, cameras, smart devices often use ad-hoc solutions to function in their specific field, such as monitoring systems like Scada, that send valuable data to a nearby control center in order to optimize industrial processes.

What if these monitoring systems are working time-sensitive, but the current solutions are slow and on top of that centralized and unsecured. The productivity could be better, employees may work in a more dangerous environment and as a result: the company could face problems.

Connected facilities incentivize industry-espionage and hacks.

Distributed denial of service attacks is a phenomenon of the last few years, where certain malicious parties are attacking infrastructural points in the web, to cripple communication of some systems and special services.

Sometimes as a decoy for a hack, sometimes for political or activist-reasons.

Not rarely, mentionable down times create financial losses or the blockage of regional infrastructure’s hits, next to the target, also other companies that are located in the surrounding area.

A problem of the Internet, not necessarily of the IoT.

Due to the distributed mesh-net characteristics, the IoT is envisioned as a network, that is self-sufficient, in which case it can connect devices of the identification group via many ways, not only one.

An attack on central points is per definition impossible because there is no center in the IoT.

That leads to a natural resistance against DDoS and other downtimes.

Legacy systems vs. new systems

An additional issue of cloud-computing in the IoT would be the costs. Legacy system use to ignore huge amounts of data because there is neither storage no need for them.

New systems in the IoT, with smart solutions, rely on this data, but sending them into the cloud would go beyond the scope of the IoT. Too much information is generated, and real-time analysis, as well as centralized cloud-computing solutions, are conflictive with each other as uploading these huge amounts takes time and money -especially if the cloud-storage is thousands of miles away.

Fog computing, however, creates a bridge-solution for the identification group and computation group: It is about forwarding the computational power to the edge of the network, where data is generated and the results are needed.

The benefits of using Fog computing instead of legacy cloud systems are tremendous.

Varghese, Wang, et. al [2017]. come to the conclusion that. “For an online game use-case, we found that the average response time for a user is improved by 20% when using the edge of the network in comparison to using a cloud-only model. It was also observed that the volume of traffic between the edge and the cloud server is reduced by over 90% for the use-case.

This is just one use case that can be mirrored on many other settings.

In consumer markets, Quality of Service and Quality of Experience are important factors.

Another example would be the transparent customer. When a transparent customer enters a big supermarket, his views and interests could be analyzed within seconds.

Cameras can detect his interest in certain devices or components, and advertisements on monitors along his path can be adjusted to his specific needs. With old legacy systems impossible due to the long processing times between these cameras, a cloud, and computational resources, with fog computing, however, the data can be processed way faster and deliver the necessary information back to the customer, along his way in the mall.

To draw a simplified picture of the fog-landscape:

The distributed mesh-net is growing in height z, if you will, whereas decentralized and centralized networks are growing on the x and y axis. Shorter ways from the data collectors to the computational resources are the result of fog-computing.

Concerns can be addressed with IOTA

Whether it’s the data-integrity, optimization or protection of the in-house Research & Development data, companies look for a lasting solution.

When data is stored centralized, hackers usually use social engineering, or phishing attacks to get access to the data.

As centrally stored data would be collected all in once with this method, Fog computing would make it possible to store sensitive information in small packets, distributed, with different passwords/keys/seeds to access them.

IOTA  can deliver a unique solution here. A data-stream, bound to countless seeds, in a distributed network, secured with sophisticated algorithms. Not even quantum computing would be a threat to the hashes.

As you may already know, IOTA is a distributed ledger technology, that enables fee free transactions.

For data-transfer with fog-computing, you wouldn’t even need tokens, the only condition would be to confirm two other transactions before sending one of your own.

A rule that enables true scalability for a billion device network on a global scale.

With Masked Authenticated Messaging, IOTA has an additional option to send and process sensitive data.

Now, a really big hurdle in the IoT is the availability of dozens of connections and different norms.

When devices could be connected in a similar way, the usability would increase. A plethora of standards that are built for the IoT can lead to a fragmentation of the network, as companies want to stick to their standards, to support their product-line or roadmap.

If IOTA would be the standard settlement and data layer, which is free to use, the Internet of Things could be a barrier-less environment with true scalability and data-integrity.

Due to the value of collected data, new markets would come up, that aim for selling this information in real-time.

People would possibly be able to sell their consumer data, each time they enter a shop, with true nano payments.

If data would be collected in the fog, BigChain DB a scalable distributed database for all kinds of data could deliver the necessary infrastructure for customers, institutes, and companies.

A seamless solution for the IoT.

Fog computing is, therefore, the next necessary milestone in the field of the Internet of Everything and a vital part of the vision of IOTA.

 

Video of Dominik Schieners Presentation at the Tech Open Air 2017

 

 

 


Sources:

https://www.cisco.com/c/dam/en_us/solutions/trends/iot/docs/computing-overview.pdf

http://www.springer.com/de/book/9783319576381

https://arxiv.org/pdf/1701.05451.pdf

 

Images:

<https://www.iot-now.com/2016/01/11/40800-connectivity-on-the-edge/>

 

Advertisment ad adsense adlogger