The blockages in blockchain – bad advice, bad approaches and bad designs

The blockages in blockchain – bad advice, bad approaches and bad designs

To paraphrase the 19th Century’s physicist and philosopher Pierre Laplace’s premise on Determinism: If the location and momentum of every atom is known, their past and future values can be calculated. He described a world where the future can be predicted or outcomes engineered. As the fundamentals of quantum mechanics demonstrated the limitations and flaws of Laplace’s assertion, the fundamentals of a working market, mathematics and production engineering prove the limitations and flaws of current distributed ledger technology (DLT) proof of concepts (POC’s) in the financial services.

The opportunity is to create an extremely low cost, easily implemented working market, writes Paul F. Dowding. A market where all forms of capital could be raised, issued, financed and traded with liquidity and price transparency at market volumes. Near-real-time settlement would generate balance sheet savings and new product opportunities. However, due to the word “ledger”, most of the POC’s focus on payments or settlements for limited-purpose use-cases with low-volume, high-latency, small-population transactions in unregistered securities.  Further, the POC’s are not capable of comprehensively recording a working market’s need for non-ledger referenced and potentially undefined obligations e.g. shorts, payables and financing.

Based on the Bitcoin model, the POC’s have scalability issues with increasing active memory requirements as transactions and balances are recorded together in their respective blocks. If balances were recorded separately from the transactions, like current financial operations, the transactions could be archived and balances carried forward for scalability. Also, Bitcoin has active and passive roles for payees and payors. Similarly, the POC’s have instigator and associate roles which create latency and potential control issues. Lastly, the POC’s resemble P2P networks without all the current regulatory or contractually required roles in place. Market participants still functionally require fund managers, brokers, markets and custodial/depository services even if their processes and pricing are drastically reduced.

Within Meta-mathematics, Alan Turing devised a theoretical computer to prove mathematics did not possess Decidability or that an algorithm cannot decide whether any other algorithm will halt or be stuck in an infinite loop (known as the halting problem).  Yet “Turing Incomplete” Code, with no loops - critical to Bitcoin’s integrity - has been abandoned by the major POC’s to allow flexible, customized smart contracts, which then introduces potential instability and vulnerability to attack.

Considering production engineering, all transaction life-cycles represent a production process. Latency cannot be reduced and capacity cannot be maximized unless processes are minimized by pre-assembled parts and run in parallel. The POC’s consist of multiple, multi-party, sequential, complete-assembly processes, one of which is voting and tallying consensus on a one-dimensional ledger. These collectively represent the antithesis of an optimal production process. Also, any downstream processes can only be fulfilled by API’s to separate systems with their inherent reconciliations. Attempts at increasing performance by “sharded” (or limited participation) consensus and “lightning networks” (or side chain processing) will never reach the industry’s performance requirements for larger groups, but there is a greater problem. The voting process is fundamentally flawed. With full adoption, Proof of Stake would concentrate the greatest value with passive investors. The three untenable choices are: a proxy vote for every block, delegated and litigious-vulnerable voting or abstention, preventing network consensus. Regardless, while game theory works for market analysis, it shouldn’t be the cornerstone for the secure recording and transfer of value.

An optimal DLT design, would use standardized, algorithmically validated, life-cycle scripts. Parties should independently generate self-validating, contra-transactions on a multi-dimensional ledger. The network’s capacity for broadcasting and receiving transactions is then only limited by its hardware. The creation and validation of transactions and balances would be separate processes, managed within each Node, whereby the downstream processes can also have further data elements confidentially written and reported in real-time.

Determinism, while invaluable to predict or engineer limited scope outcomes, becomes unreliable with complex environments. Similarly, simple use-cases create the illusion of a greater potential for sub-optimal DLT POC’s. Fundamentals require the current POC’s core and protocol layers be redesigned. Any POC participant should ask their respective advisors and partners why they didn’t identify or understand these fundamental issues and their insurmountable nature in the POC designs. While experimentation and education may be useful bi-products, the POC’s have suffered from bad advice, bad approaches and bad designs and are not commercially viable.

Written by Paul F. Dowding, managing director, blockchain solutions lead, strategy and solutions practice, Gartland and Mellina Group.


Thought Leaders

Constant change in prime brokerage shines the spotlight on proven, trusted partners

By Paul McGuigan, Scotiabank’s European Head of Securities Lending

Challenges to the sec finance status quo

Natixis’ Anthony Caserta and Saverio Costa outline the opportunities and...

Centralized collateral management becoming a reality

Bimal Kadikar, CEO of Transcend Street Solutions

The evolution of network management

Massimiliano Notarianni, Global Head of Network Management at Societe Generale

DTCC’s Roadmap to SSI Automation

DTCC's white paper – The Roadmap to Automation