Author

I am Joannes Vermorel, founder at Lokad. I am also an engineer from the Corps des Mines who initially graduated from the ENS.

I have been passionate about computer science, software matters and data mining for almost two decades. (RSS - ATOM)

Meta
Tuesday
Apr102018

A weirder definition of Bitcoin

While attemps have been made in the past to come up with a definition of Bitcoin, I felt that those definitions were somehow failing at capturing the very esssence of Bitcoin, so I decided to roll my own. Enjoy!

A weirder definition of Bitcoin

Abstract: Bitcoin is best characterized as an exceedingly weird virtue-inducing artifact. Attempts at making Bitcoin less weird have only two outcomes: either the attempt fails and Bitcoin just becomes weirder; or the attempt succeeds and this is not Bitcoin anymore. The weirdest part of Bitcoin is probably that its own excess of weirdness can be reliably relied upon, as the author demonstrates by providing corollaries of practical interest out of the present definition. Anecdotally, it also explains why Satoshi Nakamoto opted to remain anonymous, as it is usually frowned upon to conjure exceedingly weird inventions.

PDF at http://media.lokad.com/bitcoin/weirder-bitcoin-2018-04-10.pdf

Friday
Apr062018

Addressing a few loose angles of Bitcoin

Two weeks ago, I had the unique privilege of meeting not one, but a whole series, of truly remarkable people at Satoshi’s Vision in Tokyo. This list includes Amaury Séchet, Shammah Chancellor, Tomas van der Wansem, and quite a few others. While Bitcoin had gained my interest back in 2011, I never had taken much time to think about the Nakamoto consensus itself. To my defense, running Lokad, my company, was simply capturing my day-to-day interests. Thus, in Tokyo, I came to realize that there were some angles within Bitcoin which were maybe not getting yet the proper solutions they deserve.

Terabyte-sized blocks, which I have contemplating since last December, represent only a narrow angle of Bitcoin - albeit an important one.

Below, I am presenting four articles, redacted upon my return from Tokyo, covering angles which I believe to be relevant for Bitcoin. Those articles are all still early drafts, and I apologize in advance for their sorry state of writing. Once again, running my own company does not give me all the freedom it would take to get those ideas either polished or debunked depending on their merits.

By making those drafts public, I am primarily seeking the constructive feedback of the Bitcoin community at large. Unlike many speakers of the Satoshi’s Conference who can claim almost a decade of experience in those matters, I am only claiming about 10 days or so of experience. So please, take those findings with a grain of salt.

 

Ansible, practical faster-than-light secure 0-conf transactions for Bitcoin

Abstract: The Ansible is a peer-to-peer pre-consensus signal that, by itself, makes 0-conf transactions secure. The Ansible is a self-fulfilling prophecy because making it so aligns with the economic interests of the miners who have to remain competitive. The Ansible confers two seemingly counterintuitive properties to the Nakamoto consensus. First, securing 0-conf transactions do not require any kind of retaliation against Byzantine miners, thus no change to the Nakamoto consensus. Second, 0-conf transactions can be secured with arbitrarily low latencies on earth, despite the fact that this proposition appears to violate the speed of light. The Ansible perspective clarifies why larger and infrequent blocks are actually highly desirable to secure 0-conf transactions.

PDF at http://media.lokad.com/bitcoin/ansible-2018-05-05.pdf

 

Midas, united non-colluding transaction fees for Bitcoin

Abstract: Transaction fees are an integral component of the Bitcoin social contract. They reward miners into playing the long game of Bitcoin, when the monetary mass of Bitcoin will not be growing anymore, not with economic relevance anyway. Through an analysis of Bitcoin looking inward, but even more importantly, looking outward, we demonstrate that soviet economics are required in the short term to set the transaction fees, but that the transition toward market-driven fees should and will happen in the future. The author proposes Midas, a pre-consensus signal intended for transaction fees, which preserves the competition within the Bitcoin mining market. Midas unifies miners through their mutual interest of preserving the security model of Bitcoin which includes microlatent transactions. Midas does not require any change to the Nakamoto consensus.

PDF at http://media.lokad.com/bitcoin/midas-2018-05-06.pdf

 

Tokeda, Viable token-driven metadata within Bitcoin

Abstract: Tokeda addresses both the challenge of viably preserving an unbounded amount of metadata without endangering Bitcoin itself and the challenge of introducing tokens within Bitcoin by weaving the two problems through aligned economic incentives. Tokeda is compatible with stateless wallets (which include SPV wallets) and requires no consensus change. As a token scheme, Tokeda relies on a trust-but-verify security model centered around the issuer. The issuer is trusted with the relay of inbound transactions from users. The issuer takes care of routing the metadata to remedy the lack of such capabilities within the Bitcoin script. As a metadata layer, Tokeda leverages the UTX set (unspent transactions) as a purposefully pruneable key-value store, which is a superset of the UTXO set (unspent transaction outputs). Tokeda creates a market signaling mechanism at the issuer level, to foster an ecosystem of nodes which can selectively persist the metadata in the UTX set depending on the originating issuer. The author argues that Tokeda is an economically superior form of tokens compared to the code-is-law approach adopted by some of the competitors of Bitcoin. The author also argues that Tokeda is a provable way to incentivize miners to foster a token-driven economy backed by Bitcoin, instead of expecting the miners to subsidize tokens operated over Bitcoin.

PDF at http://media.lokad.com/bitcoin/tokeda-2018-04-30.pdf

 

Sakura, long term UTXO recycling mechanism for Bitcoin

Abstract: The very long term viability, centuries ahead, of Bitcoin depends on preventing the runaway growth of the UTXO set (unspent transaction outputs). Also, the ever shrinking monetary mass of Bitcoin is a complication which hinders economic agents from fully relying on perfectly predictable monetary conditions. Here, we propose Sakura, a long term recycling mechanism to prune “dead” UTXO entries, defined as entries that have remained untouched for 80 years (defined as 4,200,000 blocks). It allows those “dead” entries to re-enter the pool of mining rewards, on top of the normal halving mechanism which normally occurs every 210,000 blocks. Sakura comes with a trigger condition that “dead” UTXO entries should represent more than 50% of the UTXO set. This condition prevents a premature activation of the change of consensus if there is not enough economic gains to justify the change. Sakura proposes an exponential decay mechanism associated to a half-period of roughly 20 years. The paper also presents a discussion to justify why those seemingly arbitrary choices are made.

PDF at http://media.lokad.com/bitcoin/sakura-2018-05-02.pdf

 

Thursday
Mar292018

Satoshi's Vision, talk on Terabyte Blocks for Bitcoin

Last week, I was in Tokyo at the Satoshi's Vision conference. I gave a talk about Terabyte Blocks for Bitcoin. Here are the slides. Check the video too, there are some good questions raised at the end of the talk.

Overall, it was a incredible event, tremendously positive for Bitcoin. I am thrilled to see so many hard-working contributors doing their best to address all the challenges that Bitcoin is facing.

Wednesday
Jan242018

Fast 1D convolution with AVX

Convolutions are important in a variety of fields. For example, in deep learning, convolutional layers represent a critical building block for most signal processing: image, sound or both. My company Lokad is also extensively using convolutions as part of its own algebra of distribution.

One technical problem associated to convolutions is that they are slow. The theory tells you that FFT can be used to obtain good asymptotic performance, but FFT isn't typically an option when your signal has only a few dozen data points; but that you still need to process tens of millions of such signals.

It is possible to speed-up convolutions with a GPU, but my latest experients also tells me that a massive speed-up can already be achieved with CPUs as well, by leveraging their now widespread vector instructions. This idea isn't new, and I was inspired by an original post of Henry Gomersall on the very same topic.

I have just released my own fast 1D convolution in C++ which differs substantially from the original one posted by Henry Gomersall. More specifically:

  • This implementation works with AVX (rather than SSE), for further performance boost.
  • The approach can canonically be upgraded to AVX2 (or even larger vector instruction).
  • It delivers full convolutions rather than exact ones (NumPy terminology).

Compared to the naive C#/.NET implementation of convolution - which was my starting point - this implementation delivers a rough 20x speed-up; but it comes at the cost of doing PInvoke from C#/.NET.

Sidenote for .NET enthusiats: AVX intrinsics are coming to .NET. It will soon be possible to write such ultra-optimized code directly from C#/.NET.

Tuesday
Dec262017

Mankind needs fractional satoshis

Update: Dr Craig Wright is pointing out that payment channels are a viable alternative to fractional satoshis. I am not an expert in payment channels, but it would certainly largely help in mitigating the problem discussed below. Then, choosing between on-chain scaling and payment channels boils down in establishing the actual limits of on-chain scaling. 

Bitcoin Cash aims at becoming the world currency. As discussed previously, terabyte blocks are needed to achieve this goal. However, the Bitcoin Cash protocol also needs a few changes as well. In this post, I will demonstrate why fractional satoshis are needed for Bitcoin Cash.

In the following, for the sake of concision, Bitcoin always refers to Bitcoin Cash.

Overview of the issue

A satoshi is, presently, the smallest unit of payment that can be sent across the Bitcoin network. There are 100 million satoshis in 1 bitcoin. In particular, the smallest non-zero transaction fee that can be paid is 1 satoshi. Non-zero transaction fees are desirable in order to eliminate spam; however, the original intent behind Bitcoin is clearly to keep those fees vanishingly small as far humans are concerned.

At mankind scale, let’s assume that we have 10 billion humans, and that every human wants to do 50 transactions a day. This might seem a bit high - after all, mankind won’t reach 10 billion humans before 2050, however, good engineering implies safety margins and thinking ahead. I firmly believe that Bitcoin must be engineered to support 10 billion humans and 50 transactions per day per human.

Let’s further assume that those transactions are secured by paying exactly 1 satoshi per transaction (1). The miners collect 1e10 * 50 * 365 / 1e8 ≈ 1.8 millions BCH per year. This amount is huge, about 10% of the total BCH that will ever be in existence (2).

Bitcoin Cash needs to be designed in such a fashion that it is possible for mankind to spend less than 0.001% of its whole monetary supply per year in order to transact freely. Over the lifetime of a human, 100 years, the total transaction fees would remain below 0.1% her or his average monetary capital, which feels about right.

Practical example: let’s assume that my average monetary capital is 100,000€ (just counting cash, not any other asset classes). Over the course of my 100 year’s lifetime, I will pay 0.1% of this amount to cover all my transaction fees, that is, 100€. On average, it’s 1€/year, that is, 0.27 cents per day. We are not far of the 1/10th of a cent per day of my previous analysis.

As such, the current Bitcoin protocol is not tenable at mankind scale. Satoshis are too large, mankind needs fractional satoshis.

14-bit left shift

The Bitcoin transactions are encoded with 64-bits integers. This choice, made back in 2009, remains sound. For the foreseeable future, all CPUs will be 64-bits CPUs. However, the current Bitcoin implementation is wasteful. There are 14 bits that are wasted, as we will see below. Yet, it turns out that those 14 bits are exactly what Bitcoin needs to make transaction fees low enough at mankind scale.

Proposal: 1 bitcoin is redefined as 1,638,400,000,000 naks - nak being the shorthand of Nakamoto - that is 214 nakamoto per satoshi.

Let’s demonstrate why 14 bits makes sense. With 50 bits, it is possible to represent 250 satoshis, that is, about 11 millions BCH. The richest BCH address in existence contains about 400k BCH. It’s unlikely that this address will ever grow 1M BCH, let alone 11M BCH.

Thus, in order to represent even the richest BCH address, the protocol only needs 50 bits. While it may be theoretically possible to accumulate more than 11M BCH on a single address, it’s straightforward to add a rule in the Bitcoin protocol to invalidate any transaction which would try to accumulate more than 11M BCH on a single address, forcing the owner of such a fortune to split her/his fortune over 2 addresses instead.

Now, the protocol is left with 64-50 = 14 bits which are “wasted” if we want to preserve the encoding of transactions inputs and outputs as 64-bits unsigned integers. Re-encoding all amounts in sat as nak only requires a 14-bit shift to the left.

As 214 = 16384, we can revisit our initial back-of-the-envelop calculations with 10 billion humans doing 50 transactions a day. We have 1e10 * 50 * 365 / (16384 * 1e8) = 111.4 BCH paid in fees to the miners per year. This is much better, about 0.0005% of the whole monetary supply paid to the miner per year, that is, 0.05% over the 100 year lifetime of a human.

A non-urgent yet kind-of-urgent change

Fractional satoshi won’t become a problem until about 1% of mankind starts using Bitcoin to pay for everything. However, every single day that passes, there are more software out there which are dependent on the current instance of the Bitcoin protocol. Thus, the Bitcoin ecosystem is accumulating technical debt.

We know that this debt will have to be paid back. Indeed, as demonstrated above, keeping satoshis as the smallest payment unit is not tenable. We also know that this debt comes with compound interests. At present time, fixing this issue will only incur a modest friction in the ecosystem. 10 years from now, if Bitcoin has any measurable degree of success at being a currency, then, it will be a huge mess. Every single piece of Bitcoin-dependent software will be broken by such a change.

Thus, I call to the Bitcoin developers to coordinate in order to introduce fractional satoshis in their mid-term roadmap.

Pre-emptive answers to questions

Will I still own the same amount of BCH? Yes, there is zero impact on your current BCH holding. If you have 1 BCH now, you will still have exactly 1 BCH after the change.

Does it change the upper limit on the number of BCH? Technically yes, but in practice, no. This change would push back the date when mining becomes purely fee-funded by 4*14=52 years; and the total amount of extra BCH which will ever be mined will be less than 0.001 BCH. Hardly noticeable.

Nitpicking, why not 18 bits?

I do feel strongly that fractional satoshis are needed, and that 14 bits of extra precision is a minimum. However, if someone has a good reason to motivate a shift beyond 14 bits, maybe up to 18 bits, then, this person might be right. The discussion below is merely opinionated. This is not a demonstration.

The richest BCH address hold about 400k BCH. Thus, technically, it is still possible to adjust the protocol to free up to 18 bits, with a hard-cap at 703k BCH for a single BCH address. However, I do see for potential edge cases in the ecosystem of Bitcoin.

If Bitcoin succeeds, then the world will start implementing accounting packages, ERP, POS, CRM …, where assets are valued in Bitcoins, or rather in naks. Most of those software developers will use int64 integers (signed) to track the valuations of those assets. Why? Just because it’s what naturally comes to mind as a developer if you need large signed integers.

As the non-monetary assets are typically valued more than the monetary assets - eg. for most people, their home is worth more than the cash they have at the bank, the same goes for companies - those accounting books may contain values that exceed 10M BCH. Those situations, arguably rare, would trigger bugs known as numeric overflows.

Through a 14-bit shift, naive financial software implementations would still work up to 5M BCH (beware, signed integers, we lose 1 bit of precision), while a 18-bit shift will cap the maximal amount at 350k BCH, that is, 16 times less. While, it’s only a hunch, my take is that this numeric precision of a 14-bit shift would be sufficient to eliminate all int64 numerical overflows in finance calculations even when dealing with the budget of giant corporations. With, a 18-bit shift, edge cases would remain somewhat possible.

(1) Most Bitcoin wallets that exist today do not let you pay 1 satoshi. Instead, the minimal non-zero payable fee is 1 satoshi per byte. However, this behavior only reflects the implementation of the wallet, not a limitation of the Bitcoin protocol itself.

(2) Technically, there is a limit at 21M BCH, however, experts suspect that a few millions BCH are lost forever. Anecdotal evidence: I personally know one person who has irremediably lost about 100 BCH. Thus, those estimates sound right. In any case, even if those coins where not actually lost, it would not fundamentally change the discussion above.