0x429F
13 hours ago

A common hopium promoted by crypto people is “we must decentralize everything and put it on a blockchain” which has resulted in bizarre memes like “web3”. In reality, public blockchains are only meant for very specific usecases, when both of these criteria are fulfilled:

  • Peer-to-peer operation: software that can be run by anyone, and messages are passed directly between them

  • Strict global consensus: all peers must agree on exactly the same results

As such, blockchain apps are a niche subset of “web2”, not a new thing, and not something that can or should “replace” “web2”.

Firstly, not everything needs to be decentralized. Most services are best operated in a centralized manner. Further, in a vast majority of cases where decentralization is indeed beneficial, peer-to-peer operation is enough to achieve decentralization. In others, local consensus is all you need; global consensus is not only unnecessary, but also wasteful.

0x429F
September 18th, 2022

This topic has been well covered by StarkWare and recently Vitalik has a detailed post about layer 3s. So, do I have anything to add? I always thought I did not, which is why I never wrote on the topic - and it’s been a good ~10 months since I first discussed this topic with the StarkWare folks. But today, I think I might be able ramble about it from a different perspective.

The first thing to understand is “web2” runs on 100,000,000 servers around the world. “Web3” is a rather silly meme, given it is explicitly a niche subset of “web2”. But let’s assume this blockchain stuff can create its small and sustainable - but lucrative - niche, appealing to scenarios which strictly require distributed trust models and relatively minimal compute (i.e. nothing like supercomputers with custom hardware encoding millions of videos in real time). Let’s say we’ll need only 0.1% of the compute capabilities of “web2”. That’s 100,000 servers to establish a small niche.

Now, let’s consider a high-TPS monolithic chain like BNB Chain or Solana. Though it may seem impressive by a security & decentralization priority chain like Bitcoin, it’s necessarily a mid-range server because you must get hundreds or thousands of entities to be in sync. Today, a higher end server will be 128 cores, not 12; with 1 TB RAM, not 128 GB, etc. Immediately, it seems absurd that 1 mediocre server will be able to meet the demand. Indeed, a single actually on-chain game will probably need multiple high-end servers with 10x Solana’s compute capability were it to be successful.

The next step is rollups. While the design space for a specialized execution layer is wide and evolving, I’m talking about rollups with 1-of-N trust assumptions. Because of the 1-of-N assumption (as opposed to 51%-of-a large M), there’s no longer a need to run thousands of nodes. So, ceteris paribus, a rollup can upgrade to a more performant server. ZK rollups have a particular advantage because many of the nodes can simply verify validity proofs - so you can only need a handful of full nodes with high-performance servers. Yes, you need provers, but these proofs only need to be generated once, and proving times continue to decrease with advancements in both software and hardware.

0x429F
September 10th, 2022

Previously, I’ve discussed why Ethereum should cancel danksharding (or at least, do it only when it’s robust and battle-tested years down the line): 4844 and Done.

Here, I’ll go even further and argue even proto-danksharding (EIP-4844) is too complex and right now we should focus on a forgotten gem: EIP-4488.

I’m wildly non-technical and misinformed (OK, I’m not misinformed, just non-technical), but as I understand it from discussions with actually technical people, EIP-4488 is a relatively simple EIP with only a few lines of change required. It can be shipped within weeks, if there’s a desire to.

I’ll recommend a couple of changes to EIP-4488, though. Post-Merge, assuming Ethereum is 100% calldata, we’re prepared for 77 kB/s or 940 kB/block. I’d recommend making EIP-4488’s target calldata lower than the existing target. This will a) alleviate all fears about burst throughput, because it’ll actually be lower than what currently exists. And b) there isn’t that much demand for rollups right now. We have seen transaction fees on rollups drop to $0.01-$0.05, and even sub-cent in quieter times. In these times, we’ve seen L2 fees actually start to dominate on zk rollups, and even become a significant portion on optimistic rollups. Even if we go with half the max calldata per block proposed, that’ll be enough for months/years to come, even if there’s some unforeseen sudden exponential adoption at some point.

0x429F
August 18th, 2022

There are many reasons why someone would want to go anonymous, pseudonymous, or both. Perhaps your orthonym (given name) doesn’t match your identity. You share an orthonym name with an infamous person you don’t want to be confused with. It’s a difficult name, hard to pronounce, particularly for people from other cultures, so you find an easier name. You’re a performer and you need a more presentable name. You’re an artist dabbling in an experimental side-gig that you want to differentiate from your main oeuvre. You want to shitpost on the internet, but don’t want your partner to tell you off. You want your work in a certain space to be private. You want to play different characters, as all the world’s a stage. You’re a criminal that wants to evade law enforcement. You’re not a criminal, but your work falls into ethical, legal, and social grey areas. You’re a group of people wanting a collective name. You are uncomfortable being in the limelight, feel alienated by it, and just want people to focus on your work, rather than the person(s) behind it. There are many reasons to use a pseudonym!

While the title above uses the word “Anon”, there’s certainly a distinction between anonymous and pseudonymous. For example, one could use pseudonyms in their professional life for different work, but they are not anonymous - people in the industry, auditors, lawyers, workers union etc. know their orthonym and legal identity. However, their online pseudonyms can be also anonymous - well, there’s a spectrum. There are some pseudonyms they will take to their grave. There are others where only a handful of people they trust will know. There are others where it’s barely a secret - for example, using an Instagram account to post photography.

The focus of this post, however, is being anonymous, particularly on the internet, and more specifically having different anonymous pseudonyms - the titular Anon Game. There’s the technical stuff - using a separate internet connection, a separate browser, a separate PC or perhaps multi-boot with a privacy-focused Linux distro, using Tor, so on and so forth. These topics have been covered widely, and there’s nothing I can add to it.

Less covered are the creative aspects of being anon. Essentially, playing the anon game is creating new characters. As such, learning creative writing is the essential skill to employ here. While writing prose is certainly a beneficial skill, the anon game is more aligned with playwriting or screenwriting. Now, once again, there are tons of books on those fields, so I’m not going to lecture you about how screenplays are written. As an anon, your goal is either so no one knows your orthonym, or you’re playing a multi-anon game and you don’t want anyone to know your other pseudonyms. Either way, the same principles apply - you want to create a new character. Or maybe you don’t, or maybe something in between. But anyway, let’s talk about creating a character.

0x429F
July 22nd, 2022

At EthCC yesterday, Vitalik joked “should we cancel sharding?

There were no takers.

I raise my hand virtually and make the case for why Ethereum should cancel danksharding.

The danksharding dream is to enable rollups to achieve global scale while being fully secured by Ethereum. We can do it, yes, but no one asked - should we?

0x429F
May 28th, 2022

The endgame bottleneck for data layers is going to be historical storage. You can read more about it in detail here. Since then, Vitalik has some comments on the matter:

At much higher levels of history storage (eg. 500 TB per year), the risk that some data will be forgotten becomes higher (and additionally, the data availability verification system becomes more strained). This is likely the true limit of sharded blockchain scalability. However, all current proposed parameters are very far from reaching this point.

The current EIP-4844 spec calls for 2.5 TB/year historical storage, while for preliminary danksharding spec this is 42 TB/year. DataLayr, a DA layer based on danksharding, is already targeting 315 TB/year on testnet, and even mentions with 10,000 nodes it can get up to 31.5 PB/year. For context, what kind of throughput will this enable on rollups? Something in the order of magnitude of 10- 100 million TPS. Yeah, we are a long, long ways away from requiring this sort of throughput and the bottlenecks will be decidedly on the execution layers.

In this post, I’ll assume other bottlenecks (distributed builders, P2P, fast KZG proving, sampling etc.) are alleviated, and I’ll explore the feasibility of historical storage. Please note this is highly speculative - I haven’t read anything on the matter before. Just another reminder: for historical storage you need only one copy of data to be safe.

0x429F
May 21st, 2022

Preface: Usually, for my blog posts, I just write stream-of-consciousness rambles, and don’t even bother proofreading. This is different - this is the first time I actually put in some effort. Why? To celebrate Mirror launching Writing NFTs on Optimism. I have no idea why anyone would buy NFTs for this post, but if you do: 100% of the proceeds will be directed straight to the Gitcoin multi-sig for public goods funding. I’d love to hear feedback - should I repeat this for my older posts, and future posts?

Blockchains have found their niche as a store-of-value, in DeFi applications, speculation (kwonzis?) and NFTs. Certain identity & reputation protocols like ENS are gathering steam too. Everyone’s asking - what’s next? Many are convinced its gaming. In this post, I’ll argue why blockchain gaming isn’t what many crypto people think; but also, how it can find an interesting niche.

Games are diverse

Before we begin, there are all sorts of games which couldn’t be more different from one another. Likewise, different gamers have different preferences. It’s impossible to write a post and cover all areas. So, when I talk about “games” and “gamers”, I’ll take a broad view.

0x429F
May 20th, 2022

I have covered this topic ad nauseum, but I haven’t written a blog post summarizing the situation since Q3 2021. So, here’s an attempt at a somewhat concise summary of why rollups (and their disaggregated execution layer cousins like validiums, volitions, AnyTrust etc.) are orders of magnitude superior to monolithic blockchains - a true 1 to 100 innovation. I can write much more, but some of these points will be enough, for now. As always, I take a long-term view.

Throughput

Consider this perspective: a rollup is but a monolithic blockchain, but one that’s tuned for the highest throughput. It just outsources security to a different layer who focuses on that. Ceteris paribus, a rollup necessarily offers significantly higher throughput than a monolithic blockchain - unless the monolithic blockchain is totally centralized. Now, different rollup implementations will have different characteristics, but fundamentally rollups are the ultimate throughput solution.

0x429F
May 17th, 2022

This is my opinion how Vitalik’s contradictions are healthy and necessary, and how they can be reconciled. I relate with Vitalik’s contradictions, which is why I can write a quick blog post without thinking about it:

Contradiction between my desire to see Ethereum become a more Bitcoin-like system emphasizing long-term stability and stability, including culturally, and my realization that getting there requires quite a lot of active coordinated short-term change.

To attain long-term stability, short-term change is necessary. Unlike Bitcoin, whose premature ossification means there’ll always be fundamental flaws, and it’ll be much harder to resolve later than sooner. So, build a robust system first, ossify second.

Contradiction between my preference for reducing reliance on individuals and trying to build fixed systems that can stand the test of time and my appreciation of "live players" and their role in helping the world move forward.

0x429F
May 15th, 2022

I understand there’s a lot of in-depth research around this, and almost all proof-of-stake L1s are fundamentally designed around the assumption that staking issuance is net neutral. It does seem to make sense once you consider base L1 assets as analogous to equities or currencies, but I believe L1 assets are a completely new breed that’s a combination of currency, equity, store-of-value, speculative asset, economic bandwidth and defence budget. As such, I believe we need heuristics rather than academic research to understand why staking issuance is actually net negative. To be clear, I’m an observer, not a researcher, so please don’t take what I say seriously, this is 100% speculative, but too many things I have seen in the crypto space has convinced me this model isn’t working, and will fail long term.

Stakers are not fully protected

Firstly, even if we assume that staking issuance is net neutral, only block producers are fully protected from this dilution. This is a small number of users, somewhere between 20 and 2,000 for all alt-L1s, and probably less than 10,000 for Ethereum proof-of-stake (though Rocket Pool alone has 1,150 unique node operators in ~6 months, so this can exceed 10,000 long term - but a lot of work needs to be done to enable this). Because it’s a plutocratic election in some way or another (you either directly delegate to a block producer, or you more generally choose a staking derivative or service), there are extremely strong centralization/oligopolist pressures, so what usually happens is you have a very small number of entities dominating. For direct delegation based systems (which is everything other than Ethereum & Algorand AFAIK), there’s a long tail of validators who are barely profitable. So, even if there are 2,000 validators on paper, 1,500 have too little stake delegated to them, and actually have so little profits that they may have to sell most or all of their revenues to cover costs of operation. It’s worth noting that many of these types of networks also have higher running costs. So, what do they do? The answer is a centralized entity often subsidizes delegations. For example, even though Solana has 1,700 validators on paper, only ~150 are significantly above this subsidy. This is a complex matter with many moving parts, there’s also the matter of MEV, priority fees etc. - but the point is - only validators are fully protected.

What about the average staker then? They have to pay the block producers a commission. In some protocols this is as large as 50%, though ~10% seems to be a general average. Because of strong brand value for top block producers or staking services, they can charge a premium, while some of the newer players may even offer negative commission (and hope to make back money in MEV and priority fees). Either way, the conclusion is that a vast majority (>99%) of stakers are not actually fully protected from dilution, but on average, 90% of the way there.

0x429F
May 13th, 2022

For far too long, the narrative has been dominated by short-term solutions with unsustainable designs. It’s never too late to fight back and define what makes this industry sustainable and worthwhile. I have written mostly about rollups, and mostly in 2021 - but you’d have caught the subtext being social, economic, and technical sustainability. While the catastrophic failure of the Terra blockchain (aside from UST meltdown) was an extreme case, the cold hard reality is that almost all L1s today are faced with a similar outcome - just on a much longer timeframe. A lot of this will also apply to applications and rollups, but I’m focusing on L1s/settlement layers here.

Economic

  • Low inflation, <1%
  • Predictable security budget over the long term (note: hard caps may be dangerous)
  • Strong value accrual to the base asset, preferably canceling out most of the inflation (note: high deflation is an anomaly that implies undervaluation, and will be corrected by the markets in the long term)
  • Solutions like PBS (proposer-builder separation) necessary so most MEV is captured by the base asset
  • Do the above, and it’ll gain monetary premium
  • Monetary premium → higher economic security, more stable settlement layer and base asset → more demand and monetary premium virtuous cycle
  • Validity proofs, statelessness, and fraud proofs for the execution layers; data availability proofs for the data layer. This may not sound economic, but to scale to millions of nodes this may be necessary for economic sustainability.

Social

0x429F
April 30th, 2022

I have expressed regret multiple times this year for helping propagate the “modular blockchain” meme. There are two significant problems with this: firstly, “modular” is a pretty generic term that has been used by multiple projects to define very different things. The bigger blunder on my part was not talking about socioeconomic fragmentation enough, which led to many misunderstanding “modular blockchains” as meaning different projects should do different things.

I have covered these on Twitter, and in my last post. Here, I’ll make a more concerted effort into correcting my mistakes from 2021.

First, we need a different term. There’s a lot of time wasted as different projects have rightfully appropriated the term for different reasons, some of them years before rollups or data availability layers were conceived. I like the term “disaggregated blockchain layers” (credit: Intel).

I continue to describe “monolithic blockchain” as a blockchain that does execution, settlement, data availability and history all within one layer secured by one honest majority consensus.

0x429F
October 5th, 2021

Please consider this as a work of hard science fiction. I had written present tense prose (from 2025’s perspective), but had to rework this post to add in some future tense (i.e. 2021 perspective) for context so it has turned out to be a total mess! So, it’s a terrible work of fiction, but certainly more informative than it was before.

Ethereum is the global settlement layer. Or more technically, the global security and data availability layer.

There’s a flourishing ecosystem of external execution layers like rollups and volitions building on Ethereum. This is where all the users and dApps are.