Home < Scaling Bitcoin Conference < Montreal (2015) < Issues Impacting Block Size Proposals

Issues Impacting Block Size Proposals

Speakers: Jeff Garzik

Transcript By: Bryan Bishop

Tags: P2 p

Bitcoin was introduced as a p2p electronic cash system. That’s what it says in the whitepaper. That’s what the users were exposed to. The 1 MB block size was set for anti-spam purposes. Otherwise when bitcoin value is low, it’s trivial to DOS the network with large blocks.

Moving on to some observations. The process of finding distributed ocnsensus takes time; it’s a settlement system. The core service that bitcoin provides is censorship resistance. If you don’t have censorship resistance, you don’t have a stable timeline. You don’t have a permissionless entry and exit into the system. That censorship resistance in term comes from a large network of decentralized nodes and validators etc.

Average block size for all time is pretty much what that graph shows. In 2015 you can see that we started to burst up towards 1 MB, but in general we see that blocks are on average not full. Block size provides DOS protection. That raises the cost of attack. Sort of a history; 250k soft limit existed until about December 2013 when it was bumped to 350k. March of 2014 it was bumped to 750k. A lot of people run bitcoin-core with the defaults themselves.

The trend is pretty clear that we are headed towards the 1 MB block size limit. We are excluding blocks that the network takes a long time to find. This also excludes artifical stress tests. Today, blocks aren’t full.

Observations about the fee market. We’ve had mostly zero fee competition for most of bitcoin’s history. You have network events where you have long blocks that get full; traffic tests, stress tests, traffic bursts. And right before the soft-limits were bumped, you would have short periods where there would be a little bit of fee competition and then it would disappear. Users have experienced zero fee competition. It is not set by fee competition but by the anti-dust relay limit which is encoded into Bitcoin Core. You cannot get your transaction relayed unless you have this minimum fee set. Miners on the other hand, as miner power is, are welcome to mine any block and any transaction at any fee level, but in practice whta matters is that the fee floor is set by the defaults in Bitcoin Core which is essentially leaving out dust.

Fees provide near zero economic signaling today. For users it’s incredibly difficult to reason, you have no idea, when you’re sending one bitcoin, whether your transaction will be small in terms of byte size, the value is not at all correlated to the byte size of the transaction, and the byte size of the transaction is what the fee is based on. On the miner side, fees are unpredictable, you could ramp your soft limit all the way up to 1 MB to get as many fees as possible. The fees are dwarfed by the 25 BTC subsidy. It’s change to the miner. It’s just noise. And it doesn’t change their behavior through economic signaling. They can’t perform some behavior that suddenly makes them more profitable, it’s dependent on what transactions appear on the network.

A non-contentious hard-fork I view as a useful check and balance. Hard-forks in general should be hard and should be rare. I don’t think they should be impossible. A natural equilibrium block size exists in absence of the limit, which is a controversial statement. Essentially there are several incentives outside of maximizing fees which will prevent miners from building large blocks. Another observation is that there is rapid miner and mining pool turnover year-over-year, so the market shares for mining pools changes month-to-month year-to-year. We have people who are generating hashes, and that changes from year to year as well. Who is the king today is not going to be the king 6 months from now and so on. So that rapid turnover actually works against centralization because we have permissionless miner entry, anyone can show up and fail to be profitable just like everyone else.

There’s a wall at 1 MB. If blocks are consistently full on average at 1 MB, then that is a major economic policy shift to active fee competition. That is different from bitcoin’s history where there has been basically zero fee competition. From the user experience side, that is a radical change. User software is not prepared for this. User experience rapidly degrades once blocks are full. Though hopefully the spam poured on the network did force the wallets to improve their fee behavior. Out of annoyance there is good.

Wall at 1 MB creates chaos as fees shift to a new higher equilibrium, this event is.. not.. when we hit that wall, businesses or users might be incentivized away from bitcoin by high fees. More probloems at a high level, if you get stuck at 1 MB, if we fail to get to consensus at all, and it’s incredibly difficult to figure out where to go, and we just sit at 1 MB, then that strangles bitcoin growth and adoption because of how toxic we’re making this. Wall Street and companies looking at doing bitcoin experiments, “if I just flip the on switch then I instantly max out bitcoin’s capacity” but why are they dumping their databases to the blockchain anyway. Projects never get started, businesses don’t join in. That growth you would hope to measure through block size growth. People don’t want to increase block size until traffic is there, but people don’t want to put traffic there until there is reasonable guarantee that bitcoin bandwidth will increase.

Bitcoin was built to be upgraded, we should not be stuck at version 1. The protocol is upgradable through soft-forks. They are easier to implement and a little more insidious in that they only require the miners to upgrade, and users can’t vote because there’s no voting. There’s no good way to measure opinion about block size, but maybe that’s okay. What if they want to go to 10 megabytes or 1 trillion megabytes? There’s no good way to figure out their collective opinion and then shoot ourselves in the foot. Another problem that I see in a lot of analysis is not thinking at all, there are various proposals including an older one of mine where you have a static jump in the block size or a static fall in the block size, and what that does is reboot the fee market from scratch, from a market chaos and disruption standpoint that’s bad. Not thinking about the user market experience is that if we do nothing, as mentioned in the previous slides, a fee market will appear abruptly when the 1 MB blocks are full, users are not informed about this economic policy so it’s essentially foisted upon them in surprise. I am going to raise prices on everyone in bitcoin, please merge this, there needs to be a lot of introspection and thinking about user experience.

There is a market disruption to a market shift on.. you reboot the fee market twice, you achieve a state where the fee market when blocks are full; then again, from a market perspective, it’s just radical volatile disruption. As mentioned there’s zero fee competition, that’s potentially a moral hazard in that is it sustainable long-term, users essentially have low fees because the subsidy is economically much larger than the block fees. How long can that continue? And is it a valid economic choice to do that even though it is potentially unsustainable long term, you can subsidize adoption today and bring people on to the system, which is a rational economic choice that you can make.

The limit increase has costs. There are a large number not listed here. Hard-fork required, there is one proposal by Adam Back which does not require a hard-fork. Larger blocks potentially push miners off the system. Increase network load is shouldered by ever fewer actors.

More problems to avoid in block size proposals, avoid high priests choosing magic values, including core devs. This should be transitioned to the free market and a little bit more active role in that free market management. Avoid cliffs where you jump from 1 meg to 2 megs. That is a large market change that produces large market disruption. On the slightly more technical side, Bitcoin Core needs to send blocks out more smartly, it just sends out transactions to everyone at the same time. A smarter algorithm would be more like bittorrent where you unchoke some of the connections and you have a more optimal way to send blocks for mainnet, not the miner relay network.

You have centralization on the low end and the high end. It’s a range where we need to find in the middle what is a good value. On the low end, you must use centralized websites and never trustless systems. And at the high end you have reinvented the Visa datacenter.

Another key problem is a lack of data and field experience on block size changes. Another observation is that the community likes safety rails, notably a floor and a ceiling and that sort of thing on block size. Simulations only go so far, simulations are always not the not reality, maybe stating the reality. Field experience is gained by small change then gather data, observe, etc. Speaking of simulations, these are some of the variables I have been running- lightweight node count, pruned node count, full node count, CPU cost, cost to validate blocks, data storage costs, network resource cost, etc.

Another thing I see is analysis errors, analysis of block size proposals, a lot of this is discounting or not seeing externalities for example that’s where you might say this miner is always going to maximize for fees, and that might potentially be quite erroneous because a miner might have a private contract with a company for a fixed fee publish all of their zero fee transactions, so you have to look at your own mempool. Miners must be profitable in the short-term but you can have debt-fueled miners or equity-fueled miners, and miners often and quite rationally look at their long bitcoin and even if they are losing short-term, long term bitcoin value will increase and they will make a profit long-term. There are a lot of simplizations, these are microscopic analysis and it’s not really looking into macro. Another analysis error is that selfish mining implies broken system, you can find individually selfish mining incentives but collectively they break down, once you have, that is true with today’s block size or throughout blockchain history, greater than 50% selfish then bitcoin’s value diminishes greatly, so there’s some subtle incentives that keep things going and some people are missing that in their analysis.

Some more observations potential changes.. you have a static increase, you need more forks later. If there is a static increase schedule, then the increase might be too big for what the system needs, or it might be too small. It doesn’t take into account the needs of the free market. Feedback systems reflect the market but are gamable and manipulated by someone who is just buying a bigger block size etc. Another potentially very interesting incentive aligned proposal that appeared that I think originated with Meni Rosenfeld is pay-to-future-miner where you insert a transaction that pays not to you but it pays to some future miner and that’s a random selection process at that point, and it nicely aligns incentives, and in contrast, pay with difficulty scrambles the incentives because of the opportunity cost in waiting for a longer block or finding someone finding a block before you, you have to collude or lose much more than you would gain by an increased block size. A second course correction hard-fuck is likely, all the world’s coffee payments wont fit on the blockchain, you must have layer 2 and so on. And all the other scalability solutions.