Fee Market
Date: October 11, 2022
Transcript By: Bryan Bishop
Tags: Fee management, Bitcoin core
There are two times we have had sustained fees: late 2017 and early 2021. In late 2017 we saw lots of things break because people hadn’t written software to deal with variable fees or anything. I don’t know if that was as big of a problem in 2021. I do worry that this will start to become a thing. If you have no variable fee market, and you can just throw in 1 sat/vbyte for several years then it will just work until it doesn’t. So right now developers don’t have to worry about it. It can be really complex with lightning and RBF (replace-by-fee). A lot of businesses don’t track RBFs.
It seems like, it’s weird, it’s not that blocks are empty. It’s not that you average 200-300k. It’s that they hit the limit, then you get a persistent mempool, then it goes under, so it’s always right at that line. It feels like there’s an economics thing where people are targeting it. If fees are persistent and it’s unreliable, then I start consolidating or use something else. If fees are like, if blocks are empty, then you have more space and can fill that in. It seems like, I don’t know, it seems unlikely that it’s just randomly happening to be full but not over that size. That seems scary long-term because how do you fund this? Someone had this thing where- he talked about it in Scaling Tokyo where he said yeah we’re paying miners too much. Let’s soft-fork and push the mining reward out 20 years. That probably won’t happen, but it’s a good idea.
Other people I’ve talked with, like yesterday, they have different ideas. Some people think this is no big deal. If there are no fees in 20 or 30 years, yeah people will pay miners one way or another. Okay, but how? I don’t remember who said that. Several people have said eh no big deal. Not just here, but other people too, saying yeah we’ll just figure it out later. To me, it feels like a big concern. On the mailing list, some people argue about tail emissions which is kind of related. There was one paper by some Princeton people like “bitcoin is unstable without the block reward” which is old and well disputed at this point. Their assumption was that block space was unlimited. It wasn’t that it was unlimited, rather it was that the mempool is completely cleared out at every block release. That’s kind of what happens right now, so it’s not that crazy of an assumption compared to now. It’s just not applicable.. in 2017 it was, do we increase block size or not? If we don’t, then we will assume that we will have some sort of fee market. They had some interesting attacks like if you’re clearing out the mempool and there’s no new coins coming out, you’re vulnerable to someone reorging you out. So you don’t want to clean out the entire mempool, even if you can fill out your entire block with that. There was a fee smoothing proposal. The fees instead of getting collected by the miner that got the transaction, they get spread out across the next 100 blocks. It’s kind of like a mining pool. Aren’t the fees embedded in the coinbase output? You can’t spend it for 100 blocks, right? But it’s not about the maturity. It’s about who gets the money. You could soft-fork it. If you’re mining the next block, you’re less likely to reorg because you know you’re going to get some fees from the last blocks. From the Princeton paper, it seems like the risk is highly variable fees which we already have today; today you can see a block with basically no fees and the next block might have 0.1 BTC fees or some decent amount.. but this is overwhelmed by new coins coming out, so the actual variance is not that high. It will stay this way for the next 3 halvings. Most of the variance comes from whether you find the block or not, as a miner. So if you’re going to calculate the variance of how much you make in a day or month, that’s where you’re going to get your variance.
The scenario that is interesting and scary is that if the mempool gets cleared out and there’s no significant new coins, then what do you mine? If you mine a new block with nothing in it, then you’re wasting your electricity. So you can either try to reorg out the guy before me and grab all those transactions or I can people can setup weird like hey pay me a little so that I won’t reorg you out. Or out-of-band payments.
Say a miner posts immediately a transaction with some amount of fees into the mempool after mining so that there are fees for the next miner. This also disincentivizes miners from propagating transactions across the network. Making deals with exchanges to publish their transactions only through their blocks, etc. Sending transactions out-of-band has come up a few times. If you put your transaction into the mempool, you’re offering it to everyone at the same time. If you’re paying just one person a little bit less, you might wait 10 blocks until it gets ocnfirmed. Where is the sweet spot? Where would that be attractive to wait for 10-20 blocks for confirmation? You can put the min fee rate and it gets into the block… from the exchange’s point of view, they save on fees if they do this right? What happens if they put the same offer to other mining pools too? What happens if a miner offers a huge discount? Well, then they are making less money. There’s a balance. If businesses are willing to pay to reduce variance, this is a thing, you want stability, you want your books to make sense, so you just do that.
Right now the default tx relay fee is like 1 sat/vbyte. I think you would have people directly talking to miners if the relay fee was much higher than what miners were willing to accept which might be sort of the case now. It doesn’t seem like the minrelayfee should matter, but what if you reduce it by 10x and then aggregate fees went down? or what if you increased it and aggregate fees went up?
I’m not concerned. It’s early days for things that generate blockspace demand like lightning and fedimint. I think the willingness for those systems to pay is much higher than 1 sat, in my opinion. I am surprised by the behavior over the past year… like the low mempool and just being right on the line of full blocks but not over.
The payment count is up. The transaction count is up. The average amount sent in a payment in dollars is up hard. It used to be like $1,000 or so. Now it is $10,000 per payment on average. All the metrics are increasing but the average v size of transaction is down and average vSize of payment is down because people are batching and consolidating and moving to new output types that are more blockweight efficient. It just takes a long time for optimizations to be widely deployed. Segwit inputs are more than 50% now. There was this huge break after blockchain.com started using native segwit and blockspace demand overnight went down or minfeerate. It might just take quite a while to get these optimizations deployed.
Another break will be multisig moving to taproot. Multisig takes up a bunch of blockspace. If it moves to pay-to-taproot then it will have roughly the size of p2pkh payments on-chain. There will be a spot where we will be emptying the mempool again from this.
Will there be enough demand for blockspace? But what happens if it flips to too much demand? Maybe the willingness to open lightning channels is $5/channel. I am optimistic we will reach that. But what happens if we exceed that? How does the system get into steady state? Some actors can defer, delay, choose better times, but at some point all that gets optimized. But what happens if it goes to $20/channel? What about $30/channel? Does all the demand just disappear? Or will there be ongoing willingness? What about willingness to pay for fedimint transactions? Or other custodians?
It seems like user-experience-wise, high variability in fees is not great. Businesses say we’re willing to pay but we want some assurance or bounds on this. The reduction of volatility in the blockspace market has to do with more people having a longer time-horizon than one-confirmation. In 2017, everyone wanted to be in the next block. Now most users are willing to wait 12-24 hours and a lot of businesses are targeting 6 block confirmations instead of 2. This has smoothed out the increases when there’s just a handful of slow blocks in a row.
If you want to transact often and know how much you will pay for that, then you must be on the lightning network. You have no reason to be on the blockchain if you can’t tolerate fee variability. Mining pools need to pay their users regularly, and they shouldn’t be doing those payments on-chain.
Custodial users, large exchanges, they can even come up with proprietary ways of transacting. There’s Liquid and other solutions. For non-custodial use cases, that’s not really an option. I still think non-custodial usage will grow over time and increase demand and increase fees. It is a question of what happens when we break past what people’s willingness limits are. Does the block size increase then? If the whole world decides to jump into the lightning network, it is going to be an issue. There are other solutions that can be explored.
It’s also self-regulating to a degree. There’s not much adoption of the newer output formats. We’re still only on average of 3 outputs per transaction. It went up from an average of 2 to an average of 3. Now that PSBT is out, maybe it will become more popular for multi-user transactions to be built, which has less overhead. You can pay someone in lightning, you can make a multi-user transaction, etc. As fees go up, people will increase their efficiency.
We could soft-fork 300 kilobyte blockweight. We could get rid of the witness discount once we come out with cross-input signature aggregation. Just count witness bytes.
When does this problem need to be addressed? Is it 10 years from now? Is the problem not enough usage or too much usage? What about congestion control mechanisms for smoothing fee rates? Something like OP_CHECKTEMPLATEVERIFY where an exchange can publish a transaction and it gets unpacked later during low congestion.
Are we concerned about too little fees in the future? Who is more concerned about scaling bitcoin and that being the biggest problem? Over the next 10 years, the subsidy will be bigger than the fees. Or the subsidy is bigger than our current fees for the next 10 years, in sats. If the fees go up by then, then we’re golden and we might have another problem about scaling. But for the next 10 years, not enough fees won’t be an issue. I don’t see necessarily that there will always be enough fees to pay for the level of mining that we have right now. With all the second-layer solutions that are being built now. Well, do we need the level of mining that we have right now? We might not, but we might need consistent fees in order for the incentives to work.
Maybe if you hit that level, people think it’s unusable like 2017… so the scaling problem becomes more important? So this whole point about finding another solution, I think that’s where bitcoin has to prove itself. If bitcoin has unique properties versus other technical solutions, and materially true, then there aren’t blockspace substitutes. If there are other blockchains that are perfectly suitable in the long-run, then that’s that. Some people are willing to give up some security to save a few bucks. Bitcoin’s security might be negligble benefit then the willingness to pay fees is low.
I think most people being on custodians aren’t doing on-chain stuff is maybe masking some of the actual usage that is happening on bitcoin. I can’t prove it. But what if all the major exchanges went bust, and all their users move to self-custody? I think we would see a lot of demand materialize out of nowhere. The more that people focus on bettering the user experience for self-custody where it’s ubiquitous and easy as having a phone then I think we will see a lot of demand appear out of nowhere that you never knew existed. Right now it’s just entries in Coinbase’s mysql database without on-chain transactions.
Unfortunately currencies are not substitutable. People have a distinct preference for certain currencies. So the use case for bitcoin is moving bitcoin, not the security that the chain provides which yes is important but that’s separate in user’s minds.
What do people want to pay for on-chain? Manufacturing demand is not the same thing as organic demand. If bitcoin continues to focus on the things that differentiates bitcoin, it either solves that use case or it doesn’t. Ethereum users don’t care about ethereum, they are not sticky at all and they will move to other chains. Explaining the strength of bitcoin and why it is important, then this is a matter of marketing and it’s a wider thing to do out there. The real use case for bitcoin hasn’t happened yet. Everyone in Europe has an amazing payment experience and has privacy in their banks. Bitcoin doesn’t really solve a problem for Europeans. In developing countries, like Nigeria, there is massive bitcoin adoption because it solves a real problem. Are we building bitcoin to solve a real problem and are we patient enough to wait for that? It’s likely to go to be some scenarios where nobody is going to be excited about and that’s where bitcoin becomes really useful and we’d be happy about that rather than manufacturing demand now. A lot of use cases driving demand on other chains doesn’t need decentralized. Most people don’t use censorship resistance money now, but we’re building it for when we need it in the future.
The Tokyo proposal was to push coin minting out by 10 or 20 years. It was an interesting idea. It would create a separate pool of coins and it would allow for block size to be a little bit elastic.
The subsidy is going to get to a level where having less fees than we’re currently collecting… a full block at 1 sat/vbyte is 1 million sats. That’s not a lot of money, it’s like $200. So to reach that sort of level with the subsidy is 5 halvings out from now. 20 years. 10 years from now, at least, we still have 80x in terms of what we’re currently collecting just for a full block. In the long-term, we have to rethink how we think about how we collect mining rewards. We should keep an eye on it over the next 10 years, and work towards a solution. In the short term, if there’s another use case comes in and there’s a volatility spike in fees, then people can build stuff that spreads out transaction like lightning or maybe Liquid and other things that have been built in that regard. That’s probably going to be more relevant in the next 10 years. If it hurts enough, people will adopt other stuff. If we’re still around in 20 years, then either it’s super popular and we have this problem or we’re probably working on something else by then.
It’s easy to get stuck in short timeframes because tech moves quickly, cryptocurrency moves very quickly, but outside of this the only meaningful unit of time is 1 decade. When we think about tech development, we should think about one unit of time being one decade. Bitcoin has now existed for one decade which is almost no time. We only have another decade and a half before that becomes an issue. That’s one whole unit of time and another half to do this work and figure it out. What we have learned in the last 10 years is that we don’t produce transaction demand. Over the next 10 years, we iwll have many more developers and more users. The amount of value pushed around in the system will be orders of magnitude more.
I’m more worried about kicking the hornet’s nest moment where we want more users and then the users start flooding in and we go “oh shit this wasn’t actually ready”. Someone gave me a stat the other day: if you took 5 billion people and wanted to open a lightning channel for each of them, then I think it would be 3 years. Like 1 input and thousand output transactions. Humans have a hard time conceptualizing exponential growth and many of these things happen very quickly. All of the sudden some government does something stupid, and now you have 5 million people trying to get into bitcoin real fast and lightning won’t be as ready and then we’re working under pressure. Having trustless permission on a layer 2 network is contingent on having transactions embedded in the base layer.
If you look around, a lot of people are comfortable about having a bank account and don’t care about centralization. It’s not a problem. Bitcoin becomes gold and then it’s re-hypothecated. I think there will be different user pattersn and there might be bitcoin banks like Coinbase or Fedimint. Fedimint scales better than just a UTXO per user. It’s custodial, but it might be run by small groups. It can scale so much better. We don’t need to put 5 billion people on lightning. We have to enable 5 billion people…. I’m not trying to shit on the lightning folks, but we just saw a massive problem with lightning yesterday because of a single transaction. We should consider low fees a blessing that we still have time to build solutions. If you think bitcoin solves a real problem, then you need to know that users will come and get that problem solved. Entrepreneurship and products require scale. The solution is the product. The problem has to exist. You need a great product that solves a problem, and people won’t invest in those solutions if they know that if I build this great product and millions of people show up then it will break. There’s a linkage between enabling the people that will create the demand and enabling the scale and systems on which they build.
We need to be able to share one UTXO with multiple users. There’s coinpools, fedimint, coinpools are totally unworkable. There are ideas that people are thinking about. We need to continue thinking about those ideas. We can’t give a UTXO to every single person on earth but I also don’t think everyone will demand one. There needs to be an osmotic pressure to go on-chain. You gotta be able to withdraw. There’s gotta be a credible threat. Maybe not everyone needs a UTXO.
I have another unpopular problem: I don’t think rehypothecation is a problem as long as we can clearly distinguish rehypothecated bitcoin from other bitcoin. People can pick their threat model.
We should know what the theoretical upper bounds of the system are. 5 billion people trying to open a lightning channel, sure that will break stuff. But we need to know in terms of scaling, we should be able to theorize what will happen at these different levels. Then we can reason better about where to put effort. If it could scale to 5 billion channels, then I wouldn’t invest in semi-federated custodial systems because these other solutions will work fine.
Would you want your grandmother to have self-custody over bitcoin? I would want her to have the option. But let’s be realistic. You’d probably custody it yourself before you’d have your grandmother custody it. That’s scalability right there. Well that’s like the iphone: grandma is never going to use cellphones, or smartphones, and then iphone comes out and yeah grandma is using iphone now.
Nobody predicted that Facebook would have a billion users on Facebook. But they did. We need to think in terms of these numbers of billions of users. If the world has such a serious problem that can only be solved by billions of people using bitcoin, then that seems like a really strange world to live in. What if bitcoin is useful and only a few million people are using it? Would that really be so bad? Is it possible that the other several billion people will be okay and live productive lives without having direct access to bitcoin? Why would it be absolutely necessary problem that billions of people would only be able to solve with bitcoin?
What about some sort of degradation plan where the fees are low and don’t pay for security? Like instead of just shrugging and saying sorry, it’s all over. There’s checkpoints, or some companies might do altruistic mining maybe that might be icky or maybe not. Or maybe other ways of securing the chain other than mining. What about transitioning to something like Liquid’s dynamic federation (dynafed) concept for block signing? It’s not proof-of-stake but it’s not proof-of-work either.
A lot of people that we think of as natural users of bitcoin, they want to flee from inflation of their local currency and thye would be better served with USDC or something. They want something more stable than their own currency. Bitcoin might be too volatile for their appetite when fleeing their native currency. Taro sounds interesting to me as a form of getting an actual decentralized market on top of bitcoin because people who have na asset channel will be able to … I don’t see how Taro can increase demand on bitcoin layer 1. It’s not consensus-layer bitcoin.