Why Bitcoin Needs A Measured Approach To Scaling
Speakers: Adam Back
Date: September 7, 2015
Transcript By: Bryan Bishop
Media: https://youtube.com/watch?v=wYHyR2E5Pic
BC: Adam Back is here for the second time on this show. Some of you will be familiar that Adam Back is the inventor of proof-of-work or hashcash as it was then. He was one of the few people cited in the original Bitcoin whitepaper. More recently he has been involved in sidechains and Blockstream where he is a founder. He has been vocal in the bitcoin block size debate and how it should be done or how it should be done. We’re excited to have him on. We’ve had perhaps too much of the other side, so we’re having him on to talk about the block size scaling issues.
AB: It’s good to be on.
BC: You mentioned that it’s important to talk about what bitcoin actually is. And of course, I don’t know if we’ve had that question asked in that podcast. We always sort of presumed that people knew about what bitcoin is. Why don’t you share what you see in bitcoin and hwat you would like to see?
AB: It sounds like a silly question, right, what is bitcoin? There’s been quite a lot of users for a number of years now. Increasingly in the mass market and all kinds of interesting applications, there are users. There were a nnumber of electronic cash systems before bitcoin. We also have paypal and things like htat. I think that a lot of people may not have the history of it, but as far as I understand it, paypal started as a bearer electronic hash protocol. It kind of migrated for practical and regulatory reasons into a central web service where the web service has your balance. PayPal is grappling with complex issues like fraud and civil attacks jamming their system. So they ended up in a situation where they have to apply policy.. and now we have users that are complaining about PayPal freezing their money. More often than not, users did not do anything particularly wrong, they just tripped a semi-automated use policy. What’s interesting about bitcoin is that there’s no central party that can apply force to your ownership. If you have electronic cash, generally it means that you should be in sole control, it should be the electronic analog of paper notes in your pocket, whether U.S. dollar notes or British pound notes. There’s nobody that can make a decision and suddenly the money in your pocket stops working. There are good reasons for why this should be the case. If people get worried about the validity of their notes, they try to take it to the bank as fast as possible. This is the concept of fungibility- in an electronic cash system, it’s very important that it is highly fungible. This dates back, and perhaps there are other precedents in other countries, there was a court case in Scotland in the 17th century where there was some large denomination bank notes to a business contact and he was a little worried that it would be stolen in transit. He made marks on them so that he would recognize them. Unfortunately the notes didn’t arrive; he reported a problem and eventually there was a problem. He sued the bank and tried to get the notes back. The courts eventually decided that it was more important that the currency remain fungibile than this merchant be made whole. Their reasoning was interesting. The economy and confidence in money would be seriously damaged if the reverse decision were to be made. And also you have this general problem where money changes hands many times. The discovery of the crime or the fraud is often delayed. If someone robs a convenience store at gunpoint and makes off with some high-denomination notes nad they circulate through the economy a few times, if you try to deposit it at the bank chances are that the robbery had nothing to do with you. The crime may not be reported for a day or something, and investigation takes a while. So the general principle is that electronic cash should be fungible. The other interesting thing is that you get with bitcoin a sort of permissionless innovation. Anyone or any startup or any developer or any power user can pickup some open-source software and start writing applications. There’s an analogy that people draw with the early days of the internet. Before the internet there were large telecom companies that were sometimes government state-ran monopolies. To bring a new application to the market, you would have to negotiate a contract with a very large company. Perhaps they would say no if it competed with one of their strategies; once we had the internet and open innovation, mayn individuals playing with open-source software were able to get where we are today with everyone enjoying the internet and everything that can be applied with the fast pace of innovation. So one of the exciting things about bitcoin is that it’s very open to innovation. What if Satoshi had not been anonymous and had obtained some venture capital money and started bitcoin dot com? Chances are that bitcoin wouldn’t exist today; at the first sign of political controversy, that company would have been shut down. Digicash went through something like this. It was an electronic cash system where they were using something like zerocash but it was based on David Chaum’s blind signatures work. He started a company and at some stage they had a demo server with an internet connection to the existing banking system; the Dutch banking regulators said they can’t just setup services like that. There were people that were trading these coins with the viewpoint that if people sold small things for them, they might create a floating value. They promised that on the demo server they would only release 1 million of these tokens. So there was a bit of trade and it was an analog of how bitcoin bootstrapped; but unfortunately, DigiCash went bankrupt. People had coins but they were useless because the central server went offline. So you can see that in these 3 stories that decentralization is the distinguishing feature of bitcoin and why it succeeded where DigiCash didn’t; PayPal went centralized and lost their ecash properties, and they have been grappling with the fallout to users ever since and 6-month-long arbitration dispute processes. Some companies tend to act in a proprietary way, and will not give you access to their APIs. There are examples of startups that started with APIs, but then they close their API because it competed with themselves. If you have something that is centrally controlled, you are beholden to that central control point and those associated behaviors. Decentralization is why bitcoin was able to bootstrap; you can build centralized services on top of bitcoin, like coinbase.com or whatever, and they have custody of your coins in some form, and they can apply policy… you can build those on top of a decentralized system, but you can’t build a decentralized system on top of a centralized system, so what we need to do is hold on to and make sure we retain the decentralized nature of bitcoin and its permissionless properties. Now of course there are questions of tradeoffs here, like scale against security or decentralization, and that’s really what the core discussion has been in the technical community has been about.
BC: I think you have touched an interesting aspect here, like permissionless innovation. We have talked about this so many times. There’s no question that it is revolutionary and has so much potential there. Other systems just can’t compete with this. I think fungibility and that sort of ecash; I think it’s very fitting that in the whitepaper, he emphasized that it’s electronic cash as opposed to some settlement network or payment system like that. In my view actually probably most people would agree with you; most people in the bitcoin space would agree with you that those are fundamental critical features of bitcoin and that those should be preserved and guarded. At the same time, with the block size debate, there has been a lot of disagreement and heated controversy. There’s people being quite rude to each other at times. Even if one agrees with some of these large principles, and I do think there’s a lot of agreement there, it doesn’t help with the actual decision making. We’ve been having this conversation more and more. How do you think decisions should be made? You mentioned some principles. Do you think that anyone should hold those principles? What if people disagree with them? Decisions about where bitcoin is going to go, who should make those decisions?
AB: Right. So I’ll step back briefly. If you are talking about the properties of bitcoin, there’s relatively wide agreement and when you talk about scalability, clearly everyone wants to scale bitcoin because in the security domain there’s a concept of delivered security which means the amount of security you deliver to users is based on the amount of security multiplied by the number of users that actually use it. This introduces things into security considerations like, it should be usable and it should scale. It’s also not interesting if only a few hundred users can use the system. There are millions of users of bitcoin, but it could also be criticized that it can’t scale that well right now with the current parameters and security-scale tradeoff you have maybe 7 transactions per second or there about. While that’s generally a lot on a yearly basis, and blocks aren’t actually full now; Rusty Russell did some statistical analysis that showed that 45% of those transactions in those 1/3rd to 45% full blocks are less than $1 dollar in value. You could make a good argument that the people that you are transferring money to, like maybe changetip transactions or something to tip or moderately centralized, if policy crept into it and someone seized a couple dollars on changetip, that wouldn’t be as concerning because they would choose to switch to another wallet. Whereas if bitcoin was centralized to that degree, it would present a huge problem. So if you look at scaling as a tradeoff between security or decentralization, the sensible thing to do is to pick a pragmatic starting point. We’ve had a number of proposal like jgarzik’s BIP100 which was published first, and then Gavin did BIP101, and then someone did a BIP102 with 2 megabyte blocksizes. There was miner voting which was produced as a balance so that blocksize wouldn’t automatically grow, but miners would have to collude to grow it. More recently there is BIP103 from sipa. The other quite interesting one is flexcap proposals and that allows bursting of block sizes in reaction to sudden demand and for miners to pay for bigger blocks. If the blocks are too small to meet the demand that means there are excessive transaction fees left on the table. If miners can see that there are excess transaction fees that they can collect, they can increase their profitability by taking those and increasing the block size.
BC: Let’s wait for a sec before going into that. How are decisions going to be made? When you talk about BIPs, those are basically sort of proposals… They are outlines of changes to be made. And then they might be integrated in Bitcoin Core. But then one of the interesting things about Bitcoin is that in the end the software, Bitcoin Core has a lot of weight in the ecosystem because that’s what most people use, but in the end it’s what software people run. Do you think, is the right way to try to discuss what’s the right BIP to do, and then integrate that in Bitcoin Core, and then maybe the better way is for some people to go one way or different ways, and then you try to convince the miners and wallet providers and exchanges to switch? How do you think about that?
AB: Right, so, there is … not everyone is aware of this because it’s a small technical community, but there is a relatively formalized BIP review process. This is described as a BIP proposal, which is BIP1 which is sort of a charter and framework for how BIPs will be setup. There is a process for how BIPs go from outlines, reviews, and the process specifically is that you discuss on the bitcoin development mailing list, you outline the idea for whatever it is, and then there is review discussion- if it’s not obviously broken or defective then there’s a BIP number assigned. Then you specify in more detail, you try to draw up an implementation, there’s testing, then there’s planning of how to deploy that. All of the changes so far have been soft-forks, with the exception of an accidental hard-fork that was fixed in a rush in March 2013. With soft-forks, you have miners bringing it in, triggering it, so soft-forks are backwards-compatible and to change the scaling in this way, it’s a hard-fork and it’s the first planned hard-fork. A soft-fork, like the average time for a soft-fork until it’s triggered is like 6 months. So bip66 was the most recent one. Upgrading the bitcoin network is potentially a high-risk thing. And in fact the bip66 when it finally triggered nearly created a network fork, in fact it did but people manually fixed it. Everybody was agreeing with it, it was just fixing bugs and adding features; even that can go potentially wrong. And it did. The other thing to say is that there’s some confusion about the difference between a software fork and a network protocol fork. In open-source software it’s known that people can fork software, that’s why there’s open-source licenses. The idea there is that if there’s some group of users that decide that the bitcoin user interface should focus on a different feature set, then they can go right ahead and do that, and whatever becomes popular will win in the market. It’s not a binary decision though, both versions can exist and users can use them on their preferences. With the bitcoin blockchain, the software fork is not particularly concerning- having different implementations is not so concerning.
A network fork is a new altcoin?
AB: That probably wouldn’t be a very popular way to express it… it’s certainly the case that having… Bitcoin’s consensus model requires the vast majority of the nodes on the network to run a bit-wise compatible protocol.
So you mentioned the BIP process and there’s some formalization around that; but with regards to some more controversial changes like the block size, there are multiple BIPs trying to address this which come from different ideological backgrounds or having different visions of what bitcoin is or what bitcoin should be. In this case it doesn’t seem like a clear process of how to implement the BIP would help in establishing consensus on what we should do. We have to come back to the higher-level discussion about what bitcoin is or what it should be; how can we come to consensus on these sorts of decisions?
AB: I think there has been some, because of the nature of online discussion forums, and twitter is a very concise communication mechanism… and people on reddit sometimes get in heated debates. There tends to be black and white thinking that there’s some bitcoin developers that want 1 MB blocks only, and then some that want to increase it. But this is a mischaracterization. gmaxwell has proposed flexcap; sipa has proposed an increase; jgarzik has proposed another one, and I have proposed a few myself. Everyone wants to scale bitcoin. It’s really complicated. There’s no silver bullet. The technical community is trying to figure out which is the sort of pragmatic safe secure way to do this. And I think that outside of, just in evaluating the BIP process, the BIP proposals- that’s a decision that frequently can happen within the development BIP process. That has happened many times in the last 4 years for many security defects or features have been fixed in that way. There’s no reason to assume that wouldn’t continue to be the case. Another concept that has been put forward is this concept that if maybe it would be moving faster if there was a single final decision maker. I think Mike Hearn was the first person to propose this. I think the danger with bitcoin is that there’s a lot of other people’s money at stake. It’s a $4B economy. As we move forward into the future, if we see bitcoin gaining adoption in some segment like international settlement or gold or something, if you turn around and think for a few minutes- would you want to be the final arbitor? You would be subjected to intense amounts of international pressure, blackmail, perhaps you would be physically unsafe; even governments and central banks employing economists struggle to avoid moral hazards and the pressures on them are already immense. Bitcoin is trying to avoid those security risks. Another kind of risk is that some individual might have a hidden agenda, or a foreign government might have a conflicting interest that, if there is a peer review process where everyone has to review and peer review it, that’s the best way we know how to avoid those influences creeping in.
So this is different from other open-source projects, is that there’s so much at stake, and directly at stake. So it seems like, the benevolent dictator model may be difficult to implement for bitcoin. But the BIP process only takes into account only technical considerations and only the technical community. And there’s also the voting with mining, which does also include other industry actors opinions but not all of them, only the miners. Not payment processors, or companies providing services to bitcoin. What about a working group in the engineering task force or something, or like with HTML or whatever, where there are working groups coming and different industry actors weighing in and having a review process tha ttakes into a consideration?
AB: I think that’s what the BIP process does. There’s a workshop, a technical workshop in Montreal on 12-13 September which basically is a physical meeting a physical continuation of the BIP review process. There are certainly many types of people going. There are something like 14 or 15 commercial and academic sponsors. There are bitcoin technical enthusiasts to companies sponsoring and sending technical people; people from academia, people from the mining community, etc. While the core developers have to implement things, they would also be the sort of people to first tell you that they are not going to make decisions to effect users. They are there to hear and balance the requirements from different constituents. The main thing to keep in mind is that it’s a tradeoff with security and decentralization, and there are multiple parties involved- there are users, bitcoin ecosystem companies, and miners. If you favor one particular feature, you will disenfranchise others, like miner profits might fall.
BC: Adam, so, of course this all sounds good and reasonable. But what then happens? We’ve been talking about a variety of different BIPs. What happens when people just can’t agree? When there are positions that cannot be reconciled? Is there just a stalemate, or is there some process after that which will resolve this?
AB: I think that it’s, the task at hand is complicated. There is also a value to not acting in a hastey or rash fashion because there’s nothing worse than rushing a change and breaking bitcoin, accidentally introducing a bug or something. I do think that there are plenty prior examples about complicated technical discussions that reached a consensus and got implemented.
BC: What if no consensus is reached?
AB: I don’t think we’re there yet.
BC: But let’s just say. If you stay on the governance thing, it doesn’t matter if we reach a consensus or not. I think there will be lots of decisions about the future of bitcoin, with no consensus. Maybe with the block size thing you could say that’s an engineering question; personally I agree with you that everyone wants to scale bitcoin. I don’t think that Blockstream wants to keep it small for their own revenue reasons. I think everyone wants to keep it decentralized. Let’s say in the future that there are people that want bitcoin to do different things. Consensus is not enough. We wont be able to reach consensus.
AB: I think there are some short-term and long-term trends that help this. To be clear that sidechains arose significantly before Blockstream. The reason I became interested in sidechains was because when I got caught up with Bitcoin, I was trying to find ways to improve bitcoin. I had a background in electronic cash so I thought I had a bunch of good ideas. So when I was talking with bitcoin developers, it became apparent that my changes were fairly complicated somewhat high risk changes, they were interested in the features but it was difficult to implement. The core protocol is about providing a secure and usable base that people can build things on top of. It’s like TCP/IP or something. All of the internet stuff is built on TCP/IP, but the basic protocol, once the basic R&D was finished, it’s been static for decades. Once I saw this problem, as many others have as well, bitcoin does have a lot of code being written and new features get in there, but there’s a sort of bottleneck around security assurance and careful validation and focusing on the most important things first. It doesn’t progress as fast as general internet stuff that people are accustomed to. So I started to focus on adding a layer to do this faster; that’s what sidechains are. It’s a way for people to work on different features independently without directly modifying the blockchain. If some of these generalized extension mechanisms that allow people to implement novel features or even features that are not mutually compatible, or if there’s extension blocks or sidechains or something, were implemented before we got close to a scaling issue with the block size the answer would have been different, which was using the extension to make a high-scale opt-in extension. The further future is more clear. It’s more reasonable to have additional layers with extensibility and additional ways to get scalability, such as the Lightning Network proposal which preserves bitcoin properties. The way to bridge from short-term to long-term is to focus on short-term solutions that will create the time and space in scalability terms so that we don’t have scalability problems, which gives time to the technical community to develop and validate the features to be sure that Lightning Network scales.
BC: I think the sidechain vision has been very compelling. You can have this sort of integration of different chains and move BTC there so that you have the same network effects and same value. That’s been compelling. I do agree that once sidechains are implemented and functioning etc., that does partially solve the issue of reaching consensus and adding features and such, because you can do it in the sidechain. And you don’t need global agreement. But when you talk about the core of bitcoin protocol; the implication of what you are saying is that the core of principle of bitcoin protocol should basically remain the way it is or do as little, or make as little changes as possible. And then we have still the issue- what about those changes? How are those decisions made?
AB: I wouldn’t say as little as possible, just as much as is safe. And there are a lot of changes that have gone into bitcoin in the last year, like libsecp256k1 that Pieter Wuille and gmaxwell worked on, which increases the digital signature validation speed by a factor of 6 to 8. When you’re talking about scaling bitcoin, if it weren’t for that work, increasing the block size wouldn’t matter because the CPU would be the bottleneck. There is progress. There are new features. There’s checklocktimeverify, version bits, relative checklocktimeverify which helps Lightning Network be more efficient; and looking backwards in time there are things like P2SH. Bitcoin does have progress. There is work happening. But just to say that it’s interesting to try to structure bitcoin in the longer term such that there is a base that has core features, and then there are other layers that allow people to do things more qucikly without being as dangerous to the lower levels. Applications that are built using bitcoin p2sh features like trustless escrow or trustless custody or trustless exchange are built on top. We want core to be flexible enough to allow people to build things directly on top, and for the core to be flexible enough for people to extend bitcoin. At the moment it’s a one-size fits all solution. You have to trade off the interest of different people; miners want to maximize fees or something, and users and merchants would prefer low fees. They both can’t have what they want, so it’s a general compromise. So you have people who want to scale bitcoin with micropayments and maybe trade off decentralization to get there, and then you have people that value permissionless features that you get from that. You can’t swing from one extreme, so you have to walk the middle.
I’d like to talk about decentralization. One thing that is interesting is that unlike centralization, you have to keep fighting to keep it. We see that the bitcoin community keeps trying to keep it decentralized whereas you have a centralized system and it stays centralized as long as you want. We recently had Stefan Thomas on our show who argued that decentralization was not important, but rather censorship resistance was more important.
AB: Yes, that’s true. Decentralization is the only way we know how to do that right now. There are ecash protocols like by David Chaum, who was the first person to formalize this. That has much more robust cryptographic fungibility than bitcoin, but it’s centralized. DigiCash is long bankrupt and people who had DigiCash coins lost them, and bitcoin is here and running today and DigiCash wasn’t able to reach this scale. The other story that I described was PayPal which went from a sort of bearer electronic cash concept and ended up with a centralized service but succeeded and has many common griefs from users about account freezing. Centralized things are inherently vulnerable to policy choices by centralized parties or by permission-seeking, they can cut you off whenever they want- we see that in the wider internet ecosystem. The decentralization is the key differentiator for bitcoin. It’s easy to build centralized systems on decentralized ones, but not the other way around. There’s no reason for the hosted bitcoin wallets- many of them provide in-service netting; if we’re both users of circle.com and I pay you, that transaction doesn’t need to hit the blockchain, and the same for two users of coinbase.com. You can go beyond this and say that these services could provide netting between coinbase.com and circle.com; if I’m using Coinbase and you’re using Circle, they could send a private message to each other, and once that’s netted out in either direction they could send one transaction per day to send a transaction to each other. This would allow them to scale very well. This would handle a large portion of bitcoin users. Lightning Network does something like htat where it preserves the decentralization properties of bitcoin and can make that sort of transaction scheme more generalized where you don’t have to trust the hosted wallet services.
This looks like how banks settle their balances with each other. On decentralization there is one other question I would like to ask. We look at different things to see if bitcoin is decentralized, like number of full nodes, number of full wallet implementations, what are the most important metrics to properly identify how decentralized bitcoin is or isn’t?
AB: Yeah, it’s really an interesting question because you know unlike conventional security measures where you can fairly accurately measure the effort to hypothetically bruteforce something, which we understand quite well at this point, the decentralized metrics have a number of factors and it doesn’t translate into a single number. There are a number of indicators that bitcoin decentralization is still functioning but it may be at an ebb right now. If you look at the main ones, like the number of mining pools, or the number of vertically-integrated miners. There are a number of quite high hashrate pools and vertically-integrated miners that it would only take, for a policy decision, for 3 or 6 of them to fairly effectively implement a policy. Part of the way that bitcoin achieves policy neutrality is that different people process transactions. Even if 75% of the hashrate wanted to freeze or block a bitcoin payment, and someone wanted to stop them spending it let’s say, the remaining 25% would eventually process the transaction, so it would just be delayed and not blocked. There’s a degree of decentralization there. It’s much more centralized than people would have hoped if you had asked them a couple years ago. Another kind of metric is the number of ASIC manufacturers that are selling direct to the public or small businesses that would spend $10k-$100k of mining equipment. And the number of independent ASIC manufacturers is I think decreasing. There’s still a few that will sell to the public, but most of them have turned their attention to vertical integration, and a few have gone bankrupt due to poor market timing. There are two interesting forms of decentralization. Another one that I think people may not be aware of is actually behind some differences of opinion on the protocol level is this concept of running a full-node, or an auditing node. So the percentage of economic interest in the network that is validating transactions that it receives by its own full nodes, and I think we’re seeing evidence that this is falling as well. This is necessary for the bitcoin security picture- some of the economic interest in the ecosystem must have a full-node that it is trusting and using. If it falls too low, there is no longer a security assurance for users because miners are providing a service to users, and typically to SPV users- if there are no auditors then there’s no checkpoints. So miners balance against other miners; that’s to say that if we have a lot of very high ratio of economically independent auditors, we can tolerate a more centralized ratio of large miners because miners fighting against each other for block rewards is partly keeping each other honest because of the policy of not building on top of bad blocks as part of the consensus rules, but what makes that a policy rule is that full node auditors are looking for that. In some of the more aggressive block size proposals, there’s an emphasis on moving away from full node auditors, that over time they should be moving into data centers or increasingly be located in high-bandwidth datacenters. And of course you don’t want to constrain the network to GSM modems or something really low-powered like a raspberry pi is too constraining, but it is a balance and we do want to make it easy and not too onerous to run full-node auditors. The majority of medium sized companies and most power users should run full-nodes to ensure the security of the system. This is the whole point of the system. They run it for their own security, and the fact that they do this is something that holds the system together. They reject payments that are invalid. And this information flows back to SPV users. If you boil it down going down from the requirements about what bitcoin is and why decentralization and permissionless innovation and you can translate that into what are the mechanisms that make bitcoin secure, and the full-node auditors- it’s not just running a full-node, you have to actually use it for transactions. It’s the amount of economic interest that is relying on full-nodes and has direct trust and control of those full-nodes. This is what holds the system to a higher level. The things that degrade this are things like block sizes getting full, memory bottlenecks, CPU validation bottlenecks, and companies outsourcing running full-nodes to third-parties, or running their entire bitcoin business by API to a third-party. There are some tradeoffs here; the software is sort of technical to run, and some startups may not have the expertise to run any software at all. But I do think we need a high proportion of full-nodes.
BC: Is there any way of getting it back? It seems like the vast majority of users today, and maybe not business users, running full-nodes… people aren’t doing it. I’ve tried it, and I used to run a wallet with a full node, I use a laptop and it just doesn’t work. Can we get this back?
AB: There are some inherent properties here, like the speed of light, latency, and network constraints. But there are also software and protocol constraints. People have been working hard on improving these properties. So it’s not just about the optimization of the CPU bottleneck, which isn’t in bitcoin which I think will be in the next major release, which will make it much less intensive to catch-up. Another optimization is headers-first; it now syncs an order-of-magnitude faster than it used to, just in terms of the p2p protocol transfer. The first version was quite slow, and people would use bittorrent to catch up. Also, you don’t need to use a full node all the time. If you get involved in a larger higher-value transaction, you could just turn on your desktop or laptop on and tell it to sync and it will take maybe an hour or something, but you will have a high assurance that a transaction really cleared. I think it’s important that businesses do this. Ultimately if businesses have custody of client funds or they are selling things and they have the expertise, then it’s certainly useful thing to do. There’s one form of security standardization in bitcoin ecosystem, there’s this cryptocurrency security standard, with some acronym which has been proposed. They are suggesting a minimum of people with the expertise, I think there are tiers of security, to go beyond the simplest security minimum bar, to go to the middle one, it’s strongly recommended to run a full-node. The protocols get more efficient, so the other form of decentralization comes from the block transfer time. There are protocols now to synchronize blocks much faster, such as the relay network protocol that BlueMatt implemented and there are also- that’s kind of an add-on and I think at this point the majority of the hashrate is using that. What happens there is that all transactions are already broadcast, the native bitcoin block transfer mechanism sends all of the transactions over the network twice, which doubles the bandwidth utilization twice, and you’ve already received and validated them so then oyu receive them again which is nonsense. The relay protocol makes use of the fact that you already have most of the transactions, and typically this block relaying happens in a single TCP packet, so the relay latency is massively lower. This is not quite standardized yet, and it can’t be integrated into bitcoin core yet, but I think that this is something that people have expressed interest in doing. We could increase scale or improve some other features with that sort of addition, because block latency is what effects centralization, so there’s a ratio between block relay time, how long it takes to transfer a block across the p2p network, which seems to be 10 to 15 seconds, with the relay network that’s a half second. So you want a low ratio between the relay time and the block interval. Because if the block transfer time and the block validation time in terms of validating signatures gets to be a big portion, your mining is basically dead. So it favors larger miners because if they produce the block they already know it’s valid so they can start building on top of it; so you tend to get miners that are building off of each others blocks, and people that are building off of other blocks or have slower network connections, they get disadvantaged by the network protocol. Gavin also worked on IBLT which is another method to do this, and a few other people are working on it as well. There are a number of things happening in the protocol to improve the experience of running full nodes, making it much more lightweight to run a full node on the centralization side, and making it more efficient more intelligent block relaying. So things are progressing, I think generally.
BC: There has been a lot of heated discussion. If you have seen some discussions on reddit, we’ve seen some side, like Gavin and Mike have done a lot of work to articulate their point of view. We have seen that less happening, maybe it has happened on the mailing list but not to the wider community, of I think the people working at Blockstream and some of the other core developers. Would you like, what would you like to have people take away from here? These people are not deeply in the technical process to figure out how to improve bitcoin.
AB: I think what you said was true that, there has been some attempt to sort of popularize a given implementation amongst companies. Hard-forks are risky. Soft-forks are already risky, see for example the July fork. And hard-forks are even more risky. The best hope to minimize risk is for everyone to be fully on board with making the upgrade. With the hard-fork, it’s not backwards-compatible. You need all of the full-nodes on the network to upgrade. Everyone. The upgrade lifecycle, like for soft-forks is typically 6 months. I think we’ll actually get to a safer and faster upgrade to scalability if we take a month or so to get people to agree on what the best way to do it is. Once everyone is in agreement, you can do an upgrade much faster. I think that there’s some misconception also about a soft-fork upgrade being triggered by miners, but a hard-fork is really triggered by an economic supermajority upgrading their equipment. It’s good for miners to indicate that they have upgraded, but that’s not sufficient. If miners upgrade but users didn’t, then the hashrate that is running on the user’s network would fall and they would be vulnerable to attack. It’s really the case that you want everything to be upgraded at the same time. I have talked about simple proposals like, something short-term that happens over a 4-year time frame instead of a 20-year time frame. There’s 2 MB immediately, 4 MB after 2 years, then re-evaluate after we see from some of these new scaling features that are coming online. We’re seeing how lightning network works out; how decentralization progresses, there are some ways to improve mining decentralization with protocols like getblocktemplate-like extension which gives you variance reduction which is one of the reasons why people pool into mining pools. You can do that work yoursself and still have your pool help with variance reduction. Those kinds of things we’re working on and I think they will be online within 1 to 4 years. Within 3 years we should start evaluating where things are, it’s going to look very different. 4 years is basically an eternity for bitcoin. I think that’s a plenty enough time frame. An 8 MB cap has been well-validated by that time; it’s difficult to forecast the future. Weather forecasting more than a few days out gets increasingly uncertain and you will either overshoot and bitcoin will be insecure or miners will be unprofitable and not choose to secure the network, or you will undershoot and get congestion. It’s much more predictable and much more likely that everyone will get on board with the same proposal, and it would be good to have a little bit of focus on seeing how the flexcap proposal works, otherwise we’re risking people getting on the first proposal that was out there to just change the parameters. Jumping on the first proposal that is dumped in the public community is a bad idea. The flexcap proposal could balance and provide security while providing scalability and bursting capability. Also it protects against denial-of-service risks; miners could scale when it’s more profitable for them to do, which is to burst and still make denial-of-service expensive. Otherwise miners could potentially engage in different sort of network optimization approaches that could see blocks constrained to push up fees or miners can ping between themselves and push up block size to the maximum to squeeze out smaller miners or something like that. I think jgarzik’s proposal and flexcap is interesting, but we need more testing and more analysis. The parameters for flexcap need to be selected and we need informed debate. I think the scalingbitcoin.org workshop is going to be interesting there. We are going to be talking about selection criteria, see network measurements about latency and bandwidth, and it’s more like the way that NIST goes about selecting the new AES standard or the new SHA-3 standards where there are a number of proposals, a series of workshops, and it’s analogous in the sense that any encryption or hashing standard there’s a tradeoff between security and speed. People will do lots of benchmarking effort about measuring speed; they make hardware implementations in simulators and try to see which is most efficient, they will try to analyze the security properties; and those are much more able to stand the test of time. It’s extremely expensive to make a failure in the field. Bitcoin is far more complicated than SHA-3 or DSA. There’s all this pressure to jump on the first proposal or curtail review of potentially better proposals; it’s unfortunate but I think that if we do this sort of validation of the proposals, we can still get to scalability and get the network scaling faster because the trigger mechanism is to get everyone upgrading, not so much miner hashrate. If we get everybody all the companies all the miners and all the users on the same page on a sensible compromise that doesn’t have any obvious attacks, then we can see upgrades in a really much shorter compressed period of time.