Gavin Andresen recently added some new commits to Bitcoin Core on GitHub, thus continuing with his work on the original Bitcoin protocol. This work may raise some eyebrows, considering Core is a direct competitor with Bitcoin Classic on scalability, of which Andresen happens be to a lead developer.
Also read: The Need to End Bitcoin Discrimination
Gavin Andresen Returning to Bitcoin Core?
Gavin Andresen has been outspoken in his support for increasing the block size as a way to make Bitcoin more scalable. He sparked a controversy when he first proposed a hard fork as way to increase the block size limit in 2014, then again in 2016 when he proposed the very same fork. Bitcoin Classic being a result of this proposal, a hard fork in the Bitcoin blockchain that would increase the block size limit to 2MB.
Bitcoin Core is the term that has been adopted for the current Bitcoin client that only allows for a maximum 1MB block size limit, which was originally set by Satoshi Nakamoto. Andresen has been critical of Bitcoin Core in the past, and believes the only way forward is to somehow increase the block size limit. In the long term, as more people begin to use bitcoin, and thus accept and make payments through the Bitcoin network, then the more data will be passing through it. The amount of transactions will increase, and it is Andresen’s belief that the increase in transactions will require the Bitcoin network’s TPS rate to go up to deal with the larger data load.
Andresen has said the only way for Bitcoin to handle larger amounts of transactions is to increase its block size limit:
I think the maximum block size must be increased for the same reason the limit of 21 million coins must NEVER be increased: because people were told that the system would scale up to handle lots of transactions, just as they were told that there will only ever be 21 million bitcoins.
Some may say Andresen has taken his advocacy to extremes, even going as far to say that he would accept unlimited block sizes if they were implemented. This, of course, has made him a target of scrutiny among many Bitcoiners as the block size has been one of the most heated topics of discussion across the Bitcoin community.
Those opposed to increasing the block size limit say that a variety of negative implications will follow. From larger blocks that increase mining costs to blockchain storage space increases that make it more costly for individual nodes to host the blockchain. The ultimate effect, they say, is to damage the decentralized nature of the Bitcoin network — the very feature that makes Bitcoin and other cryptocurrencies advantageous over more traditional currencies.
Proponents of Increasing the Bitcoin block size limit, including Andresen, say that not increasing the block size limit will result in a worse outcome than increasing the size limit. Besides this, proponents also say the increase in mining costs are an inevitable effect of economies of scale that are the result of a combination of larger transaction volumes and limits of current mining technology. And as it goes for increasing data storage costs, well the proponents would say that people who are worried about that just are not aware of Moore’s Law.
Regardless, Andresen’s continued work on Core are curious considering his criticism of it in the past and his vocal advocacy of increasing the block size limit. It may be that alternative clients like Classic and XT are simply having a difficult time being accepted by the Bitcoin community at-large, and as a result Andresen is beginning to believe that a hard fork may no longer be a viable option to increase the block size limit.
What do you think of Gavin Andresen submitting new Commits to Core? Let us know in the comments below!
Images courtesy of Pixabay, WebSummity via Flickr