Thanks for your response.
Natural Block Size Growth--the chart below shows growth in the average block size. Since this a log chart, the slope of the curve represents the percentage growth per year. Until very recently, Q* was to the left of Qmax (refer to charts in my talk) and so the production quota had no economic impact on the equilibrium quantity of block space produced. In other words, had the limit been 2 MB instead, the graph shown below would have very likely been almost identical.
If we project this historical growth rate forward (refer to the region indicated by the red circle above), nothing too scary happens. Even in 2020 (5 years from now), it is quite possible that the average block size would still be under 8 MB even without a hard limit.
I'm not particularly interested in the way the block size has grown in the past because, as the saying goes, past results are no indication of the future. From the perspective of participating in this network as an independent auditor, I must consider the possibility of the block size growing 1000-fold - or any size which prohibits my ability to validate the network - in the next year or whatever (had there been no limit) likely.
I can think of any number of ways the blockchain may naively (and in my opinion wrongly) be used (rather abused) to solve all manner of real-world problems - had there been no limitation to the block size - which would result in a system I and many others would simply be unable to validate. That, to me, is unacceptable, as it would destroy the utility of Bitcoin as a decentralized system. I would be more than willing to scale up considerably, but there are limits to what is realistic - I believe the result of your paper would end up with block sizes far beyond what such a limit might be.
The current limit, unfortunate as it may be, is what's keeping all this at bay, and somewhat forcing things to explore ways to employ the blockchain in a way it does scale. Factom is an interesting example. As is the proposed Lightning Network.
Schemes to Achieve Coding Gain--Right now I normally pay about 0.0001 BTC to make a transaction. Let's call it 2 cents per TX. Imagine we experience another growth spurt and the price of 1 bitcoin increases to $10,000. Given today's network propagation impedance (about 7 sec / MB according to my estimates), each transaction would then cost $1.00!
Now, imagine that with improvements to network propagation, we can decrease the propagation impedance by a factor of 100 (say we get 25x through coding gain, and 4x through faster bit rates). This would then drop the price of a transaction from $1.00 to $0.01. At 1 cent per TX, Bitcoin could still be used as an efficient payment network.
Things such as IBLT (and sort of also the relay network) don't work in ways like '7 seconds per MB', it works like '7 seconds for the full block - regardless of its size'.
Yes, it will take time for this constant-sized message to propagate through the network, but it does not affect the orphan rate; in these schemes, the size of the block has not anything to do with the size of the message used to communicate the block. The Shannon-Hartley theorem is not violated because these schemes allow us to sort of tell the recipient which previously-communicated payload to 'select' in order to reconstruct the intended message.
Your paper outlines the way block propagation happens in the peer-to-peer network currently, and it would completely hold if it hadn't been for the fact that the contents of a block - transactions - are already known to every participant as they, like blocks, propagate through the network. This last bit allows for more efficient constant-size block announcements such as IBLT to be a reality, and it, frankly, somewhat breaks the result in your paper.
At 1 cent per TX, Bitcoin could still be used as an efficient payment network.
My interest is not with the cost to push a transaction onto the blockchain. My interest is with being able to validate transactions that have been pushed onto it and *know* the system *is* what *the system is*. (I'm fine with scaling up considerably; BIP 101 is somewhat on the 'far too optimistic' side of things, and rather uninspired, if you ask me - but I can roll with it if the desire is great enough (luckily it isn't), albeit under protest.)
At 100x more throughput (as the 100x decrease in propagation impedance you mentioned would result in) we would have an effective block size of some 30MB or so. That's probably fine.
But what if propagation impedance drops to, effectively, 0 - as an IBLT or some such scheme could pull off. What then? Do we have a boundless block size limit? Would transaction fees not be 0 satoshi? And would not a single entity in the world be able to validate all throughput?
----
And as an aside, I would argue against Bitcoin being an efficient *payment* network (now or ever), for reasons unrelated to this debate. We're *using* it as a payment network, but it has become quite clear this has some problems. Bitcoin is a settlement network, clear and simple. The analogy with Bitcoin being a digital form of cash payments sounds nice, but it's false and breaks down right quick: When I give you a dollar bill, I don't need to wait ~10 minutes for you to receive it, nor does the entire world see that transaction.
Waiting for payment is waiting for settlement. Bitcoin is a settlement network. And not even an efficient one at that. But it's secure, independent of anything - decentralized, and *relatively* fast (which isn't hard considering the competition). Payment networks may be built on top, as the Lightning Network proposal demonstrates, but Bitcoin is not one.