Peter R said:
However, if you agree that O(1) block propagation will not exist, then this means:
I can agree that propagation impedance will effectively be non-zero (but I suspect indistinguishably so) for some other simple things that are currently fact (and this will hold even if miners
don't compose a block according to their own volition), for example:
- Reconstructing a block after a reconciled IBLT set involves the computation of a merkle root which scales like O(log n), making propagation impedance non-zero.
- IBLTs are probabilistic and will only have 0 chance of failure if the size of the transmission is of infinite size - assuming such an infinitely sized transmission is impossible, propagation impedance will be non-zero in a set percentage of occurrences.
- Foregoing fully reconstructing a block and 'SPV' mining, however short-lived, involves some added risk for a duration proportional to the size of the block, which would make propagation impedance non-zero if the assumption that the non-validated block is valid turns out to not be true.
- Only after a weak-block-propagated-valid-block is discovered (0 impedance) can new weak blocks be generated and distributed in the network, which takes time proportional to the size of blocks, which would make propagation impedance non-zero if a new valid block is found before weak blocks have propagated.
Barring other (not yet discovered / described) proposals from analysis, I suppose there may very well always be some factor, however minor, that will result in a non-zero-impedance-gotcha.
In other words: a fee market exists at some point.
Now that we got that out of the way, we can, as you say, talk about what that might mean.
----
Peter (or more specifically; others in this thread), don't get me wrong, I was incredibly excited when I read your paper; my objection is not with the concept but with its application. Thinking of block space as a commodity was a novel concept to me and I think it might very well be key to sprouting a fee market out of dust.
Whether block size dependent orphan risk is the right measure to come to an appropriate equilibrium; I am doubtful for the primary concern outlined. More specifically, I think propagation impedance might be indistinguishably close to zero such that the resulting equilibrium is irrelevant for application.
In order to explain this concern better, some further context on the perspective from which I observe the debate of the holy block size limit:
Miners are given incentives to act rational and honest, and they build a highly secure and increasingly irreversible journal / record as a result. They can put in place whatever infrastructure they so desire to overcome whatever unfortunate scalability (O(n) per node validation will never be a problem for them, for example) that inherently exist, or expend massive resources to actually
solve scalability (notably block propagation scalability) if that were possible in the first place. Indeed, BIP 101 would be entirely feasible to roll out if miners were the only ones such a change would affect. For example, they can afford to purchase redundant connectivity at over-the-top bandwidths to account for the massive n amount of transaction, they can distribute a high amount of highly connected nodes across the globe to push out their block solutions quickly (or, of course, they could just patch into the block relay network) and account for any number of unforeseen problems that prohibit them from scaling. In and of itself, this is great, but not really - as miners are not the only ones that are concerned with capacity.
Participants that are 'mere' auditors of the miners' efforts - full nodes - are given (as an incentive) only the confidence and certainty their money (assets / properties / past efforts) exist according to the rules they enforce. These participants are what keep the network useful, not because they secure the block chain like miners do, but because they preserve the value contained within - they are economically dependent on the system working as advertised, and so long as they are able to audit it does the system continue to function, be useful, and be decentralized (which, I should note, indirectly secures it aswell). While this incentive is large, it is not boundless, and it does not allow for many of such auditors to appropriately react to changing parameters, presumably including those proposed in BIP 101.
Individual auditors are facing 2 main scaling problems:
1) Per-node continuing validation complexity. This is O(n) and means they need to be able to keep up with the rate of transactions.
2) Per-node full validation complexity. This is O(n^k) where n is the amount of transactions and k is the rate at which the rate of them increases (or decreases) - it's the complexity for a full block chain validation from genesis, and it no longer applies to nodes that are currently running. In BIP101, k would be ~1.4 (for a year's worth) in the worst case.
There's also network wide broadcast complexity, which is O(n^2), and not great at all, but auditors are not specifically concerned with this - network architects are.
More economically dependent auditors is generally better, for obvious reasons. Economically dependent auditor count is
not analogous to full node count; the number of economically dependent auditors is - as it seems - impossible, or at the very least impractical, to measure.
As a result of the fact that we cannot produce a useful number here, it would be useful if the barrier to fully validating (and becoming an auditor) were sufficiently low, in order for a large enough amount of them to 'naturally' exist. Increasing the complexity of the 2 scaling problems they primarily need to deal with (by increasing n, or k) to values where this barrier is no longer sufficiently low, is - I perceive - detrimental to their existence.
One way to potentially increase this barrier is to allow for block sizes that are large; if a natural healthy fee market exists at a region which actually allows for blocks to be large ('large' is kind of a subjective term here, but I think you'll catch my drift), then this would presumably increase the barrier to fully validating, and hurt the number of economically dependent auditors as a result, making the network less useful, and less decentralized (it makes for diminished utility in its application as a censorship and corruption resistant network)
That's my perspective.
Now, using block propagation impedance as a measure to model the cost of block space, and come to a natural fee market based on it, is something the producers thereof (miners) can affect to their advantage, and they are given huge incentives to actively work toward reducing that value toward zero, which would allow for a fee market to exist at block sizes that are 'large'. In and of itself this is not a problem, if it weren't for the fact they are, again, not the only ones that need to deal with block sizes being large.
Auditors are wholly out of the loop and cannot affect propagation impedance among miners, therefore, using it as a measure to naturally determine the size blocks will become could potentially result in block sizes ending up being large enough to make the scaling problems they endure infeasible to overcome, forcing them to stop auditing, and making the network less useful as outlined above. That, is my concern.
I would be very interested in seeing concrete numbers on where in the 'block size axis' a fee market would potentially form using current P2P propagation metrics, fast relay network metrics, and - if possible - what we currently know about IBLT and/or weak blocks.
As far as speculating on them goes; I suspect the first would be somewhat suitable (p2p propagation), and as propagation impedance is pushed into zero by the last 3 methods of propagation, the resulting block sizes would become increasingly unsuitable, and exponentially so as technology (base latency and throughput, for example) moves forward.