Block space as a commodity (a transaction fee market exists without a block size limit)

dgenr8

Member
Sep 18, 2015
62
114
The ability for anyone to audit the blockchain (at reasonable cost) is what makes Bitcoin Bitcoin.
@Yoghurt114 You sound knowledgeable enough to be someone who is involved in developing bitcoin. So let me put it to you as a development challenge. Help to design a network that can be audited by the public, without an arbitrary member of the public needing to verify the entire history themselves.

There are only two things needed:
  • The ability for an arbitrary member of the public to audit a self-chosen and selectable-sized slice of the blockchain history
  • A transparent system, like the one we have now, whose data is public and whose operational rules can be verified

This has been done before. Companies used to shut down for a day, weekend, or longer to perform an "exhaustive physical inventory." During this procedure, the company stopped all physical i/o activity and literally put everyone to work counting things. Then, for 1 millisecond on Monday morning, they knew exactly what their inventory was. Yay! Then things got worse again (through errors etc) for another year.

Then, at some point "cycle counting" was invented. Cycle counts are partial counts done at more frequent intervals under rules designed to achieve the required level of confidence. When a company has proper cycle counting procedures, official auditors can allow that company to skip exhaustive physical inventories forever.

The cycle counting method requires better software, but is actually preferred by official auditors. It produces higher confidence in the ongoing accuracy of the inventory database.

Getting back to the blockchain, slices could be taken in various orthogonal directions to effectively cross-check each other (ie by txid, script, or block - making different decisions about which data not to audit).

Building this may take a while. So be sure to promote BIP101 in the meantime.
 

dgenr8

Member
Sep 18, 2015
62
114
So far it hasn't come from a rising exchange rate, it's come from inflation due to the generation of new coins.
You missed the point. Rising exchange rate has more than made up for the rate of decrease in coin generation. Drastically so. The risk to miner incentives today is far less than it was in 2012.
 

dgenr8

Member
Sep 18, 2015
62
114
Now I receive it, unpack it, see I'm missing transactions (because they didn't propagate all the way to me yet, I didn't see them, or something along these lines) and have to get those transactions before I can verify it and pass it along. So it's not just 4 bytes, it could be many.
@humanitee Yes. When we hear that IBLT, weak blocks, or the relay network are O(1), the claimants are completely ignoring this. They are actually presuming that the transaction content itself is already globally shared, consistent, and validated when they make the claim that a few bits are all that need be transmitted to the next miner -- meaning all miners, since the identity of the next miner is unknown.
 
  • Like
Reactions: awemany and Peter R

dgenr8

Member
Sep 18, 2015
62
114
Anyways, this is totally academic until we're flying around in spaceships at speeds approaching the speed of light.
Quantifying the time offset of different nodes' observations of the same transaction isn't just academic. In this proposal I do just that in order to develop a miner policy that would make relying on unconfirmed transactions safe after waiting a little while (paper says 30s, now closer to 15s). The source of differing observations isn't relativity of course, just unpredictable networks.
 
  • Like
Reactions: Bloomie and Peter R

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
Yes, but I hope that just a certain size of the network and random incoming transactions are enough to show a floor on the necessary information to synchronize. Because a certain size of the network would also imply that there's more than a single miner (combined ownership aside).
I really like your idea to associate a size s with the network! It is great for explaining how real-time (100%) consensus is impossible.

Quantifying the time offset of different nodes' observations of the same transaction isn't just academic. In this proposal I do just that in order to develop a miner policy that would make relying on unconfirmed transactions safe after waiting a little while (paper says 30s, now closer to 15s). The source of differing observations isn't relativity of course, just unpredictable networks.
I didn't realize who your were until now. You're a consistent voice of reason on the developer mailing list!



@Yoghurt114

I think we've convinced you that IBLTs (at least as proposed) are not true O(1) block propagation. Assuming you've been following the last few pages in this thread, have we convinced you also that:

1. "Weak blocks" would not result in true O(1) block propagation, and
2. True O(1) block propagation is impossible if more than a single miner/mining pool exists?
 
Last edited:
  • Like
Reactions: awemany and Bloomie

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
I think we've convinced you that IBLTs (at least as proposed) are not true O(1) block propagation. Assuming you've been following the last few pages in this thread, have we convinced you also that:

1. "Weak blocks" would not result in true O(1) block propagation, and
2. True O(1) block propagation is impossible if more than a single miner/mining pool exists?
I am pretty convinced now myself that O(1) does not exist. However, I still like to show this in detail by showing that the entropy that is in the time uncertainty times transaction rate is a proportional factor in any efficient block propagation scheme. I am kind of stuck there with regards to the math because it becomes really weird statistics when allowing any block selection scheme. I basically am trying to find a formula for the mean minimum bin of a histogram of poisson distributed values (average minimum number of transactions overlapping with uncertainty dt, assuming two nodes can magically 0-bit agree upon selecting the point in time with minimum overlap of uncertainty windows - which they actually can't, of course). If that is even a closed form expression....

I am thinking about some simple monte-carlo stuff to explore this but that will take a while. Further ideas appreciated!
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
"I am pretty convinced now myself that O(1) does not exist."

New thought experiment:

A miner (on one edge of the network) receives a constant stream of new transactions (that no other miner has yet seen). There must exists some fee/kB that a new transaction could pay to entice the miner to immediately include it in the block he's working on. There's also a nonzero chance the miner solves this new block before the newly-included transaction has propagated across the network, in which case the miner must communicate the (assumed) O(1) component of the block solution plus information that describes this new transaction. Thus O(1) block propagation will not exist if miners are free to build blocks according to their own volition.
 
Last edited:

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
Greg Maxwell agrees that weak blocks do not permit true O(1) propagation (i.e., a fee market must still exist):

"Unless the weak block transaction list can be a superset of the block transaction list size proportional propagation costs are not totally eliminated." -- Greg Maxwell
 
  • Like
Reactions: awemany

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
"Unless the weak block transaction list can be a superset of the block transaction list size proportional propagation costs are not totally eliminated." -- Greg Maxwell
@Peter R. Big blocks take long or are quick to propagate - whatever suits his argument at the very moment. But still good that he's on the record with that, maybe enough to shut up his fanboys on Reddit. Thanks for the link.


A miner (on one edge of the network) receives a constant stream of new transactions (that no other miner has yet seen). There must exists some fee/kB that a new transaction could pay to entice the miner to immediately include it in the block he's working on. There's also a nonzero chance the miner solves this new block before the newly-included transaction has propagated across the network, in which case the miner must communicate the (assumed) O(1) component of the block solution plus information that describes this new transaction. Thus O(1) block propagation will not exist if miners are free to build blocks according to their own volition.
Yes, something which I also pondered about. I think @cypherdoc mentions something similar when people invoke the bloatblock scare as the true reason for the blocksize limit and somehow argue it will forever be valid - but bloatblocks are much more costly to propagate if you have to push new transactions to go along with them.

The 'problem' with your argument is that it invokes the market. And funnily, people have seemingly been very successful in using any market/game theory argument to sway people (not necessarily to be right).

For example, Peter Todd argues that cut-throat business narrow-minded small-scope games will be unavoidably harmful at various points to Bitcoin without a blocksize limit. But he fails to see or mention that he himself is only a market participant, too - and his action to argue against bigger blocks and supporting the surrounding core dev actions (+ theymos) is just such a game but with bigger scope.
 
  • Like
Reactions: Peter R

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
"The 'problem' with your argument is that it invokes the market. And funnily, people have seemingly been very successful in using any market/game theory argument to sway people (not necessarily to be right)."

I agree it would be better to prove that O(1) propagation was impossible without an economic argument, but I'm not sure we can. For example, your diagram convinces me that it's not possible for miners to come to consensus about the contents of blocks without the communication of information about the contents of said blocks; however, I'm unconvinced that--if we also allow the miners to behave in absurd ways--that this communication can't take place entirely before the block solution announcement.

In fact, I think I can prove that O(1) block solution propagation is possibly if we allow miners to behave in absurd ways. For example, imagine that all the miners agree to mine a practice blockchain and the real blockchain. Let's also imagine that the real blockchain is always 10 blocks behind the practice chain. Everyone agrees that the next block in the real blockchain contains exactly the same transactions and in the same order as the block 10 earlier in the practice chain. This type of absurd scheme would allow for 0(1) block solution propagation, but only because the miners have already come to consensus (which means the second blockchain served no purpose).
 
  • Like
Reactions: awemany

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
@Peter R.: Yes, that's a good point.

Floating transactions around is not part of the validation step but is clearly needing n transmissions for n transactions (funnily, for that part the BS guys say it is O(n ^ 2 ) at times...).

Yes, the validation step can be in O(1) - but only when having found the agreement on inclusion before the block announcement. So, to rephrase what I wrote above, I am/was trying to show the cost for agreeing on a validated subset of transactions of those that came around already is again something more complex than O(1). And as I wrote, I believe that is true because of constant transaction arrival time uncertainty, so the rate of transactions with uncertain ordering is proportional to the total transaction rate...

In that light, your example of a 'practice chain' would just be another channel that is used to transmit that information.

But you are right, the value of doing that is actually doubtful in the bigger context.

In a way, any argument about 'O(1) dangers' is saying, in other words, that somehow it is good for a miner to have a cutoff on how recent a transaction to include. I think that's true to some extend, especially with O(1) propagation - but it is rather a market of miners competing to have the most transactions in a block that still propagates easily, than any danger. Very recent transactions will encounter higher propagation impedance and are thus more likely to be excluded.

But as soon as someone argues that a miner will do O(1) attacks by going to the past, that means that he's bounded by the size of the O(n)-transmitted transaction set.

In the end, can't we just say that for the fee market to exist, that overall, transactions that are validated (included in blocks) have to be gossiped first - and that is clearly an Omega(n) process? So, because mining is depending on gossiping, the total information transmission cost of mining a transaction is always at least a constant?

What is the counterargument here?

EDIT: On the matter of doing absurd block propagation schemes - when allowing probabilistic set reconciliation, the miner could just send the block header with the merkle root hash and say 'guys, figure it out yourself which transactions I am thinking you should be taking from your mempool'. That's in O(1) for the communication overhead but in O(2^n) on the client for actually figuring out WTF the miner intended to do :D
 
Last edited:

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
@sickpig: Yes, well, kind of. But it does show that using probabilistic algorithms, one can indeed make tradeoffs going arbitrarily close into the direction of O(1) communication overhead, but bought with a huge processing overhead.

Funnily, 'being reasonable' comes into the equation here, too: No sane full node will accept that algorithm (which is - as you say - rather closer to another POW challenge than a real, usable set reconciliation algorithm) and so that block without a transaction body would be orphaned. However, you might be able to replace 'being reasonable' with the cost of processing time overhead.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
"one can indeed make tradeoffs going arbitrarily close into the direction of O(1) communication overhead, but bought with a huge processing overhead."

Who knows, maybe that could turn out to be another one of your uncertainty principles. Something like:

(communication overhead) x (processing overhead) > h.
 
  • Like
Reactions: awemany

Yoghurt114

New Member
Sep 17, 2015
21
8
@Yoghurt114 You sound knowledgeable enough to be someone who is involved in developing bitcoin.
Thanks, but I am a mere interested observer.

So let me put it to you as a development challenge. Help to design a network that can be audited by the public, without an arbitrary member of the public needing to verify the entire history themselves.

There are only two things needed:
  • The ability for an arbitrary member of the public to audit a self-chosen and selectable-sized slice of the blockchain history
  • A transparent system, like the one we have now, whose data is public and whose operational rules can be verified
You're describing the system that currently exists.

Then, at some point "cycle counting" was invented. Cycle counts are partial counts done at more frequent intervals under rules designed to achieve the required level of confidence. When a company has proper cycle counting procedures, official auditors can allow that company to skip exhaustive physical inventories forever.
Such a cycle is precisely analogous to the discovery of a valid new best block.

Peter R said:
I think we've convinced you that IBLTs (at least as proposed) are not true O(1) block propagation. Assuming you've been following the last few pages in this thread, have we convinced you also that:

1. "Weak blocks" would not result in true O(1) block propagation, and
2. True O(1) block propagation is impossible if more than a single miner/mining pool exists?
Yes. I'd like to add some important nuance: if O(1) block propagation does happen, you would always be able to conflate its implementation into the *interpretation* of a single mining pool. Which would make the above points always hold.

I feel, though, the essence of the primary concern I rather specifically formulated when I first entered this thread and have only repeated since, is being bypassed into a discussion about whether or not O(1) propagation is possible in the first place.

I never claimed any impossibility of O(1) block propagation would automatically make the results in the paper feasible enough to be utilized in the Bitcoin such as it exists today (if absent of a block size limit). I'll stress once more:

My concern is that coming to a natural block size limit the way described in the paper would make for block sizes that are too large for an independent auditor to possibly validate.

A journey into the past:

https://bitco.in/forum/threads/block-space-as-a-commodity-a-transaction-fee-market-exists-without-a-block-size-limit.58/#post-1051

So after pages of back and forth your response was this:

https://bitco.in/forum/threads/block-space-as-a-commodity-a-transaction-fee-market-exists-without-a-block-size-limit.58/page-2#post-1365

Peter R said:
I do, but, like LN, it is not relevant to this thread.
So perhaps my concern is irrelevant to figuring out whether or not block space can be thought of as a commodity.

But I don't think it's irrelevant for the real discussion; whether or not the result in your paper warrants removal of the limit, or significant increase proposals such as BIP101 that suiter to it.

If this is not what we are discussing here, then I propose we move that discussion to another thread. I have some comments on that discussion that I think might be of interest and might add some nuance, but I don't want to hijack this thread.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
Yoghurt114 said:
Yes. I'd like to add some important nuance: if O(1) block propagation does happen, you would always be able to conflate its implementation into the *interpretation* of a single mining pool. Which would make the above points always hold.
I'm not trying to play games with semantics here. I agree that if O(1) block propagation does happen that it requires a single mining pool, but my "interpretation" of a mining pool means at least one useful thing: it means that miners must forfeit their right to build blocks according to their own volition and must in fact agree not to build blocks in certain ways even when doing so may directly increase their profitability (for example, miners cannot immediately include a new TX with a juicy fee in the block they are working on even though doing so would be the rational thing to do).

So I don't think there is any sort of semantic trickery here. True network-wide O(1) block propagation will not exists if miners are free to build blocks according to their own volition and if they act rationally to maximize their profit.

Yoghurt114 said:
My concern is that coming to a natural block size limit the way described in the paper would make for block sizes that are too large for an independent auditor to possibly validate.
It is pointless to discuss what the equilibrium block size could become from the perspective presented in my paper if you believe that network-wide O(1) block propagation is somehow inevitable. However, if you agree that O(1) block propagation will not exist, then this means:

(a) the network propagation impedance is a useful concept
(b) a fee market will exists

If we can debate from this perspective (that the framework in my paper is correct and we're discussing the scale of effects) than I am happy to discuss what the equilibrium block size could become under different scenarios.

For example, it would be very interesting to work out mathematically how an implementation of the "Weak Block" idea would affect the network propagation impedance and the transaction fee market.
 
Last edited:
  • Like
Reactions: cypherdoc

Yoghurt114

New Member
Sep 17, 2015
21
8
Peter R said:
However, if you agree that O(1) block propagation will not exist, then this means:
I can agree that propagation impedance will effectively be non-zero (but I suspect indistinguishably so) for some other simple things that are currently fact (and this will hold even if miners don't compose a block according to their own volition), for example:

- Reconstructing a block after a reconciled IBLT set involves the computation of a merkle root which scales like O(log n), making propagation impedance non-zero.

- IBLTs are probabilistic and will only have 0 chance of failure if the size of the transmission is of infinite size - assuming such an infinitely sized transmission is impossible, propagation impedance will be non-zero in a set percentage of occurrences.

- Foregoing fully reconstructing a block and 'SPV' mining, however short-lived, involves some added risk for a duration proportional to the size of the block, which would make propagation impedance non-zero if the assumption that the non-validated block is valid turns out to not be true.

- Only after a weak-block-propagated-valid-block is discovered (0 impedance) can new weak blocks be generated and distributed in the network, which takes time proportional to the size of blocks, which would make propagation impedance non-zero if a new valid block is found before weak blocks have propagated.

Barring other (not yet discovered / described) proposals from analysis, I suppose there may very well always be some factor, however minor, that will result in a non-zero-impedance-gotcha.

In other words: a fee market exists at some point.

Now that we got that out of the way, we can, as you say, talk about what that might mean.

----

Peter (or more specifically; others in this thread), don't get me wrong, I was incredibly excited when I read your paper; my objection is not with the concept but with its application. Thinking of block space as a commodity was a novel concept to me and I think it might very well be key to sprouting a fee market out of dust.

Whether block size dependent orphan risk is the right measure to come to an appropriate equilibrium; I am doubtful for the primary concern outlined. More specifically, I think propagation impedance might be indistinguishably close to zero such that the resulting equilibrium is irrelevant for application.

In order to explain this concern better, some further context on the perspective from which I observe the debate of the holy block size limit:

Miners are given incentives to act rational and honest, and they build a highly secure and increasingly irreversible journal / record as a result. They can put in place whatever infrastructure they so desire to overcome whatever unfortunate scalability (O(n) per node validation will never be a problem for them, for example) that inherently exist, or expend massive resources to actually solve scalability (notably block propagation scalability) if that were possible in the first place. Indeed, BIP 101 would be entirely feasible to roll out if miners were the only ones such a change would affect. For example, they can afford to purchase redundant connectivity at over-the-top bandwidths to account for the massive n amount of transaction, they can distribute a high amount of highly connected nodes across the globe to push out their block solutions quickly (or, of course, they could just patch into the block relay network) and account for any number of unforeseen problems that prohibit them from scaling. In and of itself, this is great, but not really - as miners are not the only ones that are concerned with capacity.

Participants that are 'mere' auditors of the miners' efforts - full nodes - are given (as an incentive) only the confidence and certainty their money (assets / properties / past efforts) exist according to the rules they enforce. These participants are what keep the network useful, not because they secure the block chain like miners do, but because they preserve the value contained within - they are economically dependent on the system working as advertised, and so long as they are able to audit it does the system continue to function, be useful, and be decentralized (which, I should note, indirectly secures it aswell). While this incentive is large, it is not boundless, and it does not allow for many of such auditors to appropriately react to changing parameters, presumably including those proposed in BIP 101.

Individual auditors are facing 2 main scaling problems:

1) Per-node continuing validation complexity. This is O(n) and means they need to be able to keep up with the rate of transactions.

2) Per-node full validation complexity. This is O(n^k) where n is the amount of transactions and k is the rate at which the rate of them increases (or decreases) - it's the complexity for a full block chain validation from genesis, and it no longer applies to nodes that are currently running. In BIP101, k would be ~1.4 (for a year's worth) in the worst case.

There's also network wide broadcast complexity, which is O(n^2), and not great at all, but auditors are not specifically concerned with this - network architects are.

More economically dependent auditors is generally better, for obvious reasons. Economically dependent auditor count is not analogous to full node count; the number of economically dependent auditors is - as it seems - impossible, or at the very least impractical, to measure.

As a result of the fact that we cannot produce a useful number here, it would be useful if the barrier to fully validating (and becoming an auditor) were sufficiently low, in order for a large enough amount of them to 'naturally' exist. Increasing the complexity of the 2 scaling problems they primarily need to deal with (by increasing n, or k) to values where this barrier is no longer sufficiently low, is - I perceive - detrimental to their existence.

One way to potentially increase this barrier is to allow for block sizes that are large; if a natural healthy fee market exists at a region which actually allows for blocks to be large ('large' is kind of a subjective term here, but I think you'll catch my drift), then this would presumably increase the barrier to fully validating, and hurt the number of economically dependent auditors as a result, making the network less useful, and less decentralized (it makes for diminished utility in its application as a censorship and corruption resistant network)

That's my perspective.

Now, using block propagation impedance as a measure to model the cost of block space, and come to a natural fee market based on it, is something the producers thereof (miners) can affect to their advantage, and they are given huge incentives to actively work toward reducing that value toward zero, which would allow for a fee market to exist at block sizes that are 'large'. In and of itself this is not a problem, if it weren't for the fact they are, again, not the only ones that need to deal with block sizes being large.

Auditors are wholly out of the loop and cannot affect propagation impedance among miners, therefore, using it as a measure to naturally determine the size blocks will become could potentially result in block sizes ending up being large enough to make the scaling problems they endure infeasible to overcome, forcing them to stop auditing, and making the network less useful as outlined above. That, is my concern.

I would be very interested in seeing concrete numbers on where in the 'block size axis' a fee market would potentially form using current P2P propagation metrics, fast relay network metrics, and - if possible - what we currently know about IBLT and/or weak blocks.

As far as speculating on them goes; I suspect the first would be somewhat suitable (p2p propagation), and as propagation impedance is pushed into zero by the last 3 methods of propagation, the resulting block sizes would become increasingly unsuitable, and exponentially so as technology (base latency and throughput, for example) moves forward.
 
Last edited:
  • Like
Reactions: Bloomie

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
@Yoghurt114: Another perspective for a future with smart block propagation is kind of the other way around:

Miners in such a scenario are very much dependent on a) getting transactions from full nodes and b) on forming their blocks only on well known transactions (or else, their block impedance and thus cost raises strongly).
 

dgenr8

Member
Sep 18, 2015
62
114
@Yoghurt114 Thanks for the constructive post. Each of the world's volunteer auditors cannot audit all the world's coffee transactions, agreed.

Only one direction has been explored to solve this problem, namely: LIMITING the capacity of the entire system.

I'm not sure you got the meaning of my earlier post. It suggested another direction: developing a scalable auditing system.
 

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
Everybody who talks about coffee is missing the point to such an extent that adjectives are insufficient for the task of describing how badly they've missed..

Who cares most about the integrity of the world's billion daily coffee sales?

It's not the coffee sellers.
It's not the coffee consumers.

It's the savers in the currency being used to buy coffee.

Savers care very much that the purchasing power of their deferred consumption is not being diluted by counterfeit currency.

A billion sales of coffee every day is a billion daily opportunities for them to be robbed via inflation.

Anyone who doesn't get this doesn't understand the purpose of money.
 
  • Like
Reactions: dgenr8