Block space as a commodity (a transaction fee market exists without a block size limit)

Zangelbert Bingledack

Well-Known Member
Aug 29, 2015
1,485
5,585
@Justus Ranvier

Nevertheless, what you are talking about here is one-time dilution by fractional reserve on the part of various possibly independent private entities, rather than continuous dilution by central banks, correct?
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,994
Everybody who talks about coffee is missing the point to such an extent that adjectives are insufficient for the task of describing how badly they've missed..

Who cares most about the integrity of the world's billion daily coffee sales?

It's not the coffee sellers.
It's not the coffee consumers.

It's the savers in the currency being used to buy coffee.

Savers care very much that the purchasing power of their deferred consumption is not being diluted by counterfeit currency.

A billion sales of coffee every day is a billion daily opportunities for them to be robbed via inflation.

Anyone who doesn't get this doesn't understand the purpose of money.
as a saver in that coffee currency, i very much want to see those billions of tx's go off; it creates a continuous demand for the currency of my choice.

which supports and drives it's ultimate value.
 
  • Like
Reactions: Peter R

Yoghurt114

New Member
Sep 17, 2015
21
8
@Yoghurt114 Thanks for the constructive post. Each of the world's volunteer auditors cannot audit all the world's coffee transactions, agreed.
I'm arguing that they should [be able to], we know of no way to have a system where full auditability is not a requirement without compromising integrity.

Only one direction has been explored to solve this problem, namely: LIMITING the capacity of the entire system.
Yes.

I'm not sure you got the meaning of my earlier post. It suggested another direction: developing a scalable auditing system.
So the direction you propose be explored further is anything that has scalability that is more favorable than O(n) - fully validating everything.

Bear in mind, the concept of 'fully validate everything' inherently scales like O(n) per validator, so it can pretty reasonably be assumed a total encompassing solution here is impossible without completely rethinking things.

But, there are a ton of proposals that try and address this; none of them appear to work without compromising at least one of the security pillars we currently have.

- UTXO commitments, where miners commit to the composition of the UTXO set in a block, allows nodes to fetch the UTXO set (which, really, is all that's needed to fully validate going forward). It compromises in that you have to trust that miners were not dishonest at any point in the past (which is reasonably safe because presumably many auditors existed in the past, to check up on miners). In any case, UTXO commitments have other advantages for SPV wallets (allows them to determine whether an output exists without checking the full branch of ownership) and will help full nodes get up and running faster while they're validating the past. They should be implemented regardless for those reasons alone. (how, specifically, is still in the clear)

- zkSNARKS, an extremely sketchy branch of mathematics that I can't quite wrap my head around. (despite trying) But they would allow to retrieve a proof of validation of the full blockchain in O(1) time - if I understand correctly then the conditions under which this proof must be constructed need to be honest, which, again, requires you to trust that they were. Similar compromise as UTXO commitments.

- Sharding, similar in concept to Treechains, this is essentially a full p2p and concept rewrite. The idea in a sentence is that everyone doesn't validate everything, everyone validates a subset. Further, miners don't even mine everything, they mine a subset. Validation complexity under this model can be thought of as O(sqrt(n)), so that's great. As a downside, it's a full rewrite, unfinished (in concept), untested, unproven, and tends to depend on some clever economics to solve edge case problems such as the way fees work and what happens with reorganisations/forks at the tip. (as I'm writing this, I discover I need to do more research on this concept - so thanks ;))

[edited the following into post]

- Bitcoin-NG, this isn't a 'fully validate everything' scalability solution, but it's an interesting proposal nonetheless. Also a full p2p rewrite. PoW is replaced by a 'leader election', so miners still 'mine/do work' like they do now, but in this model when they find a block, everyone essentially agrees that until the next block is found, this miner is the sole entity allowed to add transactions to the block chain in so-called microblocks they broadcast every 10 seconds or so. It primarily solves the propagation problem and makes for more streamlined network activity. (this is actually ~0 propagation impedance and would allow full block composition freedom, well not really because microblocks still need to propagate, but impedance can be arbitrarily lowered under this model) But it doesn't work (as proposed) because the incentives are skewed in favor of paying a miner out-of-band, in which case there is no incentive to maintain the longest/best chain.

[/edit]

Only UTXO commitments are anywhere near a reality, the compromise there is well understood - and whether or not it's an acceptable one remains to be seen. I don't think it'd be wise to scale up hugely under the assumption that it is.

In any case, until such time that scalability solutions exist and actually work, we need to live with the current model and its inherent problems, not ignore them.
 
Last edited:

dgenr8

Member
Sep 18, 2015
62
114
@Justus Ranvier So your concern is with offchain accounts (denominated in the currency) settling with each other, becoming trusted, and then abusing that trust to create balances from thin air?

Do you see a way to prevent that? Otherwise, it sounds like you agree -- the more direct settlement between buyer and seller, the better.


But, there are a ton of proposals that try and address this; none of them appear to work without compromising at least one of the security pillars we currently have.
@Yoghurt114 Exactly.

If a system can be built such that both the chance and the consequences of an auditing failure can be quantified and shoved into asymptotic irrelevance, we're there. We already tolerate the possibility of hash collisions, never-ending block races, etc.
 
  • Like
Reactions: awemany

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
Nevertheless, what you are talking about here is one-time dilution by fractional reserve on the part of various possibly independent private entities, rather than continuous dilution by central banks, correct?
Both types harm the saver.

So your concern is with offchain accounts (denominated in the currency) settling with each other, becoming trusted, and then abusing that trust to create balances from thin air?
You've described a specific example of one of the things that can go wrong when buyers and sellers don't use Bitcoin.

My concern is that the vision of a future with a world high security of a "settlement currency" and a "transactional currency" is an exercise in magical thinking, based on the blind assumption that one particular historical pattern will repeat itself in spite of none of the conditions which caused that pattern emerge being valid in the present.


Do you see a way to prevent that? Otherwise, it sounds like you agree -- the more direct settlement between buyer and seller, the better.
Make Bitcoin the best money to use so that people don't conduct their transactions in a medium of exchange other than Bitcoin. Especially don't intentionally cripple Bitcoin because of some unfounded belief in monetary reincarnation.

No form of money is a store of value. Some currencies exhibit store of value behaviour under the right conditions.

First among those conditions is: be a currency.
 
Last edited:

ladoga

Member
Sep 17, 2015
50
63
Wasn't gold also more valuable when it was used as a "global" currency?

During the middle ages one could do trades in gold solidus anywhere from Europe to China and everyone would accept it as a form of payment (The word "soldier" is a reminder from that era, meaning mercenary). Much of gold's current value could simply be a follow up effect from its long history as a currency. People got used to the idea of its value and that has kept the demand going even if most of it's utility is long gone.

Bitcoin has never had such a dominant position as a currency. I suspect that if it's utility as currency was removed it would have a no potential as a store of value.
 
Last edited:

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
If a system can be built such that both the chance and the consequences of an auditing failure can be quantified and shoved into asymptotic irrelevance, we're there. We already tolerate the possibility of hash collisions, never-ending block races, etc.
And here I'd argue that with UTXO commitments, a year of transaction history is asymptotically as safe as running from the Genesis block. Because the majority of hashing power would have to lie to you for a whole year.
 
  • Like
Reactions: cypherdoc

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
Committed UTXO sets would need to be constructed in a way that makes them easily falsifiable by fraud proofs.

We can do a lot better than relying on a majority of hashing power not to lie.

It should be the case that even if the majority of hashing power lies for an entire year, as long as one miner, somewhere in the world is willing to tell the truth, they can demonstrate that a year's worth of blocks are all invalid with a compact and easily verifiable proof.
 
  • Like
Reactions: awemany

Yoghurt114

New Member
Sep 17, 2015
21
8
If a system can be built such that both the chance and the consequences of an auditing failure can be quantified and shoved into asymptotic irrelevance, we're there. We already tolerate the possibility of hash collisions, never-ending block races, etc.
It's important to point out here that the current assumptions on 'impossibility of things' such as hash collisions, ECDSA breaking, reversibility of the block chain never being zero but being 'plenty' irreversible nonetheless, and things like that are highly objective assumptions. They can be justified from a purely scientific perspective.

The majority of miners being honest is perhaps the most subjective assumption we're making in the whole of the system, it is 'safe enough' only because many auditors that keep close check on their honesty exist, along with strong economic incentives that reward honesty. To stretch this assumption is not something one would do happily. In other words, removing the incentives, or turning the 'many auditors' into 'not many auditors', would diminish security the system can provide.

Justus Ranvier said:
Committed UTXO sets would need to be constructed in a way that makes them easily falsifiable by fraud proofs.

We can do a lot better than relying on a majority of hashing power not to lie.
Yes. This is all the more reason not to increase the barrier to fully validating (too much); I would dread the day when someone asks me "but how do you know the system works correctly" and my answer would be "well, surely someone would have published a fraud proof by now..?".

It's a heck of a lot better than having no assurance at all, which is reason enough UTXO commitments would be highly desirable (in the next few years), but it's no reason to make full auditing infeasible.
 

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
Yes. This is all the more reason not to increase the barrier to fully validating (too much); I would dread the day when someone asks me "but how do you know the system works correctly" and my answer would be "well, surely someone would have published a fraud proof by now..?".
Given that today's CPUs can validate 4ktxn/s, I think we already have a lot of room to grow with regards to necessary processing bandwidth - all arguments about network bandwidth aside.

But, yes, I do like to keep Bitcoin core level-0 simple, too. Which means, for example, and as I said elsewhere, that I really prefer Bitcoin to not learn any opcodes (e.g. for LN) that might entangle the blockchain more than it is now and might need additional indexes or similar - and might hinder scaling up on level-0. Before someone gets it wrong, I am not at all opposed to LN per se.
 

Yoghurt114

New Member
Sep 17, 2015
21
8
awemany said:
Given that today's CPUs can validate 4ktxn/s, I think we already have a lot of room to grow with regards to necessary processing bandwidth
Oh absolutely, CPU-wise there's plenty of room beyond the current limit with current (or even yesterday's) technology. And with the upcoming libsecp256k power-up we can expect another 5x+ increase. Though, as you indicate, bandwidth is the primary bottleneck (to note: there's room there too).

awemany said:
But, yes, I do like to keep Bitcoin core level-0 simple, too. Which means, for example, and as I said elsewhere, that I really prefer Bitcoin to not learn any opcodes (e.g. for LN) that might entangle the blockchain more than it is now and might need additional indexes or similar - and might hinder scaling up on level-0. Before someone gets it wrong, I am not at all opposed to LN per se.
Luckily, those particular opcodes do not require a new index or anything else not readily available. (though arguably they are more complex architecture-wise; they do require knowledge of the block (relative lock time) or transaction (lock time) it is contained in, contrary to all other opcodes (which can be ran as a standalone script with no context), for example) They do not introduce any unnecessary or unwarranted complexity as far as I've reviewed them.

Things like sidechains, on the other hand, require nodes to either fully validate the sidechain (which .. simply would not be practical in the long run) or accept an SPV proof. (in this instance, whether that is a problem at all - seeing that they still would never be able to inflate Bitcoin or do anything nasty at all - I am undecided on) But if anything, those would be the controversial opcodes, not the LN ones.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
I think some of the "heat" regarding the block size limit and multiple-implementations comes down to the existence of two opposing visions of what Bitcoin is. Some think Bitcoin is ultimately governed by mathematics, others think it's the market.

My opinion is that Bitcoin is ultimately governed by the market; only over the short term is it governed by mathematics (the source code). Because this is my view point, multiple implementations are a good thing to me because they remove friction and allow the market to more easily meet its demands (e.g., a larger block size limit). If someone has the opposing view, then I could understand why they might view multiple implementations as a danger (e.g., the rules could be changed by the market).

What's interesting is that the answer to who's vision is correct is, at least partly, testable. If the market-governance theory is correct, then it should not be possible to drive a fee market significantly above the free-market equilibrium. The market will tolerate a block size limit to the right of Q* (i.e., a limit serving as an anti-spam measure but not affecting the free market dynamics).



However, the market will not tolerate a limit to the left of Q* (a limit that results in a deadweight loss of economic activity).



If the market wants to be at Q*, how can that same market force it to be at Qmax instead?

If the market theory is correct, then the total miner fees should never significantly exceed the total coins lost to orphaning over a sustained period of time. This would be one way to refute the market-governance hypothesis. Since Q* is now very close to bumping into Qmax, perhaps we'll have our answer within a year or so.
 
Last edited:

Yoghurt114

New Member
Sep 17, 2015
21
8
Peter R said:
Some think Bitcoin is ultimately governed by mathematics
I can't think of anyone holding this position successfully, nor anyone in especially the technical community who defends it at all, in either 'camp'. This is a sentiment I suppose primarily perpetuates on Reddit or ELI5-'reel-em-in'-cheerleader-conferences and the like; "Bitcoin is regulated by maths". It sounds appealing, it's a great battle-cry, but it's obvious nonsense.

The whole of the reason we have such a thing as decentralized Byzantine consensus in Bitcoin is because a healthy competitive market can provide it; the market is the loophole that makes this whole thing possible.

But that doesn't mean it is responsible for 'the market' to venture outside the realm of clear limitations set by maths and technology, although it is certainly possible - which, at some point, comes at the cost of the decentralized bit we're all so fond of.

As for your viewpoint on multiple implementations of the same consensus ruleset; again, I think most people would agree with you here that that would be a good thing. This is why libbitcoin-consensus is a thing, has been for a while, and consensus code is (slowly, but steadily) moving out of Core and into that; it makes actual implementation of alternate implementations safe and easy rather than dangerous and cumbersome.

I think if multiple implementations are something you'd like to see happen, it'd be generally better to contribute to the existing libbitcoin-consensus effort, rather than post arbitrarily composed animated gifs of what the result might look like.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
"I can't think of anyone holding this position successfully, nor anyone in especially the technical community who defends it at all, in either 'camp'. This is a sentiment I suppose primarily perpetuates on Reddit or ELI5-'reel-em-in'-cheerleader-conferences and the like; "Bitcoin is regulated by maths". It sounds appealing, it's a great battle-cry, but it's obvious nonsense."

This image shows an example from five minutes ago of someone holding this position. In my opinion, anyone who believes that a block size limit can be used to drive fees far above the free-market equilibrium falls into the "governance by mathematics" category. The nice thing is that the hypothesis is testable: if the market governance theory is correct, then I believe that we shouldn't see sustained periods where aggregate fees are significantly greater than the aggregate losses due to orphaning.



"I think if multiple implementations are something you'd like to see happen, it'd be generally better to contribute to the existing libbitcoin-consensus effort, rather than post arbitrarily composed animated gifs of what the result might look like."

Clearly a lot of people agreed with you. Did you support the censoring of that animated GIF and my ban from /r/bitcoin?
 

Yoghurt114

New Member
Sep 17, 2015
21
8
Peter R said:
anyone who believes that a block size limit can be used to drive fees far above the free-market equilibrium falls into the "governance by mathematics" category
That's not a very nuanced categorisation protocol, if you ask me.

For what it's worth, as far as protecting decentralisation goes, which is what makes Bitcoin interesting to me personally, I do have a tendency to favor a block size limit which is such that it inherently allows for this to be true. If a suitable block size limit can be set (as a de-facto dynamic limit) through a free-market-equilibrium: great. If a free-market-equilibrium the way it currently exists would make for such a suitable limit: even greater. But I don't think that's the case; the free-market-equilibrium your proposal would result in makes for blocks that are too large, I fear. Demand is huge, block space supply can be arbitrarily easy to come by, which would venture into the region of infeasibility. Bummer. I would be glad to be proven wrong with concrete numbers.

Dynamically setting some block size limit based on a deterministic measure of decentralisation of the network would have my ideal preference. Unfortunately, I can think of no way to produce such a measure in a way it is meaningful or cannot be gamed; I doubt it is possible. As a result, a reasonable growth pattern following technical advancements holds my current preference.

Peter R said:
Clearly a lot of people agreed with you. Did you support the censoring of that animated GIF and my ban from /r/bitcoin?
I was unaware you were banned as a result of that until now. However, from observation, it seems it was due to provoking vote brigading out of /r/bitcoinxt, not the content. Vote brigading is one among a very short list of rules Reddit cares deeply about, so I suppose one should not be surprised. Whether a ban is justified; I am not a moderator, so I withhold judgement. Reddit is a centralized platform of discussion, after all.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
"Vote brigading is one among a very short list of rules Reddit cares deeply about, so I suppose one should not be surprised."

It seems to me that if the moderators get to make up the reason, then according to the reasons they'll provide, content would never be censored. Instead submissions would be removed because they were "off-topic," about "alt-coins," or there was "vote brigading." Do you think the Soviet Communist Regime admitted to censoring legitimate content? Of course not; they would always make up a reasonable-sounding excuse to justify their actions.

You seem like an open-minded and objective person. Why do you tolerate the censorship? You don't think it's ridiculous that--even now that my ban has expired--I'm still not allowed to post the images of those supply and demand curves that I just shared here?

For the record, I never actually engaged in vote brigading by any sensible definition of that term and I was banned by /r/bitcoin mods and not by Reddit admin. I did share links to my submission here, at bitcointalk and at /r/bitcoinxt to encourage participation in what I believe is an important topic. I never once suggested that people up-vote or down-vote certain comments. The censors at /r/bitcoin however claimed I was vote-brigading "by proxy."
 
Last edited:
  • Like
Reactions: cypherdoc

Yoghurt114

New Member
Sep 17, 2015
21
8
Peter R said:
Why do you tolerate the censorship [in /r/bitcoin]?
To be honest, I have actively ignored taking any heed to the moderation notice because I do not believe it is an effective measure. Then again, I have also not spoken out against it. Personally, I have participated in numerous discussions concerning XT both before and after the policy change, often (though not always) with XT proponents. I have not noticed any peers in such discussions be censored or otherwise moderated, and I found many of these discussions useful - they were often visible enough to get some attention so presumably they were read by moderators. I don't perceive normal grown up discussion to be affected by the policy change. What I did notice was a strong reduction in the number of XT-related and unrelated comments that lacked nuance, sense or utility. They were hostile, unproductive and unconstructive. Frankly, I am glad they are gone.

That said, Reddit is just a terrible, terrible platform for Bitcoin to be garnering sentiment of any kind. It is a self-censoring swine-pool of swarming locusts that do nothing but extremely swiftly adapt to one imaginary groupthink status-quo after another that are all so far from reality it is pointless to keep up. I try to limit my time there - often unsuccessfully - to the weekly monday threads, which was very informative for everyone involved up to a few months or so ago.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
There is a quote from Greg Maxwell taken from this email that has been cited a lot recently by the small block side of the debate. The quote suggests that fees will not add to network security without an artificial constraint on block size. However, the argument is wrong:
Greg Maxwell said:
For fees to achieve this purpose [added security], there seemingly must be an effective scarcity of capacity. The fact that verifying and transmitting transactions has a cost isn't enough, because all the funds go to pay that cost and none to the POW "artificial" cost; e.g., if verification costs 1 then the market price for fees should converge to 1, and POW cost will converge towards zero because they adapt to whatever is being applied.
It is easy to show that this is wrong with a simple diagram. The fee revenue goes to three things: miners' profit, orphaning cost, and increased security. The exact mix between profit, orphaning cost and increased security depends on the competitiveness of the market and the concavity of the two curves. The point is that fee revenue will always add to security and will never subtract from it.



I need to give credit to @Mengerian because he originally made a similar diagram months ago in the locked thread.
 

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
There is a quote from Greg Maxwell taken from this email that has been cited a lot recently by the small block side of the debate. The quote suggests that fees will not add to network security without an artificial constraint on block size. However, the argument is wrong:
Another reason that he's wrong is that fees can add to network security as long as users are able to craft transactions which are only mineable at a specified block height or higher.

From what I've been told, some of the new opcodes they want to add for LN make it possible to create those types of transactions via the new locktime semantics.

As long as most clients use those opcodes to make sure their transactions can not be mined in the past (relative to largest block height the client knew about when it constructed the transaction), then any fees their transactions pay will just as effective as the block reward at causing miners to lose revenue by choosing to double spend rather than extend the chain.

Strange how Gregory Maxwell isn't mentioning that the changes they want to make to Bitcoin's scripting language to support Lightning Network will solve the problem supposedly make LN necessary without actually requiring LN at all.