Gold collapsing. Bitcoin UP.

albin

Active Member
Nov 8, 2015
931
4,008
@sickpig

Am I dreaming or didn't he just admit a specific piece of development done not because they're the pre-eminent technical wizards of the universe doing everything right because science, but instead for very concrete political reasons??
 

Zangelbert Bingledack

Well-Known Member
Aug 29, 2015
1,485
5,585
Faced with BU and Classic closing in, the next narrative Core is trying to push is "divergence from Core" / "behindness in commits."

This is a softer version of the possible eventual tactic of hardcore code obfuscation/complexification where they optimize more directly for making the changes difficult, confusing, or hazardous to pull in. I imagine they can keep this up for quite a while, so to me it seems more expedient to show people the pattern so they can see this tactic is a lost cause in the long term.

If people can be convinced that Core's overwhelming technical superiority means dangerous centralization of decision-making in Bitcoin, these tactics will only harm them.
 
Last edited:

Richy_T

Well-Known Member
Dec 27, 2015
1,085
2,741
@Richy_T

You might be right to a certain extent, but threads like that advance the knowledge of many who just wish to understand. You're right that it can be a waste of time for some, but many responses make good examples for the wiki, and they could be collated to avoid future repetition.

A general debunking every few weeks cleans the air and strengthens the resolve.
I may have to work out some kind of bot. :)
 
  • Like
Reactions: lunar

Richy_T

Well-Known Member
Dec 27, 2015
1,085
2,741
Anyone can enlighten me about what the problem is with pinning consensus on a block that's typically buried under months of work?
The perils of code as standard. Quite possibly a "Core" implementation should fully validate but third party clients should feel free to do trade-offs.

The Core software is trying to wear two hats and finding that neither fit quite right.
[doublepost=1487561968,1487561149][/doublepost]
This is a softer version of the possible eventual tactic of hardcore code obfuscation/complexification where they optimize more directly for making the changes difficult, confusing, or hazardous to pull in. I imagine they can keep this up for quite a while, so to me it seems more expedient to show people the pattern so they can see this tactic is a lost cause in the long term.
Possibly. I've been pondering for a while how Bitcoin should be much more modular and I've been thinking that, alongside the current BU effort, it might be a good idea to have a parallel development of a client from "scratch" which adheres to existing Bitcoin and BU philosophy but with some significant refactoring and code pruning and a whole lot of documenting of what the hell is going on. Good or bad idea?

(When I say "scratch", I still expect a good bit of code reuse).
 

AdrianX

Well-Known Member
Aug 28, 2015
2,097
5,797
bitco.in
"behindness in commits."
Maybe a miner should ask him to add a user defined block size limit to Core. He seemed quite willing to change Core for the miners.

This may be an attack on BU to force segwit adoption but BU doesn't have segwit and the dependant commits so it's a feature not a bug.

The issue is not the number of commits but the user defined block size. Miners have to acknowledge that and ask GMax when he plans to implement user adjustable block size. BU is ahead on the change that count.
 

albin

Active Member
Nov 8, 2015
931
4,008
One aspect of the "fee market" paradigm I think is interesting --

If you just crudely look at charts of Blockchain info for total tx fees as time series, denominated in both USD and bitcoin, you can very clearly see that tx fees are pretty damn sticky to their BTC denomination.

What we can infer from this is probably that a substantial portion of Bitcoin transactions are being done by tx fee payers experiencing an income effect from holding BTC, it isn't the Bitcoin Uncensored fantasy of people going in and out to facilitate their illicit transactions, because these people would have preference curves that track w/ the fiat-equivalent tx fees.

Consequently, it would follow that Core's resounding claims of fee market success are significantly and unsustainably propped up by price appreciation.
 

Roger_Murdock

Active Member
Dec 17, 2015
223
1,453
A few recent entries from the "Small-Blockers Say the Craziest Stuff" files:
  • Here's /u/jonny1000 claiming that: "The whole point of Bitcoin is automatic machine consensus, if you want people always in the driving seat then Bitcoin is nothing new." (I think my reply here made a pretty good point regarding the actual significance of "machine consensus.")
  • Here's /u/jratcliff63367 explaining why it's dangerous to "allow" people to modify block size limit related settings via BU and how we instead need to rely on "officially released code."
  • Here's /u/nullc with this head-scratcher: "The rules define hashpower. If you mine an invalid block you are not hashpower anymore. Thus, a hardfork can ever have the majority hashpower, it has no hashpower."
  • And finally, here's /u/djpnewton with another contender for what the "whole point of Bitcoin is" -- evidently it's "to present a system in which the rules are fixed and objective."
It's just bizarre to me that you can have people who, I would assume, understand decentralization well enough not to dismiss Bitcoin entirely and instead to actually be excited about it -- and yet who simultaneously fail to understand decentralization so badly that they can talk with a straight face about not "allowing" people to modify open-source software, the need to rely on "officially released code," and Bitcoin as a system with supposedly "fixed and objective rules."
 
Last edited:

sickpig

Active Member
Aug 28, 2015
926
2,541
@Peter R, @Roger_Murdock, @AdrianX, all

Matt Corallo think that block space has 0 cost of production, hence the block space is not a commodity, hence marginal cost of adding a tx to a block is 0.

He seem to think that producing a 1MB block and 100MB has the same cost.
As if the amount of information contained in a block wasn't correlated to the block propagation time.

Any thoughts?

 

Impulse

New Member
Nov 20, 2015
19
71
@Peter R, @Roger_Murdock, @AdrianX, all

Matt Corallo think that block space has 0 cost of production, hence the block space is not a commodity, hence marginal cost of adding a tx to a block is 0.

He seem to think that producing a 1MB block and 100MB has the same cost.
As if the amount of information contained in a block wasn't correlated to the block propagation time.

Any thoughts?

This is one of the many economically ignorant things that the Core camp says. They suggest that the marginal cost is literally zero but that is obviously not possible, it makes about as much sense mathematically as saying that the cost is infinite. If they said it was "near-zero", they might have an argument but they would still have to prove why.

There is much academic work on the marginal cost of digital goods and so far as I know, none of them suggest that the marginal cost of a digital good is zero. When we develop technology that is not limited by the laws of thermodynamics then the marginal cost can be considered zero. In other words, never.
 

Impulse

New Member
Nov 20, 2015
19
71
Further to my previous post, if the marginal cost were so low that for all practical purposes we could consider it "zero", then that would imply that the cost is so infinitesimally small that even for the worst case (think LukeJr kind of bandwidth problems), the cost is irrelevant. If that were the case, then the whole discussion about node resources and "centralization" would be moot anyway.
 

Impulse

New Member
Nov 20, 2015
19
71
@Justus Ranvier Their arguments have become so bizarre and full of holes that the situation has become almost unbelievable. In this case they are literally saying that we can't make blocks larger than 1mb because doing so would stress nodes beyond their ability to operate, but at the same time the marginal cost of including a transaction is zero. In other words, there is no difference from a miners perspective between producing blocks that are 1mb in size, and producing blocks that are 1 petabyte or 1 exabyte in size. How does that make any sense?
 

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
@Impulse It doesn't make sense, and it hasn't made sense 2012.

Every single argument they make was refuted years ago, and they are fully aware of this, and they continue to make them.

Ergo, they are pathological liars.

You can not successfully interact with pathological liars as if they are sincere people with a difference of opinion. The only solution that will work is ostracism.

People don't like enacting ostracism, so they try to pretend there are other available options.

There are no viable alternatives though.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
@Peter R, @Roger_Murdock, @AdrianX, all

Matt Corallo think that block space has 0 cost of production, hence the block space is not a commodity, hence marginal cost of adding a tx to a block is 0.

He seem to think that producing a 1MB block and 100MB has the same cost.
As if the amount of information contained in a block wasn't correlated to the block propagation time.

Any thoughts?

I've been having this debate with them for years. It goes like so:

BS/Core: "Block space has zero cost of production and so it's not a commodity."

Me: "That's not true. There are several factors that add cost--orphaning risk for instance. Here's a paper that shows that the cost of production is non-zero and here's a talk that argues that block space behaves just like a normal commodity."

BS/Core: "Your fee-market paper is fundamentally flawed. You're assuming that larger blocks take longer to propagate and are thus more likely to be orphaned. That is not true."

Me: "It is an empirical fact that larger blocks were more likely to be orphaned. This has been true for the entire time period where data is available. Here is a chart to illustrate this fact."

BS/Core: "Larger blocks might take longer to propagate now, but with 'pre-consensus' the propagation time will no longer depend on block size."

Me: "So you agree that right now the cost of production of block space is non zero and behaves like a commodity?"

BS/Core: "You're an idiot. What matters is what might happen in the future."

Me: "OK, I analyzed one such pre-consensus technique based on weak-blocks that I called subchains. I was able to show that although the technique allows miners to produce larger blocks for a given level of orphan risk, there is still a cost of production based on orphan risk. See Section 5, for instance."

BS/Core: "Your subchains paper is fundamentally flawed. And you 'plagerized' it from Greg."

Peter R has been banned me from #bitcoin-wizards...
...days later I return

BS/Core: "Clearly the miners can agree ahead of time to produce any sized block they like."

Me: "Not in any practical sense. I suppose they could agree ahead of time to produce this jumbo 1 TB block that they've all pre-validated, but why would they do that?"

BS/Core: "You're not thinking adversarially. They'll do it to bloat the chain and hurt Bitcoin. So do you now admit that you're wrong and miners can produce any sized block they like for zero cost?"

Me: "If the majority of the hash power is colluding to hurt Bitcoin, then yes, I would agree. But I would argue that under such assumptions (the majority of the hash power is dishonest) that Bitcoin is insecure anyways. So basically this is a twist on the classic 51% attack."

BS/Core: "Bitcoin is secure even if 51% of the miners are dishonest. This is why full nodes are so important and why we need to keep the block size small so full nodes are affordable to run."

....I could go on and on, but you get the idea.
 

chriswilmer

Active Member
Sep 21, 2015
146
431
@Peter R :

'
BS/Core: "Larger blocks might take longer to propagate now, but with 'pre-consensus' the propagation time will no longer depend on block size."

Me: "So you agree that right now the cost of production of block space is non zero and behaves like a commodity?"
'

As I was reading this, I thought this is where you were going to invoke that physics/signals theorem. Isn't that the relevant argument about future propagation time, that based on our current understanding of physics, propagation time always scales (at least) linearly with information content independent of any compression scheme?