Gold collapsing. Bitcoin UP.

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
My response to Pieter Wuille on the Dev-List has once again been censored, perhaps because I spoke favourably of Bitcoin Unlimited and pointed out misunderstandings by Maxwell and Back. (All of my emails since Maxwell unsubscribed after our debate have been censored). Here it is for anyone interested:

Hi Pieter:
Pieter Wuile said:
That's exactly the point: a hard fork does not just affect miners, and cannot just get decided by miners. All full nodes must have accepted the new rules, or they will be forked off when the hashrate percentage triggers.
And with the recent launch of Bitcoin Unlimited, nodes are taking matters into their own hands. There are now 34 nodes that will accept blocks larger than 1 MB *today*. They are not bothering to wait for BIP101 to activate. Instead, they are saying loud and clear: “if anyone produces a large block, we’ll follow you!”

Bitcoin Unlimited has now shown empirically that all nodes do NOT need to have the same rules regarding the block size limit. This was previously not understood even by thought leaders such as Maxwell and Back (who argued that Bitcoin was a delicate consensus system and all nodes must have identical rules).

Right now, no miner dares produce a larger block because it will certainly be orphaned. But eventually, when enough nodes have increased their own block size limits for what they will accept, a brave miner will produce a large block, it will be accepted into the Blockchain, and we'll look back at this debate and laugh.
Pieter Wuile said:
Furthermore, 75% is pretty terrible as a switchover point, as it guarantees that old nodes will still see a 25% forked off chain temporarily.
Bitcoin Unlimited has given us experience with this too. With Bitcoin Unlimited you can intentionally fork yourself off the chain by setting your block size very low. What happens is just that you no longer see new blocks and there are warnings that you may not be tracking consensus.

In other words, you don’t lose any coins and it’s obvious that something is wrong.
Pieter Wuile said:
My opinion is that the role of Bitcoin Core maintainers is judging whether consensus for a hard fork exists, and is technically necessary and safe. We don't need a hashpower vote to decide whether a hardfork is accepted or not, we need to be sure that full noded will accept it, and adopt it in time. A hashpower vote can still be used to be sure that miners _also_ agree.
I am very happy to see the Core team begin to talk about ideas like “Emergent Consensus” and express that the role of developers is *not* to dictate terms to the user base. However, for consensus to efficiently emerge in the first place, we need efficient signalling and communication across the community. We need to be able to talk openly about how we want the network to evolve, and that means we need to be able to promote things like Bitcoin Unlimited. It has already given us a lot a experience about just how safe it is for an individual node to just go ahead and increase his block size limit: the node will still track consensus! This is now known both theoretically and empirically.

I would like to see more leaders—like several of you on this mailing list are—speak out against the censorship and avoid places like /r/bitcoin where it is so prevalent. For consensus to emerge, we need reliable communication channels.

Best regards,
Peter
 

Zangelbert Bingledack

Well-Known Member
Aug 29, 2015
1,485
5,585
@Peter R

Right, we never actually know what the state of the ledger is for certain, just with more and more certainty as we go back through the chain.

So what happens if there is a divergence and then a reconvergence, like in /u/Taek42's China scenario, where they would supposedly mine a bunch of blocks among themselves behind the GFW and then come back and go, "Ta-daaaa! You all have to accept our better/longer/harder(?) chain now."?

What's the objective measurement of which chain is right?
 
Last edited:

solex

Moderator
Staff member
Aug 22, 2015
1,558
4,693
Peter, you are really carrying the torch. It is disgusting that your posts are censored on the dev list, and that is certainly the work of Btcdrak "Wormtongue the servant of Saruman".

Edit: Kanzure did the deed, but notified Btcdrak.
 
Last edited:

Zangelbert Bingledack

Well-Known Member
Aug 29, 2015
1,485
5,585
From Pieter Wuille:
My opinion is that the role of Bitcoin Core maintainers is judging whether consensus for a hard fork exists, and is technically necessary and safe. We don't need a hashpower vote to decide whether a hardfork is accepted or not, we need to be sure that full noded will accept it, and adopt it in time. A hashpower vote can still be used to be sure that miners _also_ agree.
Why should "we" (the devs) be the ones judging whether consensus exists?? What special expertise or position on the ground do they have? Why not miners and nodes?
[doublepost=1451160832][/doublepost]Likely the censorship was for talking about moderation, which is considered off-topic. I don't think the ML has the Theymosian policy of "no promotion of implementations that don't have overwhelming consensus," but if so then saying "we need to be able to promote things like Bitcoin Unlimited" would likely be construed as such - especially with drak at the helm.
 
  • Like
Reactions: majamalu

sickpig

Active Member
Aug 28, 2015
926
2,541
Peter, you are really carrying the torch. It is disgusting that your posts are censored on the dev list, and that is certainly the work of Btcdrak "Wormtongue the servant of Saruman".
Sadly enough, if memory serves, Jeff Garzik has the final say on the moderation of posts on btc-dev ml. Surely even Rusty Russell has a role since he was the one who posted the moderation rules to the list.

I've posted twice since moderation is active on the list, eventually the messages went through (both posts were about SegWit). @Peter R if they decide to permanently discard your message do you receive some sort notification?
 
  • Like
Reactions: AdrianX

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
@sickpig:

Yes, you get an email that says the message was not approved and sometimes an explaination. Here is the explaination I received:
Moderator of Bitcoin-Dev said:
Your request to the bitcoin-dev mailing list

Posting of your message titled "Re: [bitcoin-dev] Block size: It's
economics & user preparation & moral hazard"

has been rejected by the list moderator. The moderator gave the
following reason for rejecting your request:

"Your message was deemed inappropriate by one of the moderators.

The argument that has been made to you before is that "tracking a
big-block consensus" is not compatible with a low-bandwidth network.
Your email does not contain an (even an attempt at a) refutation of
this.

The discussion has never been "whether a client can be modified to
accept larger blocks into its local blockstore" (not sure why you
brought this up).

Generally, the moderators would prefer to not backtrack in discussions
and would prefer forward progress rather than reiteration of your
already-known disagreements."

Any questions or comments should be directed to the list administrator
at:

bitcoin-dev-owner@lists.linuxfoundation.org
 

Zangelbert Bingledack

Well-Known Member
Aug 29, 2015
1,485
5,585
Sounds like kanzure's writing. I don't think Rusty would give such reasons.

I guess you could just re-submit while addressing the point to kazure/drak's satisfaction?
 

solex

Moderator
Staff member
Aug 22, 2015
1,558
4,693
"tracking a big-block consensus" is not compatible with a low-bandwidth network.
This is so subjective that it is meaningless.

Who is to say that the user settings on a BU majority network might not converge on a limit as low as 1.5MB, while the physical network itself may have plenty of bandwidth to be fine with 3MB?

If the users (full node owners) collectively set their limit too high and large volumes cause instability then users will collectively reduce their limits. Clearly the implication is that the core devs do not trust the users.
 
Last edited:

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
Why should "we" (the devs) be the ones judging whether consensus exists?? What special expertise or position on the ground do they have? Why not miners and nodes?
If they wanted to prove that no consensus for a block size limit increase (or removal) exists, the way they'd prove it is to attempt to falsify the hypothesis. They'd be the ones informing Bitcoin users about alternatives, refrain from scaremongering, and generally give Bitcoin users every opportunity to switch without fear of reprisal, DDoS attack, or dirty tricks.

If they were sure they've properly identified the consensus as being on their side, and if their motives were as pure as they imply, this would be the best way to prove it.
 

solex

Moderator
Staff member
Aug 22, 2015
1,558
4,693
@Justus Ranvier
Exactly. We, like Core Dev are assuming that BU will give a block limit increase. BU is not directly a big-blocks solution, it is a user-defined solution for the block hard-limit problem. For all we know BU might converge on 500KB as the block limit and Luke-Jr will be doing handstands while us big-blockers drop our heads in resigned defeat.

PS. I severely doubt that, but at least we will take the risk giving the final say to the whole network!
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
Looks like it was indeed Kanzure who axed my email to the dev-list:


He is the same one who recently accused me of "wasting developers' time" by writing and posting my subchains paper (even though I never submitted it to bitcoin-dev). I still can't determine what the "invalidity" is that he keeps talking about. It sounds like it's my assumption that big blocks take longer to propagate than small blocks. However, they frequently claim that bigger blocks propagate slower as well (in fact that is one of their reasons for wanting to keep the block size limit small).
 

Roger_Murdock

Active Member
Dec 17, 2015
223
1,453
A miner wants to mine on a block that will be included in the longest chain--but he can't know with certainty what that chain will be. If the longest chain the miner is aware of includes--at its tip--a stupidly-big block that would hinder the chances of that chain persisting as the longest, then why should the miner bother with it?
Yeah, that's how I was thinking about it. The longest valid chain is just a proxy for what miners really care about, which is the most-likely-to-win chain (i.e., the chain that is most likely to stay the longest). That's what they want to extend. When block sizes and propagation speeds are small, it's a very good proxy. But in a no hard-coded cap world with bigger and slower-to-propagate blocks, presumably it becomes less reliable. So in that world, can't we assume that miners would attempt to come up with some kind of dynamic algorithm for figuring out which chain is most likely to win based on both length and likely propagation speed based on the size of the blocks it contains. In other words, it's not just the length of your chain that's important; "girth" matters too. (Sorry, couldn't resist.)

Another thought I had was that you can usually think of a chain that's 1-block longer than another chain as being "ten minutes ahead." But again, that's only true when you can discount propagation speeds, right? So let's say the network agrees on the state of the blockchain up to block 100. And let's say you have a two-block extension of that chain (101A, 102A) competing against a one-block extension (101B). If blocks 101A and 102A are really big such that they'll each take 7 minutes to propagate and 101B is really small such that its propagation speed is negligible, might the shorter 'B' chain not actually be, in some sense, "four minutes ahead" of the longer 'A' chain?
 

Inca

Moderator
Staff member
Aug 28, 2015
517
1,679
Well 35 nodes are now rejecting Core without any miner consensus.

I think we should try to get momentum towards converting all XT nodes to BU.

Reason is that BU is a 'clean upgrade' without the controversies that XT entails - just the max_blocksize increase. Plus it is now seeming unlikely that miners will switch to XT. Additionally it does not integrate opt-in RBF.

Getting dispirited with bitcoin being poisoned like this :/
 
  • Like
Reactions: majamalu

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
I'm curious how they feel about whether repeatedly re-submitting duplicate transactions with progressively higher fees is "compatible with a low-bandwidth network".
am i right in assuming that if they submit a RBF tx that this would not do anything to the original tx in the mempool and thus double the data requirements?
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
Yeah, that's how I was thinking about it. The longest valid chain is just a proxy for what miners really care about, which is the most-likely-to-win chain (i.e., the chain that is most likely to stay the longest). That's what they want to extend. When block sizes and propagation speeds are small, it's a very good proxy. But in a no hard-coded cap world with bigger and slower-to-propagate blocks, presumably it becomes less reliable. So in that world, can't we assume that miners would attempt to come up with some kind of dynamic algorithm for figuring out which chain is most likely to win based on both length and likely propagation speed based on the size of the blocks it contains. In other words, it's not just the length of your chain that's important; "girth" matters too. (Sorry, couldn't resist.)

Another thought I had was that you can usually think of a chain that's 1-block longer than another chain as being "ten minutes ahead." But again, that's only true when you can discount propagation speeds, right? So let's say the network agrees on the state of the blockchain up to block 100. And let's say you have a two-block extension of that chain (101A, 102A) competing against a one-block extension (101B). If blocks 101A and 102A are really big such that they'll each take 7 minutes to propagate and 101B is really small such that its propagation speed is negligible, might the shorter 'B' chain not actually be, in some sense, "four minutes ahead" of the longer 'A' chain?
Brilliant!

So a miner always mines on the most extended chain. However, since he only has his local view of reality to work with, he must discount chains that contain large blocks at their tip ("girthy chains") by an amount of time given by:

t = - z Q

where z is the propagation impedance in sec/MB and Q is the block size.

Then, for a certain block size, Qc, t becomes less than -10 min, so it is actually better building off the earlier block in the chain (since it is more "extended"). It turns out that Qc is exactly the Stone capacity we discussed earlier today:

Qc = T / z

where T is the 10-min block target, and which both Andrew Stone and Nicholas Houy already showed that--based on game theory--miners should ignore.

@Roger_Murdock: Thank you for giving us a way to visualize what is happening here!
 
Last edited: