Gold collapsing. Bitcoin UP.

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
So I can't quite get my head around this controversy of orphaned/invalid block thats being spread.

Bicoin.com just produced a large block but also an Excessive Block (This may have been down to an unforeseen bug but the result is the same) To all intents and purpose this was a larger than 1MB block produced and BU nodes relayed it accordingly. Meaning that it was accepted as valid and relayed until such time as the longer chain (without the excessive block) overtook and a reorg occurred. But the block was not mined on top of, because it was larger than the EB1 set by all of the BU mining pools?


If i'm following this correctly? it seems that this accident is actually a live validation that BU works as intended. So the difference between Orphaned and Invalid is purely a matter of perspective?

Should at the very least make for some good test data ?
 
Last edited:

AdrianX

Well-Known Member
Aug 28, 2015
2,097
5,797
bitco.in
@lunar you can get it from the horse's mouth - just flow my discussion he illustrates it.


and yes if bitcoin.com produced a >1MB block that was orphaned it is confirmation BU works fine with regards to the "orphaned/invalid block FUD being spread". It is concerning if BU produced a >1MB block if it was not supposed too but that's a separate issue - intentional or not it is concerning.
 

adamstgbit

Well-Known Member
Mar 13, 2016
1,206
2,650
thats one way of looking at it...
another way is that this is a disaster!

it appears BU was quick to put out this fire tho, good job, I think this incident won't hurt node count/ hash rate too much but its not "good news" thats for sure.

can't win them all i guess....
 
Last edited:

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
I am very glad that @theZerg and @Peter Tschipper (and anyone else involved) handled this incident quickly and professionally and I am also grateful for all the work. But we need to definitely learn from this!

I believe this incident shows that there's a need for a slower pace with more testing. Especially when we're still so short on man power. @Gavin Andresen's initial criticism comes to my mind.

We have everything in the code to leave 1MB behind.


I think the focus now should be to make that as smooth as possible.
 
Last edited:

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
In the short term, it is important to the credibility of the project that we put out a hotfix release for this. Even if there is a configuration workaround - you can't rely on newcomers who download the 1.0.0 software to spot something related to this issue on a forum. And you never know when that next downloader is a miner who wants to try things out. Anyhow, I'm sure @theZerg is already planning for such a spot release.

In general I heartily agree with @awemany's point about more testing, and will add what others have said before: a stricter code review process. Ensuring that the necessary test coverage is there is just one part of that.
 

satoshis_sockpuppet

Active Member
Feb 22, 2016
776
3,312
My 2 cents as an outsider who has no knowledge of the Bitcoin code (apart from it being kludgy):

First, let core have their few days of celebration. In the long run they might have
won another two weeks of being in charge, but after all, the bug showed, that the
idea behind BU (as described in the Bitcoin whitepaper..) works as intended.

Without being involved, I agree very much with @awemany and @freetrader :
This bug should not have happened and could have easily been prevented by a more
careful approach. It looks like if the pull request in question has been already
criticised back then for changing too much stuff at once and it didn't have much review.

No idea if there was a BUIP for the change, but shouldn't there have been one for
changing constants like these? Everybody knows, that a simple looking change on the
surface can have a lot of consequences. Even more so, if it's on the side of block
creation.

I also remember Gavin's early criticism on BU's commit history.

But I guess, @theZerg already gets a lot of heat these days, so I like to say,
that I hope he doesn't take the criticism in the wrong way. Without him there
probably wouldn't be BU and all the work he put into BU so far shouldn't be
dragged down by a bug, which, in the end, after the smoke cleared, won't have
any impact on Bitcoin or BU long term. (Except for code review and "quality control"
I hope).


The very positive thing:

This bug actually is an opportunity to show to the world, that
the FUD by core about BU being dangerous is bullshit. A miner produced a block,
too big (it doesn't matter if intentionally or unintentionally) for the network
and it was orphaned. That is exactly, how Bitcoin works. There was no disaster,
no global breakdown of the Bitcoin network. A miner was punished for trying to
override the rest of the network. Something BU-haters don't like to hear.

The bug was ugly, it costed a miner quite some money and it gives core a week
full of FUD propaganda, but it also showed, that, despite the FUD, BU is not
dangerous to the network at all.

tl;dr: Please be much more careful and accept criticism about BU's coding process, but
also try to sell the positive point, that BU's mechanisms work perfectly and that
a miner can't burn the world by creating a block > 1 MB.
[doublepost=1485778517][/doublepost]
So the difference between Orphaned and Invalid is purely a matter of perspective?
Exactly. The FUDERs naturally switched to saying "invalid", to no approve of BU's idea. Another propaganda trick. Because else, they would agree, that a block too big was orphaned, as predicted by BU.
 
Last edited:

adamstgbit

Well-Known Member
Mar 13, 2016
1,206
2,650
@chriswilmer
It seems normal, with the higher price 900's probably all miners/pools are expanding
as much as we'd like to see segwit's hash rate wither and die, its backed by profitably biz so...
 

Richy_T

Well-Known Member
Dec 27, 2015
1,085
2,741
It's numbers like these, that make me confident that we can scale on chain without problems. Please tell me how I get the math wrong here, if I do: (Yes, it's my post. You might also read the OP comment to get the full context.)

I think if we saw increased adoption and a return to reasonable fees, we might start seeing Bitcoin be the impetus for mesh networking. Certainly, the current state of the internet is OK for looking at pictures of cats but it's not suitable for a world-spanning network-based currency. And the nice thing about Bitcoin is it makes it easy to pay for and receive a fee for access.
[doublepost=1485802531][/doublepost]
I think 1GB is a little bit much for now, but definitely doable in 17 years or so.

I think you got the numbers mostly right. Maybe there's a factor two in there that the calculation is off by (upstream & downstream).

We want UTXO commitments though, as few people are going to want to dig all the way through the chain at these rates. Throw the old stuff away.
We can almost sort-of do that now. You could create a pruned node and just share it as a torrent. There would be trust issues, of course but there are ways to be pretty sure you're in good shape.
[doublepost=1485802698][/doublepost]
And @Peter R. assumed just 10 UTXOs per Person, even more 'small-UTXO' than I am (I consider 50 to be a good and realistic number, but without a too strong opinion, I just think it will be >>1). I do really wonder about the other small blockers and their ideas, given that Xekyo is 'bigger-UTXO' than both me and @Peter R. I think we should ask around, on reddit and elsewhere.
If UTXOs are really a concern, let's find a way to really encourage consolidation. Perhaps multiple TXs -> 1TX could get a direct discount or even be free (up to a small fraction of the block). Miners could choose not to include these, of course.

The point being, if you want to discourage UTXO growth, do it direcly, not with some hand-waving BS claims for a technology you're pushing for different reasons. If I catch you crapping in my yard, don't tell me my roses will come up lovely.
 
Last edited:

Norway

Well-Known Member
Sep 29, 2015
2,424
6,410
@Richy_T
I certainly think bitcoin could be the impetus (learned the word now, had to look it up) for mesh networking. A mesh network needs economic incentives and ledgers of financial transactions to work, in my opinion.

What do you think about my post(s) that you quoted? Am I far out in la-la-land, or do you think the estimates and math about on chain scaling is reasonable?

I'm so fed up about people telling me that a 50 MB block of data every 10 minutes is huge and insane, while my new PS4 downloads updates of games i have allready bought and installed at 8-12 GB without even prompting me if it's ok!

EDIT: I see you added more to your post while I was typing my answer. Great! We are onto something here! Using awemany's estimates of output sizes, this memory could hold 1 UTXO for every person on earth without any disc cache and plenty of free space as just 70 GB is needed for just 1000 USD.

https://www.komplett.no/product/872742/datautstyr/minnebrikker/kingston-ddr4-2133mhz-128gb-eccreg#

@Peter R Please chime in on this debate, some calculations and estimates should be put together. I like the idea to see how expensive a node serving 1 billion people would be, I'm not sure if lightning even deserves to be part of such a model yet :)
 
Last edited:

adamstgbit

Well-Known Member
Mar 13, 2016
1,206
2,650
50 MB block of data every 10 minutes is huge
it is!
50MB block would take 50X time to validate
make blockchain grow 50X times faster
and gr8ly reduce full node count.

I think it gr8 that we think FAR ahead and see if GB blocks can be viable.
but realize that all this talk about GB blocks being potently safe given XYZ, has given BU a bad rep.
alot of poeple still believe BU means unlimited blocksize, and if we talk about 50MB / GB blocks we're just giving the wrong impression.

1.256MB blocks
thats should be the immediate goal, and BU should put that as default setting, what is it now 16MB? ( way to big )

only idiots believe you can hard-fork the network from 1mb to 1Gig, the internet cant even handle that kind of size.
had we been talking about how its safe to double the limit, and speculate about what would happen with 10X incress, they would have to say things like only idiots believe 10MB blocks are viable.
 
Last edited:
  • Like
Reactions: bluemoon and Norway

Norway

Well-Known Member
Sep 29, 2015
2,424
6,410
@adamstgbit
I assume pruning. There is no need to require that all nodes have a copy of the full transaction history.

Some actors will off course store this data (IRS, CIA, KGB) but it's no reason to make every node a museum.

And what is the consequence of that? Well, it actually means that a node only need to keep track of "accounts", UTXO outputs. 1 output per person could technically be enough, but many will argue that more is needed for privacy. @awemany is questioning people in forums "how many outputs are needed per person" which is a cool approach in my opinion.

If you think validation time is the bottleneck, great! But we need to put a number on it. We can't just say a lot is too much if we don't bother to do the math.