Unlimited really should be Unlimited...

Roy Badami

Active Member
Dec 27, 2015
140
203
Yes, I think that I alluded to this earlier. You'd need to be able to track proof of work in a chain you weren't validating (yet). I don't think the infrastructure is there to do that at the moment, is it?

Ideally, you'd also want to be able to resume the aborted validation where you left off, rather than having to revalidate the entire block.
 

solex

Moderator
Staff member
Aug 22, 2015
1,558
4,693
It is good to see this on the dev mailing list, and it passed the 1st hurdle of remaining uncensored. There were a few considered replies.

http://lists.linuxfoundation.org/pipermail/bitcoin-discuss/2016-February/000049.html
I apologize for not giving credit to whoever brought this up at/after the
Montreal meeting as a key question to ask: I'm really curious to get a
general idea of what people think the size limits are for.

Do you think they should be purely a denial-of-service prevention measure?

Do you think they serve an economic purpose (e.g. to "create a transaction
fee market")?

Do you think they serve a security function (e.g. "prevent centralization
that might make the network vulnerable to attack")?

Something else?

What do you think is the best long-term option for the limits that you've
heard?

I'll go first:

I think they were put in place originally purely as a DoS-prevention
mechanism. Certainly MAX_BLOCK_SIGOPS serves (only) that purpose.

I don't think they *should* be used to influence the economics of the
system.

I don't think they increase the security of the system by any measurable
amount.

My preference for the limits is no hard-coded limits at all, just aDoS
prevention mechanism that rejects too-expensive-to-validate-blocks, where
"too expensive" is an emergent property of the network that evolves over
time.


I'm genuinely curious to see what other people think, so lets not try to
convince each other we're being ignorant and stupid-- I'd just like to know
what people are thinking.

--
--
Gavin Andresen
Interestingly Antony Towns thinks the emergent consensus won't work properly,
I don't think having it be an "emergent property" is a stable system with economically rational actors -- miners would continually run the risk of losing money because the "network" would mysteriously decide the block they just mined was "too big" and refuse to build on top of it, so I'd expect them to get together in a backroom (or just decide via soft-fork) to set some fixed limits instead.
That said, if it were possible, I think it would be my preference too.
I think that he is falling into the common trap of assuming that the average block size will be near the prevailing maximum. It is the case now, with the low hard limit. But in a normal situation the max limit (whether algorithmic or emergent) would be a number of times larger than the average block size. Of course miners will occasionally test the limit with a large block, but it won;t be mysterious if it is orphaned.

The existing situation of a choked network is forcing miners to discuss limits, in a cartel-like manner, far more than would otherwise be the case.

introducing eGADS...
http://lists.linuxfoundation.org/pipermail/bitcoin-discuss/2016-February/000057.html
 
Last edited: