Gold collapsing. Bitcoin UP.

BldSwtTrs

Active Member
Sep 10, 2015
196
583
Shit small blockers say:

*They think the "Nakamoto consensus" is something else than the consensus algorithm describes by Satoshi Nakamoto in the whitepaper published on October 31, 2008.

**They think "economic majority" is somehow a meaningful concept. It doesn't bother them the least that the economic majority is using fiat currencies.

They basically have not a fucking clue of what Bitcoin is.
------------------------------------------------------------------------

Bitcoin is the ledger. Nakamoto consensus is the consensus algorithm that allows the ledger to converge.

The ledger has value because of a social phenomena which has nothing to do which "economic majority" but which is more accurately described with the concept of "network effect".

When one thinks in terms of economic majority it favors the status quo conclusion. High rank people (ie. the economic majority) don't have interest to challenge the current state of things since they are already at top of the ladder.

When one thinks in terms of network effect it favors the growth pathway. Growth threatened established positions.

If POW coins are the ones at the higher positions on coinmarketcap it's because POW is a consensus algorithm which favors the network effect, while POS is a consensus algorithm which hinders it. The debate focusing on the technical merits of both misses the most important point.
 
Last edited:

Richy_T

Well-Known Member
Dec 27, 2015
1,085
2,741
IMO, pushing AD was a terrible mistake by BU. There is some risk of the drawbacks you mention, but on the other side, AD is a pretty lame setting, and continuing to promote it does continuous damage to the BU brand. It'd be one thing if users were clamoring for AD, but as far as I can tell, AD is just some random idea by one BU dev which became popular because people associate it with the idea of user choice.
Just to be clear, I have no problem with the idea of removing AD, I just have issues with trying to do it *right now*.

Edit: However, were a small team to fork BU with such changes in place in order to attract those who don't like AD, that would probably be a good option.
 
Last edited:

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
I saw this on the BU sack channel from @freetrader and thought it should be somewhere more permanent, lest it be overlooked. It seemed to ring true, but it would be nice to see what others thought?


"if consensus change soft forks are backward compatible (except for miners),
and if everyone would be a miner again one day,
then would consensus change soft forks become impossible?
Are they fundamentally an aberration introduced by way of non-mining nodes?"
 

AdrianX

Well-Known Member
Aug 28, 2015
2,097
5,797
bitco.in
From some figures I've seen relay nodes make up less than 0.1% of users and if LukeJr projections are correct non relay nodes less than 4%

"If consensus change soft forks are backward compatible (except for miners)" miners who don't upgrade could be forked off - so it's a hard fork from their perspective.

So why not just look at end users private keys and wallets as the important metric to avoid forking?

When defining upgrades as safe or not the question should be are you forking off any users, when the nodes upgrade.
 

albin

Active Member
Nov 8, 2015
931
4,008
I don't mean this to denigrate people with CS backgrounds, but I had a thought and wondering if there might be something to this.

Consider how very blatantly Luke-jr and his ilk like to argue by just categorically making assertions like "well A is actually B", where there might be some interesting similarities between concept A and concept B, but there are also arguably differences, and the real debate is whether the similarities or differences are what's important?

Is it possible that CompSci itself biases people toward these kinds of outcomes? Specifically, the concept of "reductions" in complexity theory. From what I understand, and quite rightly, that major wins come out of demonstrating how one problem is really a subset of a more general problem that you know how to solve.

Obviously that makes a tremendous amount of sense in the subject matter domain of applying algorithms to problems, but if you generalize this kind of thinking to other avenues in intellectual life that aren't necessarily as concrete, isn't that exactly the cautionary tale of Maslow's hammer? ("I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.")
 

BldSwtTrs

Active Member
Sep 10, 2015
196
583
From some figures I've seen relay nodes make up less than 0.1% of users and if LukeJr projections are correct non relay nodes less than 4%

"If consensus change soft forks are backward compatible (except for miners)" miners who don't upgrade could be forked off - so it's a hard fork from their perspective.

So why not just look at end users private keys and wallets as the important metric to avoid forking?

When defining upgrades as safe or not the question should be are you forking off any users, when the nodes upgrade.
I agree the only relevant metric is the users. The whole argument regarding "burden of upgrading" is nonsensical.

What should we think of a company which doesn't change any of its process to please its customers because "making a change would entails work and hassle for our employees" ?
 
I don't mean this to denigrate people with CS backgrounds, but I had a thought and wondering if there might be something to this.

Consider how very blatantly Luke-jr and his ilk like to argue by just categorically making assertions like "well A is actually B", where there might be some interesting similarities between concept A and concept B, but there are also arguably differences, and the real debate is whether the similarities or differences are what's important?

Is it possible that CompSci itself biases people toward these kinds of outcomes? Specifically, the concept of "reductions" in complexity theory. From what I understand, and quite rightly, that major wins come out of demonstrating how one problem is really a subset of a more general problem that you know how to solve.

Obviously that makes a tremendous amount of sense in the subject matter domain of applying algorithms to problems, but if you generalize this kind of thinking to other avenues in intellectual life that aren't necessarily as concrete, isn't that exactly the cautionary tale of Maslow's hammer? ("I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.")
The mathematicians I know say things like: "I don't care about numbers. I never use numbers, just formulars."

Now look at the formular how Bitcoin scales, something like traffic = transactions * nodes and blockchain = transaction * time ... Now if you take offchain scaling, like Lightning, you have other formulars; traffic = transaction * some node, and blockchain = some transactions * time. From a few of a mathematician, it makes sense to say: Why do a hardfork to scale the old formular, while we can work without a risky hardfork to enable the better formular for scaling?

I know, this perspective misses a lot, economy, marketing, user experience and so on. But mathematically, it might be a good perspective.
 

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
Is it possible that CompSci itself biases people toward these kinds of outcomes?
I'll offer a semi-qualified 'no' on that, from the perspective of a CompSci graduate.

Luke-jr is certainly a talented software developer. But I find no signs of an academic record indicating that he studied CompSci to a graduate level. If you know of evidence to the contrary, please correct me.
There is of course nothing wrong with a non-academic career path - a lot of highly talented software developers choose it, and do well.

However, please bear this in mind - it is helpful to check someone's academic qualifications before believing that what they say reflects in anyway on established science.

Now to the science part.

CompSci is a hard science. I would call it a branch of mathematics that concerns itself with computation in all its forms. It is totally a different subject, a world removed from and not at all concerned with the craft of software development.

If science does anything, it opens minds, and makes one aware that there is a vast domain of 'the unknown' - conjectures not yet proven or disproven, hypotheses not yet tested, and minutiae that matter. For CompSci in particular, this becomes very apparent very soon in the course of study.

Again from personal experience, if anything has an opposite effect which encourages indiscriminate generalization and erroneous equating of distinct things, it is the religious frame of mind. I don't wish to discount the value of religious experiences or beliefs, but I think they must be carefully checked if one wishes to remain on a scientific path in one's quest for truth.
 
Last edited:

Erdogan

Active Member
Aug 30, 2015
476
856
> to make absolutely sure the block is part of the main chain before it can be spent.

The reason is the possibility of a reorg. To a normal user, a reorg is not that big a problem, his transaction is is eventually confirmed. But if coinbase coins are spent, a reorg makes a real mess of the recent history, as they are not valid in the new chain.
 

SanchoPanza

Member
Mar 29, 2017
47
65
After a block is created, the maturation time of 120 blocks is to make absolutely sure the block is part of the main chain before it can be spent. Your node isn't doing anything with the block during that time, just waiting for other blocks to be added after yours. You don't have to be online during that time.

http://satoshi.nakamotoinstitute.org/posts/bitcointalk/8/#selection-37.0-37.313

Maybe the ideal AD setting is 120?
Current maturity is 100 blocks:
Code:
consensus/consensus.h:static const int COINBASE_MATURITY = 100;
The old 0.1.0 client had some odd code that might have given rise to the '120' value on the page you quoted.
Code:
int CMerkleTx::GetBlocksToMaturity() const
{
    if (!IsCoinBase())
        return 0;
    return max(0, (COINBASE_MATURITY+20) - GetDepthInMainChain());
}
 

albin

Active Member
Nov 8, 2015
931
4,008
If science does anything, it opens minds, and makes one aware that there is a vast domain of 'the unknown' - conjectures not yet proven or disproven, hypotheses not yet tested, and minutiae that matter. For CompSci in particular, this becomes very apparent very soon in the course of study.
That makes sense, it sounds like the genuine academic experience is actually very humbling.
 

steffen

Active Member
Nov 22, 2015
118
163
Anyone knows why F2Pool mined 22 BU blocks from block 459811 to block 459942 but then stopped again? Seems to be worth following up on, @theZerg.
 

SanchoPanza

Member
Mar 29, 2017
47
65
My suspicion of censorship on the part of bitcoin-dev w.r.t. my BIP proposal was mistaken.
My apologies to the mods there. I look forward to constructive discussion:

https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-April/013969.html

P.S. It turned out that my initial attempt to register must have gone wrong. After another, confirmed succesful attempt at registration, my submission was accepted. I describe some improvements which could be made to avoid this from happening to other newcomers.

Thanks to the person (not a list moderator or admin) who pointed out to me what kind of messages I ought to have received. This person solved the problem.
 
  • Like
Reactions: Dusty and Tomothy

Norway

Well-Known Member
Sep 29, 2015
2,424
6,410
I have a question for you guys...

Can a miner/pool run a pruned node? Why/why not?
 

Norway

Well-Known Member
Sep 29, 2015
2,424
6,410
Thanks, @Richy_T
So in theory, all nodes on the planet could be pruned, and bitcoin as money would work just fine.

That's interesting, because it means that the meta data in a transaction, like a hash serving as a timestamp of something, would not be carved in stone for all eternity.

Not a problem for me, as I think job number one for bitcoin is to be money. But in theory, it would be bad for people who want to use the blockchain to store other info.

(Not very likely scenario though, someone will always be able to store the full chain.)