Gold collapsing. Bitcoin UP.

Aquent

Active Member
Aug 19, 2015
252
667
How much money did circle raise? $76 million, that's just one company. And coinbase raised what 100 million or so. Not sure how much Bitgo raised, or Blockchain, or Xapo, just to name a few, but 21 is the richest bitcoin related company. So, we looking at what, half a billion, maybe a billion?

Relax. Lets change the atmosphere/focus. Let's bring some positivity back. As I've said, what is self evident and obvious is, well, self evident and obvious. No point beating a dead horse.

We live in an historical time and I feel the wind in the air is changing. Looking at the wider picture it feels like we are posed for an historical shift which if it so happens (and I am terrible at predictions) it will make our little debate look as insignificant as P2 or whatever.
 

VeritasSapere

Active Member
Nov 16, 2015
511
1,266
@Aquent Over a billion dollars of investment in Bitcoin companies in 2015 alone.

@Norway The ancient Scythians used to say that important decisions and ideas should always be considered while both drunk and sober before they are carried out. So if you have a great idea while drunk reconsider it when you are sober, conversely if the idea comes to you while you are sober make sure you reconsider it while drunk. ;)
 

Dusty

Active Member
Mar 14, 2016
362
1,172
I don't know what makes a fork soft. Is it like an auto update?
Soft fork is way to enforce new and stricter rules in the system without requiring that all nodes upgrade the software but needing only a majority of miners to be in agreement.

@Dusty
not at all b/c multisig seemed to be a good thing at the time.
But your knowledge of the system was waaay less then than now (as was mine, btw).

Also, P2SH was way, way more than only enabling multisig: it was an hack to rendering possible to create an address for whatever complex a script could be, using all enabled opcodes.
2of3 multisig was only the simplest and the more easier to explain to non-technical people.

plus, i know i didn't really understand the ANYONECANSPEND implications back then.
That, and also the fact than entering the path of softforking, i.e., introducing features without forcing that all the nodes of the network to upgraded their client, was just the first step in a certain direction and more and more softforks would have been followed since then (stricter malleability rules, op_ctlv, op_csv, etc).

I remember that at the time there were all sorts of fights between various camps: the ones that would have preferred hard forks, the ones that preferred OP_EVAL or CHECKHASHVERIFY(BIP17) or P2SH (BIP16) etc.

The only difference was that the bitcoin space was 1/100 or maybe 1/1000 of what is now, so it did not reach the public, but it was discussed mainly between experts.

this is a totally different situation.
Is it? (please note I'm talking about segwit now, not block size)

In common with the old debate we have that a group of people wants to soft fork the network to enable a whole new scripting capability, while other groups would like a more clean approach, an hard fork, less changes, etc.

I find that the most important difference between now and then is not technical but political: while in the old days there were a bunch of people speaking for themselves, now there are big firms coordinating people and ideas. Also, where before the discussion was mostly between very technical people, now a lot of the general public is giving opinion (one side or the other) without really knowing how the specific things work.

Soft forks should never be used for changing consensus rules, independently from the proposed change.
Soft forks are used only to change consensus rules. If no consensus rules have to be changes, no hard or soft forks are necessary. There is no use for a soft fork without changing consensus rules.

I think there's a very fundamental difference between softfork segwit and softfork P2SH.

P2SH was not intended to be an immediate fix to an emergency need the community had right then and there, and the community had essentially years to understand what it was all about and design UX and workflow around it. It truly was opt-in because there is essentially no systemic risk to introducing it.
Actually, if you read the old threads, the main criticism was "why all this rush? We need more time to discuss all the implications!", only at a certain point Gavin was fed with all the time wasted talking and decided to act. Since it was the undisputed leader at the time, he committed his P2SH patch and released the new software.

sickpig said:
Back then I was naive enough to not being able to grasp the perniciousness of this kind of deploy mechanism.
Yes. Still, in retrospective, we can say the P2SH experience turned out well: it did not break the network and allowed us to work out complex scripts, having multiuser ot server side signatures, automated escrow and a lot of other nice things without needing any other modifications to the consensus.

So, I ask myself (I'm really thinking out loud and trying to play the devil's advocate), should we try to learn from the past and being more open to this kind of innovation?
 

albin

Active Member
Nov 8, 2015
931
4,008
@Dusty

It's still different despite the similarity in rhetoric, because any "why all this rush?" person wrt P2SH could simply not use it and continue to advocate OP_EVAL, and functionally it would be the same experience as P2SH not existing. Worst case scenario is P2SH becoming deprecated. This is true opt-in.

In this case because they're repurposing segwit as a capacity increase, the opportunity cost here is a safer capacity increase, so it really isn't opt-in in the same way.
 

Dusty

Active Member
Mar 14, 2016
362
1,172
It's still different despite the similarity in rhetoric, because any "why all this rush?" person wrt P2SH could simply not use it and continue to advocate OP_EVAL, and functionally it would be the same experience as P2SH not existing.
How so? Using op_eval would have needed anyway a soft fork repurposing OP_NOP1.

Worst case scenario is P2SH becoming deprecated. This is true opt-in.
Can't follow you here, what do you mean exactly?
 

albin

Active Member
Nov 8, 2015
931
4,008
P2SH was not deployed to attempt to alleviate any time-sensitive issue, so if advocates of OP_EVAL at some point had a compelling argument that lead to implementing it, the two could coexist, or P2SH could just fall into disuse. This is a feature that Bitcoin didn't have before, so the consequences of using it were contained to people making these kinds of transactions, and there was essentially no motivation to use these features if you didn't specifically have a need for these features.

The difference with using segwit softforked as a capacity increase is that everybody across the board needs capacity in Bitcoin, so tying the capacity increase to segwit being implemented exposes everybody to risk, unless they are intentionally willing to throttle their own ability to make transactions in the face of increasing transaction demand. Hardforking 2MB first, then implementing segwit later as block + witness <= 2MB would not have this characteristic.
 

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
But the price of bitcoin has to take a dive. Because nobody wants to invest in "the future of money" as long as the transaction capacity of this system is already or almost full.
@Norway
I think i'd be a little more optimistic than this. We've had several rallies cut shot due to bottling up of transactions and UX crisis, plus a year long development civil war. Yet the price still hasn't tanked. For all it's faults bitcoin is still a great store of value in these economically troubled times. I think we might see several more rallies cut short as transactions flood the network.

With each failed breakout the pressure on Core and miners to do something will increase. If the system is to work, eventually the major economic actors will have to do what's in their own best interests, that means main chain scaling as a priority and soon. Core will be forced to capitulate or get bypassed. At which point its lift off time.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
Soft fork is way to enforce new and stricter rules in the system without requiring that all nodes upgrade the software but needing only a majority of miners to be in agreement.
i really don't like that definition. when i step back and look at the relative differences btwn a simple 2MB HF and SW, i see a huge relaxation of the rules when it comes to SW. all you have to do is listen to Lombrozzo's description of the wide open script versioning that will allow core dev to SF "all sorts of changes" to understand what i mean. not to mention the sheer magnitude of all the other changes SW allows, some of which are even good.

But your knowledge of the system was waaay less then than now (as was mine, btw).
well, here's the thing. if my knowledge level was up to snuff at that time, maybe i would've pushed hard against the ANYONECANSPEND aspects of p2sh.

2of3 multisig was only the simplest and the more easier to explain to non-technical people.
yes it was. and in some aspects, it opened up a can of worms where core dev thinks it can exploit this SF mechanism in bad ways it might not even realize it is doing.

was just the first step in a certain direction and more and more softforks would have been followed since then (stricter malleability rules, op_ctlv, op_csv, etc).
i will tell you for sure that the SF'ing that's been done over the last year for things the community has no interest in, like CSV, CLTV, opt-in RBF has caused alot of anger. now we have LN & SW being pushed in a similar SF manner. the community is fed up with core dev not listening to the users.

Also, where before the discussion was mostly between very technical people, now a lot of the general public is giving opinion (one side or the other) without really knowing how the specific things work.
well, i think i know how things work. as i think alot of ppl in this thread do too. sure, i might get a detail here and there mixed up (like this compatibility issue btwn old & new wallet outputs) but i certainly am capable of understanding the big picture. unfortunately, when Blockstream took core dev into the for profit arena of self indulgence, they started the shitstorm. when Greg Maxwell decides to choose his CTO position at Blockstream over his Bitcoin github commit privileges, that tells you something. when they shamelessly attack and censor users who advocate bigger blocks, they cross the line. when Adam outright lies and misleads, it pisses ppl off. so what do you expect? yeah, more non-tech ppl are going to get involved when they see abuses of what was a more fair and balanced system.

Since it was the undisputed leader at the time, he committed his P2SH patch and released the new software.
interesting, i didn't know that.

So, I ask myself (I'm really thinking out loud and trying to play the devil's advocate), should we try to learn from the past and being more open to this kind of innovation?
get rid of the discount, make it a+b<=4MB and i might be convinced.
 

Norway

Well-Known Member
Sep 29, 2015
2,424
6,410
Soft fork is way to enforce new and stricter rules in the system without requiring that all nodes upgrade the software but needing only a majority of miners to be in agreement.
I wasn't asking for your interpretation/prediction of the consequenses of a so called "soft fork" in the bitcoin system. My question was aimed at the definition of a soft fork. What makes the fork "soft"? Is it a definition from a core dev? Is it a definition with roots in peer to peer-systems? Has the term been used in other open source projects? And what is the f**king normal definition of a soft fork? :confused:
[doublepost=1460072226,1460071031][/doublepost]@freetrader
The milk for my white russians would be sour and spoiled my drink... ...unless Louis Pasteur discovered the process that conserves it and keep it fresh!

By the way...
I really liked the quotes from @Justus Ranvier about how money printing make an unhealthy market.

And here is my take on the same theme:

Printing more money is the disease, not the remedy.
 

AdrianX

Well-Known Member
Aug 28, 2015
2,097
5,797
bitco.in
thanks @Dusty for the question, it seems evident that knowledge is evolving as we go and so are the circumstances but individuals positions are changing a lot slower. I sure hope Gavin would make a different BIP16 decision in hindsight given where we are today.

I've concluded for myself that bitcoin is primarily a value exchange protocol, it seems evident that this function is the primary one to preserve for maximum impact this technology could have.

What makes Bitcoin so exciting for me is it's a 21st century solution to an age old problem of money that's been at the centre of power and almost all civilizations problems throughout history.

It’s so profound it has the potential to catalyze dramatic change. Fundamentally the most important element in measuring value is access to a stable universal unit of measure as value itself is nebulous. Bitcoin gives us that, leaving us humans to corporate, priorities and pursue our diverse values.

I started my bitcoin education by reading the writings of von Mises, and even The Wealth of Nations to better understand what Bitcoin was. (I studied Business economics at school and university and what I learned was all new to me)

That said how does this apply to bitcoin as a principal?

Bitcoin is loosely described as a design of incentives that use technology to define a finite unit of measure that is free of the control of a centralized authority.

The control the developers (experts) have over the technology that governs the incentive design gives the illusion that it's the technology that is the protocol. Some developers still think the protocol is the code and they are the keepers, but it's not, it's the considered design of the incentives that create the protocol the technology is used to make it possible.

The understanding that I come away with is our values are all different, sure we all value the water but its value changes relative to one relationship to it.

Bitcoin is no different, the work that is done as part of the incentives that make it function as a good value exchange protocol - a unit of measure, needs to be as universally consistent to everyone who uses it. As a principal, Bitcoin must do only what is necessary to be good money.

For Bitcoin to fulfill the value proposition of money, it is necessary that the technology that makes Bitcoin work, perform services that are by definition the minimum necessary to make bitcoin a unit of measure for value. All other services and features are themselves subjectively evaluation and have different utility for different people.

Should any services be performed by the technology, then some users get more benefit from the money than other users, in effect all users collectively contribute to the cost of maintaining the value exchange protocol but some users get added benefits at the expense of other users.

The resulting unit of measure then becomes relatively subjective instead of being universal and erodes the potential value of bitcoin.

I'm all for Soft forks like BIP16 but they should not be debated they should be released in separate implementations. the Kernel - base implementation of Bitcoin should be as simple as is viably possible.
 
Last edited:

johnyj

Member
Mar 3, 2016
89
189
So, I ask myself (I'm really thinking out loud and trying to play the devil's advocate), should we try to learn from the past and being more open to this kind of innovation?
Cheating is not innovation, it is the most natural form of taking short cut by not doing all the required hard work

The reason that soft fork cheating has been working is because people had no idea what was happening before. Soft fork defeated the very purpose of running a node, so if there are soft fork after soft fork, no one will be interested in running a node because that node is not trustworthy most of the time
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
Does anyone else see the hypocrisy of core dev exclaiming the importance of full node verification and decentralization over the last several years versus what I consider a complete about face when it comes to SW and its consequences of SPV degradation to all old nodes and even promotion and growth of "partially validating SPV nodes" going forward?
 

solex

Moderator
Staff member
Aug 22, 2015
1,558
4,693
Indeed, their hypocrisy is evident, but a price they are willing to pay to prevent the Core node count having to restart from zero while they implement all the big technical changes which they *think* Bitcoin needs.

Of course, the news that Bitcoin v0.12 + BIP101 is sufficient for a global digital cash which can scale significantly escapes them because it does not perfectly scale.
 

Roger_Murdock

Active Member
Dec 17, 2015
223
1,453
i really don't like that definition. when i step back and look at the relative differences btwn a simple 2MB HF and SW, i see a huge relaxation of the rules when it comes to SW. all you have to do is listen to Lombrozzo's description of the wide open script versioning that will allow core dev to SF "all sorts of changes" to understand what i mean. not to mention the sheer magnitude of all the other changes SW allows, some of which are even good.
I finally got around to trying to wrap my head around the "soft fork" v. "hard fork" thing. As I understand it, a "soft fork" involves enforcing additional / more stringent rules whereas a "hard fork" involves removing / relaxing at least one rule. So if the pre-fork rules are 'ABC,' a soft fork might involve nodes / miners beginning to apply the rule set 'ABCD' whereas a hard fork might involve them beginning to apply the rule set 'ABE' (where E encompasses scenarios that don't satisfy C).

I've also seen it claimed that anything that can be done with a hard fork can also be done with a soft fork (and maybe the converse is also true?). But it seems that there is always (or at least usually) a more "natural" approach based on the nature of the change you're seeking to accomplish. So for example, if you want to allow for larger blocks, the natural option is a hard fork because what you're really trying to do is to relax a rule. On the other hand if you wanted to (God forbid) reduce the block size limit, that would seem to lend itself naturally to a soft fork since what you're really trying to do is apply a stricter rule.

So maybe it's not that "soft forks are always better" or "hard forks are always better," maybe the issue is choosing the one that makes more sense for a particular change. I'm not much of a coder but my strong intuition is that attempting to accomplish via a "soft fork" what is naturally / logically a "hard fork" is a bad idea because it makes for some really kludgy and complex code. And for no real benefit, correct? It's just so that non-upgraded nodes will still "work" or rather, think they're still working despite their newfound zombie status? But how much value do non-validating nodes really provide to the network?

As I was typing this up, I finally got around to reading Mike Hearn's "On consensus and forks" article which is really helpful. So, based on the article, it sounds like he'd take the position that even things that naturally lend themselves to soft forks should be implemented as hard forks. (And I'd guess that, unlike cramming a natural hard fork into a soft fork, making a natural soft fork into a hard fork is always trivial? Just require inclusion of some new kind of "header element" / "opcode" / computer science thingamajig that old nodes will recognize as something they can't interpret and thus reject?) And the rationale for this approach is greater transparency and ensuring explicit consensus on rule changes. It's a way of communicating: "I am applying a new rule set. Are you ok with this?"

EDIT: This quote from Hearn in the comments of his article distinguishing between the "actual social rules" of Bitcoin and the "protocol rules" is helpful.
The idea that the fork types differ in whether they add rules or remove them is a distraction …. the problem is that the social rules Bitcoin encodes don’t map 1:1 to protocol rules.

Whilst P2SH only “added rules” in some extremely pedantic technical sense, that’s not a useful way to think about it because it ended up breaking the actual social rule people want: “coins may not be stolen by random people”.

The specific rules that make up the scripting language and so on are all working towards implementing that social rule, so if you break the social rule whilst claiming you are still following the technical rules, you’re kind of missing the point.
 
Last edited:

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
So maybe it's not that "soft forks are always better" or "hard forks are always better," maybe the issue is choosing the one that makes more sense for a particular change.
this is exactly right.

i've been saying for a while now that at this point, this debate has moved beyond which is safer; a soft vs hard fork. what matters is the details proposed by both and which is "better" for the system in the long run. everybody and their mother in Bitcoin has heard about the controversy and your mother has her hand over the button ready to upgrade before you do.
 

sickpig

Active Member
Aug 28, 2015
926
2,541
Soft forks are used only to change consensus rules. If no consensus rules have to be changes, no hard or soft forks are necessary. There is no use for a soft fork without changing consensus rules.
you miss my point completely.

I'm not against changes of consensus rules, I'm against soft forks.

Just t to prove my point, I think the multisig txs are extremely valuable.
 

sickpig

Active Member
Aug 28, 2015
926
2,541
The race to disprove the coming proof?


Brito is a newbe if compared against Peter Todd. He started asking for such info almost 2 years ago:

https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2014-September/006606.html
(mail subject: "Does anyone have anything at all signed by Satoshi's PGP key?")

Peter Todd said:
So far I have zero evidence that the common claim that "Satoshi PGP signed everything" was true; I have no evidence he ever cryptographically signed any communications at all.
twitter:

 
  • Like
Reactions: Norway

Inca

Moderator
Staff member
Aug 28, 2015
517
1,679
I've joked that Satoshi's re-emergence could occur in relation to this governance attack.

You can almost smell the fear. Satoshi stating categorically that the max_blocksize was instituted as a temporary measure and that developers of Core have taken the wrong path (followed the wrong leader, a person as manipulative and mendacious as Maxwell) would end this controversy once and for all.