Gold collapsing. Bitcoin UP.

MoonShot

New Member
Jul 23, 2016
16
51
It's called a "reorg" and they happen at a rate of approximately once per day.
Thanks for confirming that you've been lying about "strong consensus" all this time.

You've been concern trolling about the 75% BIP activation threshold when you knew perfectly well that it's more than sufficient.

You're just here to waste our time and to discourage anyone who's trying to save Bticoin from the Core monopoly.
Good call. If the guy really believed in his 'theories' he'd be out there profittng from the ETH/ETC split and preparing for same in BTC.
[doublepost=1469896468,1469895493][/doublepost]
I am sorry you feel that way, but the truth hurts. I know it is hard to take and I know its not the behavior you want your client to have in your heart, but the fact is that is how Bitcoin Classic nodes are configured to behave. My opposition to Classic is based on the actual code, not the beliefs in the minds and hearts of the people that run the Classic nodes. As soon as Core gets the lead, the Classic coins you bought on an exchange after the hardfork, would vanish from your wallet.

As this truth gradually sinks in and you learn from the ETH/ETC situation, I hope you slowly start to understand why there was really such a strong and genuine opposition to Bitcoin Classic.
so you know that Core & Blockstream developers have prepared an attack on Classic chain after a Fork happens. All well-hidden and deniable, ofc.
 

satoshis_sockpuppet

Active Member
Feb 22, 2016
776
3,312
Forecast? Do you really believe that an event is not caused, if the homo sapiens is not able to forecast it? The meteorologists can forecast the weather for the next 3-5 days more or less, because it is deterministic. Do you really believe the weather begins to behave indeterministic (creationistic) from day 5??
There is a difference between the two. One you can't forecast the weather because you can't process enough data (yet). The other is impossible to forecast, even if you have all measurable data.
It is the current knowledge, that you can only make statistical statements about certain effects. There is no (and that's not because you don't have enough data, it is because with the current theoretical and experimental knowledge you can never know) way to know how things behave at a time T.

As I 'proved' for you already: It's just the silly Kopenhagen interpretation that postulates indeterminism (creationism). The majority of the "leading cosmologists and other quantum field theorists" believe into the many world interpretation which is deterministic, others believe into the Bohmian Mechanics, which is deterministic too.
First, if you believe certain things because Einstein believed them, it's not a good idea to call a theory silly, when people like Heisenberg and Bohr believed in it. ;)
However you twist it, the current models and what we know, allow us to give probabilities about some "facts". Not more. Believing, that these probabilities are in reality deterministic doesn't help you because you don't have anything to support that idea.

The opposite. Effect without cause is beyond any logic. It's creationism.
It's pretty easy, people applying the (logical consistent) probabilistic models can make forecasts in a margin because they accept the effects they're observe are random. Try to find someone who formulated a (logical consistent) deterministic model for quantum physics. It doesn't exists. Therefore, with today's knowledge, it's reasonable to view the world as not deterministic.

And if everything has a cause, what is the cause for natural constants? Where do they come from?

If I can't convince you, that the world probably, from all we know, isn't deterministic; maybe I can at least convince you, that there is good reason not to say "The world is deterministic. Period.".

Anyway, this has about nothing to do with Bitcoin. ;) Although I know some guys at Core also have their problem accepting probabilistic models. :D
 

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
I know, and in those situations some unnecessary fill transactions would be needed to reach 1 MB. That would primarily be an annoyance for attackers trying to prevent real user transactions from being confirmed because they have the problem constantly. The miners mining real user transactions seldom have the problem, only when the find a new block very quickly.
I still wouldn't be in favour of it because the set of consensus rules should be the smallest possible set which allows the system to operate.

Adding a block size limit in the first place already started a slippery slope toward unbounded complexity when it changed the consensus definition from "a block is valid if all the transactions in it are valid" to "a block is valid if the transactions in it are valid and also if it follows some other arbitrary rules we tacked on later."

We should be trying to achieve a state not of there being no more consensus rules left to add to Bitcoin, but a state of there being no more consensus rules left which can be removed.
 

xhiggy

Active Member
Mar 29, 2016
124
277
Whether it rains tomorrow is not an undecided chance. It is decided already, but the computers are not filled with all the data that are needed to predict it with 100 pct hit rate.
It's off topic, but are you sure that this is true?
[doublepost=1469899722][/doublepost]
As soon as Core gets the lead, the Classic coins you bought on an exchange after the hardfork, would vanish from your wallet.
Blatant fear mongering.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
There is a difference between the two. One you can't forecast the weather because you can't process enough data (yet). The other is impossible to forecast, even if you have all measurable data.

It is the current knowledge, that you can only make statistical statements about certain effects. There is no (and that's not because you don't have enough data, it is because with the current theoretical and experimental knowledge you can never know) way to know how things behave at a time T.

First, if you believe certain things because Einstein believed them, it's not a good idea to call a theory silly, when people like Heisenberg and Bohr believed in it. ;)
However you twist it, the current models and what we know, allow us to give probabilities about some "facts". Not more. Believing, that these probabilities are in reality deterministic doesn't help you because you don't have anything to support that idea.

It's pretty easy, people applying the (logical consistent) probabilistic models can make forecasts in a margin because they accept the effects they're observe are random. Try to find someone who formulated a (logical consistent) deterministic model for quantum physics. It doesn't exists. Therefore, with today's knowledge, it's reasonable to view the world as not deterministic.

And if everything has a cause, what is the cause for natural constants? Where do they come from?

If I can't convince you, that the world probably, from all we know, isn't deterministic; maybe I can at least convince you, that there is good reason not to say "The world is deterministic. Period.".

Anyway, this has about nothing to do with Bitcoin. ;) Although I know some guys at Core also have their problem accepting probabilistic models.
The way I see it, there are two different questions:

(1) Are the laws of the universe deterministic?

(2) Can you predict the future state of a deterministic system if you can describe its initial conditions perfectly?

In my opinion, the answer to #1 is "we don't know." People who claim that the answer is "no: the universe is not deterministic" will often cite Bell's theorem as their proof:

"No physical theory of local hidden variables can ever reproduce all of the predictions of quantum mechanics."

That is, it's been proven that no set of deterministic mathematical equations exists that could explain the observations from quantum mechanics if we assume locality (i.e., that there are no "faster than light" interactions between events).

But why must space be local? What is space at its most fundamental level? No one really knows, but the theory that I like best is that space is all there is. The universe is a giant network and what we consider "empty space" is just straight-forward boring connections between nodes separated by a distance on the order of the Plank Length. Particles like electrons or photons are patterns or "twists" in this network and these twists propagate through the network according to certain transformation rules. Stephen Wolfram (creator of Mathematica) explains why the idea of locality doesn't even make sense if you think of space as a network:



With respect to the second question, #2, we already know that the answer is "not really." We can only make predictions for systems where the is some sort of computational reducibility. For other systems, the only way to figure out the state at time t = t1 is to allow the system to evolve from t0 to t1.

If the system is something like Rule 30 cellular automata below, then although the transformation rules are extremely simple, there is no way to predict whether the square in the center column at row N=100 will be black or white other than by working out the color for all the squares in all the rows above it!



Extending this analogy to the physical world, at a certain level of complexity, there is no faster way to predict the manner in which a physical system will evolve than by watching it do so. You could model the system given its initial conditions and the physical laws it obeys, but your simulation will necessarily be slower than the "computations" performed by the universe itself on the actual system.

An example of this is "what will your day be like tomorrow?" Even if the universe were completely deterministic and we knew its laws, and even if we could describe its present state perfectly, there would still be no way to predict precisely what's going to happen tomorrow other than by waiting for tomorrow to come.
 
Last edited:

Zarathustra

Well-Known Member
Aug 28, 2015
1,439
3,797
There is a difference between the two. One you can't forecast the weather because you can't process enough data (yet). The other is impossible to forecast, even if you have all measurable data.
Yes, but 'all measurable data' is not the same as all data. The Laplacian demon has all data and therefore he can predict the future in all details.

It is the current knowledge, that you can only make statistical statements about certain effects. There is no (and that's not because you don't have enough data, it is because with the current theoretical and experimental knowledge you can never know) way to know how things behave at a time T.
Yes, but determinism is not dependent on the knowledge and the computers of the homo sapiens. The better the computers and the more data is available, the better the prediction about the deterministic weather.

First, if you believe certain things because Einstein believed them, it's not a good idea to call a theory silly, when people like Heisenberg and Bohr believed in it. ;)
However you twist it, the current models and what we know, allow us to give probabilities about some "facts". Not more. Believing, that these probabilities are in reality deterministic doesn't help you because you don't have anything to support that idea.
Except the universal rule of cause and effect. That's the reason why an overwhelming majority of the wise philosophers and physicists are convinced that there is no such thing as indeterminism (creationism).

It's pretty easy, people applying the (logical consistent) probabilistic models can make forecasts in a margin because they accept the effects they're observe are random. Try to find someone who formulated a (logical consistent) deterministic model for quantum physics. It doesn't exists. Therefore, with today's knowledge, it's reasonable to view the world as not deterministic.
As I told you already twice, the bohmian as well as the many world model are deterministic models for quantum physics.

And if everything has a cause, what is the cause for natural constants? Where do they come from?
They are eternal.

If I can't convince you, that the world probably, from all we know, isn't deterministic; maybe I can at least convince you, that there is good reason not to say "The world is deterministic. Period.".
No, there is no good reason to postulate creationism instead of causalism. Never had been.
Neither the past nor the future do have contingents. There is only one past and one future: the one and only possible.

Anyway, this has about nothing to do with Bitcoin. ;) Although I know some guys at Core also have their problem accepting probabilistic models. :D
I guess most of them believe in fairytales such as a free will and a future that is still undecided. I don't think they are able to follow the greatests minds in history.

“Honestly, I cannot understand what people mean when they talk about the freedom of the human will. I have a feeling, for instance, that I will something or other; but what relation this has with freedom I cannot understand at all. I feel that I will to light my pipe and I do it; but how can I connect this up with the idea of freedom? What is behind the act of willing to light the pipe? Another act of willing? Schopenhauer once said: Der Mensch kann tun was er will; aber er kann nicht wollen was er will (Man can do what he will but he cannot will what he wills).” Einstein


The so called 'master argument' of Diodoros Kronos:

"For Diodorus, if a future event is not going to happen, then it was true in the past that it would not happen. Since every past truth is necessary (proposition 1), it was necessary that in the past it would not happen. Since the impossible cannot follow from the possible (proposition 2), it must have always been impossible for the event to occur. Therefore if something will not be true, it will never be possible for it to be true, and thus proposition 3 is shown to be false."
[doublepost=1469909997,1469909164][/doublepost]
It's off topic, but are you sure that this is true?
I have 100 percent consensus with Einstein, Laplace and Diodoros Kronos. (y)
 
Last edited:

Zangelbert Bingledack

Well-Known Member
Aug 29, 2015
1,485
5,585
@jonny1000

OK, so just a clean flag day with no threshold and some way of distinguishing between the two chains should be sufficient, right?

The way to sell this is to say Classic (and BU) is too nice. The right way is with an uncompromising fork that just says "no way!" to Core, majority (let alone strong consensus) be damned. Looking back, you did actually say this in some roundabout way but it wasn't getting through because of the weird and contradictory way you were presenting it.

I'd feel a lot more comfortable with an economic solution rather than a code-based one, which is why I think the code changes should be limited to simply what allows for fork arbitrage (preferably futures trading) between the two branches of the fork. Waiting 6 months seems pointless, and so does waiting 100 blocks for maturity (perhaps I disagree with both you and @Justus Ranvier on this; not sure yet).
 
Last edited:

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
Waiting 6 months seems pointless, and so does waiting 100 blocks for maturity (perhaps I disagree with both you and @Justus Ranvier on this; not sure yet).
If an exchange wants to be able to enable trading between the small block fork and the large block fork, it is possible to enable.

However if they go about it naive way they can very easily end up insolvent.

The safest approach is to only crediting small block deposits if they have a generation transaction ancestor on the small block fork. Exchanges can create this type of output themselves after the receive the first deposit of a matured small block generation output.

This means it's not safe to take deposits for small block bitcoins until the minority fork has matured its first post-fork generation output.
 

go1111111

Active Member
...and then they'd reappear again immediately as soon as the majority Classic fork pulled ahead again as it's mathematically guaranteed to do, OR when the transaction that paid you was also mined in the Core fork. And after a few hours the risk of a reorg would drop to negligible levels so if everybody just went to sleep after the hard fork began everything would be fine in the morning when they woke up.
I think this is the first time that I've agreed with @jonny1000 in one of his debates against Justus.

If we assume that the big block fork will retain > 51% hashpower forever, then Justus's argument is solid. Jonny's point seems to be that we shouldn't be that confident in this, because hashpower can shift in the uncertainty following a fork.

Look at ETHF/ETC. There are many people (including me) who think that ETC actually has a decent shot (maybe 20%) at eventually overtaking ETHF in hashpower. It might take many months or even years for this to happen. Just because ETHF has more hashpower now doesn't mean we can be sure that it will continue to. A similar dynamic could play out in a Classic-like fork.

Justus's argument that if the forking chain was reorged the transactions could just be reincorporated later into the original chain gets weaker the more time that has passed between the fork and the reorg. The reason is that the system won't have enough throughput to process all the reorged transactions immediately, so an opportunity for double spending arises. Let's say I buy Justus's boat for 100 BTC on the big block forked chain with a tx fee of 0.001 BTC. Then a month later this tx gets reorged. I see this and immediately send out another tx on the original chain spending this same 100 BTC but with a 1 BTC fee. A miner mines the new tx because the fee is higher. Now I have Justus's boat and he has none of my coins on any chain. Actually I don't even have to wait for the re-org. I can just spend the small block version of those outputs any time, to block my tx to Justus from ever being included in the small block chain.

I do suspect that Jonny has some antagonism toward Classic supporters that doesn't just stem from this concern about asymmetric forks, but if we put that aside, I agree with him that symmetric forks are less risky.

What I see from people opposing Jonny's arguments is just "That's an unnecessary precaution. You're being paranoid" (admittedly I haven't read every post in this discussion). Does anyone actually have a solid reason why symmetric hard forks are more harmful than asymmetric ones, instead of just being unnecessary?

One argument against them might be that SPV users may have wanted to follow the majority fork, but many ways of doing a symmetric fork would cause SPV wallets not to recognize it. However it seems possible to do symmetric forks in a way where SPV wallets recognize both (maybe depending on a setting that the wallet user can configure).
 

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
@Zangelbert Bingledack : PoW change doesn't fix that - assume the exchange is running both clients and receives an incoming tx consisting of pre-fork coins only.

If neither of the forks have replay attack protection (tx signatures change), this tx could be mined on either or both forks. It is up to the sender to take care that they send funds which can be collected by the recipient on the intended fork chain (i.e. mined on that chain), regardless of its PoW.

If there is replay attack protection, it works for both sides as far as I can tell, so an exchange would not need to taint coins before sending out a withdrawal.

Without replay protection, an exchange needs to be diligent and taint those withdrawn coins to prevent loss.
 

sickpig

Active Member
Aug 28, 2015
926
2,541
Look at ETHF/ETC. There are many people (including me) who think that ETC actually has a decent shot (maybe 20%) at eventually overtaking ETHF in hashpower
Slightly topic change: Do you think ETC had been the way it is, if the POW difficulty re-target would have been similar to the one we have in Bitcoin (2016@10mins block)?
[doublepost=1469919919][/doublepost]SF meeting between core devs and Chinese miners just started:

 

Zarathustra

Well-Known Member
Aug 28, 2015
1,439
3,797
The fact that you are involved in a debate implies that you believe in the possibility of changing someone's mind. You don't have debates with your washing machine, do you?
Yes, I believe that minds change, more or less. A changed mind is an effect, an effect that is caused.
Minds don't change ex nihilo. I am convinced that it is already decided if and when 2MB Blocks will be mined, and which minds will change, and when exactly. We just don't know it because we have much less information and data than Laplace's demon. He even knows what Maxwell's demon is determined to try.
Laplace's demon also knows if Jihan will act or if he will talk in August. He possesses all data. He knows @Jihan better than we do. We can just speculate.
 
Last edited:
  • Like
Reactions: freetrader

Zarathustra

Well-Known Member
Aug 28, 2015
1,439
3,797
@majamalu

"Then, to be consistent, you should (at least) stop trying to convince people. I'm not saying you should, I'm just pointing out a contradiction."

I would contradict myself if I would stop trying to convince people. I believe that I am (pre-) determined to convince people. Why should I break that determinism?
[doublepost=1469922765,1469921725][/doublepost]
Extending this analogy to the physical world, at a certain level of complexity, there is no faster way to predict the manner in which a physical system will evolve than by watching it do so. You could model the system given its initial conditions and the physical laws it obeys, but your simulation will necessarily be slower than the "computations" performed by the universe itself on the actual system.

An example of this is "what will your day be like tomorrow?" Even if the universe were completely deterministic and we knew its laws, and even if we could describe its present state perfectly, there would still be no way to predict precisely what's going to happen tomorrow other than by waiting for tomorrow to come.
I like your post, Peter. I like all your posts. The problem that you explain here is solved in your non-local world:

There has recently been proposed a limit on the computational power of the universe, i.e. the ability of Laplace's Demon to process an infinite amount of information. The limit is based on the maximum entropy of the universe, the speed of light, and the minimum amount of time taken to move information across the Planck length, and the figure was shown to be about 10120 bits.[11] Accordingly, anything that requires more than this amount of data cannot be computed in the amount of time that has elapsed so far in the universe.


Another theory suggests that if Laplace's demon were to occupy a parallel universe or alternate dimension from which it could determine the implied data and do the necessary calculations on an alternate and greater time line, the aforementioned time limitation would not apply. This position is for instance explained in The Fabric of Reality by David Deutsch, who says that realizing a 300-qubit quantum computer would prove the existence of parallel universes carrying the computation.

https://en.wikipedia.org/wiki/Laplace's_demon#Recent_views