Gold collapsing. Bitcoin UP.

Norway

Well-Known Member
Sep 29, 2015
2,424
6,410
meanwhile gemini announced yesterday that it would wait at least until after the fork was resolved to list BCH. they listed LTC instead.
This is just unbelievable sad. Let's get the default values out of the hands of developer politics and freeze the protocol as soon as possible. Time is not on our side.
 

Zarathustra

Well-Known Member
Aug 28, 2015
1,439
3,797
I need to reiterate this from time to time, since some people obviously have not heard it: "developer's authority" has never existed and can never exist. Developers write software, and they convince miners and the ecosystem (exchanges, merchants) to adopt them, to follow one schedule or another, etc.; their "power" only goes so far as their ability to persuade. Assuming otherwise is like assuming clerks of congresspeople, the ones who "actually write the laws", have all the power. It's quite absurd to yell at the clerks for writing laws you or I dislike rather than the congresspeople who actually turn them into reality.
I need to reiterate this from time to time, since some people obviously have not heard it: "congresspeople's authority" has never existed and can never exist. Congresspeople propose the laws, and they convince voters and the ecosystem to adopt them, to follow one schedule or another, etc.; their "power" only goes so far as their ability to persuade. It's quite absurd to yell at the congresspeople for writing laws you or I dislike rather than the voters who actually turn them into reality.

Putting forward pointless bullshit like BUIP101 erodes that - thankfully it was not passed.
Who are you to lecture 50% of the voters with your truth?
 

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
48.48% (beautifully recurring) please, let's be accurate. I also hope that we never get a vote where the result falls within (50%, 51%) otherwise we'll be in constitutational crisis.

One of the great things about open source is that those who voted for BUIP101 can run BU up to the current 32-bit limit without really lifting a finger, and with a bit more work can convert the implementation to a 64-bit value where they can have an EB10000000.
Put up a bounty for it if you feel the proposal is important that it needs to be done asap.

What make NO sense to me is that none of the people who voted for 10TB have raised an issue to request these features on SV now that BU membership has expressed a preference against it.

They are also very uncreative for not symbolically signalling such an EB yet. That is literally one of the simplest code change one can make. Of course, not getting any blocks mined with it would indicate that there is no actual miner support for this proposal.

Who are you to lecture 50% of the voters with your truth?
Seems to me @imaginary_username is a BU member who's put forward greatly valuable proposals for making Bitcoin Cash even better, is developing and running actual simulations and analyses in the realm of double spend relaying and has implemented several Electron Cash plugins such as scheduled payments and change donation plugins.

Just some examples of his work for Bitcoin Cash that speak for themselves

https://www.yours.org/content/standard-priority--sp--miner-policy-and-double-spend-proof-relay-abdcb2ee8555

https://www.yours.org/content/scheduled-payment-plugin-for-electron-cash-38d991413f56

https://www.yours.org/content/donate-spare-change-to-charity-and-improve-privacy--plugin-for-electro-372eaf8f203a

https://www.yours.org/content/a-simple-doublespend-relay-mining-probability-simulator-9cb0e92cc32a

https://www.yours.org/content/why-orphaning-doublespends-in-mempool-doesn-t-work-c5e07d0775d6

https://github.com/imaginaryusername/ds_simulator

https://github.com/imaginaryusername/dp_simulator
 

Norway

Well-Known Member
Sep 29, 2015
2,424
6,410
The most effective way to hold bitcoin down so far has been to restrict the max blocksize cap. This artificial limit is supposed to be a protection for miners, but it's really the enemy of miners. It's a very short term protection from orphans (a miner could lose minutes) but it destroys the network effect that is crucial for bitcoin to survive and the long term value of bitcoin.

Muammar Gaddafi was killed seven years ago and his country bombed by many countries, Norway included. This happened because he wanted to start a gold backed currency that could replace the fiat french franc in half of Africa's over 50 countries. As Libya is an oil producing country, they also wanted to trade oil with this gold backed currency and threatened the USD as the oil currency at the same time.

Bitcoin was just a joke to TPTB in the beginning. That has changed over 9 years. It's fair to assume that bitcoin is drawing a lot more attention from the same interests now. They have huge organizations working for them. And we have learned from history the game they play.

Infiltration, propaganda, bribes and violence are the tools of the trade. Infiltration is done by the infiltrator making himself useful.

Be vigliant when people try to control the transaction capacity of bitcoin.
 

NewLiberty

Member
Aug 28, 2015
70
442
Ultimately,
if miners start using the direct signalling protocol proposed by shadders recently based on bitmessage like communication protocol signed with coinbase pubkeys of the last 1000 blocks for signalling intentions (as intentions may certainly change after a miners last block was produced).

Then EB coinbase settings can likely be deprecated
[doublepost=1539479637][/doublepost]
Have either of these papers been published in a peer reviewed journal?
Where can one find the evidence?
You are his peer, review it.
 

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
You are his peer, review it.
I'll be his peer on the network, and that's it.
I looked at the first couple of pages with more intensity, and briefly skimmed the rest to see if it was of similar quality. Waste of time.
I. Introduction
...
Any program that is Turing complete is by necessity Finite.
...
Can you explain what a Turing complete program is?
No - neither can the informed reader at that point, because Turing completeness has a well defined meaning which applies to 'a system of data-manipulation rules (such as a computer's instruction set, a programming language, or a cellular automaton)'. The term is not usually applied to programs (sentences in a programming language).

No, you have to indulge in Craig's fancy to use an undefined term in the introduction to find out on p.2 that he defines it informally as
First, we define any Turing complete program to be a program that halts.
If at this point you haven't figured out that his earlier statement is complete bullshit, that may be because you have forgotten it already because you would have had to read a whole page of other stuff just to get to the definition. Or it may be because you can trivially think of a program that halts, but does not need to be finite.

Let's use BASIC to construct a program that clearly halts, but is infinite.
Code:
10 PRINT "THOU SHALT NOT CONFUSE THE READER"
20 END
30 PRINT "BUT I WANT TO!"
40 PRINT "NO, ITS NEVER OK"
50 PRINT "BUT I WANT TO!"
60 PRINT "ARE YOU GOING TO PLAY THIS GAME FOREVER?"
70 PRINT "I WILL IF YOU WILL!"
80 PRINT
90 PRINT
100 PRINT
... (and so on, for an infinite number of PRINT statements)
So I just demonstrated a "Turing complete program" that is not finite.
Hence clearly not "Any program that is Turing complete is by necessity Finite".
I did not really feel further rigorous review was deserved after such a mistake.

After all from falsehood, anything follows.

Real scientists don't feel a need to take an established term like Turing completeness and use it in a redefinition like 'Turing complete program' when all that's meant is a program that halts.
If you mean a program that halts, call it a halting program like everyone else.

The rest of the paper comes across as an elaborate attempt to befuddle, or maybe just the expression of someone who is themselves befuddled, but thinks introducing new terms which borrow from established terms can sow enough confusion to make it seem like something very important has been said.

Sorry if this offends anyone here who can't tell the difference. That's my brief review.
[doublepost=1539483742][/doublepost]But I do have a question I can't seem to resolve.

If Craig is so smart, and is basically surrounded by people with PhD's themselves, why doesn't anyone help him to construct a paper which does not insult the reader?
 
Last edited:

imaginary_username

Active Member
Aug 19, 2015
101
174
we have no valid criterium for what is the real coin.
I, uh, need to remind people that whatever Coinbase, Binance, Bitpay and perhaps a couple other exchanges collectively decide to give the ticker to will "get" the ticker regardless of hashwar or philosophy or desired architecture. One can loudly complain about whatever "legitimacy" they have in their hearts, but end of the day it won't matter one bit.

Not saying it's the right thing or anything, but that's kinda how it works. And there's really nothing you or I or any other dozen of loud people can do about it. ;P
 
@imaginery_username

That's why exchanges don't like forks. They don't want to make that decision. It's a bet, being wrong could cost you a lot of money. Like us, they have no criterion for the real Bch, and they don't like to spent time for a minor coin.

Preparing for a controversial fork is a lot work, especially for smaller exchanges. And ABC already announced the next fork.

I guess we end up with exchanges just saying ABC is the real coin. Decentralized development, nakamoto consensus my ass. Core over and again.

I hope exchanges will list the no-change version, though
 

Zarathustra

Well-Known Member
Aug 28, 2015
1,439
3,797
48.48% (beautifully recurring) please, let's be accurate.
Oh, pardon!


What make NO sense to me is that none of the people who voted for 10TB have raised an issue to request these features on SV now that BU membership has expressed a preference against it.
Why should BU members make requests on SV? More sense would make a request to remove the default now, as @awemany suggested. @Norway would win already on the second try. Or is such a request also 'pointless bullshit'?
[doublepost=1539500888,1539500122][/doublepost]
The miners have apparently already voted https://fork.lol/pow/work
The market value represents the voters. Most miners are mining in relation to the market value. Some miners don't. Coingeek for example doesn't mine the North Corean fork , while Slush is mining it exclusively.
 
Last edited:
  • Like
Reactions: AdrianX and Norway

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
Why should BU members make requests on SV?
Why shouldn't they?
BU members can contribute to other projects, SV is an open source project, so is ABC.

Besides, I told you already why I think this is an important test for me.
If I think something is that important, I would not stop at a rejected BU vote.
Implementing on another client would be a distinct option, especially a client whose stewards have already said they're also in favor of abolishing the limit entirely

I suggested SV because I know that a lot of those in favor of this BUIP were vocal in their support for it.
More sense would make a request to remove the default now
Maybe it would. I've outlined a few other options available since client development is a market where options range from DIY over bounties to employing others.
Have either of these papers been published in a peer reviewed journal?
So apart from throwing the question back at me, is no one from Craig's circle here able to give a decisive answer to this simple question?

I've purposefully limited myself only about these two papers that were introduced to this thread.

After my review, can any one of you point at *evidence* that you reviewed it (e.g. any critical comments about either paper that you provably shared with the world)?
And do you have something to reply about the logical error I pointed out on the first two pages?
 
Last edited:

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
No - neither can the informed reader at that point, because Turing completeness has a well defined meaning which applies to 'a system of data-manipulation rules (such as a computer's instruction set, a programming language, or a cellular automaton)'. The term is not usually applied to programs (sentences in a programming language).
I'm not going to pretend I understand all the complexities of this debate, but do think some latitude on terms can be forgiven though. Bitcoin is not a software development project, it's a new economic incentive system, more akin to the SEIR model in epidemiology. Each new action burns resources like a virus in a petri dish. These actions are either efficient and thus create the necessary incentives for reproduction (Value is created and causes further use), or they are not and perish. Your example would perish, - who would fund such a nonsensical program to run infinitely on the blockchain - does this mean your program is not infinite within the definition of the system? I don't know.

What I do know is we're in new disciplines of science, economics and possibly even biology here, i'd hope those that have the ability, would give other authors their best attempt at understanding the intent of statements made. Otherwise it becomes a semantic battle and we have no progress.

I full agree with your motivation: Let's fix the blocksize-bug once and for all and freeze that protocol. But imho BU did this in early 2016 by enabling the ecosystem to achieve an emergent consensus. If the blocksize becomes a burden, it will be fixable.
I think this is too slow. imho Bitcoin Cash should be proving itself to the world, not waiting for the world to recognise it's utility. I've been enjoying theses stress tests, as they are great at highlighting weaknesses in the current swathe of services. Perhaps a better course of action, would be daily, incrementally, larger stress block @1MB increase/day? 180MB in six months seems like a realistic target for the biggest block ever produced.
[doublepost=1539518354][/doublepost]Turing did define is terms in the original paper though, right there in the opening line.

https://www.cs.virginia.edu/~robins/Turing_Paper_1936.pdf

"The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by finite means."

"Once it is granted that computable numbers are all "computable" several other propositions of the same character follow. "
 
Last edited:
  • Like
Reactions: AdrianX and Norway

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
[...] some latitude on terms can be forgiven though
Maybe in a blog post, but not in a rigorous scientific paper that's trying to borrow from the established science and build upon it.

Actual false statements based on your definitions should not make it past peer review.

I recommend an incentive system modeled on Knuth or otherwise, to eradicate errors from these papers.

Turing did define is terms in the original paper though, right there in the opening line.
I believe you. Turing's work needs no further review at this point. NewLiberty asked me to look at Craig's paper.
 
Last edited:
  • Like
Reactions: NewLiberty

Inca

Moderator
Staff member
Aug 28, 2015
517
1,679
I remember vigorous discussions about using pegged sidechains to migrate transactions, economic activity and users off bitcoin from years ago when blockstream released the white paper.

Liquid will likely fail IMO but at least the cat is out of the bag now with attempts to garner support for LBTC openly being discussed.

I expect a strong push to keep BTC solely as a store of value, whilst attacks on BCH will intensify - expect further dirty tricks and worse from the adversaries of economic freedom - blockchain doesn't scale, remember?
 
interesting thread, thanks @Norway.

First, JToomin says:
"Arguments of technical merit are the only ones I care about."
Which is a bit one-sided when it comes to Bitcoin; it's like discussing a political party just in terms of constitutional arguments; or a bestselling book in terms of grammar; or love just with arguments of biological merits ...

But in the next tweet:
"By the time we have 1 GB blocks, it will be too late to fork in CTOR."
Which is a completely non-technical, but social hypothesis. He should not care about. Note: The onliest argument for doing CTOR now is such that he does noit care about in his own words.

Oh, Two twetes later:
"The ABC devs were trolling BU on the 10 TB vote, btw."
Nice euphemism for vandalizing.

The rest, Gigabyte blocks, selfish mining ... we are fully back in where we have been with Bitcoin in 2015/16 ... discussing current needs in terms of fantasy demand and requiring a non-technical system to be fully controllable in technical terms ... good night :(

Edit: Nevertheless, his insides on orphans and p2ppool are quite interesting.
 
Last edited:

Richy_T

Well-Known Member
Dec 27, 2015
1,085
2,741
They don't want to make that decision. It's a bet, being wrong could cost you a lot of money.
Then you do something like call Bitcoin Cash BCC like people are telling you but suddenly it all settles as BCH and you get people shouting at you for getting it wrong.

There was bound to be a bit of a mess at the fork but it seems that the "BCH community" want to remain in a state of permanent shit-show. BU and this forum seem to be an oasis of relative sanity and thoughtfulness and I wish freetrader's team had managed to pull off the fork instead of how it happened.
 

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
Completely different topic, but I feel it that needs mentioning.

The minimum fees that need to be paid for transactions are an important Schelling point.
We've already seen how fragmentation here can make 0-confirmation transactions more risky, i.e. increase risk of successful double spending.

I'm hoping that miners and developers of software clients will keep the economic aspect in mind here.

Increased complexity in fees is a deterrent to growth of the system. It requires increased complexity in a lot of software that merely wants to be able to send transactions.
We should really be aiming to keep things as simple as possible on the fee side to spur wide adoption.

Make it so that the minimum fees are always simply configurable by the node operators, so that in case of some drift in consensus, we can converge on a new Schelling point relatively quickly.

---

Resist the urge to centrally plan new incentive or disincentive schemes.
Cheap, easy to understand transaction fees are the best tool we have in our fight against the competition AND increase the use of the chain, and they can already be used to encourage UTXO consolidation.
I see more need for wallets that do this for their users in safe but automatic or guided ways than new incentive schemes at the client layer.
 
Last edited: