Gold collapsing. Bitcoin UP.

majamalu

Active Member
Aug 28, 2015
144
775
In times like this, I congratulate myself for not having sold my gold. You can say whatever you want about this barbarous relic, but you can be absolutely sure that its properties will not change due to human intervention, and that its network effect will not be risked in some trivial controversy.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,998
In times like this, I congratulate myself for not having sold my gold. You can say whatever you want about this barbarous relic, but you can be absolutely sure that its properties will not change due to human intervention, and that its network effect will not be risked in some trivial controversy.
while the BCH dev teams are more than willing to part ways, i seriously doubt the BCH miners will do so.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
The hardware in the Gigablock Testnet was not very expensive AFAIK. And it gets cheaper and faster every year.
We had high-end equipment as well as mid-grade consumer stuff: 40 Gbit connections, 20 cores, 64 GB RAM, etc. In fact, the gigablock testnet was probably closer to a "complete graph" than the real network.

What the tests showed was that we were not limited by the hardware. Nor were we limited by the speed of the internet connections, nor by the protocol.

We were limited by the current implementation of the protocol, namely inefficiencies inherited from the original Satoshi code base.

It doesn't matter how fast your internet is or how many CPU cores your server has, if the bottleneck is a block validation algorithm that does everything in a single thread.

I am confident we will achieve massive blocks. We've already made significant progress! When BU began, the network would have trouble with sustained 4 MB blocks. Now we're probably good up to sustained 32 MB blocks, and with the work @theZerg already did last October/November, we have a well-defined path to 150 MB blocks. Then with improvements to the network layer and parallelized block validation, I think we'll be at Visa level and GB blocks. And I think this will also be doable on mid-range consumer hardware. Specialized hardware will take us several orders of magnitude beyond. And that's with technology that exists today! With tomorrow's technology, we might look back at TB blocks as small.

I think sometimes when someone technical points out a bottleneck to scaling, for example "we can't get more than 100 TPS into mempool due to lack of parallelization in BU." Other people hear "we can't scale past 100 TPS" and they feel like the person pointing this out is trying to handicap Bitcoin. But at least when I say it, it's because I'm trying to understand the problems so that I can help fix them.

256 MB until 2024 (512 MB after 2024) is crazy tiny. At that pace, bitcoin has probably failed. This is just setting us up for the same failiure as Core did.
It very well could be too small. What do you suggest as a method the miners could use to keep the block size limit above demand, without requiring bickering and politics to increase it when it needs increasing?
 

Norway

Well-Known Member
Sep 29, 2015
2,424
6,410
It very well could be too small. What do you suggest as a method the miners could use to keep the block size limit above demand, without requiring bickering and politics to increase it when it needs increasing?
I'm getting more and more convinced that the best thing is to remove it. But an algorithm that prevents extreme spikes would also work.

I think a set path like doubling every 18 month is very bad, and could get us back into the 1MB problems.

I also think voting/signaling is bad if 49% of the miners can hold the 51% back. This is working against competition and progress.

Nobody will invest and build specialized node ASICs if they can just relax and harvest high transaction fees with full blocks.

EDIT: To elaborate more on the "no cap" scenarios:

If a minority miner (being evil or motivated by many tx fees) choose to mine a block so large that only 49% or less can handle it, it will be auto-orphaned. The other miners can't not orphan it.

If an evil majority miner do it to destroy bitcoin, there are other and easier ways to do this with a 51% attack.

If a majority of miners (51%) are nice but want the fees and mines a block that 49% can't handle, it's really just bitcoin growing, and the minority miners lose money until they have solved their problems or go bankrupt.

I don't see how huge blocks are "dangerous" in any way.

EDIT 2:

If there are no risks/dangers for the individual miners regarding block space, there will not be motivation for technical progress in increasing the block space.
 
Last edited:

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
Warning, conjecture ahead.

So, some interesting speculation, that might explain the recent power struggle and brinkmanship w.r.t. to the talk of a contentious hard fork on BCH.

First this tweet.


Bitmain has possibly been making promises it can't keep, and maybe financially overextended? The IPO had my spidey senses tingling. (why would such a wealthy and allegedly, extremely profitable company, be doing an IPO?) Frequently these are to either raise capital or quietly unload risk in hard times. Are they under a lot of investor pressure or struggling with cash flow?

Then


Is the ABC development path being partially dictated by Bitmain and it's shareholders in an attempt to shoehorn changes to the protocol for short term IPO and Wormhole profits? This begs the question have the other prominent people, (several BU members included) who have been calling for 1 min blocks and other controversial changes also been duped/complicit with this line of thinking?

Overall I'm of the belief that this power struggle is a hugely positive indicator. Miners are now fully awake and making moves to do what they are supposed to, Vote with their hash power. Long gone are the days of quiet, subservient and seemingly disinterested mining groups watching a slow motion train wreck happen, while doing nothing about it. These are the days of alert and ruthless competitive capitalism. Bitcoin is working as intended.

Will we see a hashpower war come November? Possibly, but more likely this aggressive signalling will be enough to persuade one side to swerve before a head on collision. I'm firmly in the nChain camp on this one, and will vote accordingly. I would very much like to see the protocol uncapped and locked down w.r.t. to the economics at V0.1, and give it the respect and chance it deserves to fulfil Satoshis Vision.

I'd like to take the opportunity to praise Bitcoin Unlimited’s Strategy for the November 2018 Hard Fork. Give miners the options and freedom to choose their path, by not forcing any choices at a client /Dev level. Very well played.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,998
@lunar

>I'm firmly in the nChain camp on this one, and will vote accordingly. I would very much like to see the protocol uncapped and locked down w.r.t. to the economics at V0.1, and give it the respect and chance it deserves to fulfil Satoshis Vision.

this is my feeling as well, altho i won't say i'm in "nChain's camp". but, in fact, i've decided i'm going to take an entirely different tact by supporting Bitcoin SV for now (being in Coingeek's camp), a miner generated implementation. i'm honestly tired of all the bickering btwn the status quo implementation devs who don't have nearly the skin in the game as these large miners and have a huge penchant for complexity. i believe i've been able to foresee many of these problems with my pessimistic memes of "devs gotta dev" & "the geeks fail to understand that which Satoshi hath created" which go back years to the beginning of this thread. so i'm at least consistent even if you disagree with me. i give high marks to Coingeek who solely mines BCH to the exclusion of BTC, as well as to Ayre who has put up $millions of his own money to take this huge risk. he is a natural born rebel/libertarian who hides offshore and whose philosophical priority on blocksize unlimiting aligns with my highly consistent and years long push to uncap tx throughput onchain and to fulfill Satoshi's original vision to it's end (unlimited blocksizes), whether it be by success or failure. this is not to denigrate BU, whose devs i'm sure i would like personally if i ever got around to meeting them, but have consciously resisted, and whom i trust to be honestly pushing what they feel is right for the protocol to the exclusion of proprietary interests. unfortunately, the push for complexity and economic goals i fail to understand, and for the unfortunate 3x node bug crashes from xthins, has forced me to reconsider support. this pains me as i helped conceive BU, heavily promoted it, and contributed monetarily to establishing it's original 5-6 worldwide testing nodes. i must say tho that i could change my mind from now until November as things continue to be fluid in this space. good luck to all.
[doublepost=1535037677][/doublepost]as far as nChain's involvement with SV, i don't see it as a problem for now. the proposed upgrade is limited to old Satoshi op_codes, 128MB blocks, and lifting of a couple of other basic protocol parameters. nothing specifically proprietary from nChain, afaict. sure, i bet CSW will push Ayre to support certain patented tech in the future that could be inserted into the protocol, so we'll have to see. but i'm confident Ayre won't be doing anything highly detrimental to the overarching BCH philosophy just to placate CSW. as it is, it's controversial as to what detrimental effect CSW's patents would even have on BCH if he's specifically allowing free and open use to those wanting to employ them on BCH while at the same time enforcing his rights to keep potentially valuable tech out of the hands of non BCH implementations.

if you'll notice, i've never once made a comment or taken a position about CSW's personality afflictions or claims. the only mention i've made about him in a roundabout way was in regards to the patents and his apparent plagiarism; which in general i'm opposed to as concepts. thus, no one can claim i'm a CSW fanboy.
 
Last edited:

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,998
looks to me even @Jihan doesn't want any part of this ~12127 BCH wall:
 

_bc

Member
Mar 17, 2017
33
130
That's a very interesting post.

"Everything in Bitcoin that falls under the purview of cutthroat market competition works, and everything that doesn't, doesn't."
...
"The error here is this is seen as a reason not to lift the cap. 'We cannot raise the cap or miners would be forced to do work!' This is stated un-ironically, with no awareness that some miners being left behind and some miners making it is exactly how Bitcoin always had to work."
...
"And if your response is, 'But that means some miners might get orphaned unexpectedly and cry foul,' then once again I say, that's a good thing."

So, taking this only half a step further: if we stop coddling the miners (by worrying about orphan rate for them), and if able miners stop over-coddling the weaker miners, and show the world that the press has it wrong (Bitcoin can scale, and at a rapid clip), then we can incite/inspire the greed in some aspiring FANG-like company, and they'll adopt BCH with haste (to front-run their competitors).

Fortune favors the bold.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
For the people saying "we should just remove the block size limit," who is "we"?

If we is BU, how can we do this? If we make it impossible for miners to set a block size limit, they probably won't run our software. So instead we give them a tool to set their block size limit to whatever they want. And by default, we set the limit to whatever the majority of miners is currently enforcing. We even run experiments to measure at what block sizes our software starts to fall apart to provide empirical evidence of the max sustained load a network of BU nodes could support.

What else can BU do?
 

Mengerian

Moderator
Staff member
Aug 29, 2015
536
2,597
@Peter R You hit the nail on the head.

It's the same with ABC. Node operators can easily change the block size limit setting if they want to. The developers just set a default at what they think is works well before running into software problems.

So, it seems to me, the thing to focus effort on is removing the technical bottlenecks. The fixes that are needed are fairly straightforward (which the gigablock project has done a good job at identifying).

So I don't really know what the "debate" is about, since everyone agrees the limit should be raised. But miners will only run with a higher limit when they are confident the software they are running can handle it.

 

Tom Zander

Active Member
Jun 2, 2016
208
455
On the topic of nchain and the ABC hard forks.

I want to point out to people that the two hard fork proposals are both very similar.

Both ABC and nChain are trying to hard fork.

Both of them are not giving a technical rationale why.

Both of them are completely not responsive to any feedback or any compromise requests from the rest of the ecosystem.

Don't make the mistake to think that people have to choose between ABC and nChain. There is a 3rd option which is to NOT hard fork and continue on the path of growth and enabling more users.

The 3rd option is to reject reckless developers.
 

bitsko

Active Member
Aug 31, 2015
730
1,532
There is a fourth option: Cobra Client

hahahahaha

I stand before the obelisk as an ape, thinking its probable moreso that the tool itself doesnt need improvement as much as mans understanding of how to operate it properly.

There aren't yet enough interested polymaths to break it down; perhaps their presence on earth is too infrequent for the timeframes; the problem has got to be in the operators' understanding of, or desire to have, their necessary role.

Things are undoubtedly hard enough as they are, physics and comfort prohibit an individual lifting themselves up by their bootstraps. But get the herd all running in one direction while they can see, in the back of their vision the weak getting culled, perhaps they will take greater care to run and scan the horizon for the best place to run to.
 
Last edited:

Norway

Well-Known Member
Sep 29, 2015
2,424
6,410
Default values: Are they more important than we think?

And by default, we set the limit to whatever the majority of miners is currently enforcing.
I believe @Peter R's proposal for default value is just hardening the blocksize cap. The miners feel too safe with status quo. It's just how most humans think. And judging by the past, the miners look to devs to get the "right" number.

The developers just set a default at what they think is works well before running into software problems.
@Mengerian's proposal for default value is different. This proposal is based on an assumption that the developers know what kind of hardware and network connections the miners and businesses have invested in and make sure they don't have to upgrade.

Both proposals from @Peter R and @Mengerian are trying to create a safe space for miners where they don't have to step up their game to keep up with demand.

My proposal for default value is 10 TB. The miners may adjust it, if they don't like it.


Yes, let’s eliminate the limit. Nothing bad will happen if we do. And if I’m wrong, the bad things would be mild annoyances, not existential risks, much less risky than operating a network near 100% capacity.

 
Last edited:

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
@AdrianX i'm not keeping lists ;)

Because this shouldn't turn into a who said what discussion. It's about the debate of ideas going back over the last 6+ months.

BU have discussed it here https://www.bitcoinunlimited.info/cash-development-plan
and recently on twitter TheRealBitcoin5

gave a couple of good examples.

I made a post in December, about the time of the initial discussion. Odd how these same things keep coming around.
[doublepost=1535110249,1535109378][/doublepost]Very few people can explain these subjects so clearly. Very well written sir. (y)

So-called "Poison Blocks" (what Greg Maxwell called the "big block attack") are the way Bitcoin was designed to scale and the ONLY way it ever can.
 

Tom Zander

Active Member
Jun 2, 2016
208
455
And judging by the past, the miners look to devs to get the "right" number.
The past doesn't really show that at all. That said, it is the sane thing to do to look at the devs for a good maximum because software needs to be tested against a block size and known to work. Otherwise you end up just stating your car can go 400km/h while the producer put a 250km/h maximum on it. That's not useful, that is plain irresponsible.

Why the push to go to a huge max-blocksize THIS YEAR???