Gold collapsing. Bitcoin UP.

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
@Mengerian odd timing ... I just noticed that too. I suspect they've now switched to only mining Segwit with hashpower that has directly voted for Core. So about 25% down from about 50-60% when including 'don't care'

would be good to see them combine the Unlimited and Classic blocks


Don't know if anyone noticed, but a great new resource page has sprung up on http://nodecounter.com/bu_settings.php gives some nice stats on BU setting.

also these


 
Last edited:

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
@Richy_T
We can almost sort-of do that now. You could create a pruned node and just share it as a torrent. There would be trust issues, of course but there are ways to be pretty sure you're in good shape.
Right. Someone should do that (TM) :D

I still think commitments that are properly stamped and validated by all will be extremely useful for on-chain scaling.

Maybe even a third-part process that goes through the blocks outside bitcoind and calculates UTXO stamps would be a start? This would allow to decouple experiments with a good UTXO summarizing method from having to play with the messier parts of bitcoind.

If UTXOs are really a concern, let's find a way to really encourage consolidation. Perhaps multiple TXs -> 1TX could get a direct discount or even be free (up to a small fraction of the block). Miners could choose not to include these, of course.

The point being, if you want to discourage UTXO growth, do it direcly, not with some hand-waving BS claims for a technology you're pushing for different reasons. If I catch you crapping in my yard, don't tell me my roses will come up lovely.
Absolutely excellent point. In a way, I am still assuming some good will in Greg and are thus open to his tactics. Along these lines, requirements that lead to SegWit have never been properly formulated.
 

solex

Moderator
Staff member
Aug 22, 2015
1,558
4,693
Our accidental 1.000023MB block was unfortunate and we will endeavour to do better in future. We offer apologies to bitcoin.com for their loss of revenue on the orphaned block. They are blazing the trail for BU and getting some knocks for their efforts.
Special thanks to @Mengerian and @Peter R with input from many people for the BU Incident Report
 
Last edited:

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
Yonatan Zunger has been writing a couple of great blog articles recently.

I really enjoyed this one, with the provocative title "Tolerance is not a moral precept":

https://extranewsfeed.com/tolerance-is-not-a-moral-precept-1af7007d6376

It made me reflect on the situation in our Bitcoin community and how we can better think about and deal with others who may have different opinions and act on them against ourselves.
 

xhiggy

Active Member
Mar 29, 2016
124
277
Are there any statistical simulations exploring BU's EB/AD dynamic?
 
Last edited:

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
I like the penguin analogy because it perfectly illustrates evolutionary emergent consensus.

How did they get there you ask? They got there through millions of years of trial and error and an almost Sun Tzu like understanding of war. In this case war is survival. They have collectively realised that by choosing the battlefield and and knowing your enemy, they can dictate the terms and thus win the war.

They have moved the battlefield to a location where they only have to beat one enemy. The Cold.

Doing this has a huge evolutionary advantage. It means there are absolutely no predators to eat their eggs and when the chicks are born they have shortest possible distance to commute to the teaming nutrient rich waters around the coast of the Antarctic when spring arrives. Thus giving them a big headstart. Everyone else has to migrate, which is not so easy with babies in tow.
 

Richy_T

Well-Known Member
Dec 27, 2015
1,085
2,741
@Richy_T
What do you think about my post(s) that you quoted? Am I far out in la-la-land, or do you think the estimates and math about on chain scaling is reasonable?
Yes. I think we're around 2 orders of magnitude from where the limit could be for "reasonable" levels of technology which the vast majority of existing Bitcoin users could manage today. There is a sliding scale, of course but there is also a sliding scale of blocksize vs adoption. These counteract each other somewhat and you don't want to be too much one way or the other but we are without a doubt currently have the slider slid all the way over to "too small". The only valid concern, in my opinion, is the n^2 issue and that could be band-aided by restricting the n and with parallel validation until there is a proper hard-forked solution. I also think that from what I have read that there are some steps people with poor connections could take which would help with their bandwidth. Lowering maxconnections is one, running blocksonly is another (though somewhat alleviated by x-thin and friends) and I think there are likely other protocol optimization that could be done. There is little reason for the bandwidth used to be more than the sum of the transaction size plus some minor overhead for a node that has to work with restrictions. I would like to know what the overhead on the current network is.

Just for measure, I run the bitcoind that I use for ChartBuddy on my mail/webserver box that is using outdated hardware and it runs just fine. I tried it on a Raspberry Pi and it struggled. So that is the bottom of the barrel we are looking at there. You could pick up a used PC for under $100 that would be fully capable of keeping up and some of the Pi class machines that are coming out are much more capable too.

Mesh networking will definitely be a way forward *if* we can get enough adoption. Why download 100+G through your ISP (That's a substantial proportion of my cap) when your neighbor will provide it for free/cheap and it won't mess up your online gaming.
[doublepost=1485886239,1485885365][/doublepost]
Absolutely excellent point. In a way, I am still assuming some good will in Greg and are thus open to his tactics. Along these lines, requirements that lead to SegWit have never been properly formulated.
I stopped assuming goodwill from Greg a long time ago. Even if internally, he's actually trying to do the right thing, by the time it's filtered through his personality, it's not.
 

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
Speaking of UTXOs. Is there any reason the already-implemented 'hash_serialized' from gettxoutsetinfo can not be used for (a cheap variant of delayed) UTXO commitments until a better scheme can be implemented?

Since Core 0.13.0, it is deemed to be platform independent. Any reason not to use it?

EDIT: Also, it takes less than one minute on my 6 year old Laptop to calculate it for the tip.

EDIT2: Is there a reason to not allow queries for slightly older coins than the tip? Due to possible reorgs, they should be still all available, or aren't they?

If we wait 2016 blocks until calculating the UTXO set, that would allow the UTXO set on my old Laptop to grow to a size of a total of (1.6GiB/60s) * ~ 1 week = 16TiB. (1 week of safety factor) It is also calculated single-threaded and without further optimization.

(Of course, this is excluding RAM and SSD access issues, which are also covered in orphan cost. And this is literally the laziest approach there can be, no summing of just the deltas, but the full UTXO set every time. But you get the idea ...)

EDIT3: Oh and with 55 bytes/UTXO, 16TB of UTXO data amounts to just some 300 billion UTXOs. Do you remember that I thought 50 / person would be enough? Well, others, smallblockers even want a couple hundred. In any case, 300 billion UTXOs amounts to about 40 / person already!

Also note that 10TB consumer HDDs are shipping since a while.
 
Last edited:

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
Interesting writeup from magma.
https://forum.bitcoin.com/mining/bitcoin-com-s-excessive-block-pool-analysis-t16844.html

Fascinating that many pools started mining on the "invalid" block


  • 06:58:48 Bitcoin.com mines an excessive block at height 450529
    06:58:49 ViaBTC starts mining a block at height 450530
    06:58:49 BTC.TOP starts mining a block at height 450530
    06:58:49 BTC.com starts mining a block at height 450530
    06:58:49 HaoBTC starts mining a block at height 450530
    06:58:49 BTCC starts mining a block at height 450530
    06:58:49 F2pool starts mining a block at height 450530
    06:58:50 BTCC starts mining a block at height 450529
    06:59:03 ViaBTC starts mining a block at height 450529
    06:59:20 BTC.TOP starts mining a block at height 450529
    06:59:30 F2pool starts mining a block at height 450529
    06:59:50 HaoBTC starts mining a block at height 450529
    07:20:19 Bitclub Network finds a 998.2kb block at height 450529
Assume this means they are all running software compatible with an increased blocksize? Or is this just an artefact of headers first mining?