Gold collapsing. Bitcoin UP.

Richy_T

Well-Known Member
Dec 27, 2015
1,085
2,741
This post really is worth a second look:
How do we really know that decentralization is more important than hard money? Just because he says so?
"Decentralization" is meaningless. Bitcoin relies on sufficient decentralization. What is sufficient decentralization? Who knows? Let's just cripple the network anyhow.
[doublepost=1454355370,1454354435][/doublepost]
Is P2Pool mostly a solo-miner/a few hashing whales, or composed of many independent parties?
P2Pool (mostly) pays directly to the miners so you can tell from looking at the coinbase payouts. It's a fairly good distribution.

I say mostly because there is someone experimenting with some kind of proxy implementation where miners don't get paid directly.
 

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
Someone explain WTF the MP is talking about (layman terms)
He wants to add a proof of storage to blocks in addition to proof of work.

The only way to produce the proof of storage is to have (access to) complete copies of all preceding blocks.

It's also a proof which can only be verified by entities with complete copies of all preceding blocks, unlike proof of work which can be verified using block headers alone.
 
  • Like
Reactions: bitsko and majamalu

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
@Justus Ranvier :

Why so? All you need to do is ask a node for specific transaction data in a block and the intermediate merkle tree hashes and see whether it matches up to the merkle root in the block headers.

That should be easy to do - if you just ask for random blocks, it should even be doable with today's protocol.

However, it is also quite easy to fake: Simply proxy the query and ask another node for the data.

I cannot see how proof of storage solves anything.
 

yrral86

Active Member
Sep 4, 2015
148
271
R
"Decentralization" is meaningless. Bitcoin relies on sufficient decentralization. What is sufficient decentralization? Who knows? Let's just cripple the network anyhow.
[doublepost=1454355370,1454354435][/doublepost]

P2Pool (mostly) pays directly to the miners so you can tell from looking at the coinbase payouts. It's a fairly good distribution.

I say mostly because there is someone experimenting with some kind of proxy implementation where miners don't get paid directly.
I don't know if anyone is actually running a p2pool proxy backed pool, but I did it as an experiment at one point. I was getting to the point where I was too small to get regular payouts from p2pool and that helped a bit, but I didn't take the time to develop it into a real pool.
 

sickpig

Active Member
Aug 28, 2015
926
2,541
He wants to add a proof of storage to blocks in addition to proof of work.

The only way to produce the proof of storage is to have (access to) complete copies of all preceding blocks.

It's also a proof which can only be verified by entities with complete copies of all preceding blocks, unlike proof of work which can be verified using block headers alone.
hence as of bitcoin 0.11.x a pruned full node can't produce a valid proof of storage.

that said I find this move MP quite surprising if you ask me, I'm also a little bit worried by the fact that I agree with a not insignificant part of the article, namely: increasing the block size and paid (not pruned) full nodes for their services.
 
  • Like
Reactions: majamalu

JVWVU

Active Member
Oct 10, 2015
142
177
So he is saying he is open to 2MB as long as Seg Wit SF is not the option and increasing my a True 2MB increase?
 

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
@Justus Ranvier: The fact that BIP21 has been changed before? (And as an extensible URI scheme, is expected to do so in the future) The words 'snuck in' even though you clearly said you're going to update BIP21?

All in all, it looks like their/his effort at further digging trenches in Bitcoinland is ongoing.
[doublepost=1454361052][/doublepost]@sickpig: Agreeing with MP is worrying, I agree :)
 

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
namely: increasing the block size and paid (not pruned) full nodes for their services.
Why do we need to store the entire blockchain in the first place?

Because a complete copy of the blockchain is necessary in order to verify the validity of new blocks. This is the only way to check for certain types of fraud.

The cost of storing a compete copy of the blockchain is linear with the number of transactions performed since the genesis block.

It's clear why that worries some people, but proof of storage is the opposite of a solution.

The root cause of the problem is that the proof system in Bitcoin has this requirement for history to be stored forever. That's a great thing to try to fix, not to permanently enshrine even more firmly into the protocol.

A perfect proof system would have the property of giving us complete assurance of the integrity of the money supply, while allowing most of the prior transaction to be forgotten, even for nodes that never say the forgotten transactions.

AFAIK, this isn't possible yet, but it might be in theory.
[doublepost=1454361290][/doublepost]
@Justus Ranvier: The fact that BIP21 has been changed before? (And as an extensible URI scheme, is expected to do so in the future) The words 'snuck in' even though you clearly said you're going to update BIP21?
And that I updated it in response to a request from someone else.

What's hilarious is that yesterday on Reddit I made a comment about Core developers and supporters were known to be vindictive, and gave an example.

Now I have another example.
 
He wants to add a proof of storage to blocks in addition to proof of work.

The only way to produce the proof of storage is to have (access to) complete copies of all preceding blocks.

It's also a proof which can only be verified by entities with complete copies of all preceding blocks, unlike proof of work which can be verified using block headers alone.
interesting.

Am I wrong, or did mp offer a compromise?

I will however say that detailsviii are negotiable, on one hand, and that I am open to considering other changes be bundled with this change, possibly including an increase of the blocksize.
Has it become easier to reason with mp than with core?
 

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
wtf did he really revert BIP 70 implementation?

wallet funcions It's not my preferred area of study but I can't parse what he's saying.
No, he reverted the change to BIP-21 which was introduced by BIP-47, without reverting the more than half a dozen other changes which had been made to that BIP since it was submitted.

If you look at the rest of his activity today, he certainly is busy doing maintenance and cleanup of the BIP repository. Very productive day today.

Hmm, that reminds me of something. When the news broke about how Luke and Maxwell had broken thin blocks, what was one of the first defences anyone brought up?


Why isn't the simpler explanation that Greg didn't pay attention to the XT repo(why would he?), Core merged work they thought was needed, then a few weeks later Tom Harding cherry-picks stuff he doesn't understand for off-label usage and breaks things?
It was just routine maintenance. How were they supposed to know it would break something else?

Maxwell really did spin a good tale, a very plausible one.

It really is just an unrelated coincidence that the very next day Luke-jr performs some other routine maintenance and just happens to revert a single change for indecipherable reasons which just happens to look like punishment for speaking out against his employer's (technically I suppose it's his client's) interests?

Small world, right?

If I didn't believe this was purely coincidental, I'd use it as an example of how nothing they repeatedly demonstrate that they should never be allowed anywhere close to project leadership or control, because they're willing to trash what should be a useful resource (BIP repository) because of petty personal grudges.
 

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
@Justus Ranvier: You know that I like UTXO commitments and keeping/validating reasonable history (1 year, maybe 5 or 10) to help with/solve that. Maybe cost of storage will stay so low that this is all unecessary, we'll see. If I would have more time and money, this is one of the first things I'd work on and test. Not because it is an urgent problem, but because I personally see it as the only persisting scalability roadblock to Bitcoin. The other is getting schemes working to pay full nodes for service.


I am not at all convinced of the bandwidth scare. There is already FTTH for fuck's sake. That is clearly enough, already today, for all people on the planet to bounce a transaction or two per day around and still get all of them delivered to your home.

And for the rest and the microstuff, we might have centralized providers (many of them, so rather hub-and-spoke decentralized) or maybe LN.
 
  • Like
Reactions: majamalu

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
@awemany UTXO commitments are scary because if somebody wants to create a bunch of counterfeit bitcoins for themselves, trying to break the UTXO commitment scheme to add their fraudulent coins is the obvious thing attack vector and it's hard to construct countermeasures that don't put you right back to needing the entire chain again.

If every full nodes parses the entire blochchain, a majority of miners can't print unlimited amounts of Bitcoins for themselves.

If nodes start relying on committed UTXO sets instead of parsing the entire blockchain, then sometimes a majority of miners can print themselves unlimited Bitcoins and it may or may not be efficiently provable.

tl;dr: It has to be very carefully implemented.
 

sickpig

Active Member
Aug 28, 2015
926
2,541
Greg is adept at psychological manipulation. He successfully demotivated people from implementing working improvements to scalability for years. And I am not talking about the hard limit, I am talking about the bandwidth reduction of more efficient block transmission. (Of course, he quite likely did that for the sake of his hard blocksize limit argument...)

Yet, his tactic becomes quite ridiculous and funny since people started to ignore him and actually implement these solutions:

As I said before: I have worked with characters like him. My level of disgust is pretty high.
every time I read gmax's walls of text, especially when such walls contradict each other in a relative small time span, I can't stop thinking to a Daniel J Bernstein's presentation I saw in October 2014. The title is quite telling:

"Making sure crypto stays insecure"


http://cr.yp.to/talks/2014.10.18/slides-djb-20141018-a4.pdf

In the presentation he tried to classify a few social engineering tools used to weaken crypto algorithms implementations. I find a significant overlap between gmax toolbox and the tricks described by djb. Maybe it's me and linking gmax's behaviour to such practice is far fetched, but as I already said it's something that I can't stop thinking :)
 
  • Like
Reactions: AdrianX