Gold collapsing. Bitcoin UP.

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
I always thought it should be possible to make something that looks like a completely normal web forum, with content that was automatically generated by chatbots to look innocuous to a human censor, while carrying stenographic payloads.

The censors would see a politically safe web site where users appear to be chatting about panda pictures or something similarly harmless, while patterns such as which order a client views threads, and what they post in that those threads, would all convey encrypted data.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,994
@cypherdoc

I suppose the reason most miners don't understand the game theory and economics is the same reason an investor who's never lost much money doesn't understand investing.
so much truth to this.

i have lost money investing and it is painful yet a much needed learning experience. i've invested in a variety of assets; stocks, bonds, real estate, gold, silver, my own biz, my education, professional equipment, IPO's, and other such speculations. experience is important. how many core devs ever took out a mortgage? i'll bet none. but it is extremely educational to deal with one of the most common funding schemes in our society; debt. and the largest type of debt one usually takes in his/her life. it involves quite a bit of risk and responsibility and usually involves other individuals like a wife and children who depend on your decisions.

i look at a guy like Peter Todd during his presentation and see someone who has no clue yet who freely pontificates on the decision making of a variety of economic actors in businesses he's never participated in and in geographic locations of the world he's never been. i mean, c'mon, how could he be so surprised to find out how slow his internet connection in China was?
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,994
oil back under $40. the sweet smell of deflation. this downstroke could get interesting:

 
  • Like
Reactions: majamalu

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,994
Joseph Poon-oh yeah, "100kB will confine Bitcoin to institutional use". right.

it would be the end of Bitcoin.
 

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
Good to see they are catching up to what I published in July.

http://diyhpl.us/wiki/transcripts/scalingbitcoin/hong-kong/segregated-witness-and-its-impact-on-scalability/

But there's an easier solution here, and I felt stupid when gmaxwell told me about this. Oh, you told gmaxwell? Hm. Well, you can instead make the witness contain the height of the block which produced the outputs, and the position within the block, and if it's wrong you can just show a proof that the transaction you claimed was there wasn't there.
http://bitcoin-development.narkive.com/p5wEUweK/fork-of-invalid-blocks-due-to-bip66-violations#post16
 

rocks

Active Member
Sep 24, 2015
586
2,284
@cypherdoc
@rocks
Not being a coder myself I can't speak to the Core devs' competence on coding, though I did see Gmax chide XT over its recent issue, saying supporters of bigger blocks seem unable to release working code. I would love to see a candid assessment of their abilities in general, though I get a sense other coders are reluctant to do that for whatever reason because I hardly ever see it. I only recall Rusty Russell mentioned as an exceptionally talented coder.
The issue relates to where participation in Bitcoin development is coming from and how that compares to other open source projects, rather than competency.

In almost every large scale open source effort I've seen development is largely user led, and companies who rely on the project allocate developer resources to the project to move it forward and ensure alignment. By having users lead development, development follows the interests and needs of the user base.

The issue we are seeing today is the blockstream development team is not taking Bitcoin in a direction the user base wants. But the problem is that user base is not contributing towards development, this creates a vacuum that is filled by Greg and team.

If we want to see a different direction, then Bitcoin entities need to step up and start to take over development and contribute much than they have been. Imagine if miners such as KnC took a lead just went ahead and implemented IBLT and other scaling techniques, instead of just waiting for blockstream to do so, we'd actually see movement on scaling for the first time in years.

We've been complaining that blockstream has taken over and forcing their way, but that is symptom. The problem and root cause is the lack of merchant and miner contributions in code.

On a side note, I think this is why Greg and team are taking such an aggressive and condescending attitude. Doing so scares others from getting involved, their behavior is designed to discourage others from participating, which in effect makes the system more reliant on them.
 

rocks

Active Member
Sep 24, 2015
586
2,284
Has anyone looked into Segregated Witness as a soft-fork mechanism to enable larger blocks without changing the 1MB limit.

https://www.reddit.com/r/btc/comments/3vqkat/segregated_witness_allows_us_to_soft_fork_a_block/

The concept seems to be simply creating additional large blocks of transactions and connecting those blocks to the mined block through signed hashes.

Just started to consider this, and although transactions stored in a side block can be verified to be connected back to the main block's header, the issue seems to be that is every node on the network has to agree to read and validate the side block and include those new outputs in the UTXO set. That seems to be more of a hard fork IMHO, but does preserve the current block chain structure.

Curious if anyone knows more about this.
 
  • Like
Reactions: majamalu

Zangelbert Bingledack

Well-Known Member
Aug 29, 2015
1,485
5,585
@rocks

As the perception grows that Core is not taking Bitcoin where users want it to go, and as blocks fill up, for the first time we will have a lot of incentive for the development of things like thin blocks and for dev to be taken into the hands of more disparate groups.
 

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
Yesterday, I stumbled upon something interesting on reddit, voting with your money:

http://bitcoinocracy.com/

I am not yet sure I want to trust my client doing the signing right (without revealing private key entropy).

But it is an interesting idea.
 
  • Like
Reactions: majamalu

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
To add something:

Maybe now would be the time to use thatproof-of-stake vote platform,

but first with all the devs getting together and at least agreeing on a common protocol for deciding what vote will count how for the blocksize?

Maybe this might be the solution to the blocksize issue?
 

solex

Moderator
Staff member
Aug 22, 2015
1,558
4,693
@rocks

I didn't notice the significance of this at first,
Blockstream has its cards on the table. Keep the 1MB for some unknown period, but create a type of extension block "segregated" data which has a max of 4MB.

Putting the main highlights, by Pieter Wuille, for reference here:
------------
This is my proposal that we do right now. We implement segregated witness right now, soon. What we do is discount the witness data by 75% for block size. So this enables us to say we allow 4x as many signatures in the chain. What this normally corresponds to, with a difficult transaction load, this is around 75% capacity increase for transactions that choose to use it. Another way of looking at it, is that we raise the block size to 4 MB for the witness part, but the non-witness has same size. The reason for doing the discount, last slide, the reason for doing this discount is that it disincentivizes UTXO impact. A signature that doesn't go into the UTXO set, can be pruned.

Why 4x? Well, 0.12 just made transaction validation like 7x faster I believe with libsecp256k1. The impact on relay, there are technologies that have been discussed earlier like IBLT and weak blocks. So segwit fixes malleability ,allows more Script upgrades, allows fraud proofs, allows pruning blocks for historical data, improves bandwidth usage for light nodes and historical sync, and it's P2SH compatible for old senders so that non-upgraders can still send funds. This gives us time for IBLT and weak blocks to develop, so that we can see whether the relay and propagation stuff can have time to work.
------------
https://prezi.com/lyghixkrguao/segregated-witness-and-deploying-it-for-bitcoin/
http://diyhpl.us/wiki/transcripts/scalingbitcoin/hong-kong/segregated-witness-and-its-impact-on-scalability/


It sounds good with its pluses, but also adds significant complexity, some very new work being pressed into service. I struggle to see how this buys time for IBLT which is a less radical change (relay improvement), and should be used with a block limit increase to buy time for something like segregated witness.

Pieter deserves respect as he has done a phenomenal amount to improve Bitcoin, but it is clear that Core Dev are so afraid that the network is near capacity limits that they will only tolerate scaling as far as code optimizations will allow.

This is radical departure from the BIP101 perspective which is to keep doing more of what already works as much as technological improvements allow. Irreconcilable differences coming up.
 
Last edited:

theZerg

Moderator
Staff member
Aug 28, 2015
1,012
2,327
I haven't looked closely at separating signatures from the rest -- "Segregated witness" -- but you have to ask:

1. will it encourage even less validation?
2. Fully validating nodes still need to download the "witness" part, right? So the bandwidth requirements for full nodes don't change? The only thing that changes is disk space because after a few years you can forget the witness. But disk space is not an issue.
3. It seems like this "soft" fork is so huge that it is a defacto hard fork. Its one thing to create a "soft" fork for a new transaction type that you feel might be awesome in the future but it does not affect normal bitcoin transactions. Old clients can still understand 99% of what new clients are generating. Its another thing to do that with every bitcoin transaction. Old clients won't be able to understand any of the new format transactions, which is all transactions new client generate. So sure they won't fork... they just won't understand a single thing going on on the blockchain. With a soft fork this big we could "soft fork" an equivalent to BIP101 too, couldn't we? Add a "txout" to the coinbase that is not really a transaction, it contains the hash of an extension block that contains more transactions.
4. It also seemed like the soft fork was accomplished in a very complex manner... additional complexity causes bugs; better to just hard fork.

I still vote for the simpler BIP101 now.

But long term, I'd vote for a hard fork with a clean segregated witness and other efficiency optimizations, supply side flexibility (unlimited blocks or high txn fees can be added "beyond" the max which is a simple version of "flex-cap"), Justus Ranvier's fraud proofs, and a fee pool to fix all the issues around miners self-mining fake txns.
 

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
Any approach that allows for effective fraud proofs reduces the downsides of a large blockchain.

If the blockchain is going to get so large that not everybody can run a full nodes, this becomes far less of a problem when security model is no longer "all or nothing".

Segregated witness could have been presented as a compromise position, "we're not comfortable with allowing the blockchain to grow at higher rates as it's currently composed, but with this change in place to improve the security model, then we would be."

Was it presented that way?
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,994
Ok. Making some hay here with DXD. Let's hope it holds.
 

theZerg

Moderator
Staff member
Aug 28, 2015
1,012
2,327
@Justus Ranvier No, as far as I remember nobody used "we". Its just another proposal to derail BIP101. I'm not going to look into it much more right now.

I've said before, "we" the large block supporters should not get derailed (like we were with BIP100) until "they" present some kind of formal proposal that has broad, signed, support within the small block devs. After all, this "proposal" requires bandwidth increases (seems like a one time hop to 5MB) which is a big issue for the small blockers.

There's no formal machinery to do this AFAIK... its why I wrote the BU Articles. But they could still do it even without a formal process.

EDIT: don't be put off because nobody talks about yr fraud proofs. IMHO they are so obviously good there's nothing to talk about :)
 
Last edited:

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
EDIT: don't be put off because nobody talks about yr fraud proofs. IMHO they are so obviously good there's nothing to talk about :)
http://diyhpl.us/wiki/transcripts/scalingbitcoin/hong-kong/segregated-witness-and-its-impact-on-scalability/

But there's an easier solution here, and I felt stupid when gmaxwell told me about this. Oh, you told gmaxwell? Hm. Well, you can instead make the witness contain the height of the block which produced the outputs, and the position within the block, and if it's wrong you can just show a proof that the transaction you claimed was there wasn't there.
I just want to know if anyone publicly proposed that solution prior to July when I posted it to the bitcoin-dev mailing list.

I'm not aware of anyone else suggesting it, but it would be nice to know for sure.
 
  • Like
Reactions: rocks