Gold collapsing. Bitcoin UP.

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
@Zangelbert Bingledack:

I was talking more about a potential scenario where Blockstream could acccept a compromise. But that door seems to be closed...

In the event of a fork, your way would make make more sense.

What do you mean by 'Damage Control?' Which damage are you referring to?
 
  • Like
Reactions: AdrianX

Zangelbert Bingledack

Well-Known Member
Aug 29, 2015
1,485
5,585
Damage control: Gmax trying to avoid having thunder stolen from the Segwit revelation and preempt people even temporarily thinking that Mr. Huge Blocks Craig Wright is really Satoshi. If it turns out him or Kleiman is Satoshi and clearly do support bigger blocks the next move will be to discredit them. I can almost here him screaming at the computer "HOLD THE LINE!!! 1MB MUST HOLD!!!"
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
If Joseph Poon misunderstands economics so badly, which he does, what does that say about the economic assumptions behind LN?
 

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
Gavin- Segregated witness is cool.

great positive little write up.

Key takeaways
Well, once all the details are worked out, and the soft or hard fork is past, and a significant fraction of transactions are spending segregated witness-locked outputs… more transactions will fit into the 1 megabyte hard limit.
So once everybody has moved their coins to segregated witness-locked outputs and all transactions are using segregated witness, two or three times as many transactions would squeeze into the one megabyte block limit.
Segregated witness transactions won’t help with the current scaling bottleneck, which is how long it takes a one-megabyte 'block’ message to propagate across the network– they will take just as much bandwidth as before.
I think it is wise to design for success. Segregated witness is cool, but it isn’t a short-term solution to the problems we’re already seeing as we run into the one-megabyte block size limit.
Mulitsig was rolled out in 2012 yet in 2015 89% of transactions are still 'basic' type (tradeblock blog)

I point this out as they are both clear improvements in the current format but individual actors need to start using them so there is a relatively slow adoption rate.

If you follow my logic here. Segregated Witness transactions will require client, individual and company software/procedures to be updated, therefore we might expect similar adoption curves. As SW only creates new 'block space' with adoption, it follows we will not be seeing the touted 4x blockspace improvements for min 2 years possibly much longer?

edit @sickpig sorry for the double ,I must have been composing this when you posted.
 

albin

Active Member
Nov 8, 2015
931
4,008
Oh boy, here we go:

Pieter proposes to give segregated witness transactions a discount on transaction fees, by not completely counting the segregated witness data when figuring out the fee-per-kilobyte transaction charge.
 

sickpig

Active Member
Aug 28, 2015
926
2,541
@lunarboy don't worry for the double. your summary is very valuable.

if I might add something on segwit.

the are two really important thing that segwit brings on a tech level: a fix for malleability and fraud proof.

the former is a key piece for a famous layer 2 solution: namely LN (among other s).

of course also fraud proof is a massive feature.
 

Melbustus

Active Member
Aug 28, 2015
237
884
@awemany - "He still clearly doesn't get it. It's up to the miner to decide cost for a transaction, not pwuille."

Yes, exactly. That's essentially the key folly in core-dev's approach these days. Unfortunately the Hong Kong conf was a terrible outcome for this viewpoint since most miners essentially refusing to exercise a vote because they're ignorant just re-enforces core-dev's impulse to specifically architect everything from the start.

We need more miners/pools that are willing to implement their own code, and we need them soon.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
@Melbustus

not one miner on the panel from the US. what a shame.

i think it is time for a US based BU nonprofit pool.
[doublepost=1449679141][/doublepost]
Oh boy, here we go:
link please
 
  • Like
Reactions: majamalu

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995

rocks

Active Member
Sep 24, 2015
586
2,284
Regarding backwards compatibility with SW, is the thinking below right or am I misunderstanding how it works?

My understanding is SW essentially moves "prunable data" for a transaction from the main block to a side block connected by a hash. Which means this is simply an intelligent form of pruning.

But it also means that to receive a SW transaction, the receiver needs to be upgraded to a SW enabled client or else they can not use or receive the output, since the transaction's data is in the side block which is not viewable or verifiable to old clients.

The bigger part I am trying to understand is can a SW transaction be moved back to the main block chain? A SW transaction requires a client to understand the SW side block. So it is logical to assume that to spend or validate a new transaction originating from an output on a SW side block requires clients to be updated. Since the main block is validated by old clients, this means that SW transactions can not be moved back to the main chain, but are locked into SW-only side blocks forever.

Is this right? If so I can see this being a deal breaker for SW. Imagine the reaction by people who upgrade, receive a SW transaction and then try to send it to someone else, only to find out they can't because that person either did not upgrade or only accepts transactions on the main chain. Now imagine that counter party is Coinbase, Bitstamp or any number of merchants.

It also means that SW is a hard fork being hidden as a soft fork. Everyone has to upgrade for it to work, that is a hard fork even if you haven't changed the main chain.

Worse it also is a pathway to make people comfortable with off chain solutions. With SW your BTC are not stored in the main chain but in something else. If that something else is a SW side block or LN or SC what's the difference, they're all off chain. SW is looking to be a poison pill or gateway drug IMHO.
 

sickpig

Active Member
Aug 28, 2015
926
2,541
@cypherdoc I think Gavin simply state what is Pieter's strategy to stimulate a migration of wallets/node to segwit.

I don't think he's endorsing Pieter's reasonings, after all he think that an hardfork is a better way to deploy sigwit, and by definition this mean that wallets/nodes have to upgrade.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
can it be that Gavin has rendered an opinion prematurely?:

[doublepost=1449681574][/doublepost]@sickpig

yeah, but the blog post will be interpreted by everyone as an endorsement. he hasn't even reviewed the code?
[doublepost=1449682017][/doublepost]double bottom prepare for launch?:



paging Janet:

 
  • Like
Reactions: majamalu

sickpig

Active Member
Aug 28, 2015
926
2,541
@cypherdoc I think he did review the code somewhat. Just look at his reply to gmx's "state of the union" on btc dev ml, namely:

http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011903.html
http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011920.html
[doublepost=1449682514][/doublepost]
Someone mentioned here a while back that Greg previously wrote a paper saying that decentralized consensus was not possible, and then when Bitcoin worked later claimed that it was because of the 1MB cap.

Does anyone remember this, and if so have a link to the paper or anything else relevant. Am curious. Thanks
A few weeks ago I've posted a YouTube video where gmax said he proved the distributed consensus rule is impossible. Just check my posts history.

Don't know if a paper exists on that matter, though.
 
Last edited:
  • Like
Reactions: majamalu

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
@sickpig

thx for those links. seems to be an inconsistency btwn what he's writing on Reddit today vs those posts. did anyone respond to his proposed attack? isn't that attack a variation on the f2pool single tx multi input attack?:

Here's the attack:

Create a 1-megabyte transaction, with all of it's inputs spending
segwitness-spending SIGHASH_ALL inputs.

Because the segwitness inputs are smaller in the block, you can fit more of
them into 1 megabyte. Each will hash very close to one megabyte of data.

That will be O(n^2) worse than the worst case of a 1-megabyte transaction
with signatures in the scriptSigs.

Did I misunderstand something or miss something about the 1-mb transaction
data and 3-mb segwitness data proposal that would make this attack not
possible?

RE: fraud proof data being deterministic: yes, I see, the data can be
computed instead of broadcast with the block.

RE: emerging consensus of Core:

I think it is a huge mistake not to "design for success" (see
http://gavinandresen.ninja/designing-for-success ).

I think it is a huge mistake to pile on technical debt in
consensus-critical code. I think we should be working harder to make things
simpler, not more complex, whenever possible.

And I think there are pretty big self-inflicted current problems because
worries about theoretical future problems have prevented us from coming to
consensus on simple solutions.

--
--
Gavin Andresen
 

Melbustus

Active Member
Aug 28, 2015
237
884
@sickpig - from that first link:

Gavin:
I think it is a huge mistake not to "design for success" (see
http://gavinandresen.ninja/designing-for-success ).

I think it is a huge mistake to pile on technical debt in
consensus-critical code. I think we should be working harder to make things
simpler, not more complex, whenever possible.

And I think there are pretty big self-inflicted current problems because
worries about theoretical future problems have prevented us from coming to
consensus on simple solutions."
^ This, this, and this.

Such a shame that Gavin stepped back from core-dev.
 

rocks

Active Member
Sep 24, 2015
586
2,284
GBTC hit a high of $72.5 today for roughtly $760 per BTC.

The twins need to get that ETF out.

I also don't understand why GBTC is allowing this premium. They could easily buy coins on the open market, create GBTC shares from those coins and sell those GBTC shares for a huge premium. That's what most closed end funds do when there is a large premium on the underlying asset.