Gold collapsing. Bitcoin UP.

bluemoon

Active Member
Jan 15, 2016
215
966
@cypherdoc

I agree. I'm afraid I've laid off the donuts, but I can hardly believe how the miners continue to be willing to gamble their futures on an eventual favourable outcome to Blockstream's convoluted strategy, when instead they could be taking the direct course of increasing the blocksize, keeping control in their own hands and likely getting themselves a bigger share of a bigger cake.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
Peter Todd is proposing adding a hard limit on the size of the unspent transaction output (UTXO) set:

https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-May/012715.html

Ignoring the central planning of yet another Core-dev defined limit, the scheme he proposes for "low-latency delayed TXO commitments" does seem interesting (I need to read his email a few more times to make proper sense of it).
 
  • Like
Reactions: AdrianX

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,994
there's no denying that there's trouble afoot in the stock mkt. i've pointed out a few times here how JWN was the first stock i could identify back in Feb 2007 to roll in anticipation of the crisis of 2008-9. you can see it clearly here on the 10y weekly chart way over on the left there. we've clearly begun a similar decline. Sell in May and Go Away! Bitcoin will benefit!:


[doublepost=1463507158][/doublepost]
Peter Todd is proposing adding a hard limit on the size of the unspent transaction output (UTXO) set:

https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-May/012715.html

Ignoring the central planning of yet another Core-dev defined limit, the scheme he proposes for "low-latency delayed TXO commitments" does seem interesting (I need to read his email a few more times to make proper sense of it).
how is that any different than what we've discussed here many times with @awemany & @rocks?
[doublepost=1463507657,1463506982][/doublepost]here's the 20y weekly Nordstrom (JWN) chart. we're already significantly below 2/07. imo, this is forecasting a BIG deflationary wave down for the $DJI and major indices to come. also look how steep this latest drop has been from the top compared to 2007-9:

 
Last edited:

satoshis_sockpuppet

Active Member
Feb 22, 2016
776
3,312
Peter Todd is proposing adding a hard limit on the size of the unspent transaction output (UTXO) set:

https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-May/012715.html

Ignoring the central planning of yet another Core-dev defined limit, the scheme he proposes for "low-latency delayed TXO commitments" does seem interesting (I need to read his email a few more times to make proper sense of it).
Is that very different from this idea: https://www.reddit.com/r/Bitcoin/comments/35du4c/an_idea_for_attacking_the_utxo_set_size_growth/ ?

BTW, I think the way Blockchain introduced their Lightning implementation was pretty cool. Just do your stuff and release a somewhat working alpha version to show something that actual works.. Did they beat Blockstream? :)
Maybe some miners will start to realize that they won't necessarily be the operators of the Lightning hubs* and that the Lightning tx fees are actual losses for them...

*As Blockstream probably sold them the idea.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,994
ok, $DJI trouble confirming the lagging and forewarning $DJT; we just broke support of the last daily low of April 7. we got a L translated daily formation going and the Primary Dow Theory Bear Trend is back in full force. look the hell out:

 

79b79aa8

Well-Known Member
Sep 22, 2015
1,031
3,440
. . . I can hardly believe how the miners continue to be willing to gamble their futures on an eventual favourable outcome to Blockstream's convoluted strategy, when instead they could be taking the direct course of increasing the blocksize, keeping control in their own hands and likely getting themselves a bigger share of a bigger cake.
It's simple: presently, big miners are in control of the hashrate, but also, to an important degree, they are in control of the code, indirectly via Blockstream-core. Having a degree of control over the code protects big miner investment. For example, miners behind the Great Firewall need to protect themselves from bandwidth disadvantage. And big miners want to avoid complications with forks / alternative implementations / disruption / uncertainty. In both cases, Blockstream delivers. Meanwhile, BS need to make money for themselves, so they build their consultancy and proprietary software, leveraging both with the advantage of having direct control of the reference code. Normally this would not happen in an open-source environment, but in this case it does, as the code that prevails is the one that is backed up by the hashrate.

With their interests thus aligned, BS and the miners negotiate which changes to introduce, in exchange for what, under what timetable, etc., with both expecting to gain advantage or profit.
 
Last edited:

Nat-go

New Member
Apr 2, 2016
21
30
Hamburg, Germany
i think LN competition will become a factor. why? b/c there are fees/money involved. in fact, bc.i's jumping the gun w/o CSV or SW implemented yet is their attempt to grab market share. bc.i wants to become a major LN hub (Thunder) with it's own brand. why should they cooperate for interoperability if they are the first?:

https://www.reddit.com/r/btc/comments/4jqa7j/how_competition_may_kill_the_lightning_network/
Thunder needs CSV & SW to run proper:

https://github.com/blockchain/thunder
Outlook
thunder.network uses a commitment-transaction design that needs both CSV and Segregated Witness to be completed. Otherwise the payments are not enforcable on the blockchain and are bad promises at best.
 
  • Like
Reactions: bluemoon

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
Peter Todd's email on the dev-list got me thinking that there are two types of protocol rules:

CATEGORY #1. Rules that we all agree we need in order for Bitcoin to function as sound money (e.g., you can't create coins out of thin air or spends coins that aren't yours).

CATEGORY #2. Rules that some people think we need as a crutch for current technology limitations (e.g., the max block size, max sigops, max bytes hashed, fee discount for segwit, max utxo set size, etc.)

Blockstream/Core and Unlimited both agree on the importance of unambiguously enforcing the rules from Category 1. However, our approach to Category 2 is completely different.

Blockstream/Core wants to apply a "top down" approach where Category #2 rules are strict and decided upon by Core-devs and tweaked as necessary to engineer the desired overall system behaviour. More rules are seen as a good thing to them because it gives them more switches to flip and knobs to turn.

These people do not trust market mechanism nor the wisdom of crowds, and so they feel it is important to limit the ability of the end user to adjust the behaviour of his node in ways that might affect the emergent network behaviour. They believe that Bitcoin's success is dependent on a group of talented developers (themselves) evaluating tradeoffs, making wise decisions for how nodes ought to behave, and then enforcing that behaviour by whatever means are necessary.

Unlimited wants to allow a "bottom up" approach where Category #2 rules are fuzzy and emerge organically from within the system itself. Rather than a strict block size limit, Unlimited allows nodes to safely set their own limit but with a mechanism to ensure convergence upon a single chain. This same philosophy can be extended to all of the Category #2 rules. As @Gavin Andresen pointed out, if nodes simple choose to drop blocks that take longer than (e.g.) 20 seconds to download and fully validate, then this single rule (which could even vary from node to node) would effectively place limits on block size, sigops, bytes hashed, UTXO lookups, etc., but in a way that naturally scales as the network becomes faster and more powerful. Convergence can be ensured if nodes simply try for longer to download and verify blocks the further from the tip those blocks are.

The people who support Unlimited trust market mechanisms and so they want to expand the ability of the end user to adjust the behaviour of his node--especially in ways that might affect emergent network behaviour. They believe it is the aggregate effect of a bunch of small decisions made each day by thousands of nodes operators and millions of bitcoin holders that will best design the strongest network. There is no prescibed behavior to enforce, beyond fighting to tear down barriers that limit the community's access to information and the community's ability to freely adjust the behaviour of their nodes.

So which is the future for Bitcoin? Will we see more and more central planning and control? Or will we see a relinquishing of that control into the capable hands of the market?
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,994
how many times have we heard statements from small blockists like, "the mass of users are ignorant", "we can't have the majority dominate the minority", "ordinary users are technically ignorant & thus should have no input to how Bitcoin scales", "the technical experts have unanimously agreed". and on and on.
 

79b79aa8

Well-Known Member
Sep 22, 2015
1,031
3,440
Blockstream/Core [. . .] do not trust market mechanism nor the wisdom of crowds, and so they feel it is important to limit the ability of the end user to adjust the behaviour of his node in ways that might affect the emergent network behaviour. They believe that Bitcoin's success is dependent on a group of talented developers (themselves) evaluating tradeoffs, making wise decisions for how nodes ought to behave, and then enforcing that behaviour by whatever means are necessary.
I don't think this is the explanation (although it may be a rationalization). It is not that BS-C do not trust the market. It is that they gain money by controlling the code. And the way the mining industry has developed (> 1/2 of the hashrate is controlled by Chinese miners that do seek protection from the open market) has so far allowed them to do so. Meanwhile the core devs appease their conscience by convincing themselves their solutions really are the best and in the interest of most.
 
Last edited:

albin

Active Member
Nov 8, 2015
931
4,008
What an amazing coincidence that there are no tradeoffs with the softfork segwit approach, like for example the witness discount is necessary as an accounting trick to allow it to be performed as a softfork, and amazingly that also happens to exactly "solve" an incentives problem with the utxo set.

Magically solving everything conceivable with no potential drawbacks through after-the-fact rationalizations, that can't possibly be massive cognitive bias right??
 
Last edited:

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
I don't think this is the explanation (although it may be a rationalization). It is not that BS-C do not trust the market. It is that they gain money by controlling the code. And the way the mining industry has developed (large Chinese miners that do seek protection from the open market) has so far allowed them to do so. Meanwhile the core devs appease their conscience by convincing themselves their solutions really are the best and in the interest of most.
Interesting. Let's consider the economic theories of Maynard Keynes as an analogy for Maxwellian economics. Copy/paste from Wikipedia:

"Keynesian economists often argue that private sector decisions sometimes lead to inefficient macroeconomic outcomes which require active policy responses by the public sector, in particular, monetary policy actions by the central bank and fiscal policy actions by the government, in order to stabilize output over the business cycle."

"Maxwellian developers often argue that node operators' independent decisions sometimes lead to inefficient use of blockchain resources which require active protocol rule changes by the Core developers, in particular, placing limits on the use of block space or the size of the unspent transaction output set, in order to guarantee the security of the network over adoption cycles."

I think what made Keynesianism so dangerous is that it both (a) sounds good, and (b) gives the people who can implement it increased power and control. Did Keynes (and in general do the proponents of Keynesian economics) believe that they are doing what is best for the economy? Or is the theory just a rationalization for increased personal power and control? Whatever the answer is (IMO it's a bit of both), I think it's the same answer for both the Keynesians and the Maxwellians.
 
Last edited:

Inca

Moderator
Staff member
Aug 28, 2015
517
1,679
/r/bitcoin providing pathetically transparent sock trolling of garzik's latest tweet.

"Garzik is a great guy, but I've struggled to follow is reasoning lately." says a concern troll sock created entirely for the blocksize debate.

"get lost, Economics are irrelevant to computer science. In addition, I think it's funny that you and Gavin still think you're relavent in this space any longer. Find a new job, because every time you open your mouth, bitcoin dies a little." says 'pizzaface18'.

Are people really this stupid? Scratch that. We know they are. That subreddit can't get any lower.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,994
@Inca

they actually pride themselves on their meanness. i was watching Core Slack yesterday and they were deriding someone for being too nice. not mean enough.

that's what we're up against. classless ppl.
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
On another topic, I remember when I first released my transaction fee market paper, Maxwell and Co. said that orphaning wasn't a deterrent against larger blocks because miners use Corallo's Relay Network and thus don't suffer block-size-dependent orphaning risk (and then later they said that we can't have bigger blocks because orphaning risk would be too high blah blah blah but let's ignore that contradiction for now).

Blockchain.info keeps tabs on orphaned blocks, and I wrote a script to scrape their site and import that raw data into Mathematica. The data goes back about two years and only appears to be missing a gap from last summer.

If the theory that larger blocks actually are more likely to be orphaned than small blocks is true, then hopefully there would be evidence of this in real orphan data. The figure below shows that indeed this does appear to be the case. Orphaned blocks are consistently larger (on average) than their neighbouring main-chain counterparts.



Miners incur a real cost by producing extra block space; the supply curve for block space is nonzero and has a positive slope (i.e., it behaves like a normal commodity).
 

jonny1000

Active Member
Nov 11, 2015
380
101
“On another topic, I remember when I first released my transaction fee market paper, Maxwell and Co. said that orphaning wasn't a deterrent against larger blocks because miners use Corallo's Relay Network and thus don't suffer block-size-dependent orphaning risk (and then later they said that we can't have bigger blocks because orphaning risk would be too high blah blah blah but let's ignore that contradiction for now).”


Dear Peter R


I have discussed this issue with you many times. That is a mischaracterisation of the criticism of your paper. It is both true that higher orphan risk probably won’t be a deterrent for larger blocks and ***IF*** it was that would have potentially catastrophic consequences on the network. These things are both true at the same time, it is not a contradiction.


Higher orphan risk probably will not be a significant deterrent to larger blocks due to Matt’s relay network and other technologies like thin blocks. If this technology proves ineffective or insufficient, then relying on orphan risk as a tool to keep blocks small and drive the fee market is a terrible idea for the following reasons:


· It increases orphan risk, which should be minimized for many reasons (for example “wasted work” which I had to explain to you). A healthy network has lower orphan risk

· It ensures orphan risk is high relative to fee revenue

· Mining centralisation is locked in, since orphan risk costs are higher (comparatively speaking) for smaller miners, since they need to propagate to a larger proportion of the network

· If mining incentives and fees are driven by orphan risk cost, there may be insufficient subsidy carrying over to finance hashing, which may result in a lower equilibrium difficulty.


Many thanks

Jon
 

Peter R

Well-Known Member
Aug 28, 2015
1,398
5,595
Good to see you pop your head back in here, Jon.
Higher orphan risk probably will not be a significant deterrent to larger blocks due to Matt’s relay network...
But I just showed empirically that, even given Matt's relay network, the blocks that got orphaned tended to be larger blocks. Do you disagree that (holding all else constant) larger blocks are more likely to orphaned than smaller blocks?
...and other technologies like thin blocks.
Thin blocks decrease propagation time, but they don't suddenly make large blocks propagate equally fast as small blocks. Instead they make a 10 MB block propagate like a 1 MB block, and a 2 MB block propagate like a 200kB block. In fact, Xthin increases the relative cost of a spam-block attack since Xthin only works for blocks composed of transactions that are permissible according to standard mempool policies. A miner would typically construct a spam block with a bunch of zero-fee spam transactions that wouldn't be accepted into a typical node's mempool and thus the spam block would propagate extra slow.
If this technology proves ineffective or insufficient, then relying on orphan risk as a tool to keep blocks small and drive the fee market is a terrible idea for the following reasons:

· It increases orphan risk, which should be minimized for many reasons (for example “wasted work” which I had to explain to you). A healthy network has lower orphan risk.
This reads as a non-sequitor to me. Do you mean it would increase orphan rates? [which I do not believe would be true]. Orphan risk is how likely your block is to be orphaned multiplied the cost you would bear if you lost it. How does relying on orphan risk to limit the block size increase orphan risk itself?

Regarding "wasted work," this is a rounding error at the historical orphan rates of 1 or 2%.
· It ensures orphan risk is high relative to fee revenue
Firstly, this doesn't matter until the block reward begins to phase out in a few decades. Secondly, this is only true if we're still building blocks inefficiently like we do today when that time comes. I showed in my subchains paper how orphan risk can greatly be reduced without a corresponding reduction in fee revenue (which we've discussed in the past). This criticism of yours (and Greg's) was one of my motivations for exploring how pre-consensus schemes (e.g., those based around weak blocks) might affect the fee market. Greg claimed that they would kill the fee market completely; I believe they make the fee market stronger.



· Mining centralisation is locked in, since orphan risk costs are higher for larger comparatively speaking for smaller miners, since they need to propagate to a larger proportion of the network
Yes, mining pools and solo miners that control a greater portion of the network hash rate have a small theoretical advantage due to the fact that they are more likely to solve two blocks in a row. At historical orphan rates of 1 - 2%, this effect is very small compared to the variation in the many other variables of the miner's profitability equation such as the cost of electricity.
· If mining incentives and fees are driven by orphan risk cost, there may be insufficient subsidy carrying over to finance hashing, which may result in a lower equilibrium difficulty.
This is not true either. The figure above illustrates how fees can contribute to hashing cost.
 
Last edited: