Gold collapsing. Bitcoin UP.

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
Society (collectivistic surplus production to pay tribute) is per se a doom scenario. Always had been.
Tainter's law: A society is a 'problem solving' construct facing 'Diminishing return on additional investment in additional complexity/debt/energy'. Doomed at the start.
Doomed at the start only insofar as it seems to always decay into some kind of longing for a perfect utopia and then fail on delivering that, because the initial premises are wrong. Recent example: Feminism.

It appears to me that we humans are bad at figuring out our own complexities but are good at simplifying them too much.

And I think 'Anarcho-primitivism' is a form of such an impossible utopia as well.
 
Last edited:

Zarathustra

Well-Known Member
Aug 28, 2015
1,439
3,797
@awemany

Feminism is the effect of patriarchy, which is the opposite of anarchy.

"And I think 'Anarcho-primitivism' is a form of such an impossible utopia as well."

Yes, but one that had been working for a much longer time than collectivism. A million years.

The citizen is a drug addicted species that has lost the will to prefer freedom before drugs.

The citizen never returns to freedom and 'primitivism' willfully. He'd always been catapulted out of collectivism and citizenship when his unnatural construct collapsed. Catapulted out of hierarchy (patriarchy) back into anarchy.
Will he ever learn the lesson? What does it need? Collapsing nuclear infrastructure instead of collapsing aquaeducts?
 
Last edited:

Zarathustra

Well-Known Member
Aug 28, 2015
1,439
3,797
Last edited:

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
@freetrader:

Jstolfi saying there:
My guess is that, before that happens, Core/Blockstream will relent and do a 2 MB fork, rather than be tossed aside. People in their team must already be having second thoughts about the sanity of Greg's "fee market".
I have not seen hints of those kind of second thoughts yet. I do and did have second thoughts regarding big blocks (as in >>8MB very soon), though I still think even that would be fine.

If both sides would only include people having second thoughts, I think we'd all have reached a very reasonable path forward. Sadly, we do not.

If the miners force Borgstream to go up to 2MB in August, I consider that a victory - mainly because I think it means that the miners finally woke up and are shown to be their own, independently-thinking-but-motivated-through-incentives part of the ecosystem.
 

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
@Tom Zander

hitting a home run.

https://zander.github.io/posts/Scaling Bitcoin/

Conclusions
The goal I tried to argue from is 50 million users per day. This goal is a huge increase from today. But to make sure we do it properly, my goal is set 5 years in the future.

Scaling Bitcoin-nodes is ultimately boring work with very little effort needed because it turns out that a modern simple system can already scale easy 10000 times higher than the current maximum allowed size.

Scaling the entire system takes a little more work, but mostly because miners have not received a lot of new features that they would need in order to make scaling safe for them. Most of those features could be added in a matter of months, with technologies like xthin blocks and optimistic mining already well underway.

The conclusion that I have to draw is that the goal of 50M/day is not just reachable, the timeline of 5 years is likely one that we will beat quite easily.

Smart tricks like Lightning network are not mentioned at all in this document because there is no need for them. Bitcoin can scale on-chain quite easily with almost no risk. Ideas like Lightning are quite high risk because there are so many unknowns.

By far the biggest problem with regards to scaling today is the protocol limit of 1MB block size. This should be removed as soon as possible.
 

AdrianX

Well-Known Member
Aug 28, 2015
2,097
5,797
bitco.in
@awemany

Society (collectivistic surplus production to pay tribute) is per se a doom scenario. Always had been.
Tainter's law: A society is a 'problem solving' construct facing 'Diminishing return on additional investment in additional complexity/debt/energy'. Doomed at the start.

Islamism, catholicism, austrianism, keynesianism, socialism or capitalism are just utopias that never will overcome nature's law.

And it doesn't help if you are virtually doxxed, extortioned, denounced, threatened and destroyed. Where is u/hellobitcoinworld?
We just don't know and I tend to the principal of societies existing as a result of cooperation. And yes I agree with you to a larg extent that things are the way they are because of the capture of the money supply.

This epic thread is leading us out - I have always though of my self as a socialist - however I seem to have more fundamentals in common with libertarians than any other pigeon hope classification.

Whatever the solution it,s been kick started with Bitcoin.
 

solex

Moderator
Staff member
Aug 22, 2015
1,558
4,693
@Zarathustra
The problem with comparing Roman data to current era data is that the rate of change in technology is vastly different. The rise and fall of the Roman Empire spanned about 1100 years and yet applied tech changed very slowly. Woodworking, metallurgy, pottery, clothing, transport, writing, food production had many of the same principles employed century after century with just glacial improvements.

Today, we see the rate of technological change making new products obsolete in a decade or two (remember the Walkman, typewriters, amalgam fillings, etc. etc.) The doubling time of knowledge is now less than 10 years. Technology really solves a lot of the "Peak [insert consumable]" problems. We haven't begun to see the impact of graphene sheets on consumer products or self-assembly nanotech for production or gene-editing for general medicine. Even something like climate change could be addressed in a few decades with artificial trees removing excess CO2 from the air. A chart reaching out to 2100 can't assume static technology.
 
Last edited:

Roger_Murdock

Active Member
Dec 17, 2015
223
1,453
Yet another (but maybe simpler) way of framing the block size debate:

We should first acknowledge that are indeed benefits associated with a constraining block size limit -- not necessarily net benefits, but certainly "all-else-equal"-type benefits. The most prominent examples of these benefits are the reduced cost of running a full node and the reduced relative importance (and thus, theoretically, the reduced "centralizing" effect) of large miners' "self-propagation advantage" vis-a-vis orphaning risk.

Unfortunately, "all else is never equal" and there are also very significant costs associated with a constraining block size limit: higher fees, less predictable fees, slower and/or less predictable confirmation times, and lower on-chain throughput forcing users to use "off-chain" solutions that are inherently riskier.

Obviously, moving from a block size limit of 0 kb (Bitcoin doesn't exist) to a block size limit of maybe 10 kb represents a situation where the benefits outweigh the costs. Similarly, almost everyone agrees that moving from a block size limit of 10 kb (allowing for perhaps a literal handful of transactions per block) to the current 1-MB limit represents a significant net benefit.

The question is whether or not there's a point at which the marginal costs of increasing the block size limit begin to outweigh the marginal benefits. If so, that transition point is obviously going to be the optimum place to set the limit. Of course, we should note the possibility that the answer to this question is actually no, because at a certain point the costs of an increased consensus-type limit go to zero. Why? Because the consensus-type limit ceases to have any effect at the point at which it becomes greater than the natural, equilibrium block size limit. These are the conditions under which Bitcoin operated for several years when the 1-MB limit was very large relative to actual transactional demand.

But let's assume for the moment that the answer to that question is yes and thus that a constraining block size limit makes sense. In terms of thinking about where that limit is likely to be, I think it's important to note that the costs of a particular limit should be expected to increase over time as adoption and transactional demand increase. Meanwhile, the benefits of a particular limit should be expected to decrease over time as a result of a) general technological improvements in, e.g., storage / bandwidth and b) Bitcoin-specific advances like Xthin / Compact Blocks. So you've actually got two forces working together to increase where the optimum limit is going to fall over time (again, assuming such a limit is even needed in the first place), which should really drive home the utter insanity of the "1MB4EVA" position.

 
Last edited:

Zarathustra

Well-Known Member
Aug 28, 2015
1,439
3,797
@Zarathustra
The problem with comparing Roman data to current era data is that the rate of change in technology is vastly different. The rise and fall of the Roman Empire spanned about 1100 years and yet applied tech changed very slowly. Woodworking, metallurgy, pottery, clothing, transport, writing, food production had many of the same principles employed century after century with just glacial improvements.

Today, we see the rate of technological change making new products obsolete in a decade or two (remember the Walkman, typewriters, amalgam fillings, etc. etc.) The doubling time of knowledge is now less than 10 years. Technology really solves a lot of the "Peak [insert consumable]" problems. We haven't begun to see the impact of graphene sheets on consumer products or self-assembly nanotech for production or gene-editing for general medicine. Even something like climate change could be addressed in a few decades with artificial trees removing excess CO2 from the air. A chart reaching out to 2100 can't assume static technology.
Yes, each new civilization is even crazier, even more complex with even higher drop height. The nature is by far to complex that I would ever trust in geoengineering to solve the problems. It's the opposite. The sixth great extinction is happening with exponentially increasing speed.

'Do not disturb a complex system' since we do not know the consequences of our actions owing to complicated causal webs. I explicitly explained the need to “leave the planet the way we got it” . Nassim Taleb

http://www.huffingtonpost.com/nassim-nicholas-taleb/my-letter-addressing-the_b_270737.html
 
Last edited:

21isthenew42

New Member
Sep 21, 2015
2
6
Yet another (but maybe simpler) way of framing the block size debate:

We should first acknowledge that are indeed benefits associated with a constraining block size limit -- not necessarily net benefits, but certainly "all-else-equal"-type benefits. The most prominent examples of these benefits are the reduced cost of running a full node and the reduced relative importance (and thus, theoretically, the reduced "centralizing" effect) of large miners' "self-propagation advantage" vis-a-vis orphaning risk.

Unfortunately, "all else is never equal" and there are also very significant costs associated with a constraining block size limit: higher fees, less predictable fees, slower and/or less predictable confirmation times, and lower on-chain throughput forcing users to use "off-chain" solutions that are inherently riskier.

Obviously, moving from a block size limit of 0 kb (Bitcoin doesn't exist) to a block size limit of maybe 10 kb represents a situation where the benefits outweigh the costs. Similarly, almost everyone agrees that moving from a block size limit of 10 kb (allowing for perhaps a literal handful of transactions per block) to the current 1-MB limit represents a significant net benefit.

The question is whether or not there's a point at which the marginal costs of increasing the block size limit begin to outweigh the marginal benefits. If so, that transition point is obviously going to be the optimum place to set the limit. Of course, we should note the possibility that the answer to this question is actually no, because at a certain point the benefits of an increased consensus-type limit go to zero. Why? Because the consensus-type limit ceases to have any effect at the point at which it becomes greater than the natural, equilibrium block size limit. These are the conditions under which Bitcoin operated for several years when the 1-MB limit was very large relative to actual transactional demand.

But let's assume for the moment that the answer to that question is yes and thus that a constraining block size limit makes sense. In terms of thinking about where that limit is likely to be, I think it's important to note that the costs of a particular limit should be expected to increase over time as adoption and transactional demand increase. Meanwhile, the benefits of a particular limit should be expected to decrease over time as a result of a) general technological improvements in, e.g., storage / bandwidth and b) Bitcoin-specific advances like Xthin / Compact Blocks. So you've actually got two forces working together to increase where the optimum limit is going to fall over time (again, assuming such a limit is even needed in the first place), which should really drive home the utter insanity of the "1MB4EVA" position.

You might like this... "defining f(decentralization)"

https://medium.com/@DominikSennes/defining-f-decentralization-for-the-bitcoin-network-and-scaling-the-scale-5d65d7ffe15c#.399fl6x9f