Gold collapsing. Bitcoin UP.

jonny1000

Active Member
Nov 11, 2015
380
101
yrral86 said:
in a perfectly competitive market you can ignore fixed costs
In a competitive environment, the rational miner makes a decision as to whether or not to include each transaction based only on the extra costs of including that particular transaction, this maximizes profit in the short term. This is not likely to include any significant electricity costs, and by definition excludes fixed costs. This is rational behavior in a competitive market.

yrral86 said:
Perhaps your definition of "marginal cost" already includes amortized fixed costs.
No it does not. By marginal cost, I mean the addition to total costs of adding one extra transaction to the block, compared to excluding that transaction.

Let me show the maths:

Marginal cost of including a transaction = total cost after inclusion - total costs before inclusion

Electricity and amortized hardware costs are likely to have to be paid regardless of the decision to add one extra transaction. Fixed costs are by definition excluded. This marginal cost probably contains nothing significant, except orphan risk costs.

yrral86 said:
If that is not the case, can you please try to explain in simple terms how a competitive market magically forces producers to operate at a loss
It is not magic, but normal economics. If fixed costs are high and marginal costs are sufficiently low, in a perfectly competitive environment, the industry makes losses. There is nothing "magical" about this, the industry has become nonviable and that happens all the time in the free market. Some large blockers have this odd view that the free market means every industry remains viable.

Please keep in mind, that it is common for some industries to go through periods of mass losses and bankruptcies, particularly traditional commodity mining. This is normally not a problem and a normal part of economics and the reallocation of resources. The free market does sometimes result in companies operating at a loss and that is totally fine and healthy. If the industry becomes nonviable and loss making, that is not a problem, the industry just goes away. This process has happened countless times in human history.

The Bitcoin mining industry has a positive externality, which is the security of the Bitcoin network. We need to ensure Bitcoin mining is always viable and the state of the industry is healthy, to protect the network.

Miners will not be "forced" to operate at a loss. Rational miners will keep operating at a loss as long as they keep making a contribution to their fixed costs. Eventually the miner will need to shut down.
 
Last edited:

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
[doublepost=1468813449][/doublepost]furthermore, at 12.5BTC reward, this still dwarfs total block tx fees and miners will continue to emphasize near network average blocksizes (in a 2MB environment) to minimize orphaning and maximize reward.
 

yrral86

Active Member
Sep 4, 2015
148
271
@jonny1000

Thank you. I believe I understand your concern now. It is rational for miners will make the decision to include any transaction that meets or exceeds their marginal cost. If they do not include it, any miner with lower costs can include it and work towards paying for their fixed costs, so it would be a benefit to their competitor. Only the miner with the lowest fixed costs can afford to not include a transaction that only meets their marginal costs (since anyone else including it would be taking a loss) and it is rational for them to accept any transaction that exceeds their marginal cost by the difference between their amortized fixed costs and the same costs of the next most efficient miner.

Without a subsidy, there is the potential for a race to the bottom leaving no room for the fixed costs. I know this will be an unpopular statement, but it seems like they only way to fix this is to have a constant level of inflation that pays for the security of the chain.

However, coming back to Bitcoin (which will most likely never agree to adding inflation), how does is this concern influenced by block size? Potentially, all fixed costs are a liability. Any rise in fixed costs makes it harder for miners to participate. Increasing blocksize will increase fixed costs related to connectivity and storage. However, by that logic decreasing blocksize will decrease fixed costs and make the network more likely to survive.

@all
Is the exponentially decreasing subsidy and the hard cap on bitcoin actually a fatal bug? Or is there some other mechanism that will ensure that security is paid for in a post-subsidy era? It seems like the only way Bitcoin can continue to work is if transactors are willing to pay more than the marginal cost for their transaction's inclusion. On the one hand, this is irrational because it means they are giving up more money than is absolutely necessary to get their transaction in a block. However, it seems like is might be the rational thing to do because it is the only way to ensure the network can afford to operate in a secure fashion.

This of course leaves us in the position of relying on transactors to pay more than is strictly necessary. It seems like it could work out okay, but I can't see any way to prove it. It leaves us in a position of trusting transactors to not get too greedy.

I hope someone can point out where I've gone wrong in this line of thinking.
 
Last edited:

jonny1000

Active Member
Nov 11, 2015
380
101
yrral86 said:
Only the miner with the lowest fixed costs can afford to not include a transaction that only meets their marginal costs (since anyone else including it would be taking a loss) and it is rational for them to accept any transaction that exceeds their marginal cost by the difference between their amortized fixed costs and the same costs of the next most efficient miner.
It is perfectly possible for all miners, even the one with the lowest amortized fixed costs, to keep including more transactions up the the point where the fee equals the marginal cost, in a competitive environment. I do not get your point here.

yrral86 said:
Without a subsidy, there is the potential for a race to the bottom leaving no room for the fixed costs. I know this will be an unpopular statement, but it seems like they only way to fix this is to have a constant level of inflation that pays for the security of the chain
I think you are finally getting it! I and many others, have always assumed the solution to this problem was to have an economically relevant blocksize limit. Luckily there already is a blocksize limit. That is why we need to fight so hard to defend it! That certainly does not mean we need 1MB forever, that would be mad. What is does mean is we eventually need a limit that "comes into play" and makes the transaction volume less than it would otherwise have been without the limit. This appears to be exactly what you guys here are so opposed to.

A constant level of inflation has all kinds of problems (excluding the normal ideological and economic theory objections)
  • The arbitrary level that will need to be chosen
  • Artificially high level of environmental damage
  • Lack of connection between miners and their customers, meaning the mining industry could be unresponsive to the needs of users
I think a blocksize limit is an excellent solution to this problem. We could still have a multi gigabyte limit in many years time. We just need to realize the idea of totally removing the limit is not workable.
 
Last edited:

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
first off, it's not possible to figure out today what the exact economics of Bitcoin will be in 2140 when the subsidy runs out. what we can reasonably determine is what is happening today; full blocks with delays and unreliability causing a slowing of new users. fixing that should be the #1 goal. the easiest way to do that is to increase the blocksize to re-establish a limit well above demand, even if it means a small bump to 2MB. as i said above, in the shorter term at 12.5BTC subsidy, miners have no incentive to recklessly make huge blocks even in the absence of a limit. they want the subsidy to the point where they even will continue to mine 0-1 tx blocks. what i see missing in the analyses above is the importance of price appreciation of the coin. miners mine BTC, not dollars. holding back some coin in reserve should be part of a miner's long term strategy. they should be leveraging the power of a fixed supply currency. as bigger blocks allow more new users attracted by low tx fees, price should continue to go up and help pay expenses for miners. i mined for 3 yrs much of the time at a loss. time and price appreciation fixed that and good. after all, there should be no question who has won the investing game in Bitcoin to this point; the buyers and hodlers of coin. as more new users enter the system, Bitcoin will eventually acquire it's "unit of account" status to truly become real money. it already has the two requirements of medium of exchange and store of value. there can be no price estimation on the value of becoming a steward in that system (mining). @jonny1000 is advocating a limit. ok, who decides what the limit is? kore dev? i don't think so. only the miners and users engaging in the economic tx can determine what their fees will be. there are a myriad of factors that will go into this and no amount of theoretical reasoning by @jonny1000 can determine the outcome. there will not be a Tragedy of the Commons. let the free market work it out and everything will be fine.
 

jonny1000

Active Member
Nov 11, 2015
380
101
first off, it's not possible to figure out today what the exact economics of Bitcoin will be in 2140 when the subsidy runs out. what we can reasonably determine is what is happening today; full blocks with delays and unreliability causing a slowing of new users. fixing that should be the #1 goal. the easiest way to do that is to increase the blocksize to re-establish a limit well above demand, even if it means a small bump to 2MB.
I totally agree. I am perfectly fine with kicking the can down the road several years. I am totally fine with 2MB of non witness data now. However that is also why I opposed XT so strongly, because locking in 8GB now is not sensible either. Now that we agree with this, please can we work to get 2MB in a collaborative way and end the unnecessary and divisive confrontation?

@jonny1000 is advocating a limit. ok, who decides what the limit is? kore dev? i don't think so.
I agree having development teams decide on the limit is a bad idea. This is why I support BIP100 and allowing the mining industry to vote on the limit over a long period of time. This solves the tragedy of the commons and other economic concerns mentioned recently in this thread and results in dynamic market driven increases in the blocksize, which I hope you guys want. Unfortunately, it seems the current situation is too confrontational and difficult to get something like that through right now. Therefore lets kick the can down the road several years and then work on these dynamic solutions in the mean time.

let the free market work it out and everything will be fine.
I have already explained that without a limit the economics can break down. It appears you have an ideological attachment to the idea of a "free market", even in this very niche and specific situation when it is not relevant.
 

Zarathustra

Well-Known Member
Aug 28, 2015
1,439
3,797
there will not be a Tragedy of the Commons. let the free market work it out and everything will be fine.
There already is a Tragedy. We are forced to interact with the totalitarian traitors and their supporters within the communitiy, which became a fractious society, which is a tautology. Society means fraction/fracture. That's why we have to fork away as far as possible from those people, to reanimate the common.
 
Last edited:

Inca

Moderator
Staff member
Aug 28, 2015
517
1,679
@jonny1000

I'm a computer scientists, not an economist, but I don't understand how you can possibly claim that in a perfectly competitive market you can ignore fixed costs. If that were true, every miner would be operating at a loss. The hardware and the electricity and the connectivity and the cooling and the staff and the building all have to be paid for regardless of how many transactions are put into the block.

Perhaps your definition of "marginal cost" already includes amortized fixed costs. If that is the case, my point holds and fees will never go to zero. If that is not the case, can you please try to explain in simple terms how a competitive market magically forces producers to operate at a loss. If it does somehow force producers to operate at a loss, then I see your point.
This was the point I was stumbling through logically but you make eloquently.
The marginal cost isn't particularly important - if miners are to maintain profitability then they will act in their own interests to mine as many fees as possible to cover their costs, especially when in the future costs trend towards fees. Miners costs are to a degree fixed (they can of course turn off during peak electricity times), whilst block sizes may vary based upon transactional demand or perceived orphan risk.

Thus instead of trying to calculate a marginal cost per additional transaction for a miner it makes sense to think like a miner and work from their fixed costs backwards instead to arrive at a reasonable calculation for how much income a block needs to generate. Then using the exchange price and the number of transactions in a block the actual fee to maintain profitability can be calculated. This process is dynamic and in a true market will be something miners compete to offer the lowest fees to provide.

There is no need for a blocksize limit in this context.

Furthermore, limiting the blocksize to prevent mining centralisation (which has actually already occurred due to economies of scale / ASICS) is illogical unless the maximum blocksize limit chosen is based upon dynamic experimental network orphan risk levels - and makes zero sense at 1mb.

Neither of these justifications for continuing a blocksize limit at 1mb are remotely convincing, especially when the block reward will make up the majority of mining income for many years into the future.

Which leaves the real reasons.

Without a subsidy, there is the potential for a race to the bottom leaving no room for the fixed costs. I know this will be an unpopular statement, but it seems like they only way to fix this is to have a constant level of inflation that pays for the security of the chain.
What you are describing (as mining matures and silicon reaches a performance and cost plateau) is the inevitable charge towards costs of energy almost entirely driving mining. It isn't hard to forsee a situation where miners switch on and off constantly based upon global electricity costs throughout the day. Remember, miners are in business and in the longer run they have to be profitable or they go bust, let the market decide :)
 
Last edited:

jonny1000

Active Member
Nov 11, 2015
380
101
Inca said:
Thus instead of trying to calculate a marginal cost per additional transaction for a miner it makes sense to think like a miner and work from their fixed costs backwards instead to arrive at a reasonable calculation for how much income a block needs to generate.
As I keep repeating, I mostly agree with you here, in most circumstances this logic is likely to be correct, miners will not include more transactions to maximize short term profit, because they want to keep the fee level high to cover their fixed costs in the longer term.

The above logic is likely to be false in the scenario where the mining industry is highly competitive. In this scenario miners are likely to focus on maximizing short term profit or in other words "next block profit maximization". In this case miners will keep adding more and more transactions to a block, up until the point where the fee equals the marginal costs, the revenue will then may not cover the fixed costs and orphan risk costs will be high relative to revenue. This could have catastrophic consequences on the network.

What is your argument? Are you saying the situation is exactly the same regardless of the level of competition in the industry? Please think carefully about the next block profit maximization strategy. Once you assume this strategy it is clear miners keep including transaction up to the point the fee equals the marginal cost, ignoring fixed costs.

Inca said:
Neither of these justifications for continuing a blocksize limit at 1mb are remotely convincing,
I agreed, I am not arguing for a 1MB limit. Does this comment from you mean you do see this argument has some merit in arguing against no limit?
 
Last edited:

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
This could have catastrophic consequences on the network.
Please could you elaborate on these "catastrophic consequences" that you envision.

It's also not clear to me how miners will ignore their fixed costs. Seems to me that most miners would have a very clear idea about those. One cannot rule out the occasional irrational actor, but your statement assumes irrationality for all under certain assumptions. I don't think that's rational.
 

jonny1000

Active Member
Nov 11, 2015
380
101
freetrader said:
Please could you elaborate on these "catastrophic consequences" that you envision.
I explained above the negative consequences if total orphan risk cost / total miner revenue is high:
  • Mining centralization pressure, resulting in the clustering of the mining network in a few or one geographic locations
  • Other negative consequences of high orphan risk such as wasted work

freetrader said:
It's also not clear to me how miners will ignore their fixed costs. Seems to me that most miners would have a very clear idea about those. One cannot rule out the occasional irrational actor, but your statement assumes irrationality for all under certain assumptions.
I keep repeating, this does not assume irrational miners. It assumes miners maximize short term profit, which is rational in some circumstances.

When constructing the block the miner may not consider the fixed costs, because the fixed costs do not change depending on how they construct the block. Instead miners may only consider the marginal things which change depending on how they construct their block. They may focus on short term profit maximization to maximize the contribution to their fixed costs. This would mean including every transaction up until the point where the marginal costs equals the fee.
 

freetrader

Moderator
Staff member
Dec 16, 2015
2,806
6,088
@jonny1000
  • Mining centralization pressure, resulting in the clustering of the mining network in a few or one geographic locations
  • Other negative consequences of high orphan risk such as wasted work
We have both of these today.
Do you consider Bitcoin being in a catastrophic situation due to these today?
If so, what's your suggestion from extracting Bitcoin from this situation?
Because I keep hearing that a block size limit is a good idea, and we already have that.
It doesn't seem to help.

They may focus on short term profit maximization to maximize the contribution to their fixed costs. This would mean including every transaction up until the point where the marginal costs equals the fee.
Excuse my ignorance, but isn't this what one would expect from a rational miner?

Is your theory that by including less transactions miners would supposedly be acting in longer term interest, I assume by raising average fees?
If that's true (and I can see it being a valid dynamic that miners will take into account), then rational miners would do that as needed, with or without a block size limit, no?
 

Inca

Moderator
Staff member
Aug 28, 2015
517
1,679
The above logic is likely to be false in the scenario where the mining industry is highly competitive. In this scenario miners are likely to focus on maximizing short term profit or in other words "next block profit maximization". In this case miners will keep adding more and more transactions to a block, up until the point where the fee equals the marginal costs, the revenue will then may not cover the fixed costs and orphan risk costs will be high relative to revenue. This could have catastrophic consequences on the network.
The only consequence of the above scenario is that profit margins will be cut to the bone, making the bitcoin network cheaper for users. This is a good thing. You could argue that will lead to further centralisation of the network but that is impossible to say, as this is going to take place in the far future when the block reward is far reduced. It is highly possible that as @cyperdoc has posited that with a long term peak efficiency of mining silicon that mining then becomes more decentralised. (Which makes it more strange that Core are agains the 21co approach of decentralising mining.)

Does this comment from you mean you do see this argument has some merit in arguing against no limit?
I am afraid I do not see a blocksize limit doing anything else than limiting income for miners and limiting transaction volume for the network unnecessarily at present or for the foreseeable future. I say unnecessarily because the network is limited by transaction demand in the mempool and by orphan risk anyway and the latter should be sufficient to set the upper bound of the blocksize for the forseeable future.

Now if it became clear that a significant deterioration in the decentralisation of the network were occurring to the detriment of bitcoin because bandwidth requirements rose precipitously then I can see an argument for enforcing soft limits to keep the bitcoin network from failing. But only for technical reasons, never as an economic incentive. If bitcoin grows to a point where it reaches the limit of what is possible from a decentralised p2p network then this is a great problem to have because bitcoin is being used by hundreds of millions of people.

This leads back to the idea of Core doing something objective for once and allowing the community to set a basic set of specifications for a validating node and allow the network performance to rise up towards those specifications over a time period before they are revised in the future as technology has marched forward*. Instead we have this ridiculous situation where Gregory Maxwell is pretending 1mb is some magic number (forever) and hiding behind censorship to push on with an overly complicated and extremely economically naive scaling initiative that has lead many to believe that he is untrustworthy.

EDIT: I appreciate your continued engagement on this thread by the way @jonny1000.

* @theZerg I think it would be really neat to add a few routines to the unlimited codebase to make an assessment of cpu performance, memory availability, storage capacity and run some network bandwidth performance testing to allow a robust assessment to be made of the performance of each unlimited node on the network. This could then be used to make objective assessments of how much more blocksize could reliably be tolerated by the network, potentially allowing the unlimited side of the network to state with some accuracy that, for example, 99% of the unlimited nodes are 4mb blocksize safe? This would undoubtedly encourage users to move their node to unlimited and force a response from Core along the same lines..

EDIT: I will write a BUIP to this effect..
 
Last edited:

pekatete

Active Member
Jun 1, 2016
123
368
London, England
icreateofx.com
It assumes miners maximize short term profit, which is rational in some circumstances.
This would amount to fudged economics when applied to any sector, but more so in the bitcoin mining sector where the biggest miners are also the biggest manufacturers of bitcoin mining equipment. In-fact the circumstances where this'd be a rational assumption are statistically irrelevant.
 

xhiggy

Active Member
Mar 29, 2016
124
277
Thus instead of trying to calculate a marginal cost per additional transaction for a miner it makes sense to think like a miner and work from their fixed costs backwards instead to arrive at a reasonable calculation for how much income a block needs to generate. Then using the exchange price and the number of transactions in a block the actual fee to maintain profitability can be calculated. This process is dynamic and in a true market will be something miners compete to offer the lowest fees to provide.

There is no need for a blocksize limit in this context.


Which leaves the real reasons.
Yes exactly, how would this not happen? Isn't this what a logical miner would do?
[doublepost=1468837794][/doublepost]
The above logic is likely to be false in the scenario where the mining industry is highly competitive. In this scenario miners are likely to focus on maximizing short term profit or in other words "next block profit maximization". In this case miners will keep adding more and more transactions to a block, up until the point where the fee equals the marginal costs, the revenue will then may not cover the fixed costs and orphan risk costs will be high relative to revenue. This could have catastrophic consequences on the network.
Why would a miner, who relies on the network to make money, do something that has been so publicly stated to be bad for the network, and therefore themselves?
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
Watch the recent onchain video. The guys are calculating that BU nodes will allow 20mb blocks today.

The only consequence of the above scenario is that profit margins will be cut to the bone, making the bitcoin network cheaper for users. This is a good thing. You could argue that will lead to further centralisation of the network but that is impossible to say, as this is going to take place in the far future when the block reward is far reduced. It is highly possible that as @cyperdoc has posited that with a long term peak efficiency of mining silicon that mining then becomes more decentralised. (Which makes it more strange that Core are agains the 21co approach of decentralising mining.)



I am afraid I do not see a blocksize limit doing anything else than limiting income for miners and limiting transaction volume for the network unnecessarily at present or for the foreseeable future. I say unnecessarily because the network is limited by transaction demand in the mempool and by orphan risk anyway and the latter should be sufficient to set the upper bound of the blocksize for the forseeable future.

Now if it became clear that a significant deterioration in the decentralisation of the network were occurring to the detriment of bitcoin because bandwidth requirements rose precipitously then I can see an argument for enforcing soft limits to keep the bitcoin network from failing. But only for technical reasons, never as an economic incentive. If bitcoin grows to a point where it reaches the limit of what is possible from a decentralised p2p network then this is a great problem to have because bitcoin is being used by hundreds of millions of people.

This leads back to the idea of Core doing something objective for once and allowing the community to set a basic set of specifications for a validating node and allow the network performance to rise up towards those specifications over a time period before they are revised in the future as technology has marched forward*. Instead we have this ridiculous situation where Gregory Maxwell is pretending 1mb is some magic number (forever) and hiding behind censorship to push on with an overly complicated and extremely economically naive scaling initiative that has lead many to believe that he is untrustworthy.

EDIT: I appreciate your continued engagement on this thread by the way @jonny1000.

* @theZerg I think it would be really neat to add a few routines to the unlimited codebase to make an assessment of cpu performance, memory availability, storage capacity and run some network bandwidth performance testing to allow a robust assessment to be made of the performance of each unlimited node on the network. This could then be used to make objective assessments of how much more blocksize could reliably be tolerated by the network, potentially allowing the unlimited side of the network to state with some accuracy that, for example, 99% of the unlimited nodes are 4mb blocksize safe? This would undoubtedly encourage users to move their node to unlimited and force a response from Core along the same lines..

EDIT: I will write a BUIP to this effect..
Wat
[doublepost=1468840668,1468839876][/doublepost]@jonny1000

Ye, iI believe the" free market" is what got us here today well beyond where guys like you ever envisioned. It's a hard concept I'll grant you. Unprofitable miners will come and go but that's normal and not to be feared. We see it all the time in the commodity industry which is very competitive. One dynamic you seem to not have considered and which I consider quite likely is gvt mining. Envision a scenario where we only have governmental mining. That actually could work and be a stable situation ( not ideal) and where costs may never matter.

Btw, if strikes me that your tragedy of the Commons argument is a very old one going back to 2009.. 7.5y had already proven you wrong. I'm try to dig up those old thread on BCT.

Als, fuck androi, this sit, and swift key until they get this formatting fixe.
 
Last edited:

Inca

Moderator
Staff member
Aug 28, 2015
517
1,679
I have made a start on BUIP 21 which aims to add a very simple set of benchmarking tests to the Unlimited node software.

The idea being that it will allow specifications of all the nodes on the network in terms of CPU/memory/storage and network performance to be recorded and collated.

Each node will then be able to announce themselves 'safe' for a block size at a higher level than 1mb.

If people move over to BU or the code is ported to Classic (or it is invented again differently in Core from an idea Maxwell originally had in 2003 :)) then this would enable the network to constantly be assessed objectively to see what impact rising block sizes would have on decentralisation and allow 'healthy' performance increases.

Thinking further ahead miner thresholds could be set to only raise the block size when a certain proportion of the network was able (though this could be gamed with a sybil attack!).

Please add your commentary. What is needed particularly is some hard figures for bandwidth requirements from BU nodes operating on the testnet at various block sizes above 1mb.
 
Last edited:

cliff

Active Member
Dec 15, 2015
345
854
@jonny1000 - Again, I appreciate your answer. I'll come back to your post later today when I'm home and have had a chance to read through yesterday's posts more thoroughly - and I'll lay off the foul language as you requested.

I heard about the concept fluctuating toll roads this weekend in talking with a neighbor about traffic in my area. The concept is a toll road with toll prices that fluctuate based on congestion. Texas has just recently started experimenting with this and maybe some other states as well- not finding a lot of data out there atm. I'll see what I can dig up - there might be interesting insights/parallels for btc folks in this literature base.

http://kxan.com/2016/05/03/rates-for-mopac-express-lanes-adopted/
 

jonny1000

Active Member
Nov 11, 2015
380
101
Why would a miner, who relies on the network to make money, do something that has been so publicly stated to be bad for the network, and therefore themselves?
Yes, that is a great point xhiggy, this is a good response and a classic one. Let me try to explain further below.

Is your theory that by including less transactions miners would supposedly be acting in longer term interest, I assume by raising average fees?
Yes this is a theory, it is certainly not mine. The theory is that miners would include less transactions to benefit the overall network's long term interests and therefore there own long term interests. This is a typical view point of larger blockers and Mike Hearn expressed this in April 2011 below:

Mike Hearn on 23 April 2011 said:
The death spiral argument assumes that I would include all transactions no matter how low their fee/priority, because it costs me nothing to do so and why would I not take the free money? Yet real life is full of companies that could do this but don't, because they understand it would undermine their own business.
Source: https://bitcointalk.org/index.php?topic=6284.msg92907#msg92907

Ironically the day earlier he thought this "death spiral" idea did seem feasible, but then changed his mind after deciding miners do care about their long term interests:

Mike Hearn on 22 April 2011 said:
The death spiral failure mode seems plausible.
Source: https://bitcointalk.org/index.php?topic=6284.msg92348#msg92348

This is one of the key parts of the blocksize debate, large blockers, like Mike Hearn, tend to think miners care about their long term future and therefore will not keep adding transactions up to the point where the fee is equal to the marginal cost. Mike's thought process is that miners are businesses and are run by people, who will make judgment calls about the long term future, the "game of life" if you will, or as he puts it, "real life". In contrast, small blockers sometimes focus on short term profit maximization, strictly rational behavior or as I sometimes call it "next block game theory", under this model, miners do indeed keep adding transactions up until the point the marginal cost is equal to the fee. In this model miners do not care about the long term interests of the system and only want to benefit themselves, selfishly.

I tend to lean more towards the large blocker, Mike Hearn "game of life" idea. However, in reality, nobody is correct and nobody is wrong. The mining industry is a spectrum of miners, with some miners towards the long term end and some towards the short term end. There are some circumstances in which more miners will move towards the short term end of the spectrum, these circumstances are likely to be when the industry is highly competitive, perhaps also when the industry is more professional or has high levels of debt financing. In my view we need to ensure the industry is robust in as many situations as possible, and therefore we should keep an economically relevant blocksize limit, as a defense for when this competitive industry structure emerges. In my view there could by cycles in the industry and one day this structure will emerge. Of course, I could be wrong, but I do not think now is the time to decide to risk it.

[doublepost=1468850140][/doublepost]
Do you consider Bitcoin being in a catastrophic situation due to these today?
At the moment we still have a high block subsidy to defend against these concerns.
 
Last edited: