Lightning Network

John W. Ratcliff

New Member
Dec 31, 2015
5
3
Gavin,

I have a question that I would like your expert opinion on. Based on slide #52 of the Lightning Network presentation (https://lightning.network/lightning-network.pdf) many are claiming that 50 million people can use the existing bitcoin network with just 1mb blocks. In fact, some have even been arguing that we should lower the blocksize limit below 1mb. This claim has been repeated over and over again these past months as the main reason why there is no need to raise the blocksize limit at all for years to come; since the current number of 'active' users is probably well below a million.

This claim is based on the expectation that 100% of all transactions on the bitcoin network will be only 250 bytes in size (currently they average 600 bytes) and that 'active users' will only need to perform 4 transactions per year to open and close just two payment channels. Ignoring the question of whether or not opening and closing just two channels a year is a reasonable use case, and ignoring the fact that these numbers fail to account for how anyone acquired bitcoin in the first place to open their initial channel, this raises still further questions.

I believe it is true that if 50 million people are supposed to post two transactions once every six months, consuming 100% of the entire bitcoin network capacity that, for the typical new user in this situation, there will be an average of a three month backlog (half the sampling frequency of once every six months).

Is this correct? Would not the 'average' wait time in this highly congested over-saturated system be on this order of magnitude?

I believe this is the case. And, if so, it raises a number of other issues. If there is an average wait time of 3 months to get a transaction processed, obviously that breaks all kinds of things, including payment channels themselves. Not to mention the fact that having to wait 3 months simply to open a channel would arguably make the bitcoin network somewhat useless.

I'm just trying to bring a bit of a 'reality check' to some of the more extreme claims that I have heard. The problem here is that non-technical people keep repeating these numbers, since they are on the LN slides, in public forums as if they were gospel truth.

Much thanks for all of your great work and contributions, and I look forward to your response.

John
 
  • Like
Reactions: ladoga

Bloomie

Administrator
Staff member
Aug 19, 2015
511
803
Hi John, great to see you on the forum. I'm splitting this off into a separate discussion and tagging @Gavin Andresen so he can see it.
 

John W. Ratcliff

New Member
Dec 31, 2015
5
3
Thanks, I should have done that myself. I'm very curious what people have to say on this topic. I sometimes feel like I"m the only one who see this is an issue because everyone else keeps repeating these claims as if they were facts.
 

Matthew Light

Active Member
Dec 25, 2015
134
121
Anyone who think Bitcoin is destined to be a niche network (50 million users) ought to sell their coins now and move on.
 

John W. Ratcliff

New Member
Dec 31, 2015
5
3
>>Anyone who think Bitcoin is destined to be a niche network (50 million users) ought to sell their coins now and move on.

Thanks for your reply, but it doesn't really answer the question about the relative backlog if 50 million people were trying to perform two transactions every six months.

Regarding your point, you do realize that with the existing bitcoin network barely 2 million people can 'actively' use it, and that is using a very loose definition of the word 'active' as performing only two transactions a month.

The purpose of the Lightning Network is to act as a force multiplier on top of the existing bitcoin blockchain; as a compromise between users being able to hold value on the main network while shifting payment transactions to a layer-2 network. I happen to think that this is an excellent idea and something we should move towards.

The goal of my post was to 'fact check' some of the force multiplier claims on the LN website which appear to me to be off by nearly an order of magnitude. Also, it does seem to me that if we ever tried to support that many users with the current blocksize there would be an average backlog of three months simply to open a channel. I might be wrong on that, thus the question to Gavin.

The plan of record, regarding the LN, is that the blocksize will be scaled up to accommodate more users, once we actually reach so many users (not payment transactions but users) that they cannot open or close channels in a timely fashion. I actually agree with this long term plan, I don't think using the main bitcoin network for micro-transactions or low value payments is good idea. We are using it that way now, as the fees are still only around 8 cents for a transaction, but if we ever grow to more than a few hundred thousand active users obviously that use case fails.

We will need the LN to offload day to day payment transactions or, if not the LN, some other layer-2 solution which pegs value against the main blockchain. The issue, as far as I am concerned, is that none of those layer-2 solutions are ready today.

In the meantime, I think it is both relevant and important to call out LN if they are claiming a force multiplier that is off by more than order of magnitude and would actually break the bitcoin network if you ever tried to use it the way they suggest.
 

Matthew Light

Active Member
Dec 25, 2015
134
121
Regarding your point, you do realize that with the existing bitcoin network barely 2 million people can 'actively' use it, and that is using a very loose definition of the word 'active' as performing only two transactions a month.
The Lightning network is vaporware at this point, and we don't know if or when it will ship, how well it will work and what actual usage will look like.

In the meanwhile, right now we have a Bitcoin network that works great, it simply needs a constant value increased or, better yet, removed and converted into setting (ala BU).

I have nothing against a lightning-style overlay in theory. I have everything against using central planning to force it down everyone's throat as an alternative to larger blocks. We absolutely need larger blocks, no matter what, and we need to let proposals like LN, etc. stand or fall on their own merits and user uptake.

I don't think using the main bitcoin network for micro-transactions or low value payments is good idea. We are using it that way now, as the fees are still only around 8 cents for a transaction, but if we ever grow to more than a few hundred thousand active users obviously that use case fails.
I don't know whether it's a good idea or not, but I want thousands / millions of Bitcoin nodes and users to make that decision, not the Federal Reserve Core developers.
 
  • Like
Reactions: Bagatell

John W. Ratcliff

New Member
Dec 31, 2015
5
3
Yeah, yeah, I get that. My question, however, was very specific. Let me just leave the LN totally out of it.

My question is, if 50 million people wanted to perform two transactions every six months consuming exactly 100% of the total capacity of the network, what would be the average length of the backlog to get a transaction processed (assuming the current 1mb blocksize limit)? I believe the average wait time for that scenario would be three months, but maybe I'm wrong. Let's just leave the LN out of the question for a moment, this is a question of how long would an average user have to wait in such a over capacity situation.

If you would have exactly <n> number of people perform exactly one transaction at exactly perfectly spaced regular intervals; then there wouldn't be a backlog. However, the reality is that users do not operate that way. For example, if 50 million people all submitted one transaction all at the same exact time, it would obviously take six months for the lowest priority/last transaction to get processed. I believe, and I may be interpreting this wrong, that the 'average' wait time would be half of that using the notion of the nyquist sampling frequency.

Maybe a real life analogy here would help. Let's measure a highway, and we divide the length of the highway by the number of cars that could fit. In theory, all of those cars could operate at full speed with zero congestion, so long as they all got into the highway at perfectly space intervals and all drove exactly the same exact speed and all got off the highway at exactly the same rate.

However, anyone who has ever driven on an actual highway knows that once a highway reaches a certain number of cars on it, even if there is plenty of 'space' between them, it will turn into a massive traffic jam slowing everyone down. I think this analogy holds relative to the bitcoin network with the network being the highway and the transactions being the cars.
 
  • Like
Reactions: stangrotic

Mengerian

Moderator
Staff member
Aug 29, 2015
536
2,597
@John W. Ratcliff "My question is, if 50 million people wanted to perform two transactions every six months consuming exactly 100% of the total capacity of the network, what would be the average length of the backlog to get a transaction processed (assuming the current 1mb blocksize limit)?"

This is a queuing theory question.

If demand is exactly the same as throughput capacity, with some random variation, then the average wait time cannot be determined, and can grow without bound in a random-walk manner.

Think of it this way: if demand is below capacity, there will be some average wait time that can be derived mathematically. If demand is greater than capacity, the wait time will grow linearly over time. The situation you describe is the boundary between these two regimes.
 

John W. Ratcliff

New Member
Dec 31, 2015
5
3
Ok, thanks. That seems like a rather involved process to get an answer, but a hell of a lot more accurate than my guess of 1/2 the sampling frequency. When I find time (???) I think I will write a simulator for the bitcoin network to try to get some real world answers based on known constraints (RBF, fee amount, etc.)
 

tynwald

Member
Dec 8, 2015
69
176
could be modelled as the difference between the 2 Poisson processes of transactions being created (block demand) and transactions being processed (block supply). Some adjustment for the network capacity being hit on almost continuous basis also needs to be factored in.