Gold collapsing. Bitcoin UP.

theZerg

Moderator
Staff member
Aug 28, 2015
1,012
2,327
It's very interesting.

The DAC concept was invented by Daniel "bytemaster" Larimer, the founder of Bitshares. At that time he was explaining that POS is a better mechanism than POW mainly because...
Where did you get the idea that Larimer invented DACs?
 

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
@Norway Show just starting now. seems very different to the one he gave on tuesday.


I attended the meetup at the institute of cryptoanarchy in prague on tuesday night to see Andreas speak. His talk was good, specifically about scaling bitcoin. He used the extended analogy around Usenet and how the internet began to scale. How many people kept saying it would never scale as each bump in users and usecases were added and how essentially the internet has been 'failing to scale for 35 years.'.

He runs several core and several classic nodes. My main takeaways were how he was really taken by segwit and lightning ("bitcoin has just acquired a PoS system"). His predictions were, that we would see both segwit and a 2MB hardfork this year. He clearly doesn't buy the Blockstream Core "conspiracy" theories even though the clear 'conflict of interest' was pointed out to him. His justification was that the main developers concerned were just 'poor communicators' due to very analytical brains.

He came across very calm and was still utterly convinced bitcoin network effect would overcome ALL obstacles. I'm inclined to agree.

Most of the good stuff was said after the show as we all had a chance to mingle and ask a few questions. I asked if he had a chance to check out unlimited, and he said he loved the idea but It was clear to me that he's not investigated it properly as he didn't understand how 'setting the acceptance depth' on the node acted like a resistance to block propagation of large blocks and seemed to think nodes/network would be frequently forked. It's a complicated subject and I tried to point him in the right direction but it was difficult to get much conversation due to the crowd.

It would be great to see him here, as the signal to noise is so high with many unique scaling options being developed within these very walls.

Overall great talk Andreas please come and join the discussion away from the nasty Reddit drama.
 
Last edited:

satoshis_sockpuppet

Active Member
Feb 22, 2016
776
3,312
He clearly doesn't buy the Blockstream Core "conspiracy" theories even though the clear 'conflict of interest' was pointed out to him. His justification was that the main developers concerned were just 'poor communicators' due to very analytical brains.
... Is he really that stupid?
"Analytical brains". Really? How can someone with even one brain cell still play the good faith game with these assholes?

Fuck this. We won't have Bitcoin if people like him can't comprehend what is going on.
Gavin is the most experienced Bitcoin dev who has shown to create solutions instead of problems. Code written by Jeff is running on thousands of computers worldwide for years.
And still the current "core developers" are somehow the masterminds and bitcoin "wizards"?
Has he been living under a rock for the last months??
 

BldSwtTrs

Active Member
Sep 10, 2015
196
583
Where did you get the idea that Larimer invented DACs?
I just witnessed it.

I followed closely the altcoin space in those days and in particular the creation of Bitshares.

Some people talked about DAO thereafter but didn't expand as much on the concept than Larimer with all the framing of holders as shareholders and money supply inflation as an avoidable cost.

I am pretty sure one should give credit to bytemaster for at the same time the name and the theoretical thinking.

If you don't believe me you might believe Hoskinson who was one of the founder of Invictus Innovation:
"Invictus Innovations Inc. (I3) is the launching entity for the Bitshares platform, the inventor of the term DAC"
https://www.linkedin.com/in/charles-hoskinson-1a95a4b4

And I think the following article is the first where the term have appeared on Internet (granted it is written by Stan the father of Dan, but I think the mind where this came from is the one of the son):
https://letstalkbitcoin.com/bitcoin-and-the-three-laws-of-robotics
 
Last edited:

theZerg

Moderator
Staff member
Aug 28, 2015
1,012
2,327
Sounds like they claim to have invented only the term not the concept... I still have a hard time believing even that but don't care enough to research it. They certainly did not invent the concept.

This blog post for example is dated March 2009 and describes the idea that anyone could contribute to a software project (without permission), and be semi-automatically paid according to some quantitative/qualitative measurement of his/her contributions out of the proceeds of the project:

http://effluviaofascatteredmind.blogspot.com/2009/03/thoughts-on-gpl-open-company-concept.html

At that point Bitcoin was almost entirely unknown. But applying a crypto-currency as either/both the payout $ and as a "stock" surrogate (rewarding contributers with a worthless token and then subsequently paying out proportional to that token) is a pretty obvious application...
 
  • Like
Reactions: freetrader

albin

Active Member
Nov 8, 2015
931
4,008
Larimer basically had the best timing (possibly some combination of design and luck?) to become the guy associated with the concept.

The first portion of the initial distribution was launched as "Protoshares" (which were later convertible to Bitshares) I remember being launched maybe like Oct or Nov 2013. He got tremendous coverage even being interviewed for a full episode of Let's Talk Bitcoin.

This was a time when a zillion people were piling into Bitcoin because of the price action obviously, and a certain proportion were undoubtedly miffed that the ASIC era had already started, so Protoshares was well-positioned to appeal to folks enamored with the fantasy of being on the very ground floor of something like Bitcoin that you could still get by just running a program on your PC.
 

Roger_Murdock

Active Member
Dec 17, 2015
223
1,453
Am I the only one who hates the phrase “off-chain scaling”? I guess my reaction when I hear that phrase is, well yeah, of course Bitcoin can "scale" off-chain. If you want to scale to “VISA levels,” and you're willing to do it off-chain, it can be pretty simple. Take the existing VISA infrastructure (or recreate it) and, instead of using it to process fiat-denominated IOUs, use it to process Bitcoin-denominated IOUs. Now obviously the “Bitcoin as settlement network” camp will say: “hey, that’s not fair! We’re not talking about simply recreating the custodial, trust-based credit and banking models of the fiat world in Bitcoin. We're talking about the Lightning Network which uses crypto-magic to enable off-chain transactions while still avoiding centralization and custodial risk. Lightning transactions are Bitcoin transactions.”

Well, no. They’re not. I’m far from an expert on the Lightning Network, but as I understand it, the proposal involves the repeated exchange of unconfirmed, unbroadcast Bitcoin transactions, i.e. potential Bitcoin transactions. Now there are some people who are very excited about the LN’s potential, and there are others who are very skeptical. But for purposes of the point I’m making it doesn’t really matter who’s right. I have no doubt that Bitcoin, by its very nature as an open, extensible network, enables the creation of some really novel, really useful “layer two” solutions (whether or not the LN proves to be one of them). But the fact remains: when you move transactions to a “layer two” solution, you have – by definition – added a layer of risk. Your layer two solution might be really, really great and a huge improvement over the traditional banking model such that the added layer of risk is relatively thin. And that’s awesome… but it’s still there. So you can’t just say: “who cares if Bitcoin’s layer one is artificially crippled? We’ll just move everything to layer two.” To the extent that on-chain scaling is artificially restricted (rather than being constrained only by technological limits), there is going to be an unavoidable deadweight loss. And all that can do is open the door for Bitcoin’s competitors.

Also, it seems to me that if there were no downside to using a “layer two” solution to make a particular payment, if it were really true that “Lightning transactions have the full security of on chain transactions” (as I saw one redditor claim), that would be a very dangerous state of affairs. If everyone could get all of the benefits of an on-blockchain transaction without actually using the blockchain (and thus paying the fees to secure it), that would seem to create a tragedy of the commons.

Edit: created a reddit thread
 
Last edited:

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,998
/u/tsontar calls a LN pmt channel one huge 0 conf tx with a time limit. if you can't close it some months later due to a spam attack or whatever, someone is going to get screwed.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,998
try reading this from Rusty. it's nuts. they don't even have this part worked out:

For lightning commitment transactions, neither RBF nor CPFP apply: RBF isn’t possible because you only publish the commitment transaction when the other side has vanished (so they can’t sign a new transaction for you), and the outputs you can spend are all timelocked so CPFP won’t incentivize a miner. We may eventually end up with a trivial non-timelocked output just to apply CPFP, but meanwhile we rely on the fact that that commitment transactions are usually not published, so fees can be safely higher than normal.

This almost works, but what if one side can’t afford it? This happens initially when one side has created the channel and holds all the funds. It can also happen later when fees increase (unavoidably: your additional HTLCs might be in flight while I send the new update_fee message). While fairness is important, timely inclusion is vital, so in this case the side which can afford it tops up the fee. It’s still possible for a fee hike plus in-flight HTLCs to mean neither of us can afford the fee, but it’ll be no worse than the old fee rate.

Obviously, you can’t offer a new HTLC if you can’t afford the fee; in fact the amount from first HTLC the receiver fulfils may go entirely towards paying their share of the transaction fee.

The spec currently recommends fees like so:

As the commitment transaction is only used in failure cases, it is
suggested that fee_rate be twice the amount estimated to allow entry
into the next block, and that nodes accept a fee_rate up to ten
times that same estimate.


https://medium.com/@rusty_lightning/the-joy-of-bitcoin-transaction-fees-b2dd2de7f818#.7wo3phsuy

[doublepost=1458852693][/doublepost]Rusty betrays his own feelings on LN fee determination at the end. hey, you do what you're paid to do; code:

Conclusion
You can read the work-in-progress BOLT#2 in my repository here which contains all the gory details. My implementation is slowly approaching it as well, but expect both to change over time.
 
Last edited:

albin

Active Member
Nov 8, 2015
931
4,008
@cypherdoc

Does this mean that they've dismissed Maxwell's insane timelock-freezing scheme?

This is like Ptolemy building his epicycles (which I'm going to assume Luke actually believes in incidentally?). How much leading to water does this horse need to realize that Lightning only has any hope of working in concert w/ Satoshi's intent that Blocksize is not artificially constrained?
 
Last edited:

solex

Moderator
Staff member
Aug 22, 2015
1,558
4,695
@cypherdoc
Thanks for the good inside info from Rusty. Since LN is being touted as a scaling solution then it should not still be in the design phase when blocks are full. It should be in the pilot phase handling decent real-world volume. We hear that SegWit is not really in itself a scaling solution, apart from the problem that it may activate very slowly, if at all. And, even afterwards the head-room increase for most txns is marginal.

Core Dev need to step back and evaluate that they are not in a position to allow Bitcoin's network effect to grow. They need to admit that they have run out of time, and demonstrate that they are not irresponsible by accepting Classic's 2MB patch.
 

albin

Active Member
Nov 8, 2015
931
4,008
If I'm remembering correctly, Lightning first came to the attention of the community because of a video explaining it from some kind of conference or meetup, and it was originally presented as a way to use smart contracting to replace zero-conf, hence "Lightning" as in fast. That actually makes sense to me.

The rebranding as the messianic scaling solution I only remember popping up because of rhetic like Peter Todd's characterization of Lightning as a "rocket ship".
 

AdrianX

Well-Known Member
Aug 28, 2015
2,097
5,797
bitco.in
... Is he really that stupid?
"Analytical brains". Really? How can someone with even one brain cell still play the good faith game with these assholes?

Fuck this. We won't have Bitcoin if people like him can't comprehend what is going on.
Gavin is the most experienced Bitcoin dev who has shown to create solutions instead of problems. Code written by Jeff is running on thousands of computers worldwide for years.
And still the current "core developers" are somehow the masterminds and bitcoin "wizards"?
Has he been living under a rock for the last months??
Bang on post but to me it sounded a little more like he was trying to be a good diplomat.
[doublepost=1458859158][/doublepost]
Am I the only one who hates the phrase “off-chain scaling”? I guess my reaction when I hear that phrase is, well yeah, of course Bitcoin can "scale" off-chain. If you want to scale to “VISA levels,” and you're willing to do it off-chain, it can be pretty simple. Take the existing VISA infrastructure (or recreate it) and, instead of using it to process fiat-denominated IOUs, use it to process Bitcoin-denominated IOUs. Now obviously the “Bitcoin as settlement network” camp will say: “hey, that’s not fair! We’re not talking about simply recreating the custodial, trust-based credit and banking models of the fiat world in Bitcoin. We're talking about the Lightning Network which uses crypto-magic to enable off-chain transactions while still avoiding centralization and custodial risk. Lightning transactions are Bitcoin transactions.”

Well, no. They’re not. I’m far from an expert on the Lightning Network, but as I understand it, the proposal involves the repeated exchange of unconfirmed, unbroadcast Bitcoin transactions, i.e. potential Bitcoin transactions. Now there are some people who are very excited about the LN’s potential, and there are others who are very skeptical. But for purposes of the point I’m making it doesn’t really matter who’s right. I have no doubt that Bitcoin, by its very nature as an open, extensible network, enables the creation of some really novel, really useful “layer two” solutions (whether or not the LN proves to be one of them). But the fact remains: when you move transactions to a “layer two” solution, you have – by definition – added a layer of risk. Your layer two solution might be really, really great and a huge improvement over the traditional banking model such that the added layer of risk is relatively thin. And that’s awesome… but it’s still there. So you can’t just say: “who cares if Bitcoin’s layer one is artificially crippled? We’ll just move everything to layer two.” To the extent that on-chain scaling is artificially restricted (rather than being constrained only by technological limits), there is going to be an unavoidable deadweight loss. And all that can do is open the door for Bitcoin’s competitors.

Also, it seems to me that if there were no downside to using a “layer two” solution to make a particular payment, if it were really true that “Lightning transactions have the full security of on chain transactions” (as I saw one redditor claim), that would be a very dangerous state of affairs. If everyone could get all of the benefits of an on-blockchain transaction without actually using the blockchain (and thus paying the fees to secure it), that would seem to create a tragedy of the commons.
@Roger_Murdock and that's how you transformed PoW Bitcoin into PoS Bitcoin.
 
  • Like
Reactions: lunar and Norway

adamstgbit

Well-Known Member
Mar 13, 2016
1,206
2,650
I think you greatly underestimate Classic's support...

f2pool's support growing is good, and 4.6% of the total hashing power is quite formidable.

you have to understand many miners support the classics scaling vision, but still run Core simply because they do not like the idea of a "change of government",

Also Core is promising an effective block increase in april and then 2MB HF later, these PROMISES is keeping a lot of hashing power pointed to Core.

What happens when segwit is delayed?
What happens when segwit's "effective block size increase" isn't as effective as the promised?
What happens when it is clear that core never has any intention of increasing to 2MB ever?

if / when Classics gains >51% of hash rate, it'll happen seemingly overnight. at one point poeple will of had enough, and the fireworks fallow.

4.6% has turned to 6.3% in a matter of days!

bullish?

(y)
 
  • Like
Reactions: AdrianX

Melbustus

Active Member
Aug 28, 2015
237
884
@cypherdoc
Thanks for the good inside info from Rusty. Since LN is being touted as a scaling solution then it should not still be in the design phase when blocks are full. It should be in the pilot phase handling decent real-world volume. We hear that SegWit is not really in itself a scaling solution, apart from the problem that it may activate very slowly, if at all. And, even afterwards the head-room increase for most txns is marginal.

Core Dev need to step back and evaluate that they are not in a position to allow Bitcoin's network effect to grow. They need to admit that they have run out of time, and demonstrate that they are not irresponsible by accepting Classic's 2MB patch.

That's assuming they *want* to scale now, which I really don't think they do. What they want is higher on-chain fees as soon as they can make that happen. They're confident it'll be necessary eventually no matter way, so they think it's their job (as the central-planners wise shepherds of Bitcoin, I guess) to ease us peons into it. For anyone who hasn't read Elliot Olds' collection of quotes, it's illuminating (and upsetting).

So it's clear that those guys are rigid thinkers, with a very particular, calcified, narrow, linear view of Bitcoin. Further, they don't listen to users and there's essentially zero chance of their opinions changing. They have a near-religious view of things.

That presents a problem beyond blocksize. I used to think that the primary issue was just getting more blockspace; that it'd be fine if Core did it, or XT, or whatever. Now my nightmare scenario is Core continuing to stonewall up until it becomes clear that Classic very well may win, and then magnanimously agreeing to a 2MB hard-fork in the near-term as the saviors of our land; "look how much we listened to our users and compromised!", they'll say. Thus they'd retain control of protocol dev, buying time to engineer artificially rising main-chain fees one way or another...

So to be clear, Bitcoin is at risk of losing the digital-cash use-case as long as these guys are driving protocol development, no matter what happens with MAX_BLOCK_SIZE at this point.