Gold collapsing. Bitcoin UP.

rocks

Active Member
Sep 24, 2015
586
2,284
I'll take that a step further and say that we can and well get all the theoretical speed and privacy benefits that a sidechain could offer on the main chain, via client improvements alone.

On the privacy front, next year we should see us finally start to make progress on address reuse as bip47 implements start to roll out and as Joinmarket matures and gains wider usage.

We should be able to deal the blockchain analytic companies a fairly significant setback in any case.
Great to hear BIP47 is moving forward.

Out of curiosity are you aware of any direction to add mixing functionality such as CoinShuffle? If the core client (or any widely used client for that matter) added this functionality it would effectively create a large meeting place for large numbers of people to securely mix their coins.
http://crypsys.mmci.uni-saarland.de/projects/CoinShuffle/coinshuffle.pdf
https://bitcointalk.org/index.php?topic=567625.msg6370451#msg6370451

Another thought, have you considered adding BIP47 to Bitpay's XT client and others, while not including it in core? It would be interesting to see non-core controlled clients add functionality people are interested in while core does not since the devs are focused on LN....
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
It doesn't even have to be that bad, what if they get LN to function correctly, but no one wants to use it?
there you go. this why i concluded back in the old BCT thread that as much as i rail on SC's & LN that i shouldn't worry about it so much. b/c i don't think there will be a demand for it. the reason i do rail is b/c they're being used as an excuse to block bigger blocks and growth on MC and acting as a major diversion. we really don't have that luxury of time wasted.

Dryja, Poon, & now Sztorc all admit their ideas are dependent on keeping blocks fixed at 1MB in their respective podcast and blog posts.
 

Justus Ranvier

Active Member
Aug 28, 2015
875
3,746
Out of curiosity are you aware of any direction to add mixing functionality such as CoinShuffle?
I've heard that at least one person is looking into adapting the CoinShuffle protocol to JoinMarket.

Personally, JM is the mixing protocol that I think has a future, since they are the only ones that figured out that the best way to make sure that liquidity will exist is to allow people to pay for it.

Another thought, have you considered adding BIP47 to Bitpay's XT client
The only actual coding I have time for right now is on Open-Transactions.

I've been advising wallet developers who want to deploy BIP47, and it looks like Samaurai Wallet will be the first one to launch.

If someone else wants to code a BIP47 implementation for Bitcoin XT (or any wallet), I'm available to answer questions.
 
  • Like
Reactions: rocks and majamalu

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
remember we're still way above the 200DMA which is still heading up. this is a bull until proven otherwise:

 

Zangelbert Bingledack

Well-Known Member
Aug 29, 2015
1,485
5,585
Paul Sztorc at least speaks his mind no matter what. He's not a joiner, even if he does have a weakness for some high-falutin received ideas from others.

Notably Sztorc is not necessarily a microblockist, because he says that if TOR can be sped up, blocksizes can increase. And he thinks TOR can be sped up dramatically through incentives made possible by Bitcoin (pay for TOR nodes using BTC instead of them all being volunteers). Unlike some, he seems to welcome larger blocks as long as certain conditions are met. Things like thin blocks, weak blocks, ILBT, etc. could of course also make that possible. I don't get the sense he would be the type to delay working on such things to create motivation for sidechains/LN.

Also, since he understands prediction markets he's in a good position to understand why fork arbitrage invalidates his concerns about controversial hard forks wrecking decentralization. Since fork arbing creates an effective prediction market for the viability of the two forks, non-experts don't have to be beholden to any expert recommendations about which fork to choose (an argument I already found silly, but I guess he didn't). That makes hard forks the ultimate decentralized decision-making tool, *especially* for controversial decisions.
 
  • Like
Reactions: majamalu

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
 
  • Like
Reactions: majamalu

lunar

Well-Known Member
Aug 28, 2015
1,001
4,290
And he thinks TOR can be sped up dramatically through incentives made possible by Bitcoin (pay for TOR nodes using BTC instead of them all being volunteers).
I've wondered about this before. Does the decentralized future belong to sponsored nodes. Some kind of bandwidth and storage arbitrage.

Is it possible we'll see a scenario where it's profitable to set up a multi node? (lets call it a Swiss Army Knode). Essentially your running a combined Bitcoin/TOR/Open-transactions/Open Bizzare, Maidsafe, Torrent, nodes..etc and your paid in millibits by those that need the storage, bandwidth or encrypted relay/mixing potential.
 

Zangelbert Bingledack

Well-Known Member
Aug 29, 2015
1,485
5,585
@lunarboy

And meshnet or wi-fi sharing. The endgame to me is to maximize utilization of all idle resources. Not just make sure that all usable capacity is being used at all times, but that it's being used for the highest-value purposes as judged by the market.

Do you allocate your bandwidth to helping TOR, meshnet, Drivespace, torrents, or your own stuff? Most people would default to whatever pays the most (done automatically, like how I hear mining of altcoins is done now), and start considering how much they really need all 27 seasons of The Simpsons in 4K (but get it faster and cheaper if they really do need it).

Economization! Something that only happens when resources are integrated into the price system with sufficient granularity.

What happens when say 5% diskspace capacity utilization goes to 80%? ~16x less diskspace actually needs to be manufactured in the world, and it is put to uses that satisfy more people than before. Huge win for the environment and much lower costs across the board.
 

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
Greg on reddit showing his sentiment towards bitcoind users, as well as his 'inability' to understand the different levels of discussion.
[doublepost=1448439733,1448439105][/doublepost]
As far as I have been able to see, Satoshi constantly worked to remove potential barriers to bitcoin's success. It seems doubtful that he would intentionally put something in that creates a serious point of failure early on that Bitcoin has to overcome and that would cause Bitcoin's failure if it didn't.

Without the blocksize issue Bitcoin would have the opportunity to grow without any impediments. Even if the developers start to take an anti-Bitcoin approach, there is a limit to what they can do.

I think Satoshi just assumed that either he or Gavin would be able to remove the cap. And then unfortunately didn't before giving up control.

Satoshi is a brilliant person, but not an oracle. I don't think he could have foreseen the blocksize cap becoming the issue it is today, I don't think he thought anyone supposedly aligned to Bitcoin would want to keep it as a way to limit Bitcoin.
Yes that's a good point, he's no oracle, of course. However he's an intelligent being and so I think he could have seen what caveden clearly saw. I agree, he might have been just blind to this (everyone just has limited attention), but he otherwise has a very good and wide grasp of people, dynamics & incentives so that seemed odd to me. Hard to say and pointless to argue what he really thinks when that guy sits in his mountain hut and broke of all contact :D
[doublepost=1448440103][/doublepost]
In

In all reality if governments the world over worked to shut down all nodes (controle all nodes) the network would notice and Bitcoin will be affected. It would probably result in fewer transactions and blocks could shrink to a size that you could run a node on TOR.
If western governments turn truly evil, you'd get shot for running a tor node or even using a computer 'unattended'. Someone said Bitcoin needs at least one country on the world to be at least half-way free.
 

sickpig

Active Member
Aug 28, 2015
926
2,541
I've wondered about this before. Does the decentralized future belong to sponsored nodes. Some kind of bandwidth and storage arbitrage.
Yes, it does.

It is the *most* viable way to guarantee network decentralization, namely economic incentives. The same reason why bitcoin is up and running today. @Justus Ranvier theorized about it at length.

It was already the case in the CPU mining era, full node operators were rewarded with block subsidy. Since GPU mining happened full node and mining decoupled and hence the need of economic incentives to run a node arised. The lack of it was the main cause of the decreasing number of nodes.

That said, expanding this mechanism to as much resources as possible it's something extremely valuable, as @Zangelbert Bingledack noted, and in this case could even remove the last bastion of small blocker: operating a node through TOR.

Let me have one last thought slightly correlated with the issue at hand. Does everybody know that gmax proved that distributed consensus mechanisms was impossible before he joined bitcoin development? He joined bitcoin because reality proved he was wrong, and he was wrong because he didn't take into account properly economic incentives. Unfortunately he's underestimating the free market once again.
 

theZerg

Moderator
Staff member
Aug 28, 2015
1,012
2,327
It wasn't gmax it was Michael fischer. Google his name and distributed consensus.

And you know what? He's right yet bitcoin still exists. How can that be? Because bitcoin never actually achieves consensus! It just reduces probability. For example high tech aliens could calculate an alternative blockchain and completely rewrite the blockchain history.

Its awesome how human ingenuity can work around fundamental truths.
 
  • Like
Reactions: AdrianX and awemany

sickpig

Active Member
Aug 28, 2015
926
2,541
@theZerg so he went on record misattributing to himself work done by others. wow.

edit: grammar
 
Last edited:
  • Like
Reactions: AdrianX

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
It wasn't gmax it was Michael fischer. Google his name and distributed consensus.
In the video, GMax asserts it was him, though, or am I missing something?

And, yes, you're right @ ingenuity: Probabilistic algorithms (like just in the form of hash functions) can apparently do some amazing stuff. Look at git and deduplicating backups such as bup/attic for another nice example...

I think it actually took quite a while for 'the industry' and CS people to warm up in general to probabilistic methods. I have overhead and participated in quite a few discussions where 'but an identical hash might still happen!' was the expression of worry of many.

But I think we all got to the point that SHA256 and its siblings are considered to be trap-door-ey enough to have everyone believe that the data you generated a hash with is the only data in existence that generated that hash...

I think a similar feeling of worry (though on a somewhat different level) is going on with Bitcoin and its probabilistic and decentralized way that it finds consensus. Some smallblockers might still need the learning experience that a market-based, distributed consensus is actually good enough for what we are doing.

Though it appears easy enough to fall into old modes of thinking aka 'This thing cannot possibly work'.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
divergence widening *again*. $DJI desperately trying to inflict max pain on bears (like me). i seriously think this will end badly:


[doublepost=1448463771][/doublepost]is it possible that 1062 represents the low for this short term daily cycle? if so, that would be exceedingly bearish to get a failed bounce so soon:


[doublepost=1448464045][/doublepost]the drop in gold and silver is also bearish for stocks *overall* (meaning they won't always move in tandem). we're overdue for a bounce imo which makes this persistent down ward movement perplexing.
 
Last edited:
  • Like
Reactions: majamalu

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
You're not missing anything.
any resolutions to the BIP Editor breech by gmax on dev mail?
[doublepost=1448471093][/doublepost]one of my all-time favorite charts; National Bank of Greece or NBG, which just hit a new all time low. now that is what i call Deflation:

 
Last edited:

sickpig

Active Member
Aug 28, 2015
926
2,541
@cypherdoc not that I'm aware of. take into account that traffic on dev ml has been decreased drastically since moderation was turned on.

maybe btc discuss ml is the new vanue for this kind of issue. I'm not subscribed to it, though.
[doublepost=1448472119][/doublepost]
LOL. Nice catch. So what I should think of his latest academic assertions on reddit then?

And for someone so worried about big blocks optimizing the ECC parts (which are already reasonably fast) should be quite down on the priority list anyways?

Not that I complain that they wrote libsecp256k1.
dunno what to think. the more time pass the gmax personality is becoming increasingly difficult to understand to me. so I'm not the right person to answer this question.

re libsecp256k1: I'm not skilled nor smart enough to audit it. still I'm quite confident in sipa, main author, coding and math abilities, more to the point tests they performed shown that a freshly installed bitcoind takes 3.5 hours to download *and* validate the entire block chain.
 

cypherdoc

Well-Known Member
Aug 26, 2015
5,257
12,995
GBTC up 8.95% today. we're getting close to a boing:

 

awemany

Well-Known Member
Aug 19, 2015
1,387
5,054
Yes, and that's great! However, the startup cost of a full node (because when it isn't stale, validation time doesn't really matter) is something that doesn't really need optimization right now.

Greg himself said bandwidth cost is the main issue, and Gavin's IBLTs or similar would help much more there.