The Best Bitcoin Alternatives - 4 Altcoins on the Rise ...

Bitcoin donations to Voat, a Reddit alternative, 04/01/2015 - 07/03/2015 [OC]

Bitcoin donations to Voat, a Reddit alternative, 04/01/2015 - 07/03/2015 [OC] submitted by rapemyradish to dataisbeautiful [link] [comments]

Vox.com: "2015 will be a crucial test for the alternative payment network. If Bitcoin does as poorly in 2015 as it did in 2014, a lot more people could question whether the technology really has a bright future."

Vox.com: submitted by montseayo to Bitcoin [link] [comments]

Bitcoin: The End of Money as We Know It (2015) - Is Bitcoin an alternative to national currencies backed by debt?

Bitcoin: The End of Money as We Know It (2015) - Is Bitcoin an alternative to national currencies backed by debt? submitted by dakecan to Documentaries [link] [comments]

Russia to launch alternative to SWIFT bank transaction system in spring 2015 Why don't they just use Bitcoin?

Russia to launch alternative to SWIFT bank transaction system in spring 2015 Why don't they just use Bitcoin? submitted by daanbarnard to Bitcoin [link] [comments]

How a $64M hack changed the fate of Ethereum, Bitcoin's closest competitor: Cryptocurrency alternative to bitcoin was co-founded by 19-year-old Canadian-Russian in 2015

How a $64M hack changed the fate of Ethereum, Bitcoin's closest competitor: Cryptocurrency alternative to bitcoin was co-founded by 19-year-old Canadian-Russian in 2015 submitted by BobsBurgers4Bitcoin to btc [link] [comments]

How a $64M hack changed the fate of Ethereum, Bitcoin's closest competitor: Cryptocurrency alternative to bitcoin was co-founded by 19-year-old Canadian-Russian in 2015

How a $64M hack changed the fate of Ethereum, Bitcoin's closest competitor: Cryptocurrency alternative to bitcoin was co-founded by 19-year-old Canadian-Russian in 2015 submitted by BitcoinAllBot to BitcoinAll [link] [comments]

(x-post /r/Dogecoin) Sobering BTC article - tl:dr; payment alternatives offer little advantage over existing methods. (But... but we offer speed, community and pineapples! ;_; ) ~ 2014 was a bad year for Bitcoin and 2015 is unlikely to be much better ~

(x-post /Dogecoin) Sobering BTC article - tl:dr; payment alternatives offer little advantage over existing methods. (But... but we offer speed, community and pineapples! ;_; ) ~ 2014 was a bad year for Bitcoin and 2015 is unlikely to be much better ~ submitted by bit_moon to Bitcoin [link] [comments]

Proposed alternatives to the 20MB stepfunction | Raystonn . | May 28 2015 /r/bitcoin_devlist

Proposed alternatives to the 20MB stepfunction | Raystonn . | May 28 2015 /bitcoin_devlist submitted by BitcoinAllBot to BitcoinAll [link] [comments]

block-size tradeoffs & hypothetical alternatives (Re: Block size increase oppositionists: please clearly define what you need done to increase block size to a static 8MB, and help do it) | Adam Back | Jun 30 2015 /r/bitcoin_devlist

block-size tradeoffs & hypothetical alternatives (Re: Block size increase oppositionists: please clearly define what you need done to increase block size to a static 8MB, and help do it) | Adam Back | Jun 30 2015 /bitcoin_devlist submitted by BitcoinAllBot to BitcoinAll [link] [comments]

Proposed alternatives to the 20MB step function | Gavin Andresen | May 28 2015 /r/bitcoin_devlist

Proposed alternatives to the 20MB step function | Gavin Andresen | May 28 2015 /bitcoin_devlist submitted by BitcoinAllBot to BitcoinAll [link] [comments]

alternatives to the 20MB block limit, measure first! | Ron | May 25 2015 /r/bitcoin_devlist

alternatives to the 20MB block limit, measure first! | Ron | May 25 2015 /bitcoin_devlist submitted by BitcoinAllBot to BitcoinAll [link] [comments]

Proposed alternatives to the 20MB step | Jrme Legoupil | Jun 01 2015 /r/bitcoin_devlist

submitted by BitcoinAllBot to BitcoinAll [link] [comments]

Alternative name for CHECKSEQUENCEVERIFY (BIP112) | Btc Drak | Nov 24 2015 /r/bitcoin_devlist

Alternative name for CHECKSEQUENCEVERIFY (BIP112) | Btc Drak | Nov 24 2015 /bitcoin_devlist submitted by BitcoinAllBot to BitcoinAll [link] [comments]

soft-fork block size increase (extension blocks) Re: Proposed alternatives to the 20MB stepfunction | Adam Back | May 30 2015 /r/bitcoin_devlist

soft-fork block size increase (extension blocks) Re: Proposed alternatives to the 20MB stepfunction | Adam Back | May 30 2015 /bitcoin_devlist submitted by BitcoinAllBot to BitcoinAll [link] [comments]

QR code alternatives (was: Proposal: extend bip70 with OpenAlias) | Mike Hearn | Jul 20 2015 /r/bitcoin_devlist

QR code alternatives (was: Proposal: extend bip70 with OpenAlias) | Mike Hearn | Jul 20 2015 /bitcoin_devlist submitted by BitcoinAllBot to BitcoinAll [link] [comments]

Alternate HD path structure: BIP, blog, or wat? | Matt Smith | Jun 19 2015 /r/bitcoin_devlist

Alternate HD path structure: BIP, blog, or wat? | Matt Smith | Jun 19 2015 /bitcoin_devlist submitted by BitcoinAllBot to BitcoinAll [link] [comments]

alternate proposal opt-in miner takes double-spend (Re: replace-by-fee v0.10.0rc4) | Adam Back | Feb 22 2015 /r/bitcoin_devlist

alternate proposal opt-in miner takes double-spend (Re: replace-by-fee v0.10.0rc4) | Adam Back | Feb 22 2015 /bitcoin_devlist submitted by BitcoinAllBot to BitcoinAll [link] [comments]

Technical: The Path to Taproot Activation

Taproot! Everybody wants to have it, somebody wants to make it, nobody knows how to get it!
(If you are asking why everybody wants it, see: Technical: Taproot: Why Activate?)
(Pedants: I mostly elide over lockin times)
Briefly, Taproot is that neat new thing that gets us:
So yes, let's activate taproot!

The SegWit Wars

The biggest problem with activating Taproot is PTSD from the previous softfork, SegWit. Pieter Wuille, one of the authors of the current Taproot proposal, has consistently held the position that he will not discuss activation, and will accept whatever activation process is imposed on Taproot. Other developers have expressed similar opinions.
So what happened with SegWit activation that was so traumatic? SegWit used the BIP9 activation method. Let's dive into BIP9!

BIP9 Miner-Activated Soft Fork

Basically, BIP9 has a bunch of parameters:
Now there are other parameters (name, starttime) but they are not anywhere near as important as the above two.
A number that is not a parameter, is 95%. Basically, activation of a BIP9 softfork is considered as actually succeeding if at least 95% of blocks in the last 2 weeks had the specified bit in the nVersion set. If less than 95% had this bit set before the timeout, then the upgrade fails and never goes into the network. This is not a parameter: it is a constant defined by BIP9, and developers using BIP9 activation cannot change this.
So, first some simple questions and their answers:

The Great Battles of the SegWit Wars

SegWit not only fixed transaction malleability, it also created a practical softforkable blocksize increase that also rebalanced weights so that the cost of spending a UTXO is about the same as the cost of creating UTXOs (and spending UTXOs is "better" since it limits the size of the UTXO set that every fullnode has to maintain).
So SegWit was written, the activation was decided to be BIP9, and then.... miner signalling stalled at below 75%.
Thus were the Great SegWit Wars started.

BIP9 Feature Hostage

If you are a miner with at least 5% global hashpower, you can hold a BIP9-activated softfork hostage.
You might even secretly want the softfork to actually push through. But you might want to extract concession from the users and the developers. Like removing the halvening. Or raising or even removing the block size caps (which helps larger miners more than smaller miners, making it easier to become a bigger fish that eats all the smaller fishes). Or whatever.
With BIP9, you can hold the softfork hostage. You just hold out and refuse to signal. You tell everyone you will signal, if and only if certain concessions are given to you.
This ability by miners to hold a feature hostage was enabled because of the miner-exit allowed by the timeout on BIP9. Prior to that, miners were considered little more than expendable security guards, paid for the risk they take to secure the network, but not special in the grand scheme of Bitcoin.

Covert ASICBoost

ASICBoost was a novel way of optimizing SHA256 mining, by taking advantage of the structure of the 80-byte header that is hashed in order to perform proof-of-work. The details of ASICBoost are out-of-scope here but you can read about it elsewhere
Here is a short summary of the two types of ASICBoost, relevant to the activation discussion.
Now, "overt" means "obvious", while "covert" means hidden. Overt ASICBoost is obvious because nVersion bits that are not currently in use for BIP9 activations are usually 0 by default, so setting those bits to 1 makes it obvious that you are doing something weird (namely, Overt ASICBoost). Covert ASICBoost is non-obvious because the order of transactions in a block are up to the miner anyway, so the miner rearranging the transactions in order to get lower power consumption is not going to be detected.
Unfortunately, while Overt ASICBoost was compatible with SegWit, Covert ASICBoost was not. This is because, pre-SegWit, only the block header Merkle tree committed to the transaction ordering. However, with SegWit, another Merkle tree exists, which commits to transaction ordering as well. Covert ASICBoost would require more computation to manipulate two Merkle trees, obviating the power benefits of Covert ASICBoost anyway.
Now, miners want to use ASICBoost (indeed, about 60->70% of current miners probably use the Overt ASICBoost nowadays; if you have a Bitcoin fullnode running you will see the logs with lots of "60 of last 100 blocks had unexpected versions" which is exactly what you would see with the nVersion manipulation that Overt ASICBoost does). But remember: ASICBoost was, at around the time, a novel improvement. Not all miners had ASICBoost hardware. Those who did, did not want it known that they had ASICBoost hardware, and wanted to do Covert ASICBoost!
But Covert ASICBoost is incompatible with SegWit, because SegWit actually has two Merkle trees of transaction data, and Covert ASICBoost works by fudging around with transaction ordering in a block, and recomputing two Merkle Trees is more expensive than recomputing just one (and loses the ASICBoost advantage).
Of course, those miners that wanted Covert ASICBoost did not want to openly admit that they had ASICBoost hardware, they wanted to keep their advantage secret because miners are strongly competitive in a very tight market. And doing ASICBoost Covertly was just the ticket, but they could not work post-SegWit.
Fortunately, due to the BIP9 activation process, they could hold SegWit hostage while covertly taking advantage of Covert ASICBoost!

UASF: BIP148 and BIP8

When the incompatibility between Covert ASICBoost and SegWit was realized, still, activation of SegWit stalled, and miners were still not openly claiming that ASICBoost was related to non-activation of SegWit.
Eventually, a new proposal was created: BIP148. With this rule, 3 months before the end of the SegWit timeout, nodes would reject blocks that did not signal SegWit. Thus, 3 months before SegWit timeout, BIP148 would force activation of SegWit.
This proposal was not accepted by Bitcoin Core, due to the shortening of the timeout (it effectively times out 3 months before the initial SegWit timeout). Instead, a fork of Bitcoin Core was created which added the patch to comply with BIP148. This was claimed as a User Activated Soft Fork, UASF, since users could freely download the alternate fork rather than sticking with the developers of Bitcoin Core.
Now, BIP148 effectively is just a BIP9 activation, except at its (earlier) timeout, the new rules would be activated anyway (instead of the BIP9-mandated behavior that the upgrade is cancelled at the end of the timeout).
BIP148 was actually inspired by the BIP8 proposal (the link here is a historical version; BIP8 has been updated recently, precisely in preparation for Taproot activation). BIP8 is basically BIP9, but at the end of timeout, the softfork is activated anyway rather than cancelled.
This removed the ability of miners to hold the softfork hostage. At best, they can delay the activation, but not stop it entirely by holding out as in BIP9.
Of course, this implies risk that not all miners have upgraded before activation, leading to possible losses for SPV users, as well as again re-pressuring miners to signal activation, possibly without the miners actually upgrading their software to properly impose the new softfork rules.

BIP91, SegWit2X, and The Aftermath

BIP148 inspired countermeasures, possibly from the Covert ASiCBoost miners, possibly from concerned users who wanted to offer concessions to miners. To this day, the common name for BIP148 - UASF - remains an emotionally-charged rallying cry for parts of the Bitcoin community.
One of these was SegWit2X. This was brokered in a deal between some Bitcoin personalities at a conference in New York, and thus part of the so-called "New York Agreement" or NYA, another emotionally-charged acronym.
The text of the NYA was basically:
  1. Set up a new activation threshold at 80% signalled at bit 4 (vs bit 1 for SegWit).
    • When this 80% signalling was reached, miners would require that bit 1 for SegWit be signalled to achive the 95% activation needed for SegWit.
  2. If the bit 4 signalling reached 80%, increase the block weight limit from the SegWit 4000000 to the SegWit2X 8000000, 6 months after bit 1 activation.
The first item above was coded in BIP91.
Unfortunately, if you read the BIP91, independently of NYA, you might come to the conclusion that BIP91 was only about lowering the threshold to 80%. In particular, BIP91 never mentions anything about the second point above, it never mentions that bit 4 80% threshold would also signal for a later hardfork increase in weight limit.
Because of this, even though there are claims that NYA (SegWit2X) reached 80% dominance, a close reading of BIP91 shows that the 80% dominance was only for SegWit activation, without necessarily a later 2x capacity hardfork (SegWit2X).
This ambiguity of bit 4 (NYA says it includes a 2x capacity hardfork, BIP91 says it does not) has continued to be a thorn in blocksize debates later. Economically speaking, Bitcoin futures between SegWit and SegWit2X showed strong economic dominance in favor of SegWit (SegWit2X futures were traded at a fraction in value of SegWit futures: I personally made a tidy but small amount of money betting against SegWit2X in the futures market), so suggesting that NYA achieved 80% dominance even in mining is laughable, but the NYA text that ties bit 4 to SegWit2X still exists.
Historically, BIP91 triggered which caused SegWit to activate before the BIP148 shorter timeout. BIP148 proponents continue to hold this day that it was the BIP148 shorter timeout and no-compromises-activate-on-August-1 that made miners flock to BIP91 as a face-saving tactic that actually removed the second clause of NYA. NYA supporters keep pointing to the bit 4 text in the NYA and the historical activation of BIP91 as a failed promise by Bitcoin developers.

Taproot Activation Proposals

There are two primary proposals I can see for Taproot activation:
  1. BIP8.
  2. Modern Softfork Activation.
We have discussed BIP8: roughly, it has bit and timeout, if 95% of miners signal bit it activates, at the end of timeout it activates. (EDIT: BIP8 has had recent updates: at the end of timeout it can now activate or fail. For the most part, in the below text "BIP8", means BIP8-and-activate-at-timeout, and "BIP9" means BIP8-and-fail-at-timeout)
So let's take a look at Modern Softfork Activation!

Modern Softfork Activation

This is a more complex activation method, composed of BIP9 and BIP8 as supcomponents.
  1. First have a 12-month BIP9 (fail at timeout).
  2. If the above fails to activate, have a 6-month discussion period during which users and developers and miners discuss whether to continue to step 3.
  3. Have a 24-month BIP8 (activate at timeout).
The total above is 42 months, if you are counting: 3.5 years worst-case activation.
The logic here is that if there are no problems, BIP9 will work just fine anyway. And if there are problems, the 6-month period should weed it out. Finally, miners cannot hold the feature hostage since the 24-month BIP8 period will exist anyway.

PSA: Being Resilient to Upgrades

Software is very birttle.
Anyone who has been using software for a long time has experienced something like this:
  1. You hear a new version of your favorite software has a nice new feature.
  2. Excited, you install the new version.
  3. You find that the new version has subtle incompatibilities with your current workflow.
  4. You are sad and downgrade to the older version.
  5. You find out that the new version has changed your files in incompatible ways that the old version cannot work with anymore.
  6. You tearfully reinstall the newer version and figure out how to get your lost productivity now that you have to adapt to a new workflow
If you are a technically-competent user, you might codify your workflow into a bunch of programs. And then you upgrade one of the external pieces of software you are using, and find that it has a subtle incompatibility with your current workflow which is based on a bunch of simple programs you wrote yourself. And if those simple programs are used as the basis of some important production system, you hve just screwed up because you upgraded software on an important production system.
And well, one of the issues with new softfork activation is that if not enough people (users and miners) upgrade to the newest Bitcoin software, the security of the new softfork rules are at risk.
Upgrading software of any kind is always a risk, and the more software you build on top of the software-being-upgraded, the greater you risk your tower of software collapsing while you change its foundations.
So if you have some complex Bitcoin-manipulating system with Bitcoin somewhere at the foundations, consider running two Bitcoin nodes:
  1. One is a "stable-version" Bitcoin node. Once it has synced, set it up to connect=x.x.x.x to the second node below (so that your ISP bandwidth is only spent on the second node). Use this node to run all your software: it's a stable version that you don't change for long periods of time. Enable txiindex, disable pruning, whatever your software needs.
  2. The other is an "always-up-to-date" Bitcoin Node. Keep its stoarge down with pruning (initially sync it off the "stable-version" node). You can't use blocksonly if your "stable-version" node needs to send transactions, but otherwise this "always-up-to-date" Bitcoin node can be kept as a low-resource node, so you can run both nodes in the same machine.
When a new Bitcoin version comes up, you just upgrade the "always-up-to-date" Bitcoin node. This protects you if a future softfork activates, you will only receive valid Bitcoin blocks and transactions. Since this node has nothing running on top of it, it is just a special peer of the "stable-version" node, any software incompatibilities with your system software do not exist.
Your "stable-version" Bitcoin node remains the same version until you are ready to actually upgrade this node and are prepared to rewrite most of the software you have running on top of it due to version compatibility problems.
When upgrading the "always-up-to-date", you can bring it down safely and then start it later. Your "stable-version" wil keep running, disconnected from the network, but otherwise still available for whatever queries. You do need some system to stop the "always-up-to-date" node if for any reason the "stable-version" goes down (otherwisee if the "always-up-to-date" advances its pruning window past what your "stable-version" has, the "stable-version" cannot sync afterwards), but if you are technically competent enough that you need to do this, you are technically competent enough to write such a trivial monitor program (EDIT: gmax notes you can adjust the pruning window by RPC commands to help with this as well).
This recommendation is from gmaxwell on IRC, by the way.
submitted by almkglor to Bitcoin [link] [comments]

Can Blockchain Gaming Drive Cryptocurrency Adoption?

Can Blockchain Gaming Drive Cryptocurrency Adoption?
The gaming industry, with its approximately 2.5 billion gamers worldwide, is a lucrative target and an immense field of application for blockchain itself, Bitcoin and other cryptocurrencies that could no doubt give a mighty push toward taking and making the technology mainstream. Honestly, this is not quite a news as the efforts to establish cryptocurrencies in the entertainment sector have gone a long way, with varying degrees of success.
by StealthEX
What they were, how it fared, and where things are going now – these questions deserve their own inquiry. So let’s take a look at how gaming facilitates cryptocurrency adoption, in what ways, and whether exposing the blockchain tech to a user base of a third of the world’s population would help oil the wheels of this sportster in a major way and ultimately cause a tectonic shift in the gaming industry itself.

A Little Bit of History

As Bitcoin kicked off in late 2008, with its first transaction hitting, or effectively starting, the blockchain in early January of 2009, it had taken well over two years till the cryptocurrency got involved in online gambling. It was the now-defunct mobile poker platform, Switchpoker, a developer of an online poker room that started to accept Bitcoin as a deposit and payment option. You can still find a topic on Bitcointalk.org about this news dated back to November 23, 2011.
In April 2012, Erik Voorhees, an American entrepreneur and early Bitcoin adopter, founded Satoshi Dice, arguably the oldest online cryptocasino on the block, which is still pretty much alive today, although Voorhees sold it in a year. What makes it truly intriguing is the fact that during its early years the casino was generating half of all the transactions on the Bitcoin network. In short, online gambling was critically important in Bitcoin’s infancy years as it helped promote cryptocurrency awareness that led to future growth and expansion into other areas.
Some folks are certainly going to argue that gambling is not the same thing as gaming. The commonly accepted view is that gaming is based on skill while gambling on chance. We won’t debate over this point. However, as every poker player knows, the outcome of a poker game depends not only on luck, but also on skill and expertise. Put simply, there are large gray areas and overlaps. All things considered, our exposition would be missing a big chunk of significant history without giving due credit to gambling and how it helped Bitcoin adoption.
Now that online gambling is off our chest, we can safely turn to gaming as it is understood in the industry, and look at how it helped the blockchain space. One of the first uses of Bitcoin in a major game that we are aware of started in 2014 with the launch of BitQuest, a Minecraft server that used Bitcoin for in-game transactions. Within the gaming environment you could buy valuable in-game stuff from other users with the so-called bits, small fractions of a Bitcoin, and earn them by completing in-game tasks or challenges like killing local monsters.
BitQuest closed the server in summer of 2019, and its brand name now belongs to a different entity not involved with gaming, but it still produced an impact. In essence, this effort successfully demonstrated how a cryptocurrency, in this case Bitcoin, can be used in lieu of a native in-game currency that players can earn, buy and spend as well as withdraw. This has serious implications for two main reasons. First, Bitcoin, unlike any other purely in-game currency, has uses outside the game and its ecosystem, and, second, its supply cannot be manipulated by the game developers, which makes the game by far more fair and square.
Needless to say, the example that BitQuest had set encouraged other market participants to look into Bitcoin as an alternative option for in-game currencies. Another popular Minecraft server, PlayMC, also introduced Bitcoin into its world in 2015, but ceased the operation just two years later. There were a few other servers experimenting with altcoins, more specifically, Dogecoin, but most of them disappeared from the scene shortly thereafter, failing to attract enough die-hard Minecraft fans.

What Has Changed?

With the arrival of smart contract-enabled blockchains such as Ethereum, EOS and TRON, the phrase “blockchain gaming” has taken on a more literal meaning as these blockchains allow games to be designed and played entirely on-chain in much the same manner trades are made on a decentralized exchange. While TRON stands for “The Real-time Operating system Nucleus”, there is an obvious reference to a once popular arcade game based on a titular 1982 science fiction film that ultimately garnered a cult following.
CryptoKitties is likely the most popular game ever released in the Ethereum ecosystem and probably in the whole crypto space so far. Its test version was made available on October 19, 2017, and it was an instant success. By the end of 2017 over 200,000 people signed up for the game, spending over $20 million in Ether. We won’t delve into its “gameplay” as it is beyond the scope of this article, and most certainly you are well familiar with it anyway. But what we absolutely should write about is the effect it made and the repercussions it produced.
It could be said that CryptoKitties was to the Ethereum blockchain what Satoshi Dice had been to Bitcoin in the early days of crypto. At the peak of its popularity the game reportedly accounted for 20-25% of all Ethereum’s traffic that clogged the entire Ethereum network, with transaction fees skyrocketing. No wonder lots of people got pissed off with this turn of events. However, despite all the rage and fury, CryptoKitties amply demonstrated what a success means in the blockchain gaming field, how it looks and feels in practice.
It is hard to estimate how much CryptoKitties contributed to cryptocurrency adoption. But given that a few hundred thousand people got involved in this game alone and many more with dozens of blockchain games that it has spawned, like Etherbots, Gods Unchained, The Six Dragons, etc, this indisputable triumph surely counts as a massive contribution by any definition or metric. Moreover, it also revealed the weaknesses of the contemporary blockchain solutions and what exactly should be done to overcome them.
Evolution never goes linearly. In fact, it generally doesn’t go in curves, circles, or zig zags, either. It always moves along very diverse routes, directions and entire dimensions like plants and animals, viruses and bacteria, and, well, dinosaurs and mammals. The evolution of gaming in crypto space is no different. СryptoKitties and other games share essentially the same tech under the hood – building games on some advanced general-purpose blockchain such as Ethereum. But it is not the only front that crypto gaming has been advancing on, nor is it the only way to introduce gaming to cryptocurrencies, and vice versa.
A more recent approach is based on designing either a standalone cryptocurrency or a token on a smart contract-enabled blockchain to be used across many games that support it as an in-game currency. As a result, gamers can enjoy true ownership of their in-game assets (the so-called non-fungible tokens, or NFTs), safe item trading outside the game, and cross-game compatibility. This path has been taken by such projects as Enjin (ENJ), GAME Credits (GAME), Decentraland (MANA), WAX (WAXP) and others, with their respective cryptocurrencies fueling a range of games.
A somewhat different avenue is taken by Funfair (FUN), Chromia (CHR) and Lucid Sight, which are offering platforms that blockchain games can be built on. Thus, Lucid Sight’s Scarcity Engine is focused more on game creators than end users, that is to say, gamers, allowing developers to integrate blockchain into their games. It aims to obliterate the difference between blockchain-based games and traditional gaming platforms. Funfair, on the other hand, leans more toward creating custom-built blockchain casinos, with its FUN token as a casino “chip”. So much for no more gambling, huh.
Our account of events would be incomplete if we didn’t mention yet another attempt to make use of Minecraft for the purpose of introducing cryptocurrencies to the gaming public. This time, a new Minecraft mod called SatoshiQuest has emerged. To participate in it, the gamers pay $1 in Bitcoin and get one in-game life. The pooled coins make up the loot, and the challenge is to find a minimum of 400 key fragments into which the keys to the Bitcoin wallet containing the prize are divided. And who said that evolution doesn’t loop?

Challenges and Future Prospects

The knockout popularity of СryptoKitties has clearly shown the scale of cryptocurrency mass adoption that blockchain gaming can trigger. As the game developers themselves put it, their “goal is to drive mainstream adoption of blockchain technology”. They believe that “the technology has immense benefits for consumers, but for those benefits to be realized, it needs to be experienced to be understood”. Speaking more broadly, as more people start using cryptocurrencies for gaming, they may eventually become interested in using their coins for purposes other than playing one game or another.
With that said, it is now as clear that there are two main barriers on the way there. The first is the limitations of the blockchain tech itself that essentially limits blockchain gaming to NFTs, in-game currencies, streamlined payments, and similar stuff. This is mostly a technical challenge anyway, and we could realistically expect it to be solved sooner or later. The other issue is applicable to the gaming industry as a whole. People en masse would only play games that are truly engaging and immersive, technical issues aside.
So the bottom line is that we need the convergence of these two vectors to make blockchain a dominating force in the gaming industry. First, the blockchain tech should have the capacity for running multiplayer games that major video game developers like Blizzard, Valve and Ubisoft produce, no trade-offs here. Then, we actually need the games like Warcraft, Counter-Strike or Far Cry that can be played on blockchain, to make it matter. Only after we get there, the gaming industry will likely become a primary driver behind cryptocurrency adoption.
What are your thoughts on how gaming facilitates cryptocurrency adoption? Tell us your ideas in the comments below.
And remember if you need to exchange your coins StealthEX is here for you. We provide a selection of more than 250 coins and constantly updating the list so that our customers will find a suitable option. Our service does not require registration and allows you to remain anonymous. Why don’t you check it out? Just go to StealthEX and follow these easy steps:
✔ Choose the pair and the amount for your exchange. For example BTC to ETH.
✔ Press the “Start exchange” button.
✔ Provide the recipient address to which the coins will be transferred.
✔ Move your cryptocurrency for the exchange.
✔ Receive your coins.
Follow us on Medium, Twitter, Facebook, and Reddit to get StealthEX.io updates and the latest news about the crypto world. For all requests message us via [[email protected]](mailto:[email protected]).
The views and opinions expressed here are solely those of the author. Every investment and trading move involves risk. You should conduct your own research when making a decision.
Original article was posted on https://stealthex.io/blog/2020/09/22/can-blockchain-gaming-drive-cryptocurrency-adoption/
submitted by Stealthex_io to StealthEX [link] [comments]

A criticism of the article "Six monetarist errors: why emission won't feed inflation"

(be gentle, it's my first RI attempt, :P; I hope I can make justice to the subject, this is my layman understanding of many macro subjects which may be flawed...I hope you can illuminate me if I have fallen short of a good RI)
Introduction
So, today a heterodox leaning Argentinian newspaper, Ambito Financiero, published an article criticizing monetarism called "Six monetarist errors: why emission won't feed inflation". I find it doesn't properly address monetarism, confuses it with other "economic schools" for whatever the term is worth today and it may be misleading, so I was inspired to write a refutation and share it with all of you.
In some ways criticizing monetarism is more of a historical discussion given the mainstream has changed since then. Stuff like New Keynesian models are the bleeding edge, not Milton Friedman style monetarism. It's more of a symptom that Argentinian political culture is kind of stuck in the 70s on economics that this things keep being discussed.
Before getting to the meat of the argument, it's good to have in mind some common definitions about money supply measures (specifically, MB, M1 and M2). These definitions apply to US but one can find analogous stuff for other countries.
Argentina, for the lack of access to credit given its economic mismanagement and a government income decrease because of the recession, is monetizing deficits way more than before (like half of the budget, apparently, it's money financed) yet we have seen some disinflation (worth mentioning there are widespread price freezes since a few months ago). The author reasons that monetary phenomena cannot explain inflation properly and that other explanations are needed and condemns monetarism. Here are the six points he makes:
1.Is it a mechanical rule?
This way, we can ask by symmetry: if a certainty exists that when emission increases, inflation increases, the reverse should happen when emission becomes negative, obtaining negative inflation. Nonetheless, we know this happens: prices have an easier time increasing and a lot of rigidity decreasing. So the identity between emission and inflation is not like that, deflation almost never exists and the price movement rhythm cannot be controlled remotely only with money quantity. There is no mechanical relationship between one thing and the other.
First, the low hanging fruit: deflation is not that uncommon, for those of you that live in US and Europe it should be obvious given the difficulties central banks had to achieve their targets, but even Argentina has seen deflation during its depression 20 years ago.
Second, we have to be careful with what we mean by emission. A statement of quantity theory of money (extracted from "Money Growth and Inflation: How Long is the Long-Run?") would say:
Inflation occurs when the average level of prices increases. Individual price increases in and of themselves do not equal inflation, but an overall pattern of price increases does. The price level observed in the economy is that which leads the quantity of money supplied to equal the quantity of money demanded. The quantity of money supplied is largely controlled by the [central bank]. When the supply of money increases or decreases, the price level must adjust to equate the quantity of money demanded throughout the economy with the quantity of money supplied. The quantity of money demanded depends not only on the price level but also on the level of real income, as measured by real gross domestic product (GDP), and a variety of other factors including the level of interest rates and technological advances such as the invention of automated teller machines. Money demand is widely thought to increase roughly proportionally with the price level and with real income. That is, if prices go up by 10 percent, or if real income increases by 10 percent, empirical evidence suggests people want to hold 10 percent more money. When the money supply grows faster than the money demand associated with rising real incomes and other factors, the price level must rise to equate supply and demand. That is, inflation occurs. This situation is often referred to as too many dollars chasing too few goods. Note that this theory does not predict that any money-supply growth will lead to inflation—only that part of money supply growth that exceeds the increase in money demand associated with rising real GDP (holding the other factors constant).
So it's not mere emission, but money supply growing faster than money demand which we should consider. So negative emission is not necessary condition for deflation in this theory.
It's worth mentioning that the relationship with prices is observed for a broad measure of money (M2) and after a lag. From the same source of this excerpt one can observe in Fig. 3a the correlation between inflation and money growth for US becomes stronger the longer data is averaged. Price rigidities don't have to change this long term relationship per se.
But what about causality and Argentina? This neat paper shows regressions in two historical periods: 1976-1989 and 1991-2001. The same relationship between M2 and inflation is observed, stronger in the first, highly inflationary period and weaker in the second, more stable, period. The regressions a 1-1 relationship in the high inflation period but deviates a bit in the low inflation period (yet the relationship is still there). Granger causality, as interpreted in the paper, shows prices caused money growth in the high inflation period (arguably because spending was monetized) while the reverse was true for the more stable period.
So one can argue that there is a mechanical relationship, albeit one that is more complicated than simple QTOM theory. The relationship is complicated too for low inflation economies, it gets more relevant the higher inflation is.
Another point the author makes is that liquidity trap is often ignored. I'll ignore the fact that you need specific conditions for the liquidity trap to be relevant to Argentina and address the point. Worth noting that while market monetarists (not exactly old fashioned monetarists) prefer alternative explanations for monetary policy with very low interest rates, this phenomena has a good monetary basis, as explained by Krugman in his famous japanese liquidity trap paper and his NYT blog (See this and this for some relevant articles). The simplified version is that while inflation may follow M2 growth with all the qualifiers needed, central banks may find difficulties targeting inflation when interest rates are low and agents are used to credible inflation targets. Central banks can change MB, not M2 and in normal times is good enough, but at those times M2 is out of control and "credibly irresponsible" policies are needed to return to normal (a more detailed explanation can be found in that paper I just linked, go for it if you are still curious).
It's not like monetary policy is not good, it's that central banks have to do very unconventional stuff to achieve in a low interest rate environment. It's still an open problem but given symmetric inflation targeting policies are becoming more popular I'm optimistic.
2 - Has inflation one or many causes?
In Argentina we know that the main determinant of inflation is dollar price increases. On that, economic concentration of key markets, utility price adjustments, fuel prices, distributive struggles, external commodity values, expectatives, productive disequilibrium, world interest rates, the economic cycle, stationality and external sector restrictions act on it too.
Let's see a simple example: during Macri's government since mid 2017 to 2019 emission was practically null, but when in 2018 the dollar value doubled, inflation doubled too (it went from 24% to 48% in 2018) and it went up again a year later. We see here that the empirical validity of monetarist theory was absent.
For the first paragraph, one could try to run econometric tests for all those variables, at least from my layman perspective. But given that it doesn't pass the smell test (has any country used that in its favor ignoring monetary policy? Also, I have shown there is at least some evidence for the money-price relationship before), I'll try to address what happened in Macri's government and if monetarism (or at least some reasonable extension of it) cannot account for it.
For a complete description of macroeconomic policy on that period, Sturzenegger account is a good one (even if a bit unreliable given he was the central banker for that government and he is considered to have been a failure). The short version is that central banks uses bonds to manage monetary policy and absorb money; given the history of defaults for the country, the Argentinian Central Bank (BCRA) uses its own peso denominated bonds instead of using treasury bonds. At that time period, the BCRA still financed the treasury but the amount got reduced. Also, it emitted pesos to buy dollar reserves, then sterilized them, maybe risking credibility further.
Near the end of 2017 it was evident the government had limited appetite for budget cuts, it had kind of abandoned its inflation target regime and the classic problem of fiscal dominance emerged, as it's shown in the classic "Unpleasant monetarist arithmetic" paper by Wallace and Sargent. Monetary policy gets less effective when the real value of bonds falls, and raising interest rates may be counterproductive in that environment. Rational expectations are needed to complement QTOM.
So, given that Argentina promised to go nowhere with reform, it was expected that money financing would increase at some point in the future and BCRA bonds were dumped in 2018 and 2019 as their value was perceived to have decreased, and so peso demand decreased. It's not that the dollar value increased and inflation followed, but instead that peso demand fell suddenly!
The IMF deal asked for MB growth to be null or almost null but that doesn't say a lot about M2 (which it's the relevant variable here). Without credible policies, the peso demand keeps falling because bonds are dumped even more (see 2019 for a hilariously brutal example of that).
It's not emission per se, but rather that it doesn't adjust properly to peso demand (which is falling). That doesn't mean increasing interest rates is enough to achieve it, following Wallace and Sargent model.
This is less a strict proof that a monetary phenomenon is involved and more stating that the author hasn't shown any problem with that, there are reasonable models for this situation. It doesn't look like an clear empirical failure to me yet.
3 - Of what we are talking about when we talk about emission?
The author mentions many money measures (M0, M1, M2) but it doesn't address it meaningfully as I tried to do above. It feels more like a rhetorical device because there is no point here except "this stuff exists".
Also, it's worth pointing that there are actual criticisms to make to Friedman on those grounds. He failed to forecast US inflation at some points when he switched to M1 instead of using M2, although he later reverted that. Monetarism kind of "failed" there (it also "failed" in the sense that modern central banks don't use money, but instead interest rates as their main tool; "failed" because despite being outdated, it was influential to modern central banking). This is often brought to this kind of discussions like if economics hasn't moved beyond that. For an account of Friedman thoughts on monetary policies and his failures, see this.
4 - Why do many countries print and inflation doesn't increase there?
There is a mention about the japanese situation in the 90s (the liquidity trap) which I have addressed.
The author mentions that many countries "printed" like crazy during the pandemic, and he says:
Monetarism apologists answer, when confronted with those grave empirical problems that happen in "serious countries", that the population "trusts" their monetary authorities, even increasing the money demand in those place despite the emission. Curious, though, it's an appeal to "trust" implying that the relationship between emission and inflation is not objective, but subjective and cultural: an appreciation that abandons mechanicism and the basic certainty of monetarism, because evaluations and diagnostics, many times ideologic, contextual or historical intervene..
That's just a restatement of applying rational expectations to central bank operations. I don't see a problem with that. Rational expectations is not magic, it's an assessment of future earnings by economic actors. Humans may not 100% rational but central banking somehow works on many countries. You cannot just say that people are ideologues and let it at that. What's your model?
Worth noting the author shills for bitcoin a bit in this section, for more cringe.
5 - Are we talking of a physical science or a social science?
Again, a vague mention of rational expectations ("populists and pro market politicians could do the same policies with different results because of how agents respond ideologically and expectatives") without handling the subject meaningfully. It criticizes universal macroeconomic rules that apply everywhere (this is often used to dismiss evidence from other countries uncritically more than as a meaningful point).
6 - How limits work?
The last question to monetarism allows to recognize it something: effectively we can think on a type of vinculation between emission and inflation in extreme conditions. That means, with no monetary rule, no government has the need of taxes but instead can emit and spend all it needs without consequence. We know it's not like that: no government can print infinitely without undesirable effects.
Ok, good disclaimer, but given what he wrote before, what's the mechanism which causes money printing to be inflationary at some point? It was rejected before but now it seems that it exists. What was even the point of the article?
Now, the problem is thinking monetarism on its extremes: without emission we have inflation sometimes, on others we have no inflation with emission, we know that if we have negative emission that doesn't guarantees us negative inflation, but that if emission is radically uncontrolled there will economic effects.
As I wrote above, that's not what monetarism (even on it's simpler form) says, nor a consequence of it. You can see some deviations in low inflation environment but it's not really Argentina's current situation.
Let's add other problems: the elastic question between money and prices is not evident. Neither is time lags in which can work or be neutral. So the question is the limit cases for monetarism which has some reason but some difficulty in explaining them: by which and it what moments rules work and in which it doesn't.
I find the time lag thing to be a red herring. You can observe empirically and not having a proper short/middle run model doesn't invalidate QTOM in the long run. While it may be that increasing interest rates or freezing MB is not effective, that's less a problem of the theory and more a problem of policy implementation.
Conclusion:
I find that the article doesn't truly get monetarism to begin with (see the points it makes about emission and money demand), neither how it's implemented in practice, nor seems to be aware of more modern theories that, while put money on the background, don't necessarily invalidate it (rational expectation ideas, and eventually New Keynesian stuff which addresses stuff like liquidity traps properly).
There are proper criticisms to be made to Friedman old ideas but he still was a relevant man in his time and the economic community has moved on to new, better theories that have some debt to it. I feel most economic discussion about monetarism in Argentina is a strawman of mainstream economics or an attack on Austrians more than genuine points ("monetarism" is used as a shorthand for those who think inflation is a monetary phenomenon more than referring to Friedman and his disciples per se).
submitted by Neronoah to badeconomics [link] [comments]

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31
Submitted for consideration to The Great Reddit Scaling Bake-Off
Baked by the pastry chefs at Offchain Labs
Please send questions or comments to [[email protected] ](mailto:[email protected])
1. Overview
We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too!
Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know).
To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts.
1.1 Why Ethereum
Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract.
The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both.
Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice.
1.2 Why Arbitrum
While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools.
Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users.
We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth.
2. Arbitrum at a glance
Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes.
Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability.
Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow.
Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below.
Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn.
Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask.
Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract.
Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details).
Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users.
Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum.
Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain.
Limitations
Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals.
As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit).
Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here).
So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality.
3. The recipe: How Arbitrum Rollup works
For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents:
Arbitrum Rollup Whitepaper
Arbitrum academic paper (describes a previous version of Arbitrum)
4. Developer docs and APIs
For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/.
Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release.
5. Who are the validators?
As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators?
Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers.
Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators.
6. Reddit Contract Support
Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain.
Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain.
For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2.
Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios.
In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process.
To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum.
When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) .
The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform.
8. Benchmarks and costs
In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking.
Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here.
Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum.
On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support:
Over a 5 day period, your scaling PoC should be able to handle:
  • 100,000 point claims (minting & distributing points)
  • 25,000 subscriptions
  • 75,000 one-off points burning
  • 100,000 transfers
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.)
We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks).
Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c.
We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic.
Our model. Our cost model includes several sources of cost:
  • L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
  • Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
  • Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system.
Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.)
9. Status of Arbitrum Rollup
Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade.
Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet.
10. Reddit Universe Arbitrum Rollup Chain
The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo.
If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access.
11. Even more scaling: Arbitrum Sidechains
Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide.
While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic.
The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout.
Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator.
Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol.
Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest.
We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help.
While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined).
12. How Arbitrum compares
We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects.
Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
  • As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
  • All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
  • Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
  • Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks
While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components.
But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration.
About Offchain Labs
Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others.
Leadership Team
Ed Felten
Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan.
Steven Goldfeder
Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons.
Harry Kalodner
Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
submitted by hkalodner to ethereum [link] [comments]

Lines of Navigation | Monthly Portfolio Update - July 202

Our little systems have their day;
They have their day and cease to be
- Tennyson, In Memoriam A.H.H.
This is my forty-fourth portfolio update. I complete this update monthly to check my progress against my goal.
Portfolio goal
My objective is to reach a portfolio of $2 180 000 by 1 July 2021. This would produce a real annual income of about $87 000 (in 2020 dollars).
This portfolio objective is based on an expected average real return of 3.99 per cent, or a nominal return of 6.49 per cent.
Portfolio summary
Total portfolio value: $1 800 119 (+$34 376 or 1.9%)
Asset allocation
Presented visually, below is a high-level view of the current asset allocation of the portfolio.
[Chart]
Comments
The portfolio has substantially increased this month, continuing the recovery in portfolio value since March.
The strong portfolio growth of over $34 000, or 1.9 per cent, returns the value of the portfolio close to that achieved at the end of February this year.
[Chart]
This month there was minimal movement in the value of Australian and global equity holdings, There was, however, a significant lift of around 6 per cent in the value of gold exchange traded fund units, as well as a rise in the value of Bitcoin holdings.
These movements have pushed the value of gold holdings to their highest level so far on the entire journey. Their total value has approximately doubled since the original major purchases across 2009 to 2015.
For most of the past year gold has functioned as a portfolio stabiliser, having a negative correlation to movements in Australian equities (of around -0.3 to -0.4). As low and negative bond rates spread across the world, however, the opportunity cost of holding gold is reduced, and its potential diversification benefits loom larger.
The fixed income holdings of the portfolio also continued to fall beneath the target allocation, making this question of what represents a defensive (or negatively correlated to equity) asset far from academic.
This steady fall is a function of the slow maturing of Ratesetter loans, which were largely made between 2015 and 2017. Ratesetter has recently advised of important changes to its market operation, and placed a fixed maximum cap on new loan rates. By replacing market set rates with maximum rates, the peer-to-peer lending platform appears to be shifting to more of a 'intermediated' role in which higher past returns (of around 8 to 9 per cent) will now no longer be possible.
[Chart]
The expanding value of gold and Bitcoin holdings since January last year have actually had the practical effect of driving new investments into equities, since effectively for each dollar of appreciation, for example, my target allocation to equities rises by seven dollars.
Consistent with this, investments this month have been in the Vanguard international shares exchange-traded fund (VGS) using Selfwealth. This has been directed to bring my actual asset allocation more closely in line with the target split between Australian and global shares.
Fathoming out: franking credits and portfolio distributions
Earlier last month I released a summary of portfolio income over the past half year. This, like all before it, noted that the summary was prepared on a purely 'cash' basis, reflecting dividends actually paid into a bank account, and excluding consideration of franking credits.
Franking credits are credits for company tax paid at the company level, which can be passed to individual shareholders, reducing their personal tax liability. They are not cash, but for a personal investor with tax liabilities they can have equivalent value. This means that comparing equity returns to other investments without factoring these credits can produce a distorted picture of an investor's final after-tax return.
In past portfolio summaries I have noted an estimate for franking credits in footnotes, but updating the value for this recently resulted in a curiosity about the overall significance of this neglected element of my equity returns.
This neglect resulted from my perception earlier in the journey that they represented a marginal and abstract factor, which could effectively be assumed away for the sake of simplicity in reporting.
This is not a wholly unfair view, in the sense that income physically received and able to be spent is something definably different in kind than a notional 'pre-payment' credit for future tax costs. Yet, as the saying goes, because the prospect of personal tax is as certain as extinction from this world, in some senses a credit of this kind can be as valuable as a cash distribution.
Restoring the record: trends and drivers of franking credits
To collect a more accurate picture of the trends and drivers of franking credits I relied on a few sources - tax statements, records and the automatic franking credit estimates that the portfolio tracking site Sharesight generates.
The chart below sets out both the level and major different sources of franking credits received over the past eleven years.
[Chart]
From this chart some observations can be made.
The key reason for the rapid growth over the recent decade has been the increased investment holdings in Australian equities. As part of the deliberate rebalancing towards Australian shares across the past two years, these holdings have expanded.
The chart below sets out the total value of Australian shares held over the comparable period.
[Chart]
As an example, at the beginning of this record Australian equities valued at around $276 000 were held. Three years later, the holding were nearly three times larger.
The phase of consistently increasing the Australian equities holding to meet its allocated weighting is largely complete. This means that the period of rapid growth seen in the past few years is unlikely to repeat. Rather, growth will revert to be in proportion to total portfolio growth.
Close to cross-over: the credit card records
One of the most powerful initial motivators to reach financial independence was the concept of the 'cross over' point in Vicki Robins and Joe Dominguez's Your Money or Your Life. This was the point at which monthly expenses are exceeded by investment income.
One of the metrics I have traced is this 'cross-over' point in relation to recorded credit card expenses. And this point is now close indeed.
Expenditures on the credit card have continued their downward trajectory across the past month. The three year rolling average of monthly credit card spending remains at its lowest point over the period of the journey. Distributions on the same basis now meet over 99 per cent of card expenses - with the gap now the equivalent of less than $50 per month.
[Chart]
The period since April of the achievement of a notional and contingent form of financial independence has continued.
The below chart illustrates this temporary state, setting out the the extent to which to which portfolio distributions (red) cover estimated total expenses (green), measured month to month.
[Chart]
An alternative way to view the same data is to examine the degree to which total expenses (i.e. fixed payments not made on credit card added to monthly credit card expenses) are met by distributions received.
An updated version of this is seen in the chart below.
[Chart]
Interestingly, on a trend basis, this currently identifies a 'crossing over' point of trend distributions fully meeting total expenditure from around November 2019. This is not conclusive, however, as the trend curve is sensitive to the unusual COVID-19 related observations of the first half of this year, and could easily shift further downward if normal expense patterns resume.
One issue this analysis raises is what to do with the 'credit card purchases' measure reported below. This measure is designed to provide a stylised benchmark of how close the current portfolio is to a target of generating the income required to meet an annual average credit card expenditure of $71 000.
The problem with this is that continued falling credit card spending means that average credit card spending is lower than that benchmark for all time horizons - measured as three and four year averages, or in fact taken as a whole since 2013. So the set benchmark may, if anything, be understating actual progress compared the graphs and data above by not reflecting changing spending levels.
In the past I have addressed this trend by reducing the benchmark. Over coming months, or perhaps at the end of the year, I will need to revisit both the meaning, and method, of setting this measure.
Progress
Progress against the objective, and the additional measures I have reached is set out below.
Measure Portfolio All Assets
Portfolio objective – $2 180 000 (or $87 000 pa) 82.6% 111.5%
Credit card purchases – $71 000 pa 100.7% 136.0%
Total expenses – $89 000 pa 80.7% 109.0%
Summary
One of the most challenging aspects of closing in on a fixed numerical target for financial independence with risk assets still in place is that the updrafts and downdrafts of market movements can push the goal further away, or surprisingly close.
There have been long period of the journey where the total value of portfolio has barely grown, despite regular investments being made. As an example, the portfolio ended 2018 lower than it started the year. The past six months have been another such period. This can create a sense of treading water.
Yet amidst the economic devastation affecting real lives and businesses, this is an extremely fortunate position to be in. Australia and the globe are set to experience an economic contraction far more severe than the Global Financial Crisis, with a lesser capacity than previously for interest rates to cushion the impact. Despite similar measures being adopted by governments to address the downturn, it is not clear whether these are fit for purpose.
Asset allocation in this environment - of being almost suspended between two realities - is a difficult problem. The history of markets can tell us that just when assets seem most 'broken', they can produce outsized returns. Yet the problem remains that far from being surrounded by broken markets, the proliferation appears to be in bubble-like conditions.
This recent podcast discussion with the founder of Grant's Interest Rate Observer provided a useful historical context to current financial conditions this month. One of the themes of the conversation was 'thinking the unthinkable', such as a return of inflation. Similar, this Hoover Institute video discussion, with a 'Back from the future' premise, provides some entertaining, informed and insightful views on the surprising and contingent nature of what we know to be true.
Some of our little systems may well have had their day, but what could replace them remains obscured to any observer.
The post, links and full charts can be seen here.
submitted by thefiexpl to fiaustralia [link] [comments]

Boscov's Prom 2015 NEPA Analysis Process: Developing Alternatives What is Bolivarcoin? Bitcoin - Dokus & Animations - YouTube Spinners as Alternatives to Dice

Alternatives to Bitcoin. All of them share Bitcoin’s fundamental strengths, however - minimal or zero transfer fees, and freedom from traditional banking restrictions. ... 5 Mar 2015 - 12:07PM. Bitcoin is definitely the most well-known cryptocurrency. It represents the first application of blockchain technology and has been increasingly prevalent in the news over the past few years. But cryptocurrencies don’t just end at bitcoin. Soon after it was introduced, other coins followed, including litecoin, namecoin, and swiftcoin. An in-depth look into 10 bitcoin alternatives 1. Litecoin (LTC) Created as “the silver to bitcoin’s gold” by Google programmer Charlie Lee, Litecoin was launched in October 2011. Though also a peer-to-peer payment currency, Litecoin was designed to offer a few key benefits over bitcoin, including reduced transaction fees and faster payment processing. Before we take a closer look at some of these alternatives to Bitcoin, ... Ether, launched in 2015, is currently the second-largest digital currency by market cap after bitcoin, although it lags ... Launched by Vitalik Buterin in 2015, Ethereum runs on a blockchain and. ... we will go through some of the most popular Bitcoin alternatives. Ethereum (ETH) Ethereum is a great Bitcoin alternative ...

[index] [33160] [2505] [8155] [30039] [1396] [33208] [18755] [11740] [21180] [32128]

Boscov's Prom 2015

Whatever Bitcoin may be; most people do not yet understand what this controversial and influential innovation is about and how it works. This documentary answers these questions. Bitcoin: The End ... Published on Jan 6, 2015 In this 7-minute video, you’ll be reminded of the relationship between the purpose and need and alternatives, and be given sideboards in developing a reasonable alternative. A Bitcoin Alternative. Bolivarcoin, the new virtual currency from Venezuela. A Bitcoin Alternative. Skip navigation Sign in. Search. ... 2015. Bolivarcoin, the new virtual currency from Venezuela. PSA Grocery Shopping Tips in COVID-19 (See Important Notes Below) www.DrJeffVW.com - Duration: 13:32. Jeffrey VanWingen Recommended for you Scrypt and SHA-256 algorithms are supported, so it can mine bitcoin, litecoin, novacoin, ppcoin, feathercoin and other alternative coins. ARM Miner is suitable for solo and pool mining thank u

#