Scaling Bitcoin Workshops - Transcripts

r/IOTAmarkets

Welcome to IOTAmarkets! -- IOTA is a quantum-resistant distributed ledger protocol launched in 2015, focused on being useful for the emerging m2m economy of Internet-of-Things (IoT), data integrity, micro-/nano- payments, and anywhere else a scalable decentralized system is warranted. IOTA uniquely offers zero fees, no scaling limitations, and decentralized consensus where users are also validators. The digital currency 'iota' has a fixed money supply with zero inflationary cost.
[link]

Minimum Viable Issuance - Why Ethereum’s lack of a hard cap on ETH issuance is a good thing.

This post will explain how the argument used by the average Bitcoin maximalist, thinking that they have found Ethereum’s achilles heel when talking about issuance is actually highlighting one of Ethereum’s strong points and one of the main threats to the longevity of the Bitcoin network.
So first let’s answer the question which I know many people have about Ethereum:

What is Ethereum’s ETH issuance schedule?

Ethereum has an issuance policy of Minimum Viable Issuance. So what does this mean exactly? It means that the issuance of ETH will be as low as possible while also maintaining a sufficient budget to pay miners (and soon to be stakers) to keep the network secure. For example, if ETH issuance was halved, miners would drop off the network and stop mining as it is no longer profitable for them to mine. As a result, the network would be less secure as it would cost less money for an attacker to control 51% of the hash power and attack the network. This means that the Ethereum community plans to change ETH issuance as time goes on to maintain a reasonable security budget which will keep the network secure but will also keep inflation in check. We have done this twice in the past with EIP-649 and EIP-1234 which reduced block rewards from 5 ETH per block to 3 ETH and from 3 ETH to 2 ETH respectively. I previously made a graph of ETH issuance over time here: https://redd.it/it8ce7
So while Ethereum doesn’t have a strictly defined issuance schedule, the community will reject any proposals which either put the security of the network at risk such as the recent EIP-2878, or we will reject proposals which will lead to excessive network security and therefore an unnecessarily high inflation rate (or we will accept proposals which reduce issuance after price rises and therefore the security budget rises). This means that when Bitcoiners accuse the Ethereum Foundation of being no better than a central bank because they can “print more Ether”, this is completely untrue. Any proposals made by the EF which would increase issuance unnecessarily would be rejected by the community in the same way that a proposal to increase the supply of Bitcoin from 21 million to 22 million would be rejected. There is a social contract around both Bitcoin’s and Ethereum’s issuance schedules. Any networks or proposals which break the social contracts of 21 million Bitcoins and minimal viable issuance of Ether would be a breach of these contracts and the new proposed network would be labeled by the community as illegitimate and the original network would live on.

So why is minimum viable issuance better than a hard cap?

Minimum viable issuance is better than a hard cap because it puts the most important part of the network first - the security. MVI ensures that the Ethereum network will always have a security budget which keeps the cost of a 51% attack impractically high. Bitcoin on the other hand, halves its security budget every 4 years until eventually only the transaction fees pay for network security. This means that every 4 years, the amount of money paying for network security halves until eventually, the value of attacking the network becomes greater than the security budget and someone performs a 51% attack (technically the security budget only halves if terms of BTC not in dollars. However, even if the price of Bitcoin more than doubles in the time that the security budget halves, the ratio of security budget to value secured on the network still halves, doubling the financial viability of performing a network attack). The strategy to pay for the security budget once Bitcoin issuance stops is for transaction fees to secure the network since transaction fees are paid to miners. Not only does this have its own security problems which I won’t detail here, but unless Bitcoin scales on layer 1 (layer 2 scaling solutions have their own security mechanisms separate from L1), then fees would have to cost well in the thousands of dollars to secure a trillion dollar market cap Bitcoin that is secured by nothing but fees. If Bitcoin maximalists want a 10 trillion or 100 trillion dollar market cap then expect fees to go up another 10 or 100 times from there.
Ethereum on the other hand, will be able to keep its network secure with approximately 1-2% annual issuance being paid to stakers under ETH 2.0. This is because not all of the network will be staking, so if 33 million of the approximately 110 million Ether in existence stakes under ETH 2.0, then paying this 33 million Ether 6% a year (a very decent yield!) would cost just under 2 million ETH per year which would equate to less than 2% annual ETH inflation. This is also before considering EIP-1559 which will burn a portion of transaction fees which will counter the effect of this inflation and potentially even make ETH deflationary if the sum of all burned transaction fees are greater than the annual inflation. Also, under ETH 2.0, an attacker performing a 51% attack would get his funds slashed (they would lose their funds) if they attack the network, meaning that they can only perform a 51% attack once. However, in Bitcoin, anyone who controls 51% of the mining hash power could perform multiple 51% attacks without losing everything like they could in ETH 2.0.
So in conclusion, while Ethereum doesn’t have the guaranteed anti-inflation security of a hard cap, it does have the guarantee of always paying it’s miners (or stakers under ETH 2.0) enough to keep the network secure. In contrast, while Bitcoin’s social contract may guarantee a hard cap of 21 million, it cannot simultaneously guarantee network security in the long run. Eventually, its users will have to decide if they want a secure network with more than 21 million coins or a tax to pay for security or an insecure network with super high fees and a hard cap of 21 million Bitcoin.
Disclaimer: The details I covered around 51% attacks and network security are simplified. I am not an expert in this field and things are a lot more nuanced than I laid out in my simplifications above.
submitted by Tricky_Troll to ethfinance [link] [comments]

Minimum Viable Issuance - Why Ethereum’s lack of a hard cap on ETH issuance is a good thing.

This post will explain how the argument used by the average Bitcoin maximalist, thinking that they have found Ethereum’s achilles heel when talking about issuance is actually highlighting one of Ethereum’s strong points and one of the main threats to the longevity of the Bitcoin network.
So first let’s answer the question which I know many people have about Ethereum:

What is Ethereum’s ETH issuance schedule?

Ethereum has an issuance policy of Minimum Viable Issuance. So what does this mean exactly? It means that the issuance of ETH will be as low as possible while also maintaining a sufficient budget to pay miners (and soon to be stakers) to keep the network secure. For example, if ETH issuance was halved, miners would drop off the network and stop mining as it is no longer profitable for them to mine. As a result, the network would be less secure as it would cost less money for an attacker to control 51% of the hash power and attack the network. This means that the Ethereum community plans to change ETH issuance as time goes on to maintain a reasonable security budget which will keep the network secure but will also keep inflation in check. We have done this twice in the past with EIP-649 and EIP-1234 which reduced block rewards from 5 ETH per block to 3 ETH and from 3 ETH to 2 ETH respectively. I previously made a graph of ETH issuance over time here: https://redd.it/it8ce7
So while Ethereum doesn’t have a strictly defined issuance schedule, the community will reject any proposals which either put the security of the network at risk such as the recent EIP-2878, or we will reject proposals which will lead to excessive network security and therefore an unnecessarily high inflation rate (or we will accept proposals which reduce issuance after price rises and therefore the security budget rises). This means that when Bitcoiners accuse the Ethereum Foundation of being no better than a central bank because they can “print more Ether”, this is completely untrue. Any proposals made by the EF which would increase issuance unnecessarily would be rejected by the community in the same way that a proposal to increase the supply of Bitcoin from 21 million to 22 million would be rejected. There is a social contract around both Bitcoin’s and Ethereum’s issuance schedules. Any networks or proposals which break the social contracts of 21 million Bitcoins and minimal viable issuance of Ether would be a breach of these contracts and the new proposed network would be labeled by the community as illegitimate and the original network would live on.

So why is minimum viable issuance better than a hard cap?

Minimum viable issuance is better than a hard cap because it puts the most important part of the network first - the security. MVI ensures that the Ethereum network will always have a security budget which keeps the cost of a 51% attack impractically high. Bitcoin on the other hand, halves its security budget every 4 years until eventually only the transaction fees pay for network security. This means that every 4 years, the amount of money paying for network security halves until eventually, the value of attacking the network becomes greater than the security budget and someone performs a 51% attack (technically the security budget only halves if terms of BTC not in dollars. However, even if the price of Bitcoin more than doubles in the time that the security budget halves, the ratio of security budget to value secured on the network still halves, doubling the financial viability of performing a network attack). The strategy to pay for the security budget once Bitcoin issuance stops is for transaction fees to secure the network since transaction fees are paid to miners. Not only does this have its own security problems which I won’t detail here, but unless Bitcoin scales on layer 1 (layer 2 scaling solutions have their own security mechanisms separate from L1), then fees would have to cost well in the thousands of dollars to secure a trillion dollar market cap Bitcoin that is secured by nothing but fees. If Bitcoin maximalists want a 10 trillion or 100 trillion dollar market cap then expect fees to go up another 10 or 100 times from there.
Ethereum on the other hand, will be able to keep its network secure with approximately 1-2% annual issuance being paid to stakers under ETH 2.0. This is because not all of the network will be staking, so if 33 million of the approximately 110 million Ether in existence stakes under ETH 2.0, then paying this 33 million Ether 6% a year (a very decent yield!) would cost just under 2 million ETH per year which would equate to less than 2% annual ETH inflation. This is also before considering EIP-1559 which will burn a portion of transaction fees which will counter the effect of this inflation and potentially even make ETH deflationary if the sum of all burned transaction fees are greater than the annual inflation. Also, under ETH 2.0, an attacker performing a 51% attack would get his funds slashed (they would lose their funds) if they attack the network, meaning that they can only perform a 51% attack once. However, in Bitcoin, anyone who controls 51% of the mining hash power could perform multiple 51% attacks without losing everything like they could in ETH 2.0.
So in conclusion, while Ethereum doesn’t have the guaranteed anti-inflation security of a hard cap, it does have the guarantee of always paying it’s miners (or stakers under ETH 2.0) enough to keep the network secure. In contrast, while Bitcoin’s social contract may guarantee a hard cap of 21 million, it cannot simultaneously guarantee network security in the long run. Eventually, its users will have to decide if they want a secure network with more than 21 million coins or a tax to pay for security or an insecure network with super high fees and a hard cap of 21 million Bitcoin.
Disclaimer: The details I covered around 51% attacks and network security are simplified. I am not an expert in this field and things are a lot more nuanced than I laid out in my simplifications above.
submitted by Tricky_Troll to ethtrader [link] [comments]

Syscoin Platform’s Great Reddit Scaling Bake-off Proposal

Syscoin Platform’s Great Reddit Scaling Bake-off Proposal

https://preview.redd.it/rqt2dldyg8e51.jpg?width=1044&format=pjpg&auto=webp&s=777ae9d4fbbb54c3540682b72700fc4ba3de0a44
We are excited to participate and present Syscoin Platform's ideal characteristics and capabilities towards a well-rounded Reddit Community Points solution!
Our scaling solution for Reddit Community Points involves 2-way peg interoperability with Ethereum. This will provide a scalable token layer built specifically for speed and high volumes of simple value transfers at a very low cost, while providing sovereign ownership and onchain finality.
Token transfers scale by taking advantage of a globally sorting mempool that provides for probabilistically secure assumptions of “as good as settled”. The opportunity here for token receivers is to have an app-layer interactivity on the speed/security tradeoff (99.9999% assurance within 10 seconds). We call this Z-DAG, and it achieves high-throughput across a mesh network topology presently composed of about 2,000 geographically dispersed full-nodes. Similar to Bitcoin, however, these nodes are incentivized to run full-nodes for the benefit of network security, through a bonded validator scheme. These nodes do not participate in the consensus of transactions or block validation any differently than other nodes and therefore do not degrade the security model of Bitcoin’s validate first then trust, across every node. Each token transfer settles on-chain. The protocol follows Bitcoin core policies so it has adequate code coverage and protocol hardening to be qualified as production quality software. It shares a significant portion of Bitcoin’s own hashpower through merged-mining.
This platform as a whole can serve token microtransactions, larger settlements, and store-of-value in an ideal fashion, providing probabilistic scalability whilst remaining decentralized according to Bitcoin design. It is accessible to ERC-20 via a permissionless and trust-minimized bridge that works in both directions. The bridge and token platform are currently available on the Syscoin mainnet. This has been gaining recent attention for use by loyalty point programs and stablecoins such as Binance USD.

Solutions

Syscoin Foundation identified a few paths for Reddit to leverage this infrastructure, each with trade-offs. The first provides the most cost-savings and scaling benefits at some sacrifice of token autonomy. The second offers more preservation of autonomy with a more narrow scope of cost savings than the first option, but savings even so. The third introduces more complexity than the previous two yet provides the most overall benefits. We consider the third as most viable as it enables Reddit to benefit even while retaining existing smart contract functionality. We will focus on the third option, and include the first two for good measure.
  1. Distribution, burns and user-to-user transfers of Reddit Points are entirely carried out on the Syscoin network. This full-on approach to utilizing the Syscoin network provides the most scalability and transaction cost benefits of these scenarios. The tradeoff here is distribution and subscription handling likely migrating away from smart contracts into the application layer.
  2. The Reddit Community Points ecosystem can continue to use existing smart contracts as they are used today on the Ethereum mainchain. Users migrate a portion of their tokens to Syscoin, the scaling network, to gain much lower fees, scalability, and a proven base layer, without sacrificing sovereign ownership. They would use Syscoin for user-to-user transfers. Tips redeemable in ten seconds or less, a high-throughput relay network, and onchain settlement at a block target of 60 seconds.
  3. Integration between Matic Network and Syscoin Platform - similar to Syscoin’s current integration with Ethereum - will provide Reddit Community Points with EVM scalability (including the Memberships ERC777 operator) on the Matic side, and performant simple value transfers, robust decentralized security, and sovereign store-of-value on the Syscoin side. It’s “the best of both worlds”. The trade-off is more complex interoperability.

Syscoin + Matic Integration

Matic and Blockchain Foundry Inc, the public company formed by the founders of Syscoin, recently entered a partnership for joint research and business development initiatives. This is ideal for all parties as Matic Network and Syscoin Platform provide complementary utility. Syscoin offers characteristics for sovereign ownership and security based on Bitcoin’s time-tested model, and shares a significant portion of Bitcoin’s own hashpower. Syscoin’s focus is on secure and scalable simple value transfers, trust-minimized interoperability, and opt-in regulatory compliance for tokenized assets rather than scalability for smart contract execution. On the other hand, Matic Network can provide scalable EVM for smart contract execution. Reddit Community Points can benefit from both.
Syscoin + Matic integration is actively being explored by both teams, as it is helpful to Reddit, Ethereum, and the industry as a whole.

Proving Performance & Cost Savings

Our POC focuses on 100,000 on-chain settlements of token transfers on the Syscoin Core blockchain. Transfers and burns perform equally with Syscoin. For POCs related to smart contracts (subscriptions, etc), refer to the Matic Network proposal.
On-chain settlement of 100k transactions was accomplished within roughly twelve minutes, well-exceeding Reddit’s expectation of five days. This was performed using six full-nodes operating on compute-optimized AWS c4.2xlarge instances which were geographically distributed (Virginia, London, Sao Paulo Brazil, Oregon, Singapore, Germany). A higher quantity of settlements could be reached within the same time-frame with more broadcasting nodes involved, or using hosts with more resources for faster execution of the process.
Addresses used: 100,014
The demonstration was executed using this tool. The results can be seen in the following blocks:
612722: https://sys1.bcfn.ca/block/6d47796d043bb4c508d29123e6ae81b051f5e0aaef849f253c8f3a6942a022ce
612723: https://sys1.bcfn.ca/block/8e2077f743461b90f80b4bef502f564933a8e04de97972901f3d65cfadcf1faf
612724: https://sys1.bcfn.ca/block/205436d25b1b499fce44c29567c5c807beaca915b83cc9f3c35b0d76dbb11f6e
612725: https://sys1.bcfn.ca/block/776d1b1a0f90f655a6bbdf559ff5072459cbdc5682d7615ff4b78c00babdc237
612726: https://sys1.bcfn.ca/block/de4df0994253742a1ac8ac9eec8d2a8c8b0a6d72c53d6f3caa29bb6c171b0a6b
612727: https://sys1.bcfn.ca/block/e5e167c52a9decb313fbaadf49a5e34cb490f8084f642a850385476d4ef10d70
612728: https://sys1.bcfn.ca/block/ab64d989edc71890e7b5b8491c20e9a27520dc45a5f7c776d3dae79057f59fe7
612729: https://sys1.bcfn.ca/block/5e8b7ecd0e36f99d07e4ea6e135fc952bf7ec30164ab6f4d1e98b0f2d405df6d
612730: https://sys1.bcfn.ca/block/d395df3d31dde60bbb0bece6bd5b358297da878f0beb96be389e5f0e043580a3
It is important to note that this POC is not focused on Z-DAG. The performance of Z-DAG has been benchmarked within realistic network conditions: Whiteblock’s audit is publicly available. Network latency tests showed an average TPS around 15k with burst capacity up to 61k. Zero-latency control group exhibited ~150k TPS. Mainnet testing of the Z-DAG network is achievable and will require further coordination and additional resources.
Even further optimizations are expected in the upcoming Syscoin Core release which will implement a UTXO model for our token layer bringing further efficiency as well as open the door to additional scaling technology currently under research by our team and academic partners. At present our token layer is account-based, similar to Ethereum. Opt-in compliance structures will also be introduced soon which will offer some positive performance characteristics as well. It makes the most sense to implement these optimizations before performing another benchmark for Z-DAG, especially on the mainnet considering the resources required to stress-test this network.

Cost Savings

Total cost for these 100k transactions: $0.63 USD
See the live fee comparison for savings estimation between transactions on Ethereum and Syscoin. Below is a snapshot at time of writing:
ETH price: $318.55 ETH gas price: 55.00 Gwei ($0.37)
Syscoin price: $0.11
Snapshot of live fee comparison chart
Z-DAG provides a more efficient fee-market. A typical Z-DAG transaction costs 0.0000582 SYS. Tokens can be safely redeemed/re-spent within seconds or allowed to settle on-chain beforehand. The costs should remain about this low for microtransactions.
Syscoin will achieve further reduction of fees and even greater scalability with offchain payment channels for assets, with Z-DAG as a resilience fallback. New payment channel technology is one of the topics under research by the Syscoin development team with our academic partners at TU Delft. In line with the calculation in the Lightning Networks white paper, payment channels using assets with Syscoin Core will bring theoretical capacity for each person on Earth (7.8 billion) to have five on-chain transactions per year, per person, without requiring anyone to enter a fee market (aka “wait for a block”). This exceeds the minimum LN expectation of two transactions per person, per year; one to exist on-chain and one to settle aggregated value.

Tools, Infrastructure & Documentation

Syscoin Bridge

Mainnet Demonstration of Syscoin Bridge with the Basic Attention Token ERC-20
A two-way blockchain interoperability system that uses Simple Payment Verification to enable:
  • Any Standard ERC-20 token to be moved from Ethereum to the Syscoin blockchain as a Syscoin Platform Token (SPT), and back to Ethereum
  • Any SPT to be moved from Syscoin to the Ethereum blockchain as an ERC-20 token, and back to Syscoin

Benefits

  • Permissionless
  • No counterparties involved
  • No trading mechanisms involved
  • No third-party liquidity providers required
  • Cross-chain Fractional Supply - 2-way peg - Token supply maintained globally
  • ERC-20s gain vastly improved transactionality with the Syscoin Token Platform, along with the security of bitcoin-core-compliant PoW.
  • SPTs gain access to all the tooling, applications and capabilities of Ethereum for ERC-20, including smart contracts.
https://preview.redd.it/l8t2m8ldh8e51.png?width=1180&format=png&auto=webp&s=b0a955a0181746dc79aff718bd0bf607d3c3aa23
https://preview.redd.it/26htnxzfh8e51.png?width=1180&format=png&auto=webp&s=d0383d3c2ee836c9f60b57eca35542e9545f741d

Source code

https://github.com/syscoin/?q=sysethereum
Main Subprojects

API

Tools to simplify using Syscoin Bridge as a service with dapps and wallets will be released some time after implementation of Syscoin Core 4.2. These will be based upon the same processes which are automated in the current live Sysethereum Dapp that is functioning with the Syscoin mainnet.

Documentation

Syscoin Bridge & How it Works (description and process flow)
Superblock Validation Battles
HOWTO: Provision the Bridge for your ERC-20
HOWTO: Setup an Agent
Developer & User Diligence

Trade-off

The Syscoin Ethereum Bridge is secured by Agent nodes participating in a decentralized and incentivized model that involves roles of Superblock challengers and submitters. This model is open to participation. The benefits here are trust-minimization, permissionless-ness, and potentially less legal/regulatory red-tape than interop mechanisms that involve liquidity providers and/or trading mechanisms.
The trade-off is that due to the decentralized nature there are cross-chain settlement times of one hour to cross from Ethereum to Syscoin, and three hours to cross from Syscoin to Ethereum. We are exploring ways to reduce this time while maintaining decentralization via zkp. Even so, an “instant bridge” experience could be provided by means of a third-party liquidity mechanism. That option exists but is not required for bridge functionality today. Typically bridges are used with batch value, not with high frequencies of smaller values, and generally it is advantageous to keep some value on both chains for maximum availability of utility. Even so, the cross-chain settlement time is good to mention here.

Cost

Ethereum -> Syscoin: Matic or Ethereum transaction fee for bridge contract interaction, negligible Syscoin transaction fee for minting tokens
Syscoin -> Ethereum: Negligible Syscoin transaction fee for burning tokens, 0.01% transaction fee paid to Bridge Agent in the form of the ERC-20, Matic or Ethereum transaction fee for contract interaction.

Z-DAG

Zero-Confirmation Directed Acyclic Graph is an instant settlement protocol that is used as a complementary system to proof-of-work (PoW) in the confirmation of Syscoin service transactions. In essence, a Z-DAG is simply a directed acyclic graph (DAG) where validating nodes verify the sequential ordering of transactions that are received in their memory pools. Z-DAG is used by the validating nodes across the network to ensure that there is absolute consensus on the ordering of transactions and no balances are overflowed (no double-spends).

Benefits

  • Unique fee-market that is more efficient for microtransaction redemption and settlement
  • Uses decentralized means to enable tokens with value transfer scalability that is comparable or exceeds that of credit card networks
  • Provides high throughput and secure fulfillment even if blocks are full
  • Probabilistic and interactive
  • 99.9999% security assurance within 10 seconds
  • Can serve payment channels as a resilience fallback that is faster and lower-cost than falling-back directly to a blockchain
  • Each Z-DAG transaction also settles onchain through Syscoin Core at 60-second block target using SHA-256 Proof of Work consensus
https://preview.redd.it/pgbx84jih8e51.png?width=1614&format=png&auto=webp&s=5f631d42a33dc698365eb8dd184b6d442def6640

Source code

https://github.com/syscoin/syscoin

API

Syscoin-js provides tooling for all Syscoin Core RPCs including interactivity with Z-DAG.

Documentation

Z-DAG White Paper
Useful read: An in-depth Z-DAG discussion between Syscoin Core developer Jag Sidhu and Brave Software Research Engineer Gonçalo Pestana

Trade-off

Z-DAG enables the ideal speed/security tradeoff to be determined per use-case in the application layer. It minimizes the sacrifice required to accept and redeem fast transfers/payments while providing more-than-ample security for microtransactions. This is supported on the premise that a Reddit user receiving points does need security yet generally doesn’t want nor need to wait for the same level of security as a nation-state settling an international trade debt. In any case, each Z-DAG transaction settles onchain at a block target of 60 seconds.

Syscoin Specs

Syscoin 3.0 White Paper
(4.0 white paper is pending. For improved scalability and less blockchain bloat, some features of v3 no longer exist in current v4: Specifically Marketplace Offers, Aliases, Escrow, Certificates, Pruning, Encrypted Messaging)
  • 16MB block bandwidth per minute assuming segwit witness carrying transactions, and transactions ~200 bytes on average
  • SHA256 merge mined with Bitcoin
  • UTXO asset layer, with base Syscoin layer sharing identical security policies as Bitcoin Core
  • Z-DAG on asset layer, bridge to Ethereum on asset layer
  • On-chain scaling with prospect of enabling enterprise grade reliable trustless payment processing with on/offchain hybrid solution
  • Focus only on Simple Value Transfers. MVP of blockchain consensus footprint is balances and ownership of them. Everything else can reduce data availability in exchange for scale (Ethereum 2.0 model). We leave that to other designs, we focus on transfers.
  • Future integrations of MAST/Taproot to get more complex value transfers without trading off trustlessness or decentralization.
  • Zero-knowledge Proofs are a cryptographic new frontier. We are dabbling here to generalize the concept of bridging and also verify the state of a chain efficiently. We also apply it in our Digital Identity projects at Blockchain Foundry (a publicly traded company which develops Syscoin softwares for clients). We are also looking to integrate privacy preserving payment channels for off-chain payments through zkSNARK hub & spoke design which does not suffer from the HTLC attack vectors evident on LN. Much of the issues plaguing Lightning Network can be resolved using a zkSNARK design whilst also providing the ability to do a multi-asset payment channel system. Currently we found a showstopper attack (American Call Option) on LN if we were to use multiple-assets. This would not exist in a system such as this.

Wallets

Web3 and mobile wallets are under active development by Blockchain Foundry Inc as WebAssembly applications and expected for release not long after mainnet deployment of Syscoin Core 4.2. Both of these will be multi-coin wallets that support Syscoin, SPTs, Ethereum, and ERC-20 tokens. The Web3 wallet will provide functionality similar to Metamask.
Syscoin Platform and tokens are already integrated with Blockbook. Custom hardware wallet support currently exists via ElectrumSys. First-class HW wallet integration through apps such as Ledger Live will exist after 4.2.
Current supported wallets
Syscoin Spark Desktop
Syscoin-Qt

Explorers

Mainnet: https://sys1.bcfn.ca (Blockbook)
Testnet: https://explorer-testnet.blockchainfoundry.co

Thank you for close consideration of our proposal. We look forward to feedback, and to working with the Reddit community to implement an ideal solution using Syscoin Platform!

submitted by sidhujag to ethereum [link] [comments]

RiB Newsletter #14 – Are We Smart (Contract) Yet?

We’re seeing a bunch of interesting Rust blockchain and crypto projects, so this month the “Interesting Things” section is loaded up with news, papers, and project links.
This month, Elrond, appeared on our radar with the launch of their mainnet. Although not written in Rust, it runs Rust smart contracts on its Arwen WASM VM, which itself is based on the Rust Wasmer VM. Along with NEAR, Nervos, and Enigma (and probably others), this continues an encouraging trend of blockchains enabling smart contracts in Rust. See the “Interesting Things” section for examples of Elrond’s Rust contracts.
Rust continues to be popular for research into zero-knowledge proofs, with Microsoft releasing Spartan, a zk-SNARK system without trusted setup.
In RiB news, we published a late one-year anniversary blog post. It has some reflection on the changes to, and growth of, RiB over the last year.
The Awesome Blockchain Rust project, which is maintained by Sun under the rust-in-blockchain GitHub org, has received a stream of updates recently, and is now published as the Awesome-RiB page on rustinblockchain.org.
It’s a pretty good resource for finding blockchain-related Rust projects, with links to many of the more prominent and mature projects noted in the RiB newsletter. It could use more eyes on it though.

Project Spotlight

Each month we like to shine a light on a notable Rust blockchain project. This month that project is…
ethers.rs
ethers.rs is an Ethereum & Celo library and wallet implementation, implemented as a port of the ethers.js library to Rust.
Ethereum client programming is usually done in JavaScript with either web3.js or ethers.js, with ethers.js being the newer of the two. These clients communicate to an Ethereum node, typically via JSON-RPC (or, when in the browser, via an “injected” client provider that follows EIP-1193, like MetaMask).
ethers.rs then provides a strongly-typed alternative for writing software that interacts with the Ethereum network.
As of now it is only suited for non-browser use cases, but if you prefer hacking in Rust to JavaScript, as some of us surely do, it is worth looking into for your next Ethereum project.
The author of ethers.rs, Georgios Konstantopoulos, accepts donations to sponsor their work.
Note that there is also a Rust alternative to web3.js, rust-web3.

Interesting Things

News

Blog Posts

Papers

Projects

Podcasts and Videos


Read more: https://rustinblockchain.org/newsletters/2020-08-05-are-we-smart-contract-yet/
submitted by Aimeedeer to rust [link] [comments]

Providing Some Clarity on Bitcoin Unlimited's Financial Decisions

Providing Some Clarity on Bitcoin Unlimited's Financial Decisions

https://preview.redd.it/zjps7jpg7rg41.jpg?width=1601&format=pjpg&auto=webp&s=defb61fb45c1a2ad5c7e31fe9200541783ba6478

Introduction

As promised in our previous article, we wanted to provide some extra clarity on Bitcoin Unlimited financial choices. We wanted to do this as there has been a lot of confusion and misinformation within the community as to the reasons behind these choices.
It has been claimed by a small number of influential people in the ecosystem that Bitcoin Unlimited does not support BCH (see the previous article debunking this claim) and that BU’s holdings are supposedly evidence of this. Background Bitcoin Unlimited was founded in 2015, and was set up as a response to the Bitcoin block size debate. More specifically, it was created to provide software that allowed on-chain scaling as originally proposed by Satoshi Nakamoto. As we all know, on-chain scaling is a vital component required for peer-to-peer electronic cash to serve the world’s population. Without it Bitcoin would be limited to serving only a small number of people willing and able to pay exorbitantly high fees. Our organisation was created to make Bitcoin unlimited. This prediction of high fees and limited capacity was played out in the BTC we know today as we predicted.
Bitcoin Unlimited received a large anonymous donation in BTC in 2016 from supporters of the ‘on-chain scaling’ movement. This donation allowed our organisation to remain independent and focussed on building software that allows on-chain scaling.
As you all know, in August of 2017, Bitcoin Cash was created after an unsuccessful multi-year effort to allow Bitcoin (BTC) to scale on-chain. Bitcoin Cash was created with the goal of on-chain scaling to support the world’s population right at its heart and BU has been supporting it since the idea was originally formulated.
Once Bitcoin Cash was created it also meant that all funds Bitcoin Unlimited held (BTC) were forked into two equal sets of coins, BTC and BCH. This put BU into a position where we had to make an important decision on how to handle these funds in a way that was in the interest of both BCH and BU.

Financial Prudence

Any organisation that wants to be effective in its goals must aim to always be financially sustainable. Without money, achieving anything becomes significantly more difficult. Cryptocurrencies only magnify this issue even further. Highly volatile asset values, opaque and dynamic tax and regulatory environments, and the unique properties of cryptocurrencies all contribute towards making the financial operations of an organisation an extreme challenge to say the least. Navigating this challenging landscape is a necessary requirement for the success of any organisation within our industry though.
While Bitcoin Unlimited’s primary goal is to make sure peer-to-peer electronic cash (as set out in the Bitcoin white-paper) becomes a reality, a secondary goal must be to make sure that it has the resources required to make its primary goal achievable, and an important part of these resources are its funds.
After Bitcoin forked into BTC and BCH, Bitcoin Unlimited then held an equal number of both. Although a BUIP was passed to authorize some extra conversion, significant practical obstacles to doing so exist (although this is still being worked on). However, since the overarching reason to convert a significant number of BTC to BCH is to maintain financial prudence based on the reasons outlined below and the poor BCH price performance has heavily skewed our holdings, we do anticipate some rebalancing when these obstacles are resolved.
We will further expand on these reasons below. Historic Volatility It is a fact that BCH has historically been more volatile than BTC. An organisation that wishes to maintain a lower level of risk must aim to hold a majority of funds in assets which will maintain their value over time, i.e. be less volatile in their price. It is unfortunately true that BCH has been a more volatile asset than BCH since its creation. While there has been lots of progress and maturation of the BCH ecosystem, this price volatility is likely due to BCH still being a smaller and less developed ecosystem than BTC. The graphs below show levels of volatility in the two coins compared.

BTC
BCH
This higher volatility in BCH has meant that to significantly increase BU’s holdings of BCH would expose the organisation to a higher level of risk for ideological reasons. BTC is already a high-volatility asset and to expose the organisation funds to even higher volatility and further risk is a decision that should not be taken based on simplistic ideology, but rather with the strategy of maximising the ability for the organisation to achieve its primary goals. This meant making the decision to not take on a higher exposure to price volatility, and instead maintain a more conservative risk profile.

Lack Of Say In The Protocol

One argument that has been put forward to suggest that this decision does not make sense because it is analogous to a CEO of a company holding more shares in their competitor’s company. This analogy does not accurately reflect the current scenario for BU or BCH. In this analogy BU is the CEO and BCH is the company. Ignoring the shareholders, A CEO is able to have the largest impact on a company compared to any other stakeholder. Their actions have a direct impact on operations of the company and therefore its value and the value of the shares.
Unfortunately, Bitcoin Unlimited currently has little to no input on the BCH protocol. It has no way to directly influence the direction or success of BCH. There are two reasons for this. Firstly, BCH has a mining software homogeneity that is as centralised as BTC (i.e. essentially all miners and pools run a single client, BitcoinABC). This means that, all though BU has a slight majority in non-mining and in-consensus nodes, BU has no say in protocol decisions unless a collaborative and decentralised development model were to be used by BitcoinABC. This is an unfortunate situation considering the fact that the community split from BTC for this very reason and is strongly in support of decentralised development. Secondly, BitcoinABC does not take a collaborative approach to development. All decisions and features are dictated by BitcoinABC.
In fact the situation is unfortunately even worse than this. BitcoinABC has decided to take an actively hostile position against Bitcoin Unlimited (and many other valuable participants in the ecosystem) and would rather that it did not exist at all.
While a number of members of BitcoinABC were previously members of BU, they unfortunately used their privilege as members to try (but fortunately failed) to sabotage the organisation.
https://www.bitcoinunlimited.info/voting/rendeproposal_vote_result/7eb0ded0487a6593ac3976b63422294e1a84b209be1307c46f373489922212a0
https://www.bitcoinunlimited.info/voting/rendeproposal_vote_result/6285fcef8fa44416b8e83f25bfebe79aff502c1446a7b60bfab28ec58c35b609
https://www.bitcoinunlimited.info/voting/rendeproposal_vote_result/b10f54ece2ea3b9001086ebdde0001fbef9dc2fd83729a65ba207c0f1d9dfceb
These three voting records show members of BitcoinABC voting for the purchase of BSV coin, voting for an unfeasibly large block size increase (10TB), and voting for implementation of and miner-activation of BSV features into the BU client. None of these actions were implemented in the ABC client, and the inclusion of BSV features is likely the single biggest criticism certain ABC affiliated people have made against BU, yet members of BitcoinABC voted for it.
While it is important to assume good faith, under no interpretation can this be seen as anything other an act of bad will towards BU. Unfortunately this kind of behaviour is rather the rule than the exception and has likely been a major factor in BCH’s struggle to attract quality developers into the ecosystem.
Regardless of the hard work done by members of BU to create useful software for Bitcoin Cash, and its continued commitment towards peer-to-peer electronic cash for the past 5 years, ABC will unfortunately never allow any of BU’s work to go into the BCH protocol willingly.
If BU were to invest all its funds into BCH it would be making a highly risky bet on BitcoinABC’s leadership, a leadership that has not only been historically unsuccessful (when looking at the price of BCH since its creation, both in dollar terms and BTC/BCH ratio terms), but also actively hostile to our organisation. A more cautious approach that takes these factors into account is to keep the funds held where there has been less volatility.
Regardless of all of this, BU is still 100% committed to supporting Bitcoin Cash.

Game Theory: The Strategy of Betting Against Yourself

Counter intuitively, a strategy where you bet against yourself can provide a beneficial low-risk profile. When you bet against yourself, if you lose you win and if you win you win. With BU’s current asset holdings of BCH and BTC the organisation is financially hedged in a way that it wins if BCH wins, and if BTC wins then BU lives to fight another day for worldwide peer-to-peer electronic cash.
If BTC goes down and BCH goes up then it means BCH is succeeding, and our funds in BCH will sustain us for longer. Not only that, but there would likely be more funds available for BCH development in this scenario. If BTC goes up and BCH goes down then BU will be sustained for longer to continue the fight for BCH and peer-to-peer electronic cash.
This is very similar to the strategy of BCH-supporting miners mining on BTC and then converting the BTC block rewards into BCH in an effort to use BTC gains to support BCH price. BU is similarly using its gains in BTC and converting them to efforts and initiatives in support of BCH. In doing so Bitcoin Unlimited is able to turn any BTC win into a positive for BCH.

Incentives

It has been suggested that the situation created by holding a larger portion of funds in BTC than in BCH creates negative incentives that push BU towards supporting BTC. It is important to keep in mind that Bitcoin Unlimited is not a profit driven organisation. While an increase in value of its assets is of course beneficial to the organisation, our primary goal is to accelerate the global adoption of peer-to-peer electronic cash as described in the Bitcoin white-paper, and the officials, membership and founding articles of Bitcoin Unlimited are the driving force for this.
It is also important to point out that there is no evidence to support the claim that BU is in support of BTC (or BSV). In fact the voting record clearly shows the opposite of this. BU has continually worked in support of peer-to-peer electronic cash, and specifically in support of BCH since it was created. This is thanks to the strong commitment by the BU officials and members, all of whom are long time Bitcoiners and supporters of the ‘on-chain scaling’ movement. The only members who receive any payment from the organisation are those who provide significant value in the form of various skilled services, and all of these are voted on by the membership. The BUIP record also shows that compensated individuals are often compensated at far under market rates for developers of their caliber. Should the price of BTC increase, no member receives any direct benefit from this beyond any appreciation in value of any BTC they privately hold. Therefore there are no strong incentives for BU to drive the price of BTC up and push the price of BCH down as this would be counter to our primary goal.

Has This Strategy Been Successful?

Bitcoin Unlimited and its members, all being long-time Bitcoiners, are acutely aware of the need to play the long game to make sure a globally adopted peer-to-peer electronic cash becomes a reality. BU is the oldest entity within the BCH ecosystem and with good reason. The financial strategy of BU to date has been highly effective in sustaining the organisation over a long period of time, and allowing it to independently support BCH development initiatives. This is made clear by the fact that BU continues to have enough funding to provide value to the BCH ecosystem for the foreseeable future.
Had BU converted all funds to BCH at, or at almost any point after, the time of the BCH/BTC fork in August 2017, then for much of the time since it would have been forced to either scale back operations or shut down support for BCH developers completely. We now see development teams such as BitcoinABC facing the prospect of being unable to fund their development of BCH, and their financial strategy may have contributed to this reality. This is despite the fact that nearly all the funds donated in the recent community funding drive sponsored by bitcoin.com were directed towards BitcoinABC.
Lack of a sustainable funding model also seems to have been a major factor in pushing BitcoinABC to make the highly controversial decision to support a change to the BCH protocol that would divert 12.5% of the block reward to themselves. Being financially prudent and sticking to its principles (as defined in the founding Articles of Federation has allowed Bitcoin Unlimited to steer clear of any conflicts of interest such as this.

Summary

Through its financial strategy Bitcoin Unlimited has been able to maintain its independence and financial sustainability and has therefore remained in a strong position to support Bitcoin Cash. BU’s officials and membership have continually made good decisions that have allowed BU to provide long-term support for the Bitcoin Cash ecosystem.
submitted by BU-BCH to btc [link] [comments]

A new whitepaper analysing the performance and scalability of the Streamr pub/sub messaging Network is now available. Take a look at some of the fascinating key results in this introductory blog

A new whitepaper analysing the performance and scalability of the Streamr pub/sub messaging Network is now available. Take a look at some of the fascinating key results in this introductory blog

Streamr Network: Performance and Scalability Whitepaper


https://preview.redd.it/bstqyn43x4j51.png?width=2600&format=png&auto=webp&s=81683ca6303ab84ab898c096345464111d674ee5
The Corea milestone of the Streamr Network went live in late 2019. Since then a few people in the team have been working on an academic whitepaper to describe its design principles, position it with respect to prior art, and prove certain properties it has. The paper is now ready, and it has been submitted to the IEEE Access journal for peer review. It is also now published on the new Papers section on the project website. In this blog, I’ll introduce the paper and explain its key results. All the figures presented in this post are from the paper.
The reasons for doing this research and writing this paper were simple: many prospective users of the Network, especially more serious ones such as enterprises, ask questions like ‘how does it scale?’, ‘why does it scale?’, ‘what is the latency in the network?’, and ‘how much bandwidth is consumed?’. While some answers could be provided before, the Network in its currently deployed form is still small-scale and can’t really show a track record of scalability for example, so there was clearly a need to produce some in-depth material about the structure of the Network and its performance at large, global scale. The paper answers these questions.
Another reason is that decentralized peer-to-peer networks have experienced a new renaissance due to the rise in blockchain networks. Peer-to-peer pub/sub networks were a hot research topic in the early 2000s, but not many real-world implementations were ever created. Today, most blockchain networks use methods from that era under the hood to disseminate block headers, transactions, and other events important for them to function. Other megatrends like IoT and social media are also creating demand for new kinds of scalable message transport layers.

The latency vs. bandwidth tradeoff

The current Streamr Network uses regular random graphs as stream topologies. ‘Regular’ here means that nodes connect to a fixed number of other nodes that publish or subscribe to the same stream, and ‘random’ means that those nodes are selected randomly.
Random connections can of course mean that absurd routes get formed occasionally, for example a data point might travel from Germany to France via the US. But random graphs have been studied extensively in the academic literature, and their properties are not nearly as bad as the above example sounds — such graphs are actually quite good! Data always takes multiple routes in the network, and only the fastest route counts. The less-than-optimal routes are there for redundancy, and redundancy is good, because it improves security and churn tolerance.
There is an important parameter called node degree, which is the fixed number of nodes to which each node in a topology connects. A higher node degree means more duplication and thus more bandwidth consumption for each node, but it also means that fast routes are more likely to form. It’s a tradeoff; better latency can be traded for worse bandwidth consumption. In the following section, we’ll go deeper into analyzing this relationship.

Network diameter scales logarithmically

One useful metric to estimate the behavior of latency is the network diameter, which is the number of hops on the shortest path between the most distant pair of nodes in the network (i.e. the “longest shortest path”. The below plot shows how the network diameter behaves depending on node degree and number of nodes.

Network diameter
We can see that the network diameter increases logarithmically (very slowly), and a higher node degree ‘flattens the curve’. This is a property of random regular graphs, and this is very good — growing from 10,000 nodes to 100,000 nodes only increases the diameter by a few hops! To analyse the effect of the node degree further, we can plot the maximum network diameter using various node degrees:
Network diameter in network of 100 000 nodes
We can see that there are diminishing returns for increasing the node degree. On the other hand, the penalty (number of duplicates, i.e. bandwidth consumption), increases linearly with node degree:

Number of duplicates received by the non-publisher nodes
In the Streamr Network, each stream forms its own separate overlay network and can even have a custom node degree. This allows the owner of the stream to configure their preferred latency/bandwidth balance (imagine such a slider control in the Streamr Core UI). However, finding a good default value is important. From this analysis, we can conclude that:
  • The logarithmic behavior of network diameter leads us to hope that latency might behave logarithmically too, but since the number of hops is not the same as latency (in milliseconds), the scalability needs to be confirmed in the real world (see next section).
  • A node degree of 4 yields good latency/bandwidth balance, and we have selected this as the default value in the Streamr Network. This value is also used in all the real-world experiments described in the next section.
It’s worth noting that in such a network, the bandwidth requirement for publishers is determined by the node degree and not the number of subscribers. With a node degree 4 and a million subscribers, the publisher only uploads 4 copies of a data point, and the million subscribing nodes share the work of distributing the message among themselves. In contrast, a centralized data broker would need to push out a million copies.

Latency scales logarithmically

To see if actual latency scales logarithmically in real-world conditions, we ran large numbers of nodes in 16 different Amazon AWS data centers around the world. We ran experiments with network sizes between 32 to 2048 nodes. Each node published messages to the network, and we measured how long it took for the other nodes to get the message. The experiment was repeated 10 times for each network size.
The below image displays one of the key results of the paper. It shows a CDF (cumulative distribution function) of the measured latencies across all experiments. The y-axis runs from 0 to 1, i.e. 0% to 100%.
CDF of message propagation delay
From this graph we can easily read things like: in a 32 nodes network (blue line), 50% of message deliveries happened within 150 ms globally, and all messages were delivered in around 250 ms. In the largest network of 2048 nodes (pink line), 99% of deliveries happened within 362 ms globally.
To put these results in context, PubNub, a centralized message brokering service, promises to deliver messages within 250 ms — and that’s a centralized service! Decentralization comes with unquestionable benefits (no vendor lock-in, no trust required, network effects, etc.), but if such protocols are inferior in terms of performance or cost, they won’t get adopted. It’s pretty safe to say that the Streamr Network is on par with centralized services even when it comes to latency, which is usually the Achilles’ heel of P2P networks (think of how slow blockchains are!). And the Network will only get better with time.
Then we tackled the big question: does the latency behave logarithmically?
Mean message propagation delay in Amazon experiments
Above, the thick line is the average latency for each network size. From the graph, we can see that the latency grows logarithmically as the network size increases, which means excellent scalability.
The shaded area shows the difference between the best and worst average latencies in each repeat. Here we can see the element of chance at play; due to the randomness in which nodes become neighbours, some topologies are faster than others. Given enough repeats, some near-optimal topologies can be found. The difference between average topologies and the best topologies gives us a glimpse of how much room for optimisation there is, i.e. with a smarter-than-random topology construction, how much improvement is possible (while still staying in the realm of regular graphs)? Out of the observed topologies, the difference between the average and the best observed topology is between 5–13%, so not that much. Other subclasses of graphs, such as irregular graphs, trees, and so on, can of course unlock more room for improvement, but they are different beasts and come with their own disadvantages too.
It’s also worth asking: how much worse is the measured latency compared to the fastest possible latency, i.e. that of a direct connection? While having direct connections between a publisher and subscribers is definitely not scalable, secure, or often even feasible due to firewalls, NATs and such, it’s still worth asking what the latency penalty of peer-to-peer is.

Relative delay penalty in Amazon experiments
As you can see, this plot has the same shape as the previous one, but the y-axis is different. Here, we are showing the relative delay penalty (RDP). It’s the latency in the peer-to-peer network (shown in the previous plot), divided by the latency of a direct connection measured with the ping tool. So a direct connection equals an RDP value of 1, and the measured RDP in the peer-to-peer network is roughly between 2 and 3 in the observed topologies. It increases logarithmically with network size, just like absolute latency.
Again, given that latency is the Achilles’ heel of decentralized systems, that’s not bad at all. It shows that such a network delivers acceptable performance for the vast majority of use cases, only excluding the most latency-sensitive ones, such as online gaming or arbitrage trading. For most other use cases, it doesn’t matter whether it takes 25 or 75 milliseconds to deliver a data point.

Latency is predictable

It’s useful for a messaging system to have consistent and predictable latency. Imagine for example a smart traffic system, where cars can alert each other about dangers on the road. It would be pretty bad if, even minutes after publishing it, some cars still haven’t received the warning. However, such delays easily occur in peer-to-peer networks. Everyone in the crypto space has seen first-hand how plenty of Bitcoin or Ethereum nodes lag even minutes behind the latest chain state.
So we wanted to see whether it would be possible to estimate the latencies in the peer-to-peer network if the topology and the latencies between connected pairs of nodes are known. We applied Dijkstra’s algorithm to compute estimates for average latencies from the input topology data, and compared the estimates to the actual measured average latencies:
Mean message propagation delay in Amazon experiments
We can see that, at least in these experiments, the estimates seemed to provide a lower bound for the actual values, and the average estimation error was 3.5%. The measured value is higher than the estimated one because the estimation only considers network delays, while in reality there is also a little bit of a processing delay at each node.

Conclusion

The research has shown that the Streamr Network can be expected to deliver messages in roughly 150–350 milliseconds worldwide, even at a large scale with thousands of nodes subscribing to a stream. This is on par with centralized message brokers today, showing that the decentralized and peer-to-peer approach is a viable alternative for all but the most latency-sensitive applications.
It’s thrilling to think that by accepting a latency only 2–3 times longer than the latency of an unscalable and insecure direct connecion, applications can interconnect over an open fabric with global scalability, no single point of failure, no vendor lock-in, and no need to trust anyone — all that becomes available out of the box.
In the real-time data space, there are plenty of other aspects to explore, which we didn’t cover in this paper. For example, we did not measure throughput characteristics of network topologies. Different streams are independent, so clearly there’s scalability in the number of streams, and heavy streams can be partitioned, allowing each stream to scale too. Throughput is mainly limited, therefore, by the hardware and network connection used by the network nodes involved in a topology. Measuring the maximum throughput would basically be measuring the hardware as well as the performance of our implemented code. While interesting, this is not a high priority research target at this point in time. And thanks to the redundancy in the network, individual slow nodes do not slow down the whole topology; the data will arrive via faster nodes instead.
Also out of scope for this paper is analysing the costs of running such a network, including the OPEX for publishers and node operators. This is a topic of ongoing research, which we’re currently doing as part of designing the token incentive mechanisms of the Streamr Network, due to be implemented in a later milestone.
I hope that this blog has provided some insight into the fascinating results the team uncovered during this research. For a more in-depth look at the context of this work, and more detail about the research, we invite you to read the full paper.
If you have an interest in network performance and scalability from a developer or enterprise perspective, we will be hosting a talk about this research in the coming weeks, so keep an eye out for more details on the Streamr social media channels. In the meantime, feedback and comments are welcome. Please add a comment to this Reddit thread or email [[email protected]](mailto:[email protected]).
Originally published by. Henri at blog.streamr.network on August 24, 2020.
submitted by thamilton5 to streamr [link] [comments]

Summary Golem Factory AMA, January 22nd 2020!

Hi all,
First of all, hope you have all had a great start to the new decade.
Golem has done an AMA on the 22nd of January and there was a lot to discuss with over 50 questions from all of you. It is somewhat understandable that a lot do not want to read the whole thing. I will try to recap the most 'important' or viable questions for the current state of development. As always, I will include a juicy Tl;dr at the end.
General Development Direction and Product Adoption
"We believe that decentralization, in the upcoming years, will not only be needed, but will be inevitable. We’re then preparing for when that time comes, as we are aware that Golem will need to grow robuster and then, the worries of low requestor supplies, will be a thing of the past. Taking into account how dependent we have become from corporations we believe that this trend will have to change and we have to be ready. Nowadays, the adoption is not going as quickly as we expected, and as quickly as we all wished for. Not only for the Golem network but for the whole cryptospace. We believe this is a moment to think progressively and overcome doubts by bulding."
(Viggith) "We're almost about to become Clay officially. Reaching this milestone gave us a lot of opportunities to learn. As the whole process took quite some time, we could observe the development decisions made in other projects, how the tech stack matured, and how expectations in the community changed shape."
"Right now we’re mostly focused on the general platform development rather than working on deep development of integrations. It doesn’t mean that we’re not actively looking for the new ones, we just want to encourage devs to build their own rather then build them interally. However, we have several examples and PoCs that are being integrated - computational chemistry software for one of the scientific research projects from IChO, the transcoding use- case is at its MVP stage. We are also investigating the usage of gWASM for gas price optimization for Ethereum, and we had a PoC for a meta-use case with tools for devops’. We are striving to improve the existing software including Task API, so that the gWASM and Task APIusers will propose new integrations."
Task API Launch and Concent
Last week, the Task API launched on Testnet which allows users to build their tasks on the Golem Network. This has been perceived to be the largest component that will transition Golem from the Brass stage, to Clay. For more information and elaboration on Concent, see this comment
"We worked on the task-api component with a small and agile team, with proper planning and preparation we were able to not have big hiccups. The largest changes where that subtask-id was only unique when combined with task-id. The largest fights with code were about windows exceptions and the issues between twisted and asyncio. Twisted is our old async library, asyncio is the new one that has better native python support.
For the mainnet release we would like to have more use-cases, better developer utilities and a lot of testing, by the team and the community. The main focus is to stabilize the task-api"
For quick examples of the Task API:
"As examples for the task-api we made two apps: `blenderapp` and `tutorialapp`. blenderapp can be run by anyone on the current testnet using these instructions. tutorialapp can be build and run locally using these instructions ( NOTE: technical ). As for tests we made unit tests on almost all levels: the apps, connecting libraries and golem core. In Golem core there are also multiple integration tests to test integration with core, one for testing blenderapp, one for testing apps while developing them."
Other Usecases for Golem on the Horizon
"We did some research on integrating BOINC and BOINC-like computations. For now it seems that it is technically possible. But it will require more effort. Recently we are planning to try to cross-compile [email protected] to gwasm application for the start and run it on mainnet. Another possible way is to use the testnet Task API as you mentioned. In general, it would be better to do so on mainnet but we need to wait for the release.
(...)Golem should be presented to science oriented researchers and be recognized in voluntary computations. That would improve our userbase, it would contribute to non-profit organizations and, of course, would bring dApps to the non blockchain world. (...) I see that there have been more discussions on reddit and we will review them and speak internally."
"Right now we’re mostly focused on the general platform development rather than working on deep development of integrations. It doesn’t mean that we’re not actively looking for the new ones, we just want to encourage devs to build their own rather then build them interally. However, we have several examples and PoCs that are being integrated - computational chemistry software for one of the scientific research projects from IChO, the transcoding use- case is at its MVP stage. We are also investigating the usage of gWASM for gas price optimization for Ethereum, and we had a PoC for a meta-use case with tools for devops’. We are striving to improve the existing software including Task API, so that the gWASM and Task APIusers will propose new integrations."
The GNT, Layer 2 and DeFi
"We crowdfunded for this project, and GNT has always been a utility token. So, in short, the narrative "the price does not matter" would be neither politically nor logically correct. However, we need to look after the best interests of all users, either golem software users or token holders, that helped us kickstart this venture.
(...)So, the GNT should be easy to use directly on the platform. Still, the token should also supplement the platform in other ways (e.g., through community-driven projects on the platform utilizing economic mechanisms envisioned and developed by the community members). The token should also be easy to use in a broader context (e.g., the DeFi), which may or may not result in a direct connection with the Golem platform."
"The current model with on-chain payments is not sustainable for Golem and other similar projects which need a trade-off between the cost of transaction, security/finality and timing. When it comes to small (aka micro) payments it’s even more important. It may happen that due to Ethereum congestion one has to pay more for the gas than the computations itself.
Here comes the idea for moving payments to layer 2 solutions. Unfortunately currently there is no such in the production which fits our platform needs, though the situation is very dynamic and we can expect suches to appear in the coming months."
"It is no secret that we have been thinking about migrating to ERC-20 for a long time. For one reason or another, we always postponed. But with all the 2019 astronomical DeFi growth, the flame was reignited(...).
We’ve been working with ETHWorks on finding the best approach for migrating GNT to ERC20.We chose to work with this particular company as our goal is to make sure that the passage to ERC20 allows the (new)GNT to be able to adapt to various matters: for instance, to be used for layer 2 scaling solutions, or Universal Logins, gassless transactions, among others. Right now, doing gassless transactions with the current GNT is cumbersome, and there are many solutions in the market that would be a great fit if GNT was ERC20.(...) As we continue the work & research, we may come up with more ideas that go beyond this, but our main focus remains on giving our users the chance to improve their Golem experience, trade without KYC (if they want to) - while we simultaneously look into all the DeFi ecosystem, and see if we can have the chance of using the token in other platforms."
New Team Members and GolemGrid
"Radek Tereszczuk has joined us in order to work on the long-term vision of the project and how it fits in the overall web 3 vision. He is an inventor, expert and consultant in areas such as IT, telecommunications, statistics, machine learning, genetics and physics. After hours, research on the new class of programming languages ​​based on his own discoveries in graph mathematics. Has 20+ years professional experience in both his own start-ups and big enterprises (mainly banks and insurance), acting as dev / analyst / architect / project and product manager.
Kuba Kucharski is joining us as Chief Product Engineering Officer to boost our product and engineering efforts. He has vast experience in leading developer teams and building product organisations. Involved in Blockchain space since 2013, some of his projects being OrisiOracles (smart contract framework built on top of Bitcoin and BitMessage) and Userfeeds (attention economy / blockchain explorer built on Ethereum)."
Phillip from GolemGrid has officially joined the team as well, after his support on mainly Rocket Chat. (chat.golem.network). When asked about his product GolemGrid (golemgrid.com) and his working abroad, he said the following:
"So far so good. No issues with working remotely for what i’m currently doing. We have an internal chat for the team members, so if there’s any questions one can just type in there and receive an answer fairly quickly.
No challenges to GolemGrid. Actually all more helpful since i've got the smart developers around to answers questions about Golem if needed. (...)
Currently there has not been any talk of GolemGrid integrating into Golem or something similar. So atm it’s purely separated as it always has been. I myself will always integrate what’s possible with Golem to GolemGrid, so whether that’s ML, Rendering or a third thing, I want to integrate it all when released.
I have plans to fiddle with the Task API in the nearest time and see if I can create something unique and useful for others."
Events and upcoming Promo
"Kubkon’s speaking at FOSSDEM 2020 in Belgium in a matter of days, then he’s heading towards ETHCC 2020 in Paris to spread the word about gWASM even further.The very eloquent Marcin Benke is speaking in April at EDCON.
MP is also doing active reach out to conferences to help with programming and intro some Golem angles we’ve not presented before, and maybe more generalized knowledge that our team can share.We’re adding more conferences every month - and most importantly, we will focus on hackathons. You can rest assured that angle will be thoroughly covered, whether local or more international initiatives, we’ll have a lot of news on this front."
"We are working on our content schedule for 2020 (including regular blogposts as we’ve been doing), planning to add tutorial videos and workshops / hackathons. The planned marketing activities for the first two quarters of 2020 are going to be targeted towards quite technical people and they are going to be heavily tech oriented (tutorials, docs, hackathons, explanatory videos, workshops etc). The promo video was a representation of the more mainstream marketing that forms part of our long-term goal."
Tl;dr
Golem has not adopted users as quickly as expected, however that goes for a lot of things in the cryptospace. The focus is currently on making the platform more robust and on UX instead of deep development and integrations. The Testnet API is live. Blenderapp can be run by anyone on the current testnet using these instructions. tutorialapp can be build and run locally using these instructions ( NOTE: technical ). Other possible use-cases for Golem are BOINC and BOINC-like computations, however these require more effort currently and have been passed on for internal discussion. Several PoCs are being integrated; chemistry software for one of the scientific research projects from IChO as well as gas-optimization calculations for gWASM and a PoC for a meta-usecase with tools for developers.
The GNT should be easy to use directly on the platform. The current model with on-chain payments is not sustainable for Golem and other similar projects which need a trade-off between the cost of transaction, security/finality and timing. When it comes to small (aka micro) payments it’s even more important. Golem has not found a layer 2 solution that satisifies their needs. Golem has been working with ETHWorks on finding the best approach for migrating GNT to ERC20. They chose to work with this particular company as our goal is to make sure that the passage to ERC20 allows the (new)GNT to be able to adapt to various matters. Their main focus remains on giving our users the chance to improve their Golem experience, trade without KYC (if they want to) - while we simultaneously look into all the DeFi ecosystem, and see if we can have the chance of using the token in other platforms.
Radek Tereszczuk, Kuba Kucharski and Phillip from GolemGrid have joined the team and they will help in the fields of long-term web3.0 vision, boosting product and engineering performance and efforts and tech support as well as community support respectively.
Golem will be speaking at FOSSDEM 2020 in Belgium in a matter of days and will then be heading towards ETHCC 2020 in Paris to spread the word about gWASM even further. They will also be speaking in April at EDCON, as well as doing active reach-outs to conferences to help with programming.

See you all next AMA!
submitted by PSVjasper99 to GolemProject [link] [comments]

AT2: Asynchronous Trustworthy Transfers

AT2, a fairly new unknown tech to create a decentralized asset transfer system without blockchain.
This week there was an article @ www.computing.co.uk. See below.
link: https://www.computing.co.uk/feature/4017118/at2-answer-cryptocurrency-energy-performance
AT2 paper: https://arxiv.org/pdf/1812.10844.pdf

Could AT2 be the answer to cryptocurrency's energy and performance problems?
Blockchains are slow, wasteful and ill-suited for digital currencies, say researchers who believe they've found a better way
Blockchains solve a hard problem: how to ensure consensus across a distributed, decentralised network, where messages arrive out of order if at all, where individual nodes may fail, and where a certain proportion may be actively malicious.
The original blockchain, bitcoin, was designed to support a novel digital currency, and the issue its consensus algorithm solved was preventing double-spend. It also successfully introduced game theory for security: adversaries would have to spend more money on an attack than they could expect to gain financially. All this and the original protocol was just a few hundred lines of code.
But this achievement came at a high cost in terms of energy use and performance.
With bitcoin, a new leader is required to verify each block of transactions, that leader being the first device to complete a computationally heavy challenge (Proof of Work, PoW). As a result, the blockchain's throughput is painfully slow at around seven transactions per second (Visa claims it can do 56,000) and the whole process is massively wasteful of energy. These drawbacks have been surmounted, to some degree, in newer blockchain designs using overlay networks, sharding and different types of "proofs of" and by non-blockchain directed acyclic graphs (DAGs), but each requires tradeoffs in terms of centralisation, complexity or security.
A group of researchers led by computer scientist Professor Rachid Guerraoui of Swiss University Ecole Polytechnique Fédérale de Lausanne (EPFL) decided to look afresh at the problem. Is this gargantuan security apparatus, in which every node in a network of thousands or millions must come to a consensus about the ordering of events, really necessary everytime someone makes a purchase? Could a leaderless mechanism be applied to the problem instead? If so, could it be guaranteed to be reliably consistent, even when a certain number of nodes are malicious or faulty (Byzantine)?
The headline answer, published in an initial paper last year, is that network-wide consensus is overkill for simple asset transfers. If cryptocurrencies could be rebooted, all the fossil fuels burned by miners of bitcoin and its clones could be left in the ground and Visa-level transaction speeds could be achieved without any loss of security or reliance on centralised control. As compact as Satoshi's original bitcoin protocol itself, the few hundred lines of code that make up their Asynchronous Trusted Transfers (AT2) algorithm could solve some of the tricky problems that have plagued decentralised token-based networks from the off.
AT2 can be used to validate transactions within two different decentralised networking scenarios: (1) permissioned or small unpermissioned networks, and (2) global scale unpermissioned networks. In the first case, the algorithm uses quorum for validating actions, whereby a certain proportion of the network's nodes must agree an action is correct before it can take place. The second scenario, networks made up of very large number of machines (nodes), uses probabilistic sampling. Instead of asking all nodes it checks a number of randomly selected nodes for their viewpoint. This is much more efficient and scalable than the deterministic quorum but carries a tiny (ca. 10-15) possibility of failure.
Doing away with network-wide consensus means AT2 sidesteps the bane of decentralised networks, the FLP Impossibility - the theory that in a fully asynchronous system, a deterministic consensus algorithm cannot be safe, live and fault-tolerant.
Computing caught up with Matteo Monti, who worked on the statistical aspects of AT2, and by email with Guerraoui to find out more. We also spoke to David Irvine of networking firm MaidSafe, which has adopted AT2 to simplify its consensus process.

Incentivising improvements
We asked Monti (pictured) to summarise the innovation that AT2 brings to the table.
"What we noticed is that there's a specific subclass of problems that can be solved on a decentralised, distributed network without requiring consensus," he said. "The main use for consensus at the moment, cryptocurrency transactions, is part of that class. We can solve this using a weaker abstraction and in doing so you gain the ability to work in a completely asynchronous environment."
Bitcoin doesn't even solve consensus well. It solves eventual consensus which an even weaker abstraction, he added, whereas AT2 can guarantee strong eventual consistency. Another issue it tackles is PoW's incentivization model which means that improvements in technology do not translate into a better performing network.
"With bitcoin, the bottleneck is always electricity. If everyone doubles their computational speed it's not going to change the efficiency of the network. Everyone's competing not to compute but to waste energy."
In place of PoW, AT2 uses ‘Proof of Bandwidth', i.e. evidence of recent interaction, to verify that a node is real. Since it doesn't rely on consensus, the performance of AT2 should allow messaging speeds across the network that approach the theoretical maximum, and improvements in hardware will translate into better overall performance.

Security measures
Blockchains like bitcoin are extremely resilient against Sybil attacks; bitcoin is still running after all, in the face of unwavering opposition from powerful nation states and bankers. Sybil attacks are a major vulnerability in permissionless decentralised networks where anyone can join anonymously, but there are others too.
Monti said the most challenging aspect of designing the AT2 algorithm was distilling all the potential types of dangerous Byzantine behaviour into a manageable set so they could be treated using probability theory. As a result of studying many possible failure scenarios, including Sybil, the algorithm is able to quickly react to deviations from the norm.
Other security features flow from the fact that each network node needs to know only a limited amount about its counterparts for the system to function. For example, the randomness used in sampling operations is generated locally on the calling device rather than on the network, making this vector hard to utilise by an attacker looking to influence events.
Signals are passed across the network via a messaging system called Byzantine Reliable Broadcasting (BRB) a gossip-based method by which nodes can quickly and reliably come to an agreement about a message even if some are Byzantine.
As a result of these features, AT2 does not rely on economic game theory for security, said Monti.
"I'd go as far as saying that the moment you need to implement an economic disadvantage to attacking the system, it means that you failed to make it impossible to attack the system. We don't care about your interests in attacking the system. What we want to achieve is a proof that no matter what you do, the system will not be compromised."

‘Crypto-Twitter'
AT2 starts with the simple idea that rather than requiring the whole network to maintain a time-ordered record of my transactions (as with a blockchain or DAG), the only person who needs to keep that tally is me.
If I decide to spend some money, I merely announce that fact to the network over BRB and this request will be held in a memory snapshot escrow. Depending on the network type, a representative sample or a quorum of other nodes then check my balance and inspect my ordered transaction history to ensure that the funds haven't already been spent (each transaction has a unique sequential ID) and provided all is correct the transaction is guaranteed to go through, even if up to a third of those validators are malicious. If I try to cheat, the transaction will be blocked.
Monti likens a wallet on an AT2 network to a social media timeline.
"What we've proved, essentially, is that you can have a cryptocurrency on Twitter," he explained.
"A payment works in two steps. First, there's a withdrawal from my account via a tweet, then the second step is a deposit, or a retweet. I tweet a message saying I want to pay Bob. Bob then retweets this message on his own timeline, and in the act of retweeting he's depositing money in his account.
"So everyone has their own independent timeline and while the messages - my tweets - are strictly ordered, that's only in my own timeline; I don't care about ordering relative to other timelines. If I try to pay someone else, it will be obvious by the sequence of tweets in my account, and my account only, whether I can perform that payment.
"In contrast, consensus effectively squeezes all of the messages into a unique timeline on which everybody agrees. But this is overkill, you don't need it. We can prove that it still works even if the ordering is partial and not total, and this enables us to switch from consensus to reliable broadcast."
But of course, nothing comes for free. AT2 can verify exchanges of tokenised assets, but aside from arrangements between a small number of opted-in parties, it does not have the ability to support smart contracts of the type that are viable on ethereum and other blockchains, because this does require network-wide consensus. Guerraoui said his team is working on "refinements and extensions" to support such functionality in the future.

Early adopters
AT2 is still pretty ‘cutting edge'. Three papers have been accepted for peer review the latest published in February, but it provides the sort of efficiencies and simplifications that could bring real progress. Guerraoui said AT2 has "received interest from many groups including companies ‘selling' blockchain approaches, as well as companies and organisations using such approaches".
One organisation that has already picked up on the potential of AT2 is Scotland's MaidSafe, creator of the SAFE Network. MaidSafe is already using AT2 to replace its Parsec consensus algorithm, which testing showed was indeed overkill for many network operations. CEO David Irvine said he and his colleagues came across AT2 while working on another way of propagating changes to data without consensus, conflict-free data replicated types (CRDTs), promptly forked the code and started to apply it.
SAFE, currently in Alpha, is a sharded network, meaning it's subdivided into small semi-autonomous sections. On a network level, the way it works is that trusted 'elder' nodes vote on a requested action then pass instructions to other sections to carry it out.
AT2 allows the initial task of accumulating the votes for an action, which had been done by the elders using a consensus algorithm, to be moved off the network and onto the requesting client which is much more lightweight and efficient. Once a quorum of votes has been gathered, the client simply resubmits the request and the elders will ensure it's carried out. The system is much simpler and should be more secure too. "It's 200 lines of logic compared to 15,000 for a start," Irvine said.
AT2 is not just used to validate token transfers. By the same mechanism, it can also be used to authorise requests to store or change data. Together with CRDTs, which guarantee that such changes cannot fail, this makes for a very tight and efficient ship, said Irvine.
"AT2 is for us a missing link. The difficulty of several nodes agreeing is simplified by the initiator taking on the effort of accumulating quorum votes. It seems so simple but in fact, it's an amazing innovation. It certainly falls into the category of 'why didn't I think of that?'."
submitted by ZaadNek to CryptoTechnology [link] [comments]

Eventi del mese: Febbraio 2020



Organizzate eventi? Ne conoscete altri? Segnalateli a u/-Defkon1-
submitted by -Defkon1- to ItalyInformatica [link] [comments]

Subreddit Stats: btc top posts from 2019-01-06 to 2020-01-05 11:19 PDT

Period: 363.85 days
Submissions Comments
Total 1000 86748
Rate (per day) 2.75 237.19
Unique Redditors 317 7747
Combined Score 194633 356658

Top Submitters' Top Submissions

  1. 31014 points, 162 submissions: Egon_1
    1. Vitalik Buterin to Core Maxi: “ok bitcoiner” .... (515 points, 206 comments)
    2. These men are serving life without parole in max security prison for nonviolent drug offenses. They helped me through a difficult time in a very dark place. I hope 2019 was their last year locked away from their loved ones. FreeRoss.org/lifers/ Happy New Year. (502 points, 237 comments)
    3. "It’s official Burger King just accepted Bitcoin Cash and GoC token as a payment option in Slovenia." (423 points, 112 comments)
    4. "HOLY SATOSHI! 😱😱 I did it! A smart card that produces valid BitcoinCash signatures. Who would love to pay with a card—to a phone?? Tap took less than a second!👟..." (368 points, 105 comments)
    5. Chrome 'Has Become Surveillance Software. It's Time to Switch' -> Brave to support BCH! (330 points, 97 comments)
    6. Gavin Andresen (2017): "Running a network near 100% capacity is irresponsible engineering... " (316 points, 117 comments)
    7. "Evidently @github has banned all the Iranian users without an ability for them to download their repositories. A service like Github must be a public good and must not be controlled by a centralized entity. Another great example of why we as a society need to make web3 a reality" (314 points, 117 comments)
    8. Roger Ver: "Bitcoin Cash acceptance is coming to thousands of physical shops in Korea" (313 points, 120 comments)
    9. Paul Sztorc: “Will people really spend $70-$700 to open/modify a lightning channel when there's an Altcoin down the street which will process a (USD-denominated) payment for $0.05 ? Many people seem to think yes but honestly I just don't get it” (306 points, 225 comments)
    10. Food For Thought (303 points, 105 comments)
  2. 29021 points, 157 submissions: MemoryDealers
    1. Bitcoin Cash is Lightning Fast! (No editing needed) (436 points, 616 comments)
    2. Brains..... (423 points, 94 comments)
    3. Meanwhile in Hong Kong (409 points, 77 comments)
    4. Ross Ulbricht has served 6 years in federal prison. (382 points, 156 comments)
    5. Just another day at the Bitcoin Cash accepting super market in Slovenia. (369 points, 183 comments)
    6. Why I'm not a fan of the SV community: My recent bill for defending their frivolous lawsuit against open source software developers. (369 points, 207 comments)
    7. History Reminder: (354 points, 245 comments)
    8. It's more decentralized this way. (341 points, 177 comments)
    9. The new Bitcoin Cash wallet is so fast!!!!! (327 points, 197 comments)
    10. The IRS wants to subpoena Apple and Google to see if you have downloaded crypto currency apps. (324 points, 178 comments)
  3. 6909 points, 37 submissions: BitcoinXio
    1. Tim Pool on Twitter: “How the fuck are people justifying creating a world like the one's depicted in Fahrenheit 451 and 1984? You realize that censorship and banning information was a key aspect of the dystopian nightmare right?” (435 points, 75 comments)
    2. The creator of the now famous HODL meme says that the HODL term has been corrupted and doesn’t mean what he intended; also mentions that the purpose of Bitcoin is to spend it and that BTC has lost its value proposition. (394 points, 172 comments)
    3. Erik Voorhees on Twitter: “I wonder if you realize that if Bitcoin didn’t work well as a payment system in the early days it likely would not have taken off. Many (most?) people found the concept of instant borderless payments captivating and inspiring. “Just hold this stuff” not sufficient.” (302 points, 66 comments)
    4. Bitfinex caught paying a company to astroturf on social media including Reddit, Twitter, Medium and other platforms (285 points, 86 comments)
    5. WARNING: If you try to use the Lightning Network you are at extremely HIGH RISK of losing funds and is not recommended or safe to do at this time or for the foreseeable future (274 points, 168 comments)
    6. Craig Wright seems to have rage quit Twitter (252 points, 172 comments)
    7. No surprise here: Samson Mow among other BTC maxi trolls harassed people to the point of breakdown (with rape threats, etc) (249 points, 85 comments)
    8. On Twitter: “PSA: The Lightning Network is being heavily data mined right now. Opening channels allows anyone to cluster your wallet and associate your keys with your IP address.” (228 points, 102 comments)
    9. btc is being targeted and attacked, yet again (220 points, 172 comments)
    10. Brian Armstrong CEO of Coinbase using Bitcoin Cash (BCH) to pay for food, video in tweet (219 points, 66 comments)
  4. 6023 points, 34 submissions: money78
    1. BSV in a nutshell... (274 points, 60 comments)
    2. There is something going on with @Bitcoin twitter account: 1/ The URL of the white paper has been changed from bitcoin.com into bitcoin.org! 2/ @Bitcoin has unfollowed all other BCH related accounts. 3/ Most of the posts that refer to "bitcoin cash" have been deleted?!! Is it hacked again?! (269 points, 312 comments)
    3. "Not a huge @rogerkver fan and never really used $BCH. But he wiped up the floor with @ToneVays in Malta, and even if you happen to despise BCH, it’s foolish and shortsighted not to take these criticisms seriously. $BTC is very expensive and very slow." (262 points, 130 comments)
    4. Jonathan Toomim: "At 32 MB, we can handle something like 30% of Venezuela's population using BCH 2x per day. Even if that's all BCH ever achieved, I'd call that a resounding success; that's 9 million people raised out of poverty. Not a bad accomplishment for a hundred thousand internet geeks." (253 points, 170 comments)
    5. Jonathan Toomim: "BCH will not allow block sizes that are large enough to wreak havoc. We do our capacity engineering before lifting the capacity limits. BCH's limit is 32 MB, which the network can handle. BSV does not share this approach, and raises limits before improving actual capacity." (253 points, 255 comments)
    6. What Bitcoin Cash has accomplished so far 💪 (247 points, 55 comments)
    7. Which one is false advertising and misleading people?! Bitcoin.com or Bitcoin.org (232 points, 90 comments)
    8. A message from Lightning Labs: "Don't put more money on lightning than you're willing to lose!" (216 points, 118 comments)
    9. Silk Road’s Ross Ulbricht thanks Bitcoin Cash’s [BCH] Roger Ver for campaigning for his release (211 points, 29 comments)
    10. This account just donated more than $6600 worth of BCH via @tipprbot to multiple organizations! (205 points, 62 comments)
  5. 4514 points, 22 submissions: unstoppable-cash
    1. Reminder: bitcoin mods removed top post: "The rich don't need Bitcoin. The poor do" (436 points, 89 comments)
    2. Peter R. Rizun: "LN User walks into a bank, says "I need a loan..." (371 points, 152 comments)
    3. It was SO simple... Satoshi had the answer to prevent full-blocks back in 2010! (307 points, 150 comments)
    4. REMINDER: "Bitcoin isn't for people that live on less than $2/day" -Samson Mow, CSO of BlockStream (267 points, 98 comments)
    5. "F'g insane... waited 5 hrs and still not 1 confirmation. How does anyone use BTC over BCH BitcoinCash?" (258 points, 222 comments)
    6. Irony:"Ave person won't be running LN routing node" But CORE/BTC said big-blocks bad since everyone can't run their own node (256 points, 161 comments)
    7. BitPay: "The Wikimedia Foundation had been accepting Bitcoin for several years but recently switched pmt processors to BitPay so they can now accept Bitcoin Cash" (249 points, 61 comments)
    8. FreeTrader: "Decentralization is dependent on widespread usage..." (195 points, 57 comments)
    9. The FLIPPENING: Fiat->OPEN Peer-to-Peer Electronic Cash! Naomi Brockwell earning more via BitBacker than Patreon! (193 points, 12 comments)
    10. LN Commentary from a guy that knows a thing or 2 about Bitcoin (Gavin Andresen-LEAD developer after Satoshi left in 2010) (182 points, 80 comments)
  6. 3075 points, 13 submissions: BeijingBitcoins
    1. Last night's BCH & BTC meetups in Tokyo were both at the same restaurant (Two Dogs). We joined forces for this group photo! (410 points, 166 comments)
    2. Chess.com used to accept Bitcoin payments but, like many other businesses, disabled the option. After some DMs with an admin there, I'm pleased to announce that they now accept Bitcoin Cash! (354 points, 62 comments)
    3. WSJ: Bitfinex Used Tether Reserves to Mask Missing $850 Million, Probe Finds (348 points, 191 comments)
    4. Bitcoiners: Then and Now [MEME CONTEST - details in comments] (323 points, 72 comments)
    5. I'd post this to /Bitcoin but they would just remove it right away (also I'm banned) (320 points, 124 comments)
    6. So this is happening at the big protest in Hong Kong right now (270 points, 45 comments)
    7. /Bitcoin mods are censoring posts that explain why BitPay has to charge an additional fee when accepting BTC payments (219 points, 110 comments)
    8. The guy who won this week's MillionaireMakers drawing has received ~$55 in BCH and ~$30 in BTC. It will cost him less than $0.01 to move the BCH, but $6.16 (20%) in fees to move the BTC. (164 points, 100 comments)
    9. The Bitcoin whitepaper was published 11 years ago today. Check out this comic version of the whitepaper, one of the best "ELI5" explanations out there. (153 points, 12 comments)
    10. Two Years™ is the new 18 Months™ (142 points, 113 comments)
  7. 2899 points, 18 submissions: jessquit
    1. Oh, the horror! (271 points, 99 comments)
    2. A few days ago I caught flak for reposting a set of graphs that didn't have their x-axes correctly labeled or scaled. tvand13 made an updated graph with correct labeling and scaling. I am reposting it as I promised. I invite the viewer to draw their own conclusions. (214 points, 195 comments)
    3. Do you think Bitcoin needs to increase the block size? You're in luck! It already did: Bitcoin BCH. Avoid the upcoming controversial BTC block size debate by trading your broken Bitcoin BTC for upgraded Bitcoin BCH now. (209 points, 194 comments)
    4. Master list of evidence regarding Bitcoin's hijacking and takeover by Blockstream (185 points, 113 comments)
    5. PSA: BTC not working so great? Bitcoin upgraded in 2017. The upgraded Bitcoin is called BCH. There's still time to upgrade! (185 points, 192 comments)
    6. Nobody uses Bitcoin Cash (182 points, 88 comments)
    7. Double-spend proofs, SPV fraud proofs, and Cashfusion improvements all on the same day! 🏅 BCH PLS! 🏅 (165 points, 36 comments)
    8. [repost] a reminder on how btc and Bitcoin Cash came to be (150 points, 102 comments)
    9. Holy shit the entire "negative with gold" sub has become a shrine devoted to the guilded astroturfing going on in rbtc (144 points, 194 comments)
    10. This sub is the only sub in all of Reddit that allows truly uncensored discussion of BTC. If it turns out that most of that uncensored discussion is negative, DON'T BLAME US. (143 points, 205 comments)
  8. 2839 points, 13 submissions: SwedishSalsa
    1. With Bitcoin, for the first time in modern history, we have a way to opt out. (356 points, 100 comments)
    2. In this age of rampant censorship and control, this is why I love Bitcoin. (347 points, 126 comments)
    3. The crypto expert (303 points, 29 comments)
    4. Satoshi reply to Mike Hearn, April 2009. Everybody, especially newcomers and r-bitcoin-readers should take a step back and read this. (284 points, 219 comments)
    5. Bitcoin Cash looking good lately. (235 points, 33 comments)
    6. Roger Ver bad (230 points, 61 comments)
    7. History of the BTC scaling debate (186 points, 54 comments)
    8. MFW i read Luke Jr wants to limit BTC blocks to 300k. (183 points, 116 comments)
    9. Meanwhile over at bitcoinsv... (163 points, 139 comments)
    10. Listen people... (155 points, 16 comments)
  9. 2204 points, 10 submissions: increaseblocks
    1. China bans Bitcoin again, and again, and again (426 points, 56 comments)
    2. China bans Bitcoin (again) (292 points, 35 comments)
    3. Bitcoin Cash Network has now been upgraded! (238 points, 67 comments)
    4. So you want small blocks with high fees to validate your own on chain transactions that happen OFF CHAIN? (212 points, 112 comments)
    5. It’s happening - BTC dev Luke jr writing code to Bitcoin BTC codebase to fork to lower the block size to 300kb! (204 points, 127 comments)
    6. Former BTC maximalist admits that maxi's lied cheated and stealed to get SegWit and Lightning (201 points, 135 comments)
    7. Just 18 more months to go! (172 points, 86 comments)
    8. Bitcoin Cash ring - F*CK BANKS (167 points, 51 comments)
    9. LTC Foundation chat leaked: no evidence of development, lack of transparency (155 points, 83 comments)
    10. A single person controls nearly half of all the Lightning Network’s capacity (137 points, 109 comments)
  10. 2138 points, 12 submissions: JonyRotten
    1. 'Craig Is a Liar' – Early Adopter Proves Ownership of Bitcoin Address Claimed by Craig Wright (309 points, 165 comments)
    2. 200,000 People Have Signed Ross Ulbricht's Clemency Petition (236 points, 102 comments)
    3. Street Artist Hides $1,000 in BTC Inside a Mural Depicting Paris Protests (236 points, 56 comments)
    4. Craig Wright Ordered to Produce a List of Early Bitcoin Addresses in Kleiman Lawsuit (189 points, 66 comments)
    5. Ross Ulbricht Clemency Petition Gathers 250,000 Signatures (163 points, 24 comments)
    6. Ross Ulbricht Letter Questions the Wisdom of Imprisoning Non-Violent Offenders (160 points, 50 comments)
    7. Expert Witness in Satoshi Case Claims Dr Wright's Documents Were Doctored (155 points, 44 comments)
    8. California City Official Uses Bitcoin Cash to Purchase Cannabis (151 points, 36 comments)
    9. Money Transmitter License Not Required for Crypto Businesses in Pennsylvania (141 points, 9 comments)
    10. McAfee to Launch Decentralized Token Exchange With No Restrictions (137 points, 35 comments)

Top Commenters

  1. jessquit (16708 points, 2083 comments)
  2. Ant-n (7878 points, 1517 comments)
  3. MemoryDealers (7366 points, 360 comments)
  4. Egon_1 (6205 points, 1001 comments)
  5. 500239 (5745 points, 735 comments)
  6. BitcoinXio (4640 points, 311 comments)
  7. LovelyDay (4353 points, 457 comments)
  8. chainxor (4293 points, 505 comments)
  9. MobTwo (3420 points, 174 comments)
  10. ShadowOfHarbringer (3388 points, 478 comments)

Top Submissions

  1. The perfect crypto t-shirt by Korben (742 points, 68 comments)
  2. The future of Libra Coin by themadscientistt (722 points, 87 comments)
  3. when you become a crypto trader... by forberniesnow (675 points, 54 comments)
  4. A Reminder Why You Shouldn’t Use Google. by InMyDayTVwasBooks (637 points, 209 comments)
  5. Imagine if in 2000 Apple just sat around all day shit-talking Microsoft. Apple would have never gone anywhere. Apple succeeded because they learned from their mistakes, improved, and got better. BCH should do the same. by guyfawkesfp (552 points, 255 comments)
  6. Bitcoin made The Simpsons intro! Sorry for the potato quality by Johans_wilgat (521 points, 44 comments)
  7. Vitalik Buterin to Core Maxi: “ok bitcoiner” .... by Egon_1 (515 points, 206 comments)
  8. Can't stop won't stop by Greentoboggan (514 points, 78 comments)
  9. These men are serving life without parole in max security prison for nonviolent drug offenses. They helped me through a difficult time in a very dark place. I hope 2019 was their last year locked away from their loved ones. FreeRoss.org/lifers/ Happy New Year. by Egon_1 (502 points, 237 comments)
  10. Blockchain? by unesgt (479 points, 103 comments)

Top Comments

  1. 211 points: fireduck's comment in John Mcafee on the run from IRS Tax Evasion charges, running 2020 Presidential Campaign from Venezuela in Exile
  2. 203 points: WalterRothbard's comment in I am a Bitcoin supporter and developer, and I'm starting to think that Bitcoin Cash could be better, but I have some concerns, is anyone willing to discuss them?
  3. 179 points: Chris_Pacia's comment in The BSV chain has just experienced a 6-block reorg
  4. 163 points: YourBodyIsBCHn's comment in I made this account specifically to tip in nsfw/gonewild subreddits
  5. 161 points: BeijingBitcoins's comment in Last night's BCH & BTC meetups in Tokyo were both at the same restaurant (Two Dogs). We joined forces for this group photo!
  6. 156 points: hawks5999's comment in You can’t make this stuff up. This is how BTC supporters actually think. From bitcoin: “What you can do to make BTC better: check twice if you really need to use it!” 🤦🏻‍♂️
  7. 155 points: lowstrife's comment in Steve Wozniak Sold His Bitcoin at Its Peak $20,000 Valuation
  8. 151 points: kdawgud's comment in The government is taking away basic freedoms we each deserve
  9. 147 points: m4ktub1st's comment in BCH suffered a 51% attack by colluding miners to re-org the chain in order to reverse transactions - why is nobody talking about this? Dangerous precident
  10. 147 points: todu's comment in Why I'm not a fan of the SV community: My recent bill for defending their frivolous lawsuit against open source software developers.
Generated with BBoe's Subreddit Stats
submitted by subreddit_stats to subreddit_stats [link] [comments]

Searching for the Unicorn Cryptocurrency

Searching for the Unicorn Cryptocurrency
For someone first starting out as a cryptocurrency investor, finding a trustworthy manual for screening a cryptocurrency’s merits is nonexistent as we are still in the early, Wild West days of the cryptocurrency market. One would need to become deeply familiar with the inner workings of blockchain to be able to perform the bare minimum due diligence.
One might believe, over time, that finding the perfect cryptocurrency may be nothing short of futile. If a cryptocurrency purports infinite scalability, then it is probably either lightweight with limited features or it is highly centralized among a limited number of nodes that perform consensus services especially Proof of Stake or Delegated Proof of Stake. Similarly, a cryptocurrency that purports comprehensive privacy may have technical obstacles to overcome if it aims to expand its applications such as in smart contracts. The bottom line is that it is extremely difficult for a cryptocurrency to have all important features jam-packed into itself.
The cryptocurrency space is stuck in the era of the “dial-up internet” in a manner of speaking. Currently blockchain can’t scale – not without certain tradeoffs – and it hasn’t fully resolved certain intractable issues such as user-unfriendly long addresses and how the blockchain size is forever increasing to name two.
In other words, we haven’t found the ultimate cryptocurrency. That is, we haven’t found the mystical unicorn cryptocurrency that ushers the era of decentralization while eschewing all the limitations of traditional blockchain systems.
“But wait – what about Ethereum once it implements sharding?”
“Wouldn’t IOTA be able to scale infinitely with smart contracts through its Qubic offering?”
“Isn’t Dash capable of having privacy, smart contracts, and instantaneous transactions?”
Those thoughts and comments may come from cryptocurrency investors who have done their research. It is natural for the informed investors to invest in projects that are believed to bring cutting edge technological transformation to blockchain. Sooner or later, the sinking realization will hit that any variation of the current blockchain technology will always likely have certain limitations.
Let us pretend that there indeed exists a unicorn cryptocurrency somewhere that may or may not be here yet. What would it look like, exactly? Let us set the 5 criteria of the unicorn cryptocurrency:
Unicorn Criteria
(1) Perfectly solves the blockchain trilemma:
o Infinite scalability
o Full security
o Full decentralization
(2) Zero or minimal transaction fee
(3) Full privacy
(4) Full smart contract capabilities
(5) Fair distribution and fair governance
For each of the above 5 criteria, there would not be any middle ground. For example, a cryptocurrency with just an in-protocol mixer would not be considered as having full privacy. As another example, an Initial Coin Offering (ICO) may possibly violate criterion (5) since with an ICO the distribution and governance are often heavily favored towards an oligarchy – this in turn would defy the spirit of decentralization that Bitcoin was found on.
There is no cryptocurrency currently that fits the above profile of the unicorn cryptocurrency. Let us examine an arbitrary list of highly hyped cryptocurrencies that meet the above list at least partially. The following list is by no means comprehensive but may be a sufficient sampling of various blockchain implementations:
Bitcoin (BTC)
Bitcoin is the very first and the best known cryptocurrency that started it all. While Bitcoin is generally considered extremely secure, it suffers from mining centralization to a degree. Bitcoin is not anonymous, lacks smart contracts, and most worrisomely, can only do about 7 transactions per seconds (TPS). Bitcoin is not the unicorn notwithstanding all the Bitcoin maximalists.
Ethereum (ETH)
Ethereum is widely considered the gold standard of smart contracts aside from its scalability problem. Sharding as part of Casper’s release is generally considered to be the solution to Ethereum’s scalability problem.
The goal of sharding is to split up validating responsibilities among various groups or shards. Ethereum’s sharding comes down to duplicating the existing blockchain architecture and sharing a token. This does not solve the core issue and simply kicks the can further down the road. After all, full nodes still need to exist one way or another.
Ethereum’s blockchain size problem is also an issue as will be explained more later in this article.
As a result, Ethereum is not the unicorn due to its incomplete approach to scalability and, to a degree, security.
Dash
Dash’s masternodes are widely considered to be centralized due to their high funding requirements, and there are accounts of a pre-mine in the beginning. Dash is not the unicorn due to its questionable decentralization.
Nano
Nano boasts rightfully for its instant, free transactions. But it lacks smart contracts and privacy, and it may be exposed to well orchestrated DDOS attacks. Therefore, it goes without saying that Nano is not the unicorn.
EOS
While EOS claims to execute millions of transactions per seconds, a quick glance reveals centralized parameters with 21 nodes and a questionable governance system. Therefore, EOS fails to achieve the unicorn status.
Monero (XMR)
One of the best known and respected privacy coins, Monero lacks smart contracts and may fall short of infinite scalability due to CryptoNote’s design. The unicorn rank is out of Monero’s reach.
IOTA
IOTA’s scalability is based on the number of transactions the network processes, and so its supposedly infinite scalability would fluctuate and is subject to the whims of the underlying transactions. While IOTA’s scalability approach is innovative and may work in the long term, it should be reminded that the unicorn cryptocurrency has no middle ground. The unicorn cryptocurrency would be expected to scale infinitely on a consistent basis from the beginning.
In addition, IOTA’s Masked Authenticated Messaging (MAM) feature does not bring privacy to the masses in a highly convenient manner. Consequently, the unicorn is not found with IOTA.

PascalCoin as a Candidate for the Unicorn Cryptocurrency
Please allow me to present a candidate for the cryptocurrency unicorn: PascalCoin.
According to the website, PascalCoin claims the following:
“PascalCoin is an instant, zero-fee, infinitely scalable, and decentralized cryptocurrency with advanced privacy and smart contract capabilities. Enabled by the SafeBox technology to become the world’s first blockchain independent of historical operations, PascalCoin possesses unlimited potential.”
The above summary is a mouthful to be sure, but let’s take a deep dive on how PascalCoin innovates with the SafeBox and more. Before we do this, I encourage you to first become acquainted with PascalCoin by watching the following video introduction:
https://www.youtube.com/watch?time_continue=4&v=F25UU-0W9Dk
The rest of this section will be split into 10 parts in order to illustrate most of the notable features of PascalCoin. Naturally, let’s start off with the SafeBox.
Part #1: The SafeBox
Unlike traditional UTXO-based cryptocurrencies in which the blockchain records the specifics of each transaction (address, sender address, amount of funds transferred, etc.), the blockchain in PascalCoin is only used to mutate the SafeBox. The SafeBox is a separate but equivalent cryptographic data structure that snapshots account balances. PascalCoin’s blockchain is comparable to a machine that feeds the most important data – namely, the state of an account – into the SafeBox. Any node can still independently compute and verify the cumulative Proof-of-Work required to construct the SafeBox.
The PascalCoin whitepaper elegantly highlights the unique historical independence that the SafeBox possesses:
“While there are approaches that cryptocurrencies could use such as pruning, warp-sync, "finality checkpoints", UTXO-snapshotting, etc, there is a fundamental difference with PascalCoin. Their new nodes can only prove they are on most-work-chain using the infinite history whereas in PascalCoin, new nodes can prove they are on the most-work chain without the infinite history.”
Some cryptocurrency old-timers might instinctively balk at the idea of full nodes eschewing the entire history for security, but such a reaction would showcase a lack of understanding on what the SafeBox really does.
A concrete example would go a long way to best illustrate what the SafeBox does. Let’s say I input the following operations in my calculator:
5 * 5 – 10 / 2 + 5
It does not take a genius to calculate the answer, 25. Now, the expression “5 \ 5 – 10 / 2 + 5”* would be forever imbued on a traditional blockchain’s history. But the SafeBox begs to differ. It says that the expression “5 \ 5 – 10 / 2 + 5”* should instead be simply “25” so as preserve simplicity, time, and space. In other words, the SafeBox simply preserves the account balance.
But some might still be unsatisfied and claim that if one cannot trace the series of operations (transactions) that lead to the final number (balance) of 25, the blockchain is inherently insecure.
Here are four important security aspects of the SafeBox that some people fail to realize:
(1) SafeBox Follows the Longest Chain of Proof-of-Work
The SafeBox mutates itself per 100 blocks. Each new SafeBox mutation must reference both to the previous SafeBox mutation and the preceding 100 blocks in order to be valid, and the resultant hash of the new mutated SafeBox must then be referenced by each of the new subsequent blocks, and the process repeats itself forever.
The fact that each new SafeBox mutation must reference to the previous SafeBox mutation is comparable to relying on the entire history. This is because the previous SafeBox mutation encapsulates the result of cumulative entire history except for the 100 blocks which is why each new SafeBox mutation requires both the previous SafeBox mutation and the preceding 100 blocks.
So in a sense, there is a single interconnected chain of inflows and outflows, supported by Byzantine Proof-of-Work consensus, instead of the entire history of transactions.
More concretely, the SafeBox follows the path of the longest chain of Proof-of-Work simply by design, and is thus cryptographically equivalent to the entire history even without tracing specific operations in the past. If the chain is rolled back with a 51% attack, only the attacker’s own account(s) in the SafeBox can be manipulated as is explained in the next part.
(2) A 51% Attack on PascalCoin Functions the Same as Others
A 51% attack on PascalCoin would work in a similar way as with other Proof-of-Work cryptocurrencies. An attacker cannot modify a transaction in the past without affecting the current SafeBox hash which is accepted by all honest nodes.
Someone might claim that if you roll back all the current blocks plus the 100 blocks prior to the SafeBox’s mutation, one could create a forged SafeBox with different balances for all accounts. This would be incorrect as one would be able to manipulate only his or her own account(s) in the SafeBox with a 51% attack – just as is the case with other UTXO cryptocurrencies. The SafeBox stores the balances of all accounts which are in turn irreversibly linked only to their respective owners’ private keys.
(3) One Could Preserve the Entire History of the PascalCoin Blockchain
No blockchain data in PascalCoin is ever deleted even in the presence of the SafeBox. Since the SafeBox is cryptographically equivalent to a full node with the entire history as explained above, PascalCoin full nodes are not expected to contain infinite history. But for whatever reason(s) one may have, one could still keep all the PascalCoin blockchain history as well along with the SafeBox as an option even though it would be redundant.
Without storing the entire history of the PascalCoin blockchain, you can still trace the specific operations of the 100 blocks prior to when the SafeBox absorbs and reflects the net result (a single balance for each account) from those 100 blocks. But if you’re interested in tracing operations over a longer period in the past – as redundant as that may be – you’d have the option to do so by storing the entire history of the PascalCoin blockchain.
(4) The SafeBox is Equivalent to the Entire Blockchain History
Some skeptics may ask this question: “What if the SafeBox is forever lost? How would you be able to verify your accounts?” Asking this question is tantamount to asking to what would happen to Bitcoin if all of its entire history was erased. The result would be chaos, of course, but the SafeBox is still in line with the general security model of a traditional blockchain with respect to black swans.
Now that we know the security of the SafeBox is not compromised, what are the implications of this new blockchain paradigm? A colorful illustration as follows still wouldn’t do justice to the subtle revolution that the SafeBox ushers. The automobiles we see on the street are the cookie-and-butter representation of traditional blockchain systems. The SafeBox, on the other hand, supercharges those traditional cars to become the Transformers from Michael Bay’s films.
The SafeBox is an entirely different blockchain architecture that is impressive in its simplicity and ingenuity. The SafeBox’s design is only the opening act for PascalCoin’s vast nuclear arsenal. If the above was all that PascalCoin offers, it still wouldn’t come close to achieving the unicorn status but luckily, we have just scratched the surface. Please keep on reading on if you want to learn how PascalCoin is going to shatter the cryptocurrency industry into pieces. Buckle down as this is going to be a long read as we explore further about the SafeBox’s implications.
Part #2: 0-Confirmation Transactions
To begin, 0-confirmation transactions are secure in PascalCoin thanks to the SafeBox.
The following paraphrases an explanation of PascalCoin’s 0-confirmations from the whitepaper:
“Since PascalCoin is not a UTXO-based currency but rather a State-based currency thanks to the SafeBox, the security guarantee of 0-confirmation transactions are much stronger than in UTXO-based currencies. For example, in Bitcoin if a merchant accepts a 0-confirmation transaction for a coffee, the buyer can simply roll that transaction back after receiving the coffee but before the transaction is confirmed in a block. The way the buyer does this is by re-spending those UTXOs to himself in a new transaction (with a higher fee) thus invalidating them for the merchant. In PascalCoin, this is virtually impossible since the buyer's transaction to the merchant is simply a delta-operation to debit/credit a quantity from/to accounts respectively. The buyer is unable to erase or pre-empt this two-sided, debit/credit-based transaction from the network’s pending pool until it either enters a block for confirmation or is discarded with respect to both sender and receiver ends. If the buyer tries to double-spend the coffee funds after receiving the coffee but before they clear, the double-spend transaction will not propagate the network since nodes cannot propagate a double-spending transaction thanks to the debit/credit nature of the transaction. A UTXO-based transaction is initially one-sided before confirmation and therefore is more exposed to one-sided malicious schemes of double spending.”
Phew, that explanation was technical but it had to be done. In summary, PascalCoin possesses the only secure 0-confirmation transactions in the cryptocurrency industry, and it goes without saying that this means PascalCoin is extremely fast. In fact, PascalCoin is capable of 72,000 TPS even prior to any additional extensive optimizations down the road. In other words, PascalCoin is as instant as it gets and gives Nano a run for its money.
Part #3: Zero Fee
Let’s circle back to our discussion of PascalCoin’s 0-confirmation capability. Here’s a little fun magical twist to PascalCoin’s 0-confirmation magic: 0-confirmation transactions are zero-fee. As in you don’t pay a single cent in fee for each 0-confirmation! There is just a tiny downside: if you create a second transaction in a 5-minute block window then you’d need to pay a minimal fee. Imagine using Nano but with a significantly stronger anti-DDOS protection for spam! But there shouldn’t be any complaint as this fee would amount to 0.0001 Pascal or $0.00002 based on the current price of a Pascal at the time of this writing.
So, how come the fee for blazingly fast transactions is nonexistent? This is where the magic of the SafeBox arises in three ways:
(1) PascalCoin possesses the secure 0-confirmation feature as discussed above that enables this speed.
(2) There is no fee bidding competition of transaction priority typical in UTXO cryptocurrencies since, once again, PascalCoin operates on secure 0-confirmations.
(3) There is no fee incentive needed to run full nodes on behalf of the network’s security beyond the consensus rewards.
Part #4: Blockchain Size
Let’s expand more on the third point above, using Ethereum as an example. Since Ethereum’s launch in 2015, its full blockchain size is currently around 2 TB, give or take, but let’s just say its blockchain size is 100 GB for now to avoid offending the Ethereum elitists who insist there are different types of full nodes that are lighter. Whoever runs Ethereum’s full nodes would expect storage fees on top of the typical consensus fees as it takes significant resources to shoulder Ethereum’s full blockchain size and in turn secure the network. What if I told you that PascalCoin’s full blockchain size will never exceed few GBs after thousands of years? That is just what the SafeBox enables PascalCoin to do so. It is estimated that by 2072, PascalCoin’s full nodes will only be 6 GB which is low enough not to warrant any fee incentives for hosting full nodes. Remember, the SafeBox is an ultra-light cryptographic data structure that is cryptographically equivalent to a blockchain with the entire transaction history. In other words, the SafeBox is a compact spreadsheet of all account balances that functions as PascalCoin’s full node!
Not only does the SafeBox’s infinitesimal memory size helps to reduce transaction fees by phasing out any storage fees, but it also paves the way for true decentralization. It would be trivial for every PascalCoin user to opt a full node in the form of a wallet. This is extreme decentralization at its finest since the majority of users of other cryptocurrencies ditch full nodes due to their burdensome sizes. It is naïve to believe that storage costs would reduce enough to the point where hosting full nodes are trivial. Take a look at the following chart outlining the trend of storage cost.

* https://www.backblaze.com/blog/hard-drive-cost-per-gigabyte/
As we can see, storage costs continue to decrease but the descent is slowing down as is the norm with technological improvements. In the meantime, blockchain sizes of other cryptocurrencies are increasing linearly or, in the case of smart contract engines like Ethereum, parabolically. Imagine a cryptocurrency smart contract engine like Ethereum garnering worldwide adoption; how do you think Ethereum’s size would look like in the far future based on the following chart?


https://i.redd.it/k57nimdjmo621.png

Ethereum’s future blockchain size is not looking pretty in terms of sustainable security. Sharding is not a fix for this issue since there still needs to be full nodes but that is a different topic for another time.
It is astonishing that the cryptocurrency community as a whole has passively accepted this forever-expanding-blockchain-size problem as an inescapable fate.
PascalCoin is the only cryptocurrency that has fully escaped the death vortex of forever expanding blockchain size. Its blockchain size wouldn’t exceed 10 GB even after many hundreds of years of worldwide adoption. Ethereum’s blockchain size after hundreds of years of worldwide adoption would make fine comedy.
Part #5: Simple, Short, and Ordinal Addresses
Remember how the SafeBox works by snapshotting all account balances? As it turns out, the account address system is almost as cool as the SafeBox itself.
Imagine yourself in this situation: on a very hot and sunny day, you’re wandering down the street across from your house and ran into a lemonade stand – the old-fashioned kind without any QR code or credit card terminal. The kid across you is selling a lemonade cup for 1 Pascal with a poster outlining the payment address as 5471-55. You flip out your phone and click “Send” with 1 Pascal to the address 5471-55; viola, exactly one second later you’re drinking your lemonade without paying a cent for the transaction fee!
The last thing one wants to do is to figure out how to copy/paste to, say, the following address 1BoatSLRHtKNngkdXEeobR76b53LETtpyT on the spot wouldn’t it? Gone are the obnoxiously long addresses that plague all cryptocurrencies. The days of those unreadable addresses will be long gone – it has to be if blockchain is to innovate itself for the general public. EOS has a similar feature for readable addresses but in a very limited manner in comparison, and nicknames attached to addresses in GUIs don’t count since blockchain-wide compatibility wouldn’t hold.
Not only does PascalCoin has the neat feature of having addresses (called PASAs) that amount to up to 6 or 7 digits, but PascalCoin can also incorporate in-protocol address naming as opposed to GUI address nicknames. Suppose I want to order something from Amazon using Pascal; I simply search the word “Amazon” then the corresponding account number shows up. Pretty neat, right?
The astute reader may gather that PascalCoin’s address system makes it necessary to commoditize addresses, and he/she would be correct. Some view this as a weakness; part #10 later in this segment addresses this incorrect perception.
Part #6: Privacy
As if the above wasn’t enough, here’s another secret that PascalCoin has: it is a full-blown privacy coin. It uses two separate foundations to achieve comprehensive anonymity: in-protocol mixer for transfer amounts and zn-SNARKs for private balances. The former has been implemented and the latter is on the roadmap. Both the 0-confirmation transaction and the negligible transaction fee would make PascalCoin the most scalable privacy coin of any other cryptocurrencies pending the zk-SNARKs implementation.
Part #7: Smart Contracts
Next, PascalCoin will take smart contracts to the next level with a layer-2 overlay consensus system that pioneers sidechains and other smart contract implementations.
In formal terms, this layer-2 architecture will facilitate the transfer of data between PASAs which in turn allows clean enveloping of layer-2 protocols inside layer-1 much in the same way that HTTP lives inside TCP.
To summarize:
· The layer-2 consensus method is separate from the layer-1 Proof-of-Work. This layer-2 consensus method is independent and flexible. A sidechain – based on a single encompassing PASA – could apply Proof-of-Stake (POS), Delegated Proof-of-Stake (DPOS), or Directed Acyclic Graph (DAG) as the consensus system of its choice.
· Such a layer-2 smart contract platform can be written in any languages.
· Layer-2 sidechains will also provide very strong anonymity since funds are all pooled and keys are not used to unlock them.
· This layer-2 architecture is ingenious in which the computation is separate from layer-2 consensus, in effect removing any bottleneck.
· Horizontal scaling exists in this paradigm as there is no interdependence between smart contracts and states are not managed by slow sidechains.
· Speed and scalability are fully independent of PascalCoin.
One would be able to run the entire global financial system on PascalCoin’s infinitely scalable smart contract platform and it would still scale infinitely. In fact, this layer-2 architecture would be exponentially faster than Ethereum even after its sharding is implemented.
All this is the main focus of PascalCoin’s upcoming version 5 in 2019. A whitepaper add-on for this major upgrade will be released in early 2019.
Part #8: RandomHash Algorithm
Surely there must be some tradeoffs to PascalCoin’s impressive capabilities, you might be asking yourself. One might bring up the fact that PascalCoin’s layer-1 is based on Proof-of-Work and is thus susceptible to mining centralization. This would be a fallacy as PascalCoin has pioneered the very first true ASIC, GPU, and dual-mining resistant algorithm known as RandomHash that obliterates anything that is not CPU based and gives all the power back to solo miners.
Here is the official description of RandomHash:
“RandomHash is a high-level cryptographic hash algorithm that combines other well-known hash primitives in a highly serial manner. The distinguishing feature is that calculations for a nonce are dependent on partial calculations of other nonces, selected at random. This allows a serial hasher (CPU) to re-use these partial calculations in subsequent mining saving 50% or more of the work-load. Parallel hashers (GPU) cannot benefit from this optimization since the optimal nonce-set cannot be pre-calculated as it is determined on-the-fly. As a result, parallel hashers (GPU) are required to perform the full workload for every nonce. Also, the algorithm results in 10x memory bloat for a parallel implementation. In addition to its serial nature, it is branch-heavy and recursive making in optimal for CPU-only mining.”
One might be understandably skeptical of any Proof-of-Work algorithm that solves ASIC and GPU centralization once for all because there have been countless proposals being thrown around for various algorithms since the dawn of Bitcoin. Is RandomHash truly the ASIC & GPU killer that it claims to be?
Herman Schoenfeld, the inventor behind RandomHash, described his algorithm in the following:
“RandomHash offers endless ASIC-design breaking surface due to its use of recursion, hash algo selection, memory hardness and random number generation.
For example, changing how round hash selection is made and/or random number generator algo and/or checksum algo and/or their sequencing will totally break an ASIC design. Conceptually if you can significantly change the structure of the output assembly whilst keeping the high-level algorithm as invariant as possible, the ASIC design will necessarily require proportional restructuring. This results from the fact that ASIC designs mirror the ASM of the algorithm rather than the algorithm itself.”
Polyminer1 (pseudonym), one of the members of the PascalCoin core team who developed RHMiner (official software for mining RandomHash), claimed as follows:
“The design of RandomHash is, to my experience, a genuine innovation. I’ve been 30 years in the field. I’ve rarely been surprised by anything. RandomHash was one of my rare surprises. It’s elegant, simple, and achieves resistance in all fronts.”
PascalCoin may have been the first party to achieve the race of what could possibly be described as the “God algorithm” for Proof-of-Work cryptocurrencies. Look no further than one of Monero’s core developers since 2015, Howard Chu. In September 2018, Howard declared that he has found a solution, called RandomJS, to permanently keep ASICs off the network without repetitive algorithm changes. This solution actually closely mirrors RandomHash’s algorithm. Discussing about his algorithm, Howard asserted that “RandomJS is coming at the problem from a direction that nobody else is.”
Link to Howard Chu’s article on RandomJS:
https://www.coindesk.com/one-musicians-creative-solution-to-drive-asics-off-monero
Yet when Herman was asked about Howard’s approach, he responded:
“Yes, looks like it may work although using Javascript was a bit much. They should’ve just used an assembly subset and generated random ASM programs. In a way, RandomHash does this with its repeated use of random mem-transforms during expansion phase.”
In the end, PascalCoin may have successfully implemented the most revolutionary Proof-of-Work algorithm, one that eclipses Howard’s burgeoning vision, to date that almost nobody knows about. To learn more about RandomHash, refer to the following resources:
RandomHash whitepaper:
https://www.pascalcoin.org/storage/whitepapers/RandomHash_Whitepaper.pdf
Technical proposal for RandomHash:
https://github.com/PascalCoin/PascalCoin/blob/mastePIP/PIP-0009.md
Someone might claim that PascalCoin still suffers from mining centralization after RandomHash, and this is somewhat misleading as will be explained in part #10.
Part #9: Fair Distribution and Governance
Not only does PascalCoin rest on superior technology, but it also has its roots in the correct philosophy of decentralized distribution and governance. There was no ICO or pre-mine, and the developer fund exists as a percentage of mining rewards as voted by the community. This developer fund is 100% governed by a decentralized autonomous organization – currently facilitated by the PascalCoin Foundation – that will eventually be transformed into an autonomous smart contract platform. Not only is the developer fund voted upon by the community, but PascalCoin’s development roadmap is also voted upon the community via the Protocol Improvement Proposals (PIPs).
This decentralized governance also serves an important benefit as a powerful deterrent to unseemly fork wars that befall many cryptocurrencies.
Part #10: Common Misconceptions of PascalCoin
“The branding is terrible”
PascalCoin is currently working very hard on its image and is preparing for several branding and marketing initiatives in the short term. For example, two of the core developers of the PascalCoin recently interviewed with the Fox Business Network. A YouTube replay of this interview will be heavily promoted.
Some people object to the name PascalCoin. First, it’s worth noting that PascalCoin is the name of the project while Pascal is the name of the underlying currency. Secondly, Google and YouTube received excessive criticisms back then in the beginning with their name choices. Look at where those companies are nowadays – surely a somewhat similar situation faces PascalCoin until the name’s familiarity percolates into the public.
“The wallet GUI is terrible”
As the team is run by a small yet extremely dedicated developers, multiple priorities can be challenging to juggle. The lack of funding through an ICO or a pre-mine also makes it challenging to accelerate development. The top priority of the core developers is to continue developing full-time on the groundbreaking technology that PascalCoin offers. In the meantime, an updated and user-friendly wallet GUI has been worked upon for some time and will be released in due time. Rome wasn’t built in one day.
“One would need to purchase a PASA in the first place”
This is a complicated topic since PASAs need to be commoditized by the SafeBox’s design, meaning that PASAs cannot be obtained at no charge to prevent systematic abuse. This raises two seemingly valid concerns:
· As a chicken and egg problem, how would one purchase a PASA using Pascal in the first place if one cannot obtain Pascal without a PASA?
· How would the price of PASAs stay low and affordable in the face of significant demand?
With regards to the chicken and egg problem, there are many ways – some finished and some unfinished – to obtain your first PASA as explained on the “Get Started” page on the PascalCoin website:
https://www.pascalcoin.org/get_started
More importantly, however, is the fact that there are few methods that can get your first PASA for free. The team will also release another method soon in which you could obtain your first PASA for free via a single SMS message. This would probably become by far the simplest and the easiest way to obtain your first PASA for free. There will be more new ways to easily obtain your first PASA for free down the road.
What about ensuring the PASA market at large remains inexpensive and affordable following your first (and probably free) PASA acquisition? This would be achieved in two ways:
· Decentralized governance of the PASA economics per the explanation in the FAQ section on the bottom of the PascalCoin website (https://www.pascalcoin.org/)
· Unlimited and free pseudo-PASAs based on layer-2 in the next version release.
“PascalCoin is still centralized after the release of RandomHash”
Did the implementation of RandomHash from version 4 live up to its promise?
The official goals of RandomHash were as follow:
(1) Implement a GPU & ASIC resistant hash algorithm
(2) Eliminate dual mining
The two goals above were achieved by every possible measure.
Yet a mining pool, Nanopool, was able to regain its hash majority after a significant but a temporary dip.
The official conclusion is that, from a probabilistic viewpoint, solo miners are more profitable than pool miners. However, pool mining is enticing for solo miners who 1) have limited hardware as it ensures a steady income instead of highly profitable but probabilistic income via solo mining, and 2) who prefer convenient software and/or GUI.
What is the next step, then? While the barrier of entry for solo miners has successfully been put down, additional work needs to be done. The PascalCoin team and the community are earnestly investigating additional steps to improve mining decentralization with respect to pool mining specifically to add on top of RandomHash’s successful elimination of GPU, ASIC, and dual-mining dominance.
It is likely that the PascalCoin community will promote the following two initiatives in the near future:
(1) Establish a community-driven, nonprofit mining pool with attractive incentives.
(2) Optimize RHMiner, PascalCoin’s official solo mining software, for performance upgrades.
A single pool dominance is likely short lived once more options emerge for individual CPU miners who want to avoid solo mining for whatever reason(s).
Let us use Bitcoin as an example. Bitcoin mining is dominated by ASICs and mining pools but no single pool is – at the time of this writing – even close on obtaining the hash majority. With CPU solo mining being a feasible option in conjunction with ASIC and GPU mining eradication with RandomHash, the future hash rate distribution of PascalCoin would be far more promising than Bitcoin’s hash rate distribution.
PascalCoin is the Unicorn Cryptocurrency
If you’ve read this far, let’s cut straight to the point: PascalCoin IS the unicorn cryptocurrency.
It is worth noting that PascalCoin is still a young cryptocurrency as it was launched at the end of 2016. This means that many features are still work in progress such as zn-SNARKs, smart contracts, and pool decentralization to name few. However, it appears that all of the unicorn criteria are within PascalCoin’s reach once PascalCoin’s technical roadmap is mostly completed.
Based on this expository on PascalCoin’s technology, there is every reason to believe that PascalCoin is the unicorn cryptocurrency. PascalCoin also solves two fundamental blockchain problems beyond the unicorn criteria that were previously considered unsolvable: blockchain size and simple address system. The SafeBox pushes PascalCoin to the forefront of cryptocurrency zeitgeist since it is a superior solution compared to UTXO, Directed Acyclic Graph (DAG), Block Lattice, Tangle, and any other blockchain innovations.


THE UNICORN

Author: Tyler Swob
submitted by Kosass to CryptoCurrency [link] [comments]

EASY Way To Read Bitcoin Charts - BTC Technical Analysis ... Plotting live bitcoin price data - Tkinter GUI development ... Bitcoin Live Stream Price Chart BTC - YouTube Live Bitcoin Chart Liquidation Watch: October 05 2020 ... Bitcoin Live - Tom Crown

A discussion on Bitcoin scalability at the Distributed Economy Convention (Deconomy) 2018 saw Bitcoin.com CEO Roger Ver and Blockstream chief strategy officer Samson Mow trading words in a fiery debate that provoked multiple reactions from the crypto community. Ver vs Mow. During the debate, Roger Ver argued that the scaling of Bitcoin was a “natural” process on the blockchain. ByzCoin: Securely Scaling Blockchains. bitcoin byzcoin blocksize August 04, 2016 at 12:57 PM Philipp Jovanovic ← Older; Newer → It is no secret that Bitcoin is currently in the midst of a severe scalability crisis. Its steadily-rising popularity has been accompanied by an ever increasing volume of transactions (tx), bringing the cryptocurrency to its knees. Sometimes the whole system is ... The reality is that for people using bitcoin on those services, the fungibility in bitcoin is actually worse than Paypal, because other people's actions unrelated to you-- and 4 hops away is a very long way away-- and potentially anyone who has done any trades with bitcoin is 4 hops away; it's a very interconnected system. There's a social networking theory that everyone is within 13 hops ... When you are receiving bitcoin, you have to be signing. Your wallets are mostly hot. There are a lot of different risks. You have to watch your channel with a full node. Don't use bloom filters. The full node providing you with a merkle block could just lie. So if you're connected to a full node and say here's my ... if you don't know-- if you don't find out that you got some coins, you want ... Request PDF Prism: Scaling Bitcoin by 10,000x Bitcoin is the first fully decentralized permissionless blockchain protocol and achieves a high level of security: the ledger it maintains has ...

[index] [47425] [43527] [19442] [2362] [7017] [17164] [26389] [9330] [22905] [4462]

EASY Way To Read Bitcoin Charts - BTC Technical Analysis ...

This video is unavailable. Watch Queue Queue. Watch Queue Queue Bitcoin Live Btc Price Chart Liquidation Watch Bull vs Bear Pump or Dump Bitcoin Currency Bitcoin is a cryptocurrency. It is a decentralized digital curr... This video is unavailable. Watch Queue Queue. Watch Queue Queue Queue Visit our website: https://altcoinbuzz.io Bitcoin tried to make a small rally today. Does that change the price prediction in the short term? How do you even... Bitcoin Live Stream Price Chart Please Like and Subscribe credit btcusd.aggr.trade and tradeview.com NOTHING YOU SEE HERE IS FINANCIAL ADVICE AND IS FOR ENTE...

#