Secure Multiparty Computations on Bitcoin Semantic Scholar

Communications of the ACM - Secure Multiparty Computations on Bitcoin

submitted by BitcoinAllBot to BitcoinAllTV [link] [comments]

Communications of the ACM - Secure Multiparty Computations on Bitcoin

Communications of the ACM - Secure Multiparty Computations on Bitcoin submitted by abtcuser to btc [link] [comments]

Communications of the ACM - Secure Multiparty Computations on Bitcoin

Communications of the ACM - Secure Multiparty Computations on Bitcoin submitted by BitcoinAllBot to BitcoinAll [link] [comments]

Secure Multiparty Computations on BitCoin

submitted by dbabbitt to TheDAC [link] [comments]

Ren (REN) is now on DeFi Swap

Ren (REN) is now on DeFi Swap

https://i.redd.it/mfdgeod8y8p51.gif
Ren (REN) is now available on DeFi Swap. Users can swap REN, be REN Liquidity Providers to earn fees and boost their yield by up to 20x when staking CRO.
Ren (REN), is an open protocol that enables the permissionless and private transfer of value between any blockchain. Ren's core product RenVM, brings interoperability to decentralized finance (DeFi). Ren built and released the first decentralized dark pool (RenEx) in 2018 and is now generalizing this technology to create an ecosystem for building, deploying, and running general-purpose, privacy-preserving, applications using zkSNARK on a newly developed secure multiparty computation protocol (sMPC). Ren's core product is now RenVM, which brings interoperability to decentralized finance (DeFi). RenVM is a decentralized and trustless custodian that holds your digital assets as they move between blockchains using zero-knowledge proofs over an sMPC based protocol. The state, inputs, and outputs of all programs that RenVM runs are kept hidden from everyone, including the Darknodes that power it.
REN joins a growing list of tokens on DeFi Swap, such as UMA (UMA), Swerve (SWRV), Harvest Finance (FARM), Uniswap (UNI), Wrapped Bitcoin (WBTC), Yearn Finance (YFI), (Wrapped) Ether (WETH), Tether (USDT), USD Coin (USDC), Dai (DAI), Chainlink (LINK), Compound (COMP) and Crypto.com Coin (CRO).
Start swapping, farming and staking now.
Please see blog for more details about DeFi Swap.
submitted by BryanM_Crypto to Crypto_com [link] [comments]

What is Cardano? [ADA]

What is Cardano?

Cardano is a Blockchain project, also called 3rd generation Blockchain because of its scientific philosophy, designed and developed by a team of worldwide scientists and engineers. The aim of the project is to develop a technology that is secure, flexible and scalable and can therefore be used by many millions of users.
In contrast to other projects, Cardano pursues a policy that seeks to reconcile the needs of the user with those of the regulatory authorities, combining privacy with regulation. Cardano’s vision is that the project will lead to greater global financial integration for all people by providing all people with open access to fair financial services. Cardano wants to create a technological platform on which financial applications can be developed and executed.
It has basically great similarities with Ethereum. Cardano is a platform such as Ethereum, EOS or NEO that enables the creation of new tokens and decentralized applications (dApps) and smart contracts. From a technical point of view, however, there are major differences which we will discuss later.
The cryptocurrency of Cardano is ADA. Like any crypto currency, ADA allows the user to send assets within the Cardano network seamlessly over the Internet in a secure and fast manner. You can find the current price of ADA on our chart page.

Cardano History

Cardano was founded by Ethereum co-founders Charles Hoskinson and Jeremy Wood after both left the Ethereum project after a disagreement over further development. While the later Cardano founders wanted to create a commercial enterprise behind Ethereum, Vitalik Buterin’s group was able to assert itself by setting up a charitable foundation behind the project.
Logically, Hoskinson and Wood founded a company, Input Output Hong Kong (IOHK), to manage Cardano’s research and development. Between September 2015 and January 2017, Cardano conducted a public ICO that raised a total of 62 million US dollars. About two thirds of all Ada tokens were sold.
Cardano was officially launched on 29 September 2017. Currently the project is still in its bootstrap era (“Byron”). In the bootstrap era, when people buy or sell Ada, the transaction is automatically delegated to a pool of trusted nodes that manage the network. They do not receive block rewards in this phase of the project. IOHK is currently working on numerous improvements and features. The next phase “Shelley” is to be introduced in 2019. In this phase, the project will grow into a fully decentralized and autonomous system.
This will be followed by the “Goguen” era, in which the integration of smart contracts is planned. This is followed by the “Basho” phase, which is intended to improve performance, and finally the “Voltaire” phase, which is intended to add a treasury system and a governance model (“Liquid Democracy”). The complete roadmap can be seen here.

The organization behind Cardano

As already written, Charles Hoskinson has decided to found the company IOHK in order to guarantee a coordinated and planned development of the project. IOHK is responsible for the design, development and maintenance of the Cardano platform until 2020.
In addition, however, there were originally two other institutions that took care of the Cardano project: Emurgo and the Cardano Foundation, based in Switzerland. Emurgo is a Japanese company that is also currently pushing partnerships with other commercial companies and organizing Cardano’s business development.
In particular, the Cardano Foundation was entrusted with administrative tasks. Otherwise the Foundation was responsible for public relations, trademark law, lobbying and cooperation with governments and regulators. In October 2018, however, there was a break between IOHK/ Emurgo and the Cardano Foundation. Among other things, Hoskinson has accused the Cardano Foundation under the direction of Michael Parsons of inaction. As a result IOHK and Emurgo decided to take over the tasks of the foundation.

Cardano: 3rd generation blockchain

Charles Hoskinson has recognized that 2nd generation blockchains still have many open problems to be successful in the long run. These are in particular scalability, interoperability and sustainability. Cardano has (partly) developed new concepts and technologies for this purpose.

Scalability

In terms of scalability, there are three challenges for Cardano:
  1. Transactions per second (TPS)
  2. Network / Bandwidth
  3. Data scaling

Transactions per second

The Transactions per Second (TPS) measure how many transactions per second can be written to a block. According to Hoskinson, however, this is only part of the problem of scaling. While Bitcoin 3-7 TPS and Ethereum 10-20 TPS create, this is far from enough to host millions of users. The solution to this problem is Cardanos Ouroboros Proof-of-Stake algorithm, which we will discuss later.

Network

A further challenge is the network, which will also have an exponentially increasing demand for network resources with millions of users as the demand increases with the number of transactions. The demand will grow in size regions of several hundred terabytes or even exabytes.
Therefore, it will be impossible to maintain a homogeneous network topology in which each node forwards each transaction and message. Not every node will have the necessary resources. To solve this problem, Cardano intends to use a technology called RINA.

Data scaling

Since the blockchain has to store the data forever, there will be an enormous and constantly growing amount of data to be scaled (“Data Scale”). The problem is obvious. If every node has to keep a complete copy of the entire blockchain, this will not be possible for every node from the point of view of resources. The solution is that not every node needs all data. The solution approaches therefore include in detail:

Interoperability

Charles Hoskinson believes that there will not be a single crypto currency to be used in the future. Therefore Cardano aims at enabling the different blockchains to communicate with each other. Cardano’s vision is to create an “Internet of blockchains” in which there is no intermediary.
A solution to this problem should be for Cardano, Sidechains. The concept has been around in crypto space for quite some time. Simply put, sidechains are parallel blockchains that can communicate with the main blockchain.
In addition, Cardano tries to comply with all existing compliance rules: KYC (Know Your Customer), AML (Anti Money Laundering), ATF (Anti Terrorist Financing). In order for this to work, Cardano provides metadata to each transaction.

Sustainability

According to Hoskinson, sustainability is the most difficult challenge. Basically, it means that continuous developments must be carried out on the system. This requires financial resources. Both a patronage and an ICO do not make sense for Hoskinson. Patronage leads to centralization, while ICOs provide short-term resources for the ecosystem, but at the same time produce a new, useless token.
For this reason, Cardano is oriented towards Dash’s treasury system. Each time a block is added to the block chain, a portion of the rewards is added to the Treasury. If funds are needed for development, they can be allocated.
To this end, Cardano has developed a governance model (“liquid democracy”) in which stakeholders can vote on a proposal for the distribution of funds.

Ada: Ouroboros Proof-of-Stake

Similar to Ethereum’s future proof-of-stake (PoS) “Casper”, Cardano also relies on a PoS. This means that there are no miners within the Cardano network responsible for validating transactions. A new proof-of-stake algorithm called Ouroboros was developed for Cardano. The basic difference between Ouroboros and Ethereum’s Casper or other similar algorithms is how block premium recipients (validators) are selected.
The core idea of the Ouroboros Proof of Stake is that a node is selected to create a new block with a probability proportional to the total number of Ada. This means that the more Ada a stakeholder has, the more he can earn over time.
In principle, each node with an Ada credit balance greater than 0 is referred to as a “stakeholder”. When a node is selected to create a new block, it is called a slot leader. The slot leader writes the transactions into a block, signs this block with his secret key and publishes the block in the network.
A fundamental problem in this electoral process is randomness. To achieve this, Cardano uses a Multiparty Computation (MPC) approach in which each voter independently performs an action called coin tossing (Coin-Fllipping Protocol).
submitted by hashatoshi to u/hashatoshi [link] [comments]

Why I think Ren is a game-changer for decentralized finance.

I'm Ken the author of The Weekly Coin newsletter, where every week I highlight high potential lower cap cryptocurrency projects one might have overlooked, the projects listed beyond the first page of CoinMarketCap. The Weekly Coin is merely a jumping-off point for you to do your own research.
If you're interested you can sign up for The Weekly Coin here. No ads, no spam, just the results of my research.
This week in The Weekly Coin we’re highlighting Ren. Ren currently lands on the 77th slot on CoinMarketCap. Yea, it's not hidden in the realms of the 2nd CMC pages and beyond. But, after researching this coin, I think it's a game-changer for DeFi and well, the cryptocurrency world as a whole. Also important to note is that I'm not invested in Ren, no one paid me to write this, and of course, I'm not a financial adviser. 😁
TL;DR
Ren allows the free movement of value between all blockchains and transfer of tokens in zero-knowledge. Unlocking new liquidity and resources to power a new wave of value in the open finance movement. With Ren all decentralized applications can run in secret, preserving the privacy of all users and data. (Renproject.io)

What is Ren?

Ren had humble beginnings as a project called Republic Protocol. Republic Protocol launched in 2017 and had their ICO on February 3, 2018. The platform focused on providing decentralized dark pools. A dark pool is a term used in traditional finance for anonymous over-the-counter (OTC) order books without moving the market (more on dark pools here). Republic Protocol's decentralized dark pools were specially designed to combat OTC trading which is known for their high fees and also to help increase institutional money flow into crypto. The technology behind these decentralized dark pools is called RenVM.
💡 Institutional investors want to get in the game but are staying on the sidelines until crypto has matured and risk is low.
RenVM turned out to be the perfect solution for interoperability between blockchains. This is because in a decentralized dark pool blockchains need to see each other. Republic Protocol then saw how massive this opportunity was and rebranded to Ren.
Ren is the evolution of the technology that underpins RepublicProtocol, in its most useful and general form. It becomes something much bigger than Republic Protocol and will empower developers to build decentralized and trustless applications, with a distinct focus on financial applications. Using our own newly developed secure multiparty computation protocol, all DeFi applications will have access to interoperable liquidity and run in complete secrecy. (Ren —The Evolution of a Protocol)
💡 To dumb it down a bit, Ren is the organization's name, REN is the ERC-20 based token, and RenVM is Ren’s core product.

Token use

To put simply, for now, the REN token is used as a work token. The token is used as a bond to run a Darknode. What is a Darknode?
RenVM is replicated over thousands of machines that work together to power it, contributing their network bandwidth, their computational power, and their storage capacity. These machines are known as Darknodes.
Darknodes communicate with other Darknodes around the world to keep RenVM running. Without them, there is no virtual machine upon which Ren can exist. RenVM uses Byzantine Fault Tolerant consensus algorithms and secure multiparty computations so that Darknodes can be operated by anyone without needing to trust them. This is what makes RenVM — and by extension, Ren itself — decentralized, trustless, and private. (Ren Documentation)
Ok so now we know what a Darknode is but what is REN Tokens used for?
The decentralized network of Darknodes is permissionless, but to prevent the forging of a large number of identities a good behavior a bond of 100,000 REN tokens is required in order to register and run a Darknode. This prevents malicious adversaries from running an unbounded number of Darknodes and overwhelming the network with misbehaving Darknodes. (Ren Documentation)
As stated, to run a Darknode you'll need 100,000 REN. You can liken this to Proof of Stake (PoS) systems in which you stake a certain currency to encourage honest behavior in block production. The benefits of running a Darknode is you will be paid transaction fees. Initially, the fees were paid in REN tokens but now Darknode operators can be paid other cryptocurrencies such as BTC, ETH, ZEC, and other ERC20 tokens.
The REN token is required to run a Darknode but the tokens are not required to use RenVM. Let’s say someone wants to acquire BAT with their BTC using a RenVM enabled DEX; where would they get the REN to preform the transaction? A centralized exchange? That defeats the purpose. A decentralized exchange? They'd need ETH or an ERC-20 token. Idk if you've used Uniswap before but you don't need an Uniswap token to swap, send, or pool. Using the RenVM is similar, you don't need to use the REN token.

Why I think Ren is a game-changer

With around 900 million dollars locked up in DeFi it's no secret that DeFi is booming right now. Go to DeFi Pulse and take a look at which chains all the DeFi applications are being built on, go on. What you'll see is every application is built on Ethereum.
Ethereum has a smart contract platform for business logic, whereas Bitcoin doesn't. You can loan and mint something like DAI without trusting an intermediary, all of this is done on Ethereum. But what if you wanted to use Bitcoin on a DEX to get DAI? Another problem is Ethereum doesn't know Bitcoin exists and vice versa. Ren fixes this. You can trade Bitcoin on a DEX like Uniswap against Ethereum or mint Dai with Bitcoin. Check out this Github repo outlining how to enable BTC and ZEC in Uniswap. As a developer of a decentralized exchange myself, I think this is huge. 🔥
Well you might be saying "We have wrapped Bitcoin", well wBTC isn't actually Bitcoin, it's more like an IOU token or a proxy. An imposter!
Well, you might be saying "We have Atomic Swaps", well Atomic Swaps are slow and a bit annoying and complex. The two parties looking to transact have to send the transaction, wait for confirmation, then send a transaction and then finally wait for a confirmation. What if the internet goes down for a person in the party? What happens if the market gets flooded?
With the RenVM it's only one transaction, it's quick, easy, and you send actual Bitcoin. There is also no need for specialized software. The RenVM takes care of it all. Check out these examples of RenVM in action.
https://youtu.be/7nqTpNt4BD0
https://youtu.be/FfQ5gBbhzlc
Also, check out Ren's Chaos Dex where you can swap SAI (DAI) for BTC or ZEC. For more information on Chaos Dex go here. Idk about you but this stuff blows my mind.
Ren is in talks with dApps in the DeFi space making sure RenVM is aligned with their platforms. There also is documentation and integrations for you to build on RenVM. Ren also has been making great strides recently in inter-blockchain liquidity by recently announcing The Ren Alliance.
The Ren Alliance is a consortium of DeFi companies and/or projects that are helping secure, develop, and utilize RenVM. (Introducing the Ren Alliance)
Also, Important to note is Ren has top tier investors in the likes of FGB Capital, Polychain Capital, and Kenetic Capital. But its headlining investor is crypto giant Huobi, which this past July launched its Huobi Cloud platform for OTC desks. Work is being done and a lot of progress is being made.

Why do we even need Interoperability?

I'm gonna yank the text straight from an article by Capgemini which explains the benefits of interoperability quite nicely.
The blockchain-based networks are being built to offer specific capabilities such as making payments, storing and trading assets and others. However, these capabilities are being offered in isolation where these networks don’t talk to each other and cannot share data. Existing centralized systems have evolved to offer the same capabilities in a more integrated way where these systems are able to run end to end transactions seamlessly making it easier for users.
If blockchain-based networks have to make a strong case for their adoption, they have to be able to work with each other and offer this seamless integration of capabilities to their users.
Strong interoperability would give users a much more useful, user-friendly experience. With this interoperability, users will be able to experience the seamless integration of capabilities being offered by the blockchain-based networks. If we have to hypothesize an example, it would look something like this – User will be able to tokenize the asset (e.g. artwork) over Ethereum based DApp, will be able to transfer the tokenized asset to another address over Cardano, and pay any corresponding transaction fees over the bitcoin network. (Capgemini)
The cryptocurrency space is pretty fragmented at times. There also is a bunch of tribalism. It can be a bit annoying. One coin makes massive efforts over here and another coin is making massive efforts over there. What if we could combine the efforts into one big force of nature? I think we can take over the world. Which, would be, huge.
Idk about y’all but I’m gonna be following the Ren team to see what they come up with. The impact this coin can have could be massive. I think Ren is definitely a coin you should look at.
submitted by Raleigh_CA to CryptoCurrency [link] [comments]

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31
Submitted for consideration to The Great Reddit Scaling Bake-Off
Baked by the pastry chefs at Offchain Labs
Please send questions or comments to [[email protected] ](mailto:[email protected])
1. Overview
We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too!
Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know).
To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts.
1.1 Why Ethereum
Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract.
The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both.
Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice.
1.2 Why Arbitrum
While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools.
Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users.
We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth.
2. Arbitrum at a glance
Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes.
Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability.
Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow.
Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below.
Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn.
Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask.
Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract.
Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details).
Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users.
Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum.
Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain.
Limitations
Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals.
As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit).
Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here).
So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality.
3. The recipe: How Arbitrum Rollup works
For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents:
Arbitrum Rollup Whitepaper
Arbitrum academic paper (describes a previous version of Arbitrum)
4. Developer docs and APIs
For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/.
Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release.
5. Who are the validators?
As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators?
Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers.
Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators.
6. Reddit Contract Support
Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain.
Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain.
For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2.
Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios.
In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process.
To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum.
When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) .
The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform.
8. Benchmarks and costs
In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking.
Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here.
Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum.
On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support:
Over a 5 day period, your scaling PoC should be able to handle:
  • 100,000 point claims (minting & distributing points)
  • 25,000 subscriptions
  • 75,000 one-off points burning
  • 100,000 transfers
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.)
We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks).
Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c.
We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic.
Our model. Our cost model includes several sources of cost:
  • L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
  • Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
  • Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system.
Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.)
9. Status of Arbitrum Rollup
Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade.
Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet.
10. Reddit Universe Arbitrum Rollup Chain
The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo.
If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access.
11. Even more scaling: Arbitrum Sidechains
Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide.
While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic.
The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout.
Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator.
Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol.
Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest.
We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help.
While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined).
12. How Arbitrum compares
We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects.
Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
  • As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
  • All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
  • Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
  • Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks
While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components.
But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration.
About Offchain Labs
Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others.
Leadership Team
Ed Felten
Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan.
Steven Goldfeder
Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons.
Harry Kalodner
Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
submitted by hkalodner to ethereum [link] [comments]

Threshold Signature Explained— Bringing Exciting Applications with TSS

Threshold Signature Explained— Bringing Exciting Applications with TSS
— A deep dive into threshold signature without mathematics by ARPA’s cryptographer Dr. Alex Su

https://preview.redd.it/cp0wib2mk0q41.png?width=757&format=png&auto=webp&s=d42056f42fb16041bc512f10f10fed56a16dc279
Threshold signature is a distributed multi-party signature protocol that includes distributed key generation, signature, and verification algorithms.
In recent years, with the rapid development of blockchain technology, signature algorithms have gained widespread attention in both academic research and real-world applications. Its properties like security, practicability, scalability, and decentralization of signature are pored through.
Due to the fact that blockchain and signature are closely connected, the development of signature algorithms and the introduction of new signature paradigms will directly affect the characteristics and efficiency of blockchain networks.
In addition, institutional and personal account key management requirements stimulated by distributed ledgers have also spawned many wallet applications, and this change has also affected traditional enterprises. No matter in the blockchain or traditional financial institutions, the threshold signature scheme can bring security and privacy improvement in various scenarios. As an emerging technology, threshold signatures are still under academic research and discussions, among which there are unverified security risks and practical problems.
This article will start from the technical rationale and discuss about cryptography and blockchain. Then we will compare multi-party computation and threshold signature before discussing the pros and cons of different paradigms of signature. In the end, there will be a list of use cases of threshold signature. So that, the reader may quickly learn about the threshold signature.
I. Cryptography in Daily Life
Before introducing threshold signatures, let’s get a general understanding of cryptography. How does cryptography protect digital information? How to create an identity in the digital world? At the very beginning, people want secure storage and transmission. After one creates a key, he can use symmetric encryption to store secrets. If two people have the same key, they can achieve secure transmission between them. Like, the king encrypts a command and the general decrypts it with the corresponding key.
But when two people do not have a safe channel to use, how can they create a shared key? So, the key exchange protocol came into being. Analogously, if the king issues an order to all the people in the digital world, how can everyone proves that the sentence originated from the king? As such, the digital signature protocol was invented. Both protocols are based on public key cryptography, or asymmetric cryptographic algorithms.


“Tiger Rune” is a troop deployment tool used by ancient emperor’s, made of bronze or gold tokens in the shape of a tiger, split in half, half of which is given to the general and the other half is saved by the emperor. Only when two tiger amulets are combined and used at the same time, will the amulet holder get the right to dispatch troops.
Symmetric and asymmetric encryption constitute the main components of modern cryptography. They both have three fixed parts: key generation, encryption, and decryption. Here, we focus on digital signature protocols. The key generation process generates a pair of associated keys: the public key and the private key. The public key is open to everyone, and the private key represents the identity and is only revealed to the owner. Whoever owns the private key has the identity represented by the key. The encryption algorithm, or signature algorithm, takes the private key as input and generate a signature on a piece of information. The decryption algorithm, or signature verification algorithm, uses public keys to verify the validity of the signature and the correctness of the information.
II. Signature in the Blockchain
Looking back on blockchain, it uses consensus algorithm to construct distributed books, and signature provides identity information for blockchain. All the transaction information on the blockchain is identified by the signature of the transaction initiator. The blockchain can verify the signature according to specific rules to check the transaction validity, all thanks to the immutability and verifiability of the signature.
For cryptography, the blockchain is more than using signature protocol, or that the consensus algorithm based on Proof-of-Work uses a hash function. Blockchain builds an infrastructure layer of consensus and transaction through. On top of that, the novel cryptographic protocols such as secure multi-party computation, zero-knowledge proof, homomorphic encryption thrives. For example, secure multi-party computation, which is naturally adapted to distributed networks, can build secure data transfer and machine learning platforms on the blockchain. The special nature of zero-knowledge proof provides feasibility for verifiable anonymous transactions. The combination of these cutting-edge cryptographic protocols and blockchain technology will drive the development of the digital world in the next decade, leading to secure data sharing, privacy protection, or more applications now unimaginable.
III. Secure Multi-party Computation and Threshold Signature
After introducing how digital signature protocol affects our lives, and how to help the blockchain build identities and record transactions, we will mention secure multi-party computation (MPC), from where we can see how threshold signatures achieve decentralization. For more about MPC, please refer to our previous posts which detailed the technical background and application scenarios.
MPC, by definition, is a secure computation that several participants jointly execute. Security here means that, in one computation, all participants provide their own private input, and can obtain results from the calculation. It is not possible to get any private information entered by other parties. In 1982, when Prof. Yao proposed the concept of MPC, he gave an example called the “Millionaires Problem” — two millionaires who want to know who is richer than the other without telling the true amount of assets. Specifically, the secure multiparty computation would care about the following properties:
  • Privacy: Any participant cannot obtain any private input of other participants, except for information that can be inferred from the computation results.
  • Correctness and verifiability: The computation should ensure correct execution, and the legitimacy and correctness of this process should be verifiable by participants or third parties.
  • Fairness or robustness: All parties involved in the calculation, if not agreed in advance, should be able to obtain the computation results at the same time or cannot obtain the results.
Supposing we use secure multi-party computation to make a digital signature in a general sense, we will proceed as follows:
  • Key generation phase: all future participants will be involved together to do two things: 1) each involved party generates a secret private key; 2) The public key is calculated according to the sequence of private keys.
  • Signature phase: Participants joining in a certain signature use their own private keys as private inputs, and the information to be signed as a public input to perform a joint signature operation to obtain a signature. In this process, the privacy of secure multi-party computing ensures the security of private keys. The correctness and robustness guarantee the unforgeability of the signature and everyone can all get signatures.
  • Verification phase: Use the public key corresponding to the transaction to verify the signature as traditional algorithm. There is no “secret input” during the verification, this means that the verification can be performed without multi-party computation, which will become an advantage of multi-party computation type distributed signature.
The signature protocol constructed on the idea of ​​secure multiparty computing is the threshold signature. It should be noted that we have omitted some details, because secure multiparty computing is actually a collective name for a type of cryptographic protocol. For different security assumptions and threshold settings, there are different construction methods. Therefore, the threshold signatures of different settings will also have distinctive properties, this article will not explain each setting, but the comparative result with other signature schemes will be introduced in the next section.
IV. Single Signature, Multi-Signature and Threshold Signature
Besides the threshold signature, what other methods can we choose?
Bitcoin at the beginning, uses single signature which allocates each account with one private key. The message signed by this key is considered legitimate. Later, in order to avoid single point of failure, or introduce account management by multiple people, Bitcoin provides a multi-signature function. Multi-signature can be simply understood as each account owner signs successively and post all signatures to the chain. Then signatures are verified in order on the chain. When certain conditions are met, the transaction is legitimate. This method achieves a multiple private keys control purpose.
So, what’s the difference between multi-signature and threshold signature?
Several constraints of multi-signature are:
  1. The access structure is not flexible. If an account’s access structure is given, that is, which private keys can complete a legal signature, this structure cannot be adjusted at a later stage. For example, a participant withdraws, or a new involved party needs to change the access structure. If you must change, you need to complete the initial setup process again, which will change the public key and account address as well.
  2. Less efficiency. The first is that the verification on chain consumes power of all nodes, and therefore requires a processing fee. The verification of multiple signatures is equivalent to multiple single signatures. The second is performance. The verification obviously takes more time.
  3. Requirements of smart contract support and algorithm adaptation that varies from chain to chain. Because multi-sig is not naturally supported. Due to the possible vulnerabilities in smart contracts, this support is considered risky.
  4. No anonymity, this is not able to be trivially called disadvantage or advantage, because anonymity is required for specific conditions. Anonymity here means that multi-signature directly exposes all participating signers of the transaction.
Correspondingly, the threshold signature has the following features:
  1. The access structure is flexible. Through an additional multi-party computation, the existing private key sequence can be expanded to assign private keys to new participants. This process will not expose the old and newly generated private key, nor will it change the public key and account address.
  2. It provides more efficiency. For the chain, the signature generated by the threshold signature is not different from a single signature, which means the following improvements : a) The verification is the same as the single signature, and needs no additional fee; b ) the information of the signer is invisible, because for other nodes, the information is decrypted with the same public key; c) No smart contract on chain is needed to provide additional support.
In addition to the above discussion, there is a distributed signature scheme supported by Shamir secret sharing. Secret sharing algorithm has a long history which is used to slice information storage and perform error correction information. From the underlying algorithm of secure computation to the error correction of the disc. This technology has always played an important role, but the main problem is that when used in a signature protocol, Shamir secret sharing needs to recover the master private key.
As for multiple signatures or threshold signature, the master private key has never been reconstructed, even if it is in memory or cache. this short-term reconstruction is not tolerable for vital accounts.
V. Limitations
Just like other secure multi-party computation protocols, the introduction of other participants makes security model different with traditional point-to-point encrypted transmission. The problem of conspiracy and malicious participants were not taken into account in algorithms before. The behavior of physical entities cannot be restricted, and perpetrators are introduced into participating groups.
Therefore, multi-party cryptographic protocols cannot obtain the security strength as before. Effort is needed to develop threshold signature applications, integrate existing infrastructure, and test the true strength of threshold signature scheme.
VI. Scenarios
1. Key Management
The use of threshold signature in key management system can achieve a more flexible administration, such as ARPA’s enterprise key management API. One can use the access structure to design authorization pattern for users with different priorities. In addition, for the entry of new entities, the threshold signature can quickly refresh the key. This operation can also be performed periodically to level up the difficulty of hacking multiple private keys at the same time. Finally, for the verifier, the threshold signature is not different from the traditional signature, so it is compatible with old equipments and reduces the update cost. ARPA enterprise key management modules already support Elliptic Curve Digital Signature Scheme secp256k1 and ed25519 parameters. In the future, it will be compatible with more parameters.

https://preview.redd.it/c27zuuhdl0q41.png?width=757&format=png&auto=webp&s=26d46e871dadbbd4e3bea74d840e0198dec8eb1c
2. Crypto Wallet
Wallets based on threshold signature are more secure because the private key doesn’t need to be rebuilt. Also, without all signatures posted publicly, anonymity can be achieved. Compared to the multi-signature, threshold signature needs less transaction fees. Similar to key management applications, the administration of digital asset accounts can also be more flexible. Furthermore, threshold signature wallet can support various blockchains that do not natively support multi-signature, which reduces the risk of smart contracts bugs.

Conclusion

This article describes why people need the threshold signature, and what inspiring properties it may bring. One can see that threshold signature has higher security, more flexible control, more efficient verification process. In fact, different signature technologies have different application scenarios, such as aggregate signatures not mentioned in the article, and BLS-based multi-signature. At the same time, readers are also welcomed to read more about secure multi-party computation. Secure computation is the holy grail of cryptographic protocols. It can accomplish much more than the application of threshold signatures. In the near future, secure computation will solve more specific application questions in the digital world.

About Author

Dr. Alex Su works for ARPA as the cryptography researcher. He got his Bachelor’s degree in Electronic Engineering and Ph.D. in Cryptography from Tsinghua University. Dr. Su’s research interests include multi-party computation and post-quantum cryptography implementation and acceleration.

About ARPA

ARPA is committed to providing secure data transfer solutions based on cryptographic operations for businesses and individuals.
The ARPA secure multi-party computing network can be used as a protocol layer to implement privacy computing capabilities for public chains, and it enables developers to build efficient, secure, and data-protected business applications on private smart contracts. Enterprise and personal data can, therefore, be analyzed securely on the ARPA computing network without fear of exposing the data to any third party.
ARPA’s multi-party computing technology supports secure data markets, precision marketing, credit score calculations, and even the safe realization of personal data.
ARPA’s core team is international, with PhDs in cryptography from Tsinghua University, experienced systems engineers from Google, Uber, Amazon, Huawei and Mitsubishi, blockchain experts from the University of Tokyo, AIG, and the World Bank. We also have hired data scientists from CircleUp, as well as financial and data professionals from Fosun and Fidelity Investments.
For more information about ARPA, or to join our team, please contact us at [email protected].
Learn about ARPA’s recent official news:
Telegram (English): https://t.me/arpa_community
Telegram (Việt Nam): https://t.me/ARPAVietnam
Telegram (Russian): https://t.me/arpa_community_ru
Telegram (Indonesian): https://t.me/Arpa_Indonesia
Telegram (Thai): https://t.me/Arpa_Thai
Telegram (Philippines):https://t.me/ARPA_Philippines
Telegram (Turkish): https://t.me/Arpa_Turkey
Korean Chats: https://open.kakao.com/o/giExbhmb (Kakao) & https://t.me/arpakoreanofficial (Telegram, new)
Medium: https://medium.com/@arpa
Twitter: u/arpaofficial
Reddit: https://www.reddit.com/arpachain/
Facebook: https://www.facebook.com/ARPA-317434982266680/54
submitted by arpaofficial to u/arpaofficial [link] [comments]

Constructing an Opt-In alternative reward for securing the blockchain

Since a keyboard with a monero logo got upvoted to the top I realized I should post various thoughts I have and generate some discussion. I hope others do the same.
Monero is currently secured by a dwindling block reward. There is a chance that the tail emission reward + transaction fees to secure the blockchain could become insufficient and allow for a scenario where it is profitable for someone to execute a 51% attack.
To understand this issue better, read this:
In Game Theory, Tragedy of the Commons is a market failure scenario where a common good is produced in lower quantities than the public desires, or consumed in greater quantities than desired. One example is pollution - it is in the public's best interest not to pollute, but every individual has incentive to pollute (e.g. because burning fossil fuel is cheap, and individually each consumer doesn't affect the environment much). The relevance to Bitcoin is a hypothetical market failure that might happen in the far future when the block reward from mining drops near zero. In the current Bitcoin design, the only fees miners earn at this time are Transaction fees. Miners will accept transactions with any fees (because the marginal cost of including them is minimal) and users will pay lower and lower fees (in the order of satoshis). It is possible that the honest miners will be under-incentivized, and that too few miners will mine, resulting in lower difficulty than what the public desires. This might mean various 51% attacks will happen frequently, and the Bitcoin will not function correctly. The Bitcoin protocol can be altered to combat this problem - one proposed solution is Dominant Assurance Contracts. Another more radical proposal (in the sense that the required change won't be accepted by most bitcoiners) is to have a perpetual reward that is constant in proportion to the monetary base. That can be achieved in two ways. An ever increasing reward (inflatacoin/expocoin) or a constant reward plus a demurrage fee in all funds that caps the monetary base (freicoin). This scenario was discussed on several threads: - Tragedy of the Commons - Disturbingly low future difficulty equilibrium https://bitcointalk.org/index.php?topic=6284.0 - Stack Exchange http://bitcoin.stackexchange.com/questions/3111/will-bitcoin-suffer-from-a-mining-tragedy-of-the-commons-when-mining-fees-drop-t Currently there is no consensus whether this problem is real, and if so, what is the best solution. 
Source: https://en.bitcoin.it/wiki/Tragedy_of_the_Commons

I suspect that least contentious solution to it is not to change code, emission or artificially increase fees (which would actually undermine the tail emission and lead to other problems, I believe: https://freedom-to-tinker.com/2016/10/21/bitcoin-is-unstable-without-the-block-reward/) but rather use a Dominant Assurance Contract that makes it rational for those who benefit from Monero to contribute to the block reward.

Dominant assurance contracts
Dominant assurance contracts, created by Alex Tabarrok, involve an extra component, an entrepreneur who profits when the quorum is reached and pays the signors extra if it is not. If the quorum is not formed, the signors do not pay their share and indeed actively profit from having participated since they keep the money the entrepreneur paid them. Conversely, if the quorum succeeds, the entrepreneur is compensated for taking the risk of the quorum failing. Thus, a player will benefit whether or not the quorum succeeds; if it fails he reaps a monetary return, and if it succeeds, he pays only a small amount more than under an assurance contract, and the public good will be provided.
Tabarrok asserts that this creates a dominant strategy) of participation for all players. Because all players will calculate that it is in their best interests to participate, the contract will succeed, and the entrepreneur will be rewarded. In a meta-game, this reward is an incentive for other entrepreneurs to enter the DAC market, driving down the cost disadvantage of dominant assurance contract versus regular assurance contracts.
Monero doesn't have a lot of scripting options to work with currently so it is very hard for me to understand how one might go about creating a Dominant Assurance Contract using Monero, especially in regards to paying out to a miner address.
This is how it could work in Bitcoin:
https://en.bitcoin.it/wiki/Dominant_Assurance_Contracts
This scheme is an attempt at Mike Hearn's exercise for the reader: an implementation of dominant assurance contracts. The scheme requires the use of multisignature transactions, nLockTime and transaction replacement which means it won't work until these features are available on the Bitcoin network.
A vendor agrees to produce a good if X BTC are raised by date D and to pay Y BTC to each of n contributors if X BTC are not raised by date D, or to pay nY BTC if X BTC are raised and the vendor fails to produce the good to the satisfaction of 2 of 3 independent arbitrators picked through a fair process
The arbitrators specify a 2-of-3 multisignature script to use as an output for the fundraiser with a public key from each arbitrator, which will allow them to judge the performance on actually producing the good
For each contributor:
The vendor and the contributor exchange public keys
They create a 2-of-2 multisignature output from those public keys
With no change, they create but do not sign a transaction with an input of X/n BTC from the contributor and an input of Y BTC from the vendor, with X/n+Y going to the output created in 3.2
The contributor creates a transaction where the output is X+nY to the address created in step 2 and the input is the output of the transaction in 3.3, signs it using SIGHASH_ALL | SIGHASH_ANYONECANPAY, with version = UINT_MAX and gives it to the vendor
The vendor creates a transaction of the entire balance of the transaction in 3.3 to the contributor with nLockTime of D and version < UINT_MAX, signs it and gives it to the contributor
The vendor and contributor then both sign the transaction in 3.3 and broadcast it to the network, making the transaction in 3.4 valid when enough contributors participate and the transaction in 3.5 valid when nLockTime expires
As date D nears, nLockTime comes close to expiration.
If enough (n) people contribute, all of the inputs from 3.4 can combine to make the output valid when signed by the vendor, creating a valid transaction sending that money to the arbitrators, which only agree to release the funds when the vendor produces a satisfactory output
If not enough people ( Note that there is a limit at which it can be more profitable for the vendor to make the remaining contributions when D approaches
Now the arbitrators have control of X (the payment from the contributors) + nY (the performance bond from the vendor) BTC and pay the vendor only when the vendor performs satisfactorily
Such contracts can be used for crowdfunding. Notable examples from Mike Hearn include:
Funding Internet radio stations which don't want to play ads: donations are the only viable revenue source as pay-for-streaming models allow undercutting by subscribers who relay the stream to their own subscribers
Automatically contributing to the human translation of web pages


Monero has these features:
  1. Multisig
  2. LockTime (but it is much different then BTCs)
  3. A possibility to do MoJoin (CoinJoin) like transactions, even if less then optimally private. There is hope that the MoJoin Schemes will allow for better privacy in the future:
I have a draft writeup for a merged-input system called MoJoin that allows multiple parties to generate a single transaction. The goal is to complete the transaction merging with no trust in any party, but this introduces significant complexity and may not be possible with the known Bulletproofs multiparty computation scheme. My current version of MoJoin assumes partial trust in a dealer, who learns the mappings between input rings and outputs (but not true spends or Pedersen commitment data).

Additionally, Non-Interactive Refund Transactions could also be possible in Monero's future.
https://eprint.iacr.org/2019/595
I can't fully workout how all of these could work together to make a DAC that allows miners to put up and payout a reward if it doesn't succeed, or how we could make it so *any* miner who participated (by putting up a reward) could claim the reward if it succeeded. I think this should really be explored as it could make for a much more secure blockchain, potentially saving us if a "crypto winter" hits where the value of monero and number of transactions are low, making for a blockchain that is hard to trust because it would be so cheap to perform a 51% attack.


I am still skeptical of Dominant Assurance Contracts, despite success in an initial test https://marginalrevolution.com/marginalrevolution/2013/08/a-test-of-dominant-assurance-contracts.html
it still remains questionable or at least confusing: https://forum.ethereum.org/discussion/747/im-not-understanding-why-dominant-assurance-contracts-are-so-special
submitted by Vespco to Monero [link] [comments]

REN (Ren) get listed on OMGFIN!

REN (Ren) get listed on OMGFIN!
https://preview.redd.it/vyucrbl00kk31.png?width=2000&format=png&auto=webp&s=2eec013ab1e7599153fda5c8c0ffe1d77f0f10c2
Dear Users,
OMGFIN is extremely proud to announce yet another great project coming to our trading platform. REN (Ren) is now available on OMGFIN, you can deposit and withdrawal REN (Ren) now. Supported trading pairs including DENT/BTC, DENT/ETH, DENT/UQC, DENT/USDT. Please take note of the following schedule:
REN (Ren) trading: 12:00 Sept 4, 2019 (UTC+8)
Get 25% cashback on trading fees at OMGFIN for all REN (Ren) coin trading pairs.
Don't miss out : BTC Market : https://omgfin.com/exchange/trade/market/RENUSDT USDT Market : https://omgfin.com/exchange/trade/market/RENUSDT
UQC Market : https://omgfin.com/exchange/trade/market/RENUQC
ETH Market : https://omgfin.com/exchange/trade/market/RENETH
https://preview.redd.it/hglgy2vxzjk31.png?width=513&format=png&auto=webp&s=d8bf1dfec8f28b0d74ad5fce928e825aeb3cbac9
REN (Ren) Introduction:
The REN is an open-source decentralized dark pool for trustless cross-chain atomic trading of Ether, ERC20 tokens and Bitcoin. REN is an ERC20 token built on the Ethereum network. Ren is an ecosystem for building, deploying, and running general-purpose, privacy-preserving, applications using zkSNARK and our own newly developed secure multiparty computation protocol. It makes it possible for any kind of application to run in a decentralized, trustless, and fault-tolerant environment similar to blockchains but with the distinguishing feature that all application inputs, outputs, and state, remain a secret even to the participants running the network.
REN (Ren) Official Website:https://renproject.io/
REN (Ren) Whitepaper:click here
Risk Warning: Investing in cryptocurrency is akin to being a venture capital investor. The cryptocurrency market is available worldwide 24 x 7 for trading with no market close or open times. Please do your own risk assessment when deciding how to invest in cryptocurrency and blockchain technology. OMGFIN attempts to screen all tokens before they come to market, however, even with the best due diligence there are still risks when investing. OMGFIN is not liable for investment gains or losses.
We sincerely appreciate your support and understanding.
Regards,
OMGFIN Team
submitted by Rajladumor1 to omgfin [link] [comments]

Weekly Wrap: This Week In Chainlink June 8 - June 14

Weekly Wrap: This Week In Chainlink June 8 - June 14

Announcements and Integrations 🎉.

Kyber Network's DEX KyberSwap is now using Chainlink Price Reference Data to update asset prices in their UI. This provides KyberSwap users with more reliable price feeds for calculating slippage rates and further safeguards against price manipulation.
Graph Protocol is making it easier for devs to pull indexed subgraph data into smart contracts. Dapps can use Chainlink to access data to index DEX liquidity to calculate slippage, catalog block data for gas fees, and sort identity data for user profiles.
ARPA is integrating Chainlink oracles to bring reliable data to its multiparty computation (MPC) network. ARPA plans to use Chainlink Price Reference Data to support its financial services, including a DeFi lending dApp for cross-chain assets.
Distributed Energy Trading Platform Dipole_Tech is integrating Chainlink oracles to price energy assets using premium data sources. Dipole also plans to use Chainlink to connect traditional & cryptocurrency payment options to its energy trading markets.
NFT devs natealex6677 & skylerfly from the generative art project ChainFaces are using Chainlink VRF to power the spin-off game FaceGolf. Chainlink VRF will be used in determining the outcome of golf matches, the random creation of new NFTs and more

Ecosystem & Community Celebrations 👏

Featured Community Videos 🎥

This video features the Chainlink and the Aave teams with presentations from ETH Global HackMoney Winning Projects along with a Q&A with Marc Zeller of Aave where we will discuss the newest developer trends in Aave and Chainlink’s communities and spark ideas for continuous building.

Upcoming Community Events 📅

Are you interested in hosting your own meetup? Apply to become a Chainlink Community Advocate today: https://events.chain.link/advocate

SmartContract is hiring: Check out these open roles 👩‍💼

View all open roles at https://careers.smartcontract.com
Are there other community content and celebrations that we missed? Post them in the comments below! ⤵️
submitted by linkedkeenan to Chainlink [link] [comments]

Best General RenVM Questions | September 2019

Best General RenVM Questions | September 2019 *These questions are sourced directly from Telegram

Q: Given the RenVM Mainnet Roll-out Plan, what are the differences between how Darknodes participate in the P2P Network, Consensus, and Execution within RenVM?
A: An outline of each component and its role in RenVM system is outlined below:P2P NetworkThe peer-to-peer network is used for two core purposes: peer discovery, and message saturation. Peer discovery allows Darknodes to learn about other active Darknodes in their shard, and in the network at large. Message saturation ensures that all messages sent around the network are seen by everyone.
ConsensusThe consensus engine is used to reach a strict ordering of transactions that go through RenVM. This ensures that the Darknodes powering RenVM are able to agree on what actions to take, and when.
ExecutionThe execution engine is used to run secure multiparty computations. This is how actions in RenVM are ultimately taken. These actions involve generating private keys, signing interoperability transactions, and, in the future, running general-purpose application logic. And all of this in secret.

Q: How do I shut down my current Darknode(s)?
A: Follow this instruction set explicitly and you won't have any issues: https://renproject.zendesk.com/hc/en-us/articles/360020365234-How-to-Fully-Deregister-a-Darknode

Q: Is running a Darknode on Chaosnet useful for the team?
A: Yes, by running a Chaosnet Darknode you are inherently helping us test. One of the core purposes of Chaosnet is to the real world incentives of RenVM. Running (and continuing to run) a Chaosnet Darknode says something about the incentives at play: they’re enough to get people running Darknodes. And this helps us! In fact, by not running a Chaosnet Darknode you’re also inherently helping us test. It’s telling us there’s something not quite right with the incentives.

Q: And what's the incentive for someone to collude and attack the network during Chaosnet?
A: The ability to steal real BTC/ZEC/BCH, the want to help us test the network, the want to betray their fellow colluders and take their REN bonds, and of course, some (wo)men just want to watch the world burn.

Q: All of this de-registering and re-registering for mainnet is a bit annoying, is it necessary?
A: We do certainly understand the point as it's been discussed at length but registration for the RenVM Mainnet is a necessary component (applying automatic updates for current Darknodes to run RenVM is not technically feasible). This announcement is very much an administrative piece to ensure our community has plenty of time and notice to proceed at the speed they prefer. Chasonet is designed for testing and those willing to actively experiment, but certainly not mandatory and there is no pressure on the general community to be active during this period.
In summary for those who prefer to be less active, should de-register their current Darknode(s) and wait patiently for activation at the release of Mainnet SubZero, no other action is needed.

Q: Is RenVM secure against quantum computing?
A: The core of RZL sMPC is theoretical secure. This means that no amount of compute power can break it (making it post-Q safe). There are some parts of it that are not (zkSNARKs and some hashes that aren’t known whether or not they’re post-Q safe) but these are easy to replace (with zkSTARKs and some post-Q safe hashes).
RZL sMPC provides ECDSA signatures because that’s what it is used by Ethereum, Bitcoin, etc. Whatever solution they come up with, will be the solution that RZL has to be upgraded to use (the whole point of RenVM is not to tell other chains how to do things, and still provide interop; this means waiting on them to define their solution and then working with that).
In short, if a QC can steal funds from RenVM, it’s because it can steal funds from any Ethereum/Bitcoin/etc. private key.

Q: If I don't deregister my Darknode by RenVM Mainnet, will I lose my 100K REN?
A: The REN bond is safe forever. You can deregister your Darknode from the legacy Mainnet whenever. We recommend doing it now, because it can take three days, and once Chaosnet rolls around that’s where our support focus will be.

Q: When shifting in funds, say a user doesn't have eth funds and this call fails const newSigResult = await ethSig.submitToEthereum (web3.currentProvider). what is the best way for that user to pick up where they left off if they leave the web page to get some ETH, and then come back? Should the app generates a new shift in the object, override the params and gateway address objects, re-submit to RenVM, and then make the above call again? Assume the transaction info such as original params and gateway address are stored in local storage so those will be available when the user comes back.
A: This is the approach we take. We store the RenVM tx in local storage and then when the user comes back we can construct the Ethereum tx and hand it to them for signing again. You can construct the RenVM tx locally and store it before asking the user to send their BTC to the gateway to protect against unexpected shutdowns. This way, you can recover from them leaving the app at any point in the process without loss of funds. (This also allows you to resend the RenVM tx in the event that the first send fails for any reason.)

Q 1: Could you elaborate on the proportionality of (a) Total value of bonded REN (b) Total value of assets under RenVM control? Does RenVM require (b) <= (a) at all times? RenVM would need an Oracle to determine the USD value of both (a) and (b).
A 1: The oraclisation is done by the Darknodes. Each of them assesses what they determine that value of (a) and (b) to be and if 2/3rds of them independently decide (b) can be increased then the network will be able to go ahead with the computation. We do require (b) < (a) but have not determined the exact ratio. Because Darknodes are randomly sampled (and constantly reshuffled) from the entire group, this value can consider the entire amount of REN bonded (not just the REN bonded by one shard).
Q 2: There's potentially an incentive-misalignment issue here: Darknodes would want to bypass the (b) < (a) limit in order to continue to process more tx's and collect fees.
A 2: True, but there’s also a natural incentive for Darknodes to want to keep the network secure. A hack would likely render their REN to drop dramatically in price and they’re REN will be locked for 2-3 months after deregistration. This is also true of users. They should be wary of keeping assets locked up when it nears the secure threshold. This can be encouraged by scaling down the burning fees/raising minting fees to encourage the movement of funds “in the right direction”

Q: Quick question: right now, a developer can choose to wait for 0 confirmations before minting zBTC on Ethereum when shifting in real BTC. Will the RenVM network require a minimum number of bitcoin confirmations, or is that always up to the application developer? If it's up to the developer, what if the developer chooses 0 confirmations, mints zBTC, and then double spends on the bitcoin network, invalidating that original bitcoin transaction? shouldn't that invalidate the zBTC that was already minted from the original 0 conf transaction?
A: The developer cannot choose. RenVM will wait for the appropriate number of confirmations. On Testnet, this number is currently set to zero because it makes testing easier. On Mainnet, there will be systems for people to take on the “confirmation risk” and provide float. Devs can also set it up so that people can deposit ahead-of-time. We are also exploring Lightning and similar concepts.

Q: I've noticed an increase of tx's made through RenVm, how tests are going on; have you met any unexpected obstacles?
A: We’ve encountered a few issues with nodes when they are rebooted/crash (we are constantly rebooting/crashing them to make sure the network continues to operate as expected under those circumstances). But, we have fixes in the work for all these issues and it hasn’t prevented us from being able to add new features (BCash and SegWit support has recently hit Devnet and will be arriving on Testnet soon).

Q1: If home chain = destination chain, then RenVM is effectively a mixing service?
A1: It can be used that way, definitely. But, it has to have a few more privacy features enabled, shifting alone won’t do.
Q2: RenVM mints Aztec notes for example?
A2: Yep, that’s the plan; we need to wait until the Ignition ceremony before this can be done. It’s one of the next features in our pipeline though! BTC would “appear” on Ethereum with no known owner. And, if you wait an amount of time between getting the authorizing from RenVM and using the signature, then it would be impossible to trace it back to the request that went to RenVM.

Q: When I go to the Command Center, the page doesn't load?
A: One has to be on the Kovan Testnet (on Metamask). To do this, select the top middle button on your Metamask tab and click Kovan Test Network (Purple circle). If you’d like to see it in action, submit a trade on our Testnet Dex Demo (https://renproject.github.io/renvm-demo/) and see it proceed through RenVM via the Hyperdrive tab: https://dcc-testnet.republicprotocol.com/hyperdrive

Q: Mixicles & RenVM: It seems like Mixicles could be used to preserve privacy features for on and off-chain settlements in a blockchain agnostic way. Wouldn’t this be seen as a threat as smart contracts could now replace a darkpool while maintaining the element of anonymity?
A: Mixicles (and all other ZK on-chain stuff we’ve seen) gives you privacy on the chain. So you can prove things have been done right (one of the things we like about public blockchains), without exposing any information about the thing (an issue with public blockchains). But, the prover still has access to the information. This rules it out for many kinds of private apps. RenVM gives you absolute privacy. You can do things with data, and prove things about data, without anyone anywhere ever knowing anything about the data. This is much more general.

Q: Can’t people just fork RenVM?
A: What ultimately prevents forks is the network effect. All projects that want to take decentralization seriously need to open-source their implementations. Almost by definition, a decentralized network is nothing but its community of people willing to work together; this is the very essence of “trust no-one except for the majority”. If you refuse to open-source you don’t have a community, you have hostages.
Building up momentum and creating a large network and community is incredibly valuable and not something that can be forked. Bitcoin is still Bitcoin, despite the large number of forks that have been created, and most of the time forks don’t overtake or outpace the original because there is too much inertia in the original community.
There are other, less philosophical, benefits too. Open-source code means you can get more feedback, people can help fix bugs, identify potential security issues, anyone can validate the implementation, people can build their own implementations (resulting in highly desirable “N versioning” which prevents a single bug compromising all nodes).

https://renproject.zendesk.com/hc/en-us/articles/360001180915-General-RenVM-Questions-September-2019
submitted by RENProtocol to RenProject [link] [comments]

Cardano is not just a Wallet

From time to time, somewhere on social media, the FUD appears about the Cardano project. We can see opinions like “Cardano is just a wallet”, “Cardano never launches main-net”, “PoS will never work”, or “it’s just white-paper”. These are all opinions based on impatience, ignorance of the depth and complexity of the project, or the inability to objectively assess the matter. Often, this FUD is caused intentionally by supporters of a competing project. Just to make competitors more relevant.
Let’s look at some facts in today’s article.
Cardano is a very complex project
Cardano is the first project that is based on formal method development and it is built as a mission-critical project. The IOHK team has done thorough research on all areas related to blockchain and distributed networks. The team studied existing works and sought the best solutions to technical problems in a real environment. The team has published many scientific studies that have undergone a rigorous review and today have countless citations. The team starts production software development only after the specifications are available. The critical parts of the project are written in Haskell. Haskell is a functional programming language that doesn’t allow any side-effects.
Blockchain incorporates technological, economic and social components. It is a system that aims to replace the current financial system and compete with the current IT giants. Such a project cannot be done just by giving you a bunch of programmers to make you a mix of Bitcoin and Ethereum. Without an emphasis on overall quality and details, such a system will never work reliably in the long term. On the other hand, such a project can be delivered in just one year. Do we need it?
Cardano took a different and more challenging path. If the team simply delivered another blockchain, it would rank among hundreds of similar projects. It was necessary to put together experts to cryptography, software security, distributed networking, threat modeling, protocol design, game theory, operating systems, designers of programming languages, economy, and of course, software architects and programmers. All these people had to work together to deliver the network that would be great in all respects. These are people who are respected leaders in their field and often at the top of their careers.
If one network is to serve the whole world, it must be capable of global scalability. It must never stop and allow all people on the planet to freely engage in network consensus. Including cheaters. Such a network will be massively attacked. Cardano must endure it and continue to function smoothly. It is a more complex task than you might think. And believe me, there are not many people in the world who could fully understand that in all details.
It takes a lot of time and effort to build such a project. No existing project is capable of mass adoption and is at the same time demonstrably secure and sustainable in the long term. Cardano will be. Global, open, public, permissionless networks are brand new. There was nothing like that before Bitcoin. The first generation of cryptocurrencies suffers from technological imperfection. Often they are in the experimental phase and are improving in full operation. In the case of Bitcoin, everybody is scared to change the first layer, as there is a legitimate concern that something will fail. The second layer can improve something, but it will always creak. The quality of the project is directly related to the quality of the team and the time spent on research, experimentation, implementation, and testing. These phases can’t be underestimated or omitted. If you do, it will fire back at you later. We can see thousands of projects on CoinMarketCap, but few are worth the attention. In ten years, there will be maybe only three of them.
Transparency
Cardano is one of the most transparent projects in the crypto. CEO of IOHK, Charles Hoskinson, does AMA and status updates very often. Sometimes several times in one month. During AMA you can ask him literally what you want. Other team members have recently started updating us as well. Cardano, as one of the few projects, has all its scientific works publicly available. Anyone can look into them and critically review the content. The number of works increases regularly. If anyone has doubts about the quality of the project, they can try to find some mistakes in these works. And believe me, it will be hard.
This work by itself has already pushed the crypt forward by a great deal. So far, nobody has worked out exactly what the ledger is and how it should work, whether PoW is really safe, how to write smart contracts safely, how to create a sustainable economic model, etc. All other projects can benefit from this work.
Do not believe that? Well, we can give you one of the many possible examples. Aggelos Kiayias is the chair in cybersecurity and privacy at the University of Edinburgh. His research interests are in computer security, information security, applied cryptography, and foundations of cryptography with a particular emphasis on blockchain technologies and distributed systems, e-voting and secure multiparty protocols, as well as privacy and identity management. He joined IOHK in 2017 as a chief scientist through a long-term consulting agreement between IOHK and the University of Edinburgh, where he is also the director of the Blockchain Technology Laboratory. Aggelos is one of the brains behind Ouroboros PoS. It is relatively easy to find the works in which he participated and the number of citations.
You can just open any scientific paper from IOHK, check the list of authors and find the number of citations. You can also easily verify there is no competitor in the whole crypto. Of course, the important thing is to get the scientific work into the source code. And that happens. You can check out GitHub for all the project repositories.
Be careful, many people and aggregation sites for some reason only look at the cardano-sl repository. Look at them all!
We understand that if you are not a programmer, it will be difficult to judge the quality of code. You just have to educate yourself here if you really want to know what is going on on GitHub. Activity is compared by the number of commits, which is some modification of the source code. Usually, a new piece is added or something old is deleted.
You can easily look at the details of each repository. You can see how many people are actively developing the code, you can see what they are working on, how often they add changes, and more. Let’s have a look at the Ouroboros-network repository.
Shelley protocol is written in Haskell. The quality and activity of the project can also be judged well by the number of new and already solved issues.
If you are in any doubt about any project, learn how to read GitHub activity. In the case of the Cardano project, be absolutely calm. The activity is one of the highest, if not absolutely highest, in the entire crypto-sphere. If anyone tells you otherwise, please refer to the IOHK library and GitHub. You can even check out the site where all the Cardano GitHub data is altogether.
Delays are usual in software development
We have already talked about how complex the Cardano project is. Believe me, delays are a common thing when creating software. There are plenty of well-described reasons for that. Every software engineer could confirm that. Let’s dive into it just a bit.
An accurate estimate during making a software plan is nearly impossible. You always have a bunch of items and tasks for a bunch of engineers. Tasks often overlap and depend on each other. To make it worse, tasks are mostly abstract and it is often hard to predict all possible obstacles. You can measure the performance of the team to improve time estimates but still, you never know when a plan is going to fail due to unexpected problems. Team members have to work together years to have solid certainty about all estimates. Cardano team is international so coordination might a bit more demanding. When some small and seemingly unnecessary tasks are skipped at the beginning, it can happen that there will be unexpected delays just because they have to be handled later. And sometimes, plan or priorities changes.
Many engineers are very optimistic when asked how long some tasks could take. They are often wrong. In reality, tasks are usually more complex than engineers think. Team members change over time. IT gurus are a special kind. They like to change jobs very often. If a team expert leaves the team, it may take longer to find an adequate replacement. This also causes a delay. The IOHK team has a lot of members.
Delivering 90% of functionality can be relatively easy compared to the last 10%. The biggest problems can arise at the end of development when all the individual components are tested together. Problems encountered at the end of development must be addressed. This often means changing a lot of things or redesigning something. Testing itself is very demanding as a global network must be simulated or the setup. In addition, the software must run on different versions of operating systems.
Let’s have a look at some famous software delays. Mac OS X was developed under the name Rhapsody and it was 1997. Version 1 release arrived after 4 years. Windows Vista was originally planned to ship in 2003. Thre was 3 years delay.
If you look for more examples, you will find many. A project like Cardano can’t simply be done in a year or two. If Shelley is launched this year, it will actually be very fast. If you look at examples from the crypto-world, you will see a big delay in the delivery of Ethereum 2.0. Count how long the Lightning Network has been built. And look at how many errors and failures we’ve seen. These failures have often led to large financial losses. If you were looking for reasons, you would find that it is caused by a badly designed or implemented software.
Of course, it is in the best interest of the team to deliver Cardano to the market as quickly as possible. But certainly not at the cost of technical imperfections. This is not a race against time. A year or two has almost no role if we can see a really good and functional, secure network capable of mass adoption.
Learn patience. Software delivery has its phases. First, everything has to be well thought out and designed. The team must be built. Then research and experiments are carried out. During this, the first source code can be written. Then the testing phase takes place. Only after all this can the network reach the public.
Testnet has been launched
Testnet has been running since the end of 2019. And it is a great success. The team expected an interest of about 100 pool operators. There are over 1000 registered pools. The network is stable and runs almost without problems. People’s interest in Cardano is huge. Community people are working on useful tools. Just check adapools.org or pooltool.io. Pool operators have no problem communicating with the team. All problems are gradually solved and many of them very quickly.
Everyone in the world can run their own node and become a pool operator. Who claims that Cardano is just a wallet, he should try it for himself.
Do you like smart contracts? Do you want to write one? Well, you can try it. Cardano will have Plutus and Marlowe. You can try both of them on the online playground.
No Lambo, sorry
Cardano’s not here to get you a Lambo. Charles promised to deliver the most decentralized network, secure smart contracts, project governance and resolved scalability. All based on scientific research. If such a network can be delivered and people adopt it, only then can we say that the team has succeeded. Will that affect the price of ADA coins? Definitely yes. The team must be fully focused on development. Not for the price. So do not complain about the price. Nobody is going to help you. Only you are responsible.
Summary
IOHK is one of the best teams in the crypto. There is a lot of work behind this team. Everyone can look at it. Nothing is patented. Everything is open-source. We cannot forget to Emurgo and Cardano Foundation. These entities, too, are completely transparent, and much work can be seen behind them. See for yourself. If someone doesn’t see it and claims that Cardano is just a wallet, he must be blind and deaf. The delays in delivering such complex software as Cardano are quite common. Rather, it would be appropriate to say that the team is moving very fast. Cardano has come so far that neither Microsoft nor IBM will be able to compete. Thanks to Cardano, the whole crypto will move up a lot and be more relevant. If you don’t believe anything we have written, you have many opportunities to check it out. So don’t believe us, verify yourself.
You can read the full article with all links and images here: https://medium.com/@Cardanians_io/cardano-is-not-just-a-wallet-2c27eab9fac7
submitted by Cardanians to cardano [link] [comments]

[PDF] Fair Two-Party Computations via Bitcoin Deposits

submitted by jedunnigan to Bitcoin [link] [comments]

AMA with Wanchain VP Lini

AMA with Wanchain VP Lini
Original article here: https://medium.com/wanchain-foundation/ama-with-wanchain-vp-lini-58ada078b4fe

“What is unique about us is that we have actually put theory into practice.”
— Lini
https://preview.redd.it/n6lo2xcmtn621.png?width=800&format=png&auto=webp&s=281acce4b45eed8acf0c52b201d01cb6f0d13507
https://preview.redd.it/10aj3ointn621.png?width=800&format=png&auto=webp&s=6a187e8a6eb5ac0445ddc73d5b0f9077f12bce39
Wanchain’s Vice President of Business Development, Lini, sat down with blockchain media organization Neutrino for an AMA covering a wide range of topics concerning Wanchain’s development.
The following is an English translation of the original Chinese AMA which was held on December 13th, 2018:
Neutrino: Could you please first share with us a little basic background, what are the basic concepts behind cross chain technology? What are the core problems which are solved with cross-chain? In your opinion, what is the biggest challenge of implementing cross chain to achieve value transfer between different chains?
Lini: Actually, this question is quite big. Let me break it down into three smaller parts:
  1. First, what is the meaning of “cross-chain”?
https://preview.redd.it/cpui6t7qtn621.png?width=720&format=png&auto=webp&s=86bc39d94b0713949c150598e2397a4f9d3ac491
In China, we like to use the word “cross-chain”, the term “interoperability” is used more frequently in foreign countries. Interoperability is also one of the important technologies identified by Vitalik for the development of a future blockchain ecosystem mentioned in the Ethereum white paper. So cross-chain is basically the concept of interoperability between chains.
  1. The core problem solved by cross chain is that of “multi-ledger” synchronous accounting
https://preview.redd.it/603dl86stn621.png?width=720&format=png&auto=webp&s=425b827298ac919f8cf05909037458a173100cc4
In essence, blockchain is a distributed bookkeeping technique, also known as distributed ledger technology. Tokens are the core units of account on each chain, there currently exist many different chains, each with their own token. Of especial importance is the way in which each ledger uses tokens to interact with each other for the purpose of clearing settlements.
  1. The core purpose of the cross-chain technology is as one of the key infrastructures of the future economy based on digital currencies.
https://preview.redd.it/3d61f26utn621.png?width=720&format=png&auto=webp&s=b735482c9734e1d32176e406adce1718be20583e
Cross chain technology is one of the foundational technological infrastructures that is necessary for the large scale application of blockchain technology.
Neutrino: As we all know, there are many different kinds of cross-chain technologies. Please give us a brief introduction to several popular cross-chain technologies on the market, and the characteristics of each of these technologies。
Lini: Before answering this question, it is very important to share two important concepts with our friends: heterogeneity and homogeneity, and centralization and decentralization.
https://preview.redd.it/n6wbs77wtn621.png?width=720&format=png&auto=webp&s=83fcadd09afb214d2aa5a2a6deb6c24d0d4da671
These two points are especially important for understanding various cross-chain technologies, because there are many different technologies and terminologies, and these are some of the foundational concepts needed for understanding them.
There are also two core challenges which must be overcome to implement cross-chain:
https://preview.redd.it/84wqd28ytn621.png?width=720&format=png&auto=webp&s=dafe1cd2993f853547b532421404e6ab86e185f1
Combining the above two points, we look at the exploration of some solutions in the industry and the design concepts of other cross-chain projects.
First I’d like to discuss the Relay solution.
https://preview.redd.it/qgcqiwlztn621.png?width=720&format=png&auto=webp&s=0925d4221c9e92e365e150638c645bef8c609b3f
However the Relay solution must consume a relatively large amount of gas to read the BTC header. Another downside is that, as we all know, Bitcoin’s blocks are relatively slow, so the time to wait for verification will be long, it usually takes about 10 minutes to wait for one block to confirm, and the best practice is to wait for 6 blocks.
The next concept is the idea of Sidechains.
https://preview.redd.it/9cg79bl1un621.png?width=720&format=png&auto=webp&s=1260e14213b1757eadc4b6141a365ed3b0e20316
This solution is good, but not all chains contain SPV, a simple verification method. Therefore, there are certain drawbacks. Of course, this two way peg way solves challenge beta very well, that is, the atomicity of the transaction.
These two technical concepts have already been incorporated into a number of existing cross chain projects. Let’s take a look at two of the most influential of these.
The first is Polkadot.
https://preview.redd.it/1o3xwz93un621.png?width=720&format=png&auto=webp&s=249909a33b5420050a6010b961a944285fc94926
This is just a summary based on Polkadot’s whitepaper and most recent developments. The theoretical design is very good and can solve challenges alpha and beta. Last week, Neutrino organized a meetup with Polkadot, which we attended. In his talk, Gavin’s focus was on governance, he didn’t get into too much technical detail, but Gavin shared some very interesting ideas about chain governance mechanisms! The specific technical details of Polkadot may have to wait until after their main net is online before it can be analyzed.
Next is Cosmos.
https://preview.redd.it/5gtjf6x4un621.png?width=720&format=png&auto=webp&s=94d6408ff65dc7041316f0130867888e108848b2
Cosmos is a star project who’s basic concept is similar to Polkadot. Cosmos’s approach is based on using a central hub. Both projects both take into account the issue of heterogeneous cross-chain transactions, and both have also taken into account how to solve challenges alpha and beta.
To sum up, each research and project team has done a lot of exploration on the best methods for implementing cross-chain technology, but many are still in the theoretical design stage. Unfortunately, since the main net has not launched yet, it is not possible to have a more detailed understanding of each project’s implementation. A blockchain’s development can be divided into two parts: theoretical design, and engineering implementation. Therefore, we can only wait until after the launch of each project’s main network, and then analyze it in more detail.
Neutrino: As mentioned in the white paper, Wanchain is a general ledger based on Ethereum, with the goal of building a distributed digital asset financial infrastructure. There are a few questions related to this. How do you solve Ethereum’s scaling problem? How does it compare with Ripple, which is aiming to be the standard trading protocol that is common to all major banks around the world? As a basic potential fundamental financial infrastructure, what makes Wanchain stand out?
Lini: This question is actually composed of two small questions. Let me answer the first one first.
  1. Considerations about TPS.
First of all, Wanchain is not developed on Ethereum. Instead, it draws on some of Ethereum’s code and excellent smart contracts and virtual machine EVM and other mature technical solutions to build the mainnet of Wanchain.
The TPS of Ethereum is not high at this stage, which is limited by various factors such as the POW consensus mechanism. However, this point also in part is due to the characteristics of Ethereum’s very distributed and decentralized features. Therefore, in order to improve TPS, Wanchain stated in its whitepaper that it will launch its own POS consensus, thus partially solving the performance issues related to TPS. Wanchain’s POS is completely different from the POS mechanism of Ethereum 2.0 Casper.
Of course, at the same time, we are also paying close attention to many good proposals from the Ethereum community, such as sharding, state channels, side chains, and the Raiden network. Since blockchain exists in the world of open source, we can of course learn from other technological breakthroughs and use our own POS to further improve TPS. If we have some time at the end, I’d love to share some points about Wanchain’s POS mechanism.
  1. Concerning, Ripple, it is completely different from what Wanchain hopes to do.
Ripple is focused on exchanges between different fiat pairs, the sharing of data between banks and financial institutions, as a clearing and settlement system, and also for the application of DLT, for example the Notary agent mechanism.
Wanchain is focused on different use cases, it is to act as a bridge between different tokens and tokens, and between assets and tokens. For various cross-chain applications it is necessary to consume WAN as a gas fee to pay out to nodes.
So it seems that the purpose Ripple and Wanchain serve are quite different. Of course, there are notary witnesses in the cross-chain mechanism, that is, everyone must trust the middleman. Ripple mainly serves financial clients, banks, so essentially everyone’s trust is already there.
Neutrino: We see that Wanchain uses a multi-party computing and threshold key sharing scheme for joint anchoring, and achieves “minimum cost” for integration through cross-chain communication protocols without changing the original chain mechanism. What are the technical characteristics of multi-party computing and threshold key sharing? How do other chains access Wanchain, what is the cross-chain communication protocol here? What is the cost of “minimum cost?
Lini: The answer to this question is more technical, involving a lot of cryptography, I will try to explain it in a simple way.
  1. About sMPC -
It stands for secure multi-party computation. I will explain it using an example proposed by the scholar Andrew Yao, the only Turing Award winner in China. The scenario called Yao’s Millionaire Problem. How can two millionaires know who is wealthier without revealing the details of their wealth to each other or a trusted third party? I’m not going to explain the answer in detail here, but those who are interested can do a web search to learn more.
In sMPC multiple parties each holding their own piece of private data jointly perform a calculation (for example, calculating a maximum value) and obtain a calculation result. However, in the process, each party involved does not leak any of their respective data. Essentially sMPC calculation can allow for designing a protocol without relying on any trusted third parties, since no individual ever has access to the complete private information.
Secure multiparty computing can be abstractly understood as two parties who each have their own private data, and can calculate the results of a public function without leaking their private data. When the entire calculation is completed, only the calculation results are revealed to both parties, and neither of them knows the data of the other party and the intermediate data of the calculation process. The protocol used for secure multiparty computing is homomorphic encryption + secret sharing + OT (+ commitment scheme + zero knowledge proofs, etc.)
Wanchain’s 21 cross chain Storeman nodes use sMPC to participate in the verification of a transaction without obtaining of a user’s complete private key. Simply put, the user’s private key will have 21 pieces given to 21 anonymous people who each can only get 1/21 part, and can’t complete the whole key.
  1. Shamir’s secret sharing
There are often plots in a movie where a top secret document needs to be handed over to, let’s say five secret agents. In order to protect against the chance of an agent from being arrested or betraying the rest, the five agents each hold only part of a secret key which will reveal the contents of the documents. But there is also a hidden danger: if one the agents are really caught, how can the rest of the agents access the information in the documents? At this point, you may wonder if there is any way for the agents to still recover the original text with only a portion of the keys? In other words, is there any method that allows a majority of the five people to be present to unlock the top secret documents? In this case, the enemy must be able to manipulate more than half of the agents to know the information in the secret documents.
Wanchain uses the threshold M<=N; N=21; M=16. That is to say, at least 16 Storeman nodes must participate in multi-party calculation to confirm a transaction. Not all 21 Storeman nodes are required to participate. This is a solution to the security problem of managing private keys.
Cross-chain communication protocols refers to the different communication methods used by different chains. This is because heterogeneous cross-chain methods can’t change the mechanism of the original chains. Nakamoto and Vitalik will not modify their main chains because they need BTC and ETH interoperability. Therefore, project teams that can only do cross-chain agreements to create different protocols for each chain to “talk”, or communicate. So the essence of a cross-chain protocol is not a single standard, but a multiple sets of standards. But there is still a shared sMPC and threshold design with the Storeman nodes.
The minimum cost is quite low, as can be shown with Wanchain 3.0’s cross chain implementation. In fact it requires just two smart contracts, one each on Ethereum and Wanchain to connect the two chains. To connect with Bitcoin all that is needed is to write a Bitcoin script. Our implementation guarantees both security and decentralization, while at the same time remaining simple and consuming less computation. The specific Ethereum contract and Bitcoin scripts online can be checked out by anyone interested in learning more.
Neutrino: What kind of consensus mechanism is currently used by Wanchain? In addition, what is the consensus and incentive mechanism for cross-chain transactions, and what is the purpose of doing so? And Wanchain will support cross-chain transactions (such as BTC, ETH) on mainstream public chains, asset cross-chain transactions between the alliance chains, and cross-chain transactions between the public and alliance chains, how can you achieve asset cross-chain security and privacy?
Lini: It is now PPOW (Permissioned Proof of Work), in order to ensure the reliability of the nodes before the cross-chain protocol design is completed, and to prepare to switch to POS (as according to the Whitepaper roadmap). The cross-chain consensus has been mentioned above, with the participation of a small consensus (at least 16 nodes) in a set of 21 Storeman nodes through sMPC and threshold secret sharing.
In addition, the incentive is achieved through two aspects: 1) 100% of the cross chain transaction fee is used to reward the Storeman node; 2) Wanchain has set aside a portion of their total token reserve as an incentive mechanism for encouraging Storeman nodes in case of small cross-chain transaction volume in the beginning.
It can be revealed that Storeman participation is opening gradually and will become completely distributed and decentralized in batches. The first phase of the Storeman node participation and rewards program is to be launched at the end of 2018. It is expected that the selection of participants will be completed within one quarter. Please pay attention to our official announcements this month.
In addition, for public chains, consortium chains, and private chains, asset transfer will also follow the cross-chain mechanism mentioned above, and generally follow the sMPC and threshold integration technology to ensure cross-chain security.
When it comes to privacy, this topic will be bigger. Going back to the Wanchain Whitepaper, we have provided privacy protection on Wanchain mainnet. Simply put, the principle is using ring signatures. The basic idea is that it mixes the original address with many other addresses to ensure privacy. We also use one-time address. In this mechanism a stamp system is used that generates a one-time address from a common address. This has been implemented since our 2.0 release.
But now only the privacy protection of native WAN transactions can be provided. The protection of cross-chain privacy and user experience will also be one of the important tasks for us in 2019.
Neutrino: At present, Wanchain uses Storeman as a cross-chain trading node. Can you introduce the Storeman mechanism and how to protect these nodes?
Lini: Let me one problem from two aspects.
  1. As I introduced before in my explanation of sMPC, the Storeman node never holds the user’s private key, but only calculates the transaction in an anonymous and secure state, and the technology prevents the Storeman nodes from colluding.
  2. Even after technical guarantees, we also designed a “double protection” against the risk from an economic point of view, that is, each node participating as a Storeman needs to pledge WAN in the contract as a “stake”. The pledge of WAN will be greater than the amount of any single transaction as a guarantee against loss of funds.
If the node is malicious (even if it is a probability of one in a billion), the community will be compensated for the loss caused by the malicious node by confiscation of the staked WAN. This is like the POS mechanism used by ETH, using staking to prevent bad behavior is a common principle.
Neutrino: On December 12th, the mainnet of Wanchain 3.0 was launched. Wanchain 3.0 opened cross-chain transactions between Bitcoin, Ethereum and ERC20 (such as MakerDao’s stable currency DAI and MKR). What does this version mean for you and the industry? This upgrade of cross-chain with Bitcoin is the biggest bright spot. So, if now you are able to use Wanchain to make transactions between what is the difference between tokens, then what is the difference between a cross chain platform like Wanchain and cryptocurrency exchanges?
Lini: The release of 3.0 is the industry’s first major network which has crossed ETH and BTC, and it has been very stable so far. As mentioned above, many cross-chain, password-protected theoretical designs are very distinctive, but for engineering implementation, the whether or not it can can be achieved is a big question mark. Therefore, this time Wanchain is the first network launched in the world to achieve this. Users are welcome to test and attack. This also means that Wanchain has connected the two most difficult and most challenging public networks. We are confident we will soon be connecting other well-known public chains.
At the same time of the release of 3.0, we also introduced cross chain integration with other ERC20 tokens in the 2.X version, such as MakerDao’s DAI, MKR, LRC, etc., which also means that more tokens of excellent projects on Ethereum will also gradually be integrated with Wanchain.
Some people will be curious, since Wanchain has crossed so many well-known public chains/projects; how is it different with crypto exchanges? In fact, it is very simple, one centralized; one distributed. Back to the white paper of Nakamoto, is not decentralization the original intention of blockchain? So what Wanchain has to do is essentially to solve the bottom layer of the blockchain, one of the core technical difficulties.
Anyone trying to create a DEX (decentralized exchange); digital lending and other application scenarios can base their application on Wanchain. There is a Wanchain based DEX prototype made by our community members Jeremiah and Harry, which quite amazing. Take a look at this video below.
https://www.youtube.com/watch?v=codcqb66G6Q
Neutrino: What are the specific application use cases after the launch of Wanchain 3.0? Most are still exploring small-scale projects. According to your experience, what are the killer blockchain applications of the future? What problems need to be solved during this period? How many years does it take?
Lini:
  1. Wanchain is just a technology platform rather than positioning itself as an application provider; that is, Wanchain will continue to support the community, and the projects which use cross-chain technology to promote a wide range of use cases for Wanchain.
  2. Cross-chain applications that we anticipate include things like: decentralized exchanges, digital lending, cross chain games, social networking dAPPs, gambling, etc. We also expect to see applications using non fungible tokens, for example exchange of real assets, STOs, etc.
  3. We recently proposed the WanDAPP solution. Simply speaking, a game developer for example has been developing on Ethereum, and ERC20 tokens have been issued, but they hope to expand the player base of their games to attract more people. To participate and make full use of their DAPP, you can consider using the WanDAPP solution to deploy the game DAPP on other common platforms, such as EOS, TRON, etc., but you don’t have to issue new tokens on these chains or use the previous ERC20 tokens. In this way the potential user population of the game can be increased greatly without issuing more tokens on a new chain, improving the real value of the original token. This is accomplished completely using the cross-chain mechanism of Wanchain.
  4. For large-scale applications, the infrastructure of the blockchain is not yet complete, there are issues which must first be dealt with such as TPS, sharding, sidechains, state channels, etc. These all must be solved for the large-scale application of blockchain applications. I don’t dare to guess when it will be completed, it depends on the progress of various different technical projects. In short, industry practitioners and enthusiasts need a little faith and patience.
Neutrino community member Block Venture Capital Spring: Will Wanchain be developing any more cross chain products aimed at general users? For example will the wallet be developed to make automatic cross chain transfers with other public chains? Another issue the community is concerned about is the currency issuance. Currently there are more than 100 million WAN circulating, what about the rest, when will it be released?
Lini: As a cross-chain public chain, we are not biased towards professional developers or ordinary developers, and they are all the same. As mentioned above, we provide a platform as infrastructure, and everyone is free to develop applications on us.
For example, if it is a decentralized exchange, it must be for ordinary users to trade on; if it is some kind of financial derivatives product, it is more likely to be used by finance professionals. As for cross-chain wallets which automatically exchange, I’m not sure if you are talking about distributed exchanges, the wallet will not be “automatic” at first, but you can “automatically” redeem other tokens.
Finally, the remaining WAN tokens are strictly in accordance with the plan laid out in the whitepaper. For example, the POS node reward mentioned above will give 10% of the total amount for reward. At the same time, for the community, there are also rewards for the bounty program. The prototype of the DEX that I just saw is a masterpiece of the overseas community developers, and also received tokens from our incentive program.
Neutrino community member’s question: There are many projects in the market to solve cross-chain problems, such as: Cosmos, Polkadot, what are Wanchain’s advantages and innovations relative to these projects?
Lini: As I mentioned earlier, Cosmos and pPolkadot all proposed very good solutions in theory. Compared with Wanchain, I don’t think that we have created anything particularly unique in our theory. The theoretical basis for our work is cryptography, which is derived from the academic foundation of scholars such as Yao Zhizhi and Silvio Micali. Our main strong point is that we have taken theory and put it into practice..
Actually, the reason why people often question whether a blockchain project can be realized or not is because the whitepapers are often too ambitious. Then when they actually start developing there are constant delays and setbacks. So for us, we focus on completing our very solid and realizable engineering goals. As for other projects, we hope to continue to learn from each other in this space.
Neutrino community member Amos from Huobi Research Institute question: How did you come to decide on 21 storeman nodes?
Lini: As for the nodes we won’t make choices based on quantity alone. The S in the POS actually also includes the time the tokens are staked, so that even if a user is staking less tokens, the amount of time they stake them for will also be used to calculate the award, so that is more fair. We designed the ULS (Unique Leader Selection) algorithm in order to reduce the reliance on the assumption of corruption delay (Cardano’s POS theory). which is used for ensuring fairness to ensure that all participants in the system can have a share of the reward, not only few large token holders.
Wu Di, a member of the Neutrino community: Many big exchanges have already begun to deploy decentralized exchanges. For example, Binance, and it seems that the progress is very fast. Will we be working with these influential exchanges in the future? We we have the opportunity to cooperate with them and broaden our own influence?
Lini: I also have seen some other exchange’s DEX. Going back the original point, distributed cross-chain nodes and centralized ones are completely different. I’m guessing that most exchanges use a centralized cross-chain solution, so it may not be the same as the 21 member Storeman group of Wanchain, but I think that most exchanges will likely be using their own token and exchange system. This is my personal understanding. But then, if you are developing cross chain technology, you will cooperate with many exchanges that want to do a DEX. Not only Binance, but also Huobi, Bithumb, Coinbase… And if there is anyone else who would like to cooperate we welcome them!
Neutrino community member AnneJiang from Maker: Dai as the first stable chain of Wanchain will open a direct trading channel between Dai and BTC. In relation to the Dai integration, has any new progress has been made on Wanchain so far?
Lini: DAI’s stable currency has already been integrated on Wanchain. I just saw it yesterday, let me give you a picture. It’s on the current 3.0 browser, https://www.wanscan.org/, you can take a look at it yourself.
This means that users with DAI are now free to trade for BTC, or ETH or some erc20 tokens. There is also a link to the Chainlink, and LRC is Loopring, so basically there are quite a few excellent project tokens. You may use the Wanchain to trade yourself, but since the DEX is not currently open, currently you can only trade with friends you know.
https://preview.redd.it/jme5s99bun621.png?width=800&format=png&auto=webp&s=7ba3d430ba3e7ddcab4dbcdedc05d596d832f5a7

About Neutrino

Neutrino is a distributed, innovative collaborative community of blockchains. At present, we have established physical collaboration spaces in Tokyo, Singapore, Beijing, Shanghai and other places, and have plans to expand into important blockchain innovation cities such as Seoul, Thailand, New York and London. Through global community resources and partnerships, Neutrino organizes a wide range of online an offline events, seminars, etc. around the world to help developers in different regions better communicate and share their experiences and knowledge.

About Wanchain

Wanchain is a blockchain platform that enables decentralized transfer of value between blockchains. The Wanchain infrastructure enables the creation of distributed financial applications for individuals and organizations. Wanchain currently enables cross-chain transactions with Ethereum, and today’s product launch will enable the same functionalities with Bitcoin. Going forward, we will continue to bridge blockchains and bring cross-chain finance functionality to companies in the industry. Wanchain has employees globally with offices in Beijing (China), Austin (USA), and London (UK).
You can find more information about Wanchain on our website. Additionally, you can reach us through Telegram, Discord, Medium, Twitter, and Reddit. You can also sign up for our monthly email newsletter here.
https://preview.redd.it/w7ezx27dun621.png?width=720&format=png&auto=webp&s=6ef7a651a2d480658f60d213e1431ba636bfbd8c
submitted by maciej_wan to wanchain [link] [comments]

Basics of Secure Multiparty Computation - YouTube CACM Apr. 2016 - Secure Multiparty Computations on Bitcoin What is Secure Multiparty Computation (MPC)? - YouTube Secure Multiparty Computation EzPC (Easy Secure Multi-party Computation) - YouTube

In this work, we propose to use Bitcoin (a digital currency, introduced in 2008) to design such fully decentralized protocols that are secure even if no trusted third party is available. As an instantiation of this idea, we construct protocols for secure multiparty lotteries using the Bitcoin currency, without relying on a trusted authority. Our protocols guarantee fairness for the honest ... The goal of this paper is to show how these properties of Bit coin can be used in the area of secure multiparty computation protocols (MPCs). Firstly, we show that the Bit coin system provides an attractive way to construct a version of "timed commitments", where the committer has to reveal his secret within a certain time frame, or to pay a fine. This, in turn, can be used to obtain fairness ... Bitcoin (a digital currency, introduced in 2008) to design such fully decentralized protocols that are secure even if no trusted third party is available. As an instantiation of this idea, we construct protocols for secure multiparty lotter-ies using the Bitcoin currency, without relying on a trusted authority. Our protocols guarantee fairness ... As an instantiation of this idea, we construct protocols for secure multiparty lottery using the Bitcoin currency, without relying on a trusted authority. By "lottery," we mean a protocol in which a group of parties initially invests some money, and at the end, one of them, chosen randomly, gets all the invested money (called the pot). Our protocol works in purely peer-to-peer environment and ... Secure Multi-Party Computation on Personal Data. Now, instead of using a number, let’s say the ‘secret’ is a user’s personal data. SMPC works in much the same way: the personal data is split into several, smaller parts, each of which is masked using cryptographic techniques. Next, each small, encrypted piece of data is sent to a ...

[index] [35961] [9760] [9594] [48577] [19279] [50988] [14201] [913] [36708] [42548]

Basics of Secure Multiparty Computation - YouTube

Multiparty Computation (MPC) enables organizations to analyze big data collaboratively without requiring them to reveal any private information. Learn how Bo... A detailed yet simple introduction to Secure multiparty computation using Shamir's secret sharing scheme. Note: At 08:27 there is an incorrect mention of "de... CACM Apr. 2016 - Secure Multiparty Computations on Bitcoin Association for Computing Machinery (ACM) Loading... Unsubscribe from Association for Computing Machinery (ACM)? ... Secure Multi-Party Computation (MPC) is a powerful cryptographic tool that allows multiple entities to execute protocols in order to compute functions on the... Yuval Ishai (Technion Israel Institute of Technology) Richard M. Karp Distinguished Lectures, Fall 2019 https://simons.berkeley.edu/events/rmklectures2019-fa...

#