submitted by BitcoinAllBot to BitcoinAllTV [link] [comments]
submitted by dbabbitt to TheDAC [link] [comments]
|submitted by BitcoinAllBot to BitcoinAll [link] [comments]|
|submitted by abtcuser to btc [link] [comments]|
submitted by BryanM_Crypto to Crypto_com [link] [comments]
Ren (REN) is now available on DeFi Swap. Users can swap REN, be REN Liquidity Providers to earn fees and boost their yield by up to 20x when staking CRO.
Ren (REN), is an open protocol that enables the permissionless and private transfer of value between any blockchain. Ren's core product RenVM, brings interoperability to decentralized finance (DeFi). Ren built and released the first decentralized dark pool (RenEx) in 2018 and is now generalizing this technology to create an ecosystem for building, deploying, and running general-purpose, privacy-preserving, applications using zkSNARK on a newly developed secure multiparty computation protocol (sMPC). Ren's core product is now RenVM, which brings interoperability to decentralized finance (DeFi). RenVM is a decentralized and trustless custodian that holds your digital assets as they move between blockchains using zero-knowledge proofs over an sMPC based protocol. The state, inputs, and outputs of all programs that RenVM runs are kept hidden from everyone, including the Darknodes that power it.
REN joins a growing list of tokens on DeFi Swap, such as UMA (UMA), Swerve (SWRV), Harvest Finance (FARM), Uniswap (UNI), Wrapped Bitcoin (WBTC), Yearn Finance (YFI), (Wrapped) Ether (WETH), Tether (USDT), USD Coin (USDC), Dai (DAI), Chainlink (LINK), Compound (COMP) and Crypto.com Coin (CRO).
Start swapping, farming and staking now.
Please see blog for more details about DeFi Swap.
Ren allows the free movement of value between all blockchains and transfer of tokens in zero-knowledge. Unlocking new liquidity and resources to power a new wave of value in the open finance movement. With Ren all decentralized applications can run in secret, preserving the privacy of all users and data. (Renproject.io)
Ren is the evolution of the technology that underpins RepublicProtocol, in its most useful and general form. It becomes something much bigger than Republic Protocol and will empower developers to build decentralized and trustless applications, with a distinct focus on financial applications. Using our own newly developed secure multiparty computation protocol, all DeFi applications will have access to interoperable liquidity and run in complete secrecy. (Ren —The Evolution of a Protocol)💡 To dumb it down a bit, Ren is the organization's name, REN is the ERC-20 based token, and RenVM is Ren’s core product.
RenVM is replicated over thousands of machines that work together to power it, contributing their network bandwidth, their computational power, and their storage capacity. These machines are known as Darknodes.Ok so now we know what a Darknode is but what is REN Tokens used for?
Darknodes communicate with other Darknodes around the world to keep RenVM running. Without them, there is no virtual machine upon which Ren can exist. RenVM uses Byzantine Fault Tolerant consensus algorithms and secure multiparty computations so that Darknodes can be operated by anyone without needing to trust them. This is what makes RenVM — and by extension, Ren itself — decentralized, trustless, and private. (Ren Documentation)
The decentralized network of Darknodes is permissionless, but to prevent the forging of a large number of identities a good behavior a bond of 100,000 REN tokens is required in order to register and run a Darknode. This prevents malicious adversaries from running an unbounded number of Darknodes and overwhelming the network with misbehaving Darknodes. (Ren Documentation)As stated, to run a Darknode you'll need 100,000 REN. You can liken this to Proof of Stake (PoS) systems in which you stake a certain currency to encourage honest behavior in block production. The benefits of running a Darknode is you will be paid transaction fees. Initially, the fees were paid in REN tokens but now Darknode operators can be paid other cryptocurrencies such as BTC, ETH, ZEC, and other ERC20 tokens.
The Ren Alliance is a consortium of DeFi companies and/or projects that are helping secure, develop, and utilize RenVM. (Introducing the Ren Alliance)Also, Important to note is Ren has top tier investors in the likes of FGB Capital, Polychain Capital, and Kenetic Capital. But its headlining investor is crypto giant Huobi, which this past July launched its Huobi Cloud platform for OTC desks. Work is being done and a lot of progress is being made.
The blockchain-based networks are being built to offer specific capabilities such as making payments, storing and trading assets and others. However, these capabilities are being offered in isolation where these networks don’t talk to each other and cannot share data. Existing centralized systems have evolved to offer the same capabilities in a more integrated way where these systems are able to run end to end transactions seamlessly making it easier for users.The cryptocurrency space is pretty fragmented at times. There also is a bunch of tribalism. It can be a bit annoying. One coin makes massive efforts over here and another coin is making massive efforts over there. What if we could combine the efforts into one big force of nature? I think we can take over the world. Which, would be, huge.
If blockchain-based networks have to make a strong case for their adoption, they have to be able to work with each other and offer this seamless integration of capabilities to their users.
Strong interoperability would give users a much more useful, user-friendly experience. With this interoperability, users will be able to experience the seamless integration of capabilities being offered by the blockchain-based networks. If we have to hypothesize an example, it would look something like this – User will be able to tokenize the asset (e.g. artwork) over Ethereum based DApp, will be able to transfer the tokenized asset to another address over Cardano, and pay any corresponding transaction fees over the bitcoin network. (Capgemini)
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31submitted by hkalodner to ethereum [link] [comments]
Submitted for consideration to The Great Reddit Scaling Bake-Off
Baked by the pastry chefs at Offchain Labs
Please send questions or comments to [[email protected] ](mailto:[email protected])
We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too!
Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know).
To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts.
1.1 Why Ethereum
Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract.
The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both.
Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice.
1.2 Why Arbitrum
While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools.
Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users.
We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth.
2. Arbitrum at a glance
Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes.
Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability.
Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow.
Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below.
Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn.
Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask.
Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract.
Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details).
Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users.
Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum.
Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain.
Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals.
As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit).
Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here).
So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality.
3. The recipe: How Arbitrum Rollup works
For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents:
Arbitrum Rollup Whitepaper
Arbitrum academic paper (describes a previous version of Arbitrum)
4. Developer docs and APIs
For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/.
Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release.
5. Who are the validators?
As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators?
Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers.
Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators.
6. Reddit Contract Support
Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain.
Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain.
For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2.
Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios.
In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process.
To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum.
When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) .
The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform.
8. Benchmarks and costs
In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking.
Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here.
Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum.
On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support:
Over a 5 day period, your scaling PoC should be able to handle:
We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks).
Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c.
We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic.
Our model. Our cost model includes several sources of cost:
Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.)
9. Status of Arbitrum Rollup
Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade.
Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet.
10. Reddit Universe Arbitrum Rollup Chain
The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo.
If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access.
11. Even more scaling: Arbitrum Sidechains
Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide.
While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic.
The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout.
Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator.
Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol.
Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest.
We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help.
While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined).
12. How Arbitrum compares
We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects.
Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components.
But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration.
About Offchain Labs
Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others.
Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan.
Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons.
Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
— A deep dive into threshold signature without mathematics by ARPA’s cryptographer Dr. Alex Susubmitted by arpaofficial to u/arpaofficial [link] [comments]
Threshold signature is a distributed multi-party signature protocol that includes distributed key generation, signature, and verification algorithms.
In recent years, with the rapid development of blockchain technology, signature algorithms have gained widespread attention in both academic research and real-world applications. Its properties like security, practicability, scalability, and decentralization of signature are pored through.
Due to the fact that blockchain and signature are closely connected, the development of signature algorithms and the introduction of new signature paradigms will directly affect the characteristics and efficiency of blockchain networks.
In addition, institutional and personal account key management requirements stimulated by distributed ledgers have also spawned many wallet applications, and this change has also affected traditional enterprises. No matter in the blockchain or traditional financial institutions, the threshold signature scheme can bring security and privacy improvement in various scenarios. As an emerging technology, threshold signatures are still under academic research and discussions, among which there are unverified security risks and practical problems.
This article will start from the technical rationale and discuss about cryptography and blockchain. Then we will compare multi-party computation and threshold signature before discussing the pros and cons of different paradigms of signature. In the end, there will be a list of use cases of threshold signature. So that, the reader may quickly learn about the threshold signature.
I. Cryptography in Daily Life
Before introducing threshold signatures, let’s get a general understanding of cryptography. How does cryptography protect digital information? How to create an identity in the digital world? At the very beginning, people want secure storage and transmission. After one creates a key, he can use symmetric encryption to store secrets. If two people have the same key, they can achieve secure transmission between them. Like, the king encrypts a command and the general decrypts it with the corresponding key.
But when two people do not have a safe channel to use, how can they create a shared key? So, the key exchange protocol came into being. Analogously, if the king issues an order to all the people in the digital world, how can everyone proves that the sentence originated from the king? As such, the digital signature protocol was invented. Both protocols are based on public key cryptography, or asymmetric cryptographic algorithms.
“Tiger Rune” is a troop deployment tool used by ancient emperor’s, made of bronze or gold tokens in the shape of a tiger, split in half, half of which is given to the general and the other half is saved by the emperor. Only when two tiger amulets are combined and used at the same time, will the amulet holder get the right to dispatch troops.
Symmetric and asymmetric encryption constitute the main components of modern cryptography. They both have three fixed parts: key generation, encryption, and decryption. Here, we focus on digital signature protocols. The key generation process generates a pair of associated keys: the public key and the private key. The public key is open to everyone, and the private key represents the identity and is only revealed to the owner. Whoever owns the private key has the identity represented by the key. The encryption algorithm, or signature algorithm, takes the private key as input and generate a signature on a piece of information. The decryption algorithm, or signature verification algorithm, uses public keys to verify the validity of the signature and the correctness of the information.
II. Signature in the Blockchain
Looking back on blockchain, it uses consensus algorithm to construct distributed books, and signature provides identity information for blockchain. All the transaction information on the blockchain is identified by the signature of the transaction initiator. The blockchain can verify the signature according to specific rules to check the transaction validity, all thanks to the immutability and verifiability of the signature.
For cryptography, the blockchain is more than using signature protocol, or that the consensus algorithm based on Proof-of-Work uses a hash function. Blockchain builds an infrastructure layer of consensus and transaction through. On top of that, the novel cryptographic protocols such as secure multi-party computation, zero-knowledge proof, homomorphic encryption thrives. For example, secure multi-party computation, which is naturally adapted to distributed networks, can build secure data transfer and machine learning platforms on the blockchain. The special nature of zero-knowledge proof provides feasibility for verifiable anonymous transactions. The combination of these cutting-edge cryptographic protocols and blockchain technology will drive the development of the digital world in the next decade, leading to secure data sharing, privacy protection, or more applications now unimaginable.
III. Secure Multi-party Computation and Threshold Signature
After introducing how digital signature protocol affects our lives, and how to help the blockchain build identities and record transactions, we will mention secure multi-party computation (MPC), from where we can see how threshold signatures achieve decentralization. For more about MPC, please refer to our previous posts which detailed the technical background and application scenarios.
MPC, by definition, is a secure computation that several participants jointly execute. Security here means that, in one computation, all participants provide their own private input, and can obtain results from the calculation. It is not possible to get any private information entered by other parties. In 1982, when Prof. Yao proposed the concept of MPC, he gave an example called the “Millionaires Problem” — two millionaires who want to know who is richer than the other without telling the true amount of assets. Specifically, the secure multiparty computation would care about the following properties:
IV. Single Signature, Multi-Signature and Threshold Signature
Besides the threshold signature, what other methods can we choose?
Bitcoin at the beginning, uses single signature which allocates each account with one private key. The message signed by this key is considered legitimate. Later, in order to avoid single point of failure, or introduce account management by multiple people, Bitcoin provides a multi-signature function. Multi-signature can be simply understood as each account owner signs successively and post all signatures to the chain. Then signatures are verified in order on the chain. When certain conditions are met, the transaction is legitimate. This method achieves a multiple private keys control purpose.
So, what’s the difference between multi-signature and threshold signature?
Several constraints of multi-signature are:
As for multiple signatures or threshold signature, the master private key has never been reconstructed, even if it is in memory or cache. this short-term reconstruction is not tolerable for vital accounts.
Just like other secure multi-party computation protocols, the introduction of other participants makes security model different with traditional point-to-point encrypted transmission. The problem of conspiracy and malicious participants were not taken into account in algorithms before. The behavior of physical entities cannot be restricted, and perpetrators are introduced into participating groups.
Therefore, multi-party cryptographic protocols cannot obtain the security strength as before. Effort is needed to develop threshold signature applications, integrate existing infrastructure, and test the true strength of threshold signature scheme.
1. Key Management
The use of threshold signature in key management system can achieve a more flexible administration, such as ARPA’s enterprise key management API. One can use the access structure to design authorization pattern for users with different priorities. In addition, for the entry of new entities, the threshold signature can quickly refresh the key. This operation can also be performed periodically to level up the difficulty of hacking multiple private keys at the same time. Finally, for the verifier, the threshold signature is not different from the traditional signature, so it is compatible with old equipments and reduces the update cost. ARPA enterprise key management modules already support Elliptic Curve Digital Signature Scheme secp256k1 and ed25519 parameters. In the future, it will be compatible with more parameters.
2. Crypto Wallet
Wallets based on threshold signature are more secure because the private key doesn’t need to be rebuilt. Also, without all signatures posted publicly, anonymity can be achieved. Compared to the multi-signature, threshold signature needs less transaction fees. Similar to key management applications, the administration of digital asset accounts can also be more flexible. Furthermore, threshold signature wallet can support various blockchains that do not natively support multi-signature, which reduces the risk of smart contracts bugs.
ConclusionThis article describes why people need the threshold signature, and what inspiring properties it may bring. One can see that threshold signature has higher security, more flexible control, more efficient verification process. In fact, different signature technologies have different application scenarios, such as aggregate signatures not mentioned in the article, and BLS-based multi-signature. At the same time, readers are also welcomed to read more about secure multi-party computation. Secure computation is the holy grail of cryptographic protocols. It can accomplish much more than the application of threshold signatures. In the near future, secure computation will solve more specific application questions in the digital world.
About AuthorDr. Alex Su works for ARPA as the cryptography researcher. He got his Bachelor’s degree in Electronic Engineering and Ph.D. in Cryptography from Tsinghua University. Dr. Su’s research interests include multi-party computation and post-quantum cryptography implementation and acceleration.
About ARPAARPA is committed to providing secure data transfer solutions based on cryptographic operations for businesses and individuals.
The ARPA secure multi-party computing network can be used as a protocol layer to implement privacy computing capabilities for public chains, and it enables developers to build efficient, secure, and data-protected business applications on private smart contracts. Enterprise and personal data can, therefore, be analyzed securely on the ARPA computing network without fear of exposing the data to any third party.
ARPA’s multi-party computing technology supports secure data markets, precision marketing, credit score calculations, and even the safe realization of personal data.
ARPA’s core team is international, with PhDs in cryptography from Tsinghua University, experienced systems engineers from Google, Uber, Amazon, Huawei and Mitsubishi, blockchain experts from the University of Tokyo, AIG, and the World Bank. We also have hired data scientists from CircleUp, as well as financial and data professionals from Fosun and Fidelity Investments.
For more information about ARPA, or to join our team, please contact us at [email protected].
Learn about ARPA’s recent official news：
Telegram (English): https://t.me/arpa_community
Telegram (Việt Nam): https://t.me/ARPAVietnam
Telegram (Russian): https://t.me/arpa_community_ru
Telegram (Indonesian): https://t.me/Arpa_Indonesia
Telegram (Thai): https://t.me/Arpa_Thai
Telegram (Turkish): https://t.me/Arpa_Turkey
Korean Chats: https://open.kakao.com/o/giExbhmb (Kakao) & https://t.me/arpakoreanofficial (Telegram, new)
In Game Theory, Tragedy of the Commons is a market failure scenario where a common good is produced in lower quantities than the public desires, or consumed in greater quantities than desired. One example is pollution - it is in the public's best interest not to pollute, but every individual has incentive to pollute (e.g. because burning fossil fuel is cheap, and individually each consumer doesn't affect the environment much). The relevance to Bitcoin is a hypothetical market failure that might happen in the far future when the block reward from mining drops near zero. In the current Bitcoin design, the only fees miners earn at this time are Transaction fees. Miners will accept transactions with any fees (because the marginal cost of including them is minimal) and users will pay lower and lower fees (in the order of satoshis). It is possible that the honest miners will be under-incentivized, and that too few miners will mine, resulting in lower difficulty than what the public desires. This might mean various 51% attacks will happen frequently, and the Bitcoin will not function correctly. The Bitcoin protocol can be altered to combat this problem - one proposed solution is Dominant Assurance Contracts. Another more radical proposal (in the sense that the required change won't be accepted by most bitcoiners) is to have a perpetual reward that is constant in proportion to the monetary base. That can be achieved in two ways. An ever increasing reward (inflatacoin/expocoin) or a constant reward plus a demurrage fee in all funds that caps the monetary base (freicoin). This scenario was discussed on several threads: - Tragedy of the Commons - Disturbingly low future difficulty equilibrium https://bitcointalk.org/index.php?topic=6284.0 - Stack Exchange http://bitcoin.stackexchange.com/questions/3111/will-bitcoin-suffer-from-a-mining-tragedy-of-the-commons-when-mining-fees-drop-t Currently there is no consensus whether this problem is real, and if so, what is the best solution.Source: https://en.bitcoin.it/wiki/Tragedy_of_the_Commons
Dominant assurance contractsMonero doesn't have a lot of scripting options to work with currently so it is very hard for me to understand how one might go about creating a Dominant Assurance Contract using Monero, especially in regards to paying out to a miner address.
Dominant assurance contracts, created by Alex Tabarrok, involve an extra component, an entrepreneur who profits when the quorum is reached and pays the signors extra if it is not. If the quorum is not formed, the signors do not pay their share and indeed actively profit from having participated since they keep the money the entrepreneur paid them. Conversely, if the quorum succeeds, the entrepreneur is compensated for taking the risk of the quorum failing. Thus, a player will benefit whether or not the quorum succeeds; if it fails he reaps a monetary return, and if it succeeds, he pays only a small amount more than under an assurance contract, and the public good will be provided.
Tabarrok asserts that this creates a dominant strategy) of participation for all players. Because all players will calculate that it is in their best interests to participate, the contract will succeed, and the entrepreneur will be rewarded. In a meta-game, this reward is an incentive for other entrepreneurs to enter the DAC market, driving down the cost disadvantage of dominant assurance contract versus regular assurance contracts.
This scheme is an attempt at Mike Hearn's exercise for the reader: an implementation of dominant assurance contracts. The scheme requires the use of multisignature transactions, nLockTime and transaction replacement which means it won't work until these features are available on the Bitcoin network.
A vendor agrees to produce a good if X BTC are raised by date D and to pay Y BTC to each of n contributors if X BTC are not raised by date D, or to pay nY BTC if X BTC are raised and the vendor fails to produce the good to the satisfaction of 2 of 3 independent arbitrators picked through a fair process
The arbitrators specify a 2-of-3 multisignature script to use as an output for the fundraiser with a public key from each arbitrator, which will allow them to judge the performance on actually producing the good
For each contributor:
The vendor and the contributor exchange public keys
They create a 2-of-2 multisignature output from those public keys
With no change, they create but do not sign a transaction with an input of X/n BTC from the contributor and an input of Y BTC from the vendor, with X/n+Y going to the output created in 3.2
The contributor creates a transaction where the output is X+nY to the address created in step 2 and the input is the output of the transaction in 3.3, signs it using SIGHASH_ALL | SIGHASH_ANYONECANPAY, with version = UINT_MAX and gives it to the vendor
The vendor creates a transaction of the entire balance of the transaction in 3.3 to the contributor with nLockTime of D and version < UINT_MAX, signs it and gives it to the contributor
The vendor and contributor then both sign the transaction in 3.3 and broadcast it to the network, making the transaction in 3.4 valid when enough contributors participate and the transaction in 3.5 valid when nLockTime expires
As date D nears, nLockTime comes close to expiration.
If enough (n) people contribute, all of the inputs from 3.4 can combine to make the output valid when signed by the vendor, creating a valid transaction sending that money to the arbitrators, which only agree to release the funds when the vendor produces a satisfactory output
If not enough people (
Note that there is a limit at which it can be more profitable for the vendor to make the remaining contributions when D approaches
Now the arbitrators have control of X (the payment from the contributors) + nY (the performance bond from the vendor) BTC and pay the vendor only when the vendor performs satisfactorily
Such contracts can be used for crowdfunding. Notable examples from Mike Hearn include:
Funding Internet radio stations which don't want to play ads: donations are the only viable revenue source as pay-for-streaming models allow undercutting by subscribers who relay the stream to their own subscribers
Automatically contributing to the human translation of web pages
I have a draft writeup for a merged-input system called MoJoin that allows multiple parties to generate a single transaction. The goal is to complete the transaction merging with no trust in any party, but this introduces significant complexity and may not be possible with the known Bulletproofs multiparty computation scheme. My current version of MoJoin assumes partial trust in a dealer, who learns the mappings between input rings and outputs (but not true spends or Pedersen commitment data).
https://preview.redd.it/vyucrbl00kk31.png?width=2000&format=png&auto=webp&s=2eec013ab1e7599153fda5c8c0ffe1d77f0f10c2submitted by Rajladumor1 to omgfin [link] [comments]
OMGFIN is extremely proud to announce yet another great project coming to our trading platform. REN (Ren) is now available on OMGFIN, you can deposit and withdrawal REN (Ren) now. Supported trading pairs including DENT/BTC, DENT/ETH, DENT/UQC, DENT/USDT. Please take note of the following schedule:
REN (Ren) trading: 12:00 Sept 4, 2019 (UTC+8)
Get 25% cashback on trading fees at OMGFIN for all REN (Ren) coin trading pairs.
Don't miss out : BTC Market : https://omgfin.com/exchange/trade/market/RENUSDT USDT Market : https://omgfin.com/exchange/trade/market/RENUSDT
UQC Market : https://omgfin.com/exchange/trade/market/RENUQC
ETH Market : https://omgfin.com/exchange/trade/market/RENETH
REN (Ren) Introduction:
The REN is an open-source decentralized dark pool for trustless cross-chain atomic trading of Ether, ERC20 tokens and Bitcoin. REN is an ERC20 token built on the Ethereum network. Ren is an ecosystem for building, deploying, and running general-purpose, privacy-preserving, applications using zkSNARK and our own newly developed secure multiparty computation protocol. It makes it possible for any kind of application to run in a decentralized, trustless, and fault-tolerant environment similar to blockchains but with the distinguishing feature that all application inputs, outputs, and state, remain a secret even to the participants running the network.
REN (Ren) Official Website：https://renproject.io/
REN (Ren) Whitepaper：click here
Risk Warning: Investing in cryptocurrency is akin to being a venture capital investor. The cryptocurrency market is available worldwide 24 x 7 for trading with no market close or open times. Please do your own risk assessment when deciding how to invest in cryptocurrency and blockchain technology. OMGFIN attempts to screen all tokens before they come to market, however, even with the best due diligence there are still risks when investing. OMGFIN is not liable for investment gains or losses.
We sincerely appreciate your support and understanding.
Original article here: https://medium.com/wanchain-foundation/ama-with-wanchain-vp-lini-58ada078b4fesubmitted by maciej_wan to wanchain [link] [comments]
“What is unique about us is that we have actually put theory into practice.”https://preview.redd.it/n6lo2xcmtn621.png?width=800&format=png&auto=webp&s=281acce4b45eed8acf0c52b201d01cb6f0d13507
Wanchain’s Vice President of Business Development, Lini, sat down with blockchain media organization Neutrino for an AMA covering a wide range of topics concerning Wanchain’s development.
The following is an English translation of the original Chinese AMA which was held on December 13th, 2018:
Neutrino: Could you please first share with us a little basic background, what are the basic concepts behind cross chain technology? What are the core problems which are solved with cross-chain? In your opinion, what is the biggest challenge of implementing cross chain to achieve value transfer between different chains?
Lini: Actually, this question is quite big. Let me break it down into three smaller parts:
In China, we like to use the word “cross-chain”, the term “interoperability” is used more frequently in foreign countries. Interoperability is also one of the important technologies identified by Vitalik for the development of a future blockchain ecosystem mentioned in the Ethereum white paper. So cross-chain is basically the concept of interoperability between chains.
In essence, blockchain is a distributed bookkeeping technique, also known as distributed ledger technology. Tokens are the core units of account on each chain, there currently exist many different chains, each with their own token. Of especial importance is the way in which each ledger uses tokens to interact with each other for the purpose of clearing settlements.
Cross chain technology is one of the foundational technological infrastructures that is necessary for the large scale application of blockchain technology.
Neutrino: As we all know, there are many different kinds of cross-chain technologies. Please give us a brief introduction to several popular cross-chain technologies on the market, and the characteristics of each of these technologies。
Lini: Before answering this question, it is very important to share two important concepts with our friends: heterogeneity and homogeneity, and centralization and decentralization.
These two points are especially important for understanding various cross-chain technologies, because there are many different technologies and terminologies, and these are some of the foundational concepts needed for understanding them.
There are also two core challenges which must be overcome to implement cross-chain:
Combining the above two points, we look at the exploration of some solutions in the industry and the design concepts of other cross-chain projects.
First I’d like to discuss the Relay solution.
However the Relay solution must consume a relatively large amount of gas to read the BTC header. Another downside is that, as we all know, Bitcoin’s blocks are relatively slow, so the time to wait for verification will be long, it usually takes about 10 minutes to wait for one block to confirm, and the best practice is to wait for 6 blocks.
The next concept is the idea of Sidechains.
This solution is good, but not all chains contain SPV, a simple verification method. Therefore, there are certain drawbacks. Of course, this two way peg way solves challenge beta very well, that is, the atomicity of the transaction.
These two technical concepts have already been incorporated into a number of existing cross chain projects. Let’s take a look at two of the most influential of these.
The first is Polkadot.
This is just a summary based on Polkadot’s whitepaper and most recent developments. The theoretical design is very good and can solve challenges alpha and beta. Last week, Neutrino organized a meetup with Polkadot, which we attended. In his talk, Gavin’s focus was on governance, he didn’t get into too much technical detail, but Gavin shared some very interesting ideas about chain governance mechanisms! The specific technical details of Polkadot may have to wait until after their main net is online before it can be analyzed.
Next is Cosmos.
Cosmos is a star project who’s basic concept is similar to Polkadot. Cosmos’s approach is based on using a central hub. Both projects both take into account the issue of heterogeneous cross-chain transactions, and both have also taken into account how to solve challenges alpha and beta.
To sum up, each research and project team has done a lot of exploration on the best methods for implementing cross-chain technology, but many are still in the theoretical design stage. Unfortunately, since the main net has not launched yet, it is not possible to have a more detailed understanding of each project’s implementation. A blockchain’s development can be divided into two parts: theoretical design, and engineering implementation. Therefore, we can only wait until after the launch of each project’s main network, and then analyze it in more detail.
Neutrino: As mentioned in the white paper, Wanchain is a general ledger based on Ethereum, with the goal of building a distributed digital asset financial infrastructure. There are a few questions related to this. How do you solve Ethereum’s scaling problem? How does it compare with Ripple, which is aiming to be the standard trading protocol that is common to all major banks around the world? As a basic potential fundamental financial infrastructure, what makes Wanchain stand out?
Lini: This question is actually composed of two small questions. Let me answer the first one first.
The TPS of Ethereum is not high at this stage, which is limited by various factors such as the POW consensus mechanism. However, this point also in part is due to the characteristics of Ethereum’s very distributed and decentralized features. Therefore, in order to improve TPS, Wanchain stated in its whitepaper that it will launch its own POS consensus, thus partially solving the performance issues related to TPS. Wanchain’s POS is completely different from the POS mechanism of Ethereum 2.0 Casper.
Of course, at the same time, we are also paying close attention to many good proposals from the Ethereum community, such as sharding, state channels, side chains, and the Raiden network. Since blockchain exists in the world of open source, we can of course learn from other technological breakthroughs and use our own POS to further improve TPS. If we have some time at the end, I’d love to share some points about Wanchain’s POS mechanism.
Wanchain is focused on different use cases, it is to act as a bridge between different tokens and tokens, and between assets and tokens. For various cross-chain applications it is necessary to consume WAN as a gas fee to pay out to nodes.
So it seems that the purpose Ripple and Wanchain serve are quite different. Of course, there are notary witnesses in the cross-chain mechanism, that is, everyone must trust the middleman. Ripple mainly serves financial clients, banks, so essentially everyone’s trust is already there.
Neutrino: We see that Wanchain uses a multi-party computing and threshold key sharing scheme for joint anchoring, and achieves “minimum cost” for integration through cross-chain communication protocols without changing the original chain mechanism. What are the technical characteristics of multi-party computing and threshold key sharing? How do other chains access Wanchain, what is the cross-chain communication protocol here? What is the cost of “minimum cost?
Lini: The answer to this question is more technical, involving a lot of cryptography, I will try to explain it in a simple way.
In sMPC multiple parties each holding their own piece of private data jointly perform a calculation (for example, calculating a maximum value) and obtain a calculation result. However, in the process, each party involved does not leak any of their respective data. Essentially sMPC calculation can allow for designing a protocol without relying on any trusted third parties, since no individual ever has access to the complete private information.
Secure multiparty computing can be abstractly understood as two parties who each have their own private data, and can calculate the results of a public function without leaking their private data. When the entire calculation is completed, only the calculation results are revealed to both parties, and neither of them knows the data of the other party and the intermediate data of the calculation process. The protocol used for secure multiparty computing is homomorphic encryption + secret sharing + OT (+ commitment scheme + zero knowledge proofs, etc.)
Wanchain’s 21 cross chain Storeman nodes use sMPC to participate in the verification of a transaction without obtaining of a user’s complete private key. Simply put, the user’s private key will have 21 pieces given to 21 anonymous people who each can only get 1/21 part, and can’t complete the whole key.
Wanchain uses the threshold M<=N; N=21; M=16. That is to say, at least 16 Storeman nodes must participate in multi-party calculation to confirm a transaction. Not all 21 Storeman nodes are required to participate. This is a solution to the security problem of managing private keys.
Cross-chain communication protocols refers to the different communication methods used by different chains. This is because heterogeneous cross-chain methods can’t change the mechanism of the original chains. Nakamoto and Vitalik will not modify their main chains because they need BTC and ETH interoperability. Therefore, project teams that can only do cross-chain agreements to create different protocols for each chain to “talk”, or communicate. So the essence of a cross-chain protocol is not a single standard, but a multiple sets of standards. But there is still a shared sMPC and threshold design with the Storeman nodes.
The minimum cost is quite low, as can be shown with Wanchain 3.0’s cross chain implementation. In fact it requires just two smart contracts, one each on Ethereum and Wanchain to connect the two chains. To connect with Bitcoin all that is needed is to write a Bitcoin script. Our implementation guarantees both security and decentralization, while at the same time remaining simple and consuming less computation. The specific Ethereum contract and Bitcoin scripts online can be checked out by anyone interested in learning more.
Neutrino: What kind of consensus mechanism is currently used by Wanchain? In addition, what is the consensus and incentive mechanism for cross-chain transactions, and what is the purpose of doing so? And Wanchain will support cross-chain transactions (such as BTC, ETH) on mainstream public chains, asset cross-chain transactions between the alliance chains, and cross-chain transactions between the public and alliance chains, how can you achieve asset cross-chain security and privacy?
Lini: It is now PPOW (Permissioned Proof of Work), in order to ensure the reliability of the nodes before the cross-chain protocol design is completed, and to prepare to switch to POS (as according to the Whitepaper roadmap). The cross-chain consensus has been mentioned above, with the participation of a small consensus (at least 16 nodes) in a set of 21 Storeman nodes through sMPC and threshold secret sharing.
In addition, the incentive is achieved through two aspects: 1) 100% of the cross chain transaction fee is used to reward the Storeman node; 2) Wanchain has set aside a portion of their total token reserve as an incentive mechanism for encouraging Storeman nodes in case of small cross-chain transaction volume in the beginning.
It can be revealed that Storeman participation is opening gradually and will become completely distributed and decentralized in batches. The first phase of the Storeman node participation and rewards program is to be launched at the end of 2018. It is expected that the selection of participants will be completed within one quarter. Please pay attention to our official announcements this month.
In addition, for public chains, consortium chains, and private chains, asset transfer will also follow the cross-chain mechanism mentioned above, and generally follow the sMPC and threshold integration technology to ensure cross-chain security.
When it comes to privacy, this topic will be bigger. Going back to the Wanchain Whitepaper, we have provided privacy protection on Wanchain mainnet. Simply put, the principle is using ring signatures. The basic idea is that it mixes the original address with many other addresses to ensure privacy. We also use one-time address. In this mechanism a stamp system is used that generates a one-time address from a common address. This has been implemented since our 2.0 release.
But now only the privacy protection of native WAN transactions can be provided. The protection of cross-chain privacy and user experience will also be one of the important tasks for us in 2019.
Neutrino: At present, Wanchain uses Storeman as a cross-chain trading node. Can you introduce the Storeman mechanism and how to protect these nodes?
Lini: Let me one problem from two aspects.
Neutrino: On December 12th, the mainnet of Wanchain 3.0 was launched. Wanchain 3.0 opened cross-chain transactions between Bitcoin, Ethereum and ERC20 (such as MakerDao’s stable currency DAI and MKR). What does this version mean for you and the industry? This upgrade of cross-chain with Bitcoin is the biggest bright spot. So, if now you are able to use Wanchain to make transactions between what is the difference between tokens, then what is the difference between a cross chain platform like Wanchain and cryptocurrency exchanges?
Lini: The release of 3.0 is the industry’s first major network which has crossed ETH and BTC, and it has been very stable so far. As mentioned above, many cross-chain, password-protected theoretical designs are very distinctive, but for engineering implementation, the whether or not it can can be achieved is a big question mark. Therefore, this time Wanchain is the first network launched in the world to achieve this. Users are welcome to test and attack. This also means that Wanchain has connected the two most difficult and most challenging public networks. We are confident we will soon be connecting other well-known public chains.
At the same time of the release of 3.0, we also introduced cross chain integration with other ERC20 tokens in the 2.X version, such as MakerDao’s DAI, MKR, LRC, etc., which also means that more tokens of excellent projects on Ethereum will also gradually be integrated with Wanchain.
Some people will be curious, since Wanchain has crossed so many well-known public chains/projects; how is it different with crypto exchanges? In fact, it is very simple, one centralized; one distributed. Back to the white paper of Nakamoto, is not decentralization the original intention of blockchain? So what Wanchain has to do is essentially to solve the bottom layer of the blockchain, one of the core technical difficulties.
Anyone trying to create a DEX (decentralized exchange); digital lending and other application scenarios can base their application on Wanchain. There is a Wanchain based DEX prototype made by our community members Jeremiah and Harry, which quite amazing. Take a look at this video below.
Neutrino: What are the specific application use cases after the launch of Wanchain 3.0? Most are still exploring small-scale projects. According to your experience, what are the killer blockchain applications of the future? What problems need to be solved during this period? How many years does it take?
Lini: As a cross-chain public chain, we are not biased towards professional developers or ordinary developers, and they are all the same. As mentioned above, we provide a platform as infrastructure, and everyone is free to develop applications on us.
For example, if it is a decentralized exchange, it must be for ordinary users to trade on; if it is some kind of financial derivatives product, it is more likely to be used by finance professionals. As for cross-chain wallets which automatically exchange, I’m not sure if you are talking about distributed exchanges, the wallet will not be “automatic” at first, but you can “automatically” redeem other tokens.
Finally, the remaining WAN tokens are strictly in accordance with the plan laid out in the whitepaper. For example, the POS node reward mentioned above will give 10% of the total amount for reward. At the same time, for the community, there are also rewards for the bounty program. The prototype of the DEX that I just saw is a masterpiece of the overseas community developers, and also received tokens from our incentive program.
Neutrino community member’s question: There are many projects in the market to solve cross-chain problems, such as: Cosmos, Polkadot, what are Wanchain’s advantages and innovations relative to these projects?
Lini: As I mentioned earlier, Cosmos and pPolkadot all proposed very good solutions in theory. Compared with Wanchain, I don’t think that we have created anything particularly unique in our theory. The theoretical basis for our work is cryptography, which is derived from the academic foundation of scholars such as Yao Zhizhi and Silvio Micali. Our main strong point is that we have taken theory and put it into practice..
Actually, the reason why people often question whether a blockchain project can be realized or not is because the whitepapers are often too ambitious. Then when they actually start developing there are constant delays and setbacks. So for us, we focus on completing our very solid and realizable engineering goals. As for other projects, we hope to continue to learn from each other in this space.
Neutrino community member Amos from Huobi Research Institute question: How did you come to decide on 21 storeman nodes?
Lini: As for the nodes we won’t make choices based on quantity alone. The S in the POS actually also includes the time the tokens are staked, so that even if a user is staking less tokens, the amount of time they stake them for will also be used to calculate the award, so that is more fair. We designed the ULS (Unique Leader Selection) algorithm in order to reduce the reliance on the assumption of corruption delay (Cardano’s POS theory). which is used for ensuring fairness to ensure that all participants in the system can have a share of the reward, not only few large token holders.
Wu Di, a member of the Neutrino community: Many big exchanges have already begun to deploy decentralized exchanges. For example, Binance, and it seems that the progress is very fast. Will we be working with these influential exchanges in the future? We we have the opportunity to cooperate with them and broaden our own influence?
Lini: I also have seen some other exchange’s DEX. Going back the original point, distributed cross-chain nodes and centralized ones are completely different. I’m guessing that most exchanges use a centralized cross-chain solution, so it may not be the same as the 21 member Storeman group of Wanchain, but I think that most exchanges will likely be using their own token and exchange system. This is my personal understanding. But then, if you are developing cross chain technology, you will cooperate with many exchanges that want to do a DEX. Not only Binance, but also Huobi, Bithumb, Coinbase… And if there is anyone else who would like to cooperate we welcome them!
Neutrino community member AnneJiang from Maker: Dai as the first stable chain of Wanchain will open a direct trading channel between Dai and BTC. In relation to the Dai integration, has any new progress has been made on Wanchain so far?
Lini: DAI’s stable currency has already been integrated on Wanchain. I just saw it yesterday, let me give you a picture. It’s on the current 3.0 browser, https://www.wanscan.org/, you can take a look at it yourself.
This means that users with DAI are now free to trade for BTC, or ETH or some erc20 tokens. There is also a link to the Chainlink, and LRC is Loopring, so basically there are quite a few excellent project tokens. You may use the Wanchain to trade yourself, but since the DEX is not currently open, currently you can only trade with friends you know.
About NeutrinoNeutrino is a distributed, innovative collaborative community of blockchains. At present, we have established physical collaboration spaces in Tokyo, Singapore, Beijing, Shanghai and other places, and have plans to expand into important blockchain innovation cities such as Seoul, Thailand, New York and London. Through global community resources and partnerships, Neutrino organizes a wide range of online an offline events, seminars, etc. around the world to help developers in different regions better communicate and share their experiences and knowledge.
About WanchainWanchain is a blockchain platform that enables decentralized transfer of value between blockchains. The Wanchain infrastructure enables the creation of distributed financial applications for individuals and organizations. Wanchain currently enables cross-chain transactions with Ethereum, and today’s product launch will enable the same functionalities with Bitcoin. Going forward, we will continue to bridge blockchains and bring cross-chain finance functionality to companies in the industry. Wanchain has employees globally with offices in Beijing (China), Austin (USA), and London (UK).
You can find more information about Wanchain on our website. Additionally, you can reach us through Telegram, Discord, Medium, Twitter, and Reddit. You can also sign up for our monthly email newsletter here.
Secure Multiparty Computations on Bitcoin Abstract: Bit coin is a decentralized digital currency, introduced in 2008, that has recently gained noticeable popularity. Its main features are: (a) it lacks a central authority that controls the transactions, (b) the list of transactions is publicly available, and (c) its syntax allows more advanced ... Download Citation Secure multiparty computations on Bitcoin Is it possible to design an online protocol for playing a lottery, in a completely decentralized way, that is, without relying on a ... CiteSeerX - Document Details (Isaac Councill, Lee Giles, Pradeep Teregowda): Abstract—Bitcoin is a decentralized digital currency, intro-duced in 2008, that has recently gained noticeable popularity. Its main features are: (a) it lacks a central authority that controls the transactions, (b) the list of transactions is publicly available, and (c) its syntax allows more advanced transactions ... Secure Multiparty Computations on Bitcoin. By Marcin Andrychowicz, Stefan Dziembowski, Daniel Malinowski, Łukasz Mazurek Communications of the ACM, April 2016, Vol. 59 No. 4, Pages 76-84 10.1145/2896386 Comments. In this work, we propose to use Bitcoin (a digital currency, introduced in 2008) to design such fully decentralized protocols that are secure even if no trusted third party is available. As an instantiation of this idea, we construct protocols for secure multiparty lotteries using the Bitcoin currency, without relying on a trusted authority.
[index]          
In this video, IOHK researcher Bernardo David provides a quick introduction to multi-party computation Protocols. Bernardo David is an Assistant Professor at the Tokyo Institute of Technology. He ... There were two other papers presented at the bitcoin session on Monday. The Bitcoin Backbone Protocol with Chains of Variable Difficulty, was produced by a team of three researchers and included ... Basics of Secure Multiparty Computation - Duration: 13:08. CryptoClear 7,496 views. 13:08. Crypto Expert Predicts Bitcoin Will Hit 100k - Robert Kiyosaki & Anthony Pompliano - Duration: 32:25. Secure multi-party computation (also known as secure computation or multi-party computation/MPC) is a subfield of cryptography with the goal of creating methods for parties to jointly compute a ... Secure Multiparty Computation ... Bitcoin and Cryptocurrency Technologies Online Course Recommended for you. ... Two-Party Secure Computation for Malicious Adversaries - Duration: ...