What’s Up With Mining Difficulty On The Bitcoin Blockchain?

Masari: Simple Private Money

Masari (MSR) is a scalability-focused, untraceable, secure, and fungible cryptocurrency using the RingCT protocol. Masari is the first CryptoNote coin to develop uncle mining and a fully client side web wallet.
[link]

Why running a Bitcoin node makes you "your own bank" and why you should run one

We have many newcomers and many seems to not understand what a bitcoin node is, what power it has, why it is important to run one, why node decentralization matter much more than anything else. I want to share my opinion on why by running a node, you are indeed your own bank.
I invite you to also read this post. It shows how you can represent the Bitcoin network with visual image and it is nice to understand the technical role of nodes in the protocol. I want to focus on my opinion of the banking function of a node here.

What is a Bitcoin node ? What the Bitcoin network does?


A bitcoin node is a peer of the Bitcoin P2P banking network. It is just that but it doesn't tell you much so a better question is : What is a banking network actually ?
A banking network is a mesh of banks which can emit, transfer and redeem their IOUs that we call a money. Emition is money creation and each banks may follow certain rules to enable such creation for a customer. A banking network allows the customers to transfer the freshly creates (or received) money (which is bank's IOU) to someone else more or less freely (sometime you can't, KYC, AML stuff...). Transfers are written in a ledger, which is a trusted record maintained someway by all the banks of the exchanges of the IOU and is what makes this kind of IOU so different from others: the mortgagor (custumers of the bank) can ask the borrower (the banks) to transfer the future repayment to someone else so that the mortgagor pay the latter. Cash is one way to make the transfert, wired payment is another. And the banks can just destroy the money by removing it from the ledger, but better not doing it randomly if you want to keep the custumers using your banking nework (that may explain why burning cash is forbidden..)

Now the heart of banks policy is this: what truely makes several banks a banking network is the fact that they follow the same rules for money creation, transfer and destruction and they agree on the same ultimate ledger. They are several way to organize such network, lets look at fiat currencies first :
In fiat currencies the network is centralized and it may have a kind of fractal structure: the Central Bank manages the currency ultimate ledger but delegate a part of its power (money creation/destruction and transfer) to private banks through law and banking regulation, each of the banks have there own policy, responsibility in money creation and ledger that they delegate to regional banks desk which manage the policy of local banks. Generally, each level has its own ledger and only net settlements are written to the ledger of the level above. Fiat IOU are often represented by numbers written as amount of your bank account or cash (that only the central bank can print). Finally, when money is created in fiat currency, the banks always created an IOU against another debt IOU's (the state and private banks debt for Central Banks, customers for private ones), when redeemed, the banks get back his own IOU and can safely destroy it. We talk about "money based on debt". Fiat currency are today backed by debt. Being fiat rich mean they people have a kind of debt to you a lot (and you may see why inflation is needed in such system since without it, it is impossible to redeem the total of loans interest). During Gold Standard, the Central Bank only created money against gold (but private bank still created money based on debt, this is fractional reserve banking), it was a commodity based money (backed by gold).
So now, what about Bitcoin ? It is a P2P banking network. Nodes are the banks and all nodes follow the same rules. However the network is not at all like fiat currencies: it is peer-to-peer, fully decentralized. This means no nodes has more power than any other node and they all keep track of the same ledger (there is no intermediate ledger like fiat). The nodes check that the supply schedule (the money creation part) is correct, all transactions and blocks are valid and propagate them to other peers if it is the case (the transfer function of the banking network). The IOU of this banking network is the bitcoin token.

What makes Bitcoin a banking network ? (and Bitcoin a money)


Now you may say that my story of "bank's IOU is money of the banking network" doesn't fit in Bitcoin since money is never really destroy and no one will give you something in return if you burn them ... but I have a way to look at it that makes it coherent.
Bitcoin's token are destroyed when you pay the transaction fees, they disappear from the ledger, transaction fees are indeed not a payment because you don't know who you are paying exactly: the protocol force you to give it to the miner who include your transaction somehow. Against this money destruction, you gain a priority score in the queue to write your transaction in the most secure, replicated, immuable database of the world: the Bitcoin's ledger. Why ? Because we have a rule which say that the miner of a block (which is also like a bank customer who borrows) can claim newly created money at most equal to the new supply schedule of token plus total of transaction fees in the block, so the just destroyed money is immediatly recreated... like any normal bank would wish to do in fact ! In the case of Bitcoin, the money is somehow "hash based" and hash is priced by the network difficulty and priority competition. The banking network exchanges against the hashing power they have to secure the ledger a priority right (incarnated by a bitcoin utxo) with miners to write stuff in the database stored by all nodes that they can transfer if thes wish to do so. The money creation against hashrate is a perfect alignement of incentives: the Bitcoin's banking network creates the money for the ones who make the ledger going forward. You can check in a block explorer that the miners indeed claim the full reward in the coinbase directly, and the raw data of a transaction have no field "transaction fees", they are always equal to the unclaimed (=destroyed) bitcoins in the output of the transaction.

So there you have it: a bitcoin node is a peer of the Bitcoin P2P banking network, a monetary network or settlement network (not a payment network like Visa, you can't have a loan with Visa, so it is not a bank). A node follows the protocol rules to ensure the network furfill its function: manage a currency that take the form of electronic cash. A bitcoin node is to Bitcoin what a bank is to fiat currencies network but decentralized and with full powers, that's why we also call bitcoin nodes "full node". That's why you are your own bank when you run a node. That's why you are just a "bank's customer" if you don't.

That's why if you don't run a node, you technically don't have a word to say on Bitcoin rules.

What if I don't run a node and mine/own bitcoins ?


You can mine without having a node, you just have to join a mining pool. The pool uses a node to send you validated transactions you include in the block you are currently mining. In that case, you are trusting the node of the pool and have no way to be sure that the node send you the transactions with the highest transaction fees for example: you are a customer of the node mining pool (which is your bank, you don't own it if you don't run your node). This is generally not an issue because the pool is a custodian of mining reward anyway and distribute rewards fairly later, and miners are often running a node for themself too.
When you own bitcoin without running your node, you use a wallet software: "something to store your money" and nothing more. You can spend bitcoins using the wallet software. But the sofware must contact a bank (=a node) to receive and broadcast transactions in the monetary network. Owning bitcoin without running a node is like having a bank account protected by your keys at the big Bitcoin bank, you are not the bank itself so some node you are connected to may spy you (chainanalysis electrum servers), scam you (electrum 3.4.1 scam), censor you and you may not follow a valid chain by simply checking proof of work in case of 51% attack.

How to run a node then ?

In Bitcoin, we try to make this task as simple as possible. You only have to install Bitcoin Core, the main node software (home page of this subreddit, right panel for a link). But do your own reaseach on how you must set Bitcoin Core for your need because the Bitcoin blockchain is big and your PC will be working a lot for several week potentially to catch the current state of the ledger. This is a time investment, it is ok if you don't run a node for little amount or if you never transact but running a node will make you learn a lot about Bitcoin.
Learn how to use your node to broacast your transactions at least (sendrawtransaction may be your friend here) and notify you when you received money. This way, you are really trusting no one.
Some guide exist to install an always online node on a raspberry pi with many useful tools to use it well (RaspiBolt, RaspiBlitz, mynodes ...), if you have skin in the game, those are really cool project which allow you to run a lightning node trustless and many other things !
Enjoy the endless rabbit hole !
submitted by Pantamis to Bitcoin [link] [comments]

Gridcoin 5.0.0.0-Mandatory "Fern" Release

https://github.com/gridcoin-community/Gridcoin-Research/releases/tag/5.0.0.0
Finally! After over ten months of development and testing, "Fern" has arrived! This is a whopper. 240 pull requests merged. Essentially a complete rewrite that was started with the scraper (the "neural net" rewrite) in "Denise" has now been completed. Practically the ENTIRE Gridcoin specific codebase resting on top of the vanilla Bitcoin/Peercoin/Blackcoin vanilla PoS code has been rewritten. This removes the team requirement at last (see below), although there are many other important improvements besides that.
Fern was a monumental undertaking. We had to encode all of the old rules active for the v10 block protocol in new code and ensure that the new code was 100% compatible. This had to be done in such a way as to clear out all of the old spaghetti and ring-fence it with tightly controlled class implementations. We then wrote an entirely new, simplified ruleset for research rewards and reengineered contracts (which includes beacon management, polls, and voting) using properly classed code. The fundamentals of Gridcoin with this release are now on a very sound and maintainable footing, and the developers believe the codebase as updated here will serve as the fundamental basis for Gridcoin's future roadmap.
We have been testing this for MONTHS on testnet in various stages. The v10 (legacy) compatibility code has been running on testnet continuously as it was developed to ensure compatibility with existing nodes. During the last few months, we have done two private testnet forks and then the full public testnet testing for v11 code (the new protocol which is what Fern implements). The developers have also been running non-staking "sentinel" nodes on mainnet with this code to verify that the consensus rules are problem-free for the legacy compatibility code on the broader mainnet. We believe this amount of testing is going to result in a smooth rollout.
Given the amount of changes in Fern, I am presenting TWO changelogs below. One is high level, which summarizes the most significant changes in the protocol. The second changelog is the detailed one in the usual format, and gives you an inkling of the size of this release.

Highlights

Protocol

Note that the protocol changes will not become active until we cross the hard-fork transition height to v11, which has been set at 2053000. Given current average block spacing, this should happen around October 4, about one month from now.
Note that to get all of the beacons in the network on the new protocol, we are requiring ALL beacons to be validated. A two week (14 day) grace period is provided by the code, starting at the time of the transition height, for people currently holding a beacon to validate the beacon and prevent it from expiring. That means that EVERY CRUNCHER must advertise and validate their beacon AFTER the v11 transition (around Oct 4th) and BEFORE October 18th (or more precisely, 14 days from the actual date of the v11 transition). If you do not advertise and validate your beacon by this time, your beacon will expire and you will stop earning research rewards until you advertise and validate a new beacon. This process has been made much easier by a brand new beacon "wizard" that helps manage beacon advertisements and renewals. Once a beacon has been validated and is a v11 protocol beacon, the normal 180 day expiration rules apply. Note, however, that the 180 day expiration on research rewards has been removed with the Fern update. This means that while your beacon might expire after 180 days, your earned research rewards will be retained and can be claimed by advertising a beacon with the same CPID and going through the validation process again. In other words, you do not lose any earned research rewards if you do not stake a block within 180 days and keep your beacon up-to-date.
The transition height is also when the team requirement will be relaxed for the network.

GUI

Besides the beacon wizard, there are a number of improvements to the GUI, including new UI transaction types (and icons) for staking the superblock, sidestake sends, beacon advertisement, voting, poll creation, and transactions with a message. The main screen has been revamped with a better summary section, and better status icons. Several changes under the hood have improved GUI performance. And finally, the diagnostics have been revamped.

Blockchain

The wallet sync speed has been DRASTICALLY improved. A decent machine with a good network connection should be able to sync the entire mainnet blockchain in less than 4 hours. A fast machine with a really fast network connection and a good SSD can do it in about 2.5 hours. One of our goals was to reduce or eliminate the reliance on snapshots for mainnet, and I think we have accomplished that goal with the new sync speed. We have also streamlined the in-memory structures for the blockchain which shaves some memory use.
There are so many goodies here it is hard to summarize them all.
I would like to thank all of the contributors to this release, but especially thank @cyrossignol, whose incredible contributions formed the backbone of this release. I would also like to pay special thanks to @barton2526, @caraka, and @Quezacoatl1, who tirelessly helped during the testing and polishing phase on testnet with testing and repeated builds for all architectures.
The developers are proud to present this release to the community and we believe this represents the starting point for a true renaissance for Gridcoin!

Summary Changelog

Accrual

Changed

Most significantly, nodes calculate research rewards directly from the magnitudes in EACH superblock between stakes instead of using a two- or three- point average based on a CPID's current magnitude and the magnitude for the CPID when it last staked. For those long-timers in the community, this has been referred to as "Superblock Windows," and was first done in proof-of-concept form by @denravonska.

Removed

Beacons

Added

Changed

Removed

Unaltered

As a reminder:

Superblocks

Added

Changed

Removed

Voting

Added

Changed

Removed

Detailed Changelog

[5.0.0.0] 2020-09-03, mandatory, "Fern"

Added

Changed

Removed

Fixed

submitted by jamescowens to gridcoin [link] [comments]

Crypto Mining Investment

To be successful at mining, speed is of the essence since the miner is trying to solve a question, add a block the the chain and reap the rewards before anyone else. The more answers suggested over the shortest period of time will increase the chances of solving that block. These days it’s impossible to mine Bitcoin using a standard desktop computer. The computational difficulty of mining is too high. When Bitcoin first arrived it was possible to mine using a standard home computer or, more specifically, the computer’s GPU (graphics processing unit).
Since Bitcoin’s phenomenal rise in 2017, many people have been wondering how they can get in on the cryptocurrency action. One way to get involved in the revolution is by mining cryptocurrency.However, before you can get started with mining, you first need to know the basics and get the equipment. In this guide, we’ll cover some of the best GPUs on the market that you can use to mine cryptocurrency.
Mining is the process of solving mathematical algorithms that serve as a puzzle to successfully add a cryptocurrency transaction to a blockchain. Transparency is one of the most vital aspects in this space, so legitimate transactions being recorded on a blockchain is essential. When a transaction occurs, it is broadcasted to everyone on that token’s network. This informs the miners that there is work to be done.
Miners will then begin solving the puzzles to play the role of a ‘digital accountant’ and add the transactions to the blockchain. This is the process of cryptocurrency mining. The incentives for miners to complete this work are the transaction fee offered by those who want to record their transaction and the block reward. A block reward is awarded to successful miners for completing their job. For example, Bitcoin’s current block reward is 12.5 Bitcoin, whereas Ethereum’s reward is 3 Ether.
As you might have guessed, the GPU market is dominated by Nvidia and AMD. In general, AMD typically make faster products and get better performance rates from their GPUs by adding more ALUs that function at a lower clock speed. However, Nvidia make use of more complex ALUs that are more powerful. Another thing worth noting is that Bitcoin, for example, uses an SHA-256 encryption algorithm, and AMD products only need a single hardware instruction to mine using the SHA-256 hash, whereas Nvidia needs three
Promote Amacminig LTD & earn
We are introducing the AMAC MINING LTD Referral Program. Now you can get up to 15% from each purchase that your referral makes. Refer your family members, friends, colleagues and earn extra cash!
GET REWARDED UP TO 20%
The more affiliates you invite, the higher commission percent you earn from it.
How you can get referrals?
- Many members will buy PTC ads on other Faucet/PTC websites.
- Many other members will buy a banner ad in Ad network websites such as Adhitz, A-ads, Moonads, Mellowads, Cointraffic, Bitmedia, Coinverti, etc.
- Some members will use traffic exchange websites.
But is there any solution to refer people without any cost?
1- Forums: Make a topic on top forums and share your referral link along with a short description about Amacmining.com
2- Forums signature: If you can not make a topic, no problems. You can participate in the forums with a personal signature that includes your Amacmining.com referral link in it.
3- Social media activity: Share your referral link on Facebook, Instagram, Twitter, Telegram, Whatsapp, Pinterest, Reddit, Myspace, Badoo, etc.
4- Build a blog: Write a blog post about our website and share your referral link on it.
5- Create video tutorials on YouTube: Make one video about Sagmining.com and let people know how it works! Publish your video on Youtube, Vimeo.com, Dailymotion.com, and Bitchute.com
6- Offer a referral rebate to your friends. This means that you can share part of your earnings with them to encourage them to register using your referral link.
But do not forget the rules:
1- Do not spam. Do not force people to join.
2- We do not allow using services that sell referrals.
3- Your referrals must be unique and real. You cannot bring fake referrals, bots, etc.
4- Do not over-promise or give an unrealistic assurance to anyone with what they can earn on the website.
5- Do not try to abuse the system in anyways whether it is specifically mentioned or not in the rules/terms.
submitted by Scoggin223 to u/Scoggin223 [link] [comments]

Research Report about Conflux Network

Research Report about Conflux Network
Author: Gamals Ahmed, CoinEx Business Ambassador
An open network for a new world of DApps, Web 3.0, finance

ABSTRACT

Conflux represents the latest addition to an active field of distributed ledger technologies that are trying to address some of the scalability limitation of blockchains like Ethereum or Bitcoin. Dfinity, Hasgraph, Zilliqa, Algorand are some of the most prominent projects focused on this area. The ideas behind Conflux have some similarities with some of these projects but it remains loyal to the Nakamoto-consensus model. Conceptually, the Conflux protocol improves upon the Nakamoto-consensus sequential model of producing one block at the time allowing the network to create an arbitrary number of concurrent blocks without sacrificing the immutability of the blockchain. Technically, this optimization can enable the concurrent processing of thousands of transactions per second. Obviously, the devil is in the details, but Conflux seems to have combined several well-known computer science techniques to arrive to this major breakthrough.
Conflux keeps all fork blocks by Tree Graph, thereby improving the overall throughput of the system, and realizing scalability under the premise of security. The overall idea is not complicated. However, in terms of engineering implementation, the overall sequencing of all transactions, the processing of complex topology of Tree Graph, the problem of block generation incentives, and how to reduce duplicate transactions, there are still many difficulties to be tackled by much profound technical and theoretical solutions. At present, Conflux has already launched and open-sourced the testnet.

1.INTRODUCTION

Conflux represents the latest addition to an active field of distributed ledger technologies that are trying to address some of the scalability limitation of blockchains like Ethereum or Bitcoin. Dfinity, Hasgraph, Zilliqa, Algorand are some of the most prominent projects focused on this area. The ideas behind Conflux have some similarities with some of these projects but it remains loyal to the Nakamoto-consensus model. Conceptually, the Conflux protocol improves upon the Nakamoto-consensus sequential model of producing one block at the time allowing the network to create an arbitrary number of concurrent blocks without sacrificing the immutability of the blockchain. Technically, this optimization can enable the concurrent processing of thousands of transactions per second. Obviously, the devil is in the details, but Conflux seems to have combined several well-known computer science techniques to arrive to this major breakthrough.
Conflux keeps all fork blocks by Tree Graph, thereby improving the overall throughput of the system, and realizing scalability under the premise of security. The overall idea is not complicated. However, in terms of engineering implementation, the overall sequencing of all transactions, the processing of complex topology of Tree Graph, the problem of block generation incentives, and how to reduce duplicate transactions, there are still many difficulties to be tackled by much profound technical and theoretical solutions. At present, Conflux has already launched and open-sourced the testnet.

1.1 OVERVIEW ABOUT CONFLUX NETWORK

Founded in 2018, Conflux has raised US$35 million, according to Crunchbase, from investors including Sequoia China and Baidu Ventures — the venture capital arm of China’s equivalent of Google.
Conflux Network is an open protocol for a new world of DApps, finance, and Web 3.0. As a fast and secure public blockchain, Conflux Network combines Proof of Work and a Tree-Graph structure to power a new generation of decentralized commerce.y developing decentralized open source technologies.
Born from the top minds of Tsinghua University and University of Toronto, Conflux Network is overseen by a diverse global team with the mission to promote cross border collaboration.
Conflux is a new PoW network with a Turing-complete smart contract language similar to this of Ethereum. The Conflux network provides significant performance improvements with its processing of parallel blocks in a Directed Acyclic Graph (DAG) structure, which lowers confirmation times and increases transaction throughput sub- stantially.
Conflux Network is a scalable and decentralized blockchain system with high throughput and fast confirmation. It operates a novel consensus protocol GHAST, which optimistically processes concurrent blocks without discarding any as forks, and adaptively assigns heterogeneous weights to blocks based on their topologies in the ledger structure (called Tree-Graph). Therefore, Conflux is able to achieve high throughput as well as fast confirmation and liveness guarantee.
The only state endorsed public, permissionless blockchain project in China, Conflux Network is an open-source, layer-1 blockchain protocol delivering heightened scalability, security, and extensibility for the next generation of open commerce, decentralized applications, financial services, and Web 3.0. Conflux Network is overseen by a global team of world-class engineers and innovative computer scientists, led by Turing Award recipient Dr. Andrew Yao. Fostering entrepreneurship and innovation, Conflux elevates startups and organizations across industries and continents to generate decentralized marketplaces and digital assets for meaningful business and social impact. Founded in 2018, Conflux has raised $35 million in capital from prominent investors including Sequoia China, Metastable, Baidu Ventures, F2Pool, Huobi, IMO Ventures, and the Shanghai Municipal Science and Technology Commission.
To address the space congestion challenge, Conflux requires users to bond native to- kens into storage to occupy space, which implicitly creates a disincentive to occupy space unnecessarily. The disincentive stems from the payment of interest on existing tokens in the system. The interest on the bonded storage is payed to miners instead of the users to create a long term income to the miners. To address the fairness attack challenge, Conflux assigns the block reward in a way that eliminates the winner-take-all characteristic of mining. Instead of competing for the longest chain, miners in Conflux receive block rewards for all the blocks that they generate, albeit with some penalty mechanisms that encourage following the consensus protocol. Competing blocks are jointly penalized so that selfish mining is not profitable and different miners are incentivized to cooperate along the protocol to keep the network stable and secure.
Similarly to Ethereum, Conflux operates with an account-based model that every normal account associates with a balance and each smart contract account contains the corresponding byte codes as well as an internal state. Conflux supports a modified version of Solidity (the main contract language in Ethereum) and Ethereum Virtual Machine (EVM) for its smart contracts, so that smart contracts from Ethereum can migrate to Conflux easily.
A transaction in Conflux refers to a message that initiates a payment transaction, or deploys/executes smart contract code. Each block consists of a list of transactions that are verified by the proposing miner. Each node maintains a pool of verified, received transactions that have not yet been included in a block. Miners compete with one an- other by solving PoW puzzles to include transactions into blocks. Similar to Bitcoin and Ethereum, Conflux adjusts the PoW difficulty so as to maintain a stable block generation rate. Each node also maintains a local state constructed from the received blocks.
The Conflux consensus algorithm operates with a special Directed Acyclic Graph (DAG) structure called TreeGraph. Unlike Ethereum which only accepts transactions on a single chain into its ledger, the Conflux consensus algorithm safely incorporates and processes transactions in all concurrent blocks. There are two kinds of edges between blocks, parent edges and reference edges. Each block (except the genesis) in the Tree- Graph has exactly one parent edge to its chosen parent block. Each block can also have multiple reference edges to refer previous blocks. All parent edges form a tree embedded inside a Directed Acyclic Graph (DAG) of all edges.
At a high level, Conflux uses the novel Greedy Heaviest Adaptive SubTree (GHAST) algorithm (Li and Yang (2020)), which assigns a weight to each block according to the topologies in the TreeGraph. Under this weight assignment, there is a deterministically heaviest chain within the graph called pivot chain, which corresponds to the relatively most stable chain from the genesis to the tip of the parental tree.
Parent edges, reference edges, and the pivot chain together enable Conflux to split all blocks in a DAG into epochs. As shown in Figure 1, every block in the pivot chain corresponds to one epoch. Each epoch contains all blocks that are reachable from the corresponding block in the pivot chain via the combination of parent edges and refer- ence edges and that are not included in previous epochs. Details about the consensus algorithm can be find in (Li and Yang (2020)).
Experimental results have shown that Conflux is capable of processing 4, 000 trans- actions per second for simple payment transactions, at least two orders of magnitude higher throughput than Ethereum and Bitcoin. The improvement in throughput is a result of the TreeGraph structure and the consensus algorithm, so that the network can operate with a much faster block generation rate, no forks are discarded, and with a higher utilization of block space. According to the technical specification, the main net of Conflux will run under a fixed block generation rate at two blocks per second. The daily block generation rate is therefore 60 • 60 • 24 × 2 = 172, 800 blocks per day.

1.1.1 CONFLUX FOUNDATION

Conflux is the next generation blockchain protocol with scalability, security, and extensibility. In addition to the research and development of Conflux protocol, Conflux Foundation is committed to integrating blockchain technology into real-world scenarios with strategic partners and developing decentralized applications with mass adoption.

1.1.2 CONFLUX MISSION

The goal of the foundation is to gradually use technology to create the next generation of distributed systems. Conflux aim to promote the integration of technology and business with advanced blockchain technology, and they strive to collaborate the resources from technology, industry, business, and all other aspects to achieve the goal. They provide a rigorous security proof for the GHAST protocol. When adversarial computing power is bounded away from 50%, GHAST guarantees logarithmically bounded liveness and low confirmation latency.
The team also evaluated Conflux on Amazon EC2 clusters with up to 12,000 Conflux full nodes. The consensus protocol of Conflux achieves a consensus throughput of 9.6Mbps with per node network bandwidth limit of 20Mbps. On a combined workload of random payment transactions and Ethereum historical transactions, the Conflux Network achieves an end-to-end throughput of up to 3,480 transactions per second while confirming transactions under one minute.

1.1.3 WHAT IS THE SECRET TO CONFLUX NETWORK’S SUCCESS IN CHINA?

Conflux wins rare endorsements from Shanghai and Hunan provincial governments as other blockchain firms struggle for a toehold on the mainland.
In the shadow of the splitting up of the Blockchain Service Network — China’s once-hyped blockchain bridge between the East and the West — Conflux announced that it had received another endorsement from a regional government, this time in Hunan.
Hunan’s official embrace of Conflux Network, which is headquartered in Singapore, follows a similar endorsement by Shanghai last December.
In China, a government endorsement allows a company to access lucrative public sector contracts. Relationships are key to doing business in the country, and official endorsements like Hunan’s are notable events with consequences for a company far beyond mere public relations value. The ultimate goal of Conflux’s partnership with Hunan, according to an email from the company, is to share and verify all government administrative data on a blockchain infrastructure powered by Conflux.
Conflux — an open-source, layer-1 blockchain protocol — seems to be pulling off what few blockchain companies have been able to do on the mainland before: getting government seals of approval, so to speak, for a public, permissionless blockchain network.
The secret to Conflux’s success in China
Though Conflux is registered in Singapore, the company’s investors and core employees are part of China’s tech elite and have deep mainland roots.
According to Conflux, at least 10 out of the company’s 18-people development team, including founders David Chow and Fan Long, graduated from the computer science program at Tsinghua University — which is sometimes referred to as “China’s M.I.T.”
“Conflux is honored to be endorsed by the Hunan Government, whom we very much look forward to working with,” said Conflux Global Managing Director, Eden Dhaliwal. “Hunan is evolving into an epicenter for science and technology innovation and Conflux is proud to be at the forefront of this exploration and development. We strongly believe a positive relationship with Chinese state-backed entities is an instrumental piece to the puzzle of widespread blockchain adoption.”
Hunan’s plans for Conflux
According to Conflux and local media coverage, the company will be working closely with the Hunan government to build out blockchain infrastructure to enhance the region’s GovTech initiatives. Recently, the Hunan government initiated a three-year plan for blockchain, to build out the industry in the province while accelerating blockchain projects for supply chain management, tax compliance, electronic signatures and digital contracts.
As part of its three year blockchain plan the first of its kind for a provincial government in China when it was unveiled earlier this year. Hunan aims to build 10 blockchain-based public service platforms and five blockchain industrial parks, with a target of connecting 30,000 enterprises that will produce 3 billion yuan in revenue or about US$443.5 million across a variety of sectors.
“The lab at Hunan University creates a launching pad for more research and development opportunities in the great province of Hunan,” Dhaliwal said. “Not only will Conflux founders Fan Long and Ming Wu be presented as honorary professors, they will help foster an ongoing partnership with students and professors who may be interested in developing a Conflux incubation in the future.”

2. CONFLUX NETWORK TECHNOLOGY

Conflux Network is a scalable and decentralized blockchain system with high throughput and fast confirmation. It operates a novel consensus protocol GHAST which optimistically processes concurrent blocks without discarding any as forks, and adaptively assigns heterogeneous weights to blocks based on their topologies in the ledger structure (called Tree-Graph). Therefore Conflux is able to achieve high throughput as well as fast confirmation and liveness guarantee.
They provide a rigorous security proof for the GHAST protocol. When adversarial computing power is bounded away from 50%, GHAST guarantees logarithmically bounded liveness and low confirmation latency.
They also evaluated Conflux on Amazon EC2 clusters with up to 12,000 Conflux full nodes. The consensus protocol of Conflux achieves a consensus throughput of 9.6Mbps with per node network bandwidth limit of 20Mbps. On a combined workload of random payment transactions and Ethereum historical transactions, the Conflux Network achieves an end-to-end throughput of up to 3,480 transactions per second while confirming transactions under one minute.
The core value of blockchain is that it establishes a trustworthy and participatory decentralized platform without relying on the endorsement of any organization or individual. Importantly, Conflux serves to maximize this core value. It permits anyone to participate in the public blockchain consensus even without holding any stock share.
Conflux inherited the central design idea of Bitcoin system: Computing power is the only weapon in competing for the right of block generation. Additionally, in Conflux, the order of all transactions is solely determined by the Tree Graph (TG) . Conflux strives to prevent illegal transactions each transaction in the core chain is examined by all nodes of the network. The strict algorithm design and the comprehensive theoretical analysis of Conflux protect itself from most attacks. More importantly, Conflux pursuiting the efficiency without sacrificing any of its safety, decentralization or trustworthiness.

2.1 VERSATILITY OF APPLICATION

Most of the PoW-based blockchain systems currently in the field (e.g., Bitcoin and Ethereum) have a very limited transaction throughput rate, which is far lower than the centralized transaction services like Visa that can support the execution of >1,000 transactions per second. Such a low throughput rate restricts the possibility of exploiting more meaningful applications from blockchain systems. By adopting the Tree Graph (TG), Conflux realized a high throughput rate of 3000–6000 TPS in our internal network testing without compromising decentralization. In the future, they anticipate that Conflux will have versatile applications in many fields, such as financial transactions, Internet ID, Internet of things as well as property rights.

2.2 CONFLUX ARCHITECTURE

Differently from some the other blockchain scalability attempts, Conflux does not propose a brand new consensus model as much as a mechanism for extending the well-established Nakamoto consensus. Powering networks like Bitcoin, Nakamoto consensus has developed a reputation for being incredibly robust and secure as well as annoyingly slow. A Nakamoto-consensus protocol usually organizes transactions into an ordered list of blocks, each of which contains multiple transactions and a link to its predecessor. Each newly generated block will be appended at the end of the longest chain to make the chain even longer and therefore harder to revert. While incredibly secure, the Nakamoto-consensus model has the limitation that only one participant can win the competition and contribute to the blockchain.
https://preview.redd.it/sqezf2f9gky51.png?width=700&format=png&auto=webp&s=4ccc059ded0b3328dc3f73c334aa33a1faa934e0
To address the limitations of Nakamoto-consensus, several blockchains have relied on alternatives such as Byzantine fault tolerance (BFT) which relies on a hierarchical model to determine the order of transactions. However, there is a school of thought, which obviously includes the researchers behind Conflux that believes that BFT blockchains can’t be sufficiently decentralized at scale. The Conflux protocols follows a different approach powered by a simple idea: “What if we could optimistically process concurrent blocks and transactions deferring its final order to after commit time?”
Released from the constraints of sequential Nakamoto-consensus, Conflux introduces concurrent block processing by following a few simple steps:
1) Optimistically process concurrent blocks
2) Organize blocks into direct acyclic graphs (DAG)
3) First agree on a total order of all blocks (Assume transactions would not conflict each other )
4) Then derives the transaction order from the agreed block order (Resolve transaction conflicts lazily)
In order to implement these steps, the Conflux consensus protocol maintains consensus protocol maintains two kinds of relationships between blocks. When a participant node generates a new block in Conflux, the node identifies a parent (predecessor) block for the new block and creates a parent edge between these two blocks like Bitcoin. These parent edges enable Conflux to achieve consistent irreversible consensus on its ledger. The end result is that edges between blocks form a direct acyclic graph(DAG) that is both easy to navigate and introduces certain level of resistant to forks. From that perspective, Conflux can be seen as a DAG-based Nakamoto consensus protocol.
https://preview.redd.it/1oqmtz0hgky51.png?width=700&format=png&auto=webp&s=4660133a9c9843e76ea8b38b478ad197c027ecb8
Blockchain is more than a consensus protocol. Conflux provides a fairly simple architecture that extends some of the core principles of Bitcoin with innovative scalability constructs. The base architecture of Conflux includes the following components:
  • Gossip Network: All participant nodes in Conflux are connected via a gossip network which is responsible from broadcasting a transaction to all nodes in the network.
  • Pending Transaction Pool: Each node in the Conflux network maintains a pending transaction pool that includes transactions that have been heard by the node but are not yet packed into any block. Whenever a node receives a new transaction from the gossip network, the node adds the transaction into its pool.
  • Block Generator: Conflux’s nodes use a block generator based on Proof-Of-Work(PoW) to create blocks for pending transactions.
  • DAG State: Each node in Conflux maintains a local copy of the DAG graph that contains all the blocks the node is aware of.
https://preview.redd.it/7pyx1cvngky51.png?width=700&format=png&auto=webp&s=44028df42f68b343f1d4e0b705ca3ccac9632760
The principles of this architecture allow Conflux to combine the security and robustness of Bitcoin with the scalability of modern blockchains in a single network. The Conflux model ensures three very desirable properties of blockchain networks: security, concurrent transaction processing and correctness. However, the Conflux team went beyond the theoretical exercise and decided to see if some of their ideas work in practice.

2.2.1 CONFLUX IN ACTION

The Conflux team run a prototype implementation of the protocol on a cluster of 800 AWS EC2 instances running dozens of nodes each. In total the experiments used an architecture of about 10000 Conflux nodes processing blocks of 1MB-8MB in size. Not surprisingly, Conflux shown a clear advantage in throughput compared to other Nakamoto-consensus blockchains.
https://preview.redd.it/g4rohz1tgky51.png?width=700&format=png&auto=webp&s=aabd72b6111358143700fc8bbc627dd1ce66a089
A bit more impressive is that Conflux’s block utilization ration vastly outperforms other comparable blockchains.
https://preview.redd.it/sj5dhn0xgky51.png?width=700&format=png&auto=webp&s=fbf1cfc5b4d00eff1dd4bf3ef3c6c29e33869798
Other experiments showed that Conflux can scale consistently to about 20000 concurrent users without major impact in the confirmation time.
Conflux takes a novel approach to blockchain scalability by not completely departing from the principles of Bitcoin. At the moment, Conflux is not much more than a sophisticated research exercise but some of its ideas might evolve as a new blockchain or even be incorporated into Bitcoin.

2.2.2 CONFLUX AND SUCCEEDING IN DOUBLE SPENDING ATTACK

1. An attacker cannot revert a transaction unless he/she reverts the Pivot Chain
In order to double spend Tx2 (in block A), an attacker may refer the genesis block as parent and expects that the malicious block (Attack A) precedes A in the total order. However, as long as the pivot chain is not reverted, the malicious block (Attack A) must belong to a later epoch; so that the attacker cannot double spend Tx2.
2. An attacker cannot revert the Pivot Chain unless he/she controls 50% block generation power.
How to revert an old block:
  • Suppose to revert a pivot chain block A.
  • Honest participants may create small forks but always under the subtree of A.
  • Attacker needs at least 50% block generation power to make subtree of A’ heavier than A.

2.3 GHAST MECHANISM

In order to solve the “liveness attack” problem, Conflux has designed a GHAST mechanism solution.
The core parts of the GHAST mechanism can be summarized as follows:
1) The heaviest chain rule is applied, but the block has three different weights: 0, 1, X. Where X is a relatively large number, for example X = 1000 (ignoring the situation involving adjustment of mining difficulty).
2) There are two types of blocks in the network: normal blocks and special blocks. The weight of the normal block is always 1; the weight of the special block is determined
according to the difficulty of the block (Difficulty) — there are 1/X special block weights of X, while the rest are 0. Mining a normal block has the same difficulty as a special block.
3) The block type is determined by the historical Tree-Graph structure of the block. As the generator of a block cannot arbitrarily specify the block type.
4) In the absence of an attack, all newly generated honest blocks should become normal blocks; after the attacker conducts any kind of “liveliness attack” and continues for a long enough time, all newly generated honest blocks become special blocks.
Full Report
submitted by CoinEx_Institution to ConfluxChain [link] [comments]

Best Cryptocurrency To Invest In 2020

Advanced cash comes into the spotlight with the presence of Blockchain development. Everyone is searching for the benefit of crypto-money related revenue in 2020.
It return in 2017 when computerized cash was highlighting a direct result of Bitcoin's second impact esteem that extended around 20,000 USD/Coin.
Regardless, even this is staggering after that spike, the worth example was continuously dropping.
Statista revives new subtleties, which will help you with understanding the worth ascent and fall design from Bitcoin's November 2016 to November 2019.
Without a doubt, for a short period of time that was a significant ascension that was suitable in getting the eye of different people.
By and by everyone is looking to a comparable side again and again with the desire that they need to place assets into 2020 to the best cryptographic cash.
On the crypto market, everyone is looking at which cash they can placed assets into?
Best CryptoCurrency to Invest in 2020
Truth is the money that acknowledges well that doesn't mean it's your first theory elective.
Placing assets into 2020 is the fundamental goal to find which cryptographic cash was the best to place assets into 2019.
There are around 2073 reported financial structures as demonstrated by CoinMarketCap. In which some have an extraordinary limits and many don't.
Most Commonly known Cryptocurrencies
If, I'll ask people what cash they'd want to place assets into. Their response in all likelihood will be underneath.
1 – Bitcoin
2 – Ethereum
3 – Ripple
4 – Litecoin
5 – Bitcoin Cash
Bitcoin Cash It's conspicuous that they're just familiar with unlimited names, anyway, there's a huge load of dark money to place assets into.
So remain with us, I'll share with you top ten plans of computerized monetary forms that are striking in the course of action of crypto news and hypothesis firms.
Beyond question, you'll get a response in the wake of examining this article for what's the best-advanced cash to place assets into 2020.
It is the most basic to get some answers concerning placing assets into advanced cash prior to placing assets into computerized money.
Inspirations to place assets into Cryptocurrency?
The new creating and a most ideal choice to do hypothesis is comparable to the budgetary trade and another endeavor network advanced cash.
Nonetheless, there are hardly any things to look at from the beginning, for how long may you need to place assets into?
The explanation is that Bitcoin was conveyed in 2009 and it didn't get any response until 2016 and at the beginning of 2017 it had a huge market esteem move. What's more, on the off chance that you searching for pin code, at that point go on texas pin code.
Stock contributing has a number of rules and choices open accessible to take a gander at, anyway computerized cash isn't available, considering the way that it is decentralized money.
Thusly, for Cryptocurrency esteem move, it's hard to envision, it's more kind of siphon and dumps reliant on design reports.
Same as other corporate endeavors. Monetary pros will pick first how much compensation they plan to make.
Likewise, there may be both hardship prospects and advantages.
Adventure period The best thing you'd have to ask yourself is how long you'd have to contribute? For sure, it is hard to make a decision, yet I will empower you to answer.
Long stretch Investment
Since we have seen the advanced cash wonder, it has various to a great extent.
Much equivalent to various endeavors, you can pick the length of the theory yourself.
We've picked each computerized money already. It's a canny decision to do the assessment that is required and endeavor to find future improvement potential.
As we want to contribute for a more stretched out time period at that point hope to find a response for underneath.
  1. Which was the cash case of the latest 2 years?
  2. Do you have that long enough adjustment to keep it?
  3. Perceive the association's future cutoff.
  4. Shouldn't something be said about the association's masterminded relationship with various associations?
  5. In case of the most perceptibly horrendous circumstance choose the advantage adversity edge.
If you can react to the more than 5 requests, by then hop on board.
Clearly, the drawn out decision may have different viewpoints.
  1. Period to meet similar to any goal level.
  2. Season of holding the Fund for explicit years or months.
Those two are essential determinants that should be set up before the hypothesis is made.
Transitory theory
This choice is usually legitimate for those successfully particularly experienced in the advanced cash industry.
The explanation for this is there are two prospects you can either fail or succeed.
As not all that inaccessible future improvement chances are less difficult to portray appeared differently in relation to the huge stretch.
Since you saw in Bitcoin start 2017 everyone has started placing assets into Bitcoin and taas stock price there has been a positive example on the lookout.
Relatively few people made the outwardly impeded decision to contribute to the business, and they acquired an incredible preferred position incidentally.
Isn't it advocated, notwithstanding any difficulty?
I think so a lot, anyway the transient favorable position possibilities aren't set at this point totally in the function that you're completely educated with respect to the market, by then you can do that.
You've taught of IPO in the capital market, a similar course in computerized cash we have ICO (Initial coin offering).
The new business sells the set degree of the coin in ICO at a more affordable expense.
Abusing those sparks is this second. Participate in well-authentic coinage and hope to buy whatever number as could be permitted.
Electronic Coin
We ought to acknowledge Electronium Coin, for example, its expense was around $0.01/Coin when it was in ICO and after the official dispatch. It had been exchanged for $0.30/Coin in cryptopia.
You have made on numerous occasions an advantage in several days.
This is known as the more insightful decision: There are various ways to deal with contributing for a concise period, for instance, each day trading, purchasing ICO, and placing assets into money.
Until making a decision for the present, guarantee you are outfitted with answers to the requests underneath.
  1. Where are you prepared to spend, is it in ICO or in the money or standard trade?
  2. Hypothesis time 3. The most fundamental request to address is, how much favorable position or mishap you are anxious to sell for.
  3. Do you have enough data to make the hypothesis in light of the fact that there is a high peril of dissatisfaction incidentally?
  4. Is it useful for you to purchase into crypto news and companions notice reliably?
As of now, it seems like you're prepared to contribute.
All the data we've shared relies upon the experience we've had up to 2019 in the earlier year but at this point 2020 has started.
So plan for the latest procedure and use factors that have been amassed in past years. Let me share your open summary of advanced types of cash.
So now, I'll share the Best Cryptocurrency once-over to place assets into 2020.
Top 8 Best Cryptocurrencies once-over to place assets into 2020 I don't have a clue how comfortable you are with advanced types of cash yet rather endeavor to share anyway much data as could be normal.
So the market seems to can become at this point it's needy upon you to pick where you have to contribute.
Delivery me through the cash list:
1-Bitcoin (BTC)
is the mother of computerized cash, all extraordinary cryptographic types of cash are exchanged using Bitcoin.
Moreover, by what method may we disregard that?
The best bit of cryptographic types of cash, as news for Bitcoin, is still in the spotlight, it is uncommonly significant in both long stretch and transient hypothesis.
So discovering game-changing data doesn't need further effort.
Wanna find more about Bitcoin?
Examine – What is Bitcoin and how it made Bitcoin's current powerful worth hangs about $4000 - 4500 and it's worth $114 trillion through and through around market capitalization.
Despite by and large cash affirmation it justifies purchasing, and you can get a couple of ATMs to change into cash in the USA itself.
Anyway to the extent esteem it is so significantly higher and it takes a lot of cash to spend to achieve a high-in general income.
However, the decision to place assets into Bitcoin is profitable and I find this remarkable among other cryptographic cash to place assets into 2020.
In case you really acknowledge that buying Bitcoin Now is legitimized, in spite of any difficulty? Do whatever it takes not to leave behind our definitive Bitcoin future guide.
2-Ethereum (ETH)
Ethereum is the second theory decision, comparably as there are different financial guidelines in Bitcoin that simply recognize Ethereum as a byproduct of ICO.
This has a market capitalization of about $23.5 billion.
We saw a very gigantic augmentation in the year 2017 from the worth diagram starting from November 2016 till November 2019, anyway, Bitcoin was in design as well.
So it certifies it's goodly influencing esteem change subject to Bitcoin design. In January 2018 it had extended around $1000/Coin.
So distinguishing moving news is simpler, considering the way that it has an unquestionable connection with Bitcoin.
Ethereum's best part is, it's outstanding who made development and it's significantly easier to zero in on those business flyers and get the coin's conceivable future model.
Ethereum Everything you need to get some answers concerning, mining, wallet, future
3-Bitcoin Cash(BCH)
Bitcoin Cash is another critical advanced cash, beginning in 2017 from Bitcoin itself.
It was executed to clarify Bitcoin's base square size and has gotten prominence speed up and transform into the best hypothesis elective.
Also, It might be basic to understand that if any such fork occurs with Bitcoin to make new cash, by then any person who has Bitcoin will get a proportionate proportion of new money complimentary.
|However It was comparable by virtue of Bitcoin cash and I got a $100 award as a prize.
Bitcoin has a market capitalization of around $9.32 billion.
From the example map, it seems like it should be your need while thinking about an endeavor elective in 2020.
4-Ripple (XRP)
You think Ripple is the best-advanced cash to place assets into 2020, or not.
The people who put assets into mid-2017 got a smile on their appearances close to the completion of 2017 when the wave hit around $3/coin from 0.12$.
It is commonly renowned and consistently used computerized cash, the explanation for this isn't just cryptographic cash yet a normally used portion planning technology, and Ethereum portion dealing with time.
At present has a colossal market assessment of around USD 15 trillion.
The entrancing thing about Ripple(XRP) is that more noteworthy endeavors, for instance, JP Morgan, American Express, the public bank of Saudi Arabia do use swell development to clear their trans
submitted by parimageek to u/parimageek [link] [comments]

I built a decentralized legal-binding smart contract system. I need peer reviewers and whitepaper proof readers. Help greatly appreciated!

I posted this on /cryptotechnology . It attracted quite a bit of upvotes but not many potential contributors. Someone mentioned I should try this sub. I read the rules and it seems to fit within them. Hope this kind of post is alright here...
EDIT: My mother language is french (I'm from Montreal/Canada). Please excuse any blatant grammatical errors.
TLDR: I built a decentralized legal-binding smart contract system. I need peer reviewers and whitepaper proof readers. If you're interested, send me an email to discuss: [email protected] . Thanks in advance!
Hi guys,
For the last few years, I've been working on a decentralized legal-binding contract system. Basically, I created a PoW blockchain software that can receive a hash as an address, and another hash as a bucket, in each transaction.
The address hash is used to tell a specific entity (application/contract/company/person, etc) that uses the blockchain that this transaction might be addressed to them. The bucket hash simply tells the nodes which hashtree of files they need to download in order to execute that contract.
The buckets are shared within the network of nodes. Someone could, for example, write a contract with a series of nodes in order to host their data for them. Buckets can hold any kind of data, and can be of any size... including encrypted data.
The blockchain's blocks are chained together using a mining system similar to bitcoin (hashcash algorithm). Each block contains transactions. The requested difficulty increases when the amount of transactions in a block increases, linearly. Then, when a block is mined properly, another smaller mining effort is requested to link the block to the network's head block.
To replace a block, you need to create another block with more transactions than the amount that were transacted in and after the mined block.
I expect current payment processors to begin accepting transactions and mine them for their customers and make money with fees, in parallel. Using such a mechanism, miners will need to have a lot of bandwidth available in order to keep downloading the blocks of other miners, just like the current payment processors.
The contracts is code written in our custom programming language. Their code is pushed using a transaction, and hosted in buckets. Like you can see, the contract's data are off-chain, only its bucket hash is on-chain. The contract can be used to listen to events that occurs on the blockchain, in any buckets hosted by nodes or on any website that can be crawled and parsed in the contract.
There is also an identity system and a vouching system...which enable the creation of soft-money (promise of future payment in hard money (our cryptocurrency) if a series of events arrive).
The contracts can also be compiled to a legal-binding framework and be potentially be used in court. The contracts currently compile to english and french only.
I also built a browser that contains a 3D viewport, using OpenGL. The browser contains a domain name system (DNS) in form of contracts. Anyone can buy a new domain by creating a transaction with a bucket that contains code to reserve a specific name. When a user request a domain name, it discovers the bucket that is attached to the domain, download that bucket and executes its scripts... which renders in the 3D viewport.
When people interact with an application, the application can create contracts on behalf of the user and send them to the blockchain via a transaction. This enables normal users (non-developers) to interact with others using legal contracts, by using a GUI software.
The hard money (cryptocurrency) is all pre-mined and will be sold to entities (people/company) that want to use the network. The hard money can be re-sold using the contract proposition system, for payment in cash or a bank transfer. The fiat funds will go to my company in order to create services that use this specific network of contracts. The goal is to use the funds to make the network grow and increase its demand in hard money. For now, we plan to create:
A logistic and transportation company
A delivery company
A company that buy and sell real estate options
A company that manage real estate
A software development company
A world-wide fiat money transfer company
A payment processor company
We chose these niche because our team has a lot of experience in these areas: we currently run companies in these fields. These niche also generate a lot of revenue and expenses, making the value of exchanges high. We expect this to drive volume in contracts, soft-money and hard-money exchanges.
We also plan to use the funds to create a venture capital fund that invests in startups that wants to create contracts on our network to execute a specific service in a specific niche.
I'm about to release the software open source very soon and begin executing our commercial activities on the network. Before launching, I'd like to open a discussion with the community regarding the details of how this software works and how it is explained in the whitepaper.
If you'd like to read the whitepaper and open a discussion with me regarding how things work, please send me an email at [email protected] .
If you have any comment, please comment below and Ill try to answer every question. Please note that before peer-reviewing the software and the whitepaper, I'd like to keep the specific details of the software private, but can discuss the general details. A release date will be given once my work has been peer reviewed.
Thanks all in advance!
P.S: This project is not a competition to bitcoin. My goal with this project is to enable companies to write contracts together, easily follow events that are executed in their contracts, understand what to expect from their partnership and what they need to give in order to receive their share of deals... and sell their contracts that they no longer need to other community members.
Bitcoin already has a network of people that uses it. It has its own value. In fact, I plan to create contracts on our network to exchange value from our network for bitcoin and vice-versa. Same for any commodity and currency that currently exits in this world.
submitted by steve-rodrigue to compsci [link] [comments]

How EpiK Protocol “Saved the Miners” from Filecoin with the E2P Storage Model?

How EpiK Protocol “Saved the Miners” from Filecoin with the E2P Storage Model?

https://preview.redd.it/n5jzxozn27v51.png?width=2222&format=png&auto=webp&s=6cd6bd726582bbe2c595e1e467aeb3fc8aabe36f
On October 20, Eric Yao, Head of EpiK China, and Leo, Co-Founder & CTO of EpiK, visited Deep Chain Online Salon, and discussed “How EpiK saved the miners eliminated by Filecoin by launching E2P storage model”. ‘?” The following is a transcript of the sharing.
Sharing Session
Eric: Hello, everyone, I’m Eric, graduated from School of Information Science, Tsinghua University. My Master’s research was on data storage and big data computing, and I published a number of industry top conference papers.
Since 2013, I have invested in Bitcoin, Ethereum, Ripple, Dogcoin, EOS and other well-known blockchain projects, and have been settling in the chain circle as an early technology-based investor and industry observer with 2 years of blockchain experience. I am also a blockchain community initiator and technology evangelist
Leo: Hi, I’m Leo, I’m the CTO of EpiK. Before I got involved in founding EpiK, I spent 3 to 4 years working on blockchain, public chain, wallets, browsers, decentralized exchanges, task distribution platforms, smart contracts, etc., and I’ve made some great products. EpiK is an answer to the question we’ve been asking for years about how blockchain should be landed, and we hope that EpiK is fortunate enough to be an answer for you as well.
Q & A
Deep Chain Finance:
First of all, let me ask Eric, on October 15, Filecoin’s main website launched, which aroused everyone’s attention, but at the same time, the calls for fork within Filecoin never stopped. The EpiK protocol is one of them. What I want to know is, what kind of project is EpiK Protocol? For what reason did you choose to fork in the first place? What are the differences between the forked project and Filecoin itself?
Eric:
First of all, let me answer the first question, what kind of project is EpiK Protocol.
With the Fourth Industrial Revolution already upon us, comprehensive intelligence is one of the core goals of this stage, and the key to comprehensive intelligence is how to make machines understand what humans know and learn new knowledge based on what they already know. And the knowledge graph scale is a key step towards full intelligence.
In order to solve the many challenges of building large-scale knowledge graphs, the EpiK Protocol was born. EpiK Protocol is a decentralized, hyper-scale knowledge graph that organizes and incentivizes knowledge through decentralized storage technology, decentralized autonomous organizations, and generalized economic models. Members of the global community will expand the horizons of artificial intelligence into a smarter future by organizing all areas of human knowledge into a knowledge map that will be shared and continuously updated for the eternal knowledge vault of humanity
And then, for what reason was the fork chosen in the first place?
EpiK’s project founders are all senior blockchain industry practitioners and have been closely following the industry development and application scenarios, among which decentralized storage is a very fresh application scenario.
However, in the development process of Filecoin, the team found that due to some design mechanisms and historical reasons, the team found that Filecoin had some deviations from the original intention of the project at that time, such as the overly harsh penalty mechanism triggered by the threat to weaken security, and the emergence of the computing power competition leading to the emergence of computing power monopoly by large miners, thus monopolizing the packaging rights, which can be brushed with computing power by uploading useless data themselves.
The emergence of these problems will cause the data environment on Filecoin to get worse and worse, which will lead to the lack of real value of the data in the chain, high data redundancy, and the difficulty of commercializing the project to land.
After paying attention to the above problems, the project owner proposes to introduce multi-party roles and a decentralized collaboration platform DAO to ensure the high value of the data on the chain through a reasonable economic model and incentive mechanism, and store the high-value data: knowledge graph on the blockchain through decentralized storage, so that the lack of value of the data on the chain and the monopoly of large miners’ computing power can be solved to a large extent.
Finally, what differences exist between the forked project and Filecoin itself?
On the basis of the above-mentioned issues, EpiK’s design is very different from Filecoin, first of all, EpiK is more focused in terms of business model, and it faces a different market and track from the cloud storage market where Filecoin is located because decentralized storage has no advantage over professional centralized cloud storage in terms of storage cost and user experience.
EpiK focuses on building a decentralized knowledge graph, which reduces data redundancy and safeguards the value of data in the distributed storage chain while preventing the knowledge graph from being tampered with by a few people, thus making the commercialization of the entire project reasonable and feasible.
From the perspective of ecological construction, EpiK treats miners more friendly and solves the pain point of Filecoin to a large extent, firstly, it changes the storage collateral and commitment collateral of Filecoin to one-time collateral.
Miners participating in EpiK Protocol are only required to pledge 1000 EPK per miner, and only once before mining, not in each sector.
What is the concept of 1000 EPKs, you only need to participate in pre-mining for about 50 days to get this portion of the tokens used for pledging. The EPK pre-mining campaign is currently underway, and it runs from early September to December, with a daily release of 50,000 ERC-20 standard EPKs, and the pre-mining nodes whose applications are approved will divide these tokens according to the mining ratio of the day, and these tokens can be exchanged 1:1 directly after they are launched on the main network. This move will continue to expand the number of miners eligible to participate in EPK mining.
Secondly, EpiK has a more lenient penalty mechanism, which is different from Filecoin’s official consensus, storage and contract penalties, because the protocol can only be uploaded by field experts, which is the “Expert to Person” mode. Every miner needs to be backed up, which means that if one or more miners are offline in the network, it will not have much impact on the network, and the miner who fails to upload the proof of time and space in time due to being offline will only be forfeited by the authorities for the effective computing power of this sector, not forfeiting the pledged coins.
If the miner can re-submit the proof of time and space within 28 days, he will regain the power.
Unlike Filecoin’s 32GB sectors, EpiK’s encapsulated sectors are smaller, only 8M each, which will solve Filecoin’s sector space wastage problem to a great extent, and all miners have the opportunity to complete the fast encapsulation, which is very friendly to miners with small computing power.
The data and quality constraints will also ensure that the effective computing power gap between large and small miners will not be closed.
Finally, unlike Filecoin’s P2P data uploading model, EpiK changes the data uploading and maintenance to E2P uploading, that is, field experts upload and ensure the quality and value of the data on the chain, and at the same time introduce the game relationship between data storage roles and data generation roles through a rational economic model to ensure the stability of the whole system and the continuous high-quality output of the data on the chain.
Deep Chain Finance:
Eric, on the eve of Filecoin’s mainline launch, issues such as Filecoin’s pre-collateral have aroused a lot of controversy among the miners. In your opinion, what kind of impact will Filecoin bring to itself and the whole distributed storage ecosystem after it launches? Do you think that the current confusing FIL prices are reasonable and what should be the normal price of FIL?
Eric:
Filecoin mainnet has launched and many potential problems have been exposed, such as the aforementioned high pre-security problem, the storage resource waste and computing power monopoly caused by unreasonable sector encapsulation, and the harsh penalty mechanism, etc. These problems are quite serious, and will greatly affect the development of Filecoin ecology.
These problems are relatively serious, and will greatly affect the development of Filecoin ecology, here are two examples to illustrate. For example, the problem of big miners computing power monopoly, now after the big miners have monopolized computing power, there will be a very delicate state — — the miners save a file data with ordinary users. There is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. And after the big miners have monopolized computing power, there will be a very delicate state — — the miners will save a file data with ordinary users, there is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. Because I can fake another identity to upload data for myself, but that leads to the fact that for any miner I go to choose which data to save. I have only one goal, and that is to brush my computing power and how fast I can brush my computing power.
There is no difference between saving other people’s data and saving my own data in the matter of computing power. When I save someone else’s data, I don’t know that data. Somewhere in the world, the bandwidth quality between me and him may not be good enough.
The best option is to store my own local data, which makes sense, and that results in no one being able to store data on the chain at all. They only store their own data, because it’s the most economical for them, and the network has essentially no storage utility, no one is providing storage for the masses of retail users.
The harsh penalty mechanism will also severely deplete the miner’s profits, because DDOS attacks are actually a very common attack technique for the attacker, and for a big miner, he can get a very high profit in a short period of time if he attacks other customers, and this thing is a profitable thing for all big miners.
Now as far as the status quo is concerned, the vast majority of miners are actually not very well maintained, so they are not very well protected against these low-DDOS attacks. So the penalty regime is grim for them.
The contradiction between the unreasonable system and the demand will inevitably lead to the evolution of the system in a more reasonable direction, so there will be many forked projects that are more reasonable in terms of mechanism, thus attracting Filecoin miners and a diversion of storage power.
Since each project is in the field of decentralized storage track, the demand for miners is similar or even compatible with each other, so miners will tend to fork the projects with better economic benefits and business scenarios, so as to filter out the projects with real value on the ground.
For the chaotic FIL price, because FIL is also a project that has gone through several years, carrying too many expectations, so it can only be said that the current situation has its own reasons for existence. As for the reasonable price of FIL there is no way to make a prediction because in the long run, it is necessary to consider the commercialization of the project to land and the value of the actual chain of data. In other words, we need to keep observing whether Filecoin will become a game of computing power or a real value carrier.
Deep Chain Finance:
Leo, we just mentioned that the pre-collateral issue of Filecoin caused the dissatisfaction of miners, and after Filecoin launches on the main website, the second round of space race test coins were directly turned into real coins, and the official selling of FIL hit the market phenomenon, so many miners said they were betrayed. What I want to know is, EpiK’s main motto is “save the miners eliminated by Filecoin”, how to deal with the various problems of Filecoin, and how will EpiK achieve “save”?
Leo:
Originally Filecoin’s tacit approval of the computing power makeup behavior was to declare that the official directly chose to abandon the small miners. And this test coin turned real coin also hurt the interests of the loyal big miners in one cut, we do not know why these low-level problems, we can only regret.
EpiK didn’t do it to fork Filecoin, but because EpiK to build a shared knowledge graph ecology, had to integrate decentralized storage in, so the most hardcore Filecoin’s PoRep and PoSt decentralized verification technology was chosen. In order to ensure the quality of knowledge graph data, EpiK only allows community-voted field experts to upload data, so EpiK naturally prevents miners from making up computing power, and there is no reason for the data that has no value to take up such an expensive decentralized storage resource.
With the inability to make up computing power, the difference between big miners and small miners is minimal when the amount of knowledge graph data is small.
We can’t say that we can save the big miners, but we are definitely the optimal choice for the small miners who are currently in the market to be eliminated by Filecoin.
Deep Chain Finance:
Let me ask Eric: According to EpiK protocol, EpiK adopts the E2P model, which allows only experts in the field who are voted to upload their data. This is very different from Filecoin’s P2P model, which allows individuals to upload data as they wish. In your opinion, what are the advantages of the E2P model? If only voted experts can upload data, does that mean that the EpiK protocol is not available to everyone?
Eric:
First, let me explain the advantages of the E2P model over the P2P model.
There are five roles in the DAO ecosystem: miner, coin holder, field expert, bounty hunter and gateway. These five roles allocate the EPKs generated every day when the main network is launched.
The miner owns 75% of the EPKs, the field expert owns 9% of the EPKs, and the voting user shares 1% of the EPKs.
The other 15% of the EPK will fluctuate based on the daily traffic to the network, and the 15% is partly a game between the miner and the field expert.
The first describes the relationship between the two roles.
The first group of field experts are selected by the Foundation, who cover different areas of knowledge (a wide range of knowledge here, including not only serious subjects, but also home, food, travel, etc.) This group of field experts can recommend the next group of field experts, and the recommended experts only need to get 100,000 EPK votes to become field experts.
The field expert’s role is to submit high-quality data to the miner, who is responsible for encapsulating this data into blocks.
Network activity is judged by the amount of EPKs pledged by the entire network for daily traffic (1 EPK = 10 MB/day), with a higher percentage indicating higher data demand, which requires the miner to increase bandwidth quality.
If the data demand decreases, this requires field experts to provide higher quality data. This is similar to a library with more visitors needing more seats, i.e., paying the miner to upgrade the bandwidth.
When there are fewer visitors, more money is needed to buy better quality books to attract visitors, i.e., money for bounty hunters and field experts to generate more quality knowledge graph data. The game between miners and field experts is the most important game in the ecosystem, unlike the game between the authorities and big miners in the Filecoin ecosystem.
The game relationship between data producers and data storers and a more rational economic model will inevitably lead to an E2P model that generates stored on-chain data of much higher quality than the P2P model, and the quality of bandwidth for data access will be better than the P2P model, resulting in greater business value and better landing scenarios.
I will then answer the question of whether this means that the EpiK protocol will not be universally accessible to all.
The E2P model only qualifies the quality of the data generated and stored, not the roles in the ecosystem; on the contrary, with the introduction of the DAO model, the variety of roles introduced in the EpiK ecosystem (which includes the roles of ordinary people) is not limited. (Bounty hunters who can be competent in their tasks) gives roles and possibilities for how everyone can participate in the system in a more logical way.
For example, a miner with computing power can provide storage, a person with a certain domain knowledge can apply to become an expert (this includes history, technology, travel, comics, food, etc.), and a person willing to mark and correct data can become a bounty hunter.
The presence of various efficient support tools from the project owner will lower the barriers to entry for various roles, thus allowing different people to do their part in the system and together contribute to the ongoing generation of a high-quality decentralized knowledge graph.
Deep Chain Finance:
Leo, some time ago, EpiK released a white paper and an economy whitepaper, explaining the EpiK concept from the perspective of technology and economy model respectively. What I would like to ask is, what are the shortcomings of the current distributed storage projects, and how will EpiK protocol be improved?
Leo:
Distributed storage can easily be misunderstood as those of Ali’s OceanDB, but in the field of blockchain, we should focus on decentralized storage first.
There is a big problem with the decentralized storage on the market now, which is “why not eat meat porridge”.
How to understand it? Decentralized storage is cheaper than centralized storage because of its technical principle, and if it is, the centralized storage is too rubbish for comparison.
What incentive does the average user have to spend more money on decentralized storage to store data?
Is it safer?
Existence miners can shut down at any time on decentralized storage by no means save a share of security in Ariadne and Amazon each.
More private?
There’s no difference between encrypted presence on decentralized storage and encrypted presence on Amazon.
Faster?
The 10,000 gigabytes of bandwidth in decentralized storage simply doesn’t compare to the fiber in a centralized server room. This is the root problem of the business model, no one is using it, no one is buying it, so what’s the big vision.
The goal of EpiK is to guide all community participants in the co-construction and sharing of field knowledge graph data, which is the best way for robots to understand human knowledge, and the more knowledge graph data there is, the more knowledge a robot has, the more intelligent it is exponentially, i.e., EpiK uses decentralized storage technology. The value of exponentially growing data is captured with linearly growing hardware costs, and that’s where the buy-in for EPK comes in.
Organized data is worth a lot more than organized hard drives, and there is a demand for EPK when robots have the need for intelligence.
Deep Chain Finance:
Let me ask Leo, how many forked projects does Filecoin have so far, roughly? Do you think there will be more or less waves of fork after the mainnet launches? Have the requirements of the miners at large changed when it comes to participation?
Leo:
We don’t have specific statistics, now that the main network launches, we feel that forking projects will increase, there are so many restricted miners in the market that they need to be organized efficiently.
However, we currently see that most forked projects are simply modifying the parameters of Filecoin’s economy model, which is undesirable, and this level of modification can’t change the status quo of miners making up computing power, and the change to the market is just to make some of the big miners feel more comfortable digging up, which won’t help to promote the decentralized storage ecology to land.
We need more reasonable landing scenarios so that idle mining resources can be turned into effective productivity, pitching a 100x coin instead of committing to one Fomo sentiment after another.
Deep Chain Finance:
How far along is the EpiK Protocol project, Eric? What other big moves are coming in the near future?
Eric:
The development of the EpiK Protocol is divided into 5 major phases.
(a) Phase I testing of the network “Obelisk”.
Phase II Main Network 1.0 “Rosetta”.
Phase III Main Network 2.0 “Hammurabi”.
(a) The Phase IV Enrichment Knowledge Mapping Toolkit.
The fifth stage is to enrich the knowledge graph application ecology.
Currently in the first phase of testing network “Obelisk”, anyone can sign up to participate in the test network pre-mining test to obtain ERC20 EPK tokens, after the mainnet exchange on a one-to-one basis.
We have recently launched ERC20 EPK on Uniswap, you can buy and sell it freely on Uniswap or download our EpiK mobile wallet.
In addition, we will soon launch the EpiK Bounty platform, and welcome all community members to do tasks together to build the EpiK community. At the same time, we are also pushing forward the centralized exchange for token listing.
Users’ Questions
User 1:
Some KOLs said, Filecoin consumed its value in the next few years, so it will plunge, what do you think?
Eric:
First of all, the judgment of the market is to correspond to the cycle, not optimistic about the FIL first judgment to do is not optimistic about the economic model of the project, or not optimistic about the distributed storage track.
First of all, we are very confident in the distributed storage track and will certainly face a process of growth and decline, so as to make a choice for a better project.
Since the existing group of miners and the computing power already produced is fixed, and since EpiK miners and FIL miners are compatible, anytime miners will also make a choice for more promising and economically viable projects.
Filecoin consumes the value of the next few years this time, so it will plunge.
Regarding the market issues, the plunge is not a prediction, in the industry or to keep learning iteration and value judgment. Because up and down market sentiment is one aspect, there will be more very important factors. For example, the big washout in March this year, so it can only be said that it will slow down the development of the FIL community. But prices are indeed unpredictable.
User2:
Actually, in the end, if there are no applications and no one really uploads data, the market value will drop, so what are the landing applications of EpiK?
Leo: The best and most direct application of EpiK’s knowledge graph is the question and answer system, which can be an intelligent legal advisor, an intelligent medical advisor, an intelligent chef, an intelligent tour guide, an intelligent game strategy, and so on.
submitted by EpiK-Protocol to u/EpiK-Protocol [link] [comments]

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

What is bitcoin mining difficulty and how it effects our output (HINDI) Mining Pool Shares, Difficulty and Luck Explained What is Crypto Mining Difficulty and How it Impacts YOUR Profits - Explained W/ BTC ZenCash ZEC Bitcoin basics: What is the difficulty target and how does it adjust itself? Crypto Mining Difficulty 101 - Everything You Need to Know

The Bitcoin difficulty chart provides the current Bitcoin difficulty (BTC diff) target as well as a historical data graph visualizing Bitcoin mining difficulty chart values with BTC difficulty adjustments (both increases and decreases) defaulted to today with timeline options of 1 day, 1 week, 1 month, 3 months, 6 months, 1 year, 3 years, and all time The Bitcoin Network Difficulty Metric The Bitcoin network difficulty is the measure of how difficult it is to find a new block compared to the easiest it can ever be. It is recalculated every 2016 blocks to a value such that the previous 2016 blocks would have been generated in exactly two weeks had everyone been mining at this difficulty. This will yield, on average, one block every ten ... The difficulty is measured in hashes (usually terahashes – TH), concerning mining, it signifies the unit of work performed. The network hashrate or nethash (number of miners) are measured by hashes per second (TH/s). The network itself adjusts difficulty in such a way that the difficulty/nethash = block time (in case of Musicoin it is 15 ... Difficulty is a measure of how difficult it is to find a hash below a given target. The Bitcoin network has a global block difficulty. Valid blocks must have a hash below this target. Mining pools also have a pool-specific share difficulty setting a lower limit for shares. How often does the network difficulty change? Every 2016 blocks. Share difficulty is important to prove your computer is doing the work of mining. You would set a higher difficulty on powerful hardware (ASICs) so that they don't need to request a new share (GetBlockTemplate, GetWork, Stratum, etc) each time. The result is that your hardware is more active in the mining process, and less active waiting on the network or pool to give you work to do. On the ...

[index] [5883] [7524] [13609] [13003] [5290] [12362] [3744] [14546] [11030] [7554]

What is bitcoin mining difficulty and how it effects our output (HINDI)

Mining Bitcoin or Ethereum is a hard task for your computer. But why? And what does the difficulty have to do with the security of blockchains? Learn all abo... What is crypto mining difficulty, how is it adjusted, what is the point of a block time? Vosk explains how the difficulty for mining a block reward is adjusted when mining Bitcoin on sha-256 or ... In this video, I attempt to describe how crypto mining difficulty works and how it affects profitability. I also crunch some numbers to show alternative methods for determining profits based on ... As requested an overview of shares, difficulty and luck. Excuse my appearance as I am still under the weather a bit. More detailed vids to the series coming. Plotting Rig Build: ASRock X399 TAICHI ... I am sharing my biased opinion based off speculation. You should not take my opinion as financial advice. You should always do your research before making any investment. You should also ...

#