No Blocksize Increase Needed for Years, Argues Bitcoin ...

Don't blindly follow a narrative, its bad for you and its bad for crypto in general

I mostly lurk around here but I see a pattern repeating over and over again here and in multiple communities so I have to post. I'm just posting this here because I appreciate the fact that this sub is a place of free speech and maybe something productive can come out from this post, while bitcoin is just fucking censorship, memes and moon/lambo posts. If you don't agree, write in the comments why, instead of downvoting. You don't have to upvote either, but when you downvote you are killing the opportunity to have discussion. If you downvote or comment that I'm wrong without providing any counterpoints you are no better than the BTC maxis you despise.
In various communities I see a narrative being used to bring people in and making them follow something without thinking for themselves. In crypto I see this mostly in BTC vs BCH tribalistic arguments:
- BTC community: "Everything that is not BTC is shitcoin." or more recently as stated by adam on twitter, "Everything that is not BTC is a ponzi scheme, even ETH.", "what is ETH supply?", and even that they are doing this for "altruistic" reasons, to "protect" the newcomers. Very convenient for them that they are protecting the newcomers by having them buy their bags
- BCH community: "BTC maxis are dumb", "just increase block size and you will have truly p2p electronic cash", "It is just that simple, there are no trade offs", "if you don't agree with me you are a BTC maxi", "BCH is satoshi's vision for p2p electronic cash"
It is not exclusive to crypto but also politics, and you see this over and over again on twitter and on reddit.
My point is, that narratives are created so people don't have to think, they just choose a narrative that is easy to follow and makes sense for them, and stick with it. And people keep repeating these narratives to bring other people in, maybe by ignorance, because they truly believe it without questioning, or maybe by self interest, because they want to shill you their bags.
Because this is BCH community, and because bitcoin is censored, so I can't post there about the problems in the BTC narrative (some of which are IMO correctly identified by BCH community), I will stick with the narrative I see in the BCH community.
The culprit of this post was firstly this post by user u/scotty321 "The BTC Paradox: “A 1 MB blocksize enables poor people to run their own node!” “Okay, then what?” “Poor people won’t be able to use the network!”". You will see many posts of this kind being made by u/Egon_1 also. Then you have also this comment in that thread by u/fuck_____________1 saying that people that want to run their own nodes are retarded and that there is no reason to want to do that. "Just trust block explorer websites". And the post and comment were highly upvoted. Really? You really think that there is no problem in having just a few nodes on the network? And that the only thing that secures the network are miners?
As stated by user u/co1nsurf3r in that thread:
While I don't think that everybody needs to run a node, a full node does publish blocks it considers valid to other nodes. This does not amount to much if you only consider a single node in the network, but many "honest" full nodes in the network will reduce the probability of a valid block being withheld from the network by a collusion of "hostile" node operators.
But surely this will not get attention here, and will be downvoted by those people that promote the narrative that there is no trade off in increasing the blocksize and the people that don't see it are retarded or are btc maxis.
The only narrative I stick to and have been for many years now is that cryptocurrency takes power from the government and gives power to the individual, so you are not restricted to your economy as you can participate in the global economy. There is also the narrative of banking the bankless, which I hope will come true, but it is not a use case we are seeing right now.
Some people would argue that removing power from gov's is a bad thing, but you can't deny the fact that gov's can't control crypto (at least we would want them not to).
But, if you really want the individuals to remain in control of their money and transact with anyone in the world, the network needs to be very resistant to any kind of attacks. How can you have p2p electronic cash if your network just has a handful couple of nodes and the chinese gov can locate them and just block communication to them? I'm not saying that this is BCH case, I'm just refuting the fact that there is no value in running your own node. If you are relying on block explorers, the gov can just block the communication to the block explorer websites. Then what? Who will you trust to get chain information? The nodes needs to be decentralized so if you take one node down, many more can appear so it is hard to censor and you don't have few points of failure.
Right now BTC is focusing on that use case of being difficult to censor. But with that comes the problem that is very expensive to transact on the network, which breaks the purpose of anyone being able to participate. Obviously I do think that is also a major problem, and lightning network is awful right now and probably still years away of being usable, if it ever will. The best solution is up for debate, but thinking that you just have to increase the blocksize and there is no trade off is just naive or misleading. BCH is doing a good thing in trying to come with a solution that is inclusive and promotes cheap and fast transactions, but also don't forget centralization is a major concern and nothing to just shrug off.
Saying that "a 1 MB blocksize enables poor people to run their own" and that because of that "Poor people won’t be able to use the network" is a misrepresentation designed to promote a narrative. Because 1MB is not to allow "poor" people to run their node, it is to facilitate as many people to run a node to promote decentralization and avoid censorship.
Also an elephant in the room that you will not see being discussed in either BTC or BCH communities is that mining pools are heavily centralized. And I'm not talking about miners being mostly in china, but also that big pools control a lot of hashing power both in BTC and BCH, and that is terrible for the purpose of crypto.
Other projects are trying to solve that. Will they be successful? I don't know, I hope so, because I don't buy into any narrative. There are many challenges and I want to see crypto succeed as a whole. As always guys, DYOR and always question if you are not blindly following a narrative. I'm sure I will be called BTC maxi but maybe some people will find value in this. Don't trust guys that are always posting silly "gocha's" against the other "tribe".
EDIT: User u/ShadowOfHarbringer has pointed me to some threads that this has been discussed in the past and I will just put my take on them here for visibility, as I will be using this thread as a reference in future discussions I engage:
When there was only 2 nodes in the network, adding a third node increased redundancy and resiliency of the network as a whole in a significant way. When there is thousands of nodes in the network, adding yet another node only marginally increase the redundancy and resiliency of the network. So the question then becomes a matter of personal judgement of how much that added redundancy and resiliency is worth. For the absolutist, it is absolutely worth it and everyone on this planet should do their part.
What is the magical number of nodes that makes it counterproductive to add new nodes? Did he do any math? Does BCH achieve this holy grail safe number of nodes? Guess what, nobody knows at what number of nodes is starts to be marginally irrelevant to add new nodes. Even BTC today could still not have enough nodes to be safe. If you can't know for sure that you are safe, it is better to try to be safer than sorry. Thousands of nodes is still not enough, as I said, it is much cheaper to run a full node as it is to mine. If it costs millions in hash power to do a 51% attack on the block generation it means nothing if it costs less than $10k to run more nodes than there are in total in the network and cause havoc and slowing people from using the network. Or using bot farms to DDoS the 1000s of nodes in the network. Not all attacks are monetarily motivated. When you have governments with billions of dollars at their disposal and something that could threat their power they could do anything they could to stop people from using it, and the cheapest it is to do so the better
You should run a full node if you're a big business with e.g. >$100k/month in volume, or if you run a service that requires high fraud resistance and validation certainty for payments sent your way (e.g. an exchange). For most other users of Bitcoin, there's no good reason to run a full node unless you reel like it.
Shouldn't individuals benefit from fraud resistance too? Why just businesses?
Personally, I think it's a good idea to make sure that people can easily run a full node because they feel like it, and that it's desirable to keep full node resource requirements reasonable for an enthusiast/hobbyist whenever possible. This might seem to be at odds with the concept of making a worldwide digital cash system in which all transactions are validated by everybody, but after having done the math and some of the code myself, I believe that we should be able to have our cake and eat it too.
This is recurrent argument, but also no math provided, "just trust me I did the math"
The biggest reason individuals may want to run their own node is to increase their privacy. SPV wallets rely on others (nodes or ElectronX servers) who may learn their addresses.
It is a reason and valid one but not the biggest reason
If you do it for fun and experimental it good. If you do it for extra privacy it's ok. If you do it to help the network don't. You are just slowing down miners and exchanges.
Yes it will slow down the network, but that shows how people just don't get the the trade off they are doing
I will just copy/paste what Satoshi Nakamoto said in his own words. "The current system where every user is a network node is not the intended configuration for large scale. That would be like every Usenet user runs their own NNTP server."
Another "it is all or nothing argument" and quoting satoshi to try and prove their point. Just because every user doesn't need to be also a full node doesn't mean that there aren't serious risks for having few nodes
For this to have any importance in practice, all of the miners, all of the exchanges, all of the explorers and all of the economic nodes should go rogue all at once. Collude to change consensus. If you have a node you can detect this. It doesn't do much, because such a scenario is impossible in practice.
Not true because as I said, you can DDoS the current nodes or run more malicious nodes than that there currently are, because is cheap to do so
Non-mining nodes don't contribute to adding data to the blockchain ledger, but they do play a part in propagating transactions that aren't yet in blocks (the mempool). Bitcoin client implementations can have different validations for transactions they see outside of blocks and transactions they see inside of blocks; this allows for "soft forks" to add new types of transactions without completely breaking older clients (while a transaction is in the mempool, a node receiving a transaction that's a new/unknown type could drop it as not a valid transaction (not propagate it to its peers), but if that same transaction ends up in a block and that node receives the block, they accept the block (and the transaction in it) as valid (and therefore don't get left behind on the blockchain and become a fork). The participation in the mempool is a sort of "herd immunity" protection for the network, and it was a key talking point for the "User Activated Soft Fork" (UASF) around the time the Segregated Witness feature was trying to be added in. If a certain percentage of nodes updated their software to not propagate certain types of transactions (or not communicate with certain types of nodes), then they can control what gets into a block (someone wanting to get that sort of transaction into a block would need to communicate directly to a mining node, or communicate only through nodes that weren't blocking that sort of transaction) if a certain threshold of nodes adheres to those same validation rules. It's less specific than the influence on the blockchain data that mining nodes have, but it's definitely not nothing.
The first reasonable comment in that thread but is deep down there with only 1 upvote
The addition of non-mining nodes does not add to the efficiency of the network, but actually takes away from it because of the latency issue.
That is true and is actually a trade off you are making, sacrificing security to have scalability
The addition of non-mining nodes has little to no effect on security, since you only need to destroy mining ones to take down the network
It is true that if you destroy mining nodes you take down the network from producing new blocks (temporarily), even if you have a lot of non mining nodes. But, it still better than if you take down the mining nodes who are also the only full nodes. If the miners are not the only full nodes, at least you still have full nodes with the blockchain data so new miners can download it and join. If all the miners are also the full nodes and you take them down, where will you get all the past blockchain data to start mining again? Just pray that the miners that were taken down come back online at some point in the future?
The real limiting factor is ISP's: Imagine a situation where one service provider defrauds 4000 different nodes. Did the excessive amount of nodes help at all, when they have all been defrauded by the same service provider? If there are only 30 ISP's in the world, how many nodes do we REALLY need?
You cant defraud if the connection is encrypted. Use TOR for example, it is hard for ISP's to know what you are doing.
Satoshi specifically said in the white paper that after a certain point, number of nodes needed plateaus, meaning after a certain point, adding more nodes is actually counterintuitive, which we also demonstrated. (the latency issue). So, we have adequately demonstrated why running non-mining nodes does not add additional value or security to the network.
Again, what is the number of nodes that makes it counterproductive? Did he do any math?
There's also the matter of economically significant nodes and the role they play in consensus. Sure, nobody cares about your average joe's "full node" where he is "keeping his own ledger to keep the miners honest", as it has no significance to the economy and the miners couldn't give a damn about it. However, if say some major exchanges got together to protest a miner activated fork, they would have some protest power against that fork because many people use their service. Of course, there still needs to be miners running on said "protest fork" to keep the chain running, but miners do follow the money and if they got caught mining a fork that none of the major exchanges were trading, they could be coaxed over to said "protest fork".
In consensus, what matters about nodes is only the number, economical power of the node doesn't mean nothing, the protocol doesn't see the net worth of the individual or organization running that node.
Running a full node that is not mining and not involved is spending or receiving payments is of very little use. It helps to make sure network traffic is broadcast, and is another copy of the blockchain, but that is all (and is probably not needed in a healthy coin with many other nodes)
He gets it right (broadcasting transaction and keeping a copy of the blockchain) but he dismisses the importance of it
submitted by r0bo7 to btc [link] [comments]

Update and Few Thoughts, a (Well-Typed) transcript: Liza&Charles the marketeers, Voltaire kick-off, PrisM and Ebb-and-Flow to fuck ETH2.0 Gasper, the (back)log of a man and a falcon, lots of companies, September Goguen time, Basho, 2021 Titans, Basho, Hydra and much more thoughts and prayers

Hi everybody this is Charles Hoskinson broadcasting live from warm sunny Colorado. I'm trying a new streaming service and it allows me to annotate a few things and simulcast to both periscope and youtube. Let's see how this works. I also get to put a little caption. I think for the future, I'm just for a while going to put: "I will never give away ada". So, when people repost my videos for giveaway scams they at least have that. First off, a thank you, a community member named Daryl had decided to carve a log and give his artistic impression of my twitter profile picture of me and the falcon so that always means a lot when I get these gifts from fans and also I just wanted to, on the back of the Catalyst presentation, express my profound gratitude and excitement to the community.
You know it's really really cool to see how much progress has been made in such a short period of time. It was only yesterday when we were saying "when Shelley"? Now Shelley's out and it's evolving rapidly. Voltaire is now starting to evolve rapidly and we're real close to Goguen. At the end of this month we'll be able to talk around some of the realities of Goguen and some of the ideas we have and give some dates for certain things and give you a sense of where that project is at. The good news is that we have gained an enormous amount of progress and knowledge about what we need to do and how to get that done and basically people are just executing and it's a much smaller task than getting us to Shelley. With Byron to Shelley we literally had to build a completely new cryptocurrency from the ground up. We had to have new ledger rules, new update system, we had to invent a way of transitioning from one system to another system and there's hundreds of other little innovations along the way: new network stack and so forth. Byron cosmetically looks like Shelley but under the hood it's completely different and the Shelley design was built with a lot of the things that we needed for Goguen in mind. For example, we built Shelley with the idea of extended UTXO and we built Shelley understanding what the realities were for the smart contract model and that's one of the advantages you get when you do this type of bespoke engineering. There's two consequences to that, one, the integration is significantly easier, and two, the integration is significantly faster. We won't look at that same complexity there.
The product update at the end of the month... We'll really start discussing around some of these things as well as talk about partners and talk about how the development ecosystem is going to evolve. There are a lot of threads throughout all three organizations that are happening simultaneously. Emurgo, they're really thinking deeply about DeFi and they've invited us to collaborate with them on things like stablecoins for example but we're also looking at oracles (oracle pools), DEX and these other things and because there are already people in market who have made mistakes, learned lessons, it gives us the benefit of hindsight. It means we can be much faster to market and we can build much more competitive things in market and the Cardano community gets first access to these next generation DeFi applications without a lot of the problems of the prior generations and that's super beneficial to us.
You know, the other side of it, is that Voltaire is going to have a systemic influence not just on community funding but also the overall evolution and direction of the platform. The longer it exists the more pervasive it will become. Probably first applied towards the Cardano foundation roadmap but later on it will definitely have a lot of influence and say over every element aspect of the system including the launch dApps and these other things. Basically, long term, the types of problems that Cardano solves so that's incredibly appealing to me and very exciting to me because it's like I have this giant community brain with the best and brightest of all of you working with us to get us where we need to go.
You know, another thing that was super encouraging, it's a small thing, but it shows us that we're definitely in the right direction was that we recently got a demo from Pramod (Viswanath) and his team out of university of Illinois on a protocol they create called PrisM which is a super fast proof-of-work protocol and they wrote this beautiful paper and they wrote code along with it that showed that PrisM is a ten thousand times faster than Nakamoto consensus. If you take the bitcoin proof-of-work protocol, you strip it out, you put PrisM in, you can run the entire bitcoin system 10000 times faster. They have these beautiful benchmarks to show that. Even in bad network conditions. (I'm) promoting this team, they're, they're real researchers, and they're real engineers, they use a lot of cool HPC concepts like springboarding and other things like that to accommodate that. Then I asked him in the presentation, I said well, how much faster if you replay the Ethereum chain? He says, well, that it takes a big performance hit, could be only maybe a hundred times because that model is not as easy to optimize and shard with standard computer science concepts. In fact in some cases there are limitations there that really can't be overcome. It turns out that we're more on that UTXO side than we are on the account side. As a coincidence or intent of the design of extended UTXO we're gonna have a lot easier time getting much higher performance where and when it's necessary.
I also approved this week a scaling up of the Basho project. In particular, to build a hydra prototype team. The science has gotten to a point where we can make a really competitive push in that particular direction. What does that mean? It means that in just a few short months we can de-risk technological approaches that long-term will give us a lot of fruit where and when the community decides that they need infrastructure like hydra. Now, here's the beautiful thing about hydra. If you watch my whiteboard back in September of 2017 when Cardano first hit market with Byron I talked about this concept of looking at scalability with a very simple test which is as you get more people in the system it stays at the same performance or it gets faster. We all experience systems that do this, for example, bittorrent, more people downloading something you tend to be able to get it faster and we all experience the converse which is, the system gets slower when you get more people. What does this mean? It means that hydra is an actual approach towards true scalability in the system and it's a lot easier to do than sharding even though we have a beautiful approach to get the sharding on the ledger side if we truly desire to go down that way. There's beautiful ideas that we are definitely in deep discussions about. That's a very complex thing. There was recently a paper ("Ebb-and-Flow Protocols: A Resolution of the Availability-Finality Dilemma") out of Stanford that showed that the Gasper protocol as proposed for ETH2.0 does have some security concerns and it's going to be the burden on the shoulders of the Ethereum 2.0 developers and Vitalik to address those concerns from those Stanford professors. Whenever you have these very complex protocols they have so many different ways they can break and things can go wrong so it's much more appealing when you don't have to embrace complexity to achieve the same. The elegance of hydra is that stake pool operators are very natural parties to put hydra channels on and every time we add one we get much more performance out of that and the system as it gets more valuable. The k factor increases which means you get more stake pull operators, which means you get more hydra channels, so with growth we get appreciation, with appreciation we get more decentralization, with more decentralization we get more performance. In essence, this spiritually speaking, is really what we meant when we said scalability. That the system will always grow to meet its particular needs and we have a very elegant way of moving in that direction that doesn't require us to embrace very sophisticated techniques. It's not to say that these techniques don't have a place and purpose but it says that the urgency of implementing these is gone and we then have the luxury to pick the best science when it's ready instead of rushing it to market to resolve a crisis of high fees. We'll never have that crisis so there's a beauty to Cardano that is missing, I in my view, from many cryptocurrencies and blockchains in the marketplace and we're now seeing that beauty shine through. Not only through our community who are so passionate and amazing but in the science and the engineering itself and how easy it is for us to navigate the concepts. How easy it is for us to add more things, to take some things away, to clean some things up here and there and our ability to move through.
I never imagined when in 2015 I signed up to go in on this crazy ride and try to build a world financial operating system we would have made as much progress as we made today. We've written more than 75 research papers as an organization many of which are directly applicable to Cardano. We've got great partners who work with Nasa and Boeing and Pfizer, massive companies, that have 10 years of history and millions of users to come in and help us grow better. We've worked with incredible organizations, major universities like university of Wyoming, university of Edinburgh, Tokyo, tech professors all across the world. We've worked with incredible engineering firms like VacuumLabs and AtixLabs and Twig and Well-Typed, runtime verification, QuviQ and dozens of others along the years and despite the fact that at times there's been delays and friction throughout this entire journey we've mostly been aligned and we keep learning and growing. It gives me so much hope that our best days are ahead of us and an almost fanatical belief that success is inevitable in a certain respect. You see because we always find a way to be here tomorrow and we always find a way to make tomorrow a better day than today and as long as that's the trend you're monotonically increasing towards a better tomorrow, you're always going to have that outcome, you're always going to be in a position where Cardano shines bright. Towards the end of the month we'll have a lot more to say about the development side and that'll be a beginning just like Voltaire is the beginning and then suddenly you now notice the beautiful parallelism of the roadmap. Shelley continues to evolve, partial delegation is coming, in fact, I signed the contract with vacuumlabs to bring that to Ledger (and Trezor). The Daedalus team is hard at work to make that feature apparent for everyone as is the Yoroi team.
You see that, with now Voltaire, and soon was Goguen, and these are not endpoints, rather they're just beginnings and they're never over. We can always make staking better, more diverse, more merit-based and entertain different control models, have better delegation mechanics, have better user experience. The same for smart contracts, that's an endless river and along the way what we've discovered is it's easy for us to work with great minds and great people. For example with testing of smart contracts I would love to diversify that conversation above and beyond what we can come up with and bring in some firms who have done this for a long time to basically take that part with us shoulder to shoulder and build beautiful frameworks to assist us. For example, runtime verification is doing this with, the EVM with a beautiful project called Firefly to replace Truffle. I believe that we can achieve similar ends with Plutus smart contracts.
When you ask yourself what makes a system competitive in the cryptocurrency space? In my view there are four dimensions and you have to have a good story for all four of those dimensions. You need security and correctness. A lot of people don't prioritize that but when they get that wrong it hurts retail people, it hurts everyday people, billions of dollars have been lost due to the incompetence and ineptitude of junior developers making very bad mistakes and oftentimes those developers faced no consequences. The people who lost money were innocent people who believed in cryptocurrencies and wanted to be part of the movement but didn't protect themselves adequately. That's a really sad thing and it's unethical to continue pushing a model that that is the standard or the likely outcome rather than a rare edge case. You have to as a platform, a third generation platformn invest heavily in giving the developers proper tools to ensure security and correctness. We've seen a whole industry there's been great innovations out of Quantstamp and ConsenSys and dozens of other firms in the space including runtime verification who have really made major leaps in the last few years of trying to improve that story. What's unique to Cardano is that we based our foundations on languages that were designed right the first time and there's over 35 years of history for the approach that we're following in the Haskell side that allows us to build high assurance systems and our developers in the ecosystem to build high assurance systems. We didn't reinvent the wheel, we found the best wheel and we're giving it to you.
I think we're going to be dominant in that respect as we enter 2021. Second, you look at things like ease of maintenance, ease of deployment, the life cycle of the software upgrades to the software and as we've demonstrated with things like the hard fork combinator and the fact that Voltaire is not just a governance layer for ada and Cardano but will eventually be reusable for any dApp deployed on our system. You have very natural tooling that's going to allow people to upgrade their smart contracts, their dApps and enable governance for their users at an incredibly low cost and not have to reinvent the governance wheel each and every application. This is another unique property to our system and it can be reused for the dApps that you deploy on your system as I've mentioned before. Performance is a significant concern and this was often corrupted by marketers especially ICO marketers who really wanted to differentiate (and) say: "our protocol tested on a single server in someone's basement is 500000 transactions per second" and somehow that translates to real life performance and that's antithetical to anyone who's ever to study distributed systems and understands the reality of these systems and where they go and what they do and in terms of performance. I think we have the most logical approach. You know, we have 10 years of history with bitcoin, it's a massive system, we've learned a huge amount and there's a lot of papers written about, a lot of practical projects and bitcoin is about to step into the world of smart contracts. We congratulate them on getting Schnorr sigs in and the success of Taproot. That means entering 2021, 2022, we are going to start seeing legitimate dApps DeFi projects, real applications, instead of choosing Ethereum or Algorand, EOS, Cardano, choosing bitcoin and they're adding a lot to that conversation. I think that ultimately that model has a lot of promise which is why we built a better one. There are still significant limitations with what bitcoin can accomplish from settlement time to the verbosity of contracts that can be written.
The extended UTXO model was designed to be the fastest accounting and most charitable accounting model ever, on and off chain, and hydra was designed to allow you to flex between those two systems seamlessly. When you look at the foundations of where we're at and how we can extend this from domain specific languages, for domain experts, such as Marlowe to financial experts, and the DSLs that will come later, for others, like lawyers and supply chain experts in medical databases and so forth and how easy it is to write and deploy these. Plutus being beautiful glue code for both on and off chain communications. I think we have an incredibly competitive offering for performance and when hydra comes, simply put, there'll be no one faster. If we need to shard, we're going to do that and definitely better than anybody else because we know where our security model sits and there won't be surprise Stanford papers to blindside us that require immediate addressing.
In terms of operating costs, this is the last component, in my view, and that's basically how much does it cost you the developer to run your application? There are really two dimensions, one is predictability and the other is amount. It's not just good enough to say: it's a penny per transaction today. You need to know that after you spend millions of dollars and months or years of effort building something and deploying something that you're not going to wake up tomorrow and now it's five dollars to do what used to cost a penny. You need that cost to be as low as possible and as predictable as possible and again the way that we architectured our system and as we turn things on towards the end of this year and as we enter into the next year we believe we have a great approach to achieve low operating cost. One person asks why Cardano? Well because we have great security and correctness in the development experience and tools with 35 years of legacy that were built right the first time and don't put the burdens of mistakes on your customers. They ask why Cardano and we say: well the chain itself is going to give you great solutions with identity value transformation and governance itself and as a consequence when you talk about upgrading your applications having a relationship with your customers of your applications and you talk about the ease of maintenance of those applications. There's going to be a good story there and we have beautiful frameworks like Voltaire that allow that story to evolve and we keep adding partners and who have decades of experience to get us along. We won't stop until it's much better. They asked why Cardano? We said because at the moment we're 10 times faster today than Ethereum today and that's all we really need for this year and next year to be honest and in the future we can be as fast as we need to be because we're truly scalable. As the system gets more decentralized the system improves performance and where and when we need to shard we can do that. We'll have the luxury of time to do it right, the Cardano way, and when people ask why Cardano? Because the reality is, it's very cheap to do things on our platform and the way we're building things. That's going to continue being the case and we have the governance mechanisms to allow the community to readjust fees and parameters so that it can continue being affordable for users. Everything in the system will eventually be customizable and parameterizable: from block size, to transaction fees and the community will be in a good position to dynamically allocate these things where and when needed so that we can enjoy as an ecosystem predictability in our cost.
In the coming weeks and months, especially in my company, we're going to invest a lot of time and effort into comparison marketing and product marketing. When I see people say, oh well, you've launched proof of stake, a lot of other people have done. I don't think those people fully appreciate the magnitude of what we actually accomplished as an ecosystem and the quality of the protocols that are in distribution. That's not their fault, it's our fault, because we didn't take the time in simplistic terms, not scientific papers and deep code and formal specifications, but rather everyday language, to really show why we're different. I admit that that's a product failing and that needs to be corrected so we hired a great marketing director, named Liza (Horowitz?) and she is going to work full time with me and others in the ecosystem, a great team of people, every single day to get out there and explain what we have done is novel, unique, competitive and special to our industry. Everything from Ouroboros and contrast to major other protocols from the EOSes and Algorands and the Tezos of the world. Why we're different, trade-offs we chose over them, to our network stack, to the extended UTXO model, to Plutus, to Marlowe and we're going to keep hammering away at that until we get it right and everybody acknowledges and sees what has been accomplished.
I've spent five years of my life, good years of my life, and missed a lot to get this project where it needs to go. All of our employees have invested huge sums of their personal lives, their time, their brand, their careers, in trying to make this the really most magical and special cryptocurrency and blockchain infrastructure around. No one ever signed up in this company or the other companies working on Cardano to work on a mediocre protocol. That's just another blockchain, they signed up to change the world, they signed up to build a system that legitimately can look at you in the face and say: one day we have the potential to have a billion users! That's what they signed up for and they showed up to play. They built technology that evolves in that direction with some certainty and great foundations and we have an obligation to market in a way that can show the world why, succinctly, with clarity. Understandably, this has been a failing in the past but you know what? You can always be better tomorrow that monotonically increasing make it better and that's what we're going to do. We recognized it and we're going to invest in it and with Voltaire if we can't do it. You the community can do it and we'll work with you. If you can do a better job and the funding will be there to get that done. In addition to this, we think about 2021 and we ask where does the future take us? I've thought a lot about this you know I've thought a lot about how do we get the next five years as we close out 2020 and here's the reality: we're not going to leave as a company until we have smart contracts and multi-asset and Voltaire has evolved to a point where the community can comfortably make decisions about the future of the protocol and that the staking experience has solidified and it's stable.
I don't care if this costs me millions or tens of millions of dollars out of my own pocket to make happen. I'm going to do that because that's my commitment to you, the community and every product update will keep pushing our way there. We'll continue to get more transparent, we'll continue to get more aggressive and hire more and parallelize more. Aware when we can, to deliver that experience so that Cardano gets where it needs to go. Then when we ask about where do we go next? The reality is that the science as an industry, the engineering as an industry has given a menu of incredibly unique attractive and sexy things that we can pursue. What we're going to do is work with the community and the very same tools that are turning on today, the Voltaire tools, the cardano.ideascale.com tools and we're going to propose a consortium and we're going to bring the best and brightest together and give a vision of where we can take the system in another five years. With the benefit of hindsight, massively improved processes, better estimation capabilities and the fact that we're not starting with two people at IOG. We're starting with 250 people and the best scientific division in our industry and the legacy of almost, nearly by the end of this year, 100 scientific papers. That's us, you know what, there's dozens of companies throughout the history who have worked on Cardano. It's about time to scale them up too and get client diversity. So come next year when the protocol has evolved to the point where it's ready for it, we'll have that conversation with you the community and that's going to be a beautiful conversation. At the conclusion of it, there's going to be certainty of how we're going to evolve over the next five years to get ourselves beyond the cryptocurrency space. I'm very tired of these conversations we have about: are you going to go to (coindesk's) consensus or not? Or who's going to be the big winner? What about Libra or what about this particular regulation and this crypto unicorn and this thing?
You know I've been in the space a long time and I've noticed that people keep saying the same things year after year in the same venues. Yes, the crowd sizes get larger and the amount of value at risk gets larger but I haven't seen a lot of progress in the places where I feel it is absolutely necessary for this technology to be permanent in the developing world. We need to see economic identity. People often ask what is the mission for Cardano? For us IOG, you look at economic identity and you take a look at a roadmap. For it, you scale up and down, and each and every step along the way, from open data, to self-sovereign identity, to financial inclusion. You can keep going down: to decentralized lending, decentralized insurance, decentralized banking. Each and every step along the way to economic identity. When you admit a blockchain tells you that, there's a collection of applications and infrastructure that you need to build.
My life's work is to get to a point where we have the technology to do that. The infrastructure to do that, with principles, and so we'll keep evolving Cardano and we'll keep evolving the space as a whole and the science as a whole until I can wake up and say: each box and that road to economic identity, for all people not just one group, we have a solution for that. I'm going to put those applications on Cardano and success for me is not about us being king of the crypto hill and having a higher market cap than bitcoin or being entrepreneur of the year coindesk's most influential person. It's meaningless noise, success for me is reflecting back at the things that we have accomplished together and recognizing that millions if not billions now live in a system where they all matter, they all have a voice, they all have an equal footing. The Jeff Bezos of the world have the very same experience as the person born in Rwanda and we're not done until that's the case. It's a long road, it's a hard road, but you know what? We're making progress, we have great people in Africa, we have great people in eastern Europe, we have great people in southeast Asia and great partners all along the way. Great people, Latin America, great people in south America, great people here in the United States.
When we talk about economic identity there are millions, if not tens of millions of Americans who don't have it. Same for Canadians, hundreds of thousands, who don't have it. Developed western cultures, it's the greatest blind spot of policy and as we enter into a depression as a result of coronavirus, add millions if not tens of millions more onto that list. Generations are being disenfranchised by this legacy system and we as an ecosystem, we as an entire community are offering a different way forward. Not hyper centralizationn not social credit but a way forward where you own your own money, your own identity, your own data. You're not a victim of surveillance capitalism, you're not a victim of civil asset forfeiture. When you say the wrong things, you get shut out of society. Each and every human being matters and I'm optimistic to believe that when you remind people that they matter they're gonna rise to the occasion. That is the point of my company. In the things that we do each and every day, that's our mission to give the platforms to the world so that those who don't have economic identity can get it and they can keep it and no one can take it from them and they can enjoy an ever increasing growth of standard of living wealth and prosperity.
However you want to measure that this is my goal post, I couldn't care less about the cryptocurrency space. It was a great place to start but the space needs to be reminded why it exists. Bitcoin was given a mandate on the back of the 2008 financial crisis to do something different. It was not given a mandate to go be a new settlement layer for central banks or a new way for the old guard to make more money and banks get bigger and for those who are in control to preserve their power. The whole point of doing something so crazy as to buy a coin that doesn't even exist in real life, that's just a bunch of numbers in the cloud, the whole point of that was so that we as a society could do something different than the way that we'd been doing things before. So, each and every member of the cryptocurrency space needs to remind everyone else from time to time why we're here and where did we come from and where are we going to go.
The beauty of Cardano is we have already achieved for the most part a decentralized brain and that momentum is pushing harder than ever. More and more scientists are waking up, more and more institutions are waking up, getting us there. The code we have, the right approach and I think we have a great competitive offering for 2021 as we go and battle the titans and that's going to be a lot of fun but we know who we are and where we're going and we're in the right places. It's so incredibly encouraging to see the stake pool operators not just be from California or Texas or New York or Canada. To see a lot of stake pool operators from the place that need the most, help everybody does matter and it means a lot to me for the people who are there but it means a lot to everybody to say that we have created an equal platform. It makes the participation of all of us so much more meaningful. We're not just talking to each other, we're talking to the world and by working together on this platform we're lifting the world up and giving people hope. That's the point, there's a lot more to do, we didn't get everything done. You never do you aspire, you work hard, you set a moon, shot and sometimes you can just get to orbit with the first go but you know what? When you build the next rocket you can go to Mars.
Thank you all for being with me, thank you all for being part of this. Today was a damn good day with the announcement of Voltaire. Go to cardano.ideascale.com. You can participate in that, so end of September is going to be a good day too. There's a lot of good days to come, in between a lot of hard days, doing tasks sometimes entirely forgettable but always necessary to keep the revolution going and the movement going. I cannot wait for 2021, our best days are ahead of us, because of you. You all take care now .
Source: https://www.youtube.com/watch?v=BFa9zL_Dl_w
Other things mentioned:
https://cardano.ideascale.com/
https://www.atixlabs.com/blockchain
https://www.well-typed.com/
https://www.vacuumlabs.com/
https://medium.com/interdax/what-is-taproot-and-how-will-it-benefit-bitcoin-5c8944eed8da
https://medium.com/interdax/how-will-schnorr-signatures-benefit-bitcoin-b4482cf85d40
https://quantstamp.com/
https://bloxian.com/bloxian-platforms/ (TWIG)
https://runtimeverification.com/firefly/
https://www.trufflesuite.com/
https://experts.illinois.edu/en/publications/prism-deconstructing-the-blockchain-to-approach-physical-limits (PrisM and not our Prism https://atalaprism.io/)
Ebb-and-Flow Protocols: A Resolution of the Availability-Finality Dilemma (aka Gasper and ETH2.0 fucker) https://arxiv.org/abs/2009.04987
http://www.quviq.com/products/
https://en.wikipedia.org/wiki/Schnorr_signature
submitted by stake_pool to cardano [link] [comments]

Two reasons why stable 10 minute block times averages are more important than a rather small long term schedule drift.

Before I open my two reasons, two preliminary facts:
  1. Current schedule drift is less than 1 yr per lifetime of Bitcoin Cash up to now.
  2. If we correct the oscillation and implement a stable new difficulty algorithm that is absolutely scheduled, we stop the future drift. We end up with the drift of the past meaning from wherever we base our absolute scheduling. That may mean we do not correct for past drift at all. To consider the question of whether we should, I want to put up two reasons why I think a stable 10 minute block time average is more important than a small long term schedule.
1. The average time will be a factor in dimensioning computer systems that build on Bitcoin Cash.
Right now, one might thinks the 10 minute block average is very big, because the network is not being heavily used and there is a quasi-consensus rule which implies most nodes will not accept blocks > 32mb.
But 32mb is far from the end goal of Bitcoin Cash on chain scaling. For Bitcoin Cash to succeed, we hope blocks will become MUCH bigger. This means the performance of relaying and processing blocks will become an important factor for node implementations.
Software needs hardware and hardware systems need to be dimensioned for the expected workload.
A stable 10-minute average allows easier dimensioning of the server hardware needed to deal with large-sized blocks.
As Gavin Andresen said: "design for success".
Therefore we should think about the case where blocks are large.
Now, what happens if we implement an algorithm where the blocks can take longer in, let's say, the next 5 years, but then suddenly the difficulty is dropped a bit so that they now effectively arrive faster?
Then your system which you dimensioned for e.g. 11.25 minutes / block would suddenly need to process more per time interval and might be under-dimensioned.
We could say: no problem - technology is going to improve anyway. The consensus around which block sizes are allowed needs to move toward higher block sizes anyway which will make current systems under-dimensioned - perhaps much faster.
But is it really necessary to add another complicating factor to this already complex calculation by implementing a diminishing block time average?
2. Enabling changes to monetary policy such as influencing the emission schedule opens the Overton window to including less well considered monetary policy changes.
In the less harmful scenario, something like drift correction would be well motivated, its benefits and risks laid out clearly, discussed and eventually accepted by Bitcoin Cash (stake)holders.
In the more harmful scenario, such changes would be pushed through without much discussion by a small group which demonstrates that the protocol is easy to change in ways that don't need to be well motivated and largely risk-free and may even disproportionately benefit a certain subset of stakeholders. This would not be a good selling point for Bitcoin Cash.
Those are the two reasons I want to bring up for considering the stability of the "10 minute block time average" as a more important point than correcting for a drift which compared to long term emission (> 100 years) is less than 1%.
submitted by Pablo_Picasho to btc [link] [comments]

Can Bitcoin Scale?

You have some bitcoins in your wallet and want to spend them on your daily purchases. But what would that look like in a world where Visa, Mastercard and other financial services still dominate the market?
The ability for bitcoin to compete with other payment systems has long been up for debate in the cryptocurrency community. When Satoshi Nakamoto programmed the blocks to have a size limit of approximately 1MB each to prevent network spam, he also created the problem of bitcoin illiquidity.
Since each block takes an average of 10 minutes to process, only a small number of transactions can go through at a time. For a system that many claimed could replace fiat payments, this was a big barrier. While Visa handles around 1,700 transactions a second, bitcoin could process up to 7. An increase in demand would inevitably lead to an increase in fees, and bitcoin’s utility would be limited even further.
The scaling debate has unleashed a wave of technological innovation in the search of workarounds. While significant progress has been made, a sustainable solution is still far from clear.
A simple solution initially appeared to be an increase in the block size. Yet that idea turned out to be not simple at all.
First, there was no clear agreement as to how much it should be increased by. Some proposals advocated for 2MB, another for 8MB, and one wanted to go as high as 32MB.
The core development team argued that increasing the block size at all would weaken the protocol’s decentralization by giving more power to miners with bigger blocks. Plus, the race for faster machines could eventually make bitcoin mining unprofitable. Also, the number of nodes able to run a much heavier blockchain could decrease, further centralizing a network that depends on decentralization.
Second, not everyone agrees on this method of change. How do you execute a system-wide upgrade when participation is decentralized? Should everyone have to update their bitcoin software? What if some miners, nodes and merchants don’t?
And finally, bitcoin is bitcoin, why mess with it? If someone didn’t like it, they were welcome to modify the open-source code and launch their own coin.
One of the earliest solutions to this issue was proposed by developer Pieter Wiulle in 2015. It’s called Segregated Witness, or SegWit.
This process would increase the capacity of the bitcoin blocks without changing their size limit, by altering how the transaction data was stored.
SegWit was deployed on the bitcoin network in August 2017 via a soft fork to make it compatible with nodes that did not upgrade. While many wallets and other bitcoin services are gradually adjusting their software, others are reluctant to do so because of the perceived risk and cost.
Several industry players argued that SegWit didn’t go far enough – it might help in the short term, but sooner or later bitcoin would again be up against a limit to its growth.
In 2017, coinciding with CoinDesk’s Consensus conference in New York, a new approach was revealed: Segwit2X. This idea – backed by several of the sector’s largest exchanges – combined SegWit with an increase in the block size to 2MB, effectively multiplying the pre-SegWit transaction capacity by a factor of 8.
Far from solving the problem, the proposal created a further wave of discord. The manner of its unveiling (through a public announcement rather than an upgrade proposal) and its lack of replay protection (transactions could happen on both versions, potentially leading to double spending) rankled many. And the perceived redistribution of power away from developers towards miners and businesses threatened to cause a fundamental split in the community.
Other technological approaches are being developed as a potential way to increase capacity.
Schnorr signatures offer a way to consolidate signature data, reducing the space it takes up within a bitcoin block (and enhancing privacy). Combined with SegWit, this could allow a much greater number of transactions, without changing the block size limit
And work is proceeding on the lightning network, a second layer protocol that runs on top of bitcoin, opening up channels of fast microtransactions that only settle on the bitcoin network when the channel participants are ready.
Adoption of the SegWit upgrade is slowly spreading throughout the network, increasing transaction capacity and lowering fees.
Progress is accelerating on more advanced solutions such as lightning, with transactions being sent on testnets (as well as some using real bitcoin). And the potential of Schnorr signatures is attracting increasing attention, with several proposals working on detailing functionality and integration.
While bitcoin’s use as a payment mechanism seems to have taken a back seat to its value as an investment asset, the need for a greater number of transactions is still pressing as the fees charged by the miners for processing are now more expensive than fiat equivalents. More importantly, the development of new features that enhance functionality is crucial to unlocking the potential of the underlying blockchain technology.
submitted by hackatoshi to u/hackatoshi [link] [comments]

Filecoin | Development Status and Mining Progress

Author: Gamals Ahmed, CoinEx Business Ambassador
https://preview.redd.it/5bqakdqgl3g51.jpg?width=865&format=pjpg&auto=webp&s=b709794863977eb6554e3919b9e00ca750e3e704
A decentralized storage network that transforms cloud storage into an account market. Miners obtain the integrity of the original protocol by providing data storage and / or retrieval. On the contrary, customers pay miners to store or distribute data and retrieve it.
Filecoin announced, that there will be more delays before its main network is officially launched.
Filecoin developers postponed the release date of their main network to late July to late August 2020.
As mentioned in a recent announcement, the Filecoin team said that the initiative completed the first round of the internal protocol security audit. Platform developers claim that the results of the review showed that they need to make several changes to the protocol’s code base before performing the second stage of the software testing process.
Created by Protocol Labs, Filecoin was developed using File System (IPFS), which is a peer-to-peer data storage network. Filecoin will allow users to trade storage space in an open and decentralized market.
Filecoin developers implemented one of the largest cryptocurrency sales in 2017. They have privately obtained over $ 200 million from professional or accredited investors, including many institutional investors.
The main network was slated to launch last month, but in February 2020, the Philly Queen development team delayed the release of the main network between July 15 and July 17, 2020.
They claimed that the outbreak of the Coronavirus (COVID-19) in China was the main cause of the delay. The developers now say that they need more time to solve the problems found during a recent codecase audit.
The Filecoin team noted the following:
“We have drafted a number of protocol changes to ensure that building our major network launch is safe and economically sound.” The project developers will add them to two different implementations of Filecoin (Lotus and go-filecoin) in the coming weeks.
Filecoin developers conducted a survey to allow platform community members to cast their votes on three different launch dates for Testnet Phase 2 and mainnet.
The team reported that the community gave their votes. Based on the vote results, the Filecoin team announced a “conservative” estimate that the second phase of the network test should begin by May 11, 2020. The main Filecoin network may be launched sometime between July 20 and August 21, 2020.
The updates to the project can be found on the Filecoin Road Map.
Filecoin developers stated:
“This option will make us get the most important protocol changes first, and then implement the rest as protocol updates during testnet.” Filecoin is back down from the final test stage.
Another filecoin decentralized storage network provider launched its catalytic test network, the final stage of the storage network test that supports the blockchain.
In a blog post on her website, Filecoin said she will postpone the last test round until August. The company also announced a calibration period from July 20 to August 3 to allow miners to test their mining settings and get an idea of how competition conditions affected their rewards.
Filecoin had announced earlier last month that the catalytic testnet test would precede its flagship launch. The delay in the final test also means that the company has returned the main launch window between August 31 and September 21.
Despite the lack of clear incentives for miners and multiple delays, Filecoin has succeeded in attracting huge interest, especially in China. Investors remained highly speculating on the network’s mining hardware and its premium price.
Mining in Filecoin
In most blockchain protocols, “miners” are network participants who do the work necessary to promote and maintain the blockchain. To provide these services, miners are compensated in the original cryptocurrency.
Mining in Filecoin works completely differently — instead of contributing to computational power, miners contribute storage capacity to use for dealing with customers looking to store data.
Filecoin will contain several types of miners:
Storage miners responsible for storing files and data on the network. Miners retrieval, responsible for providing quick tubes for file recovery. Miners repair to be carried out.
Storage miners are the heart of the network. They earn Filecoin by storing data for clients, and computerizing cipher directories to check storage over time. The probability of earning the reward reward and transaction fees is proportional to the amount of storage that the Miner contributes to the Filecoin network, not the hash power.
Retriever miners are the veins of the network. They earn Filecoin by winning bids and mining fees for a specific file, which is determined by the market value of the said file size. Miners bandwidth and recovery / initial transaction response time will determine its ability to close recovery deals on the network.
The maximum bandwidth of the recovery miners will determine the total amount of deals that it can enter into.
In the current implementation, the focus is mostly on storage miners, who sell storage capacity for FIL.

Hardware recommendations

The current system specifications recommended for running the miner are:
Compared to the hardware requirements for running a validity checker, these standards are much higher — although they definitely deserve it. Since these will not increase in the presumed future, the money spent on Filecoin mining hardware will provide users with many years of reliable service, and they pay themselves many times. Think of investing as a small business for cloud storage. To launch a model on the current data hosting model, it will cost millions of dollars in infrastructure and logistics to get started. With Filecoin, you can do the same for a few thousand dollars.
Proceed to mining
Deals are the primary function of the Filecoin network, and it represents an agreement between a client and miners for a “storage” contract.
Once the customer decides to have a miner to store based on the available capacity, duration and price required, he secures sufficient funds in a linked portfolio to cover the total cost of the deal. The deal is then published once the mine accepts the storage agreement. By default, all Filecoin miners are set to automatically accept any deal that meets their criteria, although this can be disabled for miners who prefer to organize their deals manually.
After the deal is published, the customer prepares the data for storage and then transfers it to the miner. Upon receiving all the data, the miner fills in the data in a sector, closes it, and begins to provide proofs to the chain. Once the first confirmation is obtained, the customer can make sure the data is stored correctly, and the deal has officially started.
Throughout the deal, the miner provides continuous proofs to the chain. Clients gradually pay with money they previously closed. If there is missing or late evidence, the miner is punished. More information about this can be found in the Runtime, Cut and Penalties section of this page.
At Filecoin, miners earn two different types of rewards for their efforts: storage fees and reward prevention.
Storage fees are the fees that customers pay regularly after reaching a deal, in exchange for storing data. This fee is automatically deposited into the withdrawal portfolio associated with miners while they continue to perform their duties over time, and is locked for a short period upon receipt.
Block rewards are large sums given to miners calculated on a new block. Unlike storage fees, these rewards do not come from a linked customer; Instead, the new FIL “prints” the network as an inflationary and incentive measure for miners to develop the chain. All active miners on the network have a chance to get a block bonus, their chance to be directly proportional to the amount of storage space that is currently being contributed to the network.
Duration of operation, cutting and penalties
“Slashing” is a feature found in most blockchain protocols, and is used to punish miners who fail to provide reliable uptime or act maliciously against the network.
In Filecoin, miners are susceptible to two different types of cut: storage error cut, unanimously reduce error.
Storage Error Reduction is a term used to include a wider range of penalties, including error fees, sector penalties, and termination fees. Miners must pay these penalties if they fail to provide reliability of the sector or decide to leave the network voluntarily.
An error fee is a penalty that a miner incurs for each non-working day. Sector punishment: A penalty incurred by a miner of a disrupted sector for which no error was reported before the WindowPoSt inspection.
The sector will pay an error fee after the penalty of the sector once the error is discovered.
Termination Fee: A penalty that a miner incurs when a sector is voluntary or involuntarily terminated and removed from the network.
Cutting consensus error is the penalty that a miner incurs for committing consensus errors. This punishment applies to miners who have acted maliciously against the network consensus function.
Filecoin miners
Eight of the top 10 Felticoin miners are Chinese investors or companies, according to the blockchain explorer, while more companies are selling cloud mining contracts and distributed file sharing system hardware. CoinDesk’s Wolfe Chao wrote: “China’s craze for Filecoin may have been largely related to the long-standing popularity of crypto mining in the country overall, which is home to about 65% of the computing power on Bitcoin at discretion.”
With Filecoin approaching the launch of the mainnet blocknet — after several delays since the $ 200 million increase in 2017 — Chinese investors are once again speculating strongly about network mining devices and their premium prices.
Since Protocol Labs, the company behind Filecoin, released its “Test Incentives” program on June 9 that was scheduled to start in a week’s time, more than a dozen Chinese companies have started selling cloud mining contracts and hardware — despite important details such as economics Mining incentives on the main network are still endless.
Sales volumes to date for each of these companies can range from half a million to tens of millions of dollars, according to self-reported data on these platforms that CoinDesk has watched and interviews with several mining hardware manufacturers.
Filecoin’s goal is to build a distributed storage network with token rewards to spur storage hosting as a way to drive wider adoption. Protocol Labs launched a test network in December 2019. But the tokens mined in the testing environment so far are not representative of the true silicon coin that can be traded when the main network is turned on. Moreover, the mining incentive economics on testnet do not represent how final block rewards will be available on the main network.
However, data from Blockecoin’s blocknetin testnet explorers show that eight out of 10 miners with the most effective mining force on testnet are currently Chinese miners.
These eight miners have about 15 petabytes (PB) of effective storage mining power, accounting for more than 85% of the total test of 17.9 petable. For the context, 1 petabyte of hard disk storage = 1000 terabytes (terabytes) = 1 million gigabytes (GB).
Filecoin craze in China may be closely related to the long-standing popularity of crypt mining in the country overall, which is home to about 65% of the computing power on Bitcoin by estimation. In addition, there has been a lot of hype in China about foreign exchange mining since 2018, as companies promote all types of devices when the network is still in development.
“Encryption mining has always been popular in China,” said Andy Tien, co-founder of 1475, one of several mining hardware manufacturers in Philquin supported by prominent Chinese video indicators such as Fenbushi and Hashkey Capital.
“Even though the Velikoyen mining process is more technologically sophisticated, the idea of mining using hard drives instead of specialized machines like Bitcoin ASIC may be a lot easier for retailers to understand,” he said.
Meanwhile, according to Feixiaohao, a Chinese service comparable to CoinMarketCap, nearly 50 Chinese crypto exchanges are often somewhat unknown with some of the more well-known exchanges including Gate.io and Biki — have listed trading pairs for Filecoin currency contracts for USDT.
In bitcoin mining, at the current difficulty level, one segment per second (TH / s) fragmentation rate is expected to generate around 0.000008 BTC within 24 hours. The higher the number of TH / s, the greater the number of bitcoins it should be able to produce proportionately. But in Filecoin, the efficient mining force of miners depends on the amount of data stamped on the hard drive, not the total size of the hard drive.
To close data in the hard drive, the Filecoin miner still needs processing power, i.e. CPU or GPU as well as RAM. More powerful processors with improved software can confine data to the hard drive more quickly, so miners can combine more efficient mining energy faster on a given day.
As of this stage, there appears to be no transparent way at the network level for retail investors to see how much of the purchased hard disk drive was purchased which actually represents an effective mining force.
The U.S.-based Labs Protocol was behind Filecoin’s initial coin offer for 2017, which raised an astonishing $ 200 million.
This was in addition to a $ 50 million increase in private investment supported by notable venture capital projects including Sequoia, Anderson Horowitz and Union Square Ventures. CoinDk’s parent company, CoinDk, has also invested in Protocol Labs.
After rounds of delay, Protocol Protocols said in September 2019 that a testnet launch would be available around December 2019 and the main network would be rolled out in the first quarter of 2020.
The test started as promised, but the main network has been delayed again and is now expected to launch in August 2020. What is Filecoin mining process?
Filecoin mainly consists of three parts: the storage market (the chain), the blockecin Filecoin, and the search market (under the chain). Storage and research market in series and series respectively for security and efficiency. For users, the storage frequency is relatively low, and the security requirements are relatively high, so the storage process is placed on the chain. The retrieval frequency is much higher than the storage frequency when there is a certain amount of data. Given the performance problem in processing data on the chain, the retrieval process under the chain is performed. In order to solve the security issue of payment in the retrieval process, Filecoin adopts the micro-payment strategy. In simple terms, the process is to split the document into several copies, and every time the user gets a portion of the data, the corresponding fee is paid. Types of mines corresponding to Filecoin’s two major markets are miners and warehousers, among whom miners are primarily responsible for storing data and block packages, while miners are primarily responsible for data query. After the stable operation of the major Filecoin network in the future, the mining operator will be introduced, who is the main responsible for data maintenance.
In the initial release of Filecoin, the request matching mechanism was not implemented in the storage market and retrieval market, but the takeover mechanism was adopted. The three main parts of Filecoin correspond to three processes, namely the stored procedure, retrieval process, packaging and reward process. The following figure shows the simplified process and the income of the miners:
The Filecoin mining process is much more complicated, and the important factor in determining the previous mining profit is efficient storage. Effective storage is a key feature that distinguishes Filecoin from other decentralized storage projects. In Filecoin’s EC consensus, effective storage is similar to interest in PoS, which determines the likelihood that a miner will get the right to fill, that is, the proportion of miners effectively stored in the entire network is proportional to final mining revenue.
It is also possible to obtain higher effective storage under the same hardware conditions by improving the mining algorithm. However, the current increase in the number of benefits that can be achieved by improving the algorithm is still unknown.
It seeks to promote mining using Filecoin Discover
Filecoin announced Filecoin Discover — a step to encourage miners to join the Filecoin network. According to the company, Filecoin Discover is “an ever-growing catalog of numerous petabytes of public data covering literature, science, art, and history.” Miners interested in sharing can choose which data sets they want to store, and receive that data on a drive at a cost. In exchange for storing this verified data, miners will earn additional Filecoin above the regular block rewards for storing data. Includes the current catalog of open source data sets; ENCODE, 1000 Genomes, Project Gutenberg, Berkley Self-driving data, more projects, and datasets are added every day.
Ian Darrow, Head of Operations at Filecoin, commented on the announcement:
“Over 2.5 quintillion bytes of data are created every day. This data includes 294 billion emails, 500 million tweets and 64 billion messages on social media. But it is also climatology reports, disease tracking maps, connected vehicle coordinates and much more. It is extremely important that we maintain data that will serve as the backbone for future research and discovery”.
Miners who choose to participate in Filecoin Discover may receive hard drives pre-loaded with verified data, as well as setup and maintenance instructions, depending on the company. The Filecoin team will also host the Slack (fil-Discover-support) channel where miners can learn more.
Filecoin got its fair share of obstacles along the way. Last month Filecoin announced a further delay before its main network was officially launched — after years of raising funds.
In late July QEBR (OTC: QEBR) announced that it had ceded ownership of two subsidiaries in order to focus all of the company’s resources on building blockchain-based mining operations.
The QEBR technology team previously announced that it has proven its system as a Filecoin node valid with CPU, GPU, bandwidth and storage compatibility that meets all IPFS guidelines. The QEBR test system is connected to the main Filecoin blockchain and the already mined filecoin coin has already been tested.
“The disclosure of Sheen Boom and Jihye will allow our team to focus only on the upcoming global launch of Filecoin. QEBR branch, Shenzhen DZD Digital Technology Ltd. (“ DZD “), has a strong background in blockchain development, extraction Data, data acquisition, data processing, data technology research. We strongly believe Filecoin has the potential to be a leading blockchain-based cryptocurrency and will make every effort to make QEBR an important player when Mainecoin mainnet will be launched soon”.
IPFS and Filecoin
Filecoin and IPFS are complementary protocols for storing and sharing data in a decentralized network. While users are not required to use Filecoin and IPFS together, the two combined are working to resolve major failures in the current web infrastructure.
IPFS
It is an open source protocol that allows users to store and transmit verifiable data with each other. IPFS users insist on data on the network by installing it on their own device, to a third-party cloud service (known as Pinning Services), or through community-oriented systems where a group of individual IPFS users share resources to ensure the content stays live.
The lack of an integrated catalytic mechanism is the challenge Filecoin hopes to solve by allowing users to catalyze long-term distributed storage at competitive prices through the storage contract market, while maintaining the efficiency and flexibility that the IPFS network provides.
Using IPFS
In IPFS, the data is hosted by the required data installation nodes. For data to persist while the user node is offline, users must either rely on their other peers to install their data voluntarily or use a central install service to store data.
Peer-to-peer reliance caching data may be a good thing as one or multiple organizations share common files on an internal network, or where strong social contracts can be used to ensure continued hosting and preservation of content in the long run. Most users in an IPFS network use an installation service.
Using Filecoin
The last option is to install your data in a decentralized storage market, such as Filecoin. In Filecoin’s structure, customers make regular small payments to store data when a certain availability, while miners earn those payments by constantly checking the integrity of this data, storing it, and ensuring its quick recovery. This allows users to motivate Filecoin miners to ensure that their content will be live when it is needed, a distinct advantage of relying only on other network users as required using IPFS alone.
Filecoin, powered by IPFS
It is important to know that Filecoin is built on top of IPFS. Filecoin aims to be a very integrated and seamless storage market that takes advantage of the basic functions provided by IPFS, they are connected to each other, but can be implemented completely independently of each other. Users do not need to interact with Filecoin in order to use IPFS.
Some advantages of sharing Filecoin with IPFS:
Of all the decentralized storage projects, Filecoin is undoubtedly the most interested, and IPFS has been running stably for two years, fully demonstrating the strength of its core protocol.
Filecoin’s ability to obtain market share from traditional central storage depends on end-user experience and storage price. Currently, most Filecoin nodes are posted in the IDC room. Actual deployment and operation costs are not reduced compared to traditional central cloud storage, and the storage process is more complicated.
PoRep and PoSt, which has a large number of proofs of unknown operation, are required to cause the actual storage cost to be so, in the early days of the release of Filecoin. The actual cost of storing data may be higher than the cost of central cloud storage, but the initial storage node may reduce the storage price in order to obtain block rewards, which may result in the actual storage price lower than traditional central cloud storage.
In the long term, Filecoin still needs to take full advantage of its P2P storage, convert storage devices from specialization to civil use, and improve its algorithms to reduce storage costs without affecting user experience. The storage problem is an important problem to be solved in the blockchain field, so a large number of storage projects were presented at the 19th Web3 Summit. IPFS is an important part of Web3 visibility. Its development will affect the development of Web3 to some extent. Likewise, Web3 development somewhat determines the future of IPFS. Filecoin is an IPFS-based storage class project initiated by IPFS. There is no doubt that he is highly expected.
Resources :
  1. https://www.coindesk.com/filecoin-pushes-back-final-testing-phase-announces-calibration-period-for-miners
  2. https://docs.filecoin.io/mine/#types-of-miners https://www.nasdaq.com/articles/inside-the-craze-for-filecoin-crypto-mining-in-china-2020-07-12؟amp
  3. https://www.prnewswire.com/news-releases/qebr-streamlines-holdings-to-concentrate-on-filecoin-development-and-mining-301098731.html
  4. https://www.crowdfundinsider.com/2020/05/161200-filecoin-seeks-to-boost-mining-with-filecoin-discove
  5. https://zephyrnet.com/filecoin-seeks-to-boost-mining-with-filecoin-discove
  6. https://docs.filecoin.io/introduction/ipfs-and-filecoin/#filecoin-powered-by-ipfs
submitted by CoinEx_Institution to filecoin [link] [comments]

Satoshi Nakamoto and Bitcoin are not the only contents in Blockchain, This public chain which is possible modifying global finance trending is the "brave wind and waves" for DeFi

DEFI is extending rapidly, Market value is skyrocketing
Every single employee and employer will be shocked by DeFi in Blockchain industry, There has not been a single concept existed can compare to DeFi since block chain technology created ,sparking the fire to spread through the whole block chain industry; Even the founder of Bitcoin and Blockchain Satoshi Nakamoto may not considered that the DeFi trending will exceed Bitcoin.
Defi has become the hot topic in Blockchain field since the beginning of 2019;DeFi is the abbreviation for Decentralized Finance, also called open finance, meaning to build decentralized contracts which belongs to open financial system. DeFi is dedicating to provide time free, space free financial activities to all the people, it is what we called decentralized finance.
In the current financial systems, all financial services are controlled or adjusted unitedly by centralized finance system, whether the basic function such as deposit and transfer, loan or derivatives transactions are monitored and distributed by centralized financial organizations; DeFi is hoping to build a transparent, addressable and inclusive P2P financial system, minimizing the trust risks, simplifying the transactions payment process, expanding transactions scenarios.
DeFi platform has 3 obvious advantages compare to traditional centralized financial systems.
1.Global financial services are applying broadly, allowing everyone getting financial services through internet or smart phones which based on decentralized financial system built-in blockchains, including all the services that current banks organizations provide.
2.Blockchain techniques have high openness characteristic, everyone has the right to access, but nobody has central control right, achieving decentralizing for financial transactions. This point is the original purpose for creating Bitcoin by Satoshi Nakamoto.
3.Cross border will be more convenient and more economical. DeFi is applying the openness trait in Blockchain, avoiding expensive commission during global payment, allowing financial transaction to be more convenient, efficient when minimizing the global transferring cost.
Due to these benefits, DeFi is able to occupy first place at the triennial palace examination for block chain industries because of the benefits above. Capitalist is taking up the positions of the fallen and rising to fight one after another; According to the data within block chain fields on Aug 20, the market value for the whole DeFi industry is reaching 11.3billion dollars, which is the signal for passing the top digital currency industry; at the meantime, all the transactions are reaching 429million dollars in all decentralized exchanges; The total loan is reaching 1.5 billion on loan platform; The fixed asset is reaching 6.37billion dollar value for DeFi.
For global depressed economic, DeFi industry capital is exceeding most financial industries. When the river rises, the boat floats high. DeFi related project is gaining large profits in the vigorous blockchain exploitation processes. Token price is skyrocketing in DeFi. In two years, Total value DeFi project is rising to 10billion or more, From reasonable perspective, The whole DeFi ecosystem is filling with industry bubble, which is missing the flexibility and grounded projects.
Superior projects have something in common.
The so-called decentralized finance in DeFi, It consists two parts, which are decentralization and finance, under current circumstances, most projects only achieve “Financial “part in the industry, real decentralizing has not been achieved; For most DeFi projects, the first customized version was not satisfied the marketing expectations, most core functions will need to be updated, so the initial team has to have complete authority to control the projects in order to complete on-time and efficiency updating jobs.
This means that all the DeFi projects we see, most projects are controlled by initial creating team. Is controlled DeFi belongs to real DeFi? Does any single project can achieve Finance +decentralized?
New born AITD may satisfied blockchain expectation for DEFI, AITD Blockchain new generation foundation public chain at business level is built for “Decentralization +Finance.
As the marketing needs increasing annually for finance industries, such as banks, insurances, securities. AITD is following the trend closely, connecting the idea and purpose of DeFi, building a healthy, complete decentralized financial ecosystem; Blockchain DeFi+AITD are extending to new direction for insurance, Trust, pledge, cross region payment.
INSURANCE: AITD is innovating the current medical system by integrating insurance easy use scenarios, let’s using medical insurance as an example. AITD blockchain is not only storing digital information prove to blocks, but also achieving message sharing; AITD is able to break through the each steps in insurance process, solving asymmetry problem, allowing information transparency during insurance process for upstream and downstream, achieving value flows; Providing the rewarding mechanism for information provider through information sharing, leading medical system information publicized, breaking through each circulation for Medical- insurance-monitoring, realizing medical electronic and electronic insurance business, achieving insurance business stored in block chain networks through blockchain smart contract, achieving auto insurance verification, intelligent insurance claiming goal.
PLEDGE The essence of Pledge is new Smart business, as a new model are becoming the main track for real application, achieving to be the solid foundation for decentralized finance. Financial services should not be built under opacity lonely island. AITD is dedicating to build a finance system that allows everyone visiting as long as internet is available, letting value flow freely; According to the high intelligence and high transparency characteristic, AITD will bring new revolutionized storm to global financial system. The transforming direction for Pledge is open finance, open finance is the future morphology for finance. In the future, we are building highly ecological operating systems, fully integrating the innovative characteristics for front technology, smart business, open organization, digitalized finance, forming delicate business system.
TRUST: which is connecting block chain techniques is incorporating innovation, freedom, equality gene.In the premise of Justice and fair, Block chain Trust is containing market value maintaining promotion system, which is able to observe the instant experiences feedbacks for global users through constant updating, promoting changes for products, perfecting uses experiences’ .AITD collective Trust has high transparency rate, requiring real name authentication for loan corporation and investors themselves, processing transparency for each project’s process, dedicating to build a safe, stable, transparent, efficient online and offline platforms for medium , small, micro sized companies which have capital demand and person who has financing needs; innovating the traditional Trust operation mode, practicing facilitating health industry through technology, applying assets operation idea of integrating “smart” ”capital” idea, collaborating with medical fields experts who made great contributions in this field;dedicating to facilitate medical resources, medical research abilities and financial capitals high efficiency integration.
CROSS REGION PAYMENTS: Block chain payment techniques are changing “traditional assets flow and information flow” operation modes through the structure and improving traditional high cost transferring, low transparency rate, transactions risks through unique advantages of block chains; AITD has comprehensive, strong international bank card fund collecting products and diverse overseas or local payment receiving methods, which are able to provide global one station online payment solution proposals, allowing users to transfer from anywhere, anytime in the world, enabling merchant to accept different kinds of payments habits, processing exchange rate payment automatically; According to cross region payment scenarios, transferring speed and low cost advantages will be concentrated, platform will collaborate with other platforms within the globe, assisting these platforms which have global community backgrounds to explore payment channels.
AITD is incorporating block chain technology and finance to the maximum level. In the original thoughts of Bitcoin from Satoshi Nakamoto,counting on Bitcoin to modify the current financial system mode, allowing real freedom for currency, open sources, decentralization, flowing throughout the society and applying, creating multiple finance scenarios Trust Consensus; AITD+DeFi can achieve things that bitcoin cannot achieve.
AITD advantages, self-owned public chain
Traditional DeFi projects are distributed on the Ethereum or other networks, traffic jam, low experience rate, high processing fee, internet jam, resisting developer and so on, DeFi projects is suggesting user and developer quitting in Ethereum; AITD which belongs to DeFi is facing the same problems, but AITD team has already found the best solution for this problem. We will explain it later.
The current situation that DeFi industries are facing:Although there are too much complaining towards Ethereum, the new or old projects cannot kept without Ethereum. According to the DeFi prime data, in 242 DeFi projects that collecting one time, 197 numbers of DeFi are deployed on Ethereum, EOS and Bitcoin only contain 22 and 23 , DeFi project number is approaching to zero on other public chains, Ethereum is considering as the second leading factor for blockchain industries after Bitcoin, determining the fate of DeFi.
Why are DeFi (such as hot Compound, Uniswap) not existing in other public chains? Ultimately, the reason for public chains hardly generate Defi (except Ethereum) due to the following 3 reasons.
1).Public chain which considers Ethereum as the first public chain, possessing competitive advantages in kinds of assets, total number of assets.
2) Unlike Ethereum, other public chains are not paying much attention to DeFi. they are losing the initiation for following the trend now
3) DeFi Decentralization governing after scaling, causing costs for moving Ethereum to other public chain are hard to estimate.
Actually, after DeFi shocked digital currency encrypted market, each public chain is entering DeFi military prepared competition, capital, techniques, human resources are constantly devoting into DeFi; Finally, there is no single public chain exceeding Ethereum or challenging Ethereum.Pulic chain problems are the pain points for the industry.
Ether researcher once said that “According to the jam in Ether network, even worse than ICO bubble, this is not exaggerated, During the prosperous period for ICO in 2018, Each transaction processing fee is reaching 5.4 US dollars. However, at the 5:00pm in Aug 13th , this number is skyrocketing and reaching 7.4 us dollars, It is 15 times high comparing to 0.5 US dollars in the previous month;DeFi prosperity on Ether is marketing behavior which is against humanity.
Under this circumstances, the trend for searching new public chain is necessary; what is the AITD team solution? The answer is public chain
To avoid anti humanity sanction by Ether public chain and also to build a complete, efficient DeFi ecosystem. AITD team is researching and developing self-owned public chain in block chain for three years, providing multi block chain scenarios services to large user groups on AITD block chain.
In the future,AITD will provide reliable, safe, convenient blockchain services to users in basic public information search, copyright administration, tracing for certified products, ensuring product security scenarios, achieving multi-path communication which Bitcoin is not able to process; Meanwhile, AITD chain is achieving self-closing loop for ecosystem, extending the spirit of DeFi to insurance, Trust, Pledge,Cross border payment etc in multiple financial scenarios, achieving decentralized finance for real.
Current block chain network is independent internet relatively, encountering information island problems; Isolation of the internet is not supporting collaborative operation between each blockchain network. Isolation limit applied fields for the blockchain techniques at maximum level; However, AITD is dedicating to build a strong extensibility block chain networks, when it achieves fast, safe cross chain data visit, it also builds a valuable internet for the whole block chain industry.
Valued internet+ Blockchain decentralized finance, AITD have strong ambition, dedicating to provide value of 11.3billion the best application in financial world, we will wait for the expecting result.
submitted by AITDBlockchai to u/AITDBlockchai [link] [comments]

Some informative responses from Colin and Andy from the just-concluded Nano AMA at the Atomic Wallet Telegram group

The AMA ran today from 13:00 - 14:20 UTC, with Colin and Andy. I've copied over some of their responses that I found give me better insight into Nano. Their responses are in italics. Responses to different questions are separated by double spaces. Colin's responses are listed first, followed by Andy's. Sorry I couldn't copy over the questions as well. I've added my comments in places.
From Colin:
PoW coins have done a good marketing that the energy expenditure makes your coins more secure but it’s really unnecessory. PoW coins need to continue expending work because if they stop, their security parameter erodes.
Nano has no such problem, once an election for a transaction is complete, it’s confirmed. If it sits there it stays confirmed and it doesn’t need any extra effort. Wow, put that way, Bitcoin seems unsustainable in the long term when there is an alternative like Nano.

Yes the circulating supply is forever like this. The reason it can’t change is because nano transactions can only send your current balance or less to someone else, this means new coins can never be injected in to the system. Interesting design reason new Nano can't be minted.

Volatility is a focus with all cryptocurrencies and it comes from low volume, it’s not intrinsic to cryptocurrency itself. To cure low volume our focus is integrating it in to parts of the economy where it solves a problem, rather than just emulating credit cards etc.
Not having fees in the network puts us in a very good position for buying beer, for example. Typically credit card providers will charge 2-5% for a purchase, maybe even more, and it tight margin businesses that make 2-5% profit anyway, this is huge. A lot of Reddit discussion on crypto adoption considers only user experience and overlooks benefits to merchants.

Nano is purpose built to be the fastest and most decentralized currency around. Our transactions settle in less than 1 second and it’s all done on a network with no fees, and a tiny environmental footprint
Decentralization is an essential focus for us, many other cryptocurrencies can get fast or low cost, but they can’t also maintain decentralization which I think we do very well.
Well the sustainability comes from 2 main parts. We have a laser sharp focus on being the most efficient currency. This means our development stays focused and eventually the amount of things going in to the code base will trend downward; once we’ve achieved the goal we just have to make things more efficient.
The second part of sustainability is our Open Representative Voting which is our replacement for PoW mining. We saw the energy expenditure as something that would come in conflict with any system that would attain high adoption so our goal was to get the same or better decentralization benefits and also have a low energy footprint. We think we achieved that goal as our representatives are all over the world under many different organizations. A healthy decentralized representative set is good for long term sustainability.

And on the simplicity, nano is probably one of the easiest cryptocurrencies to use. There are no fees to calculate, the UX impact of entering a fee is greatly understated. How much should the fee be? Does my grandma know what network load is? What does it mean with respect to fee?
Nano simply has accounts and balances, you send and it lands in their wallet in less than a second, nothing can be simpler.

We’re not looking to expand in to defi right now. I have some reservations about it’s viability. One thing I’ve noticed in my many years of seeing technology evolution is to not try and change 2 things at once. We don’t want to simultaneously change the currency people use and also change how finances are done. First change the currency, then change the finances.
I think Libra suffers from a market mis-assesment. Essentially what they’re claiming is be a multi-currency bank account for every facebook user. Getting users electronic bank accounts isn’t a technology problem, it’s a regulatory and logistics problem. Since Facebook is essentially being a bank for people, they’re going to be required to comply with KYC requirements. Sending/receiving isn’t going to be open as it is in cryptocurrency because of AML requirements. People are not going to have access to the system in remote areas because how do they deposit or more importantly withdraw local currency from their Libra accounts.
I think privacy is a big concern with our transactions and credit card purchases and it’s only getting worse. Letting Facebook/Libra know all your purchase history I think is a huge mistake.
I think it also doesn’t fundamentally solve the central banking problem where they can print more money and inflate the currency supply. I see this behavior as a fundamentally unethical thing that cryptocurrency solves and Libra is taking a huge step back on that.
I don’t see anything compelling about it and I don’t see long term viability.

I think disk usage is going to be a low concern long term. The goal with Nano is to be a widely used commercial grade currency so the representatives will be banks and other financial institutions, universities, and tech companies. Considering how much youtube, instagram, and other social media data is created each day, I don’t think the ledger size will be a long-term limiting factor. Looks like the role of hobbyists in running nodes will diminish with widening adoption.

Nano’s value is being the fastest, most efficient currency around. Entreprenuers make use of natural market incentives / natural efficiencies to make money on a business.
Cryptocurrency has distorted that term a bit with something more closely resembling subsidies. The transaction fees and block rewards are subsidizing the security parameter and processing prioritization. PoW chains need this subsidy because their security parameter costs a lot. Additionally we’ve seen miners work to limit the network’s throughput in order to rent-seek on the limited transaction space. Damn, talk about unaligned incentives between users and miners.
The people we’re looking for are the entreprenuers that know how to make use of a faster, lower cost currency.

Yes, having a fixed supply is an essential component of currency. If people can add more currency to the system, they’re taking value away from everyone else in that process. It’s unfair and unethical.
1 Nano actually can be divided down very small so there’s no risk of not having enough coins.

In this response, Colin is addressing a question about Steem and other dPoS systems. One major difference with Nano consensus is: having more Nano does not get you more Nano, there are no rewards for holding Nano. Holding nano doesn’t give people voting privledges on network changes, or any other centralizing component associated with holding.
Another big difference is voting in nano does not produce blocks, it chooses between conflicting blocks that a user publishes. If you don’t attempt to double-spend, your transactions cannot be voted against.

From Andy:
1. The faucet did indeed seed Nano's amazing international communities, and the contributions from around the world to the project have been unbelievable over that last 2.5 years. Communities are still active, engaged and building 💪
2. The effects of Nano being added to the Atomic Wallet (and other multi-currency wallets) is two fold. It increases the accessibility and convenience of storing Nano alongside other coins and also helps to disperse voting weight across a wider spread of representatives - increasing decentralization!

We certainly feel that Nano possesses far and away the best fundamentals, democratic approach to decentralization, and user experience.
Being fully distributed and operating on a the mainnet since 2015 is also very important, and puts Nano way ahead of many other projects making bold claims about future potential.
Nano is here today, and works as one would expect the digital money would!

Privacy is an attractive proposition to users of digital money for obvious reasons, it can be very important. Our position towards privacy is more conservative as we have seen many more hurdles to mainstream adoption being put in front of privacy-based projects.
With that being said, there are eyes towards the technical implications of introducing privacy, but it is extremely difficult to do this without incurring slowdowns to settlement times.
Throughout 2019 we were able to make significant progress in helping some of the more well-established cryptocurrency services such as exchanges, fiat gateways, payment platforms, and wallets- like Atomic 😄, to understand and integrate Nano. This proliferation of Nano across the space has ensured that it is increasingly more convenient for users and merchants to access and begin using Nano for payments.
submitted by Live_Magnetic_Air to nanocurrency [link] [comments]

Providing Some Clarity on Bitcoin Unlimited's Financial Decisions

Providing Some Clarity on Bitcoin Unlimited's Financial Decisions

https://preview.redd.it/zjps7jpg7rg41.jpg?width=1601&format=pjpg&auto=webp&s=defb61fb45c1a2ad5c7e31fe9200541783ba6478

Introduction

As promised in our previous article, we wanted to provide some extra clarity on Bitcoin Unlimited financial choices. We wanted to do this as there has been a lot of confusion and misinformation within the community as to the reasons behind these choices.
It has been claimed by a small number of influential people in the ecosystem that Bitcoin Unlimited does not support BCH (see the previous article debunking this claim) and that BU’s holdings are supposedly evidence of this. Background Bitcoin Unlimited was founded in 2015, and was set up as a response to the Bitcoin block size debate. More specifically, it was created to provide software that allowed on-chain scaling as originally proposed by Satoshi Nakamoto. As we all know, on-chain scaling is a vital component required for peer-to-peer electronic cash to serve the world’s population. Without it Bitcoin would be limited to serving only a small number of people willing and able to pay exorbitantly high fees. Our organisation was created to make Bitcoin unlimited. This prediction of high fees and limited capacity was played out in the BTC we know today as we predicted.
Bitcoin Unlimited received a large anonymous donation in BTC in 2016 from supporters of the ‘on-chain scaling’ movement. This donation allowed our organisation to remain independent and focussed on building software that allows on-chain scaling.
As you all know, in August of 2017, Bitcoin Cash was created after an unsuccessful multi-year effort to allow Bitcoin (BTC) to scale on-chain. Bitcoin Cash was created with the goal of on-chain scaling to support the world’s population right at its heart and BU has been supporting it since the idea was originally formulated.
Once Bitcoin Cash was created it also meant that all funds Bitcoin Unlimited held (BTC) were forked into two equal sets of coins, BTC and BCH. This put BU into a position where we had to make an important decision on how to handle these funds in a way that was in the interest of both BCH and BU.

Financial Prudence

Any organisation that wants to be effective in its goals must aim to always be financially sustainable. Without money, achieving anything becomes significantly more difficult. Cryptocurrencies only magnify this issue even further. Highly volatile asset values, opaque and dynamic tax and regulatory environments, and the unique properties of cryptocurrencies all contribute towards making the financial operations of an organisation an extreme challenge to say the least. Navigating this challenging landscape is a necessary requirement for the success of any organisation within our industry though.
While Bitcoin Unlimited’s primary goal is to make sure peer-to-peer electronic cash (as set out in the Bitcoin white-paper) becomes a reality, a secondary goal must be to make sure that it has the resources required to make its primary goal achievable, and an important part of these resources are its funds.
After Bitcoin forked into BTC and BCH, Bitcoin Unlimited then held an equal number of both. Although a BUIP was passed to authorize some extra conversion, significant practical obstacles to doing so exist (although this is still being worked on). However, since the overarching reason to convert a significant number of BTC to BCH is to maintain financial prudence based on the reasons outlined below and the poor BCH price performance has heavily skewed our holdings, we do anticipate some rebalancing when these obstacles are resolved.
We will further expand on these reasons below. Historic Volatility It is a fact that BCH has historically been more volatile than BTC. An organisation that wishes to maintain a lower level of risk must aim to hold a majority of funds in assets which will maintain their value over time, i.e. be less volatile in their price. It is unfortunately true that BCH has been a more volatile asset than BCH since its creation. While there has been lots of progress and maturation of the BCH ecosystem, this price volatility is likely due to BCH still being a smaller and less developed ecosystem than BTC. The graphs below show levels of volatility in the two coins compared.

BTC
BCH
This higher volatility in BCH has meant that to significantly increase BU’s holdings of BCH would expose the organisation to a higher level of risk for ideological reasons. BTC is already a high-volatility asset and to expose the organisation funds to even higher volatility and further risk is a decision that should not be taken based on simplistic ideology, but rather with the strategy of maximising the ability for the organisation to achieve its primary goals. This meant making the decision to not take on a higher exposure to price volatility, and instead maintain a more conservative risk profile.

Lack Of Say In The Protocol

One argument that has been put forward to suggest that this decision does not make sense because it is analogous to a CEO of a company holding more shares in their competitor’s company. This analogy does not accurately reflect the current scenario for BU or BCH. In this analogy BU is the CEO and BCH is the company. Ignoring the shareholders, A CEO is able to have the largest impact on a company compared to any other stakeholder. Their actions have a direct impact on operations of the company and therefore its value and the value of the shares.
Unfortunately, Bitcoin Unlimited currently has little to no input on the BCH protocol. It has no way to directly influence the direction or success of BCH. There are two reasons for this. Firstly, BCH has a mining software homogeneity that is as centralised as BTC (i.e. essentially all miners and pools run a single client, BitcoinABC). This means that, all though BU has a slight majority in non-mining and in-consensus nodes, BU has no say in protocol decisions unless a collaborative and decentralised development model were to be used by BitcoinABC. This is an unfortunate situation considering the fact that the community split from BTC for this very reason and is strongly in support of decentralised development. Secondly, BitcoinABC does not take a collaborative approach to development. All decisions and features are dictated by BitcoinABC.
In fact the situation is unfortunately even worse than this. BitcoinABC has decided to take an actively hostile position against Bitcoin Unlimited (and many other valuable participants in the ecosystem) and would rather that it did not exist at all.
While a number of members of BitcoinABC were previously members of BU, they unfortunately used their privilege as members to try (but fortunately failed) to sabotage the organisation.
https://www.bitcoinunlimited.info/voting/rendeproposal_vote_result/7eb0ded0487a6593ac3976b63422294e1a84b209be1307c46f373489922212a0
https://www.bitcoinunlimited.info/voting/rendeproposal_vote_result/6285fcef8fa44416b8e83f25bfebe79aff502c1446a7b60bfab28ec58c35b609
https://www.bitcoinunlimited.info/voting/rendeproposal_vote_result/b10f54ece2ea3b9001086ebdde0001fbef9dc2fd83729a65ba207c0f1d9dfceb
These three voting records show members of BitcoinABC voting for the purchase of BSV coin, voting for an unfeasibly large block size increase (10TB), and voting for implementation of and miner-activation of BSV features into the BU client. None of these actions were implemented in the ABC client, and the inclusion of BSV features is likely the single biggest criticism certain ABC affiliated people have made against BU, yet members of BitcoinABC voted for it.
While it is important to assume good faith, under no interpretation can this be seen as anything other an act of bad will towards BU. Unfortunately this kind of behaviour is rather the rule than the exception and has likely been a major factor in BCH’s struggle to attract quality developers into the ecosystem.
Regardless of the hard work done by members of BU to create useful software for Bitcoin Cash, and its continued commitment towards peer-to-peer electronic cash for the past 5 years, ABC will unfortunately never allow any of BU’s work to go into the BCH protocol willingly.
If BU were to invest all its funds into BCH it would be making a highly risky bet on BitcoinABC’s leadership, a leadership that has not only been historically unsuccessful (when looking at the price of BCH since its creation, both in dollar terms and BTC/BCH ratio terms), but also actively hostile to our organisation. A more cautious approach that takes these factors into account is to keep the funds held where there has been less volatility.
Regardless of all of this, BU is still 100% committed to supporting Bitcoin Cash.

Game Theory: The Strategy of Betting Against Yourself

Counter intuitively, a strategy where you bet against yourself can provide a beneficial low-risk profile. When you bet against yourself, if you lose you win and if you win you win. With BU’s current asset holdings of BCH and BTC the organisation is financially hedged in a way that it wins if BCH wins, and if BTC wins then BU lives to fight another day for worldwide peer-to-peer electronic cash.
If BTC goes down and BCH goes up then it means BCH is succeeding, and our funds in BCH will sustain us for longer. Not only that, but there would likely be more funds available for BCH development in this scenario. If BTC goes up and BCH goes down then BU will be sustained for longer to continue the fight for BCH and peer-to-peer electronic cash.
This is very similar to the strategy of BCH-supporting miners mining on BTC and then converting the BTC block rewards into BCH in an effort to use BTC gains to support BCH price. BU is similarly using its gains in BTC and converting them to efforts and initiatives in support of BCH. In doing so Bitcoin Unlimited is able to turn any BTC win into a positive for BCH.

Incentives

It has been suggested that the situation created by holding a larger portion of funds in BTC than in BCH creates negative incentives that push BU towards supporting BTC. It is important to keep in mind that Bitcoin Unlimited is not a profit driven organisation. While an increase in value of its assets is of course beneficial to the organisation, our primary goal is to accelerate the global adoption of peer-to-peer electronic cash as described in the Bitcoin white-paper, and the officials, membership and founding articles of Bitcoin Unlimited are the driving force for this.
It is also important to point out that there is no evidence to support the claim that BU is in support of BTC (or BSV). In fact the voting record clearly shows the opposite of this. BU has continually worked in support of peer-to-peer electronic cash, and specifically in support of BCH since it was created. This is thanks to the strong commitment by the BU officials and members, all of whom are long time Bitcoiners and supporters of the ‘on-chain scaling’ movement. The only members who receive any payment from the organisation are those who provide significant value in the form of various skilled services, and all of these are voted on by the membership. The BUIP record also shows that compensated individuals are often compensated at far under market rates for developers of their caliber. Should the price of BTC increase, no member receives any direct benefit from this beyond any appreciation in value of any BTC they privately hold. Therefore there are no strong incentives for BU to drive the price of BTC up and push the price of BCH down as this would be counter to our primary goal.

Has This Strategy Been Successful?

Bitcoin Unlimited and its members, all being long-time Bitcoiners, are acutely aware of the need to play the long game to make sure a globally adopted peer-to-peer electronic cash becomes a reality. BU is the oldest entity within the BCH ecosystem and with good reason. The financial strategy of BU to date has been highly effective in sustaining the organisation over a long period of time, and allowing it to independently support BCH development initiatives. This is made clear by the fact that BU continues to have enough funding to provide value to the BCH ecosystem for the foreseeable future.
Had BU converted all funds to BCH at, or at almost any point after, the time of the BCH/BTC fork in August 2017, then for much of the time since it would have been forced to either scale back operations or shut down support for BCH developers completely. We now see development teams such as BitcoinABC facing the prospect of being unable to fund their development of BCH, and their financial strategy may have contributed to this reality. This is despite the fact that nearly all the funds donated in the recent community funding drive sponsored by bitcoin.com were directed towards BitcoinABC.
Lack of a sustainable funding model also seems to have been a major factor in pushing BitcoinABC to make the highly controversial decision to support a change to the BCH protocol that would divert 12.5% of the block reward to themselves. Being financially prudent and sticking to its principles (as defined in the founding Articles of Federation has allowed Bitcoin Unlimited to steer clear of any conflicts of interest such as this.

Summary

Through its financial strategy Bitcoin Unlimited has been able to maintain its independence and financial sustainability and has therefore remained in a strong position to support Bitcoin Cash. BU’s officials and membership have continually made good decisions that have allowed BU to provide long-term support for the Bitcoin Cash ecosystem.
submitted by BU-BCH to btc [link] [comments]

Polkadot Launch AMA Recap

Polkadot Launch AMA Recap

The Polkadot Telegram AMA below took place on June 10, 2020

https://preview.redd.it/4ti681okap951.png?width=4920&format=png&auto=webp&s=e21f6a9a276d35bb9cdec59f46744f23c37966ef
AMA featured:
Dieter Fishbein, Ecosystem Development Lead, Web3 Foundation
Logan Saether, Technical Education, Web3 Foundation
Will Pankiewicz, Master of Validators, Parity Technologies
Moderated by Dan Reecer, Community and Growth, Polkadot & Kusama at Web3 Foundation

Transcription compiled by Theresa Boettger, Polkadot Ambassador:

Dieter Fishbein, Ecosystem Development Lead, Web3 Foundation

Dan: Hey everyone, thanks for joining us for the Polkadot Launch AMA. We have Dieter Fishbein (Head of Ecosystem Development, our business development team), Logan Saether (Technical Education), and Will Pankiewicz (Master of Validators) joining us today.
We had some great questions submitted in advance, and we’ll start by answering those and learning a bit about each of our guests. After we go through the pre-submitted questions, then we’ll open up the chat to live Q&A and the hosts will answer as many questions as they can.
We’ll start off with Dieter and ask him a set of some business-related questions.

Dieter could you introduce yourself, your background, and your role within the Polkadot ecosystem?

Dieter: I got my start in the space as a cryptography researcher at the University of Waterloo. This is where I first learned about Bitcoin and started following the space. I spent the next four years or so on the investment team for a large asset manager where I primarily focused on emerging markets. In 2017 I decided to take the plunge and join the space full-time. I worked at a small blockchain-focused VC fund and then joined the Polkadot team just over a year ago. My role at Polkadot is mainly focused on ensuring there is a vibrant community of projects building on our technology.

Q: Adoption of Polkadot of the important factors that all projects need to focus on to become more attractive to the industry. So, what is Polkadot's plan to gain more Adoption? [sic]

A (Dieter): Polkadot is fundamentally a developer-focused product so much of our adoption strategy is focused around making Polkadot an attractive product for developers. This has many elements. Right now the path for most developers to build on Polkadot is by creating a blockchain using the Substrate framework which they will later connect to Polkadot when parachains are enabled. This means that much of our adoption strategy comes down to making Substrate an attractive tool and framework. However, it’s not just enough to make building on Substrate attractive, we must also provide an incentive to these developers to actually connect their Substrate-based chain to Polkadot. Part of this incentive is the security that the Polkadot relay chain provides but another key incentive is becoming interoperable with a rich ecosystem of other projects that connect to Polkadot. This means that a key part of our adoption strategy is outreach focused. We go out there and try to convince the best projects in the space that building on our technology will provide them with significant value-add. This is not a purely technical argument. We provide significant support to projects building in our ecosystem through grants, technical support, incubatoaccelerator programs and other structured support programs such as the Substrate Builders Program (https://www.substrate.io/builders-program). I do think we really stand out in the significant, continued support that we provide to builders in our ecosystem. You can also take a look at the over 100 Grants that we’ve given from the Web3 Foundation: https://medium.com/web3foundation/web3-foundation-grants-program-reaches-100-projects-milestone-8fd2a775fd6b

Q: On moving forward through your roadmap, what are your most important next priorities? Does the Polkadot team have enough fundamentals (Funds, Community, etc.) to achieve those milestones?

A (Dieter): I would say the top priority by far is to ensure a smooth roll-out of key Polkadot features such as parachains, XCMP and other key parts of the protocol. Our recent Proof of Authority network launch was only just the beginning, it’s crucial that we carefully and successfully deploy features that allow builders to build meaningful technology. Second to that, we want to promote adoption by making more teams aware of Polkadot and how they can leverage it to build their product. Part of this comes down to the outreach that I discussed before but a major part of it is much more community-driven and many members of the team focus on this.
We are also blessed to have an awesome community to make this process easier 🙂

Q: Where can a list of Polkadot's application-specific chains can be found?

A (Dieter): The best list right now is http://www.polkaproject.com/. This is a community-led effort and the team behind it has done a terrific job. We’re also working on providing our own resource for this and we’ll share that with the community when it’s ready.

Q: Could you explain the differences and similarities between Kusama and Polkadot?

A (Dieter): Kusama is fundamentally a less robust, faster-moving version of Polkadot with less economic backing by validators. It is less robust since we will be deploying new technology to Kusama before Polkadot so it may break more frequently. It has less economic backing than Polkadot, so a network takeover is easier on Kusama than on Polkadot, lending itself more to use cases without the need for bank-like security.
In exchange for lower security and robustness, we expect the cost of a parachain lease to be lower on Kusama than Polkadot. Polkadot will always be 100% focused on security and robustness and I expect that applications that deal with high-value transactions such as those in the DeFi space will always want a Polkadot deployment, I think there will be a market for applications that are willing to trade cheap, high throughput for lower security and robustness such as those in the gaming, content distribution or social networking sectors. Check out - https://polkadot.network/kusama-polkadot-comparing-the-cousins/ for more detailed info!

Q: and for what reasons would a developer choose one over the other?

A (Dieter): Firstly, I see some earlier stage teams who are still iterating on their technology choosing to deploy to Kusama exclusively because of its lower-stakes, faster moving environment where it will be easier for them to iterate on their technology and build their user base. These will likely encompass the above sectors I identified earlier. To these teams, Polkadot becomes an eventual upgrade path for them if, and when, they are able to perfect their product, build a larger community of users and start to need the increased stability and security that Polkadot will provide.
Secondly, I suspect many teams who have their main deployment on Polkadot will also have an additional deployment on Kusama to allow them to test new features, either their tech or changes to the network, before these are deployed to Polkadot mainnet.

Logan Saether, Technical Education, Web3 Foundation

Q: Sweet, let's move over to Logan. Logan - could you introduce yourself, your background, and your role within the Polkadot ecosystem?

A (Logan): My initial involvement in the industry was as a smart contract engineer. During this time I worked on a few projects, including a reboot of the Ethereum Alarm Clock project originally by Piper Merriam. However, I had some frustrations at the time with the limitations of the EVM environment and began to look at other tools which could help me build the projects that I envisioned. This led to me looking at Substrate and completing a bounty for Web3 Foundation, after which I applied and joined the Technical Education team. My responsibilities at the Technical Education team include maintaining the Polkadot Wiki as a source of truth on the Polkadot ecosystem, creating example applications, writing technical documentation, giving talks and workshops, as well as helping initiatives such as the Thousand Validator Programme.

Q: The first technical question submitted for you was: "When will an official Polkadot mobile wallet appear?"

A (Logan): There is already an “official” wallet from Parity Technologies called the Parity Signer. Parity Signer allows you to keep your private keys on an air-gapped mobile device and to interactively sign messages using web interfaces such as Polkadot JS Apps. If you’re looking for something that is more of an interface to the blockchain as well as a wallet, you might be interested in PolkaWallet which is a community team that is building a full mobile interface for Polkadot.
For more information on Parity Signer check out the website: https://www.parity.io/signe

Q: Great thanks...our next question is: If someone already developed an application to run on Ethereum, but wants the interoperability that Polkadot will offer, are there any advantages to rebuilding with Substrate to run as a parachain on the Polkadot network instead of just keeping it on Ethereum and using the Ethereum bridge for use with Polkadot?

A (Logan): Yes, the advantage you would get from building on Substrate is more control over how your application will interact with the greater Polkadot ecosystem, as well as a larger design canvas for future iterations of your application.
Using an Ethereum bridge will probably have more cross chain latency than using a Polkadot parachain directly. The reason for this is due to the nature of Ethereum’s separate consensus protocol from Polkadot. For parachains, messages can be sent to be included in the next block with guarantees that they will be delivered. On bridged chains, your application will need to go through more routes in order to execute on the desired destination. It must first route from your application on Ethereum to the Ethereum bridge parachain, and afterward dispatch the XCMP message from the Polkadot side of the parachain. In other words, an application on Ethereum would first need to cross the bridge then send a message, while an application as a parachain would only need to send the message without needing to route across an external bridge.

Q: DOT transfers won't go live until Web3 removes the Sudo module and token holders approve the proposal to unlock them. But when will staking rewards start to be distributed? Will it have to after token transfers unlock? Or will accounts be able to accumulate rewards (still locked) once the network transitions to NPoS?

A (Logan): Staking rewards will be distributed starting with the transition to NPoS. Transfers will still be locked during the beginning of this phase, but reward payments are technically different from the normal transfer mechanism. You can read more about the launch process and steps at http://polkadot.network/launch-roadmap

Q: Next question is: I'm interested in how Cumulus/parachain development is going. ETA for when we will see the first parachain registered working on Kusama or some other public testnet like Westend maybe?

A (Logan): Parachains and Cumulus is a current high priority development objective of the Parity team. There have already been PoC parachains running with Cumulus on local testnets for months. The current work now is making the availability and validity subprotocols production ready in the Polkadot client. The best way to stay up to date would be to follow the project boards on GitHub that have delineated all of the tasks that should be done. Ideally, we can start seeing parachains on Westend soon with the first real parachains being deployed on Kusama thereafter.
The projects board can be viewed here: https://github.com/paritytech/polkadot/projects
Dan: Also...check out Basti's tweet from yesterday on the Cumulus topic: https://twitter.com/bkchstatus/1270479898696695808?s=20

Q: In what ways does Polkadot support smart contracts?

A (Logan): The philosophy behind the Polkadot Relay Chain is to be as minimal as possible, but allow arbitrary logic at the edges in the parachains. For this reason, Polkadot does not support smart contracts natively on the Relay Chain. However, it will support smart contracts on parachains. There are already a couple major initiatives out there. One initiative is to allow EVM contracts to be deployed on parachains, this includes the Substrate EVM module, Parity’s Frontier, and projects such as Moonbeam. Another initiative is to create a completely new smart contract stack that is native to Substrate. This includes the Substrate Contracts pallet, and the ink! DSL for writing smart contracts.
Learn more about Substrate's compatibility layer with Ethereum smart contracts here: https://github.com/paritytech/frontier

Will Pankiewicz, Master of Validators, Parity Technologies


Q: (Dan) Thanks for all the answers. Now we’ll start going through some staking questions with Will related to validating and nominating on Polkadot. Will - could you introduce yourself, your background, and your role within the Polkadot ecosystem?

A (Will): Sure thing. Like many others, Bitcoin drew me in back in 2013, but it wasn't until Ethereum came that I took the deep dive into working in the space full time. It was the financial infrastructure aspects of cryptocurrencies I was initially interested in, and first worked on dexes, algorithmic trading, and crypto funds. I really liked the idea of "Generalized Mining" that CoinFund came up with, and started to explore the whacky ways the crypto funds and others can both support ecosystems and be self-sustaining at the same time. This drew me to a lot of interesting experiments in what later became DeFi, as well as running validators on Proof of Stake networks. My role in the Polkadot ecosystem as “Master of Validators” is ensuring the needs of our validator community get met.

Q: Cool thanks. Our first community question was "Is it still more profitable to nominate the validators with lesser stake?"

A (Will): It depends on their commission, but generally yes it is more profitable to nominate validators with lesser stake. When validators have lesser stake, when you nominate them this makes your nomination stake a higher percentage of total stake. This means when rewards get distributed, it will be split more favorably toward you, as rewards are split by total stake percentage. Our entire rewards scheme is that every era (6 hours in Kusama, 24 hours in Polkadot), a certain amount of rewards get distributed, where that amount of rewards is dependent on the total amount of tokens staked for the entire network (50% of all tokens staked is currently optimal). These rewards from the end of an era get distributed roughly equally to all validators active in the validator set. The reward given to each validator is then split between the validators and all their nominators, determined by the total stake that each entity contributes. So if you contribute to a higher percentage of the total stake, you will earn more rewards.

Q: What does priority ranking under nominator addresses mean? For example, what does it mean that nominator A has priority 1 and nominator B has priority 6?

A (Will): Priority ranking is just the index of the nomination that gets stored on chain. It has no effect on how stake gets distributed in Phragmen or how rewards get calculated. This is only the order that the nominator chose their validators. The way that stake from a nominator gets distributed from a nominator to validators is via Phragmen, which is an algorithm that will optimally put stake behind validators so that distribution is roughly equal to those that will get in the validator set. It will try to maximize the total amount at stake in the network and maximize the stake behind minimally staked validators.

Q: On Polkadot.js, what does it mean when there are nodes waiting on Polkadot?

**A (Will):**In Polkadot there is a fixed validator set size that is determined by governance. The way validators get in the active set is by having the highest amount of total stake relative to other validators. So if the validator set size is 100, the top 100 validators by total stake will be in the validator set. Those not active in the validator set will be considered “waiting”.

Q: Another question...Is it necessary to become a waiting validator node right now?

A (Will): It's not necessary, but highly encouraged if you actively want to validate on Polkadot. The longer you are in the waiting tab, the longer you get exposure to nominators that may nominate you.

Q: Will current validators for Kusama also validate for Polkadot? How strongly should I consider their history (with Kusama) when looking to nominate a good validator for DOTs?

A (Will): A lot of Kusama validators will also be validators for Polkadot, as KSM was initially distributed to DOT holders. The early Kusama Validators will also likely be the first Polkadot validators. Being a Kusama validator should be a strong indicator for who to nominate on Polkadot, as the chaos that has ensued with Kusama has allowed validators to battle test their infrastructure. Kusama validators by now are very familiar with tooling, block explorers, terminology, common errors, log formats, upgrades, backups, and other aspects of node operation. This gives them an edge against Polkadot validators that may be new to the ecosystem. You should strongly consider well known Kusama validators when making your choices as a nominator on Polkadot.

Q: Can you go into more details about the process for becoming a DOT validator? Is it similar as the KSM 1000 validators program?

A (Will): The Process for becoming a DOT validators is first to have DOTs. You cannot be a validator without DOTs, as DOTs are used to pay transaction fees, and the minimum amount of DOTs you need is enough to create a validate transaction. After obtaining enough DOTs, you will need to set up your validator infrastructure. Ideally you should have a validator node with specs that match what we call standard hardware, as well as one or more sentry nodes to help isolate the validator node from attacks. After the infrastructure is up and running, you should have your Polkadot accounts set up right with a stash bonded to a controller account, and then submit a validate transaction, which will tell the network your nodes are ready to be a part of the network. You should then try and build a community around your validator to let others know you are trustworthy so that they will nominate you. The 1000 validators programme for Kusama is a programme that gives a certain amount of nominations from the Web3 Foundation and Parity to help bootstrap a community and reputation for validators. There may eventually be a similar type of programme for Polkadot as well.
Dan: Thanks a lot for all the answers, Will. That’s the end of the pre-submitted questions and now we’ll open the chat up to live Q&A, and our three team members will get through as many of your questions as possible.
We will take questions related to business development, technology, validating, and staking. For those wondering about DOT:
DOT tokens do not exist yet. Allocations of Polkadot's native DOT token are technically and legally non-transferable. Hence any publicized sale of DOTs is unsanctioned by Web3 Foundation and possibly fraudulent. Any official public sale of DOTs will be announced on the Web3 Foundation website. Polkadot’s launch process started in May and full network decentralization later this year, holders of DOT allocations will determine issuance and transferability. For those who participated in previous DOT sales, you can learn how to claim your DOTs here (https://wiki.polkadot.network/docs/en/claims).


Telegram Community Follow-up Questions Addressed Below


Q: Polkadot looks good but it confuses me that there are so many other Blockchain projects. What should I pay attention in Polkadot to give it the importance it deserves? What are your planning to achieve with your project?

A (Will): Personally, what I think differentiates it is the governance process. Coordinating forkless upgrades and social coordination helps stand it apart.
A (Dieter): The wiki is awesome - https://wiki.polkadot.network/

Q: Over 10,000 ETH paid as a transaction fee , what if this happens on Polkadot? Is it possible we can go through governance to return it to the owner?

A: Anything is possible with governance including transaction reversals, if a network quorum is reached on a topic.
A (Logan): Polkadot transaction fees work differently than the fees on Ethereum so it's a bit more difficult to shoot yourself in the foot as the whale who sent this unfortunate transaction. See here for details on fees: https://w3f-research.readthedocs.io/en/latest/polkadot/Token%20Economics.html?highlight=transaction%20fees#relay-chain-transaction-fees-and-per-block-transaction-limits
However, there is a tip that the user can input themselves which they could accidentally set to a large amount. In this cases, yes, they could proposition governance to reduce the amount that was paid in the tip.

Q: What is the minimum ideal amount of DOT and KSM to have if you want to become a validator and how much technical knowledge do you need aside from following the docs?

A (Will): It depends on what the other validators in the ecosystem are staking as well as the validator set size. You just need to be in the top staking amount of the validator set size. So if its 100 validators, you need to be in the top 100 validators by stake.

Q: Will Web3 nominate validators? If yes, which criteria to be elected?

A (Will): Web 3 Foundation is running programs like the 1000 validators programme for Kusama. There's a possibility this will continue on for Polkadot as well after transfers are enabled. https://thousand-validators.kusama.network/#/
You will need to be an active validator to earn rewards. Only those active in the validator set earn rewards. I would recommend checking out parts of the wiki: https://wiki.polkadot.network/docs/en/maintain-guides-validator-payout

Q: Is it possible to implement hastables or dag with substrate?

A (Logan): Yes.

Q: Polkadot project looks very futuristic! But, could you tell us the main role of DOT Tokens in the Polkadot Ecosystem?

A (Dan): That's a good question. The short answer is Staking, Governance, Bonding. More here: http://polkadot.network/dot-token

Q: How did you manage to prove that the consensus protocol is safe and unbreakable mathematically?

A (Dieter): We have a research teams of over a dozen scientists with PhDs and post-docs in cryptography and distributed computing who do thorough theoretical analyses on all the protocols used in Polkadot

Q: What are the prospects for NFT?

A: Already being built 🙂

Q: What will be Polkadot next roadmap for 2020 ?

A (Dieter): Building. But seriously - we will continue to add many more features and upgrades to Polkadot as well as continue to strongly focus on adoption from other builders in the ecosystem 🙂
A (Will): https://polkadot.network/launch-roadmap/
This is the launch roadmap. Ideally adding parachains and xcmp towards the end of the year

Q: How Do you stay active in terms of marketing developments during this PANDEMIC? Because I'm sure you're very excited to promote more after this settles down.

A (Dan): The main impact of covid was the impact on in-person events. We have been very active on Crowdcast for webinars since 2019, so it was quite the smooth transition to all-online events. You can see our 40+ past event recordings and follow us on Crowdcast here: https://www.crowdcast.io/polkadot. If you're interested in following our emails for updates (including online events), subscribe here: https://info.polkadot.network/subscribe

Q: Hi, who do you think is your biggest competitor in the space?

A (Dan): Polkadot is a metaprotocol that hasn't been seen in the industry up until this point. We hope to elevate the industry by providing interoperability between all major public networks as well as private blockchains.

Q: Is Polkadot a friend or competitor of Ethereum?

A: Polkadot aims to elevate the whole blockchain space with serious advancements in interoperability, governance and beyond :)

Q: When will there be hardware wallet support?

A (Will): Parity Signer works well for now. Other hardware wallets will be added pretty soon

Q: What are the attractive feature of DOT project that can attract any new users ?

A: https://polkadot.network/what-is-polkadot-a-brief-introduction/
A (Will): Buidling parachains with cross chain messaging + bridges to other chains I think will be a very appealing feature for developers

Q: According to you how much time will it take for Polkadot to get into mainstream adoption and execute all the plans set for this project?

A: We are solving many problems that have held back the blockchain industry up until now. Here is a summary in basic terms:
https://preview.redd.it/ls7i0bpm8p951.png?width=752&format=png&auto=webp&s=a8eb7bf26eac964f6b9056aa91924685ff359536

Q: When will bitpie or imtoken support DOT?

A: We are working on integrations on all the biggest and best wallet providers. ;)

Q: What event/call can we track to catch a switch to nPOS? Is it only force_new_era call? Thanks.

A (Will): If you're on riot, useful channels to follow for updates like this are #polkabot:matrix.org and #polkadot-announcements:matrix.parity.io
A (Logan): Yes this is the trigger for initiating the switch to NPoS. You can also poll the ForceEra storage for when it changes to ForceNew.

Q: What strategy will the Polkadot Team use to make new users trust its platform and be part of it?

A (Will): Pushing bleeding edge cryptography from web 3 foundation research
A (Dan): https://t.me/PolkadotOfficial/43378

Q: What technology stands behind and What are its advantages?

A (Dieter): Check out https://polkadot.network/technology/ for more info on our tech stack!

Q: What problems do you see occurring in the blockchain industry nowadays and how does your project aims to solve these problems?

A (Will): Governance I see as a huge problem. For example upgrading Bitcoin and making decisions for changing things is a very challenging process. We have robust systems of on-chain governance to help solve these coordination problems

Q: How involved are the Polkadot partners? Are they helping with the development?

A (Dieter): There are a variety of groups building in the Polkadot ecosystem. Check out http://www.polkaproject.com/ for a great list.

Q: Can you explain the role of the treasury in Polkadot?

A (Will): The treasury is for projects or people that want to build things, but don't want to go through the formal legal process of raising funds from VCs or grants or what have you. You can get paid by the community to build projects for the community.
A: There’s a whole section on the wiki about the treasury and how it functions here https://wiki.polkadot.network/docs/en/mirror-learn-treasury#docsNav

Q: Any plan to introduce Polkadot on Asia, or rising market on Asia?

**A (Will):**We're globally focused

Q: What kind of impact do you expect from the Council? Although it would be elected by token holders, what kind of people you wish to see there?

A (Will): Community focused individuals like u/jam10o that want to see cool things get built and cool communities form

If you have further questions, please ask in the official Polkadot Telegram channel.
submitted by dzr9127 to dot [link] [comments]

Bitcoin explained and made simple  Guardian Animations ... Price Discovery, Block Size Caps, & Fractional Reserve... Bitcoins? Chat w/ Justus Ranvier What is Bitcoin? Bitcoin Explained Simply for Dummies ... #Bitcoin #MCC2019 Luke Dashjr AutoCAD How To Change Text Size - YouTube

A group of businesses and mining firms that use bitcoin's software to provide services decided not to go forward with a change in the rules on the bitcoin blockchain that would see block size doubled. However, the change in the Bitcoin price caused by political events is generally the opposite of what happens to central bank currencies. Lack of confidence in a country’s economy leads people ... In the eyes of many, block-size seems to fundamentally change the utility of bitcoin. An oversimplified explanation would be that some want bitcoin to be a highly transactional-able currency, whereas others are not so interested in bitcoin as a frequent use type money (ie coffee purchases) but are more concerned with the security and usefulness of bitcoin in it’s current form. It seems these ... The Bitcoin.com mining pool has the lowest share reject rate (0.15%) we've ever seen. Other pools have over 0.30% rejected shares. Furthermore, the Bitcoin.com pool has a super responsive and reliable support team. As explained above, the Bitcoin blockchain is limited to 7 transactions per second, because verifying a block with the standard size of 1 MB takes approximately 10 minutes.

[index] [27780] [43943] [24483] [30940] [32244] [21479] [26843] [42130] [23520] [18205]

Bitcoin explained and made simple Guardian Animations ...

In Part 6 of the Bitcoin Beginner’s Guide, I am joined by Shinobi, host of Block Digest. In this episode, we are looking at how the Bitcoin protocol works. We discuss the supply & halvings ... Autodesk Autocad tutorial how to change text size with properties and style command fast, check it out!!!Don't forget guys, if you like our videos please "Subsc... We all know Bitcoin is a roller coaster of price changes, but have you ever wondered what determines the value of Bitcoin? Today Maria walk you through how the value of bitcoin constantly changes ... Baffled by bitcoin? Confused by the concept of crypto-currencies? Well, fear no more. In 190 seconds we explain what bitcoin actually is, where the idea came... What is the capacity difference between utilizing second layers, like the Lightning Network, and block size increases? Are hard forks harder to execute with ...

#