Tag Archives: blockchain

Molly White Reviews Blockchain Book

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2024/02/molly-white-reviews-blockchain-book.html

Molly White—of “Web3 is Going Just Great” fame—reviews Chris Dixon’s blockchain solutions book: Read Write Own:

In fact, throughout the entire book, Dixon fails to identify a single blockchain project that has successfully provided a non-speculative service at any kind of scale. The closest he ever comes is when he speaks of how “for decades, technologists have dreamed of building a grassroots internet access provider”. He describes one project that “got further than anyone else”: Helium. He’s right, as long as you ignore the fact that Helium was providing LoRaWAN, not Internet, that by the time he was writing his book Helium hotspots had long since passed the phase where they might generate even enough tokens for their operators to merely break even, and that the network was pulling in somewhere around $1,150 in usage fees a month despite the company being valued at $1.2 billion. Oh, and that the company had widely lied to the public about its supposed big-name clients, and that its executives have been accused of hoarding the project’s token to enrich themselves. But hey, a16z sunk millions into Helium (a fact Dixon never mentions), so might as well try to drum up some new interest!

Security Vulnerability of Switzerland’s E-Voting System

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2023/10/security-vulnerability-of-switzerlands-e-voting-system.html

Online voting is insecure, period. This doesn’t stop organizations and governments from using it. (And for low-stakes elections, it’s probably fine.) Switzerland—not low stakes—uses online voting for national elections. Andrew Appel explains why it’s a bad idea:

Last year, I published a 5-part series about Switzerland’s e-voting system. Like any internet voting system, it has inherent security vulnerabilities: if there are malicious insiders, they can corrupt the vote count; and if thousands of voters’ computers are hacked by malware, the malware can change votes as they are transmitted. Switzerland “solves” the problem of malicious insiders in their printing office by officially declaring that they won’t consider that threat model in their cybersecurity assessment.

But it also has an interesting new vulnerability:

The Swiss Post e-voting system aims to protect your vote against vote manipulation and interference. The goal is to achieve this even if your own computer is infected by undetected malware that manipulates a user vote. This protection is implemented by special return codes (Prüfcode), printed on the sheet of paper you receive by physical mail. Your computer doesn’t know these codes, so even if it’s infected by malware, it can’t successfully cheat you as long as, you follow the protocol.

Unfortunately, the protocol isn’t explained to you on the piece of paper you get by mail. It’s only explained to you online, when you visit the e-voting website. And of course, that’s part of the problem! If your computer is infected by malware, then it can already present to you a bogus website that instructs you to follow a different protocol, one that is cheatable. To demonstrate this, I built a proof-of-concept demonstration.

Appel again:

Kuster’s fake protocol is not exactly what I imagined; it’s better. He explains it all in his blog post. Basically, in his malware-manipulated website, instead of displaying the verification codes for the voter to compare with what’s on the paper, the website asks the voter to enter the verification codes into a web form. Since the website doesn’t know what’s on the paper, that web-form entry is just for show. Of course, Kuster did not employ a botnet virus to distribute his malware to real voters! He keeps it contained on his own system and demonstrates it in a video.

Again, the solution is paper. (Here I am saying that in 2004.) And, no, blockchain does not help—it makes security worse.

Friday Squid Blogging: Squid Is a Blockchain Thingy

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2023/02/friday-squid-blogging-squid-is-a-blockchain-thingy.html

I had no idea—until I read this incredibly jargon-filled article:

Squid is a cross-chain liquidity and messaging router that swaps across multiple chains and their native DEXs via axlUSDC.

So there.

As usual, you can also use this squid post to talk about the security stories in the news that I haven’t covered.

Read my blog posting guidelines here.

Decarbonizing Cryptocurrencies through Taxation

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2023/01/decarbonizing-cryptocurrencies-through-taxation.html

Maintaining bitcoin and other cryptocurrencies causes about 0.3 percent of global CO2 emissions. That may not sound like a lot, but it’s more than the emissions of Switzerland, Croatia, and Norway combined. As many cryptocurrencies crash and the FTX bankruptcy moves into the litigation stage, regulators are likely to scrutinize the cryptocurrency world more than ever before. This presents a perfect opportunity to curb their environmental damage.

The good news is that cryptocurrencies don’t have to be carbon intensive. In fact, some have near-zero emissions. To encourage polluting currencies to reduce their carbon footprint, we need to force buyers to pay for their environmental harms through taxes.

The difference in emissions among cryptocurrencies comes down to how they create new coins. Bitcoin and other high emitters use a system called “proof of work“: to generate coins, participants, or “miners,” have to solve math problems that demand extraordinary computing power. This allows currencies to maintain their decentralized ledger—the blockchain—but requires enormous amounts of energy.

Greener alternatives exist. Most notably, the “proof of stake” system enables participants to maintain their blockchain by depositing cryptocurrency holdings in a pool. When the second-largest cryptocurrency, Ethereum, switched from proof of work to proof of stake earlier this year, its energy consumption dropped by more than 99.9% overnight.

Bitcoin and other cryptocurrencies probably won’t follow suit unless forced to, because proof of work offers massive profits to miners—and they’re the ones with power in the system. Multiple legislative levers could be used to entice them to change.

The most blunt solution is to ban cryptocurrency mining altogether. China did this in 2018, but it only made the problem worse; mining moved to other countries with even less efficient energy generation, and emissions went up. The only way for a mining ban to meaningfully reduce carbon emissions is to enact it across most of the globe. Achieving that level of international consensus is, to say the least, unlikely.

A second solution is to prohibit the buying and selling of proof-of-work currencies. The European Parliament’s Committee on Economic and Monetary Affairs considered making such a proposal, but voted against it in March. This is understandable; as with a mining ban, it would be both viewed as paternalistic and difficult to implement politically.

Employing a tax instead of an outright ban would largely skirt these issues. As with taxes on gasoline, tobacco, plastics, and alcohol, a cryptocurrency tax could reduce real-world harm by making consumers pay for it.

Most ways of taxing cryptocurrencies would be inefficient, because they’re easy to circumvent and hard to enforce. To avoid these pitfalls, the tax should be levied as a fixed percentage of each proof-of-work-cryptocurrency purchase. Cryptocurrency exchanges should collect the tax, just as merchants collect sales taxes from customers before passing the sum on to governments. To make it harder to evade, the tax should apply regardless of how the proof-of-work currency is being exchanged—whether for a fiat currency or another cryptocurrency. Most important, any state that implements the tax should target all purchases by citizens in its jurisdiction, even if they buy through exchanges with no legal presence in the country.

This sort of tax would be transparent and easy to enforce. Because most people buy cryptocurrencies from one of only a few large exchanges—such as Binance, Coinbase, and Kraken—auditing them should be cheap enough that it pays for itself. If an exchange fails to comply, it should be banned.

Even a small tax on proof-of-work currencies would reduce their damage to the planet. Imagine that you’re new to cryptocurrency and want to become a first-time investor. You’re presented with a range of currencies to choose from: bitcoin, ether, litecoin, monero, and others. You notice that all of them except ether add an environmental tax to your purchase price. Which one do you buy?

Countries don’t need to coordinate across borders for a proof-of-work tax on their own citizens to be effective. But early adopters should still consider ways to encourage others to come on board. This has precedent. The European Union is trying to influence global policy with its carbon border adjustments, which are designed to discourage people from buying carbon-intensive products abroad in order to skirt taxes. Similar rules for a proof-of-work tax could persuade other countries to adopt one.

Of course, some people will try to evade the tax, just as people evade every other tax. For example, people might buy tax-free coins on centralized exchanges and then swap them for polluting coins on decentralized exchanges. To some extent, this is inevitable; no tax is perfect. But the effort and technical know-how needed to evade a proof-of-work tax will be a major deterrent.

Even if only a few countries implement this tax—and even if some people evade it—the desirability of bitcoin will fall globally, and the environmental benefit will be significant. A high enough tax could also cause a self-reinforcing cycle that will drive down these cryptocurrencies’ prices. Because the value of many cryptocurrencies rely largely on speculation, they are dependent on future buyers. When speculators are deterred by the tax, the lack of demand will cause the price of bitcoin to fall, which could prompt more current holders to sell—further lowering prices and accelerating the effect. Declining prices will pressure the bitcoin community to abandon proof of work altogether.

Taxing proof-of-work exchanges might hurt them in the short run, but it would not hinder blockchain innovation. Instead, it would redirect innovation toward greener cryptocurrencies. This is no different than how government incentives for electric vehicles encourage carmakers to improve green alternatives to the internal combustion engine. These incentives don’t restrict innovation in automobiles—they promote it.

Taxing environmentally harmful cryptocurrencies can gain support across the political spectrum, from people with varied interests. It would benefit blockchain innovators and cryptocurrency researchers by shifting focus from environmental harm to beneficial uses of the technology. It has the potential to make our planet significantly greener. It would increase government revenues.

Even bitcoin maximalists have reason to embrace the proposal: it would offer the bitcoin community a chance to prove it can survive and grow sustainably.

This essay was written with Christos Porios, and previously appeared in the Atlantic.

How a blockchain startup built a prototype solution to solve the need of analytics for decentralized applications with AWS Data Lab

Post Syndicated from Dr. Quan Hoang Nguyen original https://aws.amazon.com/blogs/big-data/how-a-blockchain-startup-built-a-prototype-solution-to-solve-the-need-of-analytics-for-decentralized-applications-with-aws-data-lab/

This post is co-written with Dr. Quan Hoang Nguyen, CTO at Fantom Foundation.

Here at Fantom Foundation (Fantom), we have developed a high performance, highly scalable, and secure smart contract platform. It’s designed to overcome limitations of the previous generation of blockchain platforms. The Fantom platform is permissionless, decentralized, and open source. The majority of decentralized applications (dApps) hosted on the Fantom platform lack an analytics page that provides information to the users. Therefore, we would like to build a data platform that supports a web interface that will be made public. This will allow users to search for a smart contract address. The application then displays key metrics for that smart contract. Such an analytics platform can give insights and trends for applications deployed on the platform to the users, while the developers can continue to focus on improving their dApps.

AWS Data Lab offers accelerated, joint-engineering engagements between customers and AWS technical resources to create tangible deliverables that accelerate data and analytics modernization initiatives. Data Lab has three offerings: the Build Lab, the Design Lab, and a Resident Architect. The Build Lab is a 2–5 day intensive build with a technical customer team. The Design Lab is a half-day to 2-day engagement for customers who need a real-world architecture recommendation based on AWS expertise, but aren’t yet ready to build. Both engagements are hosted either online or at an in-person AWS Data Lab hub. The Resident Architect provides AWS customers with technical and strategic guidance in refining, implementing, and accelerating their data strategy and solutions over a 6-month engagement.

In this post, we share the experience of our engagement with AWS Data Lab to accelerate the initiative of developing a data pipeline from an idea to a solution. Over 4 weeks, we conducted technical design sessions, reviewed architecture options, and built the proof of concept data pipeline.

Use case review

The process started with us engaging with our AWS Account team to submit a nomination for the data lab. This followed by a call with the AWS Data Lab team to assess the suitability of requirements against the program. After the Build Lab was scheduled, an AWS Data Lab Architect engaged with us to conduct a series of pre-lab calls to finalize the scope, architecture, goals, and success criteria for the lab. The scope was to design a data pipeline that would ingest and store historical and real-time on-chain transactions data, and build a data pipeline to generate key metrics. Once ingested, data should be transformed, stored, and exposed via REST-based APIs and consumed by a web UI to display key metrics. For this Build Lab, we choose to ingest data for Spooky, which is a decentralized exchange (DEX) deployed on the Fantom platform and had the largest Total Value Locked (TVL) at that time. Key metrics such number of wallets that have interacted with the dApp over time, number of tokens and their value exchanged for the dApp over time, and number of transactions for the dApp over time were selected to visualize through a web-based UI.

We explored several architecture options and picked one for the lab that aligned closely with our end goal. The total historical data for the selected smart contract was approximately 1 GB since deployment of dApp on the Fantom platform. We used FTMScan, which allows us to explore and search on the Fantom platform for transactions, to estimate the rate of transfer transactions to be approximately three to four per minute. This allowed us to design an architecture for the lab that can handle this data ingestion rate. We agreed to use an existing application known as the data producer that was developed internally by the Fantom team to ingest on-chain transactions in real time. On checking transactions’ payload size, it was found to not exceed 100 kb for each transaction, which gave us the measure of number of files that will be created once ingested through the data producer application. A decision was made to ingest the past 45 days of historic transactions to populate the platform with enough data to visualize key metrics. Because the feature of backdating exists within the data producer application, we agreed to use that. The Data Lab Architect also advised us to consider using AWS Database Migration Service (AWS DMS) to ingest historic transactions data post lab. As a last step, we decided to build a React-based webpage with Material-UI that allows users to enter a smart contract address and choose the time interval, and the app fetches the necessary data to show the metrics value.

Solution overview

We collectively agreed to incorporate the following design principles for the data lab architecture:

  • Simplified data pipelines
  • Decentralized data architecture
  • Minimize latency as much as possible

The following diagram illustrates the architecture that we built in the lab.

We collectively defined the following success criteria for the Build Lab:

  • End-to-end data streaming pipeline to ingest on-chain transactions
  • Historical data ingestion of the selected smart contract
  • Data storage and processing of on-chain transactions
  • REST-based APIs to provide time-based metrics for the three defined use cases
  • A sample web UI to display aggregated metrics for the smart contract

Prior to the Build Lab

As a prerequisite for the lab, we configured the data producer application to use the AWS Software Development Kit (AWS SDK) and PUTRecords API operation to send transactions data to an Amazon Simple Storage Service (Amazon S3) bucket. For the Build Lab, we built additional logic within the application to ingest historic transactions data together with real-time transactions data. As a last step, we verified that transactions data was captured and ingested into a test S3 bucket.

AWS services used in the lab

We used the following AWS services as part of the lab:

  • AWS Identity and Access Management (IAM) – We created multiple IAM roles with appropriate trust relationships and necessary permissions that can be used by multiple services to read and write on-chain transactions data and generated logs.
  • Amazon S3 – We created an S3 bucket to store the incoming transactions data as JSON-based files. We created a separate S3 bucket to store incoming transaction data that failed to be transformed and will be reprocessed later.
  • Amazon Kinesis Data Streams – We created a new Kinesis data stream in on-demand mode, which automatically scales based on data ingestion patterns and provides hands-free capacity management. This stream was used by the data producer application to ingest historical and real-time on-chain transactions. We discussed having the ability to manage and predict cost, and therefore were advised to use the provisioned mode when reliable estimates were available for throughput requirements. We were also advised to continue to use on-demand mode until the data traffic patterns were unpredictable.
  • Amazon Kinesis Data Firehose – We created a Firehose delivery stream to transform the incoming data and writes it to the S3 bucket. To minimize latency, we set the delivery stream buffer size to 1 MiB and buffer interval to 60 seconds. This would ensure a file is written to the S3 bucket when either of the two conditions are satisfied regardless of the order. Transactions data written to the S3 bucket was in JSON Lines format.
  • Amazon Simple Queue Service (Amazon SQS) – We set up an SQS queue of the type Standard and an access policy for that SQS queue to allow incoming messages generated from S3 bucket event notifications.
  • Amazon DynamoDB – In order to pick a data store for on-chain transactions, we needed a service that can store transactions payload of unstructured data with varying schemas, provides the ability to cache query results, and is a managed service. We picked DynamoDB for those reasons. We created a single DynamoDB table that holds the incoming transactions data. After analyzing the access query patterns, we decided to use the address field of the smart contract as the partition key and the timestamp field as the sort key. The table was created with auto scaling of read and write capacity modes because the actual usage requirements would be hard to predict at that time.
  • AWS Lambda – We created the following functions:
    • A Python-based Lambda function to perform transformations on the incoming data from the data producer application to flatten the JSON structure, convert the Unix-based epoch timestamp to a date/time value, and convert hex-based string values to a decimal value representing the number of tokens.
    • A second Lambda function to parse incoming SQS queue messages. This message contained values for bucket_name and object_key, which holds the reference to a newly created object within the S3 bucket. The Lambda function logic included parsing of this value to obtain the reference to the S3 object, get the contents of the object, read it into a data frame object using the AWS SDK for pandas (awswrangler) library, convert it into a Pandas data frame object, and use the put_df API call to write a Pandas data frame object as an item into a DynamoDB table. We choose to use Pandas due to familiarity with the library and functions required to perform data transform operations.
    • Three separate Lambda functions that contains the logic to query the DynamoDB table and retrieve items to aggregate and calculate metrics values. This calculated metrics value within the Lambda function was formatted as an HTTP response to expose as REST-based APIs.
  • Amazon API Gateway – We created a REST based API endpoint that uses Lambda proxy integration to pass a smart contract address and time-based interval in minutes as a query string parameter to the backend Lambda function. The response from the Lambda function was a metrics value. We also enabled cross-origin resource sharing (CORS) support within API Gateway to successfully query from the web UI that resides in a different domain.
  • Amazon CloudWatch – We used a Lambda function in-built mechanism to send function metrics to CloudWatch. Lambda functions come with a CloudWatch Logs log group and a log stream for each instance of your function. The Lambda runtime environment sends details of each invocation to the log stream, and relays logs and other output from your function’s code.

Iterative development approach

Across 4 days of the Build Lab, we undertook iterative development. We started by developing the foundational layer and iteratively added extra features through testing and data validation. This allowed us to develop confidence of the solution being built as we tested the output of the metrics through a web-based UI and verified with the actual data. As errors got discovered, we deleted the entire dataset and reran all the jobs to verify results and resolve those errors.

Lab outcomes

In 4 days, we built an end-to-end streaming pipeline ingesting 45 days of historical data and real-time on-chain transactions data for the selected Spooky smart contract. We also developed three REST-based APIs for the selected metrics and a sample web UI that allows users to insert a smart contract address, choose a time frequency, and visualize the metrics values. In a follow-up call, our AWS Data Lab Architect shared post-lab guidance around the next steps required to productionize the solution:

  • Scaling of the proof of concept to handle larger data volumes
  • Security best practices to protect the data while at rest and in transit
  • Best practices for data modeling and storage
  • Building an automated resilience technique to handle failed processing of the transactions data
  • Incorporating high availability and disaster recovery solutions to handle incoming data requests, including adding of the caching layer

Conclusion

Through a short engagement and small team, we accelerated this project from an idea to a solution. This experience gave us the opportunity to explore AWS services and their analytical capabilities in-depth. As a next step, we will continue to take advantage of AWS teams to enhance the solution built during this lab to make it ready for the production deployment.

Learn more about how the AWS Data Lab can help your data and analytics on the cloud journey.


About the Authors

Dr. Quan Hoang Nguyen is currently a CTO at Fantom Foundation. His interests include DLT, blockchain technologies, visual analytics, compiler optimization, and transactional memory. He has experience in R&D at the University of Sydney, IBM, Capital Markets CRC, Smarts – NASDAQ, and National ICT Australia (NICTA).

Ankit Patira is a Data Lab Architect at AWS based in Melbourne, Australia.

Responsible Disclosure for Cryptocurrency Security

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2022/09/responsible-disclosure-for-cryptocurrency-security.html

Stewart Baker discusses why the industry-norm responsible disclosure for software vulnerabilities fails for cryptocurrency software.

Why can’t the cryptocurrency industry solve the problem the way the software and hardware industries do, by patching and updating security as flaws are found? Two reasons: First, many customers don’t have an ongoing relationship with the hardware and software providers that protect their funds­—nor do they have an incentive to update security on a regular basis. Turning to a new security provider or using updated software creates risks; leaving everything the way it was feels safer. So users won’t be rushing to pay for and install new security patches.

Second, cryptocurrency is famously and deliberately decentralized, anonymized, and low friction. That means that the company responsible for hardware or software security may have no way to identify who used its product, or to get the patch to those users. It also means that many wallets with security flaws will be publicly accessible, protected only by an elaborate password. Once word of the flaw leaks, the password can be reverse engineered by anyone, and the legitimate owners are likely to find themselves in a race to move their assets before the thieves do. Even in the software industry, hackers routinely reverse engineer Microsoft’s patches to find the security flaws they fix and then try to exploit them before the patches have been fully installed.

He doesn’t have any good ideas to fix this. I don’t either. Just add it to the pile of blockchain’s many problems.

On the Dangers of Cryptocurrencies and the Uselessness of Blockchain

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2022/06/on-the-dangers-of-cryptocurrencies-and-the-uselessness-of-blockchain.html

Earlier this month, I and others wrote a letter to Congress, basically saying that cryptocurrencies are an complete and total disaster, and urging them to regulate the space. Nothing in that letter is out of the ordinary, and is in line with what I wrote about blockchain in 2019. In response, Matthew Green has written—not really a rebuttal—but a “a general response to some of the more common spurious objections…people make to public blockchain systems.” In it, he makes several broad points:

  1. Yes, current proof-of-work blockchains like bitcoin are terrible for the environment. But there are other modes like proof-of-stake that are not.
  2. Yes, a blockchain is an immutable ledger making it impossible to undo specific transactions. But that doesn’t mean there can’t be some governance system on top of the blockchain that enables reversals.
  3. Yes, bitcoin doesn’t scale and the fees are too high. But that’s nothing inherent in blockchain technology—that’s just a bunch of bad design choices bitcoin made.
  4. Blockchain systems can have a little or a lot of privacy, depending on how they are designed and implemented.

There’s nothing on that list that I disagree with. (We can argue about whether proof-of-stake is actually an improvement. I am skeptical of systems that enshrine a “they who have the gold make the rules” system of governance. And to the extent any of those scaling solutions work, they undo the decentralization blockchain claims to have.) But I also think that these defenses largely miss the point. To me, the problem isn’t that blockchain systems can be made slightly less awful than they are today. The problem is that they don’t do anything their proponents claim they do. In some very important ways, they’re not secure. They don’t replace trust with code; in fact, in many ways they are far less trustworthy than non-blockchain systems. They’re not decentralized, and their inevitable centralization is harmful because it’s largely emergent and ill-defined. They still have trusted intermediaries, often with more power and less oversight than non-blockchain systems. They still require governance. They still require regulation. (These things are what I wrote about here.) The problem with blockchain is that it’s not an improvement to any system—and often makes things worse.

In our letter, we write: “By its very design, blockchain technology is poorly suited for just about every purpose currently touted as a present or potential source of public benefit. From its inception, this technology has been a solution in search of a problem and has now latched onto concepts such as financial inclusion and data transparency to justify its existence, despite far better solutions to these issues already in use. Despite more than thirteen years of development, it has severe limitations and design flaws that preclude almost all applications that deal with public customer data and regulated financial transactions and are not an improvement on existing non-blockchain solutions.”

Green responds: “‘Public blockchain’ technology enables many stupid things: today’s cryptocurrency schemes can be venal, corrupt, overpromised. But the core technology is absolutely not useless. In fact, I think there are some pretty exciting things happening in the field, even if most of them are further away from reality than their boosters would admit.” I have yet to see one. More specifically, I can’t find a blockchain application whose value has anything to do with the blockchain part, that wouldn’t be made safer, more secure, more reliable, and just plain better by removing the blockchain part. I postulate that no one has ever said “Here is a problem that I have. Oh look, blockchain is a good solution.” In every case, the order has been: “I have a blockchain. Oh look, there is a problem I can apply it to.” And in no cases does it actually help.

Someone, please show me an application where blockchain is essential. That is, a problem that could not have been solved without blockchain that can now be solved with it. (And “ransomware couldn’t exist because criminals are blocked from using the conventional financial networks, and cash payments aren’t feasible” does not count.)

For example, Green complains that “credit card merchant fees are similar, or have actually risen in the United States since the 1990s.” This is true, but has little to do with technological inefficiencies or existing trust relationships in the industry. It’s because pretty much everyone who can and is paying attention gets 1% back on their purchases: in cash, frequent flier miles, or other affinity points. Green is right about how unfair this is. It’s a regressive subsidy, “since these fees are baked into the cost of most retail goods and thus fall heavily on the working poor (who pay them even if they use cash).” But that has nothing to do with the lack of blockchain, and solving it isn’t helped by adding a blockchain. It’s a regulatory problem; with a few exceptions, credit card companies have successfully pressured merchants into charging the same prices, whether someone pays in cash or with a credit card. Peer-to-peer payment systems like PayPal, Venmo, MPesa, and AliPay all get around those high transaction fees, and none of them use blockchain.

This is my basic argument: blockchain does nothing to solve any existing problem with financial (or other) systems. Those problems are inherently economic and political, and have nothing to do with technology. And, more importantly, technology can’t solve economic and political problems. Which is good, because adding blockchain causes a whole slew of new problems and makes all of these systems much, much worse.

Green writes: “I have no problem with the idea of legislators (intelligently) passing laws to regulate cryptocurrency. Indeed, given the level of insanity and the number of outright scams that are happening in this area, it’s pretty obvious that our current regulatory framework is not up to the task.” But when you remove the insanity and the scams, what’s left?

EDITED TO ADD: Nicholas Weaver is also adamant about this. David Rosenthal is good, too.

EDITED TO ADD (7/8/2022): This post has been translated into German.

Proof of Stake and our next experiments in web3

Post Syndicated from Wesley Evans original https://blog.cloudflare.com/next-gen-web3-network/

Proof of Stake and our next experiments in web3

Proof of Stake and our next experiments in web3

A little under four years ago we announced Cloudflare’s first experiments in web3 with our gateway to the InterPlanetary File System (IPFS). Three years ago we announced our experimental Ethereum Gateway. At Cloudflare, we often take experimental bets on the future of the Internet to help new technologies gain maturity and stability and for us to gain deep understanding of them.

Four years after our initial experiments in web3, it’s time to launch our next series of experiments to help advance the state of the art. These experiments are focused on finding new ways to help solve the scale and environmental challenges that face blockchain technologies today. Over the last two years there has been a rapid increase in the use of the underlying technologies that power web3. This growth has been a catalyst for a generation of startups; and yet it has also had negative externalities.

At Cloudflare, we are committed to helping to build a better Internet. That commitment balances technology and impact. The impact of certain older blockchain technologies on the environment has been challenging. The Proof of Work (PoW) consensus systems that secure many blockchains were instrumental in the bootstrapping of the web3 ecosystem. However, these systems do not scale well with the usage rates we see today. Proof of Work networks rely on complex cryptographic functions to create consensus and prevent attacks. Every node on the network is in a race to find a solution to this cryptographic problem. This cryptographic function is called a hash function. A hash function takes a number in x, and gives a number out y. If you know x, it’s easy to compute y. However, given y, it’s prohibitively costly to find a matching x. Miners have to find an x that would output a y with certain properties, such as 10 leading zero. Even with advanced chips, this is costly and heavily based on chance.

Proof of Stake and our next experiments in web3

While that process is incredibly secure because of the vast amount of hashing that occurs, Proof of Work networks are wasteful. This waste is driven by the fact that Proof of Work consensus mechanisms are electricity-intensive. In a PoW ecosystem, energy consumption scales directly with usage. So, for example, when web3 took off in 2020, the Bitcoin network started to consume the same amount of electricity as Switzerland (according to Cambridge Bitcoin Electricity Consumption Index).

This inefficiency in resource use is by design to make it difficult for any single actor to threaten the security of the blockchain, and for the majority of the last decade, while use was low, this inefficiency was an acceptable trade off. The other major trade off accepted with the security of Proof of Work has been the number of transactions the network can handle per second. For instance, the Ethereum network can currently process about 30 transactions per second. This ultra-low transaction rate was acceptable when the technology was in development, but does not scale to meet the current demand.

As recently as two years ago, web3 was an up-and-coming ecosystem with minimal traffic compared to the majority of the traffic on the Internet. The last two years changed the web3 use landscape. Traffic to the Bitcoin and the Ethereum networks has exploded. Technologies like Smart Contracts that power DeFi, NFTs, and DAOs have started to become commonplace in the developer community (and in the Twitter lexicon). The next generation of blockchains like Flow, Avalanche, Solana, as well as Polygon are being developed and adopted by major companies like Meta.

In order to balance our commitment to the environment and to help build a better Internet, it is clear to us that we should launch new experiments to help find a path forward. A path that balances the clear need to drastically reduce the energy consumption of web3 technologies and the capability to scale the web3 networks by orders of magnitude if the need arises to support increased user adoption over the next five years. How do we do that? We believe that the next generation of consensus protocols is likely to be based on Proof of Stake and Proof of Spacetime consensus mechanisms.

Proof of Stake and our next experiments in web3

Today, we are excited to announce the start of our experiments for the next generation of web3 networks that are embracing Proof of Stake; beginning with Ethereum. Over the next few months, Cloudflare will launch, and fully stake, Ethereum validator nodes on the Cloudflare global network as the community approaches its transition from Proof of Work to Proof of Stake with “The Merge.” These nodes will serve as a testing ground for research on energy efficiency, consistency management, and network speed.

There is a lot in that paragraph so let’s break it down! The short version though is that Cloudflare is going to participate in the research and development of the core infrastructure that helps keep Ethereum secure, fast, as well as energy efficient for everyone.

Proof of Stake is a next-generation consensus protocol to secure blockchains. Unlike Proof of Work that relies on miners racing each other with increasingly complex cryptography to mine a block, Proof of Stake secures new transactions to the network through self-interest. Validator’s nodes (people who verify new blocks for the chain) are required to put a significant asset up as collateral in a smart contract to prove that they will act in good faith. For instance, for Ethereum that is 32 ETH. Validator nodes that follow the network’s rules earn rewards; validators that violate the rules will have portions of their stake taken away. Anyone can operate a validator node as long as they meet the stake requirement. This is key. Proof of Stake networks require lots and lots of validators nodes to validate and attest to new transactions. The more participants there are in the network, the harder it is for bad actors to launch a 51% attack to compromise the security of the blockchain.

To add new blocks to the Ethereum chain, once it shifts to Proof of Stake, validators are chosen at random to create new blocks (validate). Once they have a new block ready it is submitted to a committee of 128 other validators who act as attestors. They verify that the transactions included in the block are legitimate and that the block is not malicious. If the block is found to be valid, it is this attestation that is added to the blockchain. Critically, the validation and attestation process is not computationally complex. This reduction in complexity leads to significant energy efficiency improvements.

The energy required to operate a Proof of Stake validator node is magnitudes less than a Proof of Work miner. Early estimates from the Ethereum Foundation estimate that the entire Ethereum network could use as little as 2.6 megawatts of power. Put another way, Ethereum will use 99.5% less energy post merge than today.

We are going to support the development and deployment of Ethereum by running Proof of Stake validator nodes on our network. For the Ethereum ecosystem, running validator nodes on our network allows us to offer even more geographic decentralization in places like EMEA, LATAM, and APJC while also adding infrastructure decentralization to the network. This helps keep the network secure, outage resistant, and closer to the goal of global decentralization for everyone. For us, It’s a commitment to helping research and experiment with scaling the next generation of blockchain networks that could be a key part of the story of the Internet. Running blockchain technology at scale is incredibly difficult, and Proof of Stake systems have their own unique requirements that will take time to understand and improve. Part of this experimentation though is a commitment. As part of our experiments Cloudflare has not and will not run our own Proof of Work infrastructure on our network.

Finally, what is “The Merge”? The Merge is the planned combination of the Ethereum Mainnet (the chain you interact with today when you make an Eth transaction) and the Ethereum Beacon Chain. The Beacon chain is the Proof of Stake blockchain running in parallel with the Ethereum Mainnet. The Merge is much more significant than a consensus update.

Consensus updates have happened multiple times already on Ethereum. This could be because of a vulnerability, or the need to introduce new features. The Merge is different in a way that it cannot be made by a normal upgrade. Previous forks have been enabled at a specific block number. This means that if you mine enough blocks, the new consensus rules apply automatically. With the Merge, there should only be one state root for the fork to happen. The merge state root would be the start of the Ethereum Mainnet PoS journey. It is chosen and set at a certain difficulty, instead of block height, to avoid merging a high block with low difficulty.

Once The Merge occurs later this year in 2022, Ethereum will be a full Proof of Stake ecosystem that no longer relies on Proof of Work.

This is just the start of our commitment to help build the next generation of web3 networks. We are excited to work with our partners across the cryptography, web3, and infrastructure communities to help create the next generation of blockchain ecosystems that are environmentally sustainable, secure, and blazing fast.

Let’s Architect! Architecting for Blockchain

Post Syndicated from Luca Mezzalira original https://aws.amazon.com/blogs/architecture/lets-architect-architecting-for-blockchain/

You’ve likely read about or heard someone talk about blockchain. This distributed and decentralized ledger collects immutable blocks of information and helps secure your data without going through third party. It is commonly used to maintain secure and decentralized records for registries, consensus, cryptocurrencies, and the latest trend: non-fungible tokens (NFTs).

This collection of content will help you learn the basics of blockchain and drill down in to the mindset to apply while architecting for blockchain. We focus on the architectural aspects to explain what the blockchain is from a technological perspective, how it works, when we need it, as well as its characteristics applied to different scenarios.

Amazon Managed Blockchain: When to use blockchain

There is a lot of buzz about blockchain, but when should you use it? What are its benefits and limitations? This video introduces you to Amazon Managed Blockchain and will help you identify if blockchain is a good solution for you and what type of blockchain is best suited for your use case.

John Liu covers the characteristics and benefits of private and public blockchain

John Liu covers the characteristics and benefits of private and public blockchain

Deep Dive on Amazon Managed Blockchain

In this video, Johnathan Fritz, a Principal Product Manager for Managed Blockchain shares some challenges his team faced while building a distributed and immutable network and how they overcame them. The talk provides a good example of mental models you can use to understand and solve challenges while architecting.

Blockchain is based on a consensus mechanism in a distributed system

Blockchain is based on a consensus mechanism in a distributed system

Mint and deploy NFTs to the Ethereum blockchain using Amazon Managed Blockchain

Buying NFTs is a hot topic right now. But how do you create your own? This blog post provides you a step-by-step guide that shows you how to create an NFT and how to establish a workflow to deploy ERC-721 contracts to the public blockchain Ethereum Rinkeby testnet.

The architecture uses Managed Blockchain to take advantage of maintained Ethereum nodes and allow developers to focus on smart contracts

The architecture uses Managed Blockchain to take advantage of maintained Ethereum nodes and allow developers to focus on smart contracts

How Specright uses Amazon QLDB to create a traceable supply chain network

Blockchain and distributed ledger technologies focus on decentralizing applications involving multiple parties where no single entity owns the application. When your application is decentralized and involves multiple, unknown parties, blockchains can be appropriate. On the other hand, if your application only requires a complete and verifiable history of data changes, you can consider a ledger database.

This post shows how Specright uses use Amazon Quantum Ledger Database (Amazon QLDB) to generate a complete, verifiable history of data changes, to generate an append-only immutable journal of events. Their architecture makes sure that all members of the network have access to the same and latest version of the specification to instantly track change history to investigate quality issues.

This architecture allows all members of the supply chain network to access the same and latest versions of specifications

This architecture allows all members of the supply chain network to access the same and latest versions of specifications

See you next time!

Thanks for reading! If you’re looking for more ways tools to architect your workload, check out the AWS Architecture Center.

See you in a couple of weeks when we discuss strategies for running microservices with containers!

Other posts in this series

Google Shuts Down Glupteba Botnet, Sues Operators

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2021/12/google-shuts-down-glupteba-botnet-sues-operators.html

Google took steps to shut down the Glupteba botnet, at least for now. (The botnet uses the bitcoin blockchain as a backup command-and-control mechanism, making it hard to get rid of it permanently.) So Google is also suing the botnet’s operators.

It’s an interesting strategy. Let’s see if it’s successful.

Smart Contract Bug Results in $31 Million Loss

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2021/12/smart-contract-bug-results-in-31-million-loss.html

A hacker stole $31 million from the blockchain company MonoX Finance , by exploiting a bug in software the service uses to draft smart contracts.

Specifically, the hack used the same token as both the tokenIn and tokenOut, which are methods for exchanging the value of one token for another. MonoX updates prices after each swap by calculating new prices for both tokens. When the swap is completed, the price of tokenIn­that is, the token sent by the user­decreases and the price of tokenOut­or the token received by the user­increases.

By using the same token for both tokenIn and tokenOut, the hacker greatly inflated the price of the MONO token because the updating of the tokenOut overwrote the price update of the tokenIn. The hacker then exchanged the token for $31 million worth of tokens on the Ethereum and Polygon blockchains.

The article goes on to talk about how common these sorts of attacks are. The basic problem is that the code is the ultimate authority — there is no adjudication protocol — so if there’s a vulnerability in the code, there is no recourse. And, of course, there are lots of vulnerabilities in code.

To me, this is reason enough never to use smart contracts for anything important. Human-based adjudication systems are not useless pre-Internet human baggage, they’re vital.

Integrity Guarantees of Blockchains In Case of Single Owner Or Colluding Owners

Post Syndicated from Bozho original https://techblog.bozho.net/integrity-guarantees-of-blockchains-in-case-of-single-owner-or-colluding-owners/

The title may sound as a paper title, rather than a blogpost, because it was originally an idea for such, but I’m unlikely to find the time to put a proper paper about it, so here it is – a blogpost.

Blockchain has been touted as the ultimate integrity guarantee – if you “have blockchain”, nobody can tamper with your data. Of course, reality is more complicated, and even in the most distributed of ledgers, there are known attacks. But most organizations that are experimenting with blockchain, rely on a private network, sometimes having themselves as the sole owner of the infrastructure, and sometimes sharing it with just a few partners.

The point of having the technology in the first place is to guarantee that once collected, data cannot be tampered with. So let’s review how that works in practice.

First, we have two define two terms – “tamper-resistant” (sometimes referred to as tamper-free) and “tamper-evident”. “Tamper-resistant” means nobody can ever tamper with the data and the state of the data structure is always guaranteed to be without any modifications. “Tamper-evident”, on the other hand, means that a data structure can be validated for integrity violations, and it will be known that there have been modifications (alterations, deletions or back-dating of entries). Therefore, with tamper-evident structures you can prove that the data is intact, but if it’s not intact, you can’t know the original state. It’s still a very important property, as the ability to prove that data is not tampered with is crucial for compliance and legal aspects.

Blockchain is usually built ontop of several main cryptographic primitives: cryptographic hashes, hash chains, Merkle trees, cryptographic timestamps and digital signatures. They all play a role in the integrity guarantees, but the most important ones are the Merkle tree (with all of its variations, like a Patricia Merkle tree) and the hash chain. The original bitcoin paper describes a blockchain to be a hash chain, based on the roots of multiple Merkle trees (which form a single block). Some blockchains rely on a single, ever-growing merkle tree, but let’s not get into particular implementation details.

In all cases, blockchains are considered tamper-resistant because their significantly distributed in a way that enough number of members have a copy of the data. If some node modifies that data, e.g. 5 blocks in the past, it has to prove to everyone else that this is the correct merkle root for that block. You have to have more than 50% of the network capacity in order to do that (and it’s more complicated than just having them), but it’s still possible. In a way, tamper resistance = tamper evidence + distributed data.

But many of the practical applications of blockchain rely on private networks, serving one or several entities. They are often based on proof of authority, which means whoever has access to a set of private keys, controls what the network agree on. So let’s review the two cases:

  • Multiple owners – in case of multiple node owners, several of them can collude to rewrite the chain. The collusion can be based on mutual business interest (e.g. in a supply chain, several members may team up against the producer to report distorted data), or can be based on security compromise (e.g. multiple members are hacked by the same group). In that case, the remaining node owners can have a backup of the original data, but finding out whether the rest were malicious or the changes were legitimate part of the business logic would require a complicated investigation.
  • Single owner – a single owner can have a nice Merkle tree or hash chain, but an admin with access to the underlying data store can regenerate the whole chain and it will look legitimate, while in reality it will be tampered with. Splitting access between multiple admins is one approach (or giving them access to separate nodes, none of whom has access to a majority), but they often drink beer together and collusion is again possible. But more importantly – you can’t prove to a 3rd party that your own employees haven’t colluded under orders from management in order to cover some tracks to present a better picture to a regulator.

In the case of a single owner, you don’t even have a tamper-evident structure – the chain can be fully rewritten and nobody will understand that. In case of multiple owners, it depends on the implementation. There will be a record of the modification at the non-colluding party, but proving which side “cheated” would be next to impossible. Tamper-evidence is only partially achieved, because you can’t prove whose data was modified and whose data hasn’t (you only know that one of the copies has tampered data).

In order to achieve tamper-evident structure with both scenarios is to use anchoring. Checkpoints of the data need to be anchored externally, so that there is a clear record of what has been the state of the chain at different points in time. Before blockchain, the recommended approach was to print it in newspapers (e.g. as an ad) and because it has a large enough circulation, nobody can collect all newspapers and modify the published checkpoint hash. This published hash would be either a root of the Merkle tree, or the latest hash in a hash chain. An ever-growing Merkle tree would allow consistency and inclusion proofs to be validated.

When we have electronic distribution of data, we can use public blockchains to regularly anchor our internal ones, in order to achieve proper tamper-evident data. We, at LogSentinel, for example, do exactly that – we allow publishing the latest Merkle root and the latest hash chain to Ethereum. Then even if those with access to the underlying datastore manage to modify and regenerate the entire chain/tree, there will be no match with the publicly advertised values.

How to store data on publish blockchains is a separate topic. In case of Ethereum, you can put any payload within a transaction, so you can put that hash in low-value transactions between two own addresses (or self-transactions). You can use smart-contracts as well, but that’s not necessary. For Bitcoin, you can use OP_RETURN. Other implementations may have different approaches to storing data within transactions.

If we want to achieve tamper-resistance, we just need to have several copies of the data, all subject to tamper-evidence guarantees. Just as in a public network. But what a public network gives is is a layer, which we can trust with providing us with the necessary piece for achieving local tamper evidence. Of course, going to hardware, it’s easier to have write-only storage (WORM, write once, ready many). The problem with it, is that it’s expensive and that you can’t reuse it. It’s not so much applicable to use-cases that require short-lived data that requires tamper-resistance.

So in summary, in order to have proper integrity guarantees and the ability to prove that the data in a single-owner or multi-owner private blockchains hasn’t been tampered with, we have to send publicly the latest hash of whatever structure we are using (chain or tree). If not, we are only complicating our lives by integrating a complex piece of technology without getting the real benefit it can bring – proving the integrity of our data.

The post Integrity Guarantees of Blockchains In Case of Single Owner Or Colluding Owners appeared first on Bozho's tech blog.

Audit Your Supply Chain with Amazon Managed Blockchain

Post Syndicated from Edouard Kachelmann original https://aws.amazon.com/blogs/architecture/audit-your-supply-chain-with-amazon-managed-blockchain/

For manufacturing companies, visibility into complex supply chain processes is critical to establishing resilient supply chain management. Being able to trace events within a supply chain is key to verifying the origins of parts for regulatory requirements, tracing parts back to suppliers if issues arise, and for contacting buyers if there is a product/part recall.

Traditionally, companies will create their own ledger that can be reviewed and shared with third parties for future audits. However, this process takes time and requires verifying the data’s authenticity. In this blog, we offer a solution to audit your supply chain. Our solution allows supply chain participants to safeguard product authenticity and prevent fraud, increase profitability by driving operational efficiencies, and enhance visibility to minimize disputes across parties.

Benefits of blockchain

Blockchain technology offers a new approach for tracking supply chain events. Blockchains are immutable ledgers that allow you to cryptographically prove that, since being written, each transaction remains unchanged. For a supply chain, this immutability is beneficial from a process standpoint. Auditing a supply chain becomes much simpler when you are certain that no one has altered the manufacturing, transportation, storage, or usage history of a given part or product in the time since a failure occurred.

In addition to providing an immutable system of record, many blockchain protocols can run programmable logic written as code in a decentralized manner. This code is often referred to as a “smart contract,” which enables multi-party business logic to run on the blockchain. This means that implementing your supply chain on a blockchain allows members of the network (like retailers, suppliers, etc.) to process transactions that only they are authorized to process.

Benefits of Amazon Managed Blockchain

Amazon Managed Blockchain allows customers to join either private Hyperledger Fabric networks or the Public Ethereum network. On Managed Blockchain, you are relieved of the undifferentiated heavy lifting associated with creating, configuring, and managing the underlying infrastructure for a Hyperledger Fabric network. Instead, you can focus your efforts on mission-critical value drivers like building consortia or developing use case specific components. This allows you to create and manage a scalable Hyperledger Fabric network that multiple organizations can join from their AWS account.

IoT-enabled supply chain architecture

Organizations within the Industrial Internet of Things (IIoT) space want solutions that allow them to monitor and audit their supply chain for strict quality control and accurate product tracking. Using AWS IoT will allow you to realize operational efficiency at scale. The IoT-enabled equipment on their production plant floor records data such as load, pressure, temperature, humidity, and assembly metrics through multiple sensors. Data can be transmitted in real time directly to the cloud or through an on-premises AWS Internet of Things (IoT) gateway (such as any AWS IoT Greengrass compatible hardware) into AWS IoT for storage and analytics. These devices or IoT gateway will then send MQTT messages to the AWS IoT Core endpoint.

This solution provides a pipeline to ingest data provided by IoT. It stores this data in a private blockchain network that is only accessible within member organizations. This is your immutable single source of truth for future audits. In this solution, the Hyperledger Fabric network on Managed Blockchain includes two members, but it can be extended to additional organizations that are part of the supply chain as needed.

Reference architecture for an IoT-enabled supply chain consisting of a retailer and a manufacturer

Figure 1. Reference architecture for an IoT-enabled supply chain consisting of a retailer and a manufacturer

The components of this solution are:

  • IoT enabled sensors – These sensors are directly mounted on each piece of factory equipment throughout the supply chain. They publish data to the IoT gateway. For testing purposes, you can start with the IoT Device Simulator solution to create and simulate hundreds of connected devices.
  • AWS IoT Greengrass (optional) – This gateway provides a secure way to seamlessly connect your edge devices to any AWS service. It also enables local processing, messaging, data management, machine learning (ML) inference, and offers pre-built components such as protocol conversion to MQTT if your sensors only have an OPCUA or Modbus interface.
  • AWS IoT Core – AWS IoT Core subscribes to IoT topics published by the IoT devices or gateway and ingests data into the AWS Cloud for analysis and storage.
  • AWS IoT rule – Rules give your devices the ability to interact with AWS services. Rules are analyzed and actions are performed based on the MQTT topic stream. Here, we initiate a serverless Lambda function to extract, transform, and publish data to the Fabric Client. We could use another rule for HTTPS endpoint to directly address requests to a private API Gateway.
  • Amazon API Gateway – The API Gateway provides a REST interface to invoke the AWS Lambda function for each of the API routes deployed. API Gateway allows you to handle request authorization and authentication, before passing the request on to Lambda.
  • AWS Lambda for the Fabric Client – Using AWS Lambda with the Hyperledger Fabric SDK installed as a dependency, you can communicate with your Hyperledger Fabric Peer Node(s) to write and read data from the blockchain. The peer nodes run smart contracts (referred to as chaincode in Hyperledger Fabric), endorse transactions, and store a local copy of the ledger.
  • Managed Blockchain – Managed Blockchain is a fully managed service for creating and managing blockchain networks and network resources using open-source frameworks. In our solution, an endpoint within the customer virtual private cloud (VPC) is used for the Fabric Client. It interacts with your Hyperledger Fabric network on Managed Blockchain components that run within a VPC for your Managed Blockchain network.
    • Peer node – A peer node endorses blockchain transactions and stores the blockchain ledger. In production, we recommend creating a second peer node in another Availability Zone to serve as a fallback if the first peer becomes unavailable.
    • Certificate Authority – Every user who interacts with the blockchain must first register and enroll with their certificate authority.

Choosing a Hyperledger Fabric edition

Edition Network size Max. # of members Max. # of peer nodes per member Max # of channels per network Transaction throughput and availability
Starter Test or small production 5 2 3 Lower
Standard Large production 14 3 8 Higher

Our solution allows multiple parties to write and query data on a private Hyperledger Fabric blockchain managed by Amazon Managed Blockchain. This enhances consumer experience by reducing the overall effort and complexity with getting insight into supply chain transactions.

Conclusion

In this post, we showed you how Managed Blockchain, as well as other AWS services such as AWS IoT, can provide value to your business. The IoT-enabled supply chain architecture gives you a blueprint to realize that value. The value not only stems from the benefits of having a trustworthy and transparent supply chain, but also from the reliable, secure and scalable services that AWS provides.

Further reading

On Blockchain Voting

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2020/11/on-blockchain-voting.html

Blockchain voting is a spectacularly dumb idea for a whole bunch of reasons. I have generally quoted Matt Blaze:

Why is blockchain voting a dumb idea? Glad you asked.

For starters:

  • It doesn’t solve any problems civil elections actually have.
  • It’s basically incompatible with “software independence”, considered an essential property.
  • It can make ballot secrecy difficult or impossible.

I’ve also quoted this XKCD cartoon.

But now I have this excellent paper from MIT researchers:

“Going from Bad to Worse: From Internet Voting to Blockchain Voting”
Sunoo Park, Michael Specter, Neha Narula, and Ronald L. Rivest

Abstract: Voters are understandably concerned about election security. News reports of possible election interference by foreign powers, of unauthorized voting, of voter disenfranchisement, and of technological failures call into question the integrity of elections worldwide.This article examines the suggestions that “voting over the Internet” or “voting on the blockchain” would increase election security, and finds such claims to be wanting and misleading. While current election systems are far from perfect, Internet- and blockchain-based voting would greatly increase the risk of undetectable, nation-scale election failures.Online voting may seem appealing: voting from a computer or smart phone may seem convenient and accessible. However, studies have been inconclusive, showing that online voting may have little to no effect on turnout in practice, and it may even increase disenfranchisement. More importantly: given the current state of computer security, any turnout increase derived from with Internet- or blockchain-based voting would come at the cost of losing meaningful assurance that votes have been counted as they were cast, and not undetectably altered or discarded. This state of affairs will continue as long as standard tactics such as malware, zero days, and denial-of-service attacks continue to be effective.This article analyzes and systematizes prior research on the security risks of online and electronic voting, and show that not only do these risks persist in blockchain-based voting systems, but blockchains may introduce additional problems for voting systems. Finally, we suggest questions for critically assessing security risks of new voting system proposals.

You may have heard of Voatz, which uses blockchain for voting. It’s an insecure mess. And this is my general essay on blockchain. Short summary: it’s completely useless.

US Postal Service Files Blockchain Voting Patent

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2020/08/us_postal_servi.html

The US Postal Service has filed a patent on a blockchain voting method:

Abstract: A voting system can use the security of blockchain and the mail to provide a reliable voting system. A registered voter receives a computer readable code in the mail and confirms identity and confirms correct ballot information in an election. The system separates voter identification and votes to ensure vote anonymity, and stores votes on a distributed ledger in a blockchain

I wasn’t going to bother blogging this, but I’ve received enough emails about it that I should comment.

As is pretty much always the case, blockchain adds nothing. The security of this system has nothing to do with blockchain, and would be better off without it. For voting in particular, blockchain adds to the insecurity. Matt Blaze is most succinct on that point:

Why is blockchain voting a dumb idea?

Glad you asked.

For starters:

  • It doesn’t solve any problems civil elections actually have.
  • It’s basically incompatible with “software independence”, considered an essential property.
  • It can make ballot secrecy difficult or impossible.

Both Ben Adida and Matthew Green have written longer pieces on blockchain and voting.

News articles.

US Postal Service Files Blockchain Voting Patent

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2020/08/us_postal_servi.html

The US Postal Service has filed a patent on a blockchain voting method:

Abstract: A voting system can use the security of blockchain and the mail to provide a reliable voting system. A registered voter receives a computer readable code in the mail and confirms identity and confirms correct ballot information in an election. The system separates voter identification and votes to ensure vote anonymity, and stores votes on a distributed ledger in a blockchain

I wasn’t going to bother blogging this, but I’ve received enough emails about it that I should comment.

As is pretty much always the case, blockchain adds nothing. The security of this system has nothing to do with blockchain, and would be better off without it. For voting in particular, blockchain adds to the insecurity. Matt Blaze is most succinct on that point:

Why is blockchain voting a dumb idea?

Glad you asked.

For starters:

  • It doesn’t solve any problems civil elections actually have.
  • It’s basically incompatible with “software independence”, considered an essential property.
  • It can make ballot secrecy difficult or impossible.

Both Ben Adida and Matthew Green have written longer pieces on blockchain and voting.

News articles.