Photo Credit: Bloomberg
In 1865, Britain passed its infamous “Red Flag” Act — copied in many other places — to regulate for self-propelled vehicles. It required a crew of three for each vehicle, one member of which was to walk 60 yards ahead with a red flag to warn horses and riders of the vehicle's approach. It also imposed a four-mile-per-hour speed limit, or two miles per hour in populated areas.
Why is this relevant now? Because attempts some 158 years later to regulate cryptocurrencies and artificial intelligence will seem equally silly to future generations. Technology transforms society according to its functionality and what people want to do with it, not conservative regulations passed by clueless officials.
The collapse of crypto exchange FTX in November 2022, capping a “horribilis annus” for big-name, regulated digital currencies, combined with the demo release of ChatGPT the same month, sent venture capital money fleeing from crypto and into AI. A harder-to-measure, longer-term trend among academics and top developers seems to be favouring the steady, quiet progress of working in AI over the scandal-ridden, boom-and-bust of crypto. These trends are more consequential for the future than anything done in Washington, the ups and downs of Bitcoin, or how non-venture capital is allocated. The automobile — and radio and the Internet and genetic engineering — transformed society in fundamental ways, unrelated to regulators' wishes or stock prices or anything the media was covering at the time.
The competition between crypto and AI for the hearts and minds of tech innovators, and the wallets of venture capitalists, reflects a more general dichotomy. AI is traditionally centralized, routines gobble up all data everywhere, and make decisions for a small group of human designers — or in dystopian science fiction versions — the natural limit of no humans. Crypto is radically decentralised. All actionable information is held by dispersed individuals in private keys. No one controls the system.
It's no coincidence that crypto broke through to general consciousness with the massive centralized, interconnected failure of the 2008 financial crisis, while AI took off after the 2020 global pandemic reminded people that we are all connected, like it or not. Crypto scares people because it threatens the ability of centralized human institutions to collect taxes and regulate behaviour like drug use, sex, gambling, pornography, sedition, etc. AI scares people because it threatens individual human agency and privacy, regulating all human behaviour in a totalitarian nightmare regime, or perhaps even replacing humans altogether. Another issue with traditional AI is that when you pull in all information you pull in prejudice, intolerance, and error as well as good information.
But a deeper look at recent events shows a more complex picture. The hot areas in AI use crypto technology to build decentralized controls. The first-generation, pull-in-all-information AI approaches are failing because the entities controlling information today are not willing to give it up to a faceless algorithm, not under their control. Homomorphic encryption allows information holders to get the benefit of AI analysis without either the AI routine itself or its creators accessing the underlying information. Federated learning allows independent, decentralized actors to build and use a common, robust AI tool, without sharing data. Many of the most exciting AI projects are intended to be delivered to and controlled by individuals to gather information and make decisions, without exposing anything about the individual to the Internet at large.
At the same time, many crypto projects are exploiting AI to build complex structures out of decentralized parts. A fundamental goal of most crypto is composability — once an application is built it should be easily integrated as a modular component of any larger application. First generation crypto projects were built with human developers in mind, but AI offers intoxicating possibilities for building much larger and better structures with decentralized composable applications on-the-fly. What we call a “smart” contract in crypto today is actually dumb in that it consists of dumb rules chosen by human counterparties. If you chain together enough dumb rules the contract can seem smart, but that's an illusion; complexity is not intelligence. Moreover, humans are not good at foreseeing all possible future scenarios. AI can make genuinely smart contracts, which could transform many arenas of human interaction.
Most Utopian science fiction imagines computers with access to all information, that follows human instructions slavishly. A few classic science fiction devices — most famously Isaac Asimov's Three Laws of Robotics — deal with the implicit contradiction of that view. But now that we're building the algorithms that already run much of the world and soon may run all of it, choosing the right mix of centralization and decentralization may be the most important social issue of the times — more important than politics or innovations in non-computer fields. And whatever choice we make will have to be designed very carefully so it cannot be smashed either by decentralized human activities leading to anarchy or centralized AI algorithms leading to humans becoming slaves.
© 2023 Bloomberg LP
(This story has not been edited by NDTV staff and is auto-generated from a syndicated feed.)
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.