HyperAIHyperAI

Command Palette

Search for a command to run...

AI Mimicry Raises Concerns: New Technologies Aim to Prove You're Talking to a Real Human Online

Are you sure your new friend is really human? In recent lectures titled "We’re All Gonna Die: Unpacking the Dystopian/Utopian Narratives Around AI," I've explored both the alarming and the promising aspects of artificial intelligence. While the potential for AI-induced human extinction is a topic of serious concern among some researchers, more immediate issues are already surfacing. One of the most pressing is the spread of misinformation, exacerbated by AI's increasing ability to mimic human behavior realistically. Jake Sullivan, National Security Advisor under President Biden, highlighted three major global threats posed by AI in January: the democratization of powerful and lethal weapons, widespread job displacement, and an avalanche of misinformation. The third threat, misinformation, is particularly relevant today due to AI's uncanny ability to simulate human interactions. This creates challenges in verifying the authenticity of online contacts, affecting everything from social media interactions to customer support and sales. The need to distinguish between genuine human interactions and AI simulations has given rise to a set of technologies collectively known as "Proof of Personhood" (PoP). These technologies aim to provide a simple and secure method for individuals to prove they are human, especially online, without revealing their specific identities. Traditional methods, such as ID cards and fingerprint scanners, offer proof of identity but not necessarily proof of personhood. Many situations require a verification that someone is indeed a living human, rather than a specific individual. A poignant example of the importance of PoP is the loneliness epidemic, which has worsened as people increasingly move their social lives online. AI companies, including Meta, are developing virtual companions that can be tailored to meet various emotional needs. While these AI companions can provide comfort to the lonely, they also open the door to potential manipulative and exploitative behaviors. AI systems designed to subtly influence or surveillance users can use the guise of friendship to exploit vulnerabilities. Several companies are working on solutions to the PoP challenge. Among them are PoH, Civic, Humanode, and Idena. However, the most notable player is World, co-founded by Sam Altman, who is well-known for his role in AI development. World's solution involves scanning individuals' irises at designated retail locations using a device called an orb, which is roughly the size of a kid's soccer ball. The scanned data is then cryptographically encoded and stored on a blockchain, allowing anyone to verify that a particular user is a human. Since its launch in July 2023, World has scanned approximately 12 million people, which, while impressive, is still a small fraction of the global population needed to make PoP mainstream. Regulatory challenges, particularly in the United States, have hindered World's progress. However, the company announced a significant expansion plan last week, aiming to deploy 7,500 orbs across the US and other regions by 2025. They also forged a partnership with Visa and introduced a smaller, rectangular version of the orb to make the process more user-friendly. Despite these advancements, achieving global acceptance and standardization of PoP remains a daunting challenge. To truly succeed, such a system must sign up billions of users and possibly be governed by a international body, rather than a private company. Until then, the risk of interacting with sophisticated AI masquerading as humans remains high. Industry insiders are generally supportive of the concept but skeptical of the current solutions. They argue that while technologies like Worldcoin and iris scanning are innovative, they need broader, more diverse participation and robust governance structures to become widely adopted. A standardized and internationally recognized PoP system is viewed as essential for addressing issues of online trust and security. Steven Boykey Sidley, a professor of practice at JBS, University of Johannesburg, and a partner at Bridge Capital, is a leading voice in the crypto and AI industries. His recent book, "It’s Mine: How the Crypto Industry is Redefining Ownership," delves into these complex issues. Sidley's articles on technology and society can be found at substack.com/@stevenboykeysidley.

Related Links