But as technology continues to develop, microchipping people doesn’t seem that far off. Earlier this year, Elon Musk’s secretive company, Neuralink, revealed that it is working to implant chips “in a human as soon as this year” and one Tesla driver got her car’s valet key implanted into her arm last August (whether or not it actually works is unclear).
Even the two Intel researchers from the CW article, Andrew Chien, a professor of computer science at the University of Chicago and Dean Pomerleau, both say that their predictions weren’t so far-fetched from the technology we have now. In 2009, Chien acknowledged that “there are a lot of things that have to be done first, but I think [a microchip implant] is well within the scope of possibility.” In 2020, he points out that “it’s not widespread yet but there are a lot of people trying to make [microchip implants] happen.”
Pomerleau, currently a technology consultant for automotive and self-driving car services, also believes that microchipping is still a possibility for the near future. In 2009, Pomerleau’s team was simply working to decode concrete nouns, while today the technology can predict which sentence a person might be thinking based upon certain brain frequencies. “Progress continues to be made and so I think that it is possible that one day we will have the technology to decode brain activity, complex thoughts into bits and bytes that can be used to query the internet or things like that,” he said.
Once the technology is perfected, microchip implementation will probably become commonplace as society’s technological dependency grows, according to Chien. Many people have become physiologically dependent on their technology devices, making the next step into technology dependency somewhat sensible. “There are all of these studies that are written about these people who have withdrawal symptoms if you take their smartphone away from them. There is already this deep psychological kind of dependence on being tethered to information, so this seems like a logical connection to that,” said Chien.
Further, many people are already so intertwined with data and the Internet that the security and privacy concerns of microchips would be comparable to the risks of the connected devices we have now. Chien argues that accessing location and biological data from a microchip would be no more difficult than accessing it from a smartphone or smartwatch that’s synced to Bluetooth and map services, a car that is connected to the Internet or smart home devices that are scanning dialog for verbal commands.
However, Pomerleau highlighted one of the key differences between connected technology and a microchip implant: the ability to directly stimulate the brain. While a microchip might be able to decode brain activity to control a computer; it also means that a computer could, in theory, decode information to control brain activity. “Having direct access to your thoughts and emotions through technology both for read out and, even more scarily, actually stimulating through brain computer interfaces could really exacerbate both the positive and negative impacts of technology in our lives,” he said.
Making Technology Work for Us
Once microchip implants become commonplace, companies could easily adopt the technology to track and monitor their employees in a way that could become dangerous for the person’s well-being, which might, in turn, be detrimental to the company’s success.Irina Raicu, the Internet Ethics Program director at Santa Clara University’s Markkula Center for Applied Ethics, argues that technology, whether it be microchipping or productivity programs in the workplace, strips people of their humanity, turning them into something to manipulate. “We’re treating people as something like robots or just other objects that we can control without allowing them to have autonomy and dignity. Those are things that go away if you’re being tracked all the time, whether by being microchipped or having cameras everywhere or sensors that can sense if you’re at your desk or not,” she said.
Chien has a similar concern. “We ought to be thinking about the balance between the use of those techniques and the supporting of the individual styles, that’s still productive in creating long-term work environments that people can be successful in,” he saids. People, simply, are not at their most productive, creative, or innovative when they’re being watched by their organization or others.
Raicu said that once technology no longer makes workers freer and happier, things need to change. “That’s where we want the technologist to think, “OK, who are you serving with this? And what is the ultimate goal? And how does this impact society more broadly?” she said. The reality is that technology is not the one in control and it cannot determine what gets developed and implemented. “Technology doesn’t just develop itself. Its human beings that are building it and deploying it and marketing it.”
But that means that just as humans are perpetuating technology, we are also the ones who must apply regulations to it. “We really need a whole education process to educate people about the limitations of these tools and then we will be able to talk about how to regulate them,” Raicu said. “But unfortunately we are in this period of rapid deployment of tools that are not even being audited to make sure that they work as they claim to work and so people are getting hurt right now while we are waiting for some control on these practices.”
Putting the ‘No’ in Technology
Society has developed a narrative that technology is inevitable and uncontrollable, creating a sense of helplessness when it comes to control. But Chien refuses to give technology that power and urges others to reject that helpless mentality as well. “I think the tendency to say that ‘technology has its own will and its way, and it will govern the future’ is a disempowering view and I think it’s actually the wrong view,” he said.Even if technological developments are, in some ways, inevitable, that only further reinforces the need for regulations and policies to control certain unexpected outcomes. Nor is new technology necessarily good. Facial recognition is now suspect in every field. Driverless vehicles, which were once so exciting, are slowing in momentum as implementation reveals lethal dangers, while robotic automation encroaches on workplaces, they aren’t always taking away jobs. Microchips are not an immediate threat yet, but they could become another example of our bad relationship with technology.
As humans continue to develop technology, government officials and lawmakers need to start developing solutions before it is implemented, otherwise we will have to face the consequences, said Pomerleau. “It’s not a technological solution but a sociological solution that we’re going to have to come up with if we are going to address the many [tech] challenges we are going to face in the next few decades.”