Anyone who uses a smart phone or takes advantage of algorithm-based products has firsthand experience with the ever-changing nature of technology. Users are swept along on upgrades offering higher speeds, new features, and greater personal exposure. How do we feel about these technologies, and how are they changing our lives?
That’s what behavioral scientist Nathanael Fast wants to know. The USC Marshall Associate Professor of Management and Organization studies the psychology of technology. And if the $1.5 million grant from the DoD’s Minerva Research Initiative he and his research team won is any indication, Fast’s work in this area is helping explain the complex implications of this cultural phenomenon.
“We live in a fascinating time. The future of humanity is being shaped before our very eyes, and the only thing we know for sure is that it’s going to be very different,” said Fast. “Changes are everywhere. How should we be designing and navigating the ever-changing human-technology relationship?”
“We’re finding that algorithm-operated technologies can be used to nudge behavior in ways that humans can’t.”—Nathanael Fast, Associate Professor of Management and Organization
Fast believes science can help, and he is dedicated to organizing the relevant academic disciplines into a new community focused on the psychology of technology. In 2016 he launched his efforts by co-organizing a conference with UC Berkeley behavioral scientist, Juliana Schroeder. They have continued organizing the annual conferences and, in 2018, they established Psychology of Technology Institute to provide ongoing support for the growing field.
“We’re building a network of scholars who want to work across disciplinary boundaries,” said Fast. “Today, science needs to be multi-disciplinary in order to stay relevant. Our research explores factors that shape people’s attitudes about new technologies as well as how these technologies transform how they live, work, play, and interact.”
The Effects of Tech
Fast’s Minerva grant, on which he is co-PI, will enable him to provide insight into the future of work. In the project, he and his colleagues are examining the organizational implications of introducing autonomy-mediated agents (e.g., algorithms, robots, digital assistants) into groups and teams.
In a related paper, “Technology and Social Evaluation: Implications for Individuals and Organizations,” co-authored with his former doctoral student, UVA Darden’s Roshni Raveendhran, published in The Cambridge Handbook of Technology and Employee Behavior (Cambridge University Press 2019), the team examines behavior-tracking technology and virtual/augmented reality to develop insights into the psychological and behavioral consequences of these novel technologies.
The researchers found that people are often more open to being tracked by technology than by humans because the absence of human observers eliminates the fear of being negatively evaluated. “We’re finding that algorithm-operated technologies can be used to nudge behavior in ways that humans can’t,” said Fast.
However, this willingness to use tech can have serious consequences. Fast has found that technology can undermine concern for privacy. His forthcoming paper, “Privacy matters...or does it? Algorithms, Rationalization, and the Erosion of Concern for Privacy,” scheduled for 2020 publication in Current Opinion in Psychology, argues that the unique benefits and psychology surrounding algorithm-based products are leading people down the path to a future without privacy.
In the paper, he and his former postdoctoral researcher, Arthur Jago of the University of Washington, show how the conveniences of algorithms appear to be systematically eroding our capacity and psychological motivation to take meaningful action to defend and protect personal privacy.
At the forefront of these discussions around psychology and technology, Fast asserts the need for more research into the impact of technology and its influence on behavior. He is doing his part. His next paper, Power and Decision Making: New Directions for Research in the Age of Artificial Intelligence, co-authored with Juliana Schroeder and scheduled for 2020 publication in Current Opinion in Psychology, looks at the possibility of power dynamics between humans and machines.
Historically, Fast said, the experience of power has been human-to-human. But with new advances in artificial intelligence, we experience power in human-to-AI interactions too. “Now that we can perceive minds in machines, will we experience power in new ways when interacting with the machines?” Fast asked.
When the machine acts like a low-power human, we may feel powerful, he said, which can trigger a high-power mindset. But can machines also make us feel like the lower-power actor, or even be designed to toggle our experience back and forth? What are the implications of our new relationships with machines, not only for power, but for social experiences more broadly?
Again, Fast notes the need for more research. “There are a lot of ways to do technology right,” he said, “and even more ways to mess it up. Hopefully, through innovative research and collaborative efforts to advance the science, we’ll get it right.”