You need to enable JavaScript to view this site.
Skip to Content
AI can read your emotional response to advertising and your facial expressions in a job interview. But if it can already do all this, what happens next? In part two of a series on emotion AI, Jennifer Strong and the team at MIT Technology Review explore the implications of how it’s used and where it’s heading in the future.
We meet:
- Shruti Sharma, VSCO
- Gabi Zijderveld, Affectiva
- Tim VanGoethem, Harman
- Rohit Prasad, Amazon
- Meredith Whittaker, NYU’s AI Now Institute
Credits: This episode was reported and produced by Jennifer Strong, Karen Hao, Tate Ryan-Mosley, and Emma Cillekens. We had help from Benji Rosen. We’re edited by Michael Reilly and Gideon Lichfield.