Fact of the Week: Image Recognition Algorithms Decreased Their Error Margin From 30 Percent in 2010 to Less Than 5 Percent in 2016

John Wu March 26, 2018
March 26, 2018

(Ed. Note: The “Innovation Fact of the Week” appears as a regular feature in each edition of ITIF’s weekly email newsletter. Sign up today.)

Image recognition algorithms are a subset of artificial intelligence technology—a critical technology for the future of productivity growth. In their current state, these algorithms are applied in facial recognition applications or automated security footage processing. And as these technologies develop further, they will have more day-to-day uses, much as computers and smartphones have become commonplace for the modern worker.

Improvements across various types of image recognition algorithms have improved rapidly in recent years. In 2010, such algorithms outputted a false positive 30 percent of the time, with this error margin decreasing to less than 5 percent by 2016. For further examples on how artificial intelligence technologies are being currently used in the economy, see the Center for Data Innovation’s report The Promise of Artificial Intelligence: 70 Real-World Examples.