ITIF Search

Assessing the State of Digital Skills in the U.S. Economy

November 29, 2021

An increasingly digitalized global economy requires ever-more digitally skilled workforces for nations to remain productive. Unfortunately, domestic and international assessments of digital skills show the United States is lagging its competitors.


According to the OECD, fully one-third of working-age Americans possess even limited digital skills. One in six are unable to use email, web search, or other basic online tools.
The United States ranks just 29th out of 100 countries for the digital acumen of its workforce in business, technology, and data science, according to Coursera.
This comes against a backdrop of increasing digital skills requirements for many U.S. occupations. Brookings found that whereas only 44 percent of U.S. jobs required medium-high digital skill levels in 2002, 70 percent did by 2016.
Digital skills are critical to higher wages: Jobs that incorporate higher levels of digital content pay more—in fact, for every 10 percent increase in ICT-task intensity, the average U.S. worker’s salary increases 4 percent.
The United States needs to increase its number of computer science graduates and concentrate particularly on women, who represented 37 percent of U.S. computer scientists in 1995, but just 24 percent today.
The United States needs to significantly increase its investment in workforce training, including for digital skills. As a share of GDP, the federal government now invests less than half as much in such programs as it did 30 years ago.
With corporate investment in workforce training also falling by 30 percent as a share of GDP from 1999 to 2015, Congress should expand Section 127 tuition credits.

Executive Summary

What Are Digital Skills and Why Do They Matter?

Assessing the State of Digital Skills in the U.S. Workforce

Best Practices in Teaching Digital Skills to the U.S. Workforce

Policy Recommendations



Executive Summary 

The global economy is increasingly digitalized. Oxford Economics estimated that in 2016 the digital economy accounted for 22.5 percent of global gross domestic product (GDP).[1] Going forward, analysts at the research firm IDC have estimated that as much as 60 percent of global GDP will be digitalized (meaning largely impacted by the introduction of digital tools) by 2022.[2] Countries that wish to successfully compete in the global digital economy must cultivate workforces possessing the requisite digital skills so that industries, enterprises, and even individuals can thrive in the digital environment. This report explores the state of digital skills across the U.S. economy, examining what they are, why they matter, the current extent of workforce digitalization, and how the United States fares in international digital skills comparisons. It concludes by providing a brief overview of some of the best practices and programs being introduced by nonprofit, academic, and corporate organizations to deepen the U.S. digital skills base and suggesting policy recommendations to further foster U.S. digital skills development.

What Are Digital Skills and Why Do They Matter?

The globalization of the digital economy profoundly impacts every industry. For instance, while certainly the digitalization of the global economy has brought entirely new industries and enterprises to the fore—web search, social media, artificial intelligence (AI), cloud, etc.—at least 75 percent of the value of data flows over the Internet actually accrues to traditional industries such as agriculture, manufacturing, finance, hospitality, and transportation.[3] This dynamic explains why the vast majority of the economic benefits from information and communication technologies (ICTs)—likely more than 80 percent for developed nations and 90 percent for developing ones—stem from greater adoption of ICTs within an economy, far more than the benefits generated by ICT production.[4] In other words, the value that gets created in the global economy is increasingly created digitally: In fact, some estimate that as much as half of all value that will be created across the global economy over the next decade will be done so digitally.[5] That’s why one estimate predicts that 97 million new digital jobs will be created globally in the first half of this decade.[6] Similarly, a separate report finds, “If America could train just five million workers for digital jobs in the next five years, it would drive an estimated $250 billion more in U.S. GDP growth.”[7] Therefore, if individual workers are going to be able to contribute value in such an economy, they’ll need sufficient digital skills to be able to do so.

Digital skills can be broadly construed across two different categories. First are the digital skills needed to work directly in ICT or digital industries, including computer science skills needed to code software, AI, and other computer systems; electrical engineering skills to design semiconductors, high-performance computers, and quantum computers; cybersecurity skills; and competencies to manage data centers and telecommunications networks. It’s these digital skills that have given rise to an Internet/digital tech sector that contributed $2.1 trillion of the U.S. economy in 2018, or about 10 percent of GDP.[8]

According to the U.S. Bureau of Labor Statistics’ Occupational Employment and Wage Statistics survey, the number of U.S. workers in the most-relevant category—Computer and Mathematical workers—increased by 40 percent over the past two decades, from 3.3 million workers in May 2010 to 4.6 million workers by May 2020. Baselining to 2005, over the past 15 years, job growth among information technology (IT) workers has increased almost 60 percent faster than for overall jobs in the U.S. economy. (See figure 1.)

Figure 1: Indexed job growth in U.S. IT and overall workforce, 2000–2020 (2005 baseline = 1)[9]

Chart, bar chart Description automatically generated

The Computing Technology Industry Association (CompTIA) took a slightly different approach in its annual “Cyberstates” report by developing a count of technology professionals employed across the economy, including in the technology sector (e.g., software developers, network architects, database administrators); professionals employed by IT firms though not working in IT roles directly (e.g., sales, marketing, customer service professionals); and self-employed workers in the ICT-enabled “gig” economy (e.g., Uber drivers).[10] By this measure, CompTIA generated a count of 12.4 million “tech workers” in the U.S. economy, with 66 percent, or 8.2 million, of those workers directly employed in core ICT roles.

ICT jobs represent some of the fastest-growing across the economy. Burning Glass Technologies, a Boston-based labor market analytics firm, found that, from 2012 to 2017, demand for data analysts increased by 372 percent, with a subset within that job grouping (data-visualization specialists) rocketing up by 2,574 percent.[11] And that trend continues, with many ICT occupations expected to grow at four to five times the national rate over the course of this decade, led by anticipated growth in jobs in cybersecurity, data science, and software development.[12] (See figure 2.)

Figure 2: Anticipated fastest-growing ICT jobs in U.S. economy over coming decade[13]

Chart, bar chart Description automatically generated

A second category of digital skills pertains to individuals’ facility with using digital tools in traditional work environments, such as the ability to work with basic Microsoft Office programs such as word processors, spreadsheets, email, or networking software; the ability to use databases or customer relationship management (CRM) tools; the ability to engage in social media or use video conferencing tools; to use mobile applications supporting point-of-service operations; to interpret the output of AI-based systems; or to use automated reality/virtual reality (AR/VR) tools to repair an automobile or jet engine. Indeed, it’s this broader capacity of a nation’s workforce that constitutes the transmission mechanism for digitalization to manifest its transformative impact across traditional industries from agriculture and finance to manufacturing and medicine. As the Brookings Institution’s Mark Muro and his colleagues framed it in their excellent report “Digitalization and the American Workforce”:

A sizable portion of the nation’s middle-skill employment now requires dexterity with basic information technology tools, standard health monitoring technology, computer numerical control equipment, basic enterprise management software, customer relationship management software like Salesforce or SAP, or spreadsheet programs like Microsoft Excel.[14]

Nations need to support the development of digital skills across both ends of this spectrum. Economists have long recognized that ICTs constitute a general purpose technology that turbocharges the productive, innovation, and business-model-generation capacity of virtually all downstream industries that utilize it.[15] Indeed, ICT represents “super capital” that has a much larger impact on productivity growth than do other forms of capital.[16] For instance, ICT capital has a three to seven times greater impact on firm productivity than does non-ICT capital. ICT workers also contribute three to five times more productivity than non-ICT workers do.[17] This dynamic explains why the digital economy has contributed to 86 percent of U.S. labor productivity growth in recent years, despite accounting for only 8.2 percent of U.S. GDP.[18]

The greater productivity of ICT-industry workers, or workers who possess greater levels of digital skills, explains why such workers are often able to command higher wages. For instance, the Organization for Economic Cooperation and Development (OECD) has found that a 10 percent increase in the ICT-task intensity of jobs (at the country mean) correlates to an average 2.5 percent increase in hourly wages across OECD economies, with the United States experiencing the highest effect at 4.08 percent. (See figure 3.)

Figure 3: Returns on ICT tasks—percentage change in hourly wages for a 10 percent increase in ICT task intensity of jobs (at the country mean), 2012 or 2015[19]

Chart, bar chart Description automatically generated

The finding that jobs involving high digital-task intensity and greater levels of digital skills earn more has been corroborated repeatedly. Brookings’ 2017 “Digitalization and the American Workforce” report examines the digital content of 545 occupations covering 90 percent of the U.S. workforce in all industries from 2001 to 2016. The report leverages the Occupation Information Network (O*Net) database, which surveys workers on their knowledge, skills, tools and technology, education and training, and work activities required to perform their jobs.[20] Here, two of O*Net’s three technology-related variables are relevant: “Knowledge: Computer and electronics” (which measures the overall knowledge of computers and electronics required by a job) and “Work activity–interacting with computers” (which quantifies the centrality of computers to the overall work activity of the occupation).[21] Brookings used this to develop occupational digitalization scores for 545 occupations across 23 highest-level industries, segmenting occupations into 3 tiers of digitalization: low, medium, and high.

Not surprisingly, Brookings found that the mean annual wage for workers in highly digital occupations reached $72,896 in 2016, with workers in middle-level digital jobs earning $48,274, and those in the least digitally intense positions earning $30,933. (See figure 4.) Importantly, Brookings found that digitalization scores have significant and positive effects on real annual wages even when controlling for education level, and that the wage premium for computer skills nearly doubled between 2002 and 2016. To wit, in 2002, a one-point increase in digitalization score predicted a $166.20 (in 2016 dollars) increase in real annual average wages for occupations with the same education requirements; by 2016, this wage premium had almost doubled to $292.80.[22]

Figure 4: Mean annual wage by digitalization level, 2016[23]

Chart, bar chart Description automatically generated

Brookings findings have also been corroborated by those from the most-recent OECD Survey of Adult Skills (formally known as the “Programme for the International Assessment of Adult Competencies” or PIAAC, and subsequently elaborated upon), which assesses workers from over 40 nations on 11 ICT-related measures ranging from simple use of the Internet to use of a word processor, spreadsheet software, or programming language.[24] As reported by the National Skills Council in its report “The New Landscape of Digital Literacy,” half of workers with limited or no digital skills have lower earnings. To wit, among workers reporting possessing no digital skills, 32 percent were in the lower-middle quintile of wage earners and 25 percent were in the bottom quintile, while among those reporting possessing limited digital skills, 26 percent were in the lower-middle quintile and 21 percent in the bottom quintile.[25] (See figure 5.)

Figure 5: Earnings of U.S. workers ages 16–64 in quintiles by digital skill levels[26]

Chart, bar chart Description automatically generated

Assessing the State of Digital Skills in the U.S. Workforce

This section assesses the state of digital skills across the U.S. workforce broadly, first by industry and occupation, then across various IT platforms, and finally in terms of international comparisons.

Assessing U.S. Workforce Digital Skills Broadly, By Industry, and By Occupation

The OECD’s PIAAC survey categorizes adult workers into four levels of digital skills: no digital skills, limited digital skills, proficient digital skills, and advanced digital skills.[27] (Technically, the survey tests for “digital problem solving in technology-rich environments.”)[28] Individuals with no digital skills failed to meet one or more of three basic digital skills criteria: 1) prior computer use; 2) willingness to take the computer-based assessment; and 3) ability to complete four out of six extremely basic computing tasks, such as using a mouse or highlighting text on screen.[29] Individuals with limited digital skills could complete only very simple digital tasks involving a generic interface and a few steps (e.g., placing incoming Outlook emails into specified folders).[30] Those possessing proficient digital skills were capable of using basic digital productivity tools (e.g., Microsoft Outlook), and could use digital tools such as the sort function in spreadsheets to solve problems, yet still struggled with tasks involving the use of both generic and specific technology applications.[31] Lastly, those possessing the most-advanced digital skills in the PIAAC survey have facility in “the use of both generic and more specific technology applications… [can navigate] across pages and applications… [and can solve tasks that] may involve multiple steps and operators.”[32]

While it should be apparent that even the above-referenced digital-problem solving skills measures don’t constitute “tests” for extremely high levels of digital aptitude, the PIIAC data reveals that even then fully one-third of American workers lack digital skills, with 13 percent of U.S. workers having no digital skills and 18 percent having at best limited digital skills, while 35 percent could be deemed proficient and 33 percent advanced in digital skills. (See figure 6.) In essence, one in six working-age Americans are unable to use email, web search, or other basic online tools.[33] As the National Skills Council observed, there is significant digital skills knowledge fragmentation among U.S. workers: Many may be comfortable using smartphone apps but unfamiliar with how to operate a mouse, use Microsoft Office productivity tools, or even upload employment applications to websites.[34] In part, that’s because 23 percent of U.S. households do not own a desktop or laptop computer, while over 7 percent of all Americans simply don’t use the Internet.[35] This digital divide hits U.S. minority groups particularly hard, with 23 percent of African Americans and 25 percent of Latino respondents to a 2019 Pew Research Center study reporting having smartphone-only Internet access.[36]

Figure 6: Extent of digital skills in the U.S. workforce according to OECD PIAAC survey data[37]

Chart, pie chart Description automatically generated

Table 1: Percentage of workers with no or limited digital skills, selected industries[38]


Percentage of Workers With No Digital Skills

Percentage of Workers With Limited Digital Skills

Combined Percentage of Workers With Limited or No Digital Skills

Construction, transportation, and storage




Retail, wholesale, and auto repair




Hospitality and other services








Administrative and support services; arts, entertainment, and recreation




Health and social work




Finance, insurance, and real estate








The lack of workforce digital skills is particularly acute in certain industries. Across the U.S. construction, transportation, and storage industry, fully half of all workers have no or only limited digital skills, while that share is over one-third across the health and social work, manufacturing, hospitality, and retail and wholesale industries. (See table 1.) The lack of digital skills in the manufacturing sector is particularly concerning, especially because jobs in U.S. manufacturing increasingly demand a facility with digital skills, which is important for individual workers to be both competitive and productive, and for broader U.S. manufacturing industries as well.

Unfortunately, despite this rather significant gap in digital skills across the U.S. workforce, the digital skills requirements of U.S. occupations continues to increase apace. Brookings’ study “Digitalization and the American Workforce” finds that, from 2002 to 2016, the share of employment in occupations with high digital content (defined as occupations with digital scores above 60 on a 100-point scale) more than tripled, from 4.8 to 23 percent of employment, while employment in occupations with medium digital content requirements increased from 39.5 to 47.5 percent, and the share of jobs with low digital content requirements nearly halved, shrinking from 55.7 to 29.5 percent. (See figure 7.) In essence, over 70 percent of U.S. jobs now require middle- to high-level digital skills. In absolute terms, Brookings found that, as of November 2017, over 32 million U.S. workers were employed in highly digital jobs, 66 million held moderately digitally intense positions, and only 41 million were in jobs requiring only low levels of digital skills.[39] Archetypal occupations requiring high levels of digital skills included ICT workers, electrical engineers, and financial managers; occupations requiring medium digital skills levels included attorneys, mechanics, and registered nurses; while examples of low-digital-skill-level occupations included security guards, restaurant cooks, and personal care aides.

Brookings further found that digitalization scores rose in the overwhelming number of analyzed occupations (517 of the 545) from 2002 to 2016. Moreover, Brookings found that “job creation in the last few years has heavily favored digitally oriented occupations,” with nearly 4 million of the 13 million new jobs created from 2010 to 2016, nearly 30 percent, requiring high-level digital skills.[40] (See figure 8.)

Figure 7: U.S. employment by levels of digitalization[41]

Chart, waterfall chart Description automatically generated

Figure 8: U.S. jobs created by digital skill level, 2010–2016[42]

 Chart, sunburst chart Description automatically generated

The extent of industry digitalization varies widely, with jobs in the ICT industry—where 70 percent require high levels of digital skills and 29 percent medium levels—unsurprisingly leading, followed by jobs in the finance and insurance industry and the utilities sector (See figure 9). Perhaps most notable are the digital skills requirements of occupations in advanced manufacturing. Whereas, in 2002, 47 percent of jobs required only limited digital skills (and just 15 percent required advanced skills and 38 percent middle-level digital skills), by 2016 the share of advanced manufacturing jobs requiring high levels of digital skills increased to 34 percent, those in the mid-tier increased to 48 percent, and those needing only the least digital skills shrank by nearly 30 percentage points, to a mere 18 percent share.

Figure 9: Share of sector employment by digital score, 2016[43]

Chart, bar chart Description automatically generated

Figure 10: Digital skill ratings for occupations in advanced manufacturing[44]

Chart, bar chart Description automatically generated

Notably, technology-oriented occupations are increasing their share of advanced manufacturing employment at the expense of routine occupations. For instance, from 2006 to 2016, the U.S. manufacturing industry added 73,130 workers in software development for systems software and 51,900 as mobile application software developers (a job category that scarcely existed in 2002) as well as another 15,000 jobs in computer systems analysis, graphic design, and operations research analysis. Conversely, the industry shed nearly 20,000 jobs in routine roles such as machine feeding.[45] The advent of digital manufacturing has introduced not just new skill requirements and responsibilities but even entirely new types of jobs. For instance, a recent report by MxD and Manpower Group, “The Digital Workforce Succession in Manufacturing,” identifies 165 completely new job roles pertinent to digital manufacturing and design.[46]

Moreover, across virtually every occupation in advanced manufacturing, the need for digital skills increased from 2002 to 2016. For instance, whereas for tool and die making the digital component of the job was virtually nil in 2002, by 2016, this had changed to become a job with a medium level of digital skills content relative to other jobs in the U.S. workforce. And for advanced manufacturing jobs that required a medium level of digital skills in 2002—such as for automotive service technicians and mechanics, industrial engineers, and mechanical engineering technicians—by 2016, this had jumped to requiring a high level of digital skills. (See figure 10.)

It should also be noted that, in general, the higher the mean digital score for occupations in an industry, the greater that industry’s output, productivity, and wage changes were from 2010 to 2016. As the Brookings report notes, “In an era of slowing productivity growth, the most highly digitalized sectors led the U.S. economy in productivity increases” with the ICT and media sectors increasing their annual productivity by more than 2.5 percent.[47] (See table 2). By contrast, productivity in medium-skilled digital sectors such as health care and advanced manufacturing grew by only a modest 0.7 and 0.3 percent per year, while the lagging hospitality, construction, nursing and residential care, and accommodation and food services industries experienced negative productivity growth.[48] In general, the more highly digitalized industries in the Brookings study also realized higher output and wage growth from 2010 to 2016. In sum, the Brookings report showcases the ever-increasing importance of digital skills across virtually every occupation and industry in the U.S. economy.

Table 2: Industry mean digitalization scores and economic performance, 2010–2016[49]



Compound Annual Growth Rate, 2010–2016


Mean Digital Score, 2016

Output Change

Productivity Change

Wage Change

Professional, scientific, and technical services





Finance and insurance










Management of companies and enterprises





Healthcare services and hospitals





Information and communications technology










Wholesale trade





Oil and gas extraction





Educational services





Retail trade





Advanced manufacturing





Transportation and warehousing





Basic goods manufacturing










Nursing and residential care facilities





Accommodation and food services





Mining (except oil and gas)





Assessing U.S. Workforce Digital Skills Across Specific Digital Technologies

Beyond examining U.S. workforce digital skills broadly across an economy, it’s also possible to consider them through the lens of facility with emerging digital technologies, such as AI and AR/VR.

U.S. Workforce Digital Skills in AI

AI represents an increasingly important driver of global economic growth. For instance, a 2018 McKinsey Global Institute report estimates that, by 2030, applications of AI will boost global productivity by 1.2 percent annually, increasing the size of the global economy by $13 trillion.[50] Elsewhere, an Accenture report estimates AI would contribute to a U.S. economy $8.3 trillion larger by 2035.[51] Accenture further estimated that companies “investing in AI and in human-machine collaboration” will boost their revenues by 38 percent between 2018 and 2022.[52]

As Paul Daugherty and James Wilson wrote in their excellent book Human + Machine: Reimaging Work in the Age of AI, the AI era has given rise to numerous new-to-the world jobs, including data scientist/data analyst, AI applications developer, AI applications engineer, cognitive systems engineer, machine-learning engineer or specialist, collaborative robotics specialist, digital twin analyst or architect, data-quality analyst, interaction modeler, algorithmic forensics analyst, human-computer interaction specialist, and explainability strategist.[53] Beyond creating entirely new jobs, a 2019 Brookings report, “What Jobs Are Affected By AI” finds that “virtually every occupational group” in the U.S. economy would be impacted by AI.[54] The report also finds that 740 out of 769 occupations assessed “contain a capability pair match with AI patent language, meaning at least one or more of its tasks could potentially be exposed to, complemented by, or completed by AI.”[55]

An essential feature of the AI era will be the need for humans and machines to work together. As developed in their book Human + Machine: Reimaging Work in the Age of AI, Daugherty and Wilson identified a set of roles wherein human capacity will remain supreme (tasks such as leading, empathizing, creating, and judging) and a set of tasks wherein machines are likely to outperform humans (e.g., transacting, iterating, predicting, and adapting). (See figure 11.) Yet in the middle lies a set of tasks wherein humans will enable machines (in functions such as training, explaining, and sustaining) and ones wherein “AI will give humans superpowers,” thus augmenting human capabilities (in functions such as amplifying, interacting, and embodying).[56] It’s in this “missing middle,” as Daugherty and Wilson contended, where humans and machines complement and extend the potential of one another—and give businesses significant potential to reimagine their existing work processes “to take advantage of collaborative teams of humans working alongside machines.”[57] They argued, “The simple truth is that companies can achieve the largest boosts in performance when humans and machines work together as allies, not adversaries, in order to take advantage of each other’s complementary strengths.”[58]

Figure 11: Roles of humans and machines in the AI era[59]Diagram Description automatically generated

Thus, as the Information Technology and Innovation Foundation (ITIF) wrote in “The Manufacturing Evolution: How AI Will Transform Manufacturing and the Workforce of the Future,” one of the ironies of the AI era is that as AI becomes more prevalent, investing in people becomes all the more important.[60] Yet there is a disconnect here. For instance, even though almost half of 1,200 CEOs and senior executives surveyed by Accenture in a recent study identified digital skills shortages as a key workforce challenge, only 3 percent reported that their organizations had plans to significantly increase investment in training programs over the next three years.[61] Perhaps one of the challenges here is that employees are actually more ready for the AI transformation than their employers think they are. For instance, only one-quarter of executives in the Accenture study reported they “believe our workforce is ready for AI adoption.”[62] However, 68 percent of highly skilled workers and nearly half of the lower-skilled workers were enthusiastic about AI’s potential impact on their work, while 67 percent of workers considered it important to develop their own skills to work with intelligent machines. Moreover, another survey of workers finds that while nearly all (92 percent) believe the next generation of workplace skills will be radically different from the prior one, 87 percent believe that, within the next five years, new technologies such as AI will improve their work experience.[63]

In 2019, ITIF and the Manufacturers Alliance surveyed over 200 mid-sized U.S. manufacturers with sales of generally between $500 million and $10 billion, inquiring how they anticipated AI would transform their manufacturing operations and workforces.[64] The survey finds that it remains early days for the introduction of AI jobs in U.S. manufacturing, with 72 percent of companies surveyed having yet to introduce such new types of AI jobs. However, manufacturers expect that situation to change rapidly over the next five years. Among the manufacturers surveyed, by 2024, nearly 80 percent expect to employ data scientists/data analysts (up from 43 percent today), 70 percent expect to employ machine-learning engineers or specialists (up from 33 percent today), 56 percent expect to employ collaborative robotics specialists (up from 29 percent today), and 60 percent expect to employ AI solutions programmers or software designers (up from 26 percent today). (See figure 12.) Surveys such as these make it clear: Facility with AI skills will soon become a critical component of not just manufacturing jobs, but many other jobs across the entire U.S. economy.

Figure 12: AI-related jobs U.S. manufacturers anticipate introducing[65]

Chart, bar chart Description automatically generated

U.S. Workforce Digital Skills in AR/VR

AR/VR—immersive technologies that enable users to experience digitally rendered content in both physical and virtual spaces—offers promising opportunities to enhance workforce training. These technologies can give workers hands-on experience in low-risk simulated environments as well as real-time, on-the-ground guidance, which in turn can reduce operational costs, increase engagement, and provide valuable insight for future training.[66] Because of this, there is growing interest in AR/VR training solutions across many industries. According to the research firm IDC, nearly one-third of Global 2000 (G2000) manufacturers plan to be using immersive tools by 2022.[67] Thanks to that kind of penetration, a 2019 report by PwC predicts that VR and AR could add $1.5 trillion to the global economy by 2030, with development and training contributing $294.2 billion.[68] And in a 2018 survey of automotive, manufacturing, and utilities industry executives, a significant majority of respondents indicated they believed AR/VR technologies would become mainstream in their organizations within the next five years.[69] Given the demonstrated benefits and enthusiasm around these tools, it is hardly surprising that the market for immersive training solutions is expected to grow from $20.6 billion in 2020 to $463.7 billion in 2026.[70]

Advanced, highly specialized, and high-risk industries are natural candidates for AR/VR training tools, as immersive solutions allow workers to repeatedly practice certain tasks without having to worry about making costly or dangerous mistakes. For example, VR welding simulators allow students to practice in a safe environment and receive real-time feedback on their performance. One 2012 study estimates that integrating these tools in trainings could lead to over $200 in savings per student.[71] Immersive solutions can also provide ongoing training and knowledge transfer on the ground. For example, industrial manufacturer Howden developed an AR training system to streamline expert knowledge sharing across global product units and sales teams.[72]

AR/VR solutions can improve workforce training beyond manufacturing and other advanced industries. For example, Walmart uses a VR experience to allow floor associates to practice different scenarios, including “holiday rush” modules that “simulate the high-stakes, chaotic environment of Black Friday.”[73] During the pilot phase, the training resulted in notably higher employee satisfaction, knowledge retention, and test scores.[74] Similarly, JetBlue partnered with the AR/VR solution developer Strivr Labs to develop VR training for technicians working on JetBlue aircraft.[75] Even Kentucky Fried Chicken now trains workers on food safety using VR goggles.[76]

Immersive training is also increasingly popular in health care education. For example, the HoloAnatomy platform developed by Case Western Reserve University uses Microsoft HoloLens mixed reality (MR) headsets to provide medical students with scientifically accurate anatomical models they can interact with in real time.[77] HoloLens has proven to be an engaging, effective, and cost-efficient alternative to traditional cadaver-based anatomy education.[78]

Assessing U.S. Workforce Digital Skills in International Comparisons

The United States performs mixed, and increasingly underwhelmingly, in international comparisons of digital skills. The PIIAC survey measured “use of ICT skills at work” across 14 OECD countries in 2012. On this index, the United States ranks fourth with a score of 2.1, which is ahead of the OECD average of 1.99, yet behind leaders New Zealand (and surprisingly, Mexico) with a score of 2.17. (See figure 13.)

Figure 13: Use of ICT skills at work in OECD PIIAC survey, 2012[79]