I recently participated in a roundtable discussion with top experts in the medical field, and one of the topics focused on AI’s impact on jobs in this area. Many participants voiced concerns about AI affecting some medical jobs, but most were enthusiastic about AI’s impact on medicine in general.
Although I am not a medical expert, I am knowledgeable about AI’s impact on business processes. During the discussion, I emphasized that, regardless of the business vertical, AI’s effect will be enormous. For example, it can streamline medical billing, consumer interaction, customer service, and outpatient health monitoring, in addition to its potential in drug discovery and various medical treatments.
AI isn’t destroying jobs—it’s reshaping them
This focus on jobs led me to reflect on the pivotal point artificial intelligence and employment have now reached. While doom-and-gloom news cycles portray widespread job displacement by AI, emerging evidence from Vanguard provides a different perspective—one that may encourage workers and business leaders to rethink such scenarios.
What has emerged from a recent analysis by Vanguard is quite interesting: jobs with high exposure to AI have seen greater wage and stronger job growth than jobs minimally exposed to AI. Working wages in jobs exposed to AI are up 3.8% from the second quarter of 2023 to the second quarter of 2025, which is sharply better compared to 0.7% growth in all other jobs. Employment growth in industries closely linked to AI is also similar, up 1.7% compared to 0.8% elsewhere.
These findings contrast sharply with the dystopian predictions often discussed in the literature on AI and work. Notably, the same group of institutional investors and CEOs surveyed expect hiring to increase across all organizational levels due to AI’s influence in 2026.
Of course, the context here plays a massively important role. The job market has cooled in recent months, with federal data showing weakness particularly in tech, consulting, and other high-skill sectors such as software engineering. However, laying the blame for this at the feet of artificial intelligence would do the full story a disservice.
Martha Gimbel, the executive director of the Budget Lab at Yale University, explains the dilemma aptly: “There’s a lot of change occurring in the labor market at this time.”
Identifying the specific contribution of AI during this tumultuous period requires a rigorous analysis—that’s what Vanguard tried to do.
The team investigated Labor Department databases with very detailed information on almost all occupations in the U.S. They looked at occupations in which AI is capable of supplementing or replacing certain tasks, such as data analysis, and compared this with occupations in which AI has little or no interaction at all, such as construction or cleaning
This dynamic becomes clearer when viewed through specific industries. For example, Aaron Levie, CEO of Box, points to software engineering. He argues that AI has increased both productivity and reduced costs in software development. Rather than displacing engineers, AI has expanded the range of possible solutions, resulting in greater demand for their skills.
“If you make it inexpensive enough to build software, you’ll see 10 times more uses for software,” noted Levie. That, he said, is “the part of the equation that everyone is missing.” His assured prognosis? “People who are predicting massive job destruction are wrong.”
Historical parallels suggest optimism
This scenario is not uncommon. A contemporaneous example of the scenario is Apple’s launch of the iPhone. Apple’s revolutionary gadget did not just displace other methods; it created a whole new economy centered on an application platform, which created occupations that never existed before.
Nevertheless, we must treat these results with proper perspective. We are now entering the first phases of the AI shift, and, indeed, the development of AI will continue to exceed even the most significant predictions.
The Vanguard study provides a telling point of where exactly we are, and not where we will one day find ourselves.
One thing that must be watched carefully is the character of current investment in AI. Much of what is happening now is infrastructure investment, such as data centers and compute capacity, and not directly hiring, and while that can ultimately drive productivity, that is a work in progress.
There is historical precedent for technological advancements making jobs rather than breaking them. Automation has been occurring for centuries and jobs still exist. This would presumably remain true in relation to AI.
Perspective matters
The picture emerging from the early data is more subtle than the pessimists and optimists first suggested. AI seems to augment human capabilities by allowing workers to handle higher-level tasks and delegating lower-level tasks to machines. While this increases workers’ value by boosting productivity, it could help cover their wages if the transition is handled with care.
In the roundtable discussion I mentioned above, I did say that one thing is certain when it comes to AI and jobs. People will need to learn AI skills in order to do the jobs of the future. Their current skills will be augmented by AI, helping them be more effective and impactful in their work.
To sum up, the main takeaway from current evidence is that while the future remains uncertain, early data indicates AI has the potential to increase job opportunities and wages rather than reduce them. We should keep a balanced perspective, recognizing that technological change has historically expanded human work opportunities. If handled carefully, AI may do the same.



















