intersting , but when you look into it : ) , at the end of that article, they even have a link for another one,
More on AI and jobs: AI Is Making It Nearly Impossible to Find a Well-Paying Job. Is This the World We Want?
my 2$
Yale’s takeaway (high level)
- aggregate occupational shares and broad labor statistics do
not show a glaring, economy-wide disruption since ChatGPT’s release , “no discernible disruption yet.”
The Budget Lab at Yale
What more granular evidence shows
- job postings for tech/entry roles are down, layoffs in tech/IT are large and ongoing, entry-level/young workers in AI-exposed occupations show meaningful declines, and employers are increasingly demanding AI skills , all consistent with early displacement and re-shaping of demand.
Dice+3Indeed Hiring Lab+3TechCrunch+3
Where Yale’s method is strong and where it’s blind
-Uses robust, well-known aggregate data (occupational shares, labor force statistics) that are reliable for long-run structural changes.
The Budget Lab at Yale
Key blind spots / limitations (that matter)
- Lagging and aggregated data: occupational shares and monthly unemployment figures are slow and smooth over micro churn (hires, attrition, entry-level squeezes). Early displacement often appears first in postings, layoff announcements, and hiring practices , not immediately in aggregate employment totals. The Budget Lab at Yale+1
- No job-posting / vacancy analysis: changes in demand (fewer job ads for certain roles) are early leading indicators but are not used centrally in Yale’s analysis. Job-ad declines are already observable in several markets. The Guardian+1
- No direct layoff attribution or firm-level AI adoption linking: Yale cannot say whether recent layoffs are caused by AI vs. macro pressures; firm-level disclosures and layoff trackers (and management statements) do point to restructuring often tied to automation strategies. TechCrunch+1
- No wage/quality-of-job signals: wage compression, reduced entry-level opportunities, or fewer high-pay openings are early signs of “hollowing out” but aren’t captured by headline employment counts. The UK work tracking shows fewer graduate/entry openings. The Guardian
- Limited task-level resolution: Yale aggregates by occupation; the most consequential shifts occur at the task level within occupations (some tasks automated while others persist). New task-based indices show better early predictive power. arXiv
Concrete empirical items Yale missed (or treated weakly)
- Job postings fall in AI-vulnerable roles. Several sources show postings for tech/entry roles and high-exposure occupations declining since late-2022 , an early sign of reduced demand. Indeed Hiring Lab+1
- Sustained tech/IT layoffs and restructuring. Large aggregated layoff tallies (2024 and continuing in 2025) disproportionately hit tech/IT; managements frequently cite “efficiency”/automation as reasons. Layoff trackers and TechCrunch lists document the scale. TechCrunch+1
- Young / entry-level workers are being hit first. High-frequency payroll and administrative datasets show relative employment declines for ages ~22–25 in highly AI-exposed occupations. That’s a strong micro-signal Yale’s occupational shares can smooth away. Stanford Digital Economy Lab
- AI skills are shifting hiring requirements. A rising share of tech job postings now list AI skills as required or preferred ,meaning competition and job-match requirements are changing rapidly. Dice
- Task-based exposure indices predict job-posting declines. New task-based GAISI measures correlate with declines in postings and price premia , they suggest displacement may already outweigh productivity gains in some cohorts/roles. arXiv
How the data is skewed and what that does to Yale’s conclusion
- Survivorship and smoothing bias: aggregate employment hides worker churn (firms replacing junior hires with fewer senior staff, contractors, or AI). Yale’s “no big change” claim is true for headline totals but misses composition changes. The Budget Lab at Yale
- Labeling bias: firms seldom say “AI caused this layoff.” They use euphemisms (“restructuring,” “streamlining”) which dilutes causal attribution in aggregate studies. Layoff trackers + reporting picks up the pattern even if official filings don’t label it “AI.” TechCrunch+1
- Measurement bias toward incumbents: datasets measuring current employment undercount those who never enter a job because entry roles vanished ,which harms young workers disproportionately. Yale’s approach underweights this missing cohort. Stanford Digital Eco