More than a trillion dollars has been appropriated since September 11, 2001 for U.S. military operations in Iraq, Afghanistan and elsewhere. This makes the “war on terrorism” the most costly of any military engagement in U.S. history in absolute terms or, if correcting for inflation, the second most expensive U.S. military action after World War II.
A newly updated report from the Congressional Research Service estimated the financial costs of major U.S. wars from the American Revolution ($2.4 billion in FY 2011 dollars) to World War I ($334 billion) to World War II ($4.1 trillion) to the second Iraq war ($784 billion) and the war in Afghanistan ($321 billion). CRS provided its estimates in current year dollars (i.e. the year they were spent) and in constant year dollars (adjusted for inflation), and as a percentage of gross domestic product. Many caveats apply to these figures, which are spelled out in the CRS report.
In constant dollars, World War II is still the most expensive of all U.S. wars, having consumed a massive 35.8% of GDP at its height and having cost $4.1 trillion in FY2011 dollars. See “Costs of Major U.S. Wars,” June 29, 2010.
It is in the interests of the United States to appropriately protect information that needs to be protected while maintaining our participation in new discoveries to maintain our competitive advantage.
The question is not whether the capital exists (it does!), nor whether energy solutions are available (they are!), but whether we can align energy finance quickly enough to channel the right types of capital where and when it’s needed most.
Our analysis of federal AI governance across administrations shows that divergent compliance procedures and uneven institutional capacity challenge the government’s ability to deploy AI in ways that uphold public trust.
From California to New Jersey, wildfires are taking a toll—costing the United States up to $424 billion annually and displacing tens of thousands of people. Congress needs solutions.