Friday, May 24, 2024
The House Foreign Affairs Committee voted Wednesday to advance a law bill expanding the White House's authority to police exports of AI systems – including models said to pose a national security threat to the United States.
"AI has created a technology revolution that will determine whether America remains the world's leading superpower or whether it gets eclipsed by China," bill co-author and House Rep Michael McCauln (R-TX) said, joining the list of leaders comparing the technology to the Manhattan Project.
McCauln's key concern is that while the US government's Bureau of Industry and Security (BIS) has the authority to restrict the export of AI accelerators — something the Biden administration has exploited on multiple occasions to stifle Chinese innovation in the space — it lacks the authority to regulate the export of AI models.
"Our top AI companies could inadvertently fuel China's technology, technological ascent, empowering their military and malign ambitions," McCauln warned.
The so-called Enhancing National Frameworks for Overseas Restriction of Critical Exports (ENFORCE) Act [PDF], if passed by the House and Senate, would amend the Export Control Act of 2018 to grant this authority to the BIS and enable the White House to require US companies or persons to obtain export licenses for distributing AI models deemed a national security threat to China.
"This legislation provides BIS the flexibility to craft appropriate controls on closed AI systems without stifling US innovation or affecting open source models," McCauln touted.
As far as we can tell, the bill in its current form doesn't actually contain any explicit protections or carve outs for open source models, and encompasses essentially any AI system, software or hardware, that: Could be used or modified to behave in a manner that erodes the United State's national security or foreign policy; lowers the barrier to the creation of weapons of mass destruction; or could facilitate cyberattacks.
To be clear, the actual wording of the bill is intended to be vague and specifically mandates that the definition "covered artificial intelligence systems" be updated within a year of the bill being adopted.
"We also made the definitions of AI and AI systems in the bill temporary so that the administration may undertake its usual regulatory process and solicit public comment so that the final definitions are appropriately scoped," House Rep Madeleine Dean (D-PA) explained ahead of the vote.
And that's probably a good thing as publicly available, pre-trained foundation models like Llama 2 are routinely modified through fine tuning and other techniques to perform specialized tasks, such as code generation, or to strip away safeguards designed to prevent them from responding to ethically dubious requests.
Because of this, one could argue that many of the most popular models available today could be modified in a way that it falls under the current definition of a "covered artificial intelligent system."
It also isn't clear how such restrictions will be enforced, particularly when it comes to open source models. It's not uncommon to see provisions in end user license agreements warning persons in embargoed countries that the use of the software is prohibited but doesn't actually stop them from accessing the source code.
Without explicit protections for open source models, such restrictions could end up having a chilling effect on model developers for fear uploading a model to an online repository could land them in hot water if it's ever downloaded by a Chinese national.
These concerns, of course, aren't new. Calls by lawmakers last year to restrict RISC-V exports elicited similar warnings from engineers and other techies who argued attempts to do so were doomed to backfire.
While the House Foreign Affairs Committee has voted to advance the bill, there's no guarantee that it'll ever make it to the president's desk as it'll have to survive a vote in both the House and Senate before then, and during an election year. Still, there's always next year.
By: DocMemory Copyright © 2023 CST, Inc. All Rights Reserved
|