Chief Mass Communication Specialist Shannon Renfroe/U.S. Navy via AP
An MQ-9 Sea Guardian unmanned maritime surveillance drone flies over the USS Coronado in the Pacific Ocean during a drill, April 21, 2021.
When I left Stanford to join Google as an AI research scientist, I “went across the street,” as the saying went. I had been a young assistant professor, first at Georgia Tech and then at Stanford, doing research that was partially funded by the Defense Advanced Research Projects Agency (DARPA). At one point, I brought up the ethical issues of researching surveillance technology with the DARPA program manager, but frankly, raising ethical concerns in such a competitive environment felt a bit like labeling myself a troublemaker.
I was ready to move away from defense work, get recognized for software development, and—yes—make enough money to move out of my small, spider-infested apartment on Alma that shook every time the Caltrain went by.
Since then, I’ve learned that digging deeply into public records—combined with a modicum of data science—can lead to greater accountability and transparency.
In 2018, news broke that Google was secretly helping the Pentagon build artificial intelligence to ramp up its drone surveillance program through “Project Maven.” My instinct was that it would be hypocritical for me to complain, given that DARPA had partly funded my previous job. But months later, I saw further examples of the company doing work that was beyond the pale. Through reviewing a widely (internally) shared link to some of Google’s source code, I learned that the term “human rights” was part of a potential blacklist for its mainland China version of its ubiquitous search engine. Senior management didn’t refute my fears about censorship or surveillance in mainland China—and indeed, the head of Google AI explicitly rejected the concerns of a coalition of 14 human rights organizations that mirrored my own. So I publicly resigned from Google.
Both government and industry were openly focused on blurring the lines between AI companies and the U.S. government.
Soon I was asked to serve as the face for the disgruntled U.S. tech worker who refused to build AI weapons or surveillance for their government—that is, behind closed doors, in meetings with a group of senior representatives from the Pentagon and intelligence agencies. Suffice it to say that my pleading for the protection of human rights internationally by U.S. companies was—in a non-derogatory manner—called “idealism” by a senior U.S. official. (That person has since joined the board of an AI defense contractor.)
In these private and candid sessions with national-security officials, I was able to observe how openly focused both government and industry were on blurring the lines between AI companies and the U.S. government. Both sides seemingly hoped to reclaim the “revolving door” as a positive. That’s when I began submitting numerous Freedom of Information Act (FOIA) requests involving the closest thing the Department of Defense has to a venture capital arm, the Defense Innovation Unit, or DIU. (The intelligence agencies fund startups and research through the venture group In-Q-Tel, which also works closely with the DIU.)
Many roadblocks stood in the way. The first thing you learn when submitting Freedom of Information requests for U.S. federal contracts is that the records originate in an obscure and outdated website called the Federal Procurement Data System (FPDS), which IBM deserves both the credit and blame for. And that the password to a FOIA request for a contract—insomuch as one could ever exist—is the nine-digit code called its Procurement Instrument Identifier (PIID). If you email this code in your request for the contract—also remember to ask for subawards and associated emails!—with just a year of patience, numerous follow-up calls, and confusingly either zero or several hundred dollars, you have a coin’s flip chance of being emailed a heavily redacted set of documents. But the process is very much worth it, and the diamonds I have since found in the rough reveal previously unknown information about how the Pentagon and intelligence agencies are pursuing AI for war.
There was too much data, and it all told a complex story about the realities of defense and surveillance technologies in the U.S. and the West. I noticed that tens of thousands of contracts or modifications to contract overviews became public each morning, as well as lobbying filings, public-corporate partnerships, and venture capital investments. Then there were documents from the other members of the so-called Five Eyes intelligence-sharing partnership (which includes the U.S., United Kingdom, and three of its former colonies: Canada, Australia, and New Zealand). As a former data scientist, I wondered what to do with all of this information.
I started cataloguing tens of thousands of companies and nonprofits based upon these feeds. I built out a map of international weapons and surveillance procurement and influence. Beyond serving as an excellent source of leads for our daily FOIA requests, public records are often revealing in and of themselves: The nonprofit organization I founded, Tech Inquiry, finds itself breaking news about weapons and surveillance procurement and lobbying, frequently.
To my surprise, I recently stumbled upon new details about tech companies subcontracting on the Pentagon’s drone surveillance program known as Project Maven. On a personal level, it suggests to me that I was justified in my concern over academia’s role in U.S. weapons and surveillance systems.
An entire subcontracting network surrounding Project Maven was hidden in plain sight, in public contracts and documents that I had FOIA-ed. The AI drone surveillance project integrated into facial recognition and satellite surveillance in its scoping of potential targets.
An entire subcontracting network surrounding Project Maven was hidden in plain sight, in public contracts and documents.
Importantly, I learned that Project Maven also tied into both social media and cellphone location-tracking surveillance, which can be nicely combined on a map to provide military operators a sense of the political climate in a given area. This so-called “Publicly Available Information” (PAI) is then aggregated within the U.S. Army’s Secure Unclassified Network (SUNet), making heavy usage of data fusion software from the notorious surveillance-tech company Palantir. The primary contractor for these projects is named ECS Federal, and it previously led a similar aggregation effort of public information for the elite U.S. Special Operations Command and its J2 Intelligence Directorate. Presumably, Special Forces have also found internet surveillance a useful component of their situational awareness.
An entire industry of civil society organization whitewashes this surveillance work. One of our other major findings was that numerous frequently cited research institutions—including the august Center for Strategic and International Studies and the data-driven Center for Advanced Defense Studies (C4ADS)—have subcontracted with the U.S. government alongside location-tracking data brokers Venntel and X-Mode Social (now Outlogic). In the case of C4ADS, the subcontract was with the Department of Defense’s closest analogue to the Central Intelligence Agency, the Defense Intelligence Agency. (Perhaps C4ADS’s internal data fusion platform being powered by Palantir should have served as a hint.) Since then, C4ADS has regularly been prominently cited by both The New York Times and BuzzFeed News in analysis involving U.S. geostrategic competitors—C4ADS even contributed to a Pulitzer-winning series.
AI companies such as Clarifai, Rebellion Defense, and Amazon are more overt about their defense work. Surprisingly, it turns out that tech giant Microsoft was the largest subcontractor on ECS Federal’s Project Maven awards. Also of note: Despite Google ceasing its AI development, two of the key companies involved in Project Maven—both Rebellion Defense and Orbital Insight—have continued on the project with backing from Google’s parent company Alphabet and its former executive Eric Schmidt.
Despite all of this having been a matter of public record for months, if not years, the implications of these findings are significant. Now we know that Microsoft, Clarifai, and Palantir are major AI contractors for the Pentagon. Uncovering these important relationships simply took careful analysis of the large volumes of daily updated public procurement data. And I’m certain that repeating this approach will bear much similar fruit in the future.