Aligning those who align AI, one satirical website at a time

Construction blockades circling a brain on a green background

The work of creating artificial intelligence that holds to the guardrails of human values, known in the industry as alignment, has developed into its own (somewhat ambiguous) field of study rife with policy papers and benchmarks to rank models against each other. 

But who aligns the alignment researchers? 

Enter the Center for the Alignment of AI Alignment Centers, an organization purporting to coordinate thousands of AI alignment researchers into “one final AI center singularity.”

At first glance, CAAAC seems legitimate. The aesthetics of the website are cool and calming, with a logo of converging arrows reminiscent of the idea of togetherness and sets of parallel lines swirling behind black font. 

But stay on the page for 30 seconds and the swirls spell out “bullshit,” giving away that CAAAC is all one big joke. One second longer and you’ll notice the hidden gems tucked away in every sentence and page of the fantasy center’s website. 

CAAAC launched Tuesday from the same team that brought us The Box, a literal, physical box that women can wear on dates to avoid the threat of their image being turned into AI-generated deepfake slop. 

“This website is the most important thing that anyone will read about AI in this millenium or the next,” said CAAAC cofounder Louis Barclay, staying in character when talking to The Verge. (The second founder of CAAAC wished to remain anonymous, according to Barclay.) 

CAAAC’s vibe is so similar to AI alignment research labs — who are featured on the website’s homepage with working links to their own websites — that even those in the know initially thought it was real, including Kendra Albert, a machine learning researcher and technology attorney, who spoke with The Verge

CAAAC makes fun of the trend, according to Albert, of those who want to make AI safe drifting away from the “real problems happening in the real world” — such as bias in models, exacerbating the energy crisis, or replacing workers — to the “very, very theoretical” risks of AI taking over the world, Albert said in an interview with The Verge

To fix the “AI alignment alignment crisis,” CAAAC will be recruiting its global workforce exclusively from the Bay Area. All are welcome to apply, “as long as you believe AGI will annihilate all humans in the next six months,” according to the jobs page. 

Those who are willing to take the dive to work with CAAAC — the website urges all readers to bring their own wet gear — need only comment on the LinkedIn post announcing the center to automatically become a fellow. CAAAC also offers a generative AI tool to create your own AI center, complete with an executive director, in “less than a minute, zero AI knowledge required.” 

The more ambitious job seeker applying to the “AI Alignment Alignment Alignment Researcher” position will, after clicking through the website, eventually find themselves serenaded by Rick Astley’s “Never Gonna Give You Up.”