Pendulum raises $5.9M to help organizations track harmful narratives
Pendulum helps companies, governments and other organizations track harmful narratives on social media platforms and elsewhere on the web. The company today announced that it has raised a $5.9 million seed round led by Madrona Venture Group, with participation from Cercano Management and others. The service was incubated inside Madrona Venture Labs.
“Pendulum’s platform applies AI and NLP technologies to uncover threats and opportunities contained in narratives in the earliest days of their formation and track them as they spread online,” explains Madrona managing director Hope Cochran. “By dissecting and categorizing the narratives in text, video and audio content on social media platforms, companies are better prepared and able to engage with communities as they choose. With support for YouTube, BitChute, Rumble and Podcasts currently available, the platform will grow to encompass all social platforms of importance over the coming months.”
The team behind Pendulum seems uniquely suited to build a product like that. Co-founder Sam Clark, for example, previously worked as a data mining engineer at Decide.com and then at eBay after it acquired that company. He also co-created Transparency Tube, a project that categorizes and analyzes political YouTube channels. Transparency Tube shares quite a bit of DNA with Pendulum, and Clark then teamed up with Madrona to build a commercial product around this general idea of tracking mis- and disinformation online. That’s also where he teamed up with his co-founder Mark Listes, who brings a lot of government experience to the team. Listes was previously the director of policy for the U.S. Election Assistance Commission and the chief of staff for the National Security Innovation Network, where he helped manage the U.S. Department of Defense’s venture engagement.
While Listes expected to have a pretty calm time at the Election Assistance Commission, he obviously picked the wrong time for that when he joined in 2016. “By November and December of ’16, the elections space looked a lot different,” he told me. “We were dealing with foreign interference and intelligence briefs and everything under the sun. Long story short, over the next two and a half years, I and my colleagues got right in the middle of leading the effort for fighting foreign interference out of our election system. We experienced it both personally and organizationally, and then helped fight to get out of our overall system. Harmful narratives and the impact that narratives, whether they’re mis- or disinformation or malaligned narratives can have on society at large.”
Yet while Pendulum can be used by government agencies to track online narratives, it’s a commercial service first. “We’re commercial first,” Listes said. “There’s of course an easy, intuitive government play here, but we’re actually focusing exclusively on the commercial sector first and we’re building out some really powerful partnerships there.”
Listes stressed that for a platform like Pendulum to work, it has to cover as many platforms as possible. It’s not enough to simply track Twitter, which doesn’t offer a representative sample of the population anyway, or YouTube. Because of this, Pendulum also tracks BitChute and Rumble, for example.
But Listes also noted that Pendulum isn’t in the business of adjudicating truth. “We actually have this really strong powerful narrative tracking engine that is not reliant on whether or not something’s true or false,” he explained. “We’re staying away from truth adjudication — and that opens us up to a wider range of use cases.” That means the company can work with corporations, for example, which may want to track narratives around a company’s executives and assets, for communications but also security reasons.
Because it doesn’t want to decide whether something is true or not, Pendulum opens itself up for use by nefarious actors as well. Listes argues that the company doesn’t track any personal identifiable information, though, and that the team is quite cognizant of this possibility. “We’re building values to make sure that we’re not ever in any way creating an unfair playing field or empowering malicious actors and things like that through the use of our tool,” he said.