You know the feeling; you’re on a dating app, and you start a conversation with someone, and things don’t seem like they add up completely. You’ve seen The Tinder Swindler on Netflix, and you’re suddenly wondering: Is someone about to ask me for money? The team at video dating site Filter Off decided to take a novel approach: Every detected scammer got added to a side-pool of dating hopefuls that only contained their own chatbots posing as adorable singles, and other scammers. As you might expect, hilarity ensued.
As a platform, Filter Off is a video-first dating app, launched at the beginning of the COVID-19 lockdowns. As dating shifted from bars and galleries and picnics to being more chat- and video-first, the company took off, offering virtual speed-dating events around various topics; Harry Potter date night, dog lovers date night, New York City date night — you name it. The platform has hundreds of thousands of users, and as its popularity grew with humans looking for love, the founders discovered that it attracted a second set of people as well — humans looking for money.
“The first time I noticed there was a problem was when I saw that George Clooney had joined Filter Off. I was like, ‘Holy shit, like, this is wild, I can’t believe…’ but then I looked more closely at his profile,” laughs Brian Weinreich, head of product at Filter Off, in an interview with TC. He realized that Clooney probably wouldn’t be on a dating site, and even if he were, he wouldn’t be a 34-year-old from Lagos, Nigeria. “I deleted their profile to get them off the app. Then I started noticing all these profiles that look like real people, but they were all these different profiles that didn’t add up.”
The product team decided to try to fix the problem with algorithms, and created a piece of software that would sort users into humans, and “based on certain characteristics that I’m starting to notice, on how people sign up how they use the app” they are pretty likely to be a scammer. The team kept deleting the profiles, but for every scammer they cut down, another five would pop up in their place, Medusa-style.
“I was like, alright, we need a way where we can get rid of scammers, but it needs to be in a manner that they can’t just come back and re-join the platform,” said Weinreich. “I remembered that Reddit and other platforms have a form of ‘shadow banning,’ where users can keep posting, but normal users don’t see their content.”
And so the work began. The team used GPT-3 to create a number of chatbots, combined with a script that generates human-like faces, to create a number of fake profiles themselves. The caveat: These profiles aren’t seen by “normal” users, only by people who the algorithm has determined are scammers. They put them in a pool of thousands of bots that look and talk like real people.
“The funny part is, two things happen. One, the scammers will run into other bots, but they’ll also run into other scammers, and they’re trying to scam each other,” laughs Weinreich. “They are like ‘I want $40 For a Google Play gift card’ and the other scammer replies ‘no, no you give a gift card to me‘ and they just kept arguing. It’s just this hilarious thing, where currently in our app, we have probably over 1,000 scammers that I know of that are actively talking to just bots. It’s great. They’re wasting their time and they don’t have to deal with our real users.”
The platform does have report and block features where real users can report potential scammers. When reported, the team can improve their algorithm, and also manually place the scammers in the bots pool.
“The funniest bit about our report feature is the number of reports I get from scammers reporting that they’re talking to a bot. I’m like ‘yeah, I know, that’s the point’,” shrugs Weinreich.
The company collected some of the most absurd conversations that scammers had with each other and with bots on its own blog, which is well worth a skim as well.