For Sex Traffickers, Jack Dorsey’s Cash App Is ‘King’
The Twitter and Block billionaire made Cash App into a $700 million monster. Now police officers, nonprofit critics and current and former employees say it’s struggling to fight “rampant” criminality.
Earlier this year, Kik Messenger user “heyyyydude1” was selling a stash of videos he’d amassed of child sexual abuse. One customer, who said he was a 35-year-old father of two, offered to buy 200 videos for $45. “How do I pay?” he asked.
“Cash App,” heyyyydude1 responded, sending over his payment details and a code for a Cash App referral fee. With each transaction, and many more disturbing videos sent, the seller was unknowingly providing a pile of evidence to an undercover agent with the Immigration and Customs Enforcement’s child exploitation unit. That’s according to government investigators, who claimed that after Cash App and Kik data was subpoenaed, heyyyydude1’s real name and address were revealed to be 33-year-old Philadelphia resident Michael Wilcox. Indicted in April, he awaits trial. His counsel declined to comment.
Current and former police, as well as nonprofits working directly with cops to fight child exploitation, say that such crimes are often happening via Cash App, which brings in billions in gross profit every year for Block, Inc., the Jack Dorsey-run payments giant formerly known as Square. They say that whether it’s to pay for sex with a minor, to send children funds in return for nude images or to traffic a young adult victim, Cash App is often the payment tool of choice. Block says it does not tolerate crime on its technologies and that the company proactively scans transactions for suspicious signals that could indicate criminality, but the claims of illicit use of Cash App are supported by police department data and dozens of cases reviewed by Forbes in which the tool was used to facilitate sex crimes.
Joe Scaramucci, a detective who set up the human trafficking unit for the McLennan County Sheriff’s Office in Waco, Texas, says that in his experience, outside of physical cash, Block’s tech is the primary payment tool used by people selling sex, whether they’re trafficking adults or children. Searching a police database of sex ads for “CashApp” in Waco returned 2,200 ads, compared to 1,150 for PayPal and 725 for Venmo, a PayPal-owned rival, according to data provided by Scaramucci. Similar data put together by the Arizona attorney general shows that between 2016 and 2021, there were 480,000 online sex ads where Cash App was listed as a form of payment, nearly double that of nearest rival Venmo at 260,000. For sex traders, “Cash App is king,” Scaramucci said.
Current and former employees are now raising concerns about the company’s investment in keeping Cash App clean, and critics are sounding the alarm over what they claim is Block’s lax monitoring and reporting of myriad criminal behaviors on its popular payment tool. Though it recently launched a Cash App for Teens feature, the company is conspicuously absent from collaborative efforts to fight abuse, failing to provide any tips to the National Center for Missing and Exploited Children (NCMEC), America’s national clearing house for sexual abuse material found on tech platforms. Its primary rival PayPal, the owner of Venmo, has contributed tips to the nonprofit.
“Does Cash App have the appropriate level of staff to deal with this?”
Cash App’s detractors claim criminals are attracted to its lack of thorough identity checks as compared to traditional banks and the ability for those who have abused the app to quickly sign up again. “We’re definitely seeing an escalation of the use of Cash App for a number of reasons. One is that its due diligence is next to none,” says Colm Gannon, a former sex crimes investigator in New Zealand and cofounder of Pathfinder, a police contractor focused on online child abuse.
There’s also the lure of apparent anonymity offered by Cash App, says ex-Arizona attorney general human trafficking investigator Chad Brink. Users only need to reveal their profile name, called a “cashtag,” to receive and send funds. Because it’s possible to shift funds so quickly between accounts—one signal of criminality that Block’s algorithms could detect and flag as suspicious—it’s easier to hide and cash out ill-gotten gains, added Brink. He complained about delays to getting responses from Cash App, taking far longer than rivals, leading him to question whether or not Block has invested in legal compliance in line with its explosive growth. “Does Cash App have the appropriate level of staff to deal with this?” he asked.
Block spokesperson Danika Owsley said that all forms of human trafficking were not permitted on Cash App. She added that the company operated “in line with applicable laws and regulations,” closing accounts and reporting suspicious transactions to police, while working voluntarily with nonprofits, industry and law enforcement.
“We do not tolerate illegal activity on Cash App. We maintain robust compliance programs and continuously assess and enhance our systems and controls to prevent, detect and report bad activity on the platform,” she added.
Launched in 2013 by Block, Cash App quickly became a cash cow for Dorsey, helping him amass a fortune of $4 billion. Competing with PayPal, Venmo and Zelle, it rapidly gained market share by offering a quick, simple and free alternative.
With Block’s market cap currently at $30 billion six years after its IPO, much of its growth is due to Cash App, which makes revenue by charging a small fee for business use, and through its stock trading and cryptocurrency exchange features. In its latest quarterly results, Dorsey’s business revealed that this June, 47 million accounts carried out transactions on Cash App. The business generated gross profit of $705 million in the most recent quarter, up from $624 million in the preceding quarter, accounting for nearly half of Block’s total $1.5 billion in Q2.
But as Cash App has grown in revenue and popularity, so has its use in sex crimes. Hundreds of pages of court filings describe cases where law enforcement said Cash App was used to either pay for sexualized images or sex with minors and adults. Among the filings referencing Cash App were multiple instances where a victim was trafficked on Skipthegames.com, one of the largest sex advertisement websites. That included one recently unsealed investigation where a victim in Grand Rapids, Michigan, was just 17 years old. A search for “Cash App” and “CashApp” on Skipthegames.com collectively returned nearly 900,000 results, compared to 470,000 PayPal and a similar number for Venmo.
In one of the more egregious recent cases, a 26-year-old Asheville, North Carolina, man enticed 15 minor females into either sending him sexually explicit images or having sex with him, typically sending the victim hundreds of dollars over Block’s app. He was sentenced to 28 years in prison. And in another recently unsealed warrant, an FBI agent said the agency had raided the Cash App data of an individual who was suspected of trafficking a 15-year-old runaway in San Diego.
The highest-profile case involving Cash App and alleged sex crimes, though, revolves around Florida Rep. Matt Gaetz. The Justice Department is reportedly investigating whether the Congressman paid to have sex with various women, including a 17-year-old girl. Last year, the New York Times reported that it had reviewed receipts from Cash App and Apple Pay showing payments from Gaetz and an associate—indicted Florida tax official Joel Greenberg—to one of the women. (Gaetz has previously denied any wrongdoing. His spokesman, Joel Valdez, said, “It must be a slow day for Forbes to be covering an 18-month-old debunked news story. We usually don’t comment on those.” Greenberg’s counsel did not respond to a request for comment. He’d previously pleaded guilty to identity theft and sex trafficking of a 17-year-old. )
Got a tip about Block, Inc, Square or Cash App? Do you have additional information about how they deal with criminality on their platforms? We’d like to hear from you. Contact Thomas Brewster at tbrewster@forbes.com or +44 778 237 6697 on Signal.
The cases show how vital Cash App data can be when piecing together evidence of a crime, definitively proving how and when illegal content or services were bought. Cash App is often ordered to hand over user information via search warrants, subpoenas and other court orders, and it has regularly provided data, whether it’s a customer’s name, payment details, geolocation or IP addresses used to access the tool, according to case files reviewed by Forbes. Cash App spokesperson Owsley said the company had a dedicated law enforcement response team to provide information in compliance with local, state and federal laws.
Yet response times can vary. Scaramucci said the last time he filed a request for Cash App data in December 2021, he got it back within 12 days. But Brink claimed the company has become increasingly slow to respond to such requests. He said Cash App often took longer than 30 days to respond, while PayPal and Venmo were much more responsive. “As the Cash App business has grown, there has been significant investment of resources in our compliance organization across all areas of the team and levels of leadership,” Block spokesperson Owsley said in response.
Even though Cash App will turn user data over to the cops, the app is designed so that anyone selling or trading sex can remain anonymous from other users. In a recent post on the USASexGuide.nl forum, a significant resource for people buying and selling sex, a user claimed many sex workers moved to Cash App as a form of protection, having previously been robbed of hard cash. Digital money, in such inherently dangerous situations, was seen as safer than the analog equivalent.
Victims of sex crimes, though, see the flip side of that anonymity. In 2021, Brian shared intimate photos with a woman he met over a dating app. She soon threatened to send the images to his friends and family unless he paid $500 over Cash App. Brian, who asked Forbes not to share his full name, had become the victim of a sextortion attack. After a second attempted payment to the extorter failed, Brian said his Cash App account was permanently disabled.
Now a moderator of a Reddit forum dedicated to helping sextortion victims, he frets about the ease with which a person can set up a new account. “My account was disabled and it took very little effort to create a new one,” Brian told Forbes. “I imagine scammers do the same thing.” Cash App spokesperson Owsley said the company had controls in place to prevent previously removed users from signing up again. She did not elaborate on what those controls were, or explain why Brian’s Cash App account was disabled.
Brian’s experience points to what critics say is the central problem with Cash App: that it doesn’t carry out enough checks on a new user to ensure they’re not someone previously banned, an entirely fake persona or an identity thief using someone else’s personal information to cover their tracks. When users sign up to Cash App, all they have to provide is a phone number or email, along with a zip code. In contrast, PayPal also asks for a full address and employs security checks on its mobile app to verify a user is not a bot. “When Cash App is being used, it’s being used without any type of guarantee of oversight,” says former cop Gannon.
One former employee who told Forbes they quit the company, in part, because of concerns over what they alleged to be “rampant” crime on the platform, said that Cash App allows users to create bulk accounts on a single device without any clear limits. That’s potentially attractive to criminals who want to spin up as many scam accounts as possible, so if one gets shut down, they can continue their swindles easily on another. Cash App’s Owsley said she wouldn’t comment further on company controls as it was “proprietary information.”
With Cash App’s gross profits surging by $420 million in just two years, some current and former employees have raised questions about how much Block has invested in solving its crime problem. One former employee said that in the late 2010s, the Cash App compliance division was small, only consisting of a handful of staff and that the level of monitoring of criminal activity appeared lax, with growth more of a concern to leadership than compliance. Whilst the company has built up a larger team in recent years, one current staffer said that when they or colleagues raised issues like sex trafficking and laundering, they were ignored.
Cash App spokesperson Owsley said that Cash App is investing in growing its compliance.
Andrew Marane, a former lead on global investigations at Block between 2017 and 2018, said the company has historically built code to look for unusual patterns and block payments.
“To give you an example, there’s a lot of child exploitation in the Philippines, and they tend to utilize PayPal; they tend to utilize Cash App,” he said. “The minute we start seeing, like $25 to $50, numerous transactions from disparate people going to one account, that’s a good signal and we can pick up on that very quickly.”
Marane and Owsley said the company worked with the U.S. Financial Coalition Against Child Sexual Exploitation (FCACSE), joining trainings by law enforcement and sharing information to combat the crime.
“I think with kids . . . you have an obligation to do something.”
Cash App’s problems with criminality go beyond sex trafficking, though. Fraud in particular has long been a big issue for Cash App. In December, Forbes revealed merchants across the U.S. were rejecting Cash App use in stores because of fears over fraud. And in October, Forbes reported Cash App’s European business, Verse, had been ordered by its banking licensee in Lithuania to address money laundering issues or face a penalty.
Cash App said it combines its own in-house detection models with those created by third-party vendors to identify suspicious activity, all of which are audited by third parties for efficacy. Where appropriate, it works with partner financial institutions to prevent criminal behavior on Cash App. It did not provide specific examples.
Block also does not participate in some major anti-child-exploitation initiatives, while rivals do.
Unlike PayPal, Block hasn’t been providing tips about child exploitation paid for on its app to the National Center of Missing and Exploited Children (NCMEC). By law, tech companies that see CSAM spreading on their sites are required to report it to NCMEC, a nonprofit that passes the “cybertips” to the relevant law enforcement agency.
Companies that don’t host content can still provide tips, and PayPal, despite not being legally compelled to share suspicious transactions, is disclosing them: 970 in 2021, 282 in 2020, 322 in 2019. PayPal said it did so voluntarily, and the tips can be used to provide insight into payments that could be associated with CSAM trading that took place on another platform. According to NCMEC, Block hasn’t provided any tips, ever.
Block did not deny it hadn’t provided NCMEC with cybertips, adding that it regularly attended NCMEC-led meetings. Instead, spokesperson Owsley pointed to a partnership with Polaris, a nonprofit that runs America’s national hotline for complaints and tips about any kind of human trafficking. When reached for comment, Polaris said the partnership was at an early stage and limited. PayPal, meanwhile, co-led and helped found Polaris’ Financial Intelligence Unit, which tracks finances of traffickers and helps share that information with law enforcement.
NCMEC’s Staca Shehan, vice president of analytical services at NCMEC, wouldn’t say why Block hadn’t submitted any reports. “At the end of the day, that’s a voluntary choice on their side,” Shehan adds. “I think with kids . . . you have an obligation to do something.”
“The potential harm to our reputation are magnified in instances of fraud or unauthorized or inappropriate transactions involving minors . . . ”
A former senior Justice Department attorney, speaking on the condition of anonymity as he wasn’t authorized to talk on the matter, said the Justice Department often receives tips from tech companies, in particular on child sexual exploitation, terrorism and election interference. The prosecutor, who worked predominantly on major fraud and cybercrime cases in which Cash App was frequently used, said that while he received tips from PayPal and Venmo, he did not get any from Block.
Despite its significant use in crimes targeting kids, Cash App last year launched Cash App Teens, opening up the app to users between the ages of 13 and 17, as long as they get the approval of a parent or guardian. Block, cognizant that it was opening itself up to more exploitation with the new feature, wrote in its recent quarterly financial filing with the SEC: “The risks and the potential harm to our reputation are magnified in instances of fraud or unauthorized or inappropriate transactions involving minors.”
One former employee said leadership had been warned about the lack of age and identity verification for both the child and the adult approving a Cash App Teens account, but they were ignored. In response, Block said parents or guardians were the legal owners of their minor’s account and could monitor activity, turn off access and close the account at any time. It pointed Forbes to the Cash App for families page and said adults were verified by checks on their legal name, date of birth and Social Security Number.
The Teens feature may have promised parents more control over how their kids use the app, but minors have continued to use Cash App and become victims of crimes in the process. According to a recently unsealed search warrant obtained by Forbes, in February 2022 a 14-year-old in Virginia was on Snapchat when she struck up a conversation with a man under investigation for the sexual assault of a 13-year-old. He wanted to buy the minor’s nude images and asked, “How do I pay?”
“Cash App,” the teenager replied.