The internet did not create human trafficking – but it industrialized it. What once required physical proximity — a street corner, a truck stop, a classified ad — now requires nothing more than a smartphone and a fake profile. The recruitment pipeline has gone digital, and the platforms enabling it are not hapless bystanders. They are, in several documented cases, knowing infrastructure.
From Street Corners to Algorithms: The Infrastructure Shift
In Week 1, we established the behavioral architecture of trafficking – the grooming model, the offender typology, the seven-stage sequence from target selection to long-term control. This week, we examine the delivery mechanism: how traffickers recruit, and specifically how digital platforms have become the dominant infrastructure for that recruitment.
The numbers are not ambiguous. The Human Trafficking Institute’s 2024 data identifies Facebook, Instagram, and Snapchat as the top three platforms used to recruit trafficking victims in the United States. Not forums on the dark web. Not encrypted criminal channels. The same platforms that host your organization’s communications, your employees’ personal networks, and, in many cases, your children’s social lives.
The National Human Trafficking Hotline has recorded recruitment across a broad spectrum of platforms including Facebook, Instagram, Snapchat, Kik, WhatsApp, Meetme, Tinder, Grindr, and still others. The pattern is consistent. Traffickers build an intimate relationship with a target through repeated digital contact, or they advertise false job opportunities that exploit economic vulnerability. In Thailand, where the Exodus Road conducts investigations, recruitment often begins openly on X (formerly Twitter) – tweets offering work to girls “willing to make an appointment.” The language is vague enough to survive automated detection. The intent is not.
Dr. Melissa Farley’s research – now spanning 14 countries and more than three decades – identified the online migration of the sex trade as a structural amplifier, not merely a tactical shift. In her peer-reviewed paper “Online Prostitution and Trafficking,” co-authored with Kenneth Franzblau and M. Alexis Kennedy and published in the Albany Law Review, Farley documented how internet technologies have transformed the commercial sex industry’s reach, speed, and anonymity. She described the internet not as a new venue for an old crime but as a force multiplier. “Technology, smartphones, and other digital devices make it possible to conduct business, advertise, and increase earnings” from women who, in her finding, are “for the most part trafficked or coerced.” The platforms changed, but the coercion did not.
The behavioral signature Hazelwood and Burgess documented in face-to-face grooming translates directly to digital grooming – and in some respects becomes more dangerous online. A trafficker can now create a fake profile specifically calibrated to a target’s interests, interact with them for months before any physical contact, build trust, manufacture dependency, and initiate isolation – all while remaining geographically remote and operationally anonymous. The victim walks into the exploitation willingly because they believe they know the person asking them to come. This is Stages 2 through 4 of the grooming model, executed at scale, with zero marginal cost per target.
Meta, Snapchat, and the Corporate Accountability Gap
In January 2024, the U.S. Senate convened a hearing to hold major social media platforms accountable for child exploitation on their networks. Appearing in front of the Senate was Mark Zuckerberg. Senators asked him questions. The platform executives expressed concern. Then they all went home… and the platforms continued to function as before, without change.
This is not a cynical characterization. It is what the data shows.
According to NCMEC’s 2024 CyberTipline report, Meta – which operates Facebook, Instagram, and WhatsApp – was the largest single source of child sexual exploitation reports in both 2023 and 2024. In 2023, NCMEC received 36.2 million reports of suspected child sexual exploitation, and almost 31 million of them came from Meta’s platforms – a 93% increase from Meta’s 2019 submission volume. A Wall Street Journal investigation published in June 2023 found that Meta’s own algorithms were guiding pedophiles toward sellers of child sexual abuse material, in the Journal’s words, “essentially connecting a vast pedophile network.” Meta’s internal documents, reviewed during subsequent shareholder proceedings filed with the SEC, showed that the company had rejected efforts to improve child safety and that an estimated 100,000 children were being sexually harassed on its platforms every day.
In 2024, total CyberTipline reports dropped from 36.2 million to 20.5 million, but this was not evidence of improvement. NCMEC explained it directly in Congressional testimony; the decline was attributable primarily to Meta’s adoption of end-to-end encryption across its messaging platforms and the introduction of report “bundling.” Encrypting messaging is a legitimate privacy measure. It is also, as NCMEC told Congress plainly, a mechanism that systematically reduces platform visibility into trafficking and exploitation activity – and reduces the reports that would otherwise flow to law enforcement.
“The internet is just the ‘internet streets.’ The same predators and hustlers are meeting you with the same intentions – except they look like straight people who go to medical school.”
— Survivor testimony cited in Farley, Franzblau & Kennedy, Albany Law Review
What NCMEC’s 2024 data did show, despite the reporting decline, is that online enticement – defined as an adult communicating with a child for sexual purposes, including grooming and sextortion – increased by 192%. Child sex trafficking reports to NCMEC rose 55% from 2023 to 2024.
The REPORT Act, signed into law in May 2024, expanded mandatory reporting requirements to include child sex trafficking and online enticement for the first time. In the first half of 2025 alone, child sex trafficking reports surged from 5,976 to 62,891 – a tenfold increase in 18 months – driven largely by the REPORT Act making visible what was already happening.
The crime was not new, there were not more instances or violations. The reporting obligation was.
The picture into 2025 is worse, not better. Mid-year 2025 data from NCMEC showed online enticement reports up 77% versus the same period in 2024, from 292,951 to 518,720.
AI-generated child sexual exploitation material exploded from 6,835 reports in the first half of 2024 to 440,419 in the first half of 2025 – a 6,344% increase in twelve months. NCMEC began tracking generative AI’s contribution to exploitation in 2023, as “The growth,” in their words, “has been staggering.”
Snapchat, TikTok, and Discord face parallel criticism for inadequate safeguarding. The Social Media Victims Law Center, founded by liability attorney Matthew Bergman, has pursued civil accountability claims against these platforms on behalf of trafficking survivors, arguing that their design decisions – not merely their failure to moderate content – facilitate exploitation.
A two-year Guardian investigation published in 2023 found that Meta was not only “failing to report,” but “failing to even detect the full extent” of child sex trafficking on its platforms.
The UK’s Online Safety Act 2023 represents a more direct attempt at legislative accountability, requiring platforms to actively prevent harmful content and creating new obligations around age verification and protection of minors. Implementation remains contested, but the framework acknowledges what U.S. law has largely refused to: that platform design is not neutral, and that algorithmic choices have foreseeable human consequences.
FOSTA-SESTA: Well-Intentioned, Operationally Counterproductive
The most significant U.S. legislative attempt to address online trafficking to date is FOSTA-SESTA – the Fight Online Sex Trafficking Act and Stop Enabling Sex Traffickers Act, signed in 2018. The intent was to pierce Section 230 of the Communications Decency Act and hold online platforms liable for hosting content that facilitated sex trafficking. However, the reality was much more complicated.
The U.S. Government Accountability Office examined FOSTA-SESTA’s prosecutorial impact and found it had not helped prosecutors tackle trafficking cases. In seven years, the law produced effectively one federal prosecution. The GAO concluded that gathering evidence had become “more difficult due to the relocation of platforms overseas, platforms’ use of complex payment systems, and the increased use of social media platforms.” That last point is the critical one. FOSTA-SESTA did not push trafficking off the internet. It pushed it from dedicated classified platforms – where law enforcement had developed investigative expertise and digital forensic methods – onto social media platforms that are larger, more encrypted, and significantly harder to surveil.
The DOJ itself testified during the legislative process that SESTA would make trafficking investigations more difficult. They were right. Law enforcement that had relied on platforms like Backpage.com as investigative resources – following digital evidence, identifying traffickers through ad patterns, building cases from visible transactions – found those resources eliminated. The traffickers simply moved. The investigators’ evidence base did not move with them.
The downstream effects on victims were also severe and well-documented. After FOSTA-SESTA, approximately 40% of sex workers reported heightened physical and sexual assault, including robbery, rape, and confinement – because the online screening mechanisms they had used to vet clients were removed. More people moved to street-based sex work, which carries higher violence rates. Dependency on third-party managers increased, because the ability to independently find and screen clients online was gone. FOSTA-SESTA, designed to protect trafficking victims, made a significant population of already-vulnerable people more physically unsafe.
This is not an argument for leaving platforms unregulated. It is an argument for understanding that blunt regulatory instruments, applied without adequate operational modeling, can displace harm rather than reduce it – and can inadvertently strengthen the position of the traffickers they were designed to disrupt.
| CASE FILE Operation Cross Country: Behavioral Profiling at Scale The FBI’s Operation Cross Country, run through the Innocence Lost National Initiative, represents the largest coordinated domestic effort to apply behavioral intelligence to online trafficking recruitment. Between its launch in 2003 and its most recent large-scale iteration, the operation has recovered more than 5,000 minor victims of sex trafficking across the United States. What distinguishes Cross Country from conventional law enforcement sweeps is its methodology. Investigators did not simply respond to reported incidents. They profiled the behavioral and digital signatures of traffickers operating across platforms, built cases from pattern analysis, and coordinated across jurisdictions to identify the network structure of trafficking operations. This is BSU methodology applied at a national scale. The Behavioral Analysis Unit contributed to the Innocence Lost profile with this. “The prototypical domestic minor sex trafficking offender is opportunistic but patterned, typically male, and operating within a defined geographic circuit. Victims are recruited through personal relationships, social media, and runaway networks. The digital component of recruitment – social media as a first point of contact, mobile payments as transaction infrastructure – has grown in every consecutive operation.” In 2018, a single Cross Country enforcement operation recovered 103 children and identified 308 adult sex trafficking victims in a one-week period across 55 U.S. cities. In that same period, 120 suspects were identified. The operation required 500 FBI agents and partnerships with 400 local law enforcement agencies. The scale of the infrastructure needed to find 103 children in seven days tells you something significant about the scale of the problem those 103 children were part of. The lesson from Cross Country is one that every security and operations professional should internalize. Operational pattern recognition at scale requires investment in analytical capacity, not just enforcement capacity. The former finds networks. The latter finds incidents. |
Farley’s Framework: Pornography, Trafficking, and the Digital Blur
Dr. Melissa Farley’s contribution to understanding digital trafficking goes beyond the nine-country study cited in Week 1. Her work specifically on the intersection of pornography, online platforms, and trafficking deserves direct engagement here, because it illuminates a dimension of the digital recruitment problem that most institutional frameworks systematically ignore.
Farley’s research position – developed across 52 peer-reviewed publications and refined through fieldwork in 14 countries – is that prostitution, pornography, and trafficking are not separate phenomena with meaningful legal or experiential distinctions. They are a continuum of the same organized sexual exploitation, with digital technology now serving as the connective tissue between them. As she wrote in the Albany Law Review paper, “Pornographers are indistinguishable from other pimps. Both exploit women’s and girls’ economic and psychological vulnerabilities or coerce them to get into and stay in the sex industry.”
The operational implication for digital platforms is significant. When a platform hosts pornographic content, recruits users toward that content through algorithmic recommendation, and creates economic incentives for content creators, it is not operating in a legally and morally clean space that happens to be adjacent to trafficking. According to Farley’s framework – which is grounded in survivor testimony and clinical data, not theoretical advocacy – it is operating as part of the same ecosystem. The documented crossover between pornography platforms and trafficking networks is not accidental. Nevada pimps, she documents, explicitly describe “cross-fertilizing” legal brothels with strip clubs, escort services, websites, and pornography production. The same logic applies to the digital environment.
This has direct implications for due diligence in corporate environments. An organization that operates, invests in, or maintains commercial relationships with platforms generating revenue from user-created sexual content should understand that those platforms exist within a documented chain of exploitation – and that reputational, legal, and ethical exposure flows from that chain.
“Pornography and prostitution are so linked that we’ve found you can’t separate them. Everything I know comes from the base of survivors’ perceptions and observations of the sex trade.”
— Dr. Melissa Farley, Ph.D. — Prostitution Research & Education
The Financial Architecture: Mobile Payments and Cryptocurrency
Recruitment is only one half of the digital trafficking infrastructure. The other half is the financial mechanism. Cash has historically been the trafficker’s preferred payment medium – it is anonymous, immediate, and leaves no institutional record. The digital era introduced a complication – electronic payments are traceable. Traffickers adapted.
Mobile payment applications – Venmo, CashApp, Zelle, and their international equivalents – are now the primary identified payment mechanism in sex trafficking cases in the United States, according to law enforcement. They describe them as a “significant challenge” to financial tracking. These applications were designed for consumer convenience. Peer-to-peer payments with minimal friction, instant transfer, and limited mandatory disclosure to law enforcement. They were not designed with trafficking as a threat model. They function as one regardless.
Beyond mobile payments, cryptocurrency – particularly privacy coins like Monero, which leave no transparent transaction ledger – has been documented in trafficking cases involving higher-volume or more organized operations. Dr. Farley’s 2014 Albany Law Review paper noted Bitcoin’s early adoption in the commercial sex trade, stating “unregulated online currency that unlike credit cards provides the anonymity of cash is being used to pay for web access to sites containing extremely violent or illegal images of real women and children.” In 2026, the technological sophistication of those payment mechanisms has advanced considerably. The operational intent has not changed.
For financial compliance officers, anti-money-laundering teams, and corporate security functions: the indicators are behavioral, not just transactional. Multiple small, frequent payments to an individual from multiple unrelated accounts. Payments with notes that use coded language. Consistent transaction patterns that spike on specific days or evenings. The financial signature of trafficking is not invisible – but it requires someone looking for it.
| // OPERATIONAL INTELLIGENCE — DIGITAL RED FLAGS Recruitment red flags in digital environments: unsolicited contact from an unknown person who quickly becomes intensely interested and attentive; offers of modeling, entertainment, or travel work that bypass normal application processes; requests to move communication off-platform to encrypted apps immediately; offers of housing, money, or transportation tied to undefined ‘work’ or ‘opportunities.’ Financial red flags: multiple small, frequent incoming payments from unrelated accounts; payment app activity that spikes on specific evenings; use of prepaid debit cards, gift cards, or cryptocurrency for recurring payments; cash-intensive patterns in businesses without obvious cash-intensive explanations. Organizational red flags: employees who appear controlled by a companion; workers without ID, independent housing, or financial accounts; workers who give scripted or identical answers to questions about their personal situations; any worker whose pay appears to be going to a third party. |
What This Means for Security and Operations Professionals
The digital dimension of trafficking requires security and operations professionals to update their threat models. The indicators are no longer confined to physical environments – the hotel lobby, the truck stop, the agricultural labor camp. They exist in your organization’s digital infrastructure, in your employees’ network environments, and in the platforms your organization’s supply chain operates on.
The following is not a comprehensive checklist. It is a starting framework for organizations that have not yet built one.
PLATFORM EXPOSURE
Does your organization have active social media presences that interact with the public? If so, your platforms can be used as grooming vectors. Traffickers use comment sections, direct messages, and public posts to identify and make initial contact with targets. This is not theoretical – it is documented. Review your organization’s social media policies to determine whether they address inappropriate contact patterns, requests to move communication off-platform, or what constitutes grooming-style engagement.
SUPPLY CHAIN AND THIRD-PARTY RELATIONSHIPS
Industries with documented trafficking exposure include hospitality, logistics and transportation, agriculture, construction, healthcare, and domestic service. If your supply chain touches any of these sectors – particularly through subcontractors, staffing agencies, or seasonal labor providers – you have indirect exposure to trafficking risk. Due diligence requirements should include labor practice audits that specifically address worker access to their own identification documents, earnings, and freedom of movement.
FINANCIAL PATTERN MONITORING
AML and fraud teams should be briefed on the financial signature of trafficking operations. Mobile payment platforms are the primary vector. Pattern recognition – multiple small incoming payments from unrelated accounts, consistent timing patterns tied to trafficking activity cycles, geographic clustering of transactions – should be incorporated into existing monitoring frameworks.
EMPLOYEE TRAINING AND INCIDENT REPORTING
The person who identifies a trafficking situation in your organization is most likely not your security team. It is a front desk worker, a line supervisor, a driver, a nurse, a hotel housekeeper. The Stanislaus County H.E.A.T. Task Force model is instructive. Detection rates improve dramatically when trained eyes are widely distributed, not concentrated in a specialist function. A one-hour training on behavioral indicators and response protocol – who to call, what to say, what not to do – costs almost nothing. The alternative is documented in every jurisdiction that has not built that capacity.
One in three runaway youth is targeted by a trafficker within 48 hours of leaving home. Sixty-two percent of trafficking victims know their recruiter. The recruitment is happening on the platforms you are reading this on. The question is not whether this problem is present in your operating environment. It is whether you have built the capacity to see it.
This article was originally published in Risk Optics Weekly, a free weekly intelligence briefing for senior leaders. Subscribe at frazergthompson.substack.com