Corporate Governance and Human Trafficking
How information and people get exploited and blackmailed in today’s digital economy.
By Sally A. Vazquez-Castellanos, Esq.
Publication Date: January 24, 2026.
My Morning Chat.
This morning’s reflection began while watching Netflix’s Killing Eve. It is a sophisticated, compelling drama—but the series is not without flaws.
That observation is not about storytelling or performance. It is about context.
Streaming has become routine. Not unlike social media platforms, the algorithms that are the foundation of streaming services are increasingly becoming controversial. While data collection is par for the course, the reality is that we all love to watch movies in our spare time.
The problem is the weaponization of these services, television sets, and telecommunications for profiling. In some instances, the services may be exploited to cause psychological harm such as the unique voices, noises and frequencies being exploited inside of my residence and vehicles. Obviously, this goes well beyond the traditional exploitation of data. Not that any exploitation is acceptable—but certainly when individuals or agencies use a service with the intention of profiling others to cause psychological harm or distress—then something is terribly wrong with streaming. The Courts should know that this is a real form of technology harassment and should not ignore the possibilities. Another form of harm is acoustics carefully laden throughout communities. Just imagine the psychological harm that can be inflicted once individuals begin piping in sounds of your daughter being waterboarded as a child. I am not letting you off the hook because this has been done for years. So much so —why hide it anymore—if I suddenly pass away please thoroughly inspect my vehicles and home—and please check for encephalitis or any other brain abnormalities. Thank you in advance.
It’s not like we haven’t been here before. This is not just a Netflix problem, it’s also a Tik Tok problem, a Meta problem—and, sadly—a psychological problem.
My premise is straightforward—data is a commodity and there are those who are skilled at the exploitation and sale of information. Over time, it becomes reasonable to consider what sits behind these systems—and what that means not only for companies, but for subscribers and their families.
Supply Chains Are No Longer Only Physical
Historically, supply-chain discussions—especially in the context of human trafficking and modern slavery—have focused on physical inputs: labor, factories, raw materials, and transportation.
That framework still matters. But it no longer tells the whole story. After all, digital information is currency. When people engage in the sale or exploitation of information in the context of blackmail—well it becomes a bit of a problem not only for those engaged in the practice of exploitation but for the companies that support them and their naughty little habit.
Many companies today, particularly in media and technology, operate digital supply chains that include:
-Subscriber and viewer data engagement and viewing patterns
-Recommendation and ranking systems
-Advertising, analytics, and data-sharing partners
These systems do not manufacture goods, but they do shape exposure, influence behavior, and generate value. From a corporate-governance standpoint, they should be treated with the same seriousness as traditional supply chains.
However, the problems for companies are growing. If you own a small business or if you are a solo practitioner—where do you begin while trying to become proficient in your profession or business?
Today’s issues are different from when I began branding for our law practice years ago. The digital environment of today is replete with a very scary Metaverse and related issues such as trafficking, sexual exploitation—and unfair competition and reputational harm. And—that’s just online. It simmers into something far more dangerous when it trails off into allegations of trafficking human embryos.
Understand that the mere allegation of trafficking a woman’s embryos would be sufficient to cause anyone grave concern. As a former IVF patient and attorney at a family law practice, I have previously voiced my opinions about IVF in an effort to help other families.
Sadly, there are those who would resort to using that information to justify data breaches and the hacking of personal information that extends to family, friends and colleagues. This includes the breach of medical information that impacted my mother in New York. She is suffering every day with interstitial lung disease. But in all of your selfishness—you not only disrespected me but you disrespected my entire family.
What’s unfortunate is that once you head down that road—you may have created a disaster not only of your own making—but you are quickly becoming the victim of your own hubris and brazen behavior. I have not been silent about my anger. That does not mean that you did not try to keep me silent.
Children, Algorithms, and Foreseeable Risk
Most companies do not intend to place children at risk. That bears stating plainly.
At the same time, experience with smartphones and social media has shown that algorithmic systems can influence behavior in ways that are not always anticipated. Design choices that appear neutral can have cumulative effects, particularly for younger users.
Streaming services operate within similar engagement models. For adults, this may simply mean watching more content. For children, it can mean growing up in an environment where influence is continuous and largely unseen.
Let’s clarify that point—because it’s also possible for some children to live under rather unique circumstances—something you may know nothing about nor should you have been entitled to that information. Yet, your misguided behavior placed children in jeopardy—which may include placing families in the crosshairs of traffickers and other criminals. This may include individuals engaged in risky lifestyles and who are predators living in some fantasy world of believing that the sexual exploitation and objectification of minors would be excused under any circumstances in this country.
The truth—what began as an internet that supported entrepreneurs in our American economy has descended into a cesspool of political and other violence. The so called “signs’ began manifesting itself in high stakes data breaches that led all the way to our nation’s Intelligence Community with the leak of CIA tools online. At that point—the intended joke leads to something far more dangerous and no corporate governance system can permit that kind of reckless activity.
While public safety and our nation’s immigration policies are the rallying cry for denying individuals due process—there remains the shield of something far more intrusive and destructive to the country as a whole.
Wanna cry and related cyber incidents illustrate how hacking, layered messaging and carefully designed innuendo can be used to threaten, coerce, intimidate, and manipulate people. These tactics can divide communities, fuel culture-wars, and amplify extremist narratives. The resulting disruption affects human-race relations, educational systems, elections, smartphones, technology infrastructure, privacy, human rights and dignity—ultimately leading to a broader national security concern.
The invariable result—Ana and others get exploited or become pawns every step along the supply chain—with the expectation of turning them into something far more sinister. The problem may not be Netflix but a dangerous combination of dynamics that includes politics being infused into some of the latest mergers and acquisitions among media companies today.
In spite of human trafficking laws and legislation such as President Biden’s Violence Against Women Act (VAWA), the world of innovation may be amplifying these concerns. As a result, Ana is used, objectified and exploited by so called elites adept at manipulating social media, social media platforms, and telecommunications systems as a whole.
It’s a big FU to our nation’s first African American President Barack Obama and First Lady Michelle Obama who proudly proclaimed—“When they go low—we go high.” Come to think of it—I supported First Lady Michelle Obama when she made that statement. You know who else I supported? The pantsuit nation and former First Lady and Madam Secretary—Hillary Rodham Clinton.
How’s that for a sky full of stars?
These challenges are not unique to streaming services. Social-media companies such as Meta have publicly acknowledged the unintended consequences of engagement-driven systems. The lesson is familiar: intent does not eliminate responsibility for foreseeable outcomes.
Where the Concern Actually Lies
It is important to be clear about what this discussion is—and is not—about.
This is not a critique of the many hardworking people involved in producing and distributing content. Creative work is complex, collaborative, and often done in good faith. Most of the time anyway—unless and until politics gradually takes over the industry. Corporate executives have a responsibility to advocate on behalf of their employees no matter how inconvenient it becomes for some. What’s inappropriate is to influence operations at any company in the name of politics so much so that you invariably destroy its mission and legacy. This is the problem with network news, entertainment and our global media operations.
The concern lies in corporate decision-making. Recent notable deals made by streamers, that do involve the collection, aggregation, and monetization of large volumes of data. We can also see the battle over redistricting playing out among the states.
Those decisions go to the heart of what it means to:
-Protect data
-Respect privacy
-Care for subscribers
-Maintain trust
When content libraries, platforms, and data are combined or expanded through deal-making, governance must keep pace. That issue cannot be ignored.
Overlapping Systems: Corporate Tools and Public Power
Technology companies increasingly operate in environments that intersect with government systems. Historically—the whole foundation of the Internet is derived from our nation’s military infrastructure.
Data access, compliance obligations, and public-private cooperation are now common. In my blog entitled, “It’s Personal: Children, Privacy, Technology and the Law,” I explored this overlap through the fictional story of Ana, informed by current U.S. immigration policy.
The point is not to be accusatory. It is illustrative.
When corporate systems intersect with law enforcement or immigration processes, governance becomes more important—particularly where families or children may already be vulnerable or they are attacked.
Innovation Without Guardrails
Innovation culture rewards speed. In many instances, there’s a lot we don’t understand about societal impacts and the justice system is usually slow to adapt often with good reason. The costs of disruption is often treated as abstract or temporary.
Human consequences are neither.
Modern slavery and trafficking laws reflect a consistent lesson: harm is occurring because we may be moving way too fast. A national emergency such as Covid is one thing but forcing everyone to put all of their information online may be reckless. Technology does not change the concerns such as the elimination of jobs. It often magnifies it.
Innovation without governance creates blind spots—Killing Fields of Juárez: A Structural Warning
The reality is that the problems plaguing our nation such as colonialism, racism, and discrimination have always been with us.
Long before today’s discussions about algorithms and data, journalists such as myself have documented how systemic failure places women at risk.
Reporting on the murders and disappearances of women in Ciudad Juárez—often referred to as the killing fields—revealed how labor practices, migration, weak enforcement, and institutional indifference converged.
This includes:
“Forgotten: The Women of Juarez,” which examines femicides and systemic impunity demonstrates how systems without accountability create conditions where exploitation can persist.
The lesson is structural and enduring.
Children Are Not a Marginal Risk Group
A persistent misconception in trafficking discussions is that risk is confined to distant or marginalized communities.
In reality, children in well-resourced households often have:
-More device access
-More data collected about them
-More algorithmic exposure
-Fewer meaningful barriers
It’s also true that when people conspire to hurt your family-it becomes increasingly difficult for any parent or caregiver to police. This reality underlies modern children’s privacy and safety laws. Influence does not require traditional vulnerability—only access. Some individuals have more access than others, which places the Facebook privileges issue among celebrities and politicians squarely at issue.
Entertainment, Scale, and Corporate Responsibility
Watching television on a streaming service such as Netflix is now an ordinary part of daily life.
The issue is not entertainment. It is scale and stewardship. But how do you lead in an environment that shows signs of a national security nightmare. But given the recent Tik Tok deal—maybe it’s not all doom and gloom. We have to think critically from the moment that issue arose—to where we are today. Perhaps there is a path towards leadership. But the road ahead is not about party—it is about inclusion towards resolving a national crisis that does include corporate responsibility.
When platforms reach millions of households, governance considerations include:
-How data is collected and combined
-How recommendation systems operate
-Whether parental controls are meaningful in practice
-Whether deal-making aligns with stated privacy commitments
These are governance questions, not cultural judgments. However, culture does matter. Diversity, equity and inclusion matters.
Mergers, Data, and Why This Matters Now
Recent press coverage of media-industry mergers has emphasized the sheer volume of data being acquired alongside content libraries and platforms.
From a corporate and regulatory standpoint, mergers now raise questions about:
-Data concentration.
-Cross-platform profiling and automated decision making.
-Global privacy and data-transfer obligations.
-Cybersecurity measures and the protection of personal and sensitive data.
-A careful examination of cloud computing and storage.
-Third party sharing governance, including data sharing arrangements with enforceable auditing measures.
-Long-term implications for subscribers and families.
It is not surprising that regulators, including the Department of Justice, are examining these issues alongside competition and privacy policy.
For subscribers, the concern is straightforward: what happens to personal data when companies combine it at scale?
Closing
Modern slavery law offers a useful reminder: harm grows where responsibility fades.
Technology reduces distance and increases influence. Corporate governance must adjust accordingly—particularly where data, families, and children are concerned.
The adjustment requires that we all care.
Sources
UK Modern Slavery Act 2015.
California Transparency in Supply Chains Act.
Trafficking Victims Protection Act (U.S.).
“Forgotten: The Women of Juárez” (iHeartMedia investigative podcast), reporting on Ciudad Juárez.
DOJ and FTC materials on data, privacy, and competition.
Public reporting on media-industry mergers and data aggregation

About the Author
California Attorney and Shareholder at Los Angeles-based family law firm Castellanos & Associates, APLC. Focuses on legal issues at the intersection of children’s privacy, global data protection, and the impact of media and technology on families.
