Sara Hussein lands at LAX after a conference in London and gets pulled aside by agents who tell her something that makes no sense: she’s going to hurt her husband, and they have data to prove it. The data came from her dreams, the ones she bought a DreamCloud device to manage because she had insomnia. She bought the product to help herself sleep and ended up selling her subconscious to a company that sold it to the government. Now she’s being held in a retention center for crimes she hasn’t committed and probably never will commit, but the algorithm says she’s a risk, so she stays.
The psychology of being detained for what you might do is different from being detained for what you’ve done. In one case, you have the option of defending yourself. In the other case, you’re expected to prove a negative. You have to convince people that you won’t do something you weren’t planning to do in the first place. The burden flips entirely. Sara finds herself in a position where the only way to prove she’s not dangerous is to demonstrate perfect compliance, which means suppressing every part of herself that the algorithm might flag as problematic. She has to become smaller, quieter, less herself.
The Trap of Compliance
The retention center operates on rules that shift constantly. The longer Sara stays, the more infractions pile up, even when she’s trying her hardest to follow instructions. She gets tased for trying to check a spelling error on official paperwork. She gets punished for taking her shirt off while doing laundry in extreme heat. The system isn’t designed to let people out. It’s designed to trap them deeper. Each misstep extends her stay. Each extension creates more opportunities for missteps.
What Lalami shows is how surveillance reshapes behavior at the most basic level. Sara starts observing herself constantly. She monitors her own thoughts. She’s aware of every gesture, every word choice, every emotion. She’s essentially become her own jailer. The retention center gave her the external structure, but she’s internalized the surveillance so thoroughly that she doesn’t actually need guards watching anymore. She watches herself. This is the actual psychology of being surveilled: eventually, you become the enforcement mechanism.
The dream work is particularly brutal because it’s the one thing Sara has no control over. Her conscious mind can follow the rules. Her subconscious can’t. She’s supposed to prove she’s safe while continuing to dream troubling things, and there’s no mechanism available to her to prevent that. The system has designed an impossible situation and then expects her to escape it through perfect behavior.
The Corporate Machinery of Control
Safe-X runs the retention center as a business. The detainees are cheap labor. They work assessing AI images for a film production company. They launder clothes. They perform tasks that make corporations money while simultaneously providing “rehabilitation.” The system turns incarceration into profit. Every service—emails, clean sheets, necessities—costs money. The detainees have no income. They’re accumulating debt while imprisoned for crimes they didn’t commit.
This is where the novel gets at something genuinely unsettling about how technology and corporate power intersect. Nobody is intentionally trying to be cruel. The system just operates according to its logic. Safe-X needs to recoup costs. The government needs to assess risk. The algorithm needs data. Everyone’s following protocols. But the result is that ordinary people like Sara end up trapped indefinitely because the incentive structure never actually rewards release.
The Question of What Privacy Actually Means
Sara’s an archivist. She understands that information, once recorded, becomes reality. What’s documented gets treated as fact. She starts keeping her own records of her dreams, trying to correct the official narrative about who she is. But the problem is that the algorithm doesn’t care about truth. It cares about prediction. The algorithm has looked at Sara’s data and decided she’s dangerous. Anything she does to prove otherwise just creates more data for the algorithm to interpret.
The Dreamcloud device that Sara bought to help with insomnia represents something broader about how technology sells us convenience while harvesting our most private information. Nobody reads the terms of service. Nobody imagines that their dreams might be sold to the government. People accept the device because it solves an immediate problem. The larger consequences are too abstract, too far in the future, too disconnected from the immediate need to sleep.
Where the Ending Doesn’t Actually Resolve Anything
Sara does eventually leave the retention center, but she doesn’t get closure. She doesn’t escape the system. She just navigates through it. The novel doesn’t offer the satisfaction of her vindication or her triumph. Instead, it shows her understanding that she’ll never be truly free from surveillance, that the data will follow her, that the system has already transformed how she sees herself and her life.
This is psychologically honest in uncomfortable ways. A more conventional novel would have Sara finally proving her innocence and reclaiming her life. But the actual psychology of what Lalami describes doesn’t permit that kind of resolution. Once you’ve been subjected to this level of surveillance, once you’ve internalized the monitoring as a fundamental part of how you move through the world, you don’t simply move past it because circumstances change.
The Future That’s Already Here
The novel’s greatest achievement is making the reader understand that this isn’t actually speculative fiction. Predictive policing exists. Stop-and-Frisk existed. The No-Fly list exists. Algorithmic risk assessment is being used right now to determine who gets released from prison, who gets hired, and who gets approved for loans. Lalami’s novel takes what’s already happening and just pushes it slightly further. She shows what happens when the systems we’ve already accepted start to feel as oppressive as they actually are.
Sara’s situation is extraordinary, but the psychological mechanisms through which her situation develops—the incremental acceptance of surveillance, the normalization of data harvesting, the gradual erosion of privacy—these are ordinary. Most people aren’t detained. But most people have already accepted similar losses of privacy in exchange for convenience. The novel asks whether the distinction between Sara’s situation and ours is really as large as we’d like to believe.
Discover more from itsm3g
Subscribe to get the latest posts sent to your email.
